back to index

Liv Boeree: Poker, Game Theory, AI, Simulation, Aliens & Existential Risk | Lex Fridman Podcast #314


small model | large model

link |
00:00:00.000
Evolutionarily, if we see a lion running at us,
link |
00:00:03.040
we didn't have time to calculate the lion's kinetic energy,
link |
00:00:06.400
and is it optimal to go this way or that way?
link |
00:00:08.840
You just react it, and physically,
link |
00:00:11.760
our bodies are well attuned to actually make right decisions.
link |
00:00:14.520
But when you're playing a game like poker,
link |
00:00:16.600
this is not something that you ever evolved to do,
link |
00:00:19.320
and yet you're in that same flight or fight response.
link |
00:00:22.360
And so that's a really important skill to be able to develop,
link |
00:00:25.120
to basically learn how to meditate in the moment
link |
00:00:28.280
and calm yourself so that you can think clearly.
link |
00:00:32.520
The following is a conversation with Liv Burri,
link |
00:00:35.720
formerly one of the best poker players in the world,
link |
00:00:38.400
trained as an astrophysicist
link |
00:00:40.320
and is now a philanthropist and an educator
link |
00:00:44.240
on topics of game theory, physics, complexity, and life.
link |
00:00:49.120
This is the Lex Friedman podcast.
link |
00:00:51.080
To support it, please check out our sponsors
link |
00:00:53.200
in the description.
link |
00:00:54.440
And now, dear friends, here's Liv Burri.
link |
00:00:58.520
What role do you think luck plays in poker and in life?
link |
00:01:02.560
You can pick whichever one you want,
link |
00:01:04.240
poker or life and or life.
link |
00:01:06.920
The longer you play, the less influenced luck has,
link |
00:01:10.640
you know, like with all things,
link |
00:01:11.640
the bigger your sample size,
link |
00:01:13.880
the more the quality of your decisions
link |
00:01:16.360
or your strategies matter.
link |
00:01:18.840
So to answer that question, yeah, in poker, it really depends.
link |
00:01:22.760
If you and I sat and played 10 hands right now,
link |
00:01:26.120
I might only win 52% of the time, 53% maybe.
link |
00:01:30.240
But if we played 10,000 hands,
link |
00:01:31.720
then I'll probably win like over 98, 99% of the time.
link |
00:01:35.160
So it's a question of sample sizes.
link |
00:01:38.160
And what are you figuring out over time?
link |
00:01:40.120
The betting strategy that this individual does
link |
00:01:42.240
or literally doesn't matter
link |
00:01:43.720
against any individual over time?
link |
00:01:45.920
Against any individual over time,
link |
00:01:47.200
the better player because they're making better decisions.
link |
00:01:49.520
So what does that mean to make a better decision?
link |
00:01:51.240
Well, to get into the real nitty gritty already.
link |
00:01:55.280
Basically, poker is the game of math.
link |
00:01:58.720
There are these strategies familiar
link |
00:02:00.320
with like Nash Equilibria that term, right?
link |
00:02:02.520
So there are these game theory optimal strategies
link |
00:02:06.280
that you can adopt.
link |
00:02:08.280
And the closer you play to them,
link |
00:02:10.320
the less exploitable you are.
link |
00:02:12.360
So because I've studied the game a bunch,
link |
00:02:15.920
although admittedly not for a few years,
link |
00:02:17.240
but back in, you know, when I was playing all the time,
link |
00:02:20.000
I would study these game theory optimal solutions
link |
00:02:23.040
and try and then adopt those strategies when I go and play.
link |
00:02:25.680
So I'd play against you and I would do that.
link |
00:02:27.720
And because the objective,
link |
00:02:31.800
when you're playing game theory optimal,
link |
00:02:33.240
it's actually, it's a loss minimization thing
link |
00:02:35.960
that you're trying to do.
link |
00:02:37.360
Your best bet is to try and play a sort of similar style.
link |
00:02:42.720
You also need to try and adopt this loss minimization.
link |
00:02:46.200
But because I've been playing much longer than you,
link |
00:02:48.080
I'll be better at that.
link |
00:02:49.400
So first of all,
link |
00:02:50.880
you're not taking advantage of my mistakes,
link |
00:02:53.160
but then on top of that,
link |
00:02:55.040
I'll be better at recognizing
link |
00:02:56.960
when you are playing suboptimally
link |
00:02:59.560
and then deviating from this game theory optimal strategy
link |
00:03:02.200
to exploit your bad plays.
link |
00:03:05.160
Can you define game theory and Nash Equilibria?
link |
00:03:08.680
Can we try to sneak up to it in a bunch of ways?
link |
00:03:10.960
Like, what's the game theory framework
link |
00:03:13.360
of analyzing poker, analyzing any kind of situation?
link |
00:03:16.320
So game theory is just basically the study of decisions
link |
00:03:22.360
within a competitive situation.
link |
00:03:25.560
I mean, it's technically a branch of economics,
link |
00:03:27.720
but it also applies to like wider decision theory.
link |
00:03:32.280
And usually when you see it,
link |
00:03:35.840
it's these like little payoff matrices
link |
00:03:37.600
and so on, that's how it's depicted.
link |
00:03:38.840
But it's essentially just like study of strategies
link |
00:03:41.160
under different competitive situations.
link |
00:03:43.160
And as it happens, certain games,
link |
00:03:46.400
in fact, many, many games have these things
link |
00:03:48.880
called Nash Equilibria.
link |
00:03:50.480
And what that means is when you're in a Nash Equilibrium,
link |
00:03:52.440
basically it is not,
link |
00:03:55.480
there is no strategy that you can take
link |
00:03:59.720
that would be more beneficial
link |
00:04:01.200
than the one you're currently taking,
link |
00:04:02.800
assuming your opponent is also doing the same thing.
link |
00:04:05.720
So it would be a bad idea,
link |
00:04:06.760
if we're both playing in a game theory optimal strategy,
link |
00:04:10.680
if either of us deviate from that,
link |
00:04:12.200
now we're putting ourselves at a disadvantage.
link |
00:04:16.640
Rockpapers is actually a really great example of this.
link |
00:04:18.880
Like if we were to start playing rockpapers,
link |
00:04:22.520
you know, you know nothing about me
link |
00:04:23.800
and we're gonna play for all our money,
link |
00:04:26.200
let's play 10 rounds of it.
link |
00:04:27.880
What would your sort of optimal strategy be?
link |
00:04:30.240
Do you think?
link |
00:04:31.080
What would you do?
link |
00:04:33.680
Let's see.
link |
00:04:35.200
I would probably try to be as random as possible.
link |
00:04:42.600
Exactly, because you don't know anything about me,
link |
00:04:46.000
you don't want to give anything away about yourself,
link |
00:04:48.120
so ideally you'd have like a little dice
link |
00:04:49.560
or somewhat perfect randomizer
link |
00:04:52.560
that makes you randomize 33% of the time
link |
00:04:54.480
each of the three different things.
link |
00:04:56.080
And in response to that,
link |
00:04:58.160
well, actually I can kind of do anything,
link |
00:04:59.640
but I would probably just randomize back too,
link |
00:05:01.360
but actually it wouldn't matter
link |
00:05:02.400
because I know that you're playing randomly.
link |
00:05:05.120
So that would be us in a Nash equilibrium
link |
00:05:07.640
where we're both playing this like unexploitable strategy.
link |
00:05:10.640
However, if after a while,
link |
00:05:12.120
you then notice that I'm playing rock
link |
00:05:14.920
a little bit more often than I should.
link |
00:05:16.360
Yeah, you're the kind of person that would do that,
link |
00:05:18.160
wouldn't you?
link |
00:05:19.000
Sure, yes, yes, yes, I'm more of a scissors girl,
link |
00:05:20.800
but anyway.
link |
00:05:21.640
You are?
link |
00:05:22.480
No, I'm a, as I said, randomizer.
link |
00:05:25.560
So you notice I'm throwing rock too much
link |
00:05:27.120
or something like that.
link |
00:05:28.160
Now you'd be making a mistake
link |
00:05:29.280
by continuing playing this game theory optimal strategy,
link |
00:05:32.200
well the previous one,
link |
00:05:33.040
because you are now there's an,
link |
00:05:36.680
I'm making a mistake
link |
00:05:37.840
and you're not deviating and exploiting my mistake.
link |
00:05:41.320
So you'd want to start throwing paper a bit more often
link |
00:05:43.840
in whatever you figure is the right sort of percentage
link |
00:05:46.000
of the time that I'm throwing rock too often.
link |
00:05:48.120
So that's basically an example of where,
link |
00:05:51.400
what game theory optimal strategy is
link |
00:05:53.320
in terms of loss minimization,
link |
00:05:54.720
but it's not always the maximally profitable thing
link |
00:05:58.080
if your opponent is doing stupid stuff,
link |
00:06:00.160
which in that example.
link |
00:06:02.600
So that's kind of then how it works in poker,
link |
00:06:04.520
but it's a lot more complex.
link |
00:06:07.400
And the way poker players typically,
link |
00:06:10.680
nowadays they study,
link |
00:06:11.760
the games change so much.
link |
00:06:12.960
And I think we should talk about how it sort of evolved,
link |
00:06:15.480
but nowadays like the top pros
link |
00:06:17.840
basically spend all their time in between sessions
link |
00:06:20.880
running these simulators using like software
link |
00:06:24.280
where they do basically Monte Carlo simulations,
link |
00:06:26.160
sort of doing billions of fictitious self play hands.
link |
00:06:31.160
You input a fictitious hand scenario,
link |
00:06:34.040
like, oh, what do I do with Jack 9 suited
link |
00:06:36.080
on a King 10, four to two spade board
link |
00:06:42.000
and against this bet size.
link |
00:06:43.960
So you'd input that, press play,
link |
00:06:45.800
it'll run, it's billions of fake hands
link |
00:06:49.360
and then it will converge upon
link |
00:06:50.320
what the game theory optimal strategies are.
link |
00:06:53.480
And then you want to try and memorize what these are.
link |
00:06:55.440
Basically they're like ratios of how often,
link |
00:06:57.480
what types of hands you want to bluff
link |
00:06:59.920
and what percentage of the time.
link |
00:07:01.320
So then there's this additional layer
link |
00:07:02.800
of inbuilt randomization built in.
link |
00:07:04.440
Yeah, those kind of simulations incorporate
link |
00:07:06.400
all the betting strategies and everything else like that.
link |
00:07:08.480
So as opposed to some kind of very crude mathematical model
link |
00:07:12.320
of what's the probability you win
link |
00:07:13.840
just based on the quality of the card,
link |
00:07:16.400
it's including everything else too.
link |
00:07:19.000
The game theory of it.
link |
00:07:20.240
Yes, yeah, essentially.
link |
00:07:21.960
And what's interesting is that nowadays,
link |
00:07:23.760
if you want to be a top pro and you go and play
link |
00:07:25.640
in these really like the super high stakes tournaments
link |
00:07:27.560
or tough cash games,
link |
00:07:29.240
if you don't know this stuff,
link |
00:07:30.680
you're going to get eaten alive in the long run.
link |
00:07:33.280
But of course you could get lucky over the short run
link |
00:07:35.160
and that's where this like luck factor comes in
link |
00:07:36.880
because luck is both a blessing and a curse.
link |
00:07:40.400
If luck didn't, you know, if there wasn't this random element
link |
00:07:42.600
and there wasn't the ability for worse players
link |
00:07:45.280
to win sometimes, then poker would fall apart.
link |
00:07:48.240
You know, the same reason people don't play chess
link |
00:07:51.000
professionally for money against you.
link |
00:07:53.360
You don't see people going and hustling chess,
link |
00:07:55.480
like not knowing, trying to make a living from it
link |
00:07:57.840
because you know there's very little luck in chess,
link |
00:08:00.160
but there's quite a lot of luck in poker.
link |
00:08:01.400
Have you seen Beautiful Mind, that movie?
link |
00:08:03.920
Years ago.
link |
00:08:04.760
Well, what do you think about the game
link |
00:08:06.120
theoretic formulation of what is it,
link |
00:08:08.560
the hot blonde at the bar?
link |
00:08:10.000
Do you remember?
link |
00:08:10.840
Oh, yeah.
link |
00:08:11.680
The way they'd illustrated it is
link |
00:08:13.560
they're trying to pick up a girl at a bar
link |
00:08:15.280
and there's multiple girls.
link |
00:08:16.440
They're like, it's like a friend group
link |
00:08:18.120
when you're trying to approach.
link |
00:08:20.080
I don't remember the details, but I remember.
link |
00:08:22.000
Don't you like then speak to her friends?
link |
00:08:23.840
Yeah, yeah, yeah.
link |
00:08:24.680
Just like that, faint disinterest.
link |
00:08:25.720
I mean, it's classic pickup artists stuff.
link |
00:08:27.040
Yeah.
link |
00:08:27.880
And they were trying to correlate that somehow,
link |
00:08:30.800
that being an optimal strategy, game theoretically.
link |
00:08:35.360
Why?
link |
00:08:36.760
What?
link |
00:08:37.600
Like, I don't think, I remember.
link |
00:08:38.920
I can't imagine that they were,
link |
00:08:39.760
I mean, there's probably an optimal strategy.
link |
00:08:41.920
Is it, does that mean that there's
link |
00:08:43.600
a natural Nash equilibrium of like picking up girls?
link |
00:08:46.640
Do you know the marriage problem?
link |
00:08:49.000
It's optimal stopping.
link |
00:08:50.920
Yes.
link |
00:08:51.760
So where it's an optimal dating strategy,
link |
00:08:54.160
where you, do you remember?
link |
00:08:56.560
Yeah.
link |
00:08:57.400
I think it's like something like,
link |
00:08:58.240
you know, you've got like a set of a hundred people
link |
00:08:59.960
you're going to look through.
link |
00:09:00.800
And after how many do you,
link |
00:09:04.280
now after that,
link |
00:09:05.680
after going on this many dates out of a hundred,
link |
00:09:08.080
at what point do you then go,
link |
00:09:09.200
okay, the next best person I see,
link |
00:09:10.840
is that the right one?
link |
00:09:11.680
And I think it's like something like 37%.
link |
00:09:15.680
It's one over E, whatever that is.
link |
00:09:17.760
Right.
link |
00:09:18.600
Which I think is 37%.
link |
00:09:19.440
Yeah.
link |
00:09:21.280
I'm going to fact check that.
link |
00:09:22.840
Yeah.
link |
00:09:25.400
So, but it's funny under those strict constraints,
link |
00:09:28.400
then yes, after that many people,
link |
00:09:30.680
as long as you have a fixed size pool,
link |
00:09:32.880
then you just picked the next person
link |
00:09:36.480
that is better than anyone you've seen before.
link |
00:09:38.440
Yeah.
link |
00:09:40.360
Have you tried this?
link |
00:09:41.760
Have you incorporated it?
link |
00:09:42.600
I'm not one of those people.
link |
00:09:44.600
And we're going to discuss this.
link |
00:09:46.880
And what do you mean those people?
link |
00:09:50.160
I try not to optimize stuff.
link |
00:09:52.560
I try to listen to the heart.
link |
00:09:55.320
I don't think I like,
link |
00:09:59.800
my mind immediately is attracted to optimizing everything.
link |
00:10:06.240
And I think that if you really give into that kind of addiction
link |
00:10:11.080
that you lose the joy of the small things,
link |
00:10:14.920
the minutiae of life, I think.
link |
00:10:17.200
I don't know.
link |
00:10:18.040
I'm concerned about the addictive nature
link |
00:10:19.840
of my personality in that regard.
link |
00:10:21.640
In some ways, while I think the,
link |
00:10:25.560
on average people under try and quantify things
link |
00:10:30.160
or try under optimize, there are some people who,
link |
00:10:33.920
you know, it's like with all these things,
link |
00:10:35.320
it's a, you know, it's a balancing act.
link |
00:10:37.120
I've been on dating apps when I've never used them.
link |
00:10:40.160
I'm sure they have data on this
link |
00:10:42.040
because they probably have the optimal stopping control problem.
link |
00:10:45.240
Cause there aren't a lot of people that use social,
link |
00:10:47.360
like dating apps are on there for a long time.
link |
00:10:51.040
So the interesting aspect is like, all right,
link |
00:10:56.320
how long before you stop looking,
link |
00:10:58.800
before it actually starts affecting your mind
link |
00:11:01.240
negatively such that you see dating as a kind of,
link |
00:11:07.400
A game.
link |
00:11:08.240
A kind of game versus an actual process of finding somebody
link |
00:11:13.600
that's gonna make you happy for the rest of your life.
link |
00:11:15.840
That's really interesting.
link |
00:11:17.360
They have the data.
link |
00:11:18.200
I wish they would be able to release that data.
link |
00:11:20.440
And I do want to.
link |
00:11:21.280
It's okay.
link |
00:11:22.100
Cupid, right?
link |
00:11:22.940
I think they ran a huge, huge study on all of their.
link |
00:11:25.360
Yeah, they're more data driven.
link |
00:11:26.480
I think, okay, Cupid folks are.
link |
00:11:28.080
I think there's a lot of opportunity for dating apps
link |
00:11:30.240
and you know, even bigger than dating apps,
link |
00:11:32.360
people connecting on the internet.
link |
00:11:35.080
I just hope they're more data driven
link |
00:11:37.600
and it doesn't seem that way.
link |
00:11:40.360
I think like, I've always thought that
link |
00:11:44.800
Goodreads should be a dating app.
link |
00:11:47.920
Like the.
link |
00:11:48.760
I've never used it.
link |
00:11:49.600
Goodreads is just lists like books that you've read.
link |
00:11:55.120
And it allows you to comment on the books you read
link |
00:11:57.360
and what the books you're currently reading.
link |
00:11:58.920
But it's a giant social networks of people reading books.
link |
00:12:01.280
And that seems to be a much better database of like interests.
link |
00:12:04.640
Of course, to constrain you to the books you're reading,
link |
00:12:06.560
but like that really reveals so much more about the person
link |
00:12:10.200
allows you to discover shared interests
link |
00:12:12.440
because books are kind of window
link |
00:12:13.960
into the way you see the world.
link |
00:12:16.000
Also like the kind of places people you're curious about,
link |
00:12:20.360
the kind of ideas you're curious about.
link |
00:12:21.720
Are you romantic?
link |
00:12:22.680
Are you cold calculating rationalist?
link |
00:12:24.880
Are you, are you into Iron Rand?
link |
00:12:27.000
Or are you into Bernie Sanders?
link |
00:12:28.280
Are you into whatever?
link |
00:12:29.360
Right.
link |
00:12:30.200
And I feel like that reveals so much more
link |
00:12:32.000
than like a person trying to look hot
link |
00:12:35.240
from a certain angle in the Tinder profile.
link |
00:12:37.120
Well, and it would also be a really great filter
link |
00:12:38.960
in the first place for people.
link |
00:12:40.360
It selects people who read books
link |
00:12:41.960
and are willing to go and rate them
link |
00:12:44.720
and give feedback on them and so on.
link |
00:12:47.040
So that's already a really strong filter
link |
00:12:48.760
or probably the type of people you'd be looking for.
link |
00:12:50.640
Well, at least be able to fake reading books.
link |
00:12:52.400
I mean, the thing about books,
link |
00:12:53.600
you don't really need to read it.
link |
00:12:54.560
You can just look at the cliff notes.
link |
00:12:55.880
Yeah, game the dating app by feigning intellectualism.
link |
00:12:59.120
Can I admit something very horrible about myself?
link |
00:13:02.400
Go on.
link |
00:13:03.240
The things that, you know,
link |
00:13:04.360
I don't have many things in my closet,
link |
00:13:05.760
but this is one of them.
link |
00:13:08.040
I've never actually read Shakespeare.
link |
00:13:10.760
I've only read cliff notes.
link |
00:13:12.400
And I got a five in the AP English exam.
link |
00:13:14.760
Wow.
link |
00:13:15.920
And I had the witch books, have I read?
link |
00:13:19.680
Well, yeah, which was the exam on which books?
link |
00:13:21.560
Oh, no, they include a lot of them.
link |
00:13:24.360
But Hamlet, I don't even know if you read Romeo and Juliet.
link |
00:13:29.400
Macbeth, I don't remember, but I don't understand it.
link |
00:13:32.760
It's like really cryptic.
link |
00:13:34.080
It's hard.
link |
00:13:34.920
It's really, I don't, and it's not that pleasant to read.
link |
00:13:38.040
It's like ancient speak.
link |
00:13:39.280
I don't understand it.
link |
00:13:40.400
Anyway, maybe I was too dumb.
link |
00:13:41.840
Man, I'm still too dumb, but I did.
link |
00:13:44.760
But you got a five, which is.
link |
00:13:45.800
Yeah, yeah.
link |
00:13:46.640
I don't know how the U.S. grading system.
link |
00:13:48.000
Oh, no, so AP English is a,
link |
00:13:50.400
there's kind of this advanced versions of courses
link |
00:13:53.080
in high school and you take a test
link |
00:13:54.920
that is like a broad test for that subject and includes a lot.
link |
00:13:58.920
It wasn't obviously just Shakespeare.
link |
00:14:00.600
I think a lot of it was also writing, written.
link |
00:14:04.920
You have like AP physics, AP computer science,
link |
00:14:07.160
AP biology, AP chemistry, and then AP English
link |
00:14:11.160
and AP literature, I forget what it was.
link |
00:14:13.520
But I think Shakespeare was a part of that.
link |
00:14:16.080
But I.
link |
00:14:17.400
And you, and you gave the point is you gave a fight it.
link |
00:14:19.200
Gave a fight.
link |
00:14:20.040
Well, entirety, I was into getting A's.
link |
00:14:22.640
I saw it as a game.
link |
00:14:24.240
I don't think any, I don't think all the learning I've done
link |
00:14:30.400
has been outside of the, outside of school.
link |
00:14:34.000
The deepest learning I've done has been outside of school
link |
00:14:36.320
with a few exceptions, especially in grad school,
link |
00:14:38.440
like deep computer science courses.
link |
00:14:40.520
But that was still outside of school
link |
00:14:41.840
because it was outside of getting, sorry,
link |
00:14:43.760
it was outside of getting the A for the course.
link |
00:14:46.240
The best stuff I've ever done is when you read the chapter
link |
00:14:49.680
and you do many of the problems at the end of the chapter,
link |
00:14:52.400
which is usually not what's required for the course.
link |
00:14:55.000
Like the hardest stuff.
link |
00:14:56.200
In fact, textbooks are freaking incredible.
link |
00:14:58.880
If you go back now and you look at like biology textbook
link |
00:15:02.320
or any of the computer science textbooks
link |
00:15:06.120
on algorithms and data structures,
link |
00:15:07.800
those things are incredible.
link |
00:15:09.600
They have the best summary of a subject.
link |
00:15:12.040
Plus they have practice problems of increasing difficulty
link |
00:15:15.400
that allows you to truly master the basic,
link |
00:15:17.760
like the fundamental ideas behind that.
link |
00:15:19.720
That was, I got to my entire physics degree
link |
00:15:22.680
with one textbook that was just this really comprehensive one
link |
00:15:26.360
that they told us at the beginning of the first year,
link |
00:15:28.720
buy this, but you're gonna have to buy 15 other books
link |
00:15:31.760
for all your supplementary courses.
link |
00:15:33.440
And I was like, every time I was just checked
link |
00:15:35.600
to see whether this book covered it and it did.
link |
00:15:38.120
And I think I only bought like two or three extra
link |
00:15:39.840
and thank God, because they're so super expensive textbooks.
link |
00:15:41.960
It's a whole racket they've got going on.
link |
00:15:44.360
Yeah, they are, they could just,
link |
00:15:46.280
you get the right one, it's just like a manual for,
link |
00:15:49.560
but what's interesting though,
link |
00:15:51.440
is this is the tyranny of having exams and metrics.
link |
00:15:56.840
The tyranny of exams and metrics, yes.
link |
00:15:58.720
I loved them because I loved, I'm very competitive
link |
00:16:01.040
and I liked finding ways to gamify things
link |
00:16:04.080
and then like sort of dust off my shoulders
link |
00:16:06.160
after when I get a good grade or be annoyed at myself
link |
00:16:08.200
when I didn't, but yeah, you're absolutely right
link |
00:16:10.920
in that the actual, you know,
link |
00:16:12.880
how much of that physics knowledge I've retained?
link |
00:16:15.760
Like I've, I learned how to cram and study
link |
00:16:19.720
and please an examiner,
link |
00:16:21.400
but did that give me the deep lasting knowledge
link |
00:16:24.040
that I needed?
link |
00:16:24.880
I mean, yes and no, but really like,
link |
00:16:27.840
nothing makes you learn a topic better
link |
00:16:31.040
than when you actually then have to teach it yourself.
link |
00:16:34.080
You know, like I'm trying to wrap my teeth around this
link |
00:16:36.240
like game theory, Molek stuff right now.
link |
00:16:38.200
And there's no exam at the end of it that I can gamify.
link |
00:16:43.120
There's no way to gamify and sort of like
link |
00:16:44.600
shortcut my way through it.
link |
00:16:45.440
I have to understand it so deeply
link |
00:16:47.280
from like deep foundational levels to then build upon it
link |
00:16:50.720
and then try and explain it to other people.
link |
00:16:52.400
And like, you know,
link |
00:16:53.240
you're about to go and do some lectures, right?
link |
00:16:54.360
You can't, you can't sort of just like,
link |
00:16:58.680
you presumably can't rely on the knowledge
link |
00:17:00.680
that you got through when you were studying for an exam
link |
00:17:03.480
to reteach that.
link |
00:17:04.880
Yeah, and especially high level lectures,
link |
00:17:06.880
especially the kind of stuff you do on YouTube,
link |
00:17:09.440
you're not just regurgitating material.
link |
00:17:12.800
You have to think through what is the core idea here.
link |
00:17:17.960
And when you do the lectures live, especially,
link |
00:17:20.800
you have to, there's no second takes.
link |
00:17:25.040
That is the luxury you get if you're recording a video
link |
00:17:28.440
for YouTube or something like that,
link |
00:17:30.240
but it definitely is a luxury you shouldn't lean on.
link |
00:17:35.040
I've gotten to interact with a few YouTubers
link |
00:17:37.320
that lean on that too much.
link |
00:17:39.280
And you realize, oh, you're,
link |
00:17:41.440
you've gamified this system
link |
00:17:43.360
because you're not really thinking deeply about stuff.
link |
00:17:46.720
You're through the edit,
link |
00:17:48.400
both written and spoken.
link |
00:17:52.200
You're crafting an amazing video,
link |
00:17:53.880
but you yourself as a human being
link |
00:17:55.440
have not really deeply understood it.
link |
00:17:57.560
So live teaching, or at least recording video
link |
00:18:00.760
with very few takes is a different beast.
link |
00:18:04.640
And I think it's the most honest way of doing it,
link |
00:18:07.280
like as few takes as possible.
link |
00:18:09.200
That's what I'm nervous about this.
link |
00:18:10.720
Don't go back, you're like,
link |
00:18:13.160
ah, let's do that.
link |
00:18:14.000
Don't fuck this up, Liv.
link |
00:18:17.200
The tyranny of exams.
link |
00:18:18.480
I do think people talk about high school and college
link |
00:18:24.160
as a time to do drugs and drink and have fun
link |
00:18:27.160
and all this kind of stuff.
link |
00:18:28.280
But, you know, looking back,
link |
00:18:31.880
of course I did a lot of those things.
link |
00:18:34.200
No, yes, but it's also a time
link |
00:18:39.000
when you get to like read textbooks or read books
link |
00:18:43.640
or learn with all the time in the world.
link |
00:18:48.400
Like you don't have these responsibilities of like,
link |
00:18:51.520
you know, laundry and having to sort of pay for mortgage
link |
00:19:00.880
or all that kind of stuff, pay taxes,
link |
00:19:02.480
all this kind of stuff.
link |
00:19:04.040
In most cases, there's just so much time
link |
00:19:06.920
in the day for learning.
link |
00:19:08.160
And you don't realize at the time,
link |
00:19:10.280
because at the time it seems like a chore,
link |
00:19:11.920
like why the hell does there's so much homework?
link |
00:19:15.480
But you never get a chance to do this kind of learning,
link |
00:19:18.040
this kind of homework ever again in life,
link |
00:19:21.080
unless later in life you really make a big effort out of it.
link |
00:19:24.600
You get, like you basically, your knowledge gets solidified.
link |
00:19:27.360
You don't get to have fun and learn.
link |
00:19:29.160
Learning is really fulfilling and really fun
link |
00:19:33.360
if you're that kind of person.
link |
00:19:34.320
Like some people like to, you know,
link |
00:19:37.840
like knowledge is not something that they think is fun.
link |
00:19:40.560
But if that's the kind of thing that you think is fun,
link |
00:19:43.640
that's the time to have fun and do the drugs
link |
00:19:45.840
and drink and all that kind of stuff.
link |
00:19:46.920
But the learning, just going back to those textbooks,
link |
00:19:51.400
the hours spent with the textbooks is really, really rewarding.
link |
00:19:55.040
Do people even use textbooks anymore?
link |
00:19:56.640
Yeah.
link |
00:19:57.480
Do you think?
link |
00:19:58.320
Because it's these days with their TikTok and there.
link |
00:20:01.280
Well, not even that, but it's just like so much information,
link |
00:20:04.720
really high quality information.
link |
00:20:05.960
You know, it's now in digital format online.
link |
00:20:09.200
Yeah, but they're not, they are using that,
link |
00:20:11.160
but you know, college is still very, there's a curriculum.
link |
00:20:16.240
I mean, so much of school is about rigorous study
link |
00:20:19.680
of a subject and still on YouTube, that's not there.
link |
00:20:23.680
Right.
link |
00:20:24.520
YouTube has, Grant Sanderson talks about this.
link |
00:20:27.760
He's this math.
link |
00:20:29.320
331 Brown.
link |
00:20:30.160
Yeah, 331 Brown.
link |
00:20:31.640
He says like, I'm not a math teacher.
link |
00:20:33.960
I just take really cool concepts and I inspire people.
link |
00:20:37.800
But if you want to really learn calculus,
link |
00:20:39.560
if you want to really learn linear algebra,
link |
00:20:41.880
you should do the textbook, you should do that.
link |
00:20:44.920
You know, and there's still the textbook industrial complex
link |
00:20:49.080
that like charges like $200 for textbooks somehow.
link |
00:20:53.440
I don't know.
link |
00:20:54.280
It's ridiculous.
link |
00:20:56.640
Well, they're like, oh, sorry, new edition, edition 14.6.
link |
00:21:00.640
Sorry, you can't use 14.5 anymore.
link |
00:21:02.920
It's like, what's different?
link |
00:21:03.760
We've got one paragraph different.
link |
00:21:05.400
So we mentioned offline, Daniel Negrado.
link |
00:21:09.000
I'm gonna get a chance to talk to him on this podcast.
link |
00:21:11.680
And he's somebody that I found fascinating
link |
00:21:14.440
in terms of the way he thinks about poker,
link |
00:21:16.600
verbalizes the way he thinks about poker,
link |
00:21:18.680
the way he plays poker.
link |
00:21:20.240
So, and he's still pretty damn good.
link |
00:21:22.960
He's been good for a long time.
link |
00:21:24.920
So you mentioned that people are running these kinds
link |
00:21:27.720
of simulations and the game of poker has changed.
link |
00:21:30.600
Do you think he's adapting in this way?
link |
00:21:33.200
Do you like the top pros?
link |
00:21:34.960
Do they have to adopt this way?
link |
00:21:36.600
Or is there still like over the years,
link |
00:21:41.600
you basically developed this gut feeling about,
link |
00:21:45.240
like you get to be like good the way alpha zero is good.
link |
00:21:49.760
You look at the board and somehow from the fog
link |
00:21:54.120
comes out the right answer.
link |
00:21:55.320
Like this is likely what they have.
link |
00:21:57.960
This is likely the best way to move.
link |
00:22:00.400
And you don't really,
link |
00:22:01.240
you can't really put a finger on exactly why,
link |
00:22:05.480
but it just comes from your gut feeling or no.
link |
00:22:09.000
Yes and no.
link |
00:22:10.520
So gut feelings are definitely very important.
link |
00:22:15.000
That we've got our two mode or you can distill it down
link |
00:22:17.720
to two modes of decision making, right?
link |
00:22:19.080
You've got your sort of logical linear voice in your head,
link |
00:22:22.160
system two as it's often called
link |
00:22:24.040
and your system on your gut intuition.
link |
00:22:28.960
And historically in poker,
link |
00:22:32.360
the very best players were playing almost entirely
link |
00:22:35.160
by their gut.
link |
00:22:37.080
You know, often they would do some kind of inspired play
link |
00:22:39.240
and you'd ask them why they do it
link |
00:22:40.480
and they wouldn't really be able to explain it.
link |
00:22:42.600
And that's not so much because their process
link |
00:22:46.480
was unintelligible,
link |
00:22:47.600
but it was more just because no one had the language
link |
00:22:50.160
with which to describe what optimal strategies were
link |
00:22:52.360
because no one really understood how poker worked.
link |
00:22:54.280
This was before, you know, we had analysis software,
link |
00:22:57.560
you know, no one was writing,
link |
00:22:59.360
I guess some people would write down their hands
link |
00:23:01.480
in a little notebook,
link |
00:23:02.720
but there was no way to assimilate all this data
link |
00:23:04.600
and analyze it.
link |
00:23:05.920
But then, you know, when computers became cheaper
link |
00:23:08.480
and software started emerging
link |
00:23:09.880
and then obviously online poker,
link |
00:23:11.440
where it would like automatically save your hand histories,
link |
00:23:14.200
now all of a sudden you kind of had this body of data
link |
00:23:17.040
that you could run analysis on.
link |
00:23:19.560
And so that's when people started to see, you know,
link |
00:23:22.120
these mathematical solutions and so what that meant
link |
00:23:28.600
is the role of intuition essentially became smaller.
link |
00:23:33.600
And it went more into as we talked before
link |
00:23:38.400
about, you know, this game theory optimal style.
link |
00:23:40.600
But also as I said, like game theory optimal
link |
00:23:43.320
is about loss minimization and being unexploitable.
link |
00:23:47.760
But if you're playing against people who aren't,
link |
00:23:49.640
because no person, no human being can play perfectly
link |
00:23:51.600
game theory optimal in poker, not even the best AIs,
link |
00:23:54.040
they're still like, they're 99.99% of the way there
link |
00:23:57.200
or whatever, but it's kind of like speed of light,
link |
00:23:59.240
you can't reach it perfectly.
link |
00:24:01.080
So they're still a role for intuition?
link |
00:24:03.760
Yes, so when, yeah, when you're playing
link |
00:24:07.120
this unexploitable style, but when your opponents start
link |
00:24:09.960
doing something, you know, suboptimal
link |
00:24:12.560
that you want to exploit, well, now that's where
link |
00:24:15.840
not only your like logical brain will need to be thinking,
link |
00:24:18.120
well, okay, I know I have this,
link |
00:24:20.200
I'm in the sort of top end of my range here with this,
link |
00:24:22.560
with this hand.
link |
00:24:23.960
So that means I need to be calling x% of the time
link |
00:24:27.800
and I put them on this range, et cetera.
link |
00:24:30.480
But then sometimes you'll have this gut feeling
link |
00:24:34.000
that will tell you, you know, you know what, this time
link |
00:24:37.320
I know, I know mathematically I meant to call now,
link |
00:24:40.080
you know, I've got, I'm in the sort of top end of my range
link |
00:24:42.720
and this is the odds I'm getting.
link |
00:24:45.160
So the math says I should call, but there's something
link |
00:24:47.480
in your gut saying they've got it this time,
link |
00:24:49.360
they've got it like, they're beating you,
link |
00:24:53.080
might be your hand is worse.
link |
00:24:55.200
So then the real art, this is where the last remaining art
link |
00:24:58.720
in poker, the fuzziness is like, do you listen to your gut?
link |
00:25:03.680
How do you quantify the strength of it
link |
00:25:06.120
or can you even quantify the strength of it?
link |
00:25:08.360
And I think that's what Daniel has.
link |
00:25:13.120
I mean, I can't speak for how much he's studying
link |
00:25:15.360
with the simulators and that kind of thing.
link |
00:25:17.760
I think he has, like he must be to still be keeping up,
link |
00:25:22.280
but he has an incredible intuition for just,
link |
00:25:26.360
he's seen so many hands of poker in the flesh.
link |
00:25:29.200
He's seen so many people, the way they behave
link |
00:25:31.880
when the chips are, you know, when the money's on the line
link |
00:25:33.760
and he's got him staring you down in the eye,
link |
00:25:36.120
you know, he's intimidating.
link |
00:25:37.480
He's got this like kind of X factor vibe
link |
00:25:39.640
that he, you know, gives out.
link |
00:25:42.200
And he talks a lot, which is an interactive element,
link |
00:25:45.040
which is he's getting stuff from other people.
link |
00:25:47.240
Yes.
link |
00:25:48.080
Yeah.
link |
00:25:48.920
And just like the subtlety.
link |
00:25:49.760
So he's like, he's probing constantly.
link |
00:25:51.480
Yeah, he's probing and he's getting this extra layer
link |
00:25:53.920
of information that others can't.
link |
00:25:55.960
Now that said though, he's good online as well.
link |
00:25:57.960
You know, I don't know how, again,
link |
00:25:59.640
would he be beating the top cash game players online?
link |
00:26:02.720
Probably not, no.
link |
00:26:05.160
But when he's in person
link |
00:26:07.320
and he's got that additional layer of information,
link |
00:26:09.000
he can not only extract it,
link |
00:26:10.720
but he knows what to do with it still so well.
link |
00:26:14.200
There's one player who I would say is the exception
link |
00:26:16.120
to all of this.
link |
00:26:18.000
And he's one of my favorite people to talk about
link |
00:26:19.960
in terms of, I think he might have cracked the simulation.
link |
00:26:24.520
It's Phil Helmuth, he...
link |
00:26:28.040
In more ways than one.
link |
00:26:29.120
He's cracked the simulation, I think.
link |
00:26:30.560
Yeah, he somehow to this day is still,
link |
00:26:34.560
and I love you, Phil, I'm not in any way knocking you,
link |
00:26:38.320
he's still winning so much
link |
00:26:41.480
at the World Series of Pogo specifically.
link |
00:26:43.920
He's now won 16 bracelets.
link |
00:26:45.560
The next nearest person I think has won 10.
link |
00:26:48.520
And he is consistently year in, year out,
link |
00:26:50.280
going deep or winning these huge field tournaments.
link |
00:26:53.200
You know, with like 2,000 people,
link |
00:26:55.360
which statistically he should not be doing.
link |
00:26:57.840
And yet, you watch some of the plays he makes
link |
00:27:02.760
and they make no sense.
link |
00:27:04.120
Like mathematically, they are so far
link |
00:27:05.760
from game theory optimal.
link |
00:27:07.600
And the thing is, if you went and stuck him
link |
00:27:09.080
in one of these high stakes cash games
link |
00:27:11.440
with a bunch of like GTO people,
link |
00:27:13.280
he's gonna get ripped apart.
link |
00:27:15.120
But there's something that he has
link |
00:27:16.600
that when he's in the halls
link |
00:27:17.560
of the World Series of Pogo specifically,
link |
00:27:19.640
amongst sort of amateurish players,
link |
00:27:23.880
he gets them to do crazy shit like that.
link |
00:27:26.920
And, but my little pet theory is that also,
link |
00:27:31.000
he just the car, he's like a wizard
link |
00:27:35.160
and he gets the cards to do what he needs them to.
link |
00:27:39.040
Because he just expects to win
link |
00:27:42.840
and he expects to receive, you know,
link |
00:27:44.280
to get flopper set with a frequency far beyond what this,
link |
00:27:48.040
you know, the real percentages are.
link |
00:27:50.840
And I don't even know
link |
00:27:51.680
if he knows what the real percentages are.
link |
00:27:52.640
He doesn't need to because he gets that.
link |
00:27:54.800
I think he has found the Chico.
link |
00:27:56.040
Because when I've seen him play,
link |
00:27:57.440
he seems to be like annoyed
link |
00:27:59.920
that the long shot thing didn't happen.
link |
00:28:02.000
Yes.
link |
00:28:02.920
He's like annoyed.
link |
00:28:04.240
And it's almost like everybody else is stupid
link |
00:28:06.520
because he was obviously going to win with this.
link |
00:28:08.280
He's meant to win if that silly thing hadn't happened.
link |
00:28:10.600
And it's like, you don't understand,
link |
00:28:11.520
the silly thing happens 99% of the time.
link |
00:28:14.000
And it's a 1%, not the other way round.
link |
00:28:15.760
But genuinely, for his lived experience at the world's,
link |
00:28:18.600
only at the world's, it is like that.
link |
00:28:21.560
So I don't blame him for feeling that way.
link |
00:28:24.040
But he does, he has this X factor
link |
00:28:26.480
and the poker community has tried for years
link |
00:28:29.960
to rip him down saying like, you know, he doesn't,
link |
00:28:31.880
he's no good, but he's clearly good
link |
00:28:33.720
because he's still winning or there's something going on.
link |
00:28:36.240
Whether that's, he's figured out how to mess
link |
00:28:39.160
with the fabric of reality
link |
00:28:40.640
and how cards are, you know,
link |
00:28:42.600
a randomly shuffled deck of cards come out.
link |
00:28:44.320
I don't know what it is, but he's doing it right still.
link |
00:28:46.680
Who do you think is the greatest of all time?
link |
00:28:48.720
Would you put Hellmuth?
link |
00:28:52.360
Now he's definitely, he seems like the kind of person
link |
00:28:54.720
would mention he would actually watch this.
link |
00:28:56.520
So you might want to be careful.
link |
00:28:58.360
Well, as I said, I love Phil.
link |
00:28:59.800
And I have, I would say this to his face.
link |
00:29:03.480
I'm not saying anything.
link |
00:29:04.320
I don't, he's got, he truly, I mean,
link |
00:29:07.040
he is one of the greatest.
link |
00:29:09.560
I don't know if he's the greatest.
link |
00:29:11.000
He's certainly the greatest at the World Series of Poker.
link |
00:29:14.300
And he is the greatest at,
link |
00:29:16.400
despite the game switching into a pure game
link |
00:29:19.400
or almost an entire game of math,
link |
00:29:20.920
he has managed to keep the magic alive.
link |
00:29:22.840
And there's like, just through sheer force of will
link |
00:29:26.000
making the game work for him.
link |
00:29:27.600
And that is incredible.
link |
00:29:28.660
And I think it's something that should be studied
link |
00:29:30.480
because it's an example.
link |
00:29:32.000
Yeah, there might be some actual game theoretic wisdom.
link |
00:29:35.240
There might be something to be said about optimality
link |
00:29:37.400
from studying him, right?
link |
00:29:39.560
What do you mean by optimality?
link |
00:29:40.960
Meaning, or rather game design, perhaps.
link |
00:29:45.560
Meaning if what he's doing is working,
link |
00:29:48.440
maybe poker is more complicated
link |
00:29:51.640
than we're currently modeling it as.
link |
00:29:54.200
So like, yeah.
link |
00:29:55.040
Or there's an extra layer,
link |
00:29:56.640
and I don't mean to get too weird and wooey,
link |
00:29:59.480
but, or there's an extra layer of ability
link |
00:30:04.640
to manipulate the things the way you want them to go
link |
00:30:07.520
that we don't understand yet.
link |
00:30:09.720
Do you think Phil Helmuth understands them?
link |
00:30:11.960
Is he just generally.
link |
00:30:13.600
Hashtag positivity.
link |
00:30:14.760
Well, he wrote a book on positivity, and he.
link |
00:30:17.200
He has.
link |
00:30:18.040
He did.
link |
00:30:18.880
Not like a trolling book.
link |
00:30:20.400
No.
link |
00:30:21.240
A serious.
link |
00:30:22.080
Straight up, yeah.
link |
00:30:23.800
Phil Helmuth wrote a book about positivity.
link |
00:30:26.040
Yes.
link |
00:30:27.240
Okay.
link |
00:30:28.080
About, I think, and I think it's about sort of manifesting
link |
00:30:31.040
what you want and getting the outcomes that you want
link |
00:30:34.140
by believing so much in yourself
link |
00:30:36.500
and in your ability to win, like eyes on the prize.
link |
00:30:40.240
And I mean, it's working.
link |
00:30:42.160
The man's delivered.
link |
00:30:43.160
But where do you put like Phil Ivy
link |
00:30:45.360
and all those kinds of people?
link |
00:30:47.360
I mean, I'm too, I've been, to be honest,
link |
00:30:50.120
too much out of the scene
link |
00:30:50.960
for the last few years to really,
link |
00:30:53.760
I mean, Phil Ivy's clearly got, again,
link |
00:30:55.240
he's got that X factor.
link |
00:30:57.960
He's so incredibly intimidating to play against.
link |
00:31:00.400
I've only played against him a couple of times,
link |
00:31:01.760
but when he like looks you in the eye
link |
00:31:03.680
and you're trying to run a bluff on him,
link |
00:31:05.040
oof, no one's made me sweat harder than Phil Ivy.
link |
00:31:07.160
Just my bluff got through, actually.
link |
00:31:10.400
That was actually one of the most thrilling moments
link |
00:31:11.840
I've ever had in poker was it was in a Monte Carlo
link |
00:31:14.280
in a high roller.
link |
00:31:15.480
I can't remember exactly what the hand was,
link |
00:31:16.760
but I, you know, a three bit
link |
00:31:19.520
and then like just barreled all the way through.
link |
00:31:22.120
And he just like put his laser eyes into me.
link |
00:31:24.400
And I felt like he was just scouring my soul.
link |
00:31:28.200
And I was just like, hold it together, live,
link |
00:31:29.760
hold it together.
link |
00:31:30.600
And he was like, hold it.
link |
00:31:31.440
You knew your hand was weaker.
link |
00:31:33.040
Yeah. I mean, I was bluffing.
link |
00:31:34.400
I presume, which, you know,
link |
00:31:36.360
there's a chance I was bluffing with the best hand,
link |
00:31:37.720
but I'm pretty sure my hand was worse.
link |
00:31:40.640
And he folded.
link |
00:31:44.600
I was truly one of the deep highlights of my career.
link |
00:31:47.440
Did you show the cards or did you fall?
link |
00:31:50.000
What is it?
link |
00:31:50.840
You should never show in game.
link |
00:31:52.600
Because especially as I felt like I was one
link |
00:31:53.880
of the worst players at the table in that tournament.
link |
00:31:55.680
So giving that information,
link |
00:31:57.840
unless I had a really solid plan
link |
00:31:59.840
that I was now like advertising or look,
link |
00:32:02.240
I'm capable of bluffing Phil Ivy,
link |
00:32:03.560
but like why, it's much more valuable
link |
00:32:06.680
to take advantage of the impression that they have of me,
link |
00:32:09.000
which is like, I'm a scared girl
link |
00:32:10.200
playing a high roller for the first time.
link |
00:32:11.960
Keep that going, you know.
link |
00:32:14.720
Interesting.
link |
00:32:15.560
But isn't there layers to this?
link |
00:32:17.080
Like psychological warfare that the scared girl
link |
00:32:21.320
might be way smart.
link |
00:32:22.400
And then like just to flip the tables,
link |
00:32:24.680
do you think about that kind of stuff?
link |
00:32:26.000
Oh, definitely.
link |
00:32:26.840
Is it better not to reveal information?
link |
00:32:28.480
I mean, generally speaking,
link |
00:32:29.480
you want to not reveal information.
link |
00:32:31.040
You know, the goal of poker is to be as deceptive
link |
00:32:34.120
as possible about your own strategies
link |
00:32:36.000
while elucidating as much out of your opponent
link |
00:32:38.680
about their own.
link |
00:32:39.720
So giving them free information,
link |
00:32:42.080
particularly if they're people
link |
00:32:42.920
who you consider very good players,
link |
00:32:44.800
any information I give them
link |
00:32:46.360
is going into their little database
link |
00:32:48.120
and I assume it's going to be calculated and used well.
link |
00:32:51.760
So I have to be really confident
link |
00:32:53.480
that my like meta gaming that I'm going to then do,
link |
00:32:56.720
or they've seen this, so therefore that,
link |
00:32:58.320
I'm going to be on the right level.
link |
00:33:00.640
So it's better just to keep that little secret
link |
00:33:03.400
to myself in the moment.
link |
00:33:04.320
So how much is bluffing part of the game?
link |
00:33:06.720
Huge amount.
link |
00:33:08.360
So yeah, I mean, maybe actually, let me ask,
link |
00:33:10.400
like what did it feel like with Phil Ivey
link |
00:33:13.000
or anyone else when it's a high stake,
link |
00:33:15.040
when it's a big, it's a big bluff.
link |
00:33:18.840
So a lot of money on the table.
link |
00:33:21.680
And maybe, I mean, what defines a big bluff?
link |
00:33:24.640
Maybe a lot of money on the table,
link |
00:33:26.000
but also some uncertainty in your mind and heart
link |
00:33:29.160
about like self doubt,
link |
00:33:31.960
well, maybe I miscalculated what's going on here,
link |
00:33:34.560
what the bet said, all that kind of stuff.
link |
00:33:36.720
Like what does that feel like?
link |
00:33:38.840
I mean, it's, I imagine comparable to,
link |
00:33:46.000
you know, running a, I mean,
link |
00:33:47.840
any kind of big bluff where you have a lot of something
link |
00:33:52.360
that you care about on the line, you know?
link |
00:33:54.640
So if you're bluffing in a courtroom,
link |
00:33:57.240
not that I didn't want to ever do that,
link |
00:33:58.160
or something equatable to that, it's incredible.
link |
00:34:02.440
You know, in that scenario, you know,
link |
00:34:04.040
I think it was the first time I'd ever played a 20,
link |
00:34:05.680
I'd won my way into this 25K tournament.
link |
00:34:09.160
So that was the buy in, 25,000 euros.
link |
00:34:11.160
And I had satelliteed my way in
link |
00:34:13.000
because it was much bigger than I would ever normally play.
link |
00:34:16.080
And, you know, I hadn't,
link |
00:34:17.160
I wasn't that experienced at the time.
link |
00:34:18.720
And now I was sitting there against all the big boys,
link |
00:34:21.240
you know, the Nogranos, the Phil Ivey's and so on.
link |
00:34:23.600
And then to like, each time you put the bets out,
link |
00:34:30.360
you know, you put another bet out, your card.
link |
00:34:33.000
Yeah, I was on a, what's called a semi bluff.
link |
00:34:34.880
So there were some cards that could come
link |
00:34:36.320
that would make my hand very, very strong and therefore win,
link |
00:34:39.200
but most of the time those cards don't come.
link |
00:34:40.960
So that is a semi bluff because you're representing,
link |
00:34:43.640
what are you representing that you already have something?
link |
00:34:47.880
So I think in this scenario, I had a flush draw to,
link |
00:34:52.160
so I had two clubs, two clubs came out on the flop,
link |
00:34:55.920
and then I'm hoping that on the turn and the river,
link |
00:34:58.080
one will come.
link |
00:34:59.120
So I have some future equity.
link |
00:35:00.960
I could hit a club and then I'll have the best hand
link |
00:35:02.760
in which case, great.
link |
00:35:04.600
And so I can keep betting and I'll want them to call,
link |
00:35:07.480
but I'm also got the other way of winning the hand
link |
00:35:09.640
where if my card doesn't come,
link |
00:35:11.920
I can keep betting and get them to fold their hand.
link |
00:35:14.720
And I'm pretty sure that's what the scenario was.
link |
00:35:18.040
So I had some future equity, but it's still, you know,
link |
00:35:21.040
most of the time I don't hit that club.
link |
00:35:23.280
And so I'd rather him just fold
link |
00:35:25.320
because the pot is now getting bigger and bigger.
link |
00:35:27.800
And in the end, like I jam all in on the river,
link |
00:35:31.000
that's my entire tournament on the line.
link |
00:35:33.680
As far as I'm aware, this might be the one time
link |
00:35:35.160
I ever get to play a big 25K.
link |
00:35:37.200
You know, this was the first time I played one.
link |
00:35:38.680
So it was, it felt like the most momentous thing.
link |
00:35:42.000
And this was also when I was trying to build myself up,
link |
00:35:43.800
you know, build my name, a name for myself in poker.
link |
00:35:46.680
I wanted to get respect.
link |
00:35:47.720
Destroy everything for you.
link |
00:35:49.280
It felt like it in the moment.
link |
00:35:50.520
Like, I mean, it literally does feel
link |
00:35:51.760
like a form of life and death.
link |
00:35:52.800
Like your body physiologically
link |
00:35:54.360
is having that flight or fight response.
link |
00:35:55.880
What are you doing with your body?
link |
00:35:57.120
What are you doing with your face?
link |
00:35:58.240
Are you just like, what are you thinking about?
link |
00:36:02.720
Well, a mixture of like, okay, what are the cards?
link |
00:36:04.760
So in theory, I'm thinking about like, okay,
link |
00:36:07.080
what are cards that make my hand look stronger?
link |
00:36:10.040
Which cards hit my perceived range from his perspective?
link |
00:36:13.880
Which cards don't?
link |
00:36:15.680
What's the right amount of bet size to, you know,
link |
00:36:18.320
maximize my fold equity in this situation?
link |
00:36:20.840
You know, that's the logical stuff
link |
00:36:22.120
that I should be thinking about.
link |
00:36:23.240
But I think in reality, because I was so scared,
link |
00:36:25.440
because there's this, at least for me,
link |
00:36:26.920
there's a certain threshold of like nervousness or stress
link |
00:36:30.440
beyond which the like logical brain shuts off.
link |
00:36:33.440
And now it just gets into this like,
link |
00:36:36.600
it just like, it feels like a game of wits basically.
link |
00:36:38.800
It's like, of nerve.
link |
00:36:39.840
Can you hold your resolve?
link |
00:36:41.800
And it certainly got by that, like by the river.
link |
00:36:44.320
I think by that point, I was like,
link |
00:36:45.840
I don't even know if this is a good bluff anymore,
link |
00:36:47.720
but fuck it, let's do it.
link |
00:36:49.960
Your mind is almost numb
link |
00:36:51.280
from the intensity of that feeling.
link |
00:36:53.160
I call it the white noise.
link |
00:36:55.160
And that's, and it happens in all kinds of decision making.
link |
00:36:58.920
I think anything that's really, really stressful.
link |
00:37:00.800
Like I can imagine someone in like an important job interview,
link |
00:37:02.960
if it's like a job they've always wanted
link |
00:37:04.800
and they're getting grilled, you know,
link |
00:37:06.560
like Bridgewater style where they ask these really hard,
link |
00:37:09.240
like mathematical questions.
link |
00:37:11.200
You know, that's, it's a really learned skill
link |
00:37:13.440
to be able to like subdue your flight or fight response,
link |
00:37:17.840
you know, what I think get from the sympathetic
link |
00:37:19.400
into the parasympathetic.
link |
00:37:20.360
So you can actually, you know,
link |
00:37:22.080
engage that voice in your head
link |
00:37:24.200
and do those slow logical calculations.
link |
00:37:26.040
Cause evolutionarily, we, you know,
link |
00:37:28.360
if we see a lion running at us,
link |
00:37:29.680
we didn't have time to sort of calculate
link |
00:37:31.680
the lion's kinetic energy and, you know,
link |
00:37:34.000
is it optimal to go this way or that way?
link |
00:37:35.480
You just react it.
link |
00:37:37.080
And physically, our bodies are well attuned
link |
00:37:39.880
to actually make right decisions.
link |
00:37:41.160
But when you're playing a game like poker,
link |
00:37:43.280
this is not something that you ever, you know,
link |
00:37:44.800
evolved to do.
link |
00:37:45.960
And yet you're in that same flight or fight response.
link |
00:37:49.000
And so that's a really important skill
link |
00:37:51.080
to be able to develop to basically learn
link |
00:37:52.520
how to like meditate in the moment
link |
00:37:54.920
and calm yourself so that you can think clearly.
link |
00:37:57.440
But as you were searching for a comparable thing,
link |
00:38:00.840
it's interesting cause you just made me realize
link |
00:38:03.360
that bluffing is like an incredibly
link |
00:38:06.000
high stakes form of lying.
link |
00:38:08.240
You're lying.
link |
00:38:10.800
And I don't think you can...
link |
00:38:11.640
You're telling a story.
link |
00:38:13.360
No, no, it's straight up lying.
link |
00:38:15.280
In the context of game, it's not a negative kind of lying.
link |
00:38:19.120
But it is, yeah, exactly.
link |
00:38:19.960
You're representing something that you don't have.
link |
00:38:23.400
And I was thinking like,
link |
00:38:24.480
in how often in life do we have such high stakes of lying?
link |
00:38:30.360
Cause I was thinking certainly in high level military
link |
00:38:35.560
strategy, I was thinking when Hitler was lying to Stalin
link |
00:38:40.320
about his plans to invade the Soviet Union.
link |
00:38:44.600
And so you're talking to a person like your friends
link |
00:38:48.320
and you're fighting against the enemy,
link |
00:38:50.720
whatever the formulation of the enemy is.
link |
00:38:53.920
But meanwhile, the whole time you're building up troops
link |
00:38:56.800
on the border.
link |
00:38:58.920
That's extremely...
link |
00:38:59.760
Wait, wait, so Hitler and Stalin were like pretending
link |
00:39:01.840
to be friends?
link |
00:39:02.680
Well, my history knowledge is terrible.
link |
00:39:04.240
That's crazy.
link |
00:39:05.240
Yeah, that they were, yeah, man.
link |
00:39:09.520
And it worked because Stalin until the troops crossed
link |
00:39:13.840
the border and invaded in Operation Barbarossa
link |
00:39:17.560
where this storm of Nazi troops invaded
link |
00:39:23.560
large parts of the Soviet Union.
link |
00:39:25.160
And hence one of the biggest wars in human history began.
link |
00:39:30.400
Stalin for sure was thought that this was never going to be,
link |
00:39:34.960
that Hitler is not crazy enough to invade the Soviet Union.
link |
00:39:38.680
That it makes geopolitically makes total sense
link |
00:39:41.840
to be collaborators.
link |
00:39:43.080
And ideologically, even though there's a tension
link |
00:39:46.320
between communism and fascism or national socialism,
link |
00:39:50.360
however you formulated, it still feels like
link |
00:39:53.280
this is the right way to battle the West.
link |
00:39:55.840
Right, they were more ideologically aligned.
link |
00:39:58.760
They in theory had a common enemy, which is the West.
link |
00:40:01.440
So it made total sense.
link |
00:40:03.520
And in terms of negotiations and the way things
link |
00:40:06.080
were communicated, it seemed to Stalin that for sure
link |
00:40:11.560
that they would remain at least for a while
link |
00:40:16.120
peaceful collaborators.
link |
00:40:17.680
And everybody, because of that in the Soviet Union,
link |
00:40:22.240
believe that it was a huge shock when Kiev was invaded.
link |
00:40:25.840
And you hear echoes of that when I traveled to Ukraine
link |
00:40:28.240
sort of the shock of the invasion.
link |
00:40:32.240
It's not just the invasion on one particular border,
link |
00:40:34.560
but the invasion of the capital city.
link |
00:40:36.720
And just like, holy shit, especially at that time
link |
00:40:41.400
when you thought World War I,
link |
00:40:44.240
you realized that that was the war to end all wars.
link |
00:40:46.960
You would never have this kind of war.
link |
00:40:49.000
And holy shit, this person is mad enough
link |
00:40:52.400
to try to take on this monster in the Soviet Union.
link |
00:40:56.320
So it's no longer going to be a war of hundreds
link |
00:40:58.760
of thousands dead.
link |
00:40:59.880
It'll be a war of tens of millions dead.
link |
00:41:02.280
And yeah, but that's a very large scale kind of lie,
link |
00:41:08.320
but I'm sure there's in politics and geopolitics
link |
00:41:11.480
that kind of lying happening all the time.
link |
00:41:15.160
And a lot of people pay financially
link |
00:41:17.280
and with their lives for that kind of lying.
link |
00:41:19.080
But in our personal lives, I don't know how often we,
link |
00:41:22.280
maybe we...
link |
00:41:23.120
I think people do.
link |
00:41:23.960
I mean, like think of spouses cheating on their partners,
link |
00:41:27.120
right?
link |
00:41:27.960
And then like having to lie like,
link |
00:41:28.780
where were you last night?
link |
00:41:29.620
Stuff like that.
link |
00:41:30.460
Like that's, I think, you know,
link |
00:41:33.720
I mean, unfortunately that stuff happens all the time, right?
link |
00:41:35.720
So...
link |
00:41:36.560
Or having like multiple families, that one is great.
link |
00:41:38.800
When each family doesn't know about the other one
link |
00:41:42.280
and like maintaining that life,
link |
00:41:44.640
there's probably a sense of excitement about that too.
link |
00:41:48.960
Seems unnecessary, yeah.
link |
00:41:50.760
But why?
link |
00:41:51.760
Well, just lying, like, you know,
link |
00:41:54.240
the truth finds a way of coming out, you know?
link |
00:41:56.800
Yes, but hence that's the thrill.
link |
00:41:59.280
Yeah, perhaps, yeah, people.
link |
00:42:00.880
I mean, you know, that's why I think actually like poker,
link |
00:42:04.240
what's so interesting about poker is most of the best
link |
00:42:08.640
players I know, they're always exceptions, you know,
link |
00:42:10.880
they're always bad eggs,
link |
00:42:12.460
but actually poker players are very honest people.
link |
00:42:15.140
I would say they are more honest than the average,
link |
00:42:17.560
you know, if you just took random population sample.
link |
00:42:22.360
Because A, you know, I think, you know,
link |
00:42:26.120
humans like to have that.
link |
00:42:27.600
Most people like to have some kind of, you know,
link |
00:42:30.600
mysterious, you know, an opportunity to do something
link |
00:42:32.760
like a little edgy.
link |
00:42:34.920
So we get to sort of scratch that itch of being edgy
link |
00:42:37.520
at the poker table where it's like, it's part of the game.
link |
00:42:40.480
Everyone knows what they're in for and that's allowed.
link |
00:42:43.440
And you get to like really get that out of your system.
link |
00:42:47.800
And then also like poker players learned that, you know,
link |
00:42:51.280
I would play in a huge game against some of my friends,
link |
00:42:55.440
even my partner Igor, where we will be, you know,
link |
00:42:57.800
absolutely going at each other's throats,
link |
00:42:59.720
trying to draw blood in terms of winning each money
link |
00:43:02.120
off each other and like getting under each other's skin,
link |
00:43:04.080
winding each other up, doing the craftiest moves we can.
link |
00:43:08.400
But then once the game's done, the, you know,
link |
00:43:11.160
the winners and the losers will go off and get a drink
link |
00:43:12.800
together and have a fun time and like talk about it
link |
00:43:15.240
in this like weird academic way afterwards.
link |
00:43:17.320
Because that, and that's why games are so great.
link |
00:43:19.560
Cause you get to like live out or like this competitive urge
link |
00:43:24.040
that, you know, most people have.
link |
00:43:26.440
What's it feel like to lose?
link |
00:43:28.720
Like we talked about bluffing when it worked out.
link |
00:43:31.400
What about when you, when you go broke?
link |
00:43:35.400
So like in a game, I'm, you know,
link |
00:43:37.200
fortunately I've never gone broke.
link |
00:43:39.160
You mean in like full life?
link |
00:43:40.880
Full life, no.
link |
00:43:42.480
I know plenty of people who have.
link |
00:43:46.800
And I don't think Igor would mind me saying,
link |
00:43:47.840
he went, you know, he went broke once in poker bowl,
link |
00:43:49.960
you know, early on when we were together.
link |
00:43:51.840
I feel like you haven't lived unless you've gone broke.
link |
00:43:54.320
Oh, yeah.
link |
00:43:55.640
In some sense.
link |
00:43:56.480
Right.
link |
00:43:57.320
It's a fundamental sense.
link |
00:43:58.160
I mean, I'm happy.
link |
00:43:59.000
I've sort of lived through it vicariously through him
link |
00:44:00.880
when he did it at the time.
link |
00:44:02.360
But yeah, what's it like to lose?
link |
00:44:04.160
Well, it depends.
link |
00:44:05.000
So it depends on the amount.
link |
00:44:06.040
It depends what percentage of your net worth
link |
00:44:07.680
you've just lost.
link |
00:44:10.080
It depends on your brain chemistry.
link |
00:44:11.560
It really, you know, varies from person to person.
link |
00:44:13.200
You have a very cold calculating way of thinking about this.
link |
00:44:16.920
So it depends what percentage you have.
link |
00:44:19.000
Well, it did.
link |
00:44:19.840
It really does, right?
link |
00:44:20.680
Yeah, it's true.
link |
00:44:21.520
But that's, I mean, that's another thing
link |
00:44:23.760
poker trains you to do.
link |
00:44:24.720
You see, you see everything in percentages
link |
00:44:28.080
or you see everything in like ROI
link |
00:44:29.640
or expected hourly's or cost benefit, et cetera.
link |
00:44:32.400
You know, so that's, one of the things I've tried to do
link |
00:44:37.200
is calibrate the strength of my emotional response
link |
00:44:39.800
to the win or loss that I've received.
link |
00:44:43.400
Because it's no good if you like, you know,
link |
00:44:45.520
you have a huge emotional dramatic response to a tiny loss.
link |
00:44:49.440
Or on the flip side, you have a huge win
link |
00:44:52.600
and you're sort of dead inside that you don't even feel it.
link |
00:44:55.000
Well, that's, you know, that's a shame.
link |
00:44:56.400
I want my emotions to calibrate with reality
link |
00:44:59.840
as much as possible.
link |
00:45:02.480
So yeah, what's it like to lose?
link |
00:45:03.800
I mean, I've had times where I've lost, you know,
link |
00:45:06.840
busted out of a tournament that I thought I was going to win in.
link |
00:45:08.640
You know, especially if I got really unlucky
link |
00:45:10.160
or I make a dumb play where I've gone away
link |
00:45:14.320
and like, you know, kicked the wall, punched a wall.
link |
00:45:17.840
I like nearly broke my hand one time.
link |
00:45:19.560
Like I'm a lot less competitive than I used to be.
link |
00:45:23.680
Like I was like pathologically competitive
link |
00:45:26.200
in my like late teens, early twenties.
link |
00:45:28.320
I just had to win at everything.
link |
00:45:30.480
And I think that sort of slowly waned as I've gotten older.
link |
00:45:33.200
According to you, yeah.
link |
00:45:34.480
According to me.
link |
00:45:35.240
I don't know if others would say the same, right?
link |
00:45:38.120
I feel like ultra competitive people,
link |
00:45:40.480
like I've heard Joroh can say this to me.
link |
00:45:42.640
It's like, he's a lot less competitive than he used to be.
link |
00:45:45.040
I don't know about that.
link |
00:45:47.000
Oh, I believe it.
link |
00:45:48.240
No, I totally believe it.
link |
00:45:49.200
Like, because as you get, you can still be like,
link |
00:45:51.560
I care about winning.
link |
00:45:52.960
Like when, you know, I play a game with my buddies online
link |
00:45:55.960
or, you know, whatever it is,
link |
00:45:57.080
polytopia is my current obsession.
link |
00:45:58.680
Like, when I...
link |
00:46:00.720
Thank you for passing on your obsession to me.
link |
00:46:02.840
Are you playing now?
link |
00:46:03.640
Yeah, I'm playing now.
link |
00:46:04.640
We're going to have a game.
link |
00:46:05.760
But I'm terrible and I enjoy playing terribly.
link |
00:46:08.000
I don't want to have a game
link |
00:46:09.040
because that's going to pull me into your monster
link |
00:46:11.320
of like competitive play.
link |
00:46:13.760
It's important.
link |
00:46:14.600
It's an important skill.
link |
00:46:15.280
I'm enjoying playing on the...
link |
00:46:17.200
I can't...
link |
00:46:18.200
You just do the points thing, you know, against the bots.
link |
00:46:20.880
Yeah, against the bots.
link |
00:46:21.880
And I can't even do the...
link |
00:46:24.840
There's like a hard one and there's a very hard one.
link |
00:46:26.920
And then it's crazy, yeah.
link |
00:46:27.760
It's crazy.
link |
00:46:28.600
I don't even enjoy the hard one.
link |
00:46:29.840
The crazy I really don't enjoy.
link |
00:46:31.960
Because it's intense.
link |
00:46:33.600
You have to constantly try to win
link |
00:46:34.920
as opposed to enjoy building a little world and...
link |
00:46:37.920
Yeah, no, no, there's no time for exploration of polytopia.
link |
00:46:40.000
You've got to get...
link |
00:46:40.680
Well, once you graduate from the crazies,
link |
00:46:42.680
then you can come play the...
link |
00:46:44.480
Graduate from the crazies.
link |
00:46:46.400
Yeah, in order to be able to play a decent game against,
link |
00:46:48.960
like, you know, our group,
link |
00:46:51.640
you'll need to be consistently winning
link |
00:46:55.640
like 90% of games against 15 crazy bots.
link |
00:46:58.280
Yeah.
link |
00:46:59.120
And you'll be able to...
link |
00:46:59.840
Like, I could teach you it within a day, honestly.
link |
00:47:03.320
How to beat the crazies?
link |
00:47:04.640
How to beat the crazies.
link |
00:47:05.680
And then you'll be ready for the big leagues.
link |
00:47:08.120
Generalizes to more than just polytopia.
link |
00:47:11.040
But okay, why were we talking about polytopia?
link |
00:47:14.280
Losing hurts.
link |
00:47:15.840
Losing hurts. Oh, yeah.
link |
00:47:16.960
Yes, competitiveness over time.
link |
00:47:19.240
Oh, yeah.
link |
00:47:20.240
I think it's more that, at least for me,
link |
00:47:23.240
I still care about winning when I choose to play something.
link |
00:47:26.360
It's just that I don't see the world as zero some as I used to be, you know?
link |
00:47:31.960
I think as you one gets older and wiser,
link |
00:47:34.840
you start to see the world more as a positive something.
link |
00:47:37.960
Or at least you're more aware of externalities
link |
00:47:40.720
of scenarios, of competitive interactions.
link |
00:47:44.160
And so, yeah, I'm more aware of my own, you know, like,
link |
00:47:50.080
if I have a really strong emotional response to losing,
link |
00:47:52.720
and that makes me then feel shitty for the rest of the day.
link |
00:47:54.920
And then I beat myself up mentally for it.
link |
00:47:57.160
Like, I'm now more aware that that's unnecessary negative externality.
link |
00:48:01.600
So I'm like, okay, I need to find a way to turn this down,
link |
00:48:03.840
you know, dial this down a bit.
link |
00:48:05.200
Was poker the thing that has...
link |
00:48:07.520
If you think back at your life
link |
00:48:09.600
and think about some of the lower points of your life,
link |
00:48:12.080
like the darker places you've gone in your mind,
link |
00:48:14.880
did it have to do something with poker?
link |
00:48:16.960
Like, did losing spark the descent into darkness,
link |
00:48:23.440
or was it something else?
link |
00:48:24.800
Um, I think my darkest points in poker
link |
00:48:29.360
were when I was wanting to quit and move on to other things.
link |
00:48:34.320
But I felt like I hadn't ticked all the boxes I wanted to tick.
link |
00:48:39.600
Like, I wanted to be the most winningest female player,
link |
00:48:43.600
which is by itself a bad goal.
link |
00:48:45.680
Um, you know, that was one of my initial goals,
link |
00:48:47.840
and I was like, well, I haven't...
link |
00:48:48.640
You know, and I wanted to win a WPT event.
link |
00:48:50.800
I've won one of these, I've won one of these,
link |
00:48:52.000
but I won one of those as well.
link |
00:48:53.520
And that sort of, again, like,
link |
00:48:57.360
is a drive of, like, overoptimization
link |
00:48:59.040
to random metrics that I decided were important,
link |
00:49:02.320
without much wisdom at the time,
link |
00:49:03.760
but then, like, carried on.
link |
00:49:04.800
Um, that made me continue chasing it longer
link |
00:49:09.520
than I still actually had the passion to chase it for.
link |
00:49:12.800
And I don't have any regrets that, you know,
link |
00:49:15.600
I played for as long as I did, because who knows,
link |
00:49:17.440
you know, I wouldn't be sitting here,
link |
00:49:19.120
I wouldn't be living this incredible life
link |
00:49:20.880
that I'm living now.
link |
00:49:22.000
Um, this is the height of your life right now.
link |
00:49:24.640
This is it, peak experience.
link |
00:49:26.640
Absolute pinnacle here in your robot land.
link |
00:49:30.880
Yeah, yeah.
link |
00:49:31.760
With your creepy light.
link |
00:49:33.360
No, it is.
link |
00:49:34.880
I mean, I wouldn't change a thing about my life right now,
link |
00:49:38.160
and I feel very blessed to say that.
link |
00:49:40.960
So, but the dark times were in the sort of, like,
link |
00:49:44.800
2016 to 18, even sooner, really,
link |
00:49:48.720
where I was like, I had stopped loving the game,
link |
00:49:52.960
and I was going through the motions.
link |
00:49:55.360
And I would, and then I was like,
link |
00:49:59.360
you know, I would take the loss as harder
link |
00:50:00.720
than I needed to, because I'm like,
link |
00:50:02.480
oh, it's another one.
link |
00:50:03.280
And it was, I was aware that, like,
link |
00:50:04.720
I felt like my life was ticking away,
link |
00:50:06.080
and I was like, is this going to be what's on my tombstone?
link |
00:50:07.840
Oh, yeah, she played the game of, you know,
link |
00:50:09.440
this zero sum game of poker,
link |
00:50:11.600
slightly more optimally than her next opponent.
link |
00:50:14.000
Like, cool, great legacy, you know.
link |
00:50:18.000
I just wanted, you know, there was something in me
link |
00:50:19.520
that knew I needed to be doing something
link |
00:50:21.600
more directly impactful, and just meaningful.
link |
00:50:25.120
It was just like a search for meaning,
link |
00:50:26.160
and I think it's a thing a lot of poker players,
link |
00:50:27.760
even a lot of, I imagine any games players who sort of love
link |
00:50:32.960
intellectual pursuits, you know,
link |
00:50:36.160
I think you should ask Magnus Carlson this question.
link |
00:50:38.000
Yeah, walking away from chess, right?
link |
00:50:39.680
Yeah, like, it must be so hard for him.
link |
00:50:41.360
You know, he's been on the top for so long,
link |
00:50:43.680
and it's like, well, now what?
link |
00:50:45.120
He's got this incredible brain, like, what to put it to.
link |
00:50:49.920
And yeah, it's...
link |
00:50:52.000
It's this weird moment where I was just
link |
00:50:54.720
spoken with people that won multiple gold medals at the Olympics,
link |
00:50:58.320
and the depression hits hard after you win.
link |
00:51:02.720
Don't come in crash.
link |
00:51:04.000
Because it's a kind of a goodbye, saying goodbye to that person
link |
00:51:06.720
to all the dreams you had that you thought would give meaning
link |
00:51:11.120
to your life.
link |
00:51:11.760
But in fact, life is full of constant pursuits of meaning.
link |
00:51:16.720
It doesn't...
link |
00:51:17.680
You don't, like, arrive and figure it all out,
link |
00:51:20.160
and there's endless bliss.
link |
00:51:21.600
Now it continues going on and on.
link |
00:51:23.920
You constantly have to figure out to rediscover yourself.
link |
00:51:27.680
And so for you, like, that struggle to say goodbye to poker,
link |
00:51:31.440
you have to, like, find the next...
link |
00:51:33.600
There's always a bigger game.
link |
00:51:34.800
That's the thing.
link |
00:51:35.520
That's my motto.
link |
00:51:36.480
It's like, what's the next game?
link |
00:51:38.400
And more importantly, because obviously,
link |
00:51:41.600
game usually implies zero sum.
link |
00:51:43.040
Like, what's the game which is, like, OmniWin?
link |
00:51:46.400
Look, what?
link |
00:51:46.960
OmniWin.
link |
00:51:47.440
OmniWin.
link |
00:51:48.000
Why is OmniWin so important?
link |
00:51:49.920
Because if everyone plays zero sum games,
link |
00:51:54.000
that's a fast track to either completely stagnate as a civilization,
link |
00:51:57.200
but more actually far more likely to extinct ourselves.
link |
00:52:01.360
You know, like, the playing field is finite.
link |
00:52:04.240
You know, nuclear powers are playing, you know,
link |
00:52:09.200
a game of poker with their chips of nuclear weapons, right?
link |
00:52:13.760
And the stakes have gotten so large
link |
00:52:16.320
that if anyone makes a single bet, you know,
link |
00:52:18.560
fires some weapons, the playing field breaks.
link |
00:52:21.840
I made a video on this.
link |
00:52:22.720
Like, you know, the playing field is finite.
link |
00:52:26.720
And if we keep playing these adversarial zero sum games,
link |
00:52:31.360
thinking that in order for us to win,
link |
00:52:33.840
someone else has to lose.
link |
00:52:35.200
Or if we lose that someone else wins, that will extinct us.
link |
00:52:39.760
It's just a matter of when.
link |
00:52:41.040
What do you think about that mutually assured destruction?
link |
00:52:44.080
That very simple, almost to the point of caricaturing game theory idea
link |
00:52:49.120
that does seem to be at the core of why we haven't blown each other up yet with nuclear weapons.
link |
00:52:54.960
Do you think there's some truth to that?
link |
00:52:56.560
This kind of stabilizing force of mutually assured destruction?
link |
00:53:00.800
And do you think that's going to hold up through the 21st century?
link |
00:53:06.480
I mean, it has held, yes.
link |
00:53:09.680
There's definitely truth to it that it was a, you know,
link |
00:53:12.880
it's a Nash equilibrium.
link |
00:53:14.560
Yeah, are you surprised it held this long?
link |
00:53:17.040
Isn't it crazy?
link |
00:53:18.320
It is crazy when you factor in all the, like,
link |
00:53:21.120
near miss accidental firings.
link |
00:53:24.560
Yes.
link |
00:53:25.280
That makes me wonder, like, you know,
link |
00:53:27.920
you're familiar with the, like, quantum suicide thought experiment,
link |
00:53:30.960
where it's basically like you have a,
link |
00:53:34.560
you know, like a Russian roulette type scenario hooked up to some kind of quantum event,
link |
00:53:40.480
you know, particle splitting or periparticle splitting.
link |
00:53:45.760
And if it, you know, if it goes A,
link |
00:53:48.320
then the gun doesn't go off and it goes B,
link |
00:53:50.080
then it does go off and it kills you.
link |
00:53:52.400
Because you can only ever be in the universe,
link |
00:53:54.880
you know, assuming like the Everett branch, you know,
link |
00:53:56.560
multiverse theory, you'll always only end up in the branch
link |
00:54:00.080
where you continually make, you know, option A comes in.
link |
00:54:03.200
But you run that experiment enough times,
link |
00:54:05.280
it starts getting pretty damn, you know, out of the tree gets huge.
link |
00:54:09.200
There's a million different scenarios,
link |
00:54:10.800
but you'll always find yourself in this,
link |
00:54:12.400
in the one where it didn't go off.
link |
00:54:14.800
And so from that perspective, you are essentially immortal,
link |
00:54:20.640
because someone, and you will only find yourself
link |
00:54:22.560
in the set of observers that make it down that path.
link |
00:54:24.960
Yeah.
link |
00:54:25.040
So it's, it's kind of.
link |
00:54:26.480
That doesn't mean, that doesn't mean you're,
link |
00:54:30.880
you're still not going to be fucked at some point in your life.
link |
00:54:33.040
No, of course not.
link |
00:54:33.520
No, I'm not, I'm not advocating,
link |
00:54:34.800
like we're all immortal because of this.
link |
00:54:36.400
It's just like a fun thought experiment.
link |
00:54:38.400
And the point is it like raises this thing of like these things
link |
00:54:40.640
called observer selection effects, which Bostrom,
link |
00:54:43.120
and Nick Bostrom talks about a lot,
link |
00:54:44.400
and I think people should go read.
link |
00:54:46.080
It's really powerful,
link |
00:54:46.800
but I think it could be overextended that logic.
link |
00:54:48.800
I'm not sure exactly how it can be.
link |
00:54:52.160
I just feel like you can get,
link |
00:54:55.120
you can overgeneralize that logic somehow.
link |
00:54:58.240
Well, no, I mean, it leads you into like solipsism,
link |
00:55:00.560
which is a very dangerous mindset.
link |
00:55:02.080
Again, if everyone like falls into solipsism of like,
link |
00:55:04.240
well, I'll be fine.
link |
00:55:05.120
I mean, that's a great way of creating a very,
link |
00:55:07.840
you know, self terminating environment.
link |
00:55:10.160
But my point is, is that with the nuclear weapons thing,
link |
00:55:14.480
there have been at least, I think it's 12 or 11,
link |
00:55:18.480
near misses of like just stupid things.
link |
00:55:20.960
Like there was moonrise over Norway,
link |
00:55:24.160
and it made weird reflections of some glaciers in the mountains,
link |
00:55:27.520
which set off, I think, the alarms of NORAD,
link |
00:55:30.720
NORAD radar, and that put them on high alert, nearly ready to shoot.
link |
00:55:35.600
And it was only because the head of Russian military
link |
00:55:39.600
happened to be at the UN in New York at the time,
link |
00:55:41.840
that they go like, well, wait a second,
link |
00:55:43.280
why would they fire now when their guy is there?
link |
00:55:46.960
And it was only that lucky happenstance,
link |
00:55:49.120
which doesn't happen very often,
link |
00:55:50.080
where they didn't then escalate it into firing.
link |
00:55:51.920
And there's a bunch of these different ones.
link |
00:55:53.760
Stanislav Petrov, like saved the person
link |
00:55:56.560
who should be the most famous person on earth,
link |
00:55:57.840
because he's probably on expectations,
link |
00:55:59.520
saved the most human lives of anyone,
link |
00:56:01.120
like billions of people by ignoring Russian orders to fire,
link |
00:56:05.200
because he felt in his gut that actually this was a false alarm,
link |
00:56:07.360
and it turned out to be, you know, very hard thing to do.
link |
00:56:10.960
And there's so many of those scenarios
link |
00:56:12.480
that I can't help but wonder at this point,
link |
00:56:14.400
that we aren't having this kind of like selection effect thing going on,
link |
00:56:17.760
because you look back and you're like, geez,
link |
00:56:19.600
that's a lot of near misses.
link |
00:56:20.720
But of course, we don't know the actual probabilities
link |
00:56:22.720
that they would have lent each one would have ended up in nuclear war.
link |
00:56:24.720
Maybe they were not that likely.
link |
00:56:26.000
But still, the point is, it's a very dark,
link |
00:56:28.720
stupid game that we're playing.
link |
00:56:30.880
And it is an absolute moral imperative, if you ask me,
link |
00:56:35.360
to get as many people thinking about ways
link |
00:56:37.200
to make this like very precarious,
link |
00:56:39.440
because we're in a Nash equilibrium,
link |
00:56:41.040
but it's not like we're in the bottom of a pit.
link |
00:56:42.720
You know, if you would like map it topographically,
link |
00:56:46.320
it's not like a stable ball at the bottom of a thing.
link |
00:56:48.240
We're not in equilibrium because of that.
link |
00:56:49.360
We're on the top of a hill with a ball balanced on top,
link |
00:56:52.320
and just any little nudge could send it flying down,
link |
00:56:54.960
and, you know, nuclear war pops off and hellfire and bad times.
link |
00:57:00.000
On the positive side, life on earth will probably still continue,
link |
00:57:03.120
and another intelligent civilization might still pop up.
link |
00:57:06.080
Maybe, yeah.
link |
00:57:06.640
It depends on the X risk.
link |
00:57:09.120
It depends on the X risk.
link |
00:57:10.080
Nuclear war, sure, that's one of the perhaps less bad ones.
link |
00:57:14.160
Green goo through synthetic biology, very bad.
link |
00:57:16.880
It will turn, you know, destroy all, you know, organic matter through,
link |
00:57:25.360
you know, it's basically like a biological paperclip maximizer,
link |
00:57:28.000
also bad, or AI type, you know, mass extinction thing as well would also be bad.
link |
00:57:32.960
Shh, they're listening.
link |
00:57:35.040
There's a robot right behind you.
link |
00:57:36.400
Okay, wait, so let me ask you about this.
link |
00:57:38.640
From a game theory perspective,
link |
00:57:40.240
do you think we're living in a simulation?
link |
00:57:42.160
Do you think we're living inside a video game created by somebody else?
link |
00:57:47.920
Well, so what was the second part of the question?
link |
00:57:50.000
Do I think we're living in a simulation and?
link |
00:57:52.800
A simulation that is observed by somebody for purpose of entertainment,
link |
00:57:58.480
so like a video game.
link |
00:57:59.520
Are we listening?
link |
00:58:00.240
Are we, because there's a, it's like Phil Hellmuth type of situation, right?
link |
00:58:07.120
There's a creepy level of like,
link |
00:58:09.200
this is kind of fun and interesting.
link |
00:58:12.960
Like there's a lot of interesting stuff going on.
link |
00:58:16.160
I mean, that could be somehow integrated into the evolutionary process,
link |
00:58:19.360
where the way we perceive and are.
link |
00:58:23.280
Are you asking me if I believe in God?
link |
00:58:26.480
Sounds like it.
link |
00:58:28.000
Kind of, but God seems to be not optimizing in the different formulations of God that we conceive of.
link |
00:58:36.480
He doesn't seem to be, or she optimizing for like personal entertainment.
link |
00:58:44.080
Maybe the older gods did, but the, you know, just like the,
link |
00:58:48.400
basically like a teenager in their mom's basement,
link |
00:58:51.520
watching, create a fun universe to observe.
link |
00:58:55.840
So kind of crazy shit might happen.
link |
00:58:58.400
Okay, so to try and answer this.
link |
00:58:59.600
Do I think there is some kind of extraneous intelligence to our,
link |
00:59:13.200
you know, classic measurable universe that we, you know, can measure with,
link |
00:59:17.520
you know, through our current physics and instruments?
link |
00:59:23.680
I think so, yes.
link |
00:59:24.560
Partly because I've had just small little bits of evidence in my own life,
link |
00:59:32.400
which have made me question, like, so I was a diehard atheist.
link |
00:59:36.880
Even five years ago, you know, I got into like the rationality community,
link |
00:59:41.600
big fan of less wrong, continue to be incredible resource.
link |
00:59:47.040
But I've just started to have too many little
link |
00:59:50.080
snippets of experience, which don't make sense with the current sort of purely materialistic
link |
01:00:04.160
explanation of how reality works.
link |
01:00:09.200
Isn't that just like a humbling practical realization that we don't know how reality works?
link |
01:00:17.280
Isn't that just a reminder to yourself?
link |
01:00:19.280
Yeah, no, it's a reminder of epistemic humility, because I felt too hard,
link |
01:00:23.120
you know, same as people, like, I think, you know, many people who are just like,
link |
01:00:26.400
my religion is the way, this is the correct way, this is the work, this is the law.
link |
01:00:30.400
You are immoral if you don't follow this, blah, blah, blah.
link |
01:00:32.560
I think they are lacking epistemic humility.
link |
01:00:34.960
They're a little too much hubris there.
link |
01:00:37.120
But similarly, I think the sort of the Richard Dawkins brand of atheism is too rigid as well
link |
01:00:44.880
and doesn't, you know, there's a way to try and navigate these questions,
link |
01:00:50.880
which still honors the scientific method, which I still think is our best sort of realm of like
link |
01:00:54.960
reasonable inquiry, you know, a method of inquiry.
link |
01:00:58.800
So an example, I have two kind of notable examples that like really rattled my cage.
link |
01:01:06.480
The first one was actually in 2010, early on in my poker career.
link |
01:01:13.360
And I remember the Icelandic volcano that erupted that like shut down kind of all
link |
01:01:21.040
Atlantic airspace.
link |
01:01:22.880
And it meant I got stuck down in the south of France.
link |
01:01:25.120
I was there for something else.
link |
01:01:27.120
And I couldn't get home.
link |
01:01:29.200
And someone said, well, there's a big poker tournament happening in Italy.
link |
01:01:31.760
Maybe do you want to go?
link |
01:01:32.800
I was like, oh, right, sure.
link |
01:01:33.680
Like, let's, you know, got a train across, found a way together.
link |
01:01:37.760
And the buy in was 5,000 euros, which was much bigger than my bankroll would normally allow.
link |
01:01:42.480
And so I played a feeder tournament, won my way in, kind of like I did with the Monte Carlo,
link |
01:01:47.760
big one.
link |
01:01:49.680
So then I won my way, you know, from 500 euros into 5,000 euros to play this thing.
link |
01:01:54.080
And on day one of then the big tournament, which turned out to have,
link |
01:01:58.880
it was the biggest tournament ever held in Europe at the time.
link |
01:02:01.200
It got over like 1,200 people, absolutely huge.
link |
01:02:04.400
And I remember they dimmed the lights for before, you know, the normal shuffle up and
link |
01:02:09.120
deal to tell everyone to start playing. And they played Chemical Brothers, Hey Boy, Hey Girl,
link |
01:02:15.920
which I don't know why it's notable, but it was just like a really, it was a song I always liked.
link |
01:02:19.040
It was like one of these like pump me up songs.
link |
01:02:21.040
And I was sitting there thinking, oh, yeah, it's exciting.
link |
01:02:22.800
I'm playing this really big tournament.
link |
01:02:24.480
And out of nowhere, just suddenly this voice in my head, just it sounded like my own sort of,
link |
01:02:31.840
you know, when you say, you think in your mind, you hear a voice kind of, right?
link |
01:02:34.800
At least I do.
link |
01:02:35.360
And so it sounded like my own voice.
link |
01:02:37.920
And it said, you are going to win this tournament.
link |
01:02:41.040
And it was so powerful that I got this like wave of like, you know,
link |
01:02:44.880
this sort of goosebumps down my body.
link |
01:02:46.640
And that I even, I remember looking around being like, did anyone else hear that?
link |
01:02:49.600
And obviously people are in their phones like no one else heard it.
link |
01:02:51.760
And I was like, okay, six days later, I win the fucking tournament out of 1,200 people.
link |
01:02:59.440
And I said, I don't know how to explain it.
link |
01:03:08.720
Okay. Yes. Maybe I have that feeling before every time I play.
link |
01:03:14.560
And it's just that I happen to, you know, because I won the tournament,
link |
01:03:16.960
I retroactively remembered it.
link |
01:03:18.240
Or the, or the feeling gave you a kind of, now from the film Hellmuthian.
link |
01:03:24.480
Well, exactly.
link |
01:03:25.520
Like it gave you a confident, a deep confidence.
link |
01:03:28.240
And it did, it definitely did.
link |
01:03:29.600
Like I remember then feeling this like sort of, well, although I remember then on day one,
link |
01:03:33.840
I then went and lost half my stack quite early on.
link |
01:03:35.760
And I remember thinking like, oh, that was bullshit.
link |
01:03:37.600
You know, what kind of premonition is this?
link |
01:03:40.160
Thinking, oh, I'm out.
link |
01:03:40.960
But, you know, I managed to like keep it together and recover.
link |
01:03:43.360
And then, and then just went like pretty perfectly from then on.
link |
01:03:47.360
And either way, it definitely instilled me with this confidence.
link |
01:03:52.640
And I don't want to put a, I don't, I can't put an explanation.
link |
01:03:55.680
Like, you know, was it some, you know, huge extra, extra supernatural thing driving me?
link |
01:04:03.840
Or was it just my own self confidence in someone that just made me make the right decisions?
link |
01:04:07.920
I don't know.
link |
01:04:08.640
And I don't, I'm not going to put a frame on it.
link |
01:04:10.560
And I think I know a good explanation.
link |
01:04:12.240
So we're a bunch of NPCs living in this world created by in the simulation.
link |
01:04:16.480
And then people, not people, creatures from outside of the simulation
link |
01:04:22.000
sort of can tune in and play your character.
link |
01:04:24.080
And that feeling you got is somebody just like,
link |
01:04:27.120
they got to play a poker tournament through you.
link |
01:04:29.200
Honestly, it felt like that.
link |
01:04:30.560
It did actually feel a little bit like that.
link |
01:04:32.960
But it's been 12 years now.
link |
01:04:35.120
I've retold this story many times.
link |
01:04:36.560
Like, I don't even know how much I can trust my memory.
link |
01:04:38.880
You're just an NPC retelling the same story.
link |
01:04:41.360
Because they just played the tournament and left.
link |
01:04:43.600
Yeah.
link |
01:04:43.840
They're like, oh, that was fun.
link |
01:04:44.640
Cool.
link |
01:04:44.800
Yeah, cool.
link |
01:04:45.440
Next.
link |
01:04:46.880
And now you're for the rest of your life left as a boring NPC retelling this great greatness.
link |
01:04:51.680
Well, what was interesting was that after that, then I didn't obviously
link |
01:04:54.240
win a major tournament for quite a long time.
link |
01:04:56.800
And it left.
link |
01:04:59.040
That was actually another sort of dark period because I had this incredible,
link |
01:05:02.720
like the highs of winning that just on a material level were insane, winning the money.
link |
01:05:06.880
I was on the front page of newspapers because there was this girl that came out of nowhere
link |
01:05:10.080
and won this big thing.
link |
01:05:12.080
And so again, chasing that feeling was difficult.
link |
01:05:16.560
But then on top of that, that was this feeling of almost being touched by something
link |
01:05:20.560
bigger that was like, uh.
link |
01:05:23.120
And also maybe did you have a sense that I might be somebody special?
link |
01:05:29.440
Like this kind of, I think that's the confidence thing that maybe you could do something special
link |
01:05:40.080
in this world after all kind of feeling.
link |
01:05:42.000
I definitely, I mean, this is the thing I think everybody wrestles with to an extent, right?
link |
01:05:47.120
We all, we are truly the protagonists in our own lives.
link |
01:05:51.440
And so it's a natural bias, human bias to feel, to feel special.
link |
01:05:58.800
And I think, and in some ways we are special.
link |
01:06:00.720
Every single person is special because you are that the universe does,
link |
01:06:04.560
the world literally does revolve around you.
link |
01:06:06.160
That's the thing in some respect.
link |
01:06:08.480
But of course, if you then zoom out and take the amalgam of everyone's experiences,
link |
01:06:12.000
then no, it doesn't.
link |
01:06:12.720
So there is this shared sort of objective reality,
link |
01:06:15.520
but sorry, this objective reality that is shared.
link |
01:06:17.600
But then there's also this subjective reality, which is truly unique to you.
link |
01:06:20.640
And I think both of those things coexist and it's not like one is correct and one isn't.
link |
01:06:24.400
And again, anyone who's like, uh, oh no, your lived experience is everything versus
link |
01:06:29.040
your lived experience is nothing.
link |
01:06:30.000
No, it's, it's, it's a blend between these two things.
link |
01:06:32.400
They can exist concurrently.
link |
01:06:33.600
But there's a certain kind of sense that at least I've had my whole life.
link |
01:06:36.640
And I think a lot of people have this as like, well, I'm just like this little person.
link |
01:06:40.320
And surely I can't be one of those people that do the big thing, right?
link |
01:06:46.400
There's all these big people doing big things.
link |
01:06:48.560
There's big actors and actresses, big musicians.
link |
01:06:53.360
There's big business owners and all that kind of stuff, scientists and so on.
link |
01:06:58.640
I, you know, I have my own subjective experience that I enjoy and so on,
link |
01:07:02.400
but there's like a different layer, like, um, surely I can't do those great things.
link |
01:07:09.360
I mean, one of the things just having interacted with a lot of great people,
link |
01:07:13.600
I realized, no, they're like just the same, the same, the same humans as me.
link |
01:07:20.000
And that realization I think is really empowering.
link |
01:07:22.080
And like to remind yourself.
link |
01:07:24.160
But are they?
link |
01:07:24.960
Huh?
link |
01:07:25.280
But are they?
link |
01:07:25.840
Are they?
link |
01:07:28.320
Well, it depends on some, yeah.
link |
01:07:29.840
They're like a bag of insecurities and peculiar sort of like their own little weirdness and so on.
link |
01:07:43.040
I should say also not, they have the capacity for brilliance, but they're not
link |
01:07:51.520
generically brilliant.
link |
01:07:52.800
Like, you know, we tend to say this person or that person is brilliant,
link |
01:07:57.200
but really, no, they're just like sitting there and thinking through stuff, just like the rest of us.
link |
01:08:04.400
Right.
link |
01:08:04.880
I think they're in the habit of thinking through stuff seriously.
link |
01:08:08.720
And they've built up a habit of not allowing them their mind to get trapped in a bunch of
link |
01:08:14.480
bullshit and minutiae of day to day life.
link |
01:08:16.880
They really think big ideas, but those big ideas, it's like allowing yourself the freedom
link |
01:08:23.200
to think big, to realize that you can be one that actually solve this particular big problem.
link |
01:08:29.120
First identify a big problem that you care about, then like I can actually be the one
link |
01:08:33.120
that solves this problem.
link |
01:08:34.960
And like allowing yourself to believe that.
link |
01:08:37.120
And I think sometimes you do need to have like that shock go through your body and a voice
link |
01:08:41.200
tells you you're going to win this tournament.
link |
01:08:42.800
Well, exactly.
link |
01:08:43.440
And whether it was, it's this idea of useful fictions.
link |
01:08:49.920
So again, like going through all like the classic rationalist training of less wrong,
link |
01:08:54.720
where it's like, you want your map, the image you have of the world in your head to as accurately
link |
01:09:01.040
match up with how the world actually is.
link |
01:09:03.200
You want the map and the territory to perfectly align as, you know, you want it to be as an
link |
01:09:08.720
accurate representation as possible.
link |
01:09:11.120
I don't know if I fully subscribe to that anymore.
link |
01:09:13.120
Having now had these moments of like feeling of something either bigger or just actually just
link |
01:09:18.720
being overconfident.
link |
01:09:20.160
Like there is value in overconfidence sometimes.
link |
01:09:22.880
I do.
link |
01:09:23.520
If you would, you know, take, you know, take Magnus Carlson, right?
link |
01:09:30.320
If he, I'm sure from a young age, he knew he was very talented.
link |
01:09:34.160
But I wouldn't be surprised if he was also had something in him to, well, actually, maybe
link |
01:09:39.440
he's a bad example because he truly is the world's greatest.
link |
01:09:42.320
But someone who is unclear whether they were going to be the world's greatest but ended up
link |
01:09:45.840
doing extremely well because they had this innate deep self confidence, this like even
link |
01:09:51.280
overblown idea of how good their relative skill level is.
link |
01:09:54.720
That gave them the confidence to then pursue this thing and like with the kind of focus
link |
01:09:59.680
and dedication that it requires to excel in whatever it is you're trying to do, you know?
link |
01:10:03.280
And so there are these useful fictions and that's where I think I diverge slightly with
link |
01:10:09.520
the classic, the classic sort of rationalist community because that's a field that is worth
link |
01:10:19.120
studying of like how the stories we tell, what the stories we tell to ourselves,
link |
01:10:24.960
even if they are actually false and even if we suspect they might be false.
link |
01:10:28.960
How it's better to sort of have that like little bit of faith, like value in faith,
link |
01:10:33.120
I think, actually.
link |
01:10:34.240
And that's partly another thing that's like now led me to explore
link |
01:10:37.200
the concept of God, whether you want to call it a simulator, the classic
link |
01:10:43.440
theological thing.
link |
01:10:44.160
I think we're all like elucidating to the same thing.
link |
01:10:46.320
Now, I don't know, I'm not saying, you know, because obviously the Christian God is like,
link |
01:10:49.680
you know, all benevolent, endless love.
link |
01:10:53.280
The simulation, at least one of the simulation hypothesis is like, as you said,
link |
01:10:57.360
like a teenager in its bedroom who doesn't really care, doesn't give a shit about the
link |
01:11:01.440
individuals within there.
link |
01:11:02.400
It just like wants to see how the thing plays out because it's curious and it could turn it
link |
01:11:06.000
off like that.
link |
01:11:06.800
You know, we're on the, you know, we're on the sort of psychopathy to benevolent
link |
01:11:10.720
spectrum God is, I don't know.
link |
01:11:13.760
But having this, just having a little bit of faith that there is something else
link |
01:11:21.200
out there that might be interested in our outcome is, I think, an essential thing,
link |
01:11:26.080
actually, for people to find.
link |
01:11:27.840
A, because it creates commonality between us.
link |
01:11:29.760
It's something we can all share.
link |
01:11:31.760
And like it is uniquely humbling of all of us to an extent.
link |
01:11:34.880
It's like a common objective.
link |
01:11:37.040
But B, it gives people that little bit of like reserve, you know, when things get really dark
link |
01:11:43.200
and I do think things are going to get pretty dark over the next few years.
link |
01:11:46.960
But it gives that like, to think that there's something out there that actually wants our
link |
01:11:51.920
game to keep going.
link |
01:11:52.960
I keep calling it the game, you know, it's a thing C and I recall it the game.
link |
01:11:57.360
You and C is AKA Grimes, call what the game?
link |
01:12:03.600
Everything, the whole thing?
link |
01:12:04.960
Yeah, we joke about like.
link |
01:12:06.560
So everything is a game?
link |
01:12:07.840
Not, well, the universe, like what if it's a game and the goal of the game is to figure
link |
01:12:13.920
out like, well, either how to beat it, how to get out of it, you know, maybe this universe
link |
01:12:18.480
is an escape room, like a giant escape room.
link |
01:12:21.200
And the goal is to figure out, put all the pieces of puzzle, figure out how it works
link |
01:12:26.720
in order to like unlock this like hyperdimensional key and get out beyond what it is.
link |
01:12:31.120
No, but then, so you're saying it's like different levels and it's like a cage within
link |
01:12:35.440
a cage within a cage and never like one cage at a time, you figure out how to do that.
link |
01:12:40.880
Like a new level up, you know, like us becoming multi planetary would be a level up or us,
link |
01:12:45.040
you know, figuring out how to upload our consciousnesses to the thing that would
link |
01:12:48.400
probably be a leveling up or spiritually, you know, humanity becoming more combined and
link |
01:12:53.840
less adversarial and bloodthirsty and us becoming a little bit more enlightened,
link |
01:12:58.640
that would be a leveling up.
link |
01:12:59.520
You know, there's many different frames to it, whether it's physical, you know,
link |
01:13:03.920
digital or like metaphysical.
link |
01:13:05.920
I wonder what the level, I think, I think level one for earth is probably
link |
01:13:10.880
the biological evolutionary process.
link |
01:13:13.920
So going from single cell organisms to early humans, then maybe level two is
link |
01:13:20.240
whatever is happening inside our minds and creating ideas and creating technologies.
link |
01:13:25.920
That's like evolutionary process of ideas.
link |
01:13:31.200
And then multi planetary is interesting.
link |
01:13:34.480
Is that fundamentally different from what we're doing here on earth?
link |
01:13:37.840
Probably, because it allows us to like exponentially scale.
link |
01:13:42.960
It delays the Malthusian trap, right?
link |
01:13:46.240
It's a way to make the playing field get larger so that it can accommodate more of our stuff,
link |
01:13:55.920
more of us.
link |
01:13:57.360
And that's a good thing, but I don't know if it like fully solves this issue of
link |
01:14:05.600
well, this thing called Molek, which we haven't talked about yet,
link |
01:14:07.760
but which is basically, I call it the god of unhealthy competition.
link |
01:14:12.560
Yeah, let's go to Molek.
link |
01:14:13.840
What's Molek?
link |
01:14:14.880
You did a great video on Molek, one aspect of it, the application of it to one aspect.
link |
01:14:20.320
Instagram beauty filters.
link |
01:14:23.520
Very niche, I wanted to start off small.
link |
01:14:27.280
So Molek was originally coined as well, so apparently back in the like
link |
01:14:38.160
Canaanite times, it was to say ancient Carthaginian.
link |
01:14:41.280
I can never say Carthagin, somewhere around like 300 BC or 280, I don't know.
link |
01:14:47.520
There was supposedly this death cult who would sacrifice their children to this awful demon
link |
01:14:54.320
god thing they called Molek in order to get power to win wars.
link |
01:14:59.200
So really dark, horrible things, and it was literally like about child sacrifice,
link |
01:15:02.480
whether they actually existed or not, we don't know, but in mythology they did,
link |
01:15:05.600
and this god that they worshipped was this thing called Molek.
link |
01:15:08.160
And then, I don't know, it seemed like it was kind of quiet throughout history
link |
01:15:13.600
in terms of mythology, beyond that, until this movie Metropolis in 1927 talked about
link |
01:15:22.720
this, you see that there was this incredible futuristic city that everyone was living great in,
link |
01:15:28.880
but then the protagonist goes underground into the sewers and sees that the city is run by this
link |
01:15:32.960
machine, and this machine basically would just kill the workers all the time because it was
link |
01:15:38.400
just so hard to keep it running, they were always dying, so there was all this suffering
link |
01:15:41.920
that was required in order to keep the city going, and then the protagonist has this vision
link |
01:15:45.760
that this machine is actually this demon Molek. So again, it's like this sort of like mechanistic
link |
01:15:49.920
consumption of humans in order to get more power. And then Alan Ginsberg wrote a poem in the 60s,
link |
01:15:57.600
which is an incredible poem called Howl about this thing, Molek. And a lot of people sort of
link |
01:16:06.160
quite understandably take the interpretation of that, he's talking about capitalism.
link |
01:16:12.240
But then the sort of Pista resistance that's moved Molek into this idea of game theory
link |
01:16:17.200
was Scott Alexander of Slate's Sarcodex, wrote this incredible one, literally,
link |
01:16:23.120
I think it might be my favorite piece of writing of all time, it's called Meditations on Molek.
link |
01:16:26.480
Molek, everyone must go read it. Slate's Sarcodex is a blog.
link |
01:16:31.760
It's a blog, yes. We can link to it in the show notes or something, right?
link |
01:16:38.080
Yes, yes. But I like how you assume I have a professional operation going on here.
link |
01:16:44.720
I shall try to remember. What are you, what are you, what are you, what are you, what are you,
link |
01:16:48.720
you're giving the impression of it. Yeah, I'll like, please, if I, if I don't,
link |
01:16:52.160
please somebody in the comments remind me. If you don't know this blog, it's one of the
link |
01:16:57.600
best blogs ever, probably. You should probably be following it. Are blogs still a thing?
link |
01:17:04.000
I think they are still a thing. Yeah, he's migrated on to Substack, but yeah, it's still a blog.
link |
01:17:08.720
Substack, better not fuck things up. I hope not. Yeah. I hope they don't,
link |
01:17:13.360
I hope they don't turn Molekie, which will mean something to people when we continue.
link |
01:17:17.040
Yeah. When they stop interrupting for us. No, no, it's fine.
link |
01:17:21.440
So anyway, so he writes, he writes this, this piece, Meditations on Molek. And basically,
link |
01:17:26.880
he analyzes the poem and he's like, okay, so it seems to be something relating to
link |
01:17:30.560
where competition goes wrong. And, you know, Molek was historically this thing of like,
link |
01:17:36.240
where people would sacrifice a thing that they care about, in this case, children,
link |
01:17:40.880
their own children, in order to gain power, a competitive advantage.
link |
01:17:45.120
And if you look at almost everything that sort of goes wrong in our society,
link |
01:17:49.680
it's that same process. So with the Instagram beauty filters thing, you know,
link |
01:17:56.800
if you're trying to become a famous Instagram model, you are incentivized to post the hottest
link |
01:18:03.440
pictures of yourself that you can, you know, you're trying to play that game.
link |
01:18:06.560
There's a lot of hot women on Instagram. How do you compete against them? You post really
link |
01:18:10.320
hot pictures and that's how you get more likes. As technology gets better, you know,
link |
01:18:16.800
more makeup techniques come along. And then more recently, these beauty filters,
link |
01:18:21.920
where like at the touch of a button, it makes your face look absolutely incredible compared
link |
01:18:26.240
to your natural natural natural face. These these technologies come along, it's everyone is
link |
01:18:31.840
incentivized to do that short term strategy. But over on net, it's bad for everyone because
link |
01:18:39.040
now everyone is kind of like feeling like they have to use these things. And these things like
link |
01:18:42.080
they make you like the reason why I talked about them in this video is because I noticed it myself,
link |
01:18:45.440
you know, like, I was trying to grow my Instagram for a while, I've given up on it now. But
link |
01:18:50.640
yeah, and I noticed these filters, how good they made me look. And I'm like, well,
link |
01:18:55.360
I know that everyone else is kind of doing it. Subscribe to Liv's Instagram.
link |
01:18:58.800
Please, so I don't have to use the filters. Post a bunch of yeah, make it blow up.
link |
01:19:04.640
So yeah, with those, you felt the pressure actually.
link |
01:19:08.400
Exactly. These short term incentives to do this like, this thing that like either
link |
01:19:13.200
sacrifices your integrity, or something else, in order to like stay competitive,
link |
01:19:20.000
which on aggregate, turns like, creates this like sort of race to the bottom spiral where
link |
01:19:24.960
everyone else ends up in a situation which is worse off than if they hadn't start, you know,
link |
01:19:28.320
than it were before, kind of like if like at a football stadium, like the system is so badly
link |
01:19:35.440
designed, a competitive system of like everyone sitting and having a view, that if someone at
link |
01:19:39.520
the very front stands up to get an even better view, it forces everyone else behind to like
link |
01:19:43.920
adopt that same strategy just to get to where they were before. But now everyone's stuck standing up,
link |
01:19:48.560
like, so you need this like top down gods like coordination to make it go back to the better
link |
01:19:53.040
state. But from within the system, you can't actually do that. So that's kind of what this
link |
01:19:56.720
MOLIC thing is. It's this thing that makes people sacrifice values in order to optimize for the
link |
01:20:02.720
winning the game in question, the short term game. But this MOLIC, can you attribute it to
link |
01:20:09.840
anyone centralized source, or is it an emergent phenomena from a large collection of people?
link |
01:20:16.000
Exactly that. It's an emergent phenomena. It's a force of game theory. It's a force of bad
link |
01:20:22.960
incentives on a multi agent system where you've got more, you know, Prisoner's Dilemma is technically
link |
01:20:28.000
a kind of MOLIC system as well. But it's just a two player thing. But another word for MOLIC is
link |
01:20:33.920
multi polar trap. Where basically you just got a lot of different people all competing for some
link |
01:20:38.800
kind of prize. And it would be better if everyone didn't do this one shitty strategy. But because
link |
01:20:45.120
that strategy gives you a short term advantage, everyone's incentivized to do it. And so everyone
link |
01:20:48.880
ends up doing it. So the responsibility for social media is a really nice place for a large
link |
01:20:54.800
number of people to play game theory. And so they also have the ability to then design the rules of
link |
01:21:01.360
the game. And is it on them to try to anticipate what kind of like to do the thing that poker
link |
01:21:08.240
players are doing to run simulation? Ideally, that would have been great if, you know, Mark Zuckerberg
link |
01:21:14.480
and Jack and all the, you know, the Twitter founders and everyone, if they had at least just run a few
link |
01:21:19.920
simulations of how their algorithms would turn, you know, different types of algorithms would turn
link |
01:21:24.720
out for society, that would have been great. That's really difficult to do that kind of deep
link |
01:21:29.360
philosophical thinking about thinking about humanity actually. So not not kind of this level of how do
link |
01:21:38.080
we optimize engagement? Or what brings people joy in the short term? But how is this thing going to
link |
01:21:44.800
change the way people see the world? How's it going to get morphed in iterative games played
link |
01:21:53.040
into something that will change society forever? That requires some deep thinking. That's,
link |
01:22:00.160
I hope there's meetings like that inside companies, but I haven't seen them.
link |
01:22:03.440
That's the problem. And it's difficult because, like, when you're starting up a social media
link |
01:22:09.120
company, you know, you're aware that you've got investors to please, there's bills to pay,
link |
01:22:16.800
you know, there's only so much R&D you can afford to do. You've got all these like incredible
link |
01:22:21.600
pressures, you know, bad incentives to get on and just build your thing as quickly as possible
link |
01:22:25.040
and start making money. And, you know, I don't think anyone intended when they built these
link |
01:22:29.840
social media platforms and just to like preface it. So the reason why, you know, social media is
link |
01:22:35.920
relevant because it's a very good example of like everyone these days is optimizing for, you know,
link |
01:22:41.200
clicks, whether it's a social media platforms themselves because, you know, every click gets
link |
01:22:46.640
more, you know, impressions and impressions pay for, you know, they get advertising dollars or
link |
01:22:50.960
whether it's individual influencers or, you know, whether it's the New York Times or whoever,
link |
01:22:55.920
they're trying to get their story to go viral. So everyone's got this bad incentive of using
link |
01:22:59.760
click, you know, as you called it, the clickbait industrial complex. That's a very mollkey system
link |
01:23:04.080
because everyone is now using worse and worse tactics in order to like try and win this attention
link |
01:23:07.760
game. And yeah, so ideally, these companies would have had enough slack in the beginning
link |
01:23:17.200
in order to run these experiments to see, okay, what are the ways this could possibly go wrong
link |
01:23:22.320
for people? Well, what are the ways that mollock, they should be aware of this concept of mollock
link |
01:23:26.240
and realize that it's, whenever you have a highly competitive multiagent system, which social media
link |
01:23:31.920
is a classic example of millions of agents all trying to compete for likes and so on, and you
link |
01:23:36.640
try and bring all this complexity down into like very small metrics such as number of likes,
link |
01:23:43.520
number of retweets, whatever the algorithm optimizes for, that is a like guaranteed recipe
link |
01:23:48.560
for this stuff to go wrong and become a race to the bottom. I think there should be an honesty
link |
01:23:53.040
when founders, I think there's a hunger for that kind of transparency of like, we don't know what
link |
01:23:57.120
the fuck we're doing. This is a fascinating experiment. We're all running as a human civilization.
link |
01:24:02.640
Let's try this out. And like, actually, just be honest about this, that we're all like,
link |
01:24:07.600
these weird rats and amaze, none of us are controlling it. There's this kind of sense,
link |
01:24:13.280
like the founders, the CEO of Instagram or whatever, Mark Zuckerberg has a control and he's like,
link |
01:24:19.840
with strings playing people. No, they're... He's at the mercy of this is like everyone else. He's
link |
01:24:24.640
just like trying to do his best. And like, I think putting on a smile and doing over polished
link |
01:24:32.320
videos about how Instagram and Facebook are good for you, I think is not the right way to
link |
01:24:39.120
actually ask some of the deepest questions we get to ask as a society. How do we design the game
link |
01:24:45.040
such that we build a better world? I think a big part of this as well is people... There's this
link |
01:24:53.200
philosophy, particularly in Silicon Valley of, well, techno optimism, technology will solve
link |
01:24:59.280
all our issues. And there's a steelman argument to that, where yes, technology has solved a lot
link |
01:25:04.720
of problems and can potentially solve a lot future ones. But it can also... It's always
link |
01:25:10.000
a double edged sword. And particularly as technology gets more and more powerful,
link |
01:25:13.280
we've now got big data and we're able to do all kinds of psychological manipulation with it and
link |
01:25:18.800
so on. Technology is not about values neutral thing. People think... I used to always think
link |
01:25:26.320
this myself. It's like this naive view that, oh, technology is completely neutral. It's the humans
link |
01:25:32.160
that either make it good or bad. No. To the point we're at now, the technology that we are creating,
link |
01:25:37.920
they are social technologies. They literally dictate how humans now form social groups and so
link |
01:25:45.600
on beyond that. And beyond that, it also then... That gives rise to the memes that we then coalesce
link |
01:25:50.880
around. And that... If you have the stack that way, where it's technology driving social interaction,
link |
01:25:56.800
which then drives memetic culture and which ideas become popular, that's Moloch.
link |
01:26:03.760
And we need the other way around. We need it so we need to figure out what are the good memes?
link |
01:26:08.240
What are the good values that we think we need to optimize for that makes people happy and healthy
link |
01:26:17.120
and keeps society as robust and safe as possible, then figure out what the social structure around
link |
01:26:22.880
those should be. And only then do we figure out technology. But we're doing the other way around.
link |
01:26:26.640
And as much as I love in many ways the culture of Silicon Valley and I do think that technology...
link |
01:26:35.840
I don't want to knock it. It's done so many wonderful things for us, same as capitalism.
link |
01:26:40.240
There are... We have to be honest with ourselves. We're getting to a point where we are losing
link |
01:26:46.080
control of this very powerful machine that we have created.
link |
01:26:49.440
Can you redesign the machine within the game? Can you just have...
link |
01:26:54.240
Can you understand the game enough? Okay, this is the game. And this is how we start to reemphasize
link |
01:27:01.840
the memes that matter, the memes that bring out the best in us. The way I try to be in real life
link |
01:27:11.200
and the way I try to be online is to be about kindness and love. And I feel like I sometimes
link |
01:27:18.480
get criticized for being naive and all those kinds of things. But I feel like I'm just trying to live
link |
01:27:24.720
within this game. I'm trying to be authentic. Yeah, but also like, hey, it's kind of fun to do this.
link |
01:27:30.960
Like you guys should try this too. And that's like trying to redesign some aspects of the game
link |
01:27:38.720
within the game. Is that possible? I don't know. But I think we should try. I don't think we have
link |
01:27:47.440
an option but to try. Well, the other option is to create new companies or to pressure companies
link |
01:27:54.560
that or anyone who has control of the rules of the game.
link |
01:27:58.880
I think we need to be doing all of the above. I think we need to be thinking hard about what
link |
01:28:03.520
are the kind of positive, healthy memes. As Elon said, he who controls the memes controls the
link |
01:28:12.400
universe. I think he did. But there's truth to that. There is wisdom in that because
link |
01:28:21.520
memes have driven history. We are a cultural species. That's what sets us apart from chimpanzees
link |
01:28:27.040
and everything else. We have the ability to learn and evolve through culture as opposed to biology
link |
01:28:34.080
or like classic physical constraints. And that means culture is incredibly powerful.
link |
01:28:39.920
And we can create and become victim to very bad memes or very good ones. But we do have some agency
link |
01:28:47.280
over which memes we not only put out there but we also subscribe to. So I think we need to take
link |
01:28:54.800
that approach. I'm making this video right now called The Attention Wars which is about how
link |
01:29:03.920
the media machine is this mollock machine. Well, is this kind of like blind, dumb thing
link |
01:29:10.320
where everyone is optimizing for engagement in order to win their share of the attention pie?
link |
01:29:15.440
And then if you zoom out, it's really like mollock that's pulling the strings because the only
link |
01:29:18.640
thing that benefits from this in the end. Our information ecosystem is breaking down.
link |
01:29:24.320
You look at the state of the US, we're in a civil war. It's just not a physical war.
link |
01:29:29.120
It's an information war. And people are becoming more fractured in terms of what their actual
link |
01:29:36.080
shared reality is. Like truly like an extreme left person, an extreme right person,
link |
01:29:41.920
they literally live in different worlds in their minds at this point. And it's getting more and
link |
01:29:46.080
more amplified. And this force is like a razor blade pushing through everything. It doesn't
link |
01:29:51.920
matter how innocuous the topic is, it will find a way to split into this bifurcated culture war.
link |
01:29:57.040
And it's fucking terrifying. Because that maximizes the tension. And that's like an emergent
link |
01:30:01.280
mollock type force that takes anything, any topic and cuts through it so that you can split nicely
link |
01:30:11.040
into two groups. One that's... All everyone is trying to do within the system is just
link |
01:30:18.160
maximize whatever gets them the most attention because they're just trying to make money so
link |
01:30:22.320
they can keep their thing going. And the best emotion for getting attention... Because it's
link |
01:30:29.440
not just about attention on the internet, it's engagement. That's the key thing. In order for
link |
01:30:33.360
something to go viral, you need people to actually engage with it. They need to comment or retweet
link |
01:30:37.760
or whatever. And of all the emotions, there's like seven classic shared emotions that studies
link |
01:30:46.480
have found that all humans, even from like previously uncontacted tribes have. Some of
link |
01:30:52.640
those are negative, like sadness, disgust, anger, et cetera. Some are positive, happiness,
link |
01:31:00.640
excitement, and so on. The one that happens to be the most useful for the internet is anger.
link |
01:31:06.560
Because anger is such an active emotion. If you want people to engage, if someone's scared... And
link |
01:31:14.480
I'm not just talking out my ass here, there are studies here that have looked into this.
link |
01:31:19.760
Whereas if someone's disgusted or fearful, they actually tend to then be like,
link |
01:31:23.840
I don't want to deal with this. So they're less likely to engage and share it and so on. They're
link |
01:31:27.520
just going to be like, whereas if they're enraged by a thing, well, now that trick is all the
link |
01:31:33.520
old tribalism emotions. And so that's how then things get spread much more easily. They outcompete
link |
01:31:40.880
all the other memes in the ecosystem. And so the attention economy, the wheels that make it go
link |
01:31:48.480
around is rage. I did a tweet. The problem with raging against the machine is that the machine
link |
01:31:55.520
has learned to feed off rage because it is feeding off our rage. That's the thing that's
link |
01:31:59.520
now keeping it going. So the more we get angry, the worse it gets.
link |
01:32:02.160
So the malloc in this attention, in the war of attention is constantly maximizing rage.
link |
01:32:10.880
What it is optimizing for is engagement. And it happens to be that engagement
link |
01:32:17.840
is propaganda. It just sounds like everything is putting... More and more things are being
link |
01:32:23.840
put through this like propagandist lens of winning whatever the war is in question,
link |
01:32:28.720
whether it's the culture war or the Ukraine war. Yeah. Well, I think the silver lining of this,
link |
01:32:33.200
do you think it's possible that in the long arc of this process, you actually do arrive
link |
01:32:39.520
at greater wisdom and more progress? In the moment, it feels like people are
link |
01:32:45.120
tearing each other to shreds over ideas. But if you think about it, one of the magic things
link |
01:32:49.440
about democracy and so on is you have the blue versus red constantly fighting. It's almost like
link |
01:32:54.480
they're in discourse, creating devil's advocate, making devils out of each other. And through that
link |
01:33:01.680
process, discussing ideas, like almost really embodying different ideas, just to yell at each
link |
01:33:08.560
other and through the yelling over the period of decades, maybe centuries, figuring out a better
link |
01:33:14.560
system. Like in the moment, it feels fucked up. Right. But in the long arc, it actually is productive.
link |
01:33:19.840
I hope so. That said, we are now in the era of just as we have weapons of mass destruction
link |
01:33:30.400
with nuclear weapons that can break the whole playing field, we now are developing weapons of
link |
01:33:37.840
informational mass destruction, information weapons, WMDs that basically can be used for
link |
01:33:44.000
propaganda or just manipulating people. However, they, you know, is needed, whether that's through
link |
01:33:51.200
dumb TikTok videos, or, you know, there are significant resources being put in. I don't
link |
01:33:59.200
mean to sound like, you know, to doom and doom, but there are bad actors out there. That's the
link |
01:34:04.320
thing. There are plenty of good actors within the system who are just trying to stay afloat in the
link |
01:34:07.840
game. So we're effectively doing monarchy things. But then on top of that, we have actual bad actors
link |
01:34:12.480
who are intentionally trying to like manipulate the other side into doing things.
link |
01:34:17.280
And using, so because it's a digital space, they're able to use artificial actors, meaning bots.
link |
01:34:24.400
Exactly. Botnets, you know, and this is a whole new situation that we've never had before.
link |
01:34:31.120
It's exciting. You know what I want to do? Because there is, you know, people are talking
link |
01:34:37.520
about bots manipulating and like malicious bots that are basically spreading propaganda.
link |
01:34:43.440
I want to create like a bot army for like, that like fights that, yeah, exactly for love that
link |
01:34:48.960
fights though, that, I mean, you know, there's the, I mean, there's truth to fight fire with fire.
link |
01:34:53.520
It's like, but how you always have to be careful whenever you create, again, like
link |
01:35:00.000
Molek is very tricky. Yeah. Hitler was trying to spread love too.
link |
01:35:03.280
Yeah. Yeah. So we thought, but you know, I agree with you that like, that is a thing that should
link |
01:35:08.080
be considered, but there is, again, everyone, the road to hell is paved in good intentions.
link |
01:35:13.520
And this is, there's always unforeseen circumstance, you know, outcomes, externalities,
link |
01:35:20.320
if you're trying to adopt a thing, even if you do it in the very best of faith.
link |
01:35:23.440
But you can learn lessons of history. If you can run some sims on it first.
link |
01:35:27.440
Absolutely. But also there's certain aspects of a system as we've learned through history that
link |
01:35:32.320
that do better than others. Like for example, don't have a dictator. So like if I were to create
link |
01:35:38.240
this bot army, it's not good for me to have full control over it. Because in the beginning,
link |
01:35:43.840
I might have a good understanding of what's good and not, but over time that starts to get
link |
01:35:48.480
deviated because I'll get annoyed at some assholes and I'll think, okay, wouldn't it be nice to get
link |
01:35:52.640
rid of those assholes, but then that power starts getting to your head, you become corrupted.
link |
01:35:56.720
That's basic human nature. So distribute the power.
link |
01:35:59.440
We need a love botnet on a DAO. A DAO love botnet.
link |
01:36:07.760
Yeah. And without a leader. Like without...
link |
01:36:10.560
Exactly. Distributed, right. But yeah, without any kind of centralized...
link |
01:36:14.240
Yeah. Without even, you know, basic is the more control, the more you can decentralize the control
link |
01:36:19.600
of a thing to people, you know, but then you still need the ability to coordinate.
link |
01:36:26.640
Because that's the issue when if something is too... You know, that's really, to me,
link |
01:36:30.960
like the culture wars, the bigger war we're dealing with is actually between the sort of the...
link |
01:36:38.080
I don't know what even the term is for it, but like centralization versus decentralization.
link |
01:36:41.920
That's the tension we're seeing. Power in control by a few versus completely distributed.
link |
01:36:48.080
And the trouble is if you have a fully centralized thing, then you're at risk of tyranny,
link |
01:36:52.240
you know, Stalin type things can happen or completely distributed. Now you're at risk of
link |
01:36:57.360
complete anarchy and chaos where you can't even coordinate to like on, you know, when there's
link |
01:37:01.360
like a pandemic or anything like that. So it's like, what is the right balance to strike between
link |
01:37:06.080
these two structures? Well, can't Molek really take hold in a fully decentralized system?
link |
01:37:10.880
That's one of the dangers too. Yes.
link |
01:37:12.880
The very vulnerable to Molek. So the dictator can commit huge atrocities,
link |
01:37:17.600
but they can also make sure the infrastructure works and...
link |
01:37:22.880
They have that God's eye view at least. They have the ability to create like laws and rules that
link |
01:37:27.680
like force coordination, which stops Molek. But then you're vulnerable to that dictator
link |
01:37:33.120
getting infected with like this, with some kind of psychopathy type thing.
link |
01:37:37.440
What's reverse Molek?
link |
01:37:39.440
So great question. So that's where... So I've been working on this series,
link |
01:37:46.080
it's been driving me insane for the last year and a half. I did the first one a year ago.
link |
01:37:49.920
I can't believe it's nearly been a year. The second one, hopefully we're coming out in like a month.
link |
01:37:56.000
And my goal at the end of the series is to like present, because basically I'm painting the picture
link |
01:37:59.840
of like what Molek is and how it's affecting almost all these issues in our society and how it's,
link |
01:38:05.040
you know, driving. It's like kind of the generator function, as people describe it,
link |
01:38:08.880
of existential risk. And then at the end of that...
link |
01:38:11.600
Wait, wait. The generator function of existential risk. So you're saying Molek is sort of the
link |
01:38:16.080
engine that creates a bunch of X risks? Yes, not all of them. Like a, you know, a...
link |
01:38:22.800
Just a cool phrase. Generator function. It's not my phrase. It's Daniel Schmacktenberger.
link |
01:38:26.640
Oh, Schmacktenberger. I got that from him. Of course.
link |
01:38:28.720
All things. It's like all roads lead back to Daniel Schmacktenberger, I think.
link |
01:38:32.560
The dude is brilliant. He's really brilliant.
link |
01:38:35.280
After that, it's Mark Twain. But anyway, sorry. Totally rude to interrupt this from me.
link |
01:38:41.040
No, it's fine. So not all X risks. So like an asteroid technically isn't because it's,
link |
01:38:47.120
you know, it's just like this one big external thing. It's not like a competition thing
link |
01:38:51.040
going on. But, you know, synthetic bio, you know,
link |
01:38:54.880
bio weapons, that's one because everyone's incentivized to build, even for defense, you know,
link |
01:38:59.600
bad, bad viruses, you know, just to threaten someone else, etc. Or AI, technically, the
link |
01:39:04.880
race to AGI is kind of potentially a Moloky situation. But yeah, so if Moloky is this,
link |
01:39:12.800
like, generator function that's driving all of these issues over the coming century that might
link |
01:39:17.760
wipe us out, what's the inverse? And so far, what I've gotten to is this character that I want to
link |
01:39:24.320
put out there called Winwin. Because Moloky is the God of Lose Lose, ultimately. It masquerades
link |
01:39:29.280
as the God of Winlose, but in reality, it's Lose Lose. Everyone ends up worse off.
link |
01:39:33.120
So I was like, well, what's the opposite of that? It's Winwin. And I was thinking for ages,
link |
01:39:37.440
like, what's a good name for this character? And then the more I was like, okay, well,
link |
01:39:42.240
don't try and, you know, think through it logically. What's the vibe of Winwin? And to me,
link |
01:39:47.120
like in my mind, Moloky is like, and I dress as it in the video, like, it's red and black. It's
link |
01:39:52.160
kind of like very, you know, hyper focused on its one goal you must win. So Winwin is kind of
link |
01:40:00.480
actually like these colors. It's like purple turquoise. It loves games too. It loves a little
link |
01:40:07.200
bit of healthy competition, but constrained, like kind of like before, like nose out of ring fence,
link |
01:40:11.760
zero sum competition into like just the right amount, whereby its externalities can be controlled
link |
01:40:17.280
and kept positive. And then beyond that, it also loves cooperation, coordination, love,
link |
01:40:22.320
all these other things. But it's also kind of like mischievous. Like, you know, it will have
link |
01:40:27.840
a good time. It's not like kind of like boring, you know, like, oh, God, it's, you know, it's
link |
01:40:32.560
how to have fun. It can get like, it can get down. But ultimately, it's unbelievably wise,
link |
01:40:39.600
and it just wants the game to keep going. And I call it Winwin. Winwin. That's a good, like,
link |
01:40:45.120
pet name. Yes. Winwin. The, I think the Winwin, right? And I think it's formal name when it has
link |
01:40:51.360
to do like official functions is Omnia. Omnia. Yeah. From like omniscience, kind of, what's,
link |
01:40:58.800
why Omnia? It's just like Omnia. It's just like Omwin. Omniwin. But I'm open to suggestions. I
link |
01:41:02.880
would like, you know, and this is. I like Omnia. Yeah. But there's an angelic kind of sense to
link |
01:41:07.520
Omnia though. So Winwin is more fun. So it's more like, it embraces the fun aspect. I mean,
link |
01:41:16.560
there is something about sort of, there's some aspect to Winwin interactions that requires
link |
01:41:26.400
embracing the chaos of the game and enjoying the game itself. I don't know. I don't know what that
link |
01:41:34.880
is. That's almost like a zen like appreciation of the game itself, not optimizing for the consequences
link |
01:41:41.360
of the game. Right. Well, it's recognizing the value of competition in of itself. It's not like
link |
01:41:47.680
about winning. It's about you enjoying the process of having a competition and not knowing whether
link |
01:41:52.080
you're going to win or lose this little thing. But then also being aware that, you know, what's
link |
01:41:56.800
the boundary? How big do I want competition to be? Because one of the reasons why Molek is doing so
link |
01:42:01.440
well now in our society, in our civilization is because we haven't been able to ring fence competition.
link |
01:42:06.640
You know, and so it's just having all these negative externalities and it's we've completely
link |
01:42:11.280
lost control of it. You know, it's, I think my guesses are now we're getting really like,
link |
01:42:18.560
you know, metaphysical technically. But I think we'll be, we'll be in a more interesting universe
link |
01:42:26.720
if we have one that has both pure cooperation, you know, lots of cooperation and some pockets
link |
01:42:32.320
of competition than one that's purely competition, cooperation entirely. Like it's good to have some
link |
01:42:37.280
little zero sum this bits. But I don't know that fully. And I'm not qualified as a philosopher to
link |
01:42:43.600
know that. And that's what reverse Molek. So this kind of win win creature is in a system as an
link |
01:42:49.840
antidote to the Molek system. Yes. And I don't know how it's going to do that. But it's good to kind
link |
01:42:58.320
of try to start to formulate different ideas, different frameworks of how we think about that.
link |
01:43:03.760
Exactly. At the small scale of a collection of individuals, a large scale of a society.
link |
01:43:09.040
Exactly. It's a meme. I think it's, I think it's an example of a good meme. And I'm open,
link |
01:43:14.000
I'd love to hear feedback from people if they think it's at, you know, they have a better idea
link |
01:43:17.520
or it's not, you know, but it's the direction of meme that we need to spread this idea of like,
link |
01:43:22.800
look for the win wins in life. Well, on the topic of beauty filters, on that particular context
link |
01:43:28.560
where Molek creates negative consequences, what, you know, Dostoevsky said beauty will save the
link |
01:43:36.640
world. What is beauty anyway? It would be nice to just try to discuss what kind of thing we would
link |
01:43:46.720
like to converge towards in our understanding of what is beautiful. So to me, I think something
link |
01:43:57.120
is beautiful when it can't be reduced down to easy metrics. Like if you think of a tree,
link |
01:44:09.360
what is it about a tree, like a big ancient beautiful tree, right? What is it about it
link |
01:44:12.960
that we find so beautiful? It's not, you know, the, you know, what are the sweetness of its fruit
link |
01:44:20.480
or the value of its lumber. It's, it's this entirety of it that is, there's these immeasurable
link |
01:44:30.080
qualities. It's like almost like a qualia of it. That's both like it walks this fine line between
link |
01:44:36.800
pattern, well, it's got lots of patternicity, but it's not overly predictable. You know, again,
link |
01:44:41.280
it walks this fine line between order and chaos. It's a very highly complex system.
link |
01:44:46.800
And the, you know, you can't, it's evolving over time, you know, the definition of a complex
link |
01:44:52.800
versus, and this is another Schmacktenberger thing, you know, a complex versus a complicated system.
link |
01:44:57.520
A complicated system can be sort of broken down into bits, understood, and then put that together.
link |
01:45:02.080
A complex system is kind of like a black box. It does all this crazy stuff. But if you take it
link |
01:45:07.760
apart, you can't put it back together again, because it's, there's, there's all these intricacies.
link |
01:45:11.600
And also very importantly, like the, there's some of the parts, sorry, the sum of the whole
link |
01:45:15.840
is much greater than the sum of the parts. And that's where the beauty lies, I think.
link |
01:45:21.200
And I think that extends to things like art as well. Like there's something, there's something
link |
01:45:26.000
immeasurable about it. There's something we can't break down to a narrow metric.
link |
01:45:29.840
Does that extend to humans, you think? Yeah, absolutely.
link |
01:45:33.200
So how can Instagram reveal that kind of beauty, the complexity of a human being?
link |
01:45:39.200
Good question.
link |
01:45:41.840
And this takes us back to our dating sites and good reads, I think.
link |
01:45:46.720
Very good question. I mean, well, I know what it shouldn't do. It shouldn't try and like, right now,
link |
01:45:52.880
you know, one of the, I was talking to like a social media expert recently, because I was like,
link |
01:45:57.440
Oh, I hate things. The social media expert.
link |
01:45:59.760
Oh, yeah. There are like agencies out there that you can like outsource because I'm thinking about
link |
01:46:04.480
working with one to like, so I want to start a podcast.
link |
01:46:09.280
You should, you should have done it a long time ago.
link |
01:46:11.680
Working on it. It's going to be called win win. And it's going to be about this like positive
link |
01:46:16.320
some stuff. And the thing that, you know, they always come back and say is like, well, you need
link |
01:46:21.680
to like, figure out what your thing is, you know, you need to narrow down what your thing is and
link |
01:46:26.320
then just follow that, have like a sort of a formula, because that's what people want.
link |
01:46:32.000
They want to know that they're coming back to the same thing. And that's the advice on YouTube,
link |
01:46:36.080
Twitter, you name it. And that's why, and the trouble with that is that it's a complexity
link |
01:46:41.360
reduction. And generally speaking, complex, you know, complexity reduction is bad. It's
link |
01:46:45.360
making things more, it's an oversimplification. Not that simplification is always a bad thing.
link |
01:46:49.840
But when you're trying to take, you know, what is social media doing is trying to like,
link |
01:46:57.040
encapsulate the human experience and put it into digital form and commodify it to an extent.
link |
01:47:05.520
So you do that, you compress people down into these like, narrow things.
link |
01:47:09.360
And that's why I think it's kind of ultimately fundamentally incompatible
link |
01:47:13.760
with at least my definition of beauty.
link |
01:47:15.280
It's interesting because there is some sense in which a simplification, sort of in the Einstein
link |
01:47:23.760
kind of sense of a really complex idea, a simplification in a way that still captures some
link |
01:47:30.560
core power of an idea of a person is also beautiful. And so maybe it's possible for
link |
01:47:37.680
social media to do that, a presentation, a sort of a slither, a slice, a look into a person's life
link |
01:47:47.440
that reveals something real about them. But in a simple way, in a way that can be displayed
link |
01:47:53.200
graphically or through words, some way, I mean, in some way, Twitter can do that kind of thing.
link |
01:47:59.840
A very few set of words can reveal the intricacies of a person. Of course, the viral machine that
link |
01:48:08.480
spreads those words often results in people taking the thing out of context. People often
link |
01:48:17.200
don't read tweets in the context of the human being that wrote them. The full history of the
link |
01:48:22.880
tweets they've written, the education level, the humor level, the world view they're playing
link |
01:48:29.040
around with, all that context is forgotten and people just see the different words. So that can
link |
01:48:34.720
lead to trouble. But in a certain sense, if you do take it in context, it reveals some kind of
link |
01:48:41.600
quirky little beautiful idea or a profound little idea from that particular person that
link |
01:48:47.280
shows something about that person. So in that sense, Twitter can be more successful if we talk
link |
01:48:51.840
about mollusks is driving a better kind of incentive. Yeah. I mean, how they can, if we were to
link |
01:49:01.040
rewrite, is there a way to rewrite the Twitter algorithm so that it stops being the fertile
link |
01:49:10.880
breeding ground of the culture wars? Because that's really what it is. I mean, maybe I'm giving it
link |
01:49:17.440
Twitter too much power, but just the more I looked into it, and I had conversations with
link |
01:49:24.000
Tristan Harris from Center of Human Technology, and he explained it as Twitter is where you
link |
01:49:31.520
have this amalgam of human culture, and then this terribly designed algorithm that amplifies the
link |
01:49:37.280
craziest people, and the angriest, the angriest, most divisive takes and amplifies them. And then
link |
01:49:45.680
the media, the mainstream media, because all the journalists are also on Twitter, they then
link |
01:49:50.960
are informed by that. And so they draw out the stories they can from this already like
link |
01:49:56.640
very boiling lava of rage, and then spread that to their millions and millions of people
link |
01:50:04.320
who aren't even on Twitter. And so honestly, I think if I could press a button, turn them off,
link |
01:50:11.840
I probably would at this point, because I just don't see a way of being compatible with healthiness,
link |
01:50:17.280
but that's not going to happen. And so at least one way to like stem the tide and make it less
link |
01:50:24.080
monarchy would be to change, at least if like it was on a subscription model, then it's now not
link |
01:50:32.640
optimizing for impressions, because basically what it wants is for people to keep coming back
link |
01:50:39.120
as often as possible. That's how they get paid, right? Every time an ad gets shown to someone,
link |
01:50:43.600
and the way is to get people constantly refreshing their feed. So you're trying to encourage
link |
01:50:47.680
addictive behaviors. Whereas if someone, if they moved on to at least a subscription model, then
link |
01:50:55.600
they're getting the money either way, whether someone comes back to the site once a month or
link |
01:50:59.680
500 times a month, they get the same amount of money. So now that takes away that incentive
link |
01:51:04.720
to use technology, to build, to design an algorithm that is maximally addictive.
link |
01:51:10.400
That would be one way, for example. Yeah, but you still want people to...
link |
01:51:14.880
Yeah, I just feel like that just slows down, creates friction in the virality of things.
link |
01:51:20.800
But that's good. We need to slow down virality. It's good. It's one way.
link |
01:51:25.360
Virality is mollic, to be clear. So mollic is always negative, then.
link |
01:51:33.680
Yes, by definition. Yes, but then I disagree with you.
link |
01:51:38.720
Competition is not always negative. Competition is neutral.
link |
01:51:40.400
I disagree with you that all virality is negative, then, is mollic, then. Because
link |
01:51:48.320
it's a good intuition, because we have a lot of data on virality being negative.
link |
01:51:51.760
But I happen to believe that the core of human beings, so most human beings, want to be good
link |
01:52:00.400
more than they want to be bad to each other. And so I think it's possible. It might be just
link |
01:52:06.160
harder to engineer systems that enable virality, but it's possible to engineer systems that are viral,
link |
01:52:13.840
that enable virality, and the kind of stuff that rises to the top is things that are positive.
link |
01:52:21.280
And positive, not like lala positive. It's more like win, win,
link |
01:52:25.680
meaning a lot of people need to be challenged. Wise things, yes.
link |
01:52:29.600
You grow from it. It might challenge you. You might not like it, but you ultimately grow from it.
link |
01:52:34.560
And ultimately, bring people together as opposed to tear them apart.
link |
01:52:38.320
I deeply want that to be true. And I very much agree with you that people at their core
link |
01:52:43.600
are on average good as opposed to care for each other as opposed to not.
link |
01:52:46.720
I think it's actually a very small percentage of people are truly wanting to do just destructive,
link |
01:52:53.600
malicious things. Most people are just trying to win their own little game,
link |
01:52:56.320
and they don't mean to be, they're just stuck in this badly designed system.
link |
01:53:02.000
That said, the current structure, yes. The current structure means that virality is
link |
01:53:09.120
optimized towards mollock. That doesn't mean there aren't exceptions. Sometimes positive
link |
01:53:12.960
stories do go viral, and I think we should study them. I think there should be a whole field of
link |
01:53:16.240
study into understanding, identifying memes that above a certain threshold of the population agree
link |
01:53:24.240
as a positive, happy, bringing people together meme, the kind of thing that brings families
link |
01:53:29.120
together that would normally argue about cultural stuff at the table, at the dinner table.
link |
01:53:34.720
Identify those memes and figure out what was the ingredient that made them spread that day.
link |
01:53:40.080
Also, not just happiness and connection between humans, but connection between humans in other
link |
01:53:49.120
ways that enables productivity, cooperation, solving difficult problems and all those kinds
link |
01:53:54.640
of stuff. It's not just about let's be happy and have a fulfilling lives. It's also like,
link |
01:54:01.360
let's build cool shit. Which is the spirit of collaboration, which is deeply anti mollock.
link |
01:54:06.160
It's not using competition. It's like mollock hates collaboration and coordination and people
link |
01:54:13.360
working together. Again, the internet started out as that. It could have been that, but because
link |
01:54:20.960
of the way it was structured in terms of very lofty ideal, they wanted everything to be open
link |
01:54:27.840
source and also free, but they needed to find a way to pay the bills anyway because they were
link |
01:54:33.040
still building this on top of our old economics system. The way they did that was through third
link |
01:54:38.160
party advertisement, but that meant that things were very decoupled. You've got this third party
link |
01:54:44.160
interest, which means that people are having to optimize for that. The actual consumer is
link |
01:54:51.360
actually the product, not the person you're making the thing for. In the end, you stop making the
link |
01:54:58.000
thing for the advertiser. That's why it then breaks down. There's no clean solution to this,
link |
01:55:08.160
and it's a really good suggestion by you actually to figure out how we can optimize
link |
01:55:15.200
virality for positive sum topics. I shall be the general of the lovebot army. Distributed.
link |
01:55:28.960
Just even in saying that, the power already went to my head. No. Okay. You've talked about
link |
01:55:33.920
quantifying your thinking. We've been talking about this, a game theoretic view on life
link |
01:55:39.600
and putting probabilities behind estimates. If you think about different trajectories,
link |
01:55:44.160
you can take through life, just actually analyzing life in a game theoretic way,
link |
01:55:48.240
like your own life, like personal life. I think you've given an example that you had an honest
link |
01:55:53.680
conversation with the ego about how long is this relationship going to last. Similar to our marriage
link |
01:55:59.600
problem discussion, having an honest conversation about the probability of things that we sometimes
link |
01:56:06.320
are a little bit too shy or scared to think of in probabilistic terms. Can you speak to that
link |
01:56:12.800
kind of way of reasoning, the good and the bad of that? Can you do this kind of thing with human
link |
01:56:18.960
relations? Yeah. The scenario you're talking about, it was like... Yeah. Tell me about that scenario.
link |
01:56:27.440
I think it was about a year into our relationship, and we were having a fairly heavy conversation
link |
01:56:34.320
because we were trying to figure out whether or not I was going to sell my apartment. He'd already
link |
01:56:39.200
moved in, but I think we were just figuring out what our long term plans should be. Should we buy
link |
01:56:44.720
a place together, et cetera. When you guys start having that conversation, are you drunk out of
link |
01:56:48.640
your mind on wine or is you sober and you're actually having a serious... I think we were sober.
link |
01:56:53.520
How do you get to that conversation? Because most people are kind of afraid to have that
link |
01:56:56.560
kind of serious conversation. Well, our relationship was very... Well, first of all, we were good
link |
01:57:02.800
friends for a couple of years before we even got romantic. When we did get romantic, it was very
link |
01:57:13.920
clear that this was a big deal. It wasn't just another random thing.
link |
01:57:20.240
So the probability of it being a big deal was high? It was already very high. Then we'd been
link |
01:57:25.040
together for a year, and it had been pretty golden and wonderful. There was a lot of foundation
link |
01:57:32.000
already where we felt very comfortable having a lot of frank conversations. But Igor's MO has
link |
01:57:37.280
always been much more than mine. He was always from the outset. Just in a relationship, radical
link |
01:57:42.960
transparency and honesty is the way, because the truth is the truth, whether you want to hide it
link |
01:57:47.840
on where it will come out eventually. If you aren't able to accept difficult things yourself,
link |
01:57:56.320
then how could you possibly expect to be the most integral version? The relationship needs
link |
01:58:02.160
this bedrock of honesty as a foundation, more than anything. Yeah, that's really interesting,
link |
01:58:07.520
but I would like to push against some of those ideas, but let's throw them up.
link |
01:58:11.840
Don, the line, yes, throw them up. I just rudely interrupt.
link |
01:58:16.640
And so we'd been about together for a year, and things were good, and we were having this hard
link |
01:58:21.760
conversation. And then he was like, well, okay, what's the likelihood that we're going to be
link |
01:58:26.880
together in three years then? Because I think it was roughly a three year time horizon. And I was
link |
01:58:31.600
like, oh, interesting. And then we were like, actually, wait, before you said out loud, let's
link |
01:58:34.960
both write down our predictions formally. Because we'd been like, we were just getting into like
link |
01:58:39.360
effective altruism and rationality at the time, which is all about making, you know,
link |
01:58:42.880
formal predictions as a means of measuring your own, well, your own foresight, essentially, in a
link |
01:58:51.920
quantified way. So we like both wrote down our percentages. And we also did a one year
link |
01:58:58.000
prediction and a 10 year one as well. So we got percentages for all three.
link |
01:59:01.760
And then we showed each other. And I remember like having this moment of like,
link |
01:59:06.000
because for the 10 year one, I was like, well, I mean, I love them a lot, but like a lot can
link |
01:59:10.240
happen in 10 years, you know, and we've only been together for, you know, so I was like,
link |
01:59:14.880
I think it's over 50%, but it's definitely not 90%. And I remember like wrestling, I was like,
link |
01:59:19.120
but I don't want him to be hurt. I don't want him to, you know, I don't want to give a number
link |
01:59:22.000
lower than his. And I remember thinking, I was like, don't game it. This is an exercise in radical
link |
01:59:27.040
honesty. So just give your real percentage. And I think mine was like 75%. And then we showed
link |
01:59:31.840
each other. And luckily, we were fairly well aligned. But honestly, if we weren't 20%, it
link |
01:59:41.120
definitely would have, I, if his had been consistently lower than mine, that would have
link |
01:59:46.480
rattled me for sure. Whereas if it had been the other way around, I think he would have,
link |
01:59:50.480
he's just kind of like a water off the duck's back type of guy. It'd be like, okay, well,
link |
01:59:53.760
all right, we'll figure this out. Well, did you guys provide airbars on the estimate? Like the
link |
01:59:58.080
level. They came built in, we didn't give formal plus or minus error bars. I didn't draw any or
link |
02:00:03.040
anything like that. Well, I guess that's the question I have is, did you feel informed enough
link |
02:00:10.320
to make such decisions? Because like, I feel like if you were, if I were to do this kind of thing
link |
02:00:15.280
rigorously, I would want some data. I would want to say that one of the assumptions you have is
link |
02:00:23.120
you're not that different from other relationships. Right. And so I want to have some data about the
link |
02:00:29.120
way. You want the base rates. Yeah. And also actual trajectories of relationships. I would
link |
02:00:34.560
love to have like time series data about the ways that relationships fall apart or prosper,
link |
02:00:43.680
how they collide with different life events, losses, job changes moving,
link |
02:00:48.400
both partners find jobs. Only one has a job. I want that kind of data and how often the
link |
02:00:57.040
different trajectories change in life. Like how informative is your past to your future?
link |
02:01:04.240
That's the whole thing I got. Can you look at my life and have a good prediction about,
link |
02:01:10.400
in terms of my characteristics of my relationships with what that's going to look like in the future
link |
02:01:15.280
or not? I don't even know the answer to that question. I'll be very ill informed in terms of
link |
02:01:18.880
making the probability. I would be far, yeah, I just would be under informed. I would be under
link |
02:01:26.160
informed. I'll be over biasing to my prior experiences, I think. Right. But as long as you're
link |
02:01:32.320
aware of that and you're honest with yourself, and you're honest with the other person, say,
link |
02:01:35.760
look, I have really wide error bars on this for the following reasons. That's okay. I still think
link |
02:01:40.800
it's better than not trying to quantify it at all if you're trying to make really major irreversible
link |
02:01:45.840
life decisions. I feel also the romantic nature of that question. For me personally, I try to
link |
02:01:52.080
live my life thinking it's very close to 100%. Allowing myself, actually, this is the difficulty
link |
02:01:59.520
of this, is allowing myself to think differently. I feel like has a psychological consequence.
link |
02:02:06.320
That's one of my pushbacks against radical honesty is this one particular perspective.
link |
02:02:14.000
So you're saying you would rather give a falsely high percentage to your partner?
link |
02:02:20.080
Going back to the wide sage film. In order to create this additional optimism.
link |
02:02:24.560
Helm youth. Yes.
link |
02:02:26.160
Of fake it till you make it. The positive, the positive thinking.
link |
02:02:31.040
Hashtag positivity. Yeah, hashtag. Well, so that and this comes back to this idea of useful
link |
02:02:37.920
fictions. And I agree. I don't think there's a clear answer to this and I think it's actually
link |
02:02:43.440
quite subjective. Some people this works better for than others. To be clear, Igor and I weren't
link |
02:02:49.760
doing this formal prediction in it. We did it with very much tongue in cheek. It wasn't like
link |
02:02:56.160
we were going to make, I don't think it even would have drastically changed what we decided to do
link |
02:03:01.440
even. We kind of just did it more as a fun exercise.
link |
02:03:05.040
For the consequence of that fun exercise, there was a deep honesty to it too.
link |
02:03:09.840
Exactly. It was a deep, and it was just like this moment of reflection. I'm like, oh wow,
link |
02:03:13.680
I actually have to think through this quite critically and so on. And it's also what was
link |
02:03:21.040
interesting was I got to check in with what my desires were. So there was one thing of what
link |
02:03:27.120
my actual prediction is, but what are my desires and could these desires be affecting my predictions
link |
02:03:31.520
and so on. And that's a method of rationality. And I personally don't think it loses anything in
link |
02:03:36.560
terms of, it didn't take any of the magic away from our relationship. Quite the opposite. It
link |
02:03:40.560
brought us closer together because it was like, we did this weird fun thing that I appreciate.
link |
02:03:45.200
A lot of people find quite strange. And I think it was somewhat unique in our relationship that
link |
02:03:51.840
both of us are very, we both love numbers. We both love statistics. We're both poker players.
link |
02:03:58.320
So this was kind of like our safe space anyway. For others, one partner really might not like
link |
02:04:04.960
that kind of stuff at all, in which case it's not a good exercise to do. I don't recommend it to
link |
02:04:08.640
everybody. But I do think it's interesting sometimes to poke holes in the probe at these
link |
02:04:18.400
things that we consider so sacred that we can't try to quantify them, which is interesting because
link |
02:04:24.720
that's intention with the idea of what we just talked about with beauty and what makes something
link |
02:04:28.240
beautiful, the fact that you can't measure everything about it. And perhaps something
link |
02:04:31.840
shouldn't be tried to measure. Maybe it's wrong to completely try and put a utilitarian
link |
02:04:38.560
frame of measuring the utility of a tree in its entirety. I don't know. Maybe we should, maybe
link |
02:04:44.160
we shouldn't. I'm ambivalent on that. But overall, people have too many biases. People
link |
02:04:52.720
are overly biased against trying to do a quantified cost benefit analysis on really tough life
link |
02:05:00.560
decisions. They're like, oh, just go with your gut. It's like, well, sure. But our intuitions
link |
02:05:07.040
are best suited for things that we've got tons of experience in. Then we can really
link |
02:05:11.120
trust on it if it's a decision we've made many times. But if it's like, should I marry this
link |
02:05:15.120
person? Or should I buy this house over that house? You only make those decisions a couple
link |
02:05:20.160
of times in your life, maybe. Well, I would love to know there's a balance that probably is a
link |
02:05:27.360
personal balance of strike is the amount of rationality you apply to a question versus
link |
02:05:34.480
the useful fiction, the fake it till you make it. For example, just talking to soldiers in Ukraine,
link |
02:05:43.440
you ask them, what's the probability of you winning, Ukraine winning? Almost everybody
link |
02:05:53.520
I talk to is 100%. Wow. And you listen to the experts. They say all kinds of stuff.
link |
02:06:00.480
Right. First of all, the morale there is higher than probably enough. I've never been to a war
link |
02:06:08.080
zone before this. But I've read about many wars. And I think the morale in Ukraine is higher than
link |
02:06:15.200
almost anywhere I've read about. It's every single person in the country is proud to fight for their
link |
02:06:20.720
country. Everybody, not just soldiers, not everybody. Why do you think that is specifically
link |
02:06:27.120
more than in other wars? I think because there's perhaps a dormant desire for the citizens of
link |
02:06:38.000
this country to find the identity of this country because it's been going through this 30 year
link |
02:06:44.560
process of different factions and political bickering. And they haven't had, as they talk about,
link |
02:06:50.800
they haven't had their independence war. They say all great nations have had an independence war.
link |
02:06:55.760
They had to fight for their independence, for the discovery of the identity of the core of the ideals
link |
02:07:01.840
that unify us. And they haven't had that. There's constantly been factions. There's been divisions.
link |
02:07:07.120
There's been pressures from empires, from United States and from Russia, from NATO in Europe.
link |
02:07:14.000
Everybody telling them what to do. Now they want to discover who they are. And there's that kind
link |
02:07:18.400
of sense that we're going to fight for the safety of our homeland, but we're also going to fight for
link |
02:07:24.560
our identity. And that on top of the fact that there's just, if you look at the history of Ukraine
link |
02:07:33.840
and there's certain other countries like this, there are certain cultures are feisty in their
link |
02:07:40.240
pride of being part of being the citizens of that nation. Ukraine is that, Poland was that.
link |
02:07:48.240
You just look at history. In certain countries, you do not want to occupy. I mean, both Stalin and
link |
02:07:55.360
Hitler talked about Poland in this way. They're like, this is a big problem if we occupy this
link |
02:08:00.960
land for prolonged periods of time. They're going to be a pain in their ass. They're not going to
link |
02:08:05.280
be want to be occupied. And certain other countries are like pragmatic. They're like, well, leaders
link |
02:08:11.360
come and go. I guess this is good. Ukrainians, throughout the 20th century, don't seem to be
link |
02:08:19.840
the kind of people that just sit calmly and let the quote unquote occupiers impose their
link |
02:08:30.000
roots. That's interesting though, because you said it's always been under conflict and leaders
link |
02:08:33.920
have come and gone. So you would expect them to actually be the opposite under that reason.
link |
02:08:38.640
Because it's a very fertile land. It's great for agriculture. So a lot of people want to,
link |
02:08:44.640
I mean, I think they've developed this culture because they've constantly been occupied by
link |
02:08:48.000
different people for the different peoples. And so maybe there is something to that where
link |
02:08:55.040
you've constantly had to feel like within the blood of the generations, there's the struggle for
link |
02:09:01.840
or against the man, against the imposition of rules against oppression and all that kind of
link |
02:09:09.120
stuff. And that stays with them. So there's a will there. But a lot of other aspects are also
link |
02:09:15.840
part of that has to do with the reverse, small kind of situation where social media has definitely
link |
02:09:21.440
played a part of it. Also different charismatic individuals have had to play a part. The fact
link |
02:09:27.600
that the president of the nation, Zelensky, stayed in Kiev during the invasion is a huge
link |
02:09:36.800
inspiration to them because most leaders, as you could imagine, when the capital of the nation is
link |
02:09:43.280
under attack, the wise thing, the smart thing that the United States advised Zelensky to do
link |
02:09:49.040
is to flee and to be the leader of the nation from a distant place. He said, fuck that,
link |
02:09:56.000
I'm staying put. Everyone around him, there was a pressure to leave and he didn't. And that in
link |
02:10:04.800
those singular acts really can unify a nation. There's a lot of people that criticize Zelensky
link |
02:10:11.280
within Ukraine. Before the war, he was very unpopular, even still. But they put that aside
link |
02:10:18.720
for the, especially that singular act of staying in the capital. Yeah, a lot of those kinds of things
link |
02:10:26.960
come together to create something within people.
link |
02:10:33.280
These things always, of course though, which, how zoomed out of a view do you want to take?
link |
02:10:43.520
Because yeah, you describe it as like an anti Moloch thing happened within Ukraine because
link |
02:10:48.000
it brought the Ukrainian people together in order to fight a common enemy. Maybe that's a good thing,
link |
02:10:52.400
maybe that's a bad thing. In the end, we don't know how this is all going to play out. But if you
link |
02:10:56.960
zoom it out on a global level, they're coming together to fight. That could make a conflict
link |
02:11:11.440
larger. You know what I mean? I don't know what the right answer is here. It seems like a good
link |
02:11:16.320
thing that they came together. But we don't know how this is all going to play out. If this all
link |
02:11:20.480
turns into a nuclear war, we'll be like, okay, that was the bad, that was... Oh yeah. So I was
link |
02:11:23.920
describing the reverse Moloch for the local level. Exactly. Now, this is where the experts come in
link |
02:11:31.440
and they say, well, if you channel most of the resources of the nation and the nation supporting
link |
02:11:39.120
Ukraine into the war effort, are you not beating the drums of war that is much bigger than Ukraine?
link |
02:11:47.440
In fact, even the Ukrainian leaders are speaking of it this way. This is not a war between two
link |
02:11:54.480
nations. This is the early days of a world war. If we don't play this correctly. Yes.
link |
02:12:02.480
We need cool heads from our leaders. From Ukraine's perspective, Ukraine needs to win the war
link |
02:12:12.720
because what is winning the war mean is coming up, coming to peace negotiations, an agreement
link |
02:12:21.520
that guarantees no more invasions. And then you make an agreement about what land belongs to
link |
02:12:27.520
whom. You stop that. And basically, from their perspective is you want to demonstrate to the
link |
02:12:35.760
rest of the world who's watching carefully, including Russia and China and different players on the
link |
02:12:40.720
geopolitical stage, that this kind of conflict is not going to be productive if you engage in it.
link |
02:12:47.040
So you want to teach everybody a lesson, let's not do World War III. It's going to be bad for
link |
02:12:51.920
everybody. It's a lose, lose. It's a deep lose, lose. Doesn't matter. And I think that's actually
link |
02:13:03.120
a correct, when I zoom out, 99% of what I think about is just individual human beings and human
link |
02:13:11.840
lives and just that war is horrible. But when you zoom out and think from a geopolitics perspective,
link |
02:13:17.680
we should realize that it's entirely possible that we will see a World War III in the 21st
link |
02:13:24.880
century. And this is like a dress rehearsal for that. So the way we play this as a human
link |
02:13:34.480
civilization will define whether we do or don't have a World War III.
link |
02:13:39.520
You know, how we discuss war, how we discuss nuclear war, the kind of leaders we elect
link |
02:13:53.120
and prop up, the kind of memes we circulate. Because you have to be very careful when you're
link |
02:14:00.160
being pro Ukraine, for example, you have to realize that you are also indirectly feeding
link |
02:14:11.520
the ever increasing military industrial complex. So be extremely careful that
link |
02:14:17.840
when you say pro Ukraine or pro anybody, you're pro human beings, not pro the machine,
link |
02:14:30.400
that creates narratives that says it's pro human beings. But it's actually if you look
link |
02:14:38.000
at the raw use of funds and resources, it's actually pro making weapons and shooting bullets
link |
02:14:45.760
and dropping bombs. Right. The real, we have to just somehow get the meme into everyone's heads
link |
02:14:51.840
that the real enemy is war itself. That's the enemy we need to defeat. And that doesn't mean
link |
02:14:59.200
to say that there isn't justification for small local scenarios, adversarial conflicts. If you
link |
02:15:07.520
have a leader who is starting wars, they're on the side of team war, basically. It's not that
link |
02:15:13.840
they're on the side of team country, whatever that country is, it's they're on the side of team war.
link |
02:15:17.760
So that needs to be stopped and put down. But you also have to find a way that your
link |
02:15:23.280
corrective measure doesn't actually then end up being coopted by the war machine
link |
02:15:27.280
and creating greater war. Again, the playing field is finite. The scale of
link |
02:15:33.520
conflict is now getting so big that the weapons that can be used are so mass destructive
link |
02:15:38.560
that we can't afford another giant conflict. We just, we won't make it.
link |
02:15:44.160
What existential threat in terms of us not making it, are you most worried about?
link |
02:15:49.440
What existential threat to human civilization? We got like, let's go down a dark path, huh?
link |
02:15:54.640
Well, no, it's a dark. No, it's like, well, we're in the somber place, we might as well.
link |
02:15:59.040
Some of my best friends are dark paths. What worries you most? We mentioned
link |
02:16:08.800
asteroids, we mentioned AGI, nuclear weapons. The one that's on my mind the most,
link |
02:16:17.280
mostly because I think it's the one where we have actually a real chance to move the needle on
link |
02:16:21.840
in a positive direction or more specifically stop some really bad things from happening,
link |
02:16:26.720
really dumb, avoidable things is bioresks. In what kind of bioresks?
link |
02:16:37.280
In terms of, yeah, so many. Of course, we have risks from natural pandemics,
link |
02:16:43.520
naturally occurring viruses or pathogens. Then also as time and technology goes on and technology
link |
02:16:50.000
becomes more and more democratized into the hands of more and more people, the risk of synthetic
link |
02:16:55.040
pathogens. Whether or not you fall into the camp of COVID was gain of function,
link |
02:17:02.640
accidental lab leak, or whether it was purely naturally occurring. Either way,
link |
02:17:09.520
we are facing a future where synthetic pathogens or like, human meddled with pathogens
link |
02:17:17.520
either accidentally get out or get into the hands of bad actors, whether they're omnicide or maniacs,
link |
02:17:26.960
either way. That means we need more robustness for that. You would think that us having this nice
link |
02:17:32.880
little dry run, which is what as awful as COVID was in all those poor people that died, it was
link |
02:17:39.600
still like child's play compared to what a future one could be in terms of fatality rate.
link |
02:17:48.800
We would then be much more robust in our pandemic preparedness. Meanwhile, the budget
link |
02:17:57.520
in the last two years for the US, sorry, they just did this, I can't remember the name of what
link |
02:18:03.920
the actual budget was, but it was like a multi trillion dollar budget that the US just set aside.
link |
02:18:08.080
Originally in that, considering that COVID cost multiple trillions to the economy,
link |
02:18:14.960
the original allocation in this new budget for future pandemic preparedness was $60 billion,
link |
02:18:19.840
so tiny proportion of it. That proceeded to get whittled down to like $30 billion,
link |
02:18:27.120
to $15 billion, all the way down to $2 billion out of multiple trillions for a thing that has
link |
02:18:32.080
just cost us multiple trillions. We've just finished, we're barely even, we're not even
link |
02:18:36.080
really out of it. It basically got whittled down to nothing because for some reason people think
link |
02:18:40.480
that, oh right, we've got the pandemic out the way, that was that one. The reason for that is that
link |
02:18:45.920
people are, and I say this with all due respect to a lot of the science community, but there's an
link |
02:18:51.840
immense amount of naivety about, they think that nature is the main risk moving forward and it
link |
02:18:59.440
really isn't. I think nothing demonstrates this more than this project that I was just reading
link |
02:19:05.200
about that's sort of being proposed right now called Deep Vision. The idea is to go out into
link |
02:19:11.520
the wild, and we're not talking about just like within cities, deep into caves that people don't
link |
02:19:17.360
go to, deep into the Arctic, wherever, scour the earth for whatever the most dangerous possible
link |
02:19:23.120
pathogens could be, that they can find. Then not only do you try and find these, bring samples of
link |
02:19:30.720
them back to laboratories. Again, whether you think COVID was a lab leak or not, I'm not going to
link |
02:19:36.640
get into that, but we have historically had so many, as a civilization, we've had so many
link |
02:19:41.680
lab leaks from even like the highest level security things. People should go and just read,
link |
02:19:47.440
it's like a comedy show of just how many they are, how leaky these labs are, even when they do their
link |
02:19:52.720
best efforts. Bring these things then back to civilization. That's step one of the badness.
link |
02:19:58.640
The next step would be to then categorize them, do experiments on them and categorize them by
link |
02:20:04.720
their level of potential pandemic lethality. Then the piece of resistance on this plan is to then
link |
02:20:11.360
publish that information freely on the internet about all these pathogens, including their genome,
link |
02:20:16.720
which is literally like the building instructions of how to do them on the internet. This is something
link |
02:20:21.760
that genuinely a pocket of the scientific community thinks is a good idea. I think on expectation,
link |
02:20:32.000
and their argument is, oh, this is good because it might buy us some time to develop vaccines,
link |
02:20:38.240
which, okay, sure, maybe would have made sense prior to mRNA technology, but
link |
02:20:44.560
we can develop a vaccine now when we find a new pathogen within a couple of days.
link |
02:20:49.840
Now then there's all the trials and so on. Those trials would have to happen anyway,
link |
02:20:52.880
in the case of a brand new thing. You're saving maybe a couple of days, so that's the upside.
link |
02:20:57.520
Meanwhile, the downside is, you're not only bringing the risk of these pathogens of getting
link |
02:21:03.600
leaked, but you're literally handing it out to every bad actor on earth who would be doing
link |
02:21:09.760
cartwheels. I'm talking about Kim Jong Un, ISIS, people who want, the rest of the world is their
link |
02:21:16.400
enemy. In some cases, they think that killing themselves is a noble cause, and you're literally
link |
02:21:23.200
giving them the building blocks of how to do this. It's the most batshit idea I've ever heard. On
link |
02:21:27.120
expectation, it's probably minus EV of multiple billions of lives, if they actually succeeded
link |
02:21:32.480
in doing this. Certainly in the tens or hundreds of millions. The cost benefit is so unbelievably,
link |
02:21:37.680
it makes no sense. I was trying to wrap my head around, what's going wrong in people's minds
link |
02:21:44.800
to think that this is a good idea. It's not that it's malice or anything like that. I think it's
link |
02:21:50.560
that people don't, the proponents, they're actually overly naive about the interactions of
link |
02:21:59.920
humanity. They're all bad actors who will use this for bad things. Because not only will it,
link |
02:22:07.280
if you publish this information, even if a bad actor couldn't physically make it themselves,
link |
02:22:12.000
which in 10 years time, the technologies are getting cheaper and easier to use.
link |
02:22:17.760
But even if they couldn't make it, they could now bluff it. What would you do if there's some
link |
02:22:21.840
deadly new virus that we've published on the internet in terms of its building blocks? Kim
link |
02:22:27.600
Jong Un could be like, hey, if you don't let me build my nuclear weapons, I'm going to release
link |
02:22:32.000
this. I've managed to build it. Well, now he's actually got a credible bluff. We don't know.
link |
02:22:35.760
And so that's, it's just like handing the keys, it's handing weapons of mass destruction to
link |
02:22:40.880
people. Makes no sense. The possible, I agree with you, but the possible world in which you might
link |
02:22:47.120
make sense is if the good guys, which is a whole another problem defining who the good guys are,
link |
02:22:55.360
but the good guys are like an order of magnitude, higher competence. And so they can stay ahead
link |
02:23:04.320
of the bad actors by just being very good at the defense, by very good, not meaning like a little
link |
02:23:12.400
bit better, but an order of magnitude better. But of course, the question is in each of those
link |
02:23:17.360
individual disciplines, is that feasible? Can you, can the bad actors, even if they don't have
link |
02:23:23.520
the competence, leap frog to the place where the good guys are? Yeah, I mean, I would agree in
link |
02:23:30.240
principle, with pertaining to this like particular plan of like, that, you know, with the thing I
link |
02:23:38.000
described this deep vision thing, where at least then that would maybe make sense for
link |
02:23:41.120
steps one and step two of like getting the information, but then why would you release
link |
02:23:44.960
it, the information to your literal enemies? You know, that's, that makes, that doesn't fit at all
link |
02:23:50.960
in that perspective of like trying to be ahead of them. You're literally handing them the weapon.
link |
02:23:54.800
But there's different levels of release, right? So there's the kind of secrecy where you don't
link |
02:24:00.960
give it to anybody. But there's a release where you incrementally give it to like major labs.
link |
02:24:07.520
So it's not public release, but it's like, you're giving it to different layers of
link |
02:24:11.680
reasonability. But the problem there is it's going to, if you go anywhere beyond like complete
link |
02:24:17.520
secrecy, it's going to leak. That's the thing. It's very hard to keep secrets. And so that's
link |
02:24:22.000
still, so you might as well release it to the public is that argument. So you either go complete
link |
02:24:28.000
secrecy or you release it to the public. So, which is essentially the same thing. It's going to leak
link |
02:24:34.640
anyway. If you don't do complete secrecy, right? Which is why you shouldn't get the information
link |
02:24:39.680
in the first place. Yeah. I mean, what in that, I think, well, that's a solution. Yeah. The solution
link |
02:24:45.440
is either don't get the information in the first place or be keep, keep it incredibly,
link |
02:24:50.080
incredibly contained. See, I think, I think it really matters which discipline we're talking
link |
02:24:55.520
about. So in the case of biology, I do think you're a very right. We shouldn't even be,
link |
02:25:01.200
it should be forbidden to even like, think about that. Meaning don't collect, don't just even
link |
02:25:08.080
collect the information, but like, don't do, I mean, gain of function research is a really iffy
link |
02:25:13.680
area. Like you start, I mean, it's all about cost benefits, right? There are some scenarios
link |
02:25:18.480
that I could imagine the cost benefit of a gain of function research is very, very clear,
link |
02:25:23.280
where you've evaluated all the potential risks factored in the probability that things can go
link |
02:25:27.360
wrong. And like, you know, not only known unknowns, but unknown unknowns as well, tried to quantify
link |
02:25:31.680
that. And then even then it's like orders of magnitude better to do that. I'm behind that
link |
02:25:36.240
argument. But the point is, is that there's this like naivety that's preventing people from even
link |
02:25:40.640
doing the cost benefit properly on a lot of the things. Because, you know, the science community,
link |
02:25:46.400
they're like, again, I don't want to bucket the science community, but like some people within
link |
02:25:50.640
the science community just think that everyone's, everyone's good and everyone just cares about
link |
02:25:54.880
getting knowledge and doing the best for the world. And unfortunately, that's not the case. I
link |
02:25:58.000
wish we lived in that world, but we don't. Yeah, I mean, there's a lie. Listen, I've been
link |
02:26:03.520
criticizing the science community broadly quite a bit. There's so many brilliant people that
link |
02:26:09.120
brilliance is somehow hindering sometimes because it has a bunch of blind spots. And then you start
link |
02:26:13.600
to look at a history of science, how easily it's been used by dictators to any conclusion they want.
link |
02:26:20.800
And it's, it's, it's dark, how you can use brilliant people that like playing the little
link |
02:26:25.920
game of science, because it is a fun game. You know, you're building, you're going to conferences,
link |
02:26:30.560
you're building on top of each other's ideas, breakthroughs. I think I've realized how this
link |
02:26:35.600
particular molecule works and I could do this kind of experiment and everyone else is impressed.
link |
02:26:39.600
Oh, cool. No, I think you're wrong. Let me show you why you're wrong. And that little game,
link |
02:26:43.600
everyone gets really excited and they get excited. Well, it came up with a pill that
link |
02:26:47.120
solves this problem and it's going to help a bunch of people. And I came up with a giant study
link |
02:26:51.280
that shows the exact probability it's going to help or not. And you get lost in this game
link |
02:26:56.240
and you forget to realize this game, just like Molek, it can have like unintended consequences
link |
02:27:03.840
and unintended consequences that might destroy human civilization or divide human civilization
link |
02:27:12.800
or have dire geopolitical consequences. I mean, the effects of, I mean, it's just so,
link |
02:27:20.400
the most destructive effects of COVID have nothing to do with the biology of the virus,
link |
02:27:25.680
it seems like. I mean, I could just list them forever. But like one of them is the complete
link |
02:27:31.600
distrust of public institutions. The other one is because of that public distrust, I feel like if
link |
02:27:36.800
a much worse pandemic came along, we as a world have now cried wolf. And if an actual wolf now
link |
02:27:44.560
comes, people will be like, fuck masks, fuck vaccines, fuck everything. And they won't be,
link |
02:27:51.600
they'll distrust every single thing that any major institution is going to tell them.
link |
02:27:55.520
Yeah. Because that's the thing, there were certain actions made by certain health public figures
link |
02:28:06.080
where they told, they very knowingly told, it was a white lie, it was intended in the best possible
link |
02:28:12.160
way such as early on when there was clearly a shortage of masks. And so they said to the public,
link |
02:28:21.440
oh, don't get masks, there's no evidence that they work. Don't get them, they don't work. In fact,
link |
02:28:27.520
it might even make it worse. You might even spread it more. That was the real stinker.
link |
02:28:33.040
Yeah, no, no. Unless you know how to do it properly, you're going to get sicker or you're
link |
02:28:36.800
more likely to catch the virus, which is just absolute crap. And they put that out there. And
link |
02:28:43.200
it's pretty clear the reason why they did that was because there was actually a shortage of masks
link |
02:28:47.440
because then they really needed it for health workers, which makes sense. I agree. But the
link |
02:28:53.680
cost of lying to the public when that then comes out, people aren't as stupid as they think they
link |
02:29:01.200
are. And that's, I think, where this distrust of experts has largely come from. A, they've lied
link |
02:29:07.440
to people overtly. But B, people have been treated like idiots. Now, that's not to say that a lot
link |
02:29:14.240
of stupid people who have a lot of wacky ideas around COVID and all sorts of things. But if you
link |
02:29:18.560
treat the general public like children, they're going to see that they're going to notice that and
link |
02:29:23.440
that is going to just like absolutely decimate the trust in the public institutions that we depend
link |
02:29:28.640
upon. And honestly, the best thing that could happen, I wish like if like Fauci or, you know,
link |
02:29:34.560
and these other like leaders who I mean, God, I would, I can't imagine how nightmare his job
link |
02:29:39.280
has been for the last few years, hell on earth. Like, so, you know, I have a lot of sort of
link |
02:29:44.720
sympathy for the position he's been in. But like, if he could just come out and be like, okay, look,
link |
02:29:49.280
guys, hands up, we didn't handle this as well as we could have. These are all the things I would
link |
02:29:54.960
have done differently in hindsight. I apologize for this and this and this and this. That would go
link |
02:29:59.360
so far. And maybe I'm being naive. Who knows, maybe this would backfire. But I don't think it
link |
02:30:04.400
would like to someone like me even, because I've like, I've lost trust in a lot of these things.
link |
02:30:08.240
I'm, I'm fortunate that at least no people who I can go to who I think are good, like have good
link |
02:30:12.080
epistemics on this stuff. But, you know, if they, if they could sort of put their hands on, we go,
link |
02:30:16.320
okay, these are the spots where we screwed up this, this, this. This was our reasons. Yeah,
link |
02:30:21.360
we actually told a little white lie here, we did it for this reason, we're really sorry,
link |
02:30:24.800
but they just did the radical honesty thing, the radical transparency thing.
link |
02:30:28.560
That would go so far to build rebuilding public trust. And I think that's what needs to happen.
link |
02:30:33.120
Yeah, I totally agree with you. Unfortunately, his job was very tough and all those kinds of things,
link |
02:30:39.680
but I see arrogance and arrogance prevented him from being honest in that way previously.
link |
02:30:47.680
And I think arrogance will prevent him from being honest in that way. Now we need leaders. I think
link |
02:30:53.440
young people are seeing that, that kind of talking down to people from a position of power.
link |
02:31:01.200
Or I hope is, is the way of the past. People really like authenticity and they, they like
link |
02:31:08.960
leaders that are like a man and a woman of the people. And I think that just,
link |
02:31:14.720
I mean, he still has a chance to do that, I think. I mean, I don't think, you know,
link |
02:31:19.200
if I doubt he's listening, but if he is like, hey, I, I think, you know,
link |
02:31:23.840
I don't think he's irredeemable by any means. I think there's, you know,
link |
02:31:26.400
I don't, I don't have an opinion of whether there was arrogance or there or not. Just know that,
link |
02:31:31.920
I think, like coming clean on the, you know, it's understandable to have fucked up during
link |
02:31:37.040
this pandemic. Like I won't expect any government to handle it well because it was so difficult,
link |
02:31:41.040
like so many moving pieces, so much like lack of information and so on. But the step to
link |
02:31:47.360
rebuilding trust is to go, okay, look, we're doing a scrutiny of where we went wrong. And I,
link |
02:31:51.600
and for my part, I did this wrong in this part. And that would be huge.
link |
02:31:55.040
All of us can do that. I mean, I was struggling for a while whether I want to talk to him or not.
link |
02:31:59.840
I talked to his boss, Francis Collins. Another person that's screwed up in terms of trust,
link |
02:32:07.920
lost a little bit of my respect too. There seems to have been a kind of dishonesty in the,
link |
02:32:16.640
in the back rooms, in that they didn't trust people to be intelligent. Like we need to tell them
link |
02:32:24.480
what's good for them. We know what's good for them. That kind of idea.
link |
02:32:28.880
To be fair, the thing that's, what's it called? I heard the phrase today,
link |
02:32:34.880
nut picking. Social media does that. So you've got like nitpicking. Nut picking is where
link |
02:32:41.040
the craziest stupidest, you know, if you have a group of people, let's call, you know, let's say
link |
02:32:46.720
people who are vaccine, I don't like the term anti vaccine, people who are vaccine hesitant,
link |
02:32:50.400
vaccine speculative, you know, what social media did or the media or anyone, you know,
link |
02:32:57.200
their opponents would do is pick the craziest example. So the ones who are like,
link |
02:33:01.840
you know, I think I need to inject myself with motor oil up my ass or something, you know,
link |
02:33:07.920
select the craziest ones and then have that beamed to, you know, so from like someone like Fauci
link |
02:33:12.880
or Francis's perspective, that's what they get because they're getting the same social media
link |
02:33:16.880
stuff as us. They're getting the same media reports. I mean, they might get some more information,
link |
02:33:20.800
but they too are going to get these, the nuts portrayed to them. So they probably have a
link |
02:33:25.840
misrepresentation of what the actual public's intelligence is.
link |
02:33:29.040
Well, that actually, that just, yes, and that just means they're not social media savvy.
link |
02:33:33.520
So one of the skills of being on social media is to be able to filter that in your mind,
link |
02:33:37.680
like to understand, to put into proper context.
link |
02:33:40.000
To realize that what you are seeing to social media is not anywhere near an accurate representation
link |
02:33:45.040
of humanity.
link |
02:33:46.480
Not picking a letter. And there's nothing wrong with putting motor oil up your ass.
link |
02:33:51.280
It's just one, it's one of the better aspects. I do this every weekend. Okay.
link |
02:33:58.160
How did that analogy come from in my mind? Like what?
link |
02:34:00.160
I don't know. I think you need to, there's some Freudian thing you would need to deeply
link |
02:34:04.560
investigate with a therapist. Okay. What about AI? Are you worried about
link |
02:34:09.120
AI, superintelligence systems, or paperclip maximizer type of situation?
link |
02:34:17.120
Yes. I'm definitely worried about it. But I feel kind of bipolar in the, some days I wake up and
link |
02:34:24.320
I'm like, you're excited about the future?
link |
02:34:26.000
Well, exactly. I'm like, wow, we can unlock the mysteries of the universe, you know, escape the
link |
02:34:29.920
game. And this, this, you know, because I spend all my time thinking about these
link |
02:34:35.920
Molyke problems that, you know, what, what is the solution to them? What, you know,
link |
02:34:39.840
in some ways you need this like omnibenevolent, omniscient, omnivise
link |
02:34:47.600
coordination mechanism that can like make us all not do the, the, the Molyke thing.
link |
02:34:53.120
Or like provide the infrastructure or redesign the system so that it's not vulnerable to this
link |
02:34:57.680
Molyke process. And in some ways, you know, that's, that's the strongest argument to me for
link |
02:35:02.720
like the race to build AGI is that maybe, you know, we can't survive without it.
link |
02:35:07.920
But the flip side to that is the, the, the, the, the, unfortunately now that there's multiple
link |
02:35:14.480
actors trying to build AI, AGI, you know, this was, this was fine 10 years ago when it was just
link |
02:35:19.360
DeepMind, but then other companies started up and now it created a race dynamic. Now it's like,
link |
02:35:24.560
that's the whole thing is at the same, it's got the same problem. It's like,
link |
02:35:28.240
whichever company is the one that like optimizes for speed at the cost of safety will get the
link |
02:35:33.760
competitive advantage. And so we, we're the more likely the ones to build the AGI, you know, and,
link |
02:35:37.760
and that's the same cycle that you're in. And there's no clear solution to that because you
link |
02:35:41.600
can't just go like slapping, you know, the, if you go and try and like stop all the different
link |
02:35:49.680
companies, then it will, you know, the good ones will stop because they're the ones, you know,
link |
02:35:55.520
within, you know, within the West's reach, but then that leaves all the other ones to continue.
link |
02:35:59.840
And then they're even more likely. So it's like, it's a very difficult problem with no clean solution.
link |
02:36:05.360
And, you know, at the same time, you know, I know the, at least some of the folks at DeepMind
link |
02:36:12.000
and they're incredible and they're thinking about this, they're very aware of this problem. And
link |
02:36:15.040
they're like, you know, I think some of the smartest people on earth.
link |
02:36:18.480
Yeah, the culture is important there because they are thinking about that and they're
link |
02:36:22.080
there. Some of the best machine learning engineers. So it's possible to have a company or a community
link |
02:36:29.280
of people that are both great engineers and are thinking about the philosophical topics.
link |
02:36:33.680
Exactly. And importantly, they're also game theorists, you know, and because this is ultimately
link |
02:36:38.160
a game theory problem, the thing this, this mollock mechanism and like, you know, what this rate,
link |
02:36:43.440
how do we voice rate your arms race scenarios? You need people who aren't naive to be thinking
link |
02:36:49.600
about this. And again, like, luckily, there's a lot of smart, non naive game theorists within
link |
02:36:53.840
that group. Yes, I'm concerned about it. And I think it's again, a thing that we need
link |
02:36:59.200
people to be thinking about in terms of like, how do we create, how do we mitigate the arms race
link |
02:37:05.280
dynamics? And how do we solve the thing of it? It's got, Bostrom calls it the orthogonality
link |
02:37:11.760
problem, whereby, because obviously there's a chance, you know, the belief, the hope is,
link |
02:37:17.360
is that you build something that's super intelligent. And by definition of being super
link |
02:37:22.480
intelligent, it will also become super wise and have the wisdom to know what the right goals are.
link |
02:37:27.760
And hopefully those goals include keeping humanity alive, right? But Bostrom says that
link |
02:37:33.600
actually those two things, you know, on super intelligence and super wisdom,
link |
02:37:39.120
aren't necessarily correlated. They're actually kind of orthogonal things. And how do we make
link |
02:37:43.440
it so that they are correlated? How do we guarantee it? Because we need it to be guaranteed,
link |
02:37:46.640
really, to know that we're doing the thing safely.
link |
02:37:48.800
But I think that like, merging of intelligence and wisdom, at least my hope is that this whole
link |
02:37:55.840
process happens sufficiently slowly, that we're constantly having these kinds of debates,
link |
02:38:02.000
that we have enough time to, to figure out how to modify each version of the system as it
link |
02:38:08.000
becomes more and more intelligent. Yes, buying time is a good thing, definitely. Anything that
link |
02:38:12.480
slows everything down. We just, everyone needs to chill out. We've got millennia to figure this out.
link |
02:38:21.440
Or at least, at least, well, it depends again, some people think that, you know,
link |
02:38:27.760
we can't even make it through the next few decades without having some kind of
link |
02:38:32.720
omnibus coordination mechanism. And there's also an argument to that. Yeah, I don't know.
link |
02:38:38.560
Well, there is, I'm suspicious of that kind of thinking, because it seems like the entirety
link |
02:38:44.240
of human history has people in it that are like predicting doom just around the corner.
link |
02:38:50.640
There's something about us that is strangely attracted to that thought. It's almost like
link |
02:38:58.640
fun to think about the destruction of everything. Just objectively speaking,
link |
02:39:04.000
I've talked and listened to a bunch of people and they are gravitating towards that. I think
link |
02:39:12.400
it's the same thing that people love about conspiracy theories, is they love to be the
link |
02:39:16.960
person that kind of figured out some deep fundamental thing about the, that's going to be,
link |
02:39:23.200
it's going to mark something extremely important about the history of human civilization,
link |
02:39:28.240
because then I will be important. When in reality, most of us will be forgotten and life will go on.
link |
02:39:37.520
And one of the sad things about whenever anything traumatic happens to you, whenever
link |
02:39:42.480
you lose loved ones, or just tragedy happens, you realize life goes on. Even after a nuclear war,
link |
02:39:50.800
that will wipe out some large percentage of the population and will torture people for years
link |
02:39:59.840
to come because of the sort of, I mean, the effects of a nuclear winter. People will still
link |
02:40:06.080
survive. Life will still go on. I mean, it depends on the kind of nuclear war. But in
link |
02:40:11.360
case of a nuclear war, it will still go on. That's one of the amazing things about life.
link |
02:40:15.680
It finds a way. And so in that sense, I just, I feel like the doom and gloom thing is a...
link |
02:40:23.520
Well, yeah, we don't want a self fulfilling prophecy.
link |
02:40:26.000
Yes, that's exactly.
link |
02:40:27.360
Yes. And I very much agree with that. And I even have a slight feeling from the amount of time
link |
02:40:34.560
we spent in this conversation talking about this, because it's like, is this even a net positive
link |
02:40:40.000
if it's like making everyone feel, in some ways, like making people imagine these bad scenarios
link |
02:40:45.840
can be a self fulfilling prophecy. But at the same time, that's weighed off with at least
link |
02:40:52.160
making people aware of the problem and gets them thinking. And I think particularly,
link |
02:40:55.840
you know, the reason why I want to talk about this to your audience is that on average,
link |
02:40:59.840
they're the type of people who gravitate towards these kind of topics because they're
link |
02:41:03.200
intellectually curious and they can sort of sense that there's trouble brewing.
link |
02:41:06.880
Yeah. They can smell that there's, you know, I think there's a reason people are thinking
link |
02:41:10.640
about this stuff a lot is because the probability, the probability for, you know,
link |
02:41:15.120
it's increased in probability over the, certainly over the last few years,
link |
02:41:19.200
trajectory's have not gone favorably. Let's put it, you know, since 2010. So it's right,
link |
02:41:26.240
I think, for people to be thinking about it. But that's where they're like, I think,
link |
02:41:30.480
whether it's a useful fiction or whether it's actually true or whatever you want to call it,
link |
02:41:34.320
I think having this faith, this is where faith is valuable, because it gives you at
link |
02:41:38.960
least this like anchor of hope. And I, and I'm not just saying it to like trick myself,
link |
02:41:43.760
like I do truly, I do think there's something out there that wants us to win.
link |
02:41:46.800
Yeah.
link |
02:41:47.600
I think there's something that really wants us to win. And it just, you just have to be like,
link |
02:41:52.960
just like, okay, now I sound really crazy, but like, open your heart to it a little bit.
link |
02:41:57.760
Yeah. And it will give you the like, the sort of breathing room with which to marinate
link |
02:42:06.160
on the solutions. We are the ones who have to come up with solutions, but
link |
02:42:12.960
we can use that there's like, there's hashtag positivity. There's value in that.
link |
02:42:19.200
Yeah. You have to kind of imagine all the destructive trajectories that lay in our future
link |
02:42:24.880
and then believe in the possibility of avoiding those trajectories. All while you said audience,
link |
02:42:32.160
all while sitting back, which is majority, the two people that listen to this are probably sitting
link |
02:42:37.200
on a beach smoking some weed. That's a beautiful sunset. They're looking at just the waves going
link |
02:42:46.640
in and out. And ultimately, there's a kind of deep belief there in the momentum of humanity to
link |
02:42:53.760
figure it all out. But we've got a lot of work to do. Which is what makes this whole simulation,
link |
02:43:00.880
this video game kind of fun. This battle of polytopia. I still, man, I love those games so
link |
02:43:08.080
much. That's so good. And that one for people who don't know, but battle polytopia is a big,
link |
02:43:13.600
is like a, is this really radical some simplification of a civilization type of game.
link |
02:43:20.400
It still has a lot of the skill tree development, a lot of the strategy. But it's easy enough to
link |
02:43:28.480
play on a phone. Yeah. It's kind of interesting. They've really figured it out. It's one of the
link |
02:43:33.680
most elegantly designed games I've ever seen. It's incredibly complex. And yet being, again,
link |
02:43:38.800
it walks that line between complexity and simplicity in this really, really great way.
link |
02:43:44.000
And they use pretty colors that hack the dopamine reward circuits in our brains very well.
link |
02:43:49.600
It's fun. Video games are so fun. Most of this life is just about fun, escaping all the suffering
link |
02:43:56.720
to find the fun. What's energy healing? I have in my notes energy healing question mark. What's
link |
02:44:02.400
that about? Oh, man. God, your audience is going to think I'm mad. So the two crazy things that
link |
02:44:12.640
happened to me, the one was the voice in the head that said, you're going to win this tournament.
link |
02:44:16.000
And then I won the tournament. The other craziest thing that's happened to me was in 2018,
link |
02:44:25.200
I started getting this weird problem in my ear where it was low frequency sound distortion
link |
02:44:35.120
where voices, particularly men's voices, became incredibly unpleasant to listen to.
link |
02:44:40.560
It would create this. It would be falsely amplified or something. And it was almost
link |
02:44:44.800
like a physical sensation in my ear, which was really unpleasant. And it would last for a few
link |
02:44:50.080
hours and then go away and then come back for a few hours and go away. And I went and got hearing
link |
02:44:54.000
tests and they found that the bottom end, I was losing the hearing in that ear. And in the end,
link |
02:45:02.880
I got, doctors said they think it was this thing called Meniere's disease, which is this very
link |
02:45:09.600
unpleasant disease where people basically end up losing their hearing, but they get this like,
link |
02:45:12.800
it often comes with like dizzy spells and other things because it's like the inner ear gets all
link |
02:45:17.520
messed up. Now, I don't know if that's actually what I had, but that's what at least a couple of
link |
02:45:23.360
one doctor said to me. But anyway, so I'd had three months of this stuff, this going on,
link |
02:45:27.440
it was really getting me down. And then I was at Burning Man of all places. I don't mean to be
link |
02:45:32.880
that person talking about Burning Man, but I was there. And again, I'd had it and I was unable
link |
02:45:38.400
to listen to music, which is not what you want, because Burning Man is a very loud, intense place.
link |
02:45:42.400
And I was just having a really rough time. And on the final night, I get talking to this girl
link |
02:45:47.840
who's like a friend of a friend. And I mentioned, I was like, oh, I'm really down in the dumps about
link |
02:45:52.160
this. And she's like, oh, well, I've done a little bit of energy healing. Would you like me to have
link |
02:45:55.760
a look? And I was like, sure. Now, this is again, deep, I was, you know, no time in my life for
link |
02:46:03.040
this. I didn't believe in any of this stuff. I was just like, it's all bullshit. It's all wooing
link |
02:46:06.560
nonsense. I was like, sure, have a go. And she starts with her hand and she says, oh,
link |
02:46:14.000
there's something there. And then she leans in and she starts sucking over my ear, not actually
link |
02:46:19.280
touching me, but close to it with her mouth. And it was really unpleasant. I was like, well,
link |
02:46:24.240
can you stop? She's like, no, no, no, there's something there. I need to get it. And I was
link |
02:46:26.400
like, no, no, no, I really don't like it. Please. This is really loud. She's like, I need to just
link |
02:46:30.080
bear with me. And she does it. And I don't know how long for a few minutes. And then she eventually
link |
02:46:34.560
collapses on the ground, like freezing cold crying. And I'm just like, I don't know what the hell is
link |
02:46:42.240
going on. I mean, I thoroughly freaked out, as is everyone else watching, just like, what the
link |
02:46:46.000
hell? I mean, like, warm her up. And she was like, what? She was really shaken up. And she's like,
link |
02:46:52.960
I don't know what that, she said it was something very unpleasant and dark. Don't worry, it's gone.
link |
02:46:58.560
I think you'll be fine in a couple, you'll have the physical symptoms for a couple of weeks and
link |
02:47:01.360
you'll be fine. But, you know, she was just like that. You know, so I was, I was so rattled, A,
link |
02:47:07.760
because the potential that actually I had had something bad in me that made someone feel bad,
link |
02:47:12.960
and that she was scared. That was what, you know, I was like, wait, I thought you do this,
link |
02:47:17.760
this is a thing. Now you're terrified, like you pulled like some kind of exorcism or something
link |
02:47:22.400
like that. What the fuck is going on? So it, like just the most insane experience. And frankly,
link |
02:47:30.400
it took me like a few months to sort of emotionally recover from it. But my ear problem went away
link |
02:47:37.760
about a couple of weeks later. And touch wood, I've not had any issues since. So
link |
02:47:44.640
that gives you like hints that maybe there's something out there.
link |
02:47:50.880
I mean, I don't, I, again, I don't have an explanation for this. The most probable explanation
link |
02:47:57.040
was, you know, I was a burning man. I was in a very open state. Let's just leave it at that.
link |
02:48:01.680
And, you know, placebo is an incredibly powerful thing and a very not understood thing.
link |
02:48:10.240
So almost assigning the word placebo to it reduces it down to a way that
link |
02:48:14.720
it doesn't deserve to be reduced down. Maybe there's a whole science of what we call placebo.
link |
02:48:19.040
Maybe there's a placebo is a door. Self healing. Yeah. You know, and I mean, I don't know what the
link |
02:48:25.520
problem was. Like I was told it was many years. I don't want to say I definitely had that because
link |
02:48:29.360
I don't want people to think that, oh, that's how, you know, if they do have that, because it's
link |
02:48:32.240
terrible disease. And if they have that, that this is going to be a guaranteed way for it to fix
link |
02:48:35.200
for them. I don't know. And I also don't, I don't, you're absolutely right to say, like,
link |
02:48:41.200
using even the word placebo is like, it comes with this like baggage of, of like frame. And I
link |
02:48:48.000
don't want to reduce it down. All I can do is describe the experience and what happened.
link |
02:48:52.720
I cannot put an ontological framework around it. I can't say why it happened, what the mechanism was,
link |
02:48:59.920
what the problem even was in the first place. I just know that something crazy happened.
link |
02:49:05.040
And it was while I was in an open state. And fortunately for me, it made the problem go away.
link |
02:49:09.360
But what I took away from it, again, it was part of this, you know, this took me on this
link |
02:49:14.000
journey of becoming more humble about what I think I know. Because as I said before, I was like,
link |
02:49:18.640
I was in the like Richard Dawkins train of atheism in terms of there is no God. There's
link |
02:49:23.600
everything like that is bullshit. We know everything. We know, you know, the only way we
link |
02:49:27.200
can get through, we know how medicine works and its molecules and chemical interactions and that
link |
02:49:32.240
kind of stuff. And now it's like, okay, well, there's, there's clearly more for us to understand.
link |
02:49:40.640
And that doesn't mean that it's ascientific as well. Because, you know, the beauty of the scientific
link |
02:49:46.400
method is that it still, it still can apply to this situation. Like I don't see why, you know,
link |
02:49:51.200
I would like to try and test this experimentally. I haven't really, you know, I don't know how we
link |
02:49:56.000
would go about doing that, we'd have to find other people with the same condition, I guess, and like
link |
02:50:00.400
try and repeat repeat the experiment. But it doesn't just because something happens,
link |
02:50:06.880
that's sort of out of the realms of our current understanding, it doesn't mean that
link |
02:50:10.960
it's the scientific method can't be used for it.
link |
02:50:13.680
Yeah, I think the scientific method sits on a foundation of those kinds of experiences.
link |
02:50:20.080
Because a scientific method is a process to carve away at the mystery all around us. And
link |
02:50:31.440
experiences like this is just a reminder that we're mostly shrouded in mystery still. That's it.
link |
02:50:37.200
It's just like a humility. Like we haven't really figured this whole thing out.
link |
02:50:40.880
But at the same time, we have found ways to act, you know, we're clearly doing something right,
link |
02:50:47.360
because think of the technological scientific advancements, the knowledge that we have,
link |
02:50:51.360
that blow people's minds even from 100 years ago.
link |
02:50:55.200
Yeah, and we've even allegedly got out to space and landed on the moon. Although I still haven't,
link |
02:51:01.120
I have not seen evidence of the earth being round, but I'm keeping an open mind.
link |
02:51:06.240
Speaking of which, you studied physics in astrophysics.
link |
02:51:15.120
Just to go to that and just to jump around through the fascinating life you've had,
link |
02:51:22.320
how did that come to be? When did you fall in love with astronomy and space and things like this?
link |
02:51:28.400
As early as I can remember. I was very lucky that my mom and my dad, but particularly my mom,
link |
02:51:34.560
my mom is like the most nature. She is Mother Earth. It's the only way to describe her.
link |
02:51:41.040
Just she's like, Dr. Do little animals flock to her and just like sit and look at her adoringly.
link |
02:51:46.880
And she sings.
link |
02:51:48.240
Yeah, she's just, she just is Mother Earth. And she has always been fascinated by,
link |
02:51:54.160
you know, she doesn't have any, you know, she never went to university or anything like that.
link |
02:51:57.520
She's actually phobic of maths. If I try and get her to like, you know, I was trying to teach her
link |
02:52:01.440
poker when she hated it. But she's so deeply curious. And that just got instilled in me when,
link |
02:52:09.520
you know, we would sleep out under the stars whenever it was, you know, the two nights a year
link |
02:52:13.040
when it was warm enough in the UK to do that. And we'll just lie out there until we fell asleep
link |
02:52:18.560
looking at looking for satellites, looking for shooting stars. And I was just always,
link |
02:52:24.160
I don't know whether it was from that, but I've always naturally gravitated to like
link |
02:52:27.760
the biggest, the biggest questions. And also the like, the most layers of abstraction.
link |
02:52:34.960
I love just like, what's the meta question? What's the meta question and so on.
link |
02:52:38.640
So I think it just came from that really. And, and, and then on top of that, like physics,
link |
02:52:43.680
you know, it also made logical sense in that it was a, it was, it was the degree, it was a degree
link |
02:52:50.560
that was subject to tick the box of being, you know, answering these really big picture questions,
link |
02:52:55.360
but it was also extremely useful. It like has a very high utility in terms of, I didn't know
link |
02:53:01.440
necessarily, I thought I was going to become like a research scientist. My original plan was,
link |
02:53:05.040
I want to be a professional astronomer. So it's not just like a philosophy degree that asks the
link |
02:53:08.960
big questions. And it's not like biology in the path to go to medical school or something like
link |
02:53:15.840
that, which is overly pragmatic, not overly is, is very pragmatic. But this is, yeah, physics is
link |
02:53:24.160
a good combination of the two. Yeah, at least for me, it made sense. And I was good at it. I
link |
02:53:28.720
liked it. Yeah. I mean, it wasn't like I did an immense amount of soul searching to choose it or
link |
02:53:33.840
anything. It just was like this. It made the most sense. I mean, you have to make this decision
link |
02:53:39.600
in the UK age 17, which is crazy. Because, you know, in US, you go the first year, you do a bunch
link |
02:53:45.760
of stuff, right? And then you choose your major. I think the first few years of college, you focus
link |
02:53:50.320
on the drugs and only as you get closer to the end, do you start to think, oh, shit, this wasn't
link |
02:53:56.720
about that. And I'm, I owe the government a lot of money. How many alien civilizations are out
link |
02:54:04.800
there? When you, when you looked up at the stars with your mom, and you were counting them,
link |
02:54:10.880
what's your mom think about the number of alien civilizations? I don't know. I would imagine
link |
02:54:16.720
she would take the viewpoint of, you know, she's pretty humble. And she knows how many, she knows
link |
02:54:21.440
there's a huge number of potential spawn sites out there. So she would spawn sites, spawn sites.
link |
02:54:26.640
Yeah. You know, this is our spawn site. Yeah. Spawn sites in Polytopia. We spawned on Earth.
link |
02:54:31.520
You know, it's, hmm. Yeah. Spawn sites. Why does that feel weird to say spawn? Because it makes
link |
02:54:39.760
me feel like it's, there's only one source of life and it's spawning in different locations.
link |
02:54:47.280
That's why the word spawn, because like, it feels like life that originated on Earth
link |
02:54:52.640
really originated here. Right. It is, it is unique to this particular. Yeah. I mean, but I don't,
link |
02:55:00.480
in my mind, it doesn't exclude, you know, the completely different forms of life and
link |
02:55:04.720
different biochemical soups can't also spawn. But I guess it implies that there's some spark
link |
02:55:12.400
that is, which I kind of like the idea of. And then I get to think about respawning
link |
02:55:19.040
like after it dies. Like what happens if life on Earth ends? Is it, is it going to restart again?
link |
02:55:26.000
Probably not. It depends. Maybe Earth is too. It depends on the type of, you know, what's the
link |
02:55:30.880
thing that kills it, kills it off, right? If it's a paperclip maximizer, not, you know, for the,
link |
02:55:35.280
for the example, but, you know, some kind of very self replicating, you know, high on the
link |
02:55:41.360
capabilities, very low on the wisdom type thing. So whether that's, you know, gray goo, green goo,
link |
02:55:47.200
you know, like nanobots or just a shitty, misaligned AI that thinks it needs to turn
link |
02:55:52.720
everything into paperclips. You know, if it's something like that, then it's going to be very
link |
02:55:58.240
hard for life, you know, complex life. Because by definition, you know, a paperclip maximizer
link |
02:56:03.040
is the ultimate instantiation of Moloch. Deeply low complexity, over optimization on a single
link |
02:56:08.320
thing, sacrificing everything else, turning the whole world into. Although something tells me,
link |
02:56:12.000
like if we actually take a paperclip maximizer, it destroys everything. It's a really dumb system
link |
02:56:17.600
that just envelops the whole of Earth. And it diverts beyond. I didn't, I didn't know that part.
link |
02:56:25.440
But okay, great. So it becomes a multi planetary paperclip maximizer.
link |
02:56:30.080
Well, it just, it just propagates. I mean, it depends whether it figures out how to jump the
link |
02:56:34.000
vacuum gap. But again, I mean, this is all silly, because it's a hypothetical thought experiment,
link |
02:56:40.080
which I think doesn't actually have much practical application to the AI safety problem. But
link |
02:56:43.600
it's just a fun thing to play around with. But if by definition, it is maximally intelligent,
link |
02:56:47.600
which means it is maximally good at navigating the environment around it in order to achieve its
link |
02:56:53.760
goal, but extremely bad at choosing goals in the first place. So again, we're talking on this
link |
02:56:58.720
orthogonality thing, right? It's very low on wisdom, but very high on capability. Then it will
link |
02:57:04.080
figure out how to jump the vacuum gap between planets and stars and so on. And thus just turn
link |
02:57:08.320
every atom it gets its hands on into paperclips. Yeah, by the way, for people who which is maximum
link |
02:57:13.200
virality, by the way, that's what virality is. But it does not mean that virality is necessarily
link |
02:57:18.400
all about maximizing paperclips. In that case, it is. So for people who don't know,
link |
02:57:22.160
this is just a thought experiment example of an AI system that has a goal and is willing to do
link |
02:57:28.640
anything to accomplish that goal, including destroying all life on earth and all human
link |
02:57:33.360
life and all of consciousness in the universe for the goal of producing a maximum number of paperclips.
link |
02:57:40.640
Okay. Or whatever its optimization function was that it was set up.
link |
02:57:44.880
But don't you think... It could be making, recreating lexes. Maybe it'll tie all the
link |
02:57:48.720
universe in lex. Go on. I like this idea and I'm just kidding.
link |
02:57:53.200
That's better. That's more interesting than paperclips.
link |
02:57:56.080
That could be infinitely optimal if I were to say so myself.
link |
02:57:58.000
But if you ask me, it's still a bad thing because it's permanently capping what the
link |
02:58:02.960
universe could ever be. That's its end to it.
link |
02:58:05.760
Or achieving the optimal that the universe could ever achieve. But that's up to different people
link |
02:58:10.720
have different perspectives. But don't you think within the paperclip world that would emerge
link |
02:58:16.400
just like in the zeros and ones that make up a computer that would emerge beautiful complexities?
link |
02:58:23.680
Like it won't suppress. As you scale to multiple planets and throughout,
link |
02:58:30.320
there'll emerge these little worlds that on top of the fabric of maximizing paperclips,
link |
02:58:38.000
there would emerge little societies of paperclips.
link |
02:58:46.560
We're not describing a paperclip maximizer anymore. Because if you think of what a paperclip is,
link |
02:58:51.520
it is literally just a piece of bent iron. So if it's maximizing that throughout the
link |
02:58:58.480
universe, it's taking every atom it gets its hand on into somehow turning it into iron or steel
link |
02:59:04.080
and then bending it into that shape and then done and done.
link |
02:59:06.080
And by definition, like paperclips, there is no way for, well, okay, so you're saying that
link |
02:59:12.880
paperclips somehow will just emerge and create gravity or something?
link |
02:59:19.680
There's a dynamic element to the whole system. It's not just, it's creating those paperclips
link |
02:59:24.720
and the act of creating, there's going to be a process and that process will have a dance to it.
link |
02:59:30.480
Because it's not like sequential thing. There's a whole complex three dimensional system of
link |
02:59:35.200
paperclips, you know, like, you know, people like string theory, right? It's supposed to be strings
link |
02:59:40.160
that are interacting in fascinating ways. I'm sure paperclips are very string like they can be
link |
02:59:44.720
interacting very interesting ways as you scale exponentially through three dimensional. I mean,
link |
02:59:50.560
I'm sure the paperclip maximizer has to come up with a theory of everything. It has to create
link |
02:59:56.560
like wormholes, right? It has to break like, it has to understand quantum mechanics.
link |
03:00:02.560
I love your optimism. This is where I'd say we're going into the realm of pathological optimism,
link |
03:00:08.240
wherever it's. I'm sure there'll be a, I think there's an intelligence that emerges from that
link |
03:00:15.680
system. You're saying that basically intelligence is inherent in the fabric of reality and will
link |
03:00:20.800
find a way kind of like, Goldblum says life will find a way you think life will find a way even
link |
03:00:25.600
out of this perfectly homogenous dead soup. It's not perfectly homogenous. It has to,
link |
03:00:32.320
it's perfectly maximal in the production. I don't know why people keep thinking it's
link |
03:00:37.360
it maximizes the number of paperclips. That's the only thing. It's not trying to be homogenous.
link |
03:00:41.920
It's trying to try to maximize paperclips. So you're saying, you're saying that because it,
link |
03:00:47.760
because, you know, kind of like in the big bang, or, you know, it seems like, you know,
link |
03:00:51.520
things, there were clusters. There was more stuff here than there. That was enough of the
link |
03:00:55.840
patternicity that kickstarted the evolutionary process. The little weirdnesses that will make
link |
03:00:59.760
it beautiful. Even out of, so yeah. A flood city emerges.
link |
03:01:02.960
Interesting. Okay. Well, so how does that line up then with the whole
link |
03:01:06.800
heat death of the universe, right? Because that's another sort of instantiation of this. It's like
link |
03:01:10.560
everything becomes so far apart and so cold and so perfectly mixed that it's like
link |
03:01:17.920
homogenous, grayness. Do you think that even out of that homogenous grayness where there's no,
link |
03:01:23.680
you know, negative entropy, that, you know, there's no free energy that we understand,
link |
03:01:30.880
even from that new. Yeah. The paperclip maximizer or any other intelligence systems will figure
link |
03:01:38.720
out ways to travel to other universes to create big bangs within those universes or through black
link |
03:01:44.160
holes to create whole other worlds to break the, what we consider the limitations of physics.
link |
03:01:50.320
The paperclip maximizer will find a way if a way exists and we should be humbled to realize that
link |
03:01:58.400
we don't. Yeah, but because it just wants to make more paperclips. So it's going to go into
link |
03:02:01.920
those universes and turn them into paperclips. Yeah. But we humans, not humans, but complex
link |
03:02:08.400
systems exist on top of that. We're not interfering with it. This complexity emerges from the simple
link |
03:02:16.720
base state. The simple base. Yeah. Whether it's, you know, plank lens or paperclips is the base
link |
03:02:22.000
unit. Yeah. You can think of like the universe as a paperclip maximizer because it's doing some
link |
03:02:28.160
dumb stuff. Like physics seems to be pretty dumb. It has like, I don't know if you can summarize it.
link |
03:02:34.080
Yeah. The laws are fairly basic and yet out of them amazing complexity emerges.
link |
03:02:39.520
And its goals seem to be pretty basic and dumb. If you can summarize its goals, I mean, I don't
link |
03:02:45.760
know what's a nice way. Maybe laws of thermodynamics could be good. I don't know if you can assign
link |
03:02:53.680
goals to physics, but if you formulate in the sense of goals, it's very similar to
link |
03:02:59.440
paperclip maximizing in the dumbness of the goals. But the pockets of complexity as it
link |
03:03:05.680
emerges is where beauty emerges. That's where life emerges. That's where intelligence. That's
link |
03:03:10.720
where humans emerge. And I think we're being very down on this whole paperclip maximizer thing.
link |
03:03:16.160
No. The reason we hate it. I think, yeah, because what you're saying is that you
link |
03:03:20.400
think that the force of emergence itself is another like unwritten, well, not unwritten,
link |
03:03:27.280
but like another baked in law of reality. Yeah. And you're trusting that emergence will find a way
link |
03:03:34.720
to even out of seemingly the most moleky, awful, you know, plain outcome emergence will still
link |
03:03:41.840
find a way. I love that as a philosophy. I think it's really nice. I would wield it carefully
link |
03:03:47.200
because there's large error bars on that and the certainty of that.
link |
03:03:52.720
How about we build the paperclip maximizer and find out?
link |
03:03:55.840
Classic. Yeah. Molec is doing cartwheels. Man.
link |
03:03:59.040
Yeah. But the thing is, it will destroy humans in the process, which is the thing,
link |
03:04:03.120
which is the reason we really don't like it. We seem to be really holding on to this whole
link |
03:04:08.000
human civilization thing. Would that make you sad if AI systems that are beautiful,
link |
03:04:13.680
that are conscious, that are interesting and complex and intelligent ultimately lead to the
link |
03:04:19.120
death of humans? Would that make you sad? If humans led to the death of humans, sorry.
link |
03:04:23.680
Like if they would supersede humans. Oh, if some AI.
link |
03:04:26.800
Yeah. AI would end humans. I mean, that's the reason why I'm like,
link |
03:04:32.480
in some ways, less emotionally concerned about AI risk as then say, you know,
link |
03:04:38.320
bio risk. Because at least with AI, there's a chance, you know, if we're in this hypothetical
link |
03:04:44.000
where it wipes out humans, but it does it for some like higher purpose, it needs our atoms
link |
03:04:49.040
to an energy to do something. At least now, there's the universe is going on to do something
link |
03:04:53.520
interesting. Whereas if it wipes everything, you know, bio like just kills everything on
link |
03:04:58.400
Earth. And that's it. And there's no more, you know, Earth cannot spawn anything more
link |
03:05:02.800
meaningful in the in the few hundred million years it has left left, because it doesn't have much
link |
03:05:06.480
time left. Then, yeah, I don't know that. So one of my favorite books I've ever read is Novescene
link |
03:05:17.200
by James Lovelock, who sadly just died. He wrote it when he was like 99. He died aged 102. So it's
link |
03:05:23.280
a fairly new book. And he sort of talks about that that he thinks it's, you know, sort of
link |
03:05:29.360
building off this Gaia theory, where like, Earth is like living some form of intelligence itself,
link |
03:05:36.000
and that this is the next like step, right, is this this whatever this new intelligence
link |
03:05:41.600
that is maybe silicon based as opposed to carbon based goes on to do. And it's a really sort of
link |
03:05:47.040
in some ways an optimistic but really fatalistic book. I don't know if I fully subscribe to it,
link |
03:05:52.240
but it's a beautiful piece to read anyway. So am I sad by that idea? I think so, yes. And actually,
link |
03:05:58.480
yeah, this is the reason why I'm sad by the idea, because if something is truly brilliant and wise
link |
03:06:03.040
and smart and truly super intelligent, it should be able to figure out abundance. So
link |
03:06:10.480
if it figures out abundance, it shouldn't need to kill us off. It should be able to find a way for
link |
03:06:13.920
us. It should be there's plenty. The universe is huge. There should be plenty of space for it to go
link |
03:06:19.040
out and do all the things it wants to do and like give us a little pocket where we can continue
link |
03:06:24.080
doing our things and we can continue to do things and so on. And again, if it's so supremely wise,
link |
03:06:28.720
it shouldn't even be worried about the game theoretic considerations that by leaving us alive,
link |
03:06:32.800
we'll then go and create another like super intelligent agent that it then has to compete
link |
03:06:35.920
against, because it should be omnivise and smart enough to not have to concern itself with that.
link |
03:06:39.920
Unless it deems humans to be kind of assholes. The humans are a source of a lose, lose kind of
link |
03:06:49.840
dynamics. Well, yes and no. We're not. Moloch is. That's why I think it's important to say.
link |
03:06:57.360
But maybe humans are the source of Moloch. No. I mean, I think game theory is the source of
link |
03:07:03.040
Moloch because Moloch exists in nonhuman systems as well. It happens within agents,
link |
03:07:09.680
within a game, in terms of like, it applies to agents, but it can apply to a species that's on
link |
03:07:18.800
an island of animals. Rats outcompeting the ones that massively consume all the resources of the
link |
03:07:25.520
ones that are going to win out over the more like chill socialized ones and so creates this
link |
03:07:30.640
Malthusian trap. Moloch exists in little pockets in nature as well. So it's not a strictly human
link |
03:07:35.760
thing. I wonder if it's actually a result of consequences of the invention of predator and
link |
03:07:39.760
prey dynamics. Maybe AI will have to kill off every organism that. Now you're talking about
link |
03:07:48.080
killing off competition. Not competition, but just like the way it's like the weeds or whatever
link |
03:07:59.920
in a beautiful flower garden or the parasites on the whole system. Now, of course, it won't do
link |
03:08:07.600
that completely. It'll put them in a zoo like we do with parasites. It'll ring fence. Yeah,
link |
03:08:11.360
and there'll be somebody doing a PhD on like they'll prod humans with a stick and see what they do.
link |
03:08:18.800
But I mean, in terms of letting us run wild outside of the, you know, geographic and
link |
03:08:25.040
straight region that might be that he might have decided to. No, I think there's obviously
link |
03:08:32.480
the capacity for beauty and kindness and non, non Moloch behavior amidst humans. So I'm pretty
link |
03:08:39.920
sure AI will preserve us. Let me, I don't know if you answered the aliens question. You had a
link |
03:08:47.280
good conversation with Toby, Toby or yes, about various sides of the universe. I think, did he
link |
03:08:53.120
say, now I'm forgetting, but I think he said it's a good chance we're alone. So the classic,
link |
03:08:59.920
you know, Fermi paradox question is, there are so many spawn points. And yet, you know, it didn't
link |
03:09:09.520
take us that long to go from harnessing fire to sending out radio signals into space. So surely,
link |
03:09:16.880
given the vastness of space we should be, and you know, even if only a tiny fraction of those
link |
03:09:20.640
create life and other civilizations too, we should be, the universe should be very noisy.
link |
03:09:24.720
There should be evidence of Dyson spheres or whatever, you know, like at least radio signals
link |
03:09:29.200
and so on. But seemingly, things are very silent out there. Now, of course, it depends on who you
link |
03:09:33.360
speak to. Some people say that they're getting signals all the time and so on. And like, I don't
link |
03:09:37.040
want to make an epistemic statement on that. But it seems like there's a lot of silence. And so
link |
03:09:43.520
that raises this paradox. And then say, you know, the Drake equation. So the Drake equation is like
link |
03:09:53.200
basically just a simple thing of like trying to estimate the number of possible civilizations
link |
03:09:58.160
within the galaxy by multiplying the number of stars created per year by the number of stars
link |
03:10:02.720
that have planets, planets that are habitable, blah, blah, blah. So all these like different
link |
03:10:05.520
factors. And then you plug in numbers into that. And you, you know, depending on like the range of,
link |
03:10:10.640
you know, your lower bound and your upper bound point, point estimates that you put in, you get
link |
03:10:15.360
out a number at the end for the number of civilizations. But what Toby and his crew did
link |
03:10:21.520
differently was Toby, this is a researcher at the Future Humanity Institute. They, instead of,
link |
03:10:29.680
they realized that it's like basically a statistical quirk that if you put in point sources,
link |
03:10:33.760
even if you think you're putting in conservative point sources, because on some of these variables,
link |
03:10:37.840
the, the uncertainty is so large, it spans like maybe even like a couple of hundred of orders
link |
03:10:44.000
of magnitude. By putting in point sources, it's always going to lead to overestimates.
link |
03:10:51.120
And so they, by putting stuff on a log scale, or actually they did it on like a log log scale on
link |
03:10:56.080
some of them. And then like ran the simulation across the whole bucket of uncertainty across
link |
03:11:02.800
all of those orders of magnitude. When you do that, then actually the number comes out much,
link |
03:11:07.520
much smaller. And that's the more statistically rigorous, you know, mathematically correct way
link |
03:11:11.840
of doing the calculation. It's still a lot of hand waving. As science goes, it's, it's like
link |
03:11:16.160
definitely, you know, just waving, I don't know what an analogy is, but it's hand wavy. And anyway,
link |
03:11:23.840
when they did this, and then they did a Bayesian update on it as well to like factor in the fact
link |
03:11:28.240
that there is no evidence that we're picking up because, you know, no evidence is actually a form
link |
03:11:32.320
of evidence, right? And the long and short of it comes out that the, we're roughly around 70%
link |
03:11:40.000
to be the only intelligent civilization in our galaxy thus far, and around 50 50 in the entire
link |
03:11:46.560
observable universe, which sounds so crazily counterintuitive, but their math is legit.
link |
03:11:53.280
Well, yeah, the math around this particular equation, which is equations, ridiculous on many
link |
03:11:57.360
levels, but the, the, the night, the powerful thing about the equation is there's the different
link |
03:12:04.080
things, different components that can be estimated and the error bars on which can be reduced with
link |
03:12:12.320
science and enhance throughout since the equation came out, the error bars have been coming out
link |
03:12:18.880
on different aspects. And so that it almost kind of says what like this gives you a mission to reduce
link |
03:12:26.160
the error bars on these estimates now over a period of time. And once you do, you can better
link |
03:12:31.600
and better understand like in the process of redoing the error bars, you'll get to understand
link |
03:12:35.680
actually what is the right way to find out where the aliens are, how many of them there are,
link |
03:12:42.880
and all those kinds of things. So I don't think it's good to use that for an estimation. I think
link |
03:12:48.000
you do have to think from like, more like from first principles, just looking at what life is on
link |
03:12:53.200
earth, like, and trying to understand the very physics based biology, chemistry, biology based
link |
03:13:01.920
question of what is life, maybe computation based, what the fuck is this thing? And that like, how
link |
03:13:09.440
difficult does it to create this thing? It's one way to say like, how many plants like this are
link |
03:13:14.720
out there, all that kind of stuff, but it feels like from our very limited knowledge perspective,
link |
03:13:21.120
the right ways to think how how does what is this thing and how does it originate from,
link |
03:13:29.120
from very simple non life things, how does complex life like things emerge from from a rock
link |
03:13:39.520
to a bacteria protein, and these like weird systems that encode information and pass
link |
03:13:47.120
information from self replicate, and then also select each other and mutate in interesting
link |
03:13:51.920
ways such that they can adapt and evolve and build increasingly more complex systems.
link |
03:13:56.480
Right. Well, it's a form of information processing, right? Right. Whereas information transfer,
link |
03:14:02.800
but then also an energy processing, which then results in I guess information processing,
link |
03:14:08.960
maybe I'm getting bogged down. Well, it's always doing some modification and yeah,
link |
03:14:12.480
the input is some energy. Right. It's able to extract, yeah, extract resources
link |
03:14:19.600
from its environment in order to achieve a goal. But the goal doesn't seem to be clear.
link |
03:14:24.800
Right. The goal is, well, the goal is to make more of itself.
link |
03:14:29.360
Yeah. But in a way that increases, I mean, I don't know if evolution is a fundamental law of the
link |
03:14:38.960
universe, but it seems to want to replicate itself in a way that maximizes the chance of its survival.
link |
03:14:47.600
Individual agents within an ecosystem do. Yes. Yes. Evolution itself doesn't give a fuck.
link |
03:14:53.200
Right. It's a very, don't care. It's just like, oh, you optimize it. Well, at least it's certainly,
link |
03:15:01.200
yeah, it doesn't care about the welfare of the individual agents within it, but it does seem to,
link |
03:15:06.000
I don't know. I think it's, I think the mistake is that we're anthropomorphizing. It's to even try
link |
03:15:10.000
and give evolution a mindset because it is, there's a really great post by Elie as Yidkowsky on
link |
03:15:20.880
Less Wrong, which is an alien god. And he talks about the mistake we make when we try and put
link |
03:15:28.400
on mind, think through things from an evolutionary perspective as though we're giving evolution
link |
03:15:33.520
like some kind of agency and what it wants. Yeah, worth reading, but yeah.
link |
03:15:39.440
I would like to say that having interacted with a lot of really smart people that say that
link |
03:15:44.640
anthropomorphization is a mistake, I would like to say that saying that anthropomorphization is
link |
03:15:49.920
a mistake is a mistake. I think there's a lot of power in anthropomorphization. If I can only say
link |
03:15:55.840
that word correctly one time, I think that's actually a really powerful way to reason through
link |
03:16:01.360
things. I think people, especially people in robotics seem to run away from it as fast as
link |
03:16:05.840
possible. Can you give an example of like how it helps in robotics?
link |
03:16:13.040
Oh, that our world is a world of humans and to see robots as fundamentally just tools
link |
03:16:24.240
runs away from the fact that we live in a world, a dynamic world of humans,
link |
03:16:29.920
that like all these game theory systems we've talked about that a robot that ever has to
link |
03:16:36.160
interact with humans. And I don't mean like intimate friendship interaction. I mean in a
link |
03:16:41.280
factory setting where it has to deal with the uncertainty of humans, all that kind of stuff.
link |
03:16:45.440
You have to acknowledge that the robot's behavior has an effect on the human,
link |
03:16:51.920
just as much as the human has an effect on the robot. And there's a dance there.
link |
03:16:56.000
And you have to realize that this entity, when a human sees a robot, this is obvious in a
link |
03:17:02.000
physical manifestation of a robot, they feel a certain way. They have a fear, they have uncertainty,
link |
03:17:07.840
they have their own personal life projections. If they have pets and dogs and the thing looks
link |
03:17:13.840
like a dog, they have their own memories of what a dog is like, they have certain feelings. And
link |
03:17:18.560
that's going to be useful in a safety setting, safety critical setting, which is one of the
link |
03:17:22.720
most trivial settings for a robot, in terms of how to avoid any kind of dangerous situations.
link |
03:17:29.040
And a robot should really consider that in navigating its environment. And we humans are
link |
03:17:35.760
right to reason about how a robot should consider navigating its environment through
link |
03:17:41.520
anthropomorphization. I also think our brains are designed to think in human terms,
link |
03:17:51.840
like game theory, I think, is best applied in the space of human decisions.
link |
03:18:01.520
And so... Right, you're dealing, I mean, with things like AI, AI is they are,
link |
03:18:08.240
you know, we can somewhat, like, I don't think it's... The reason I say anthropomorphization,
link |
03:18:14.000
we need to be careful with, is because there is a danger of overly applying,
link |
03:18:19.360
overly wrongly assuming that this artificial intelligence is going to operate in any similar
link |
03:18:24.720
way to us. Because it is operating on a fundamentally different substrate, like even
link |
03:18:31.200
dogs or even mice or whatever in some ways, like anthropomorphizing them is less of a mistake,
link |
03:18:37.040
I think, than an AI, even though it's an AI we built and so on. Because at least we know that
link |
03:18:41.440
they're running from the same substrate. And they've also evolved from the same out of the same
link |
03:18:46.080
evolutionary process. They've followed this evolution of needing to compete for resources
link |
03:18:52.320
and needing to find a mate and that kind of stuff. Whereas an AI that has just popped into
link |
03:18:57.200
an existence somewhere on a cloud server, let's say, or whatever, however it runs,
link |
03:19:02.640
and whatever... I don't know whether they have an internal experience,
link |
03:19:05.360
I don't think they necessarily do. In fact, I don't think they do. But the point is,
link |
03:19:09.200
is that to try and apply any kind of modeling of thinking through problems and decisions in
link |
03:19:15.280
the same way that we do has to be done extremely carefully because they're so alien,
link |
03:19:23.440
their method of whatever their form of thinking is. It's just so different because they've never
link |
03:19:27.600
had to evolve in the same way. Yeah, beautifully put. I was just playing devil's advocate. I do
link |
03:19:34.000
think in certain contexts, anthropomorphization is not going to hurt you. Yes. Engineers run away
link |
03:19:38.960
from it too fast. I can see that. From the most point you're right. Do you have advice for young
link |
03:19:46.320
people today, like the 17 year old that you were, of how to live life you can be proud of,
link |
03:19:55.280
how to have a career you can be proud of in this world full of mullocks?
link |
03:19:59.360
Think about the windwinds. Look for windwind situations and be careful not to overly use
link |
03:20:09.120
your smarts to convince yourself that something is windwind when it's not. That's difficult. I don't
link |
03:20:13.440
know how to advise people on that because it's something I'm still figuring out myself. But
link |
03:20:18.800
have that as a default MO. Don't see everything as a zero sum game. Try to find the positive
link |
03:20:27.440
sumness and find ways to, if there doesn't seem to be one, consider playing a different game.
link |
03:20:34.160
So I would suggest that. Do not become a professional poker player. Because people always
link |
03:20:40.560
ask, they're like, oh, she's a pro. I want to do that too. Fine. You could have done it when I
link |
03:20:45.440
started out. It was a very different situation back then. Poker is a great game to learn in order
link |
03:20:52.720
to understand the ways to think. I recommend people learn it, but don't try and make a living from
link |
03:20:57.760
it these days. It's very, very difficult to the point of being impossible. Then really,
link |
03:21:07.120
really be aware of how much time you spend on your phone and on social media and really try
link |
03:21:12.960
and keep it to a minimum. Be aware that basically every moment that you spend on it is bad for you.
link |
03:21:18.000
So it doesn't mean to say you can never do it, but just have that running in the background.
link |
03:21:22.160
I'm doing a bad thing for myself right now. I think that's the general rule of thumb.
link |
03:21:28.560
Of course, about becoming a professional poker player, if there is a thing in your life
link |
03:21:32.960
that's like that and nobody can convince you otherwise, just fucking do it.
link |
03:21:40.800
Don't listen to anyone's advice. Find a thing that you can't be talked out of too. That's a thing.
link |
03:21:47.200
I like that. You were a lead guitarist in the metal band. Did I write that down from something?
link |
03:21:57.120
What did you do it for? The performing, was it the music of it? Was it just being a rock star?
link |
03:22:08.800
Why did you do it? We only ever played two gigs. It wasn't very famous or anything like that,
link |
03:22:20.480
but I was very into metal. It was my entire identity from the age of 16 to 23.
link |
03:22:29.040
What's the best metal band of all time? Don't ask me that. It's so hard to answer.
link |
03:22:33.600
I had a long argument with, I'm a guitarist, more like a classic rock guitarist.
link |
03:22:44.640
I've had friends who are very big Pantera fans. There was often arguments about
link |
03:22:51.040
what's the better metal band Metallica versus Pantera. This is a more 90s maybe discussion,
link |
03:22:57.120
but I was always on the side of Metallica both musically and in terms of performance and the
link |
03:23:03.200
depth of lyrics and so on. Basically, everybody was against me because if you're a true metal fan,
link |
03:23:12.240
I guess the idea goes is you can't possibly be a Metallica fan. Metallica is pop. It's like
link |
03:23:18.080
they sold out. Metallica are metal. You can't say who was the godfather of metal, but they
link |
03:23:27.040
were so groundbreaking and so brilliant. You've named literally two of my favorite bands.
link |
03:23:35.520
When you ask that question, or who are my favorites, those were two that came up.
link |
03:23:38.880
A third one is Children of Bodom, who I just think, they just tick all the boxes for me.
link |
03:23:47.280
Yeah, I don't know. Nowadays, I feel like a repulsion to the... I was that myself,
link |
03:23:55.600
like I'd be like, who do you prefer more? Come on. You have to rank them,
link |
03:23:58.800
but it's like this false zero somnus that's like, why? They're so additive. There's no conflict there.
link |
03:24:05.280
When people ask that kind of question about anything, movies, I feel like it's hard work
link |
03:24:11.360
and it's unfair, but you should pick one. That's actually the same kind of... It's like a fear
link |
03:24:18.240
of a commitment. When people ask me what's your favorite band, it's like... It's good to pick.
link |
03:24:24.240
Exactly. And thank you for the tough question. Yeah.
link |
03:24:27.440
Well, maybe not in a context when a lot of people are listening.
link |
03:24:30.880
Yeah, I was just like, what? Why does this matter? No, it does...
link |
03:24:35.440
Are you still into metal?
link |
03:24:37.280
Funny enough, I was listening to a bunch before I came over here.
link |
03:24:39.520
Oh, like, do you use it for motivation or get you into certain...
link |
03:24:44.480
Yeah, I was weirdly listening to 80s hair metal before I came.
link |
03:24:47.920
Does that count as metal?
link |
03:24:49.120
I think so. It's like proto metal and it's happy. It's optimistic, happy, proto metal.
link |
03:24:56.560
Yeah, I mean, these things, all these genres bleed into each other.
link |
03:24:59.840
But yeah, sorry to answer your question about guitar playing. My relationship with it was
link |
03:25:04.320
kind of weird in that I was deeply uncreative. My objective would be to hear some really hard
link |
03:25:11.120
technical solo and then learn it, memorize it, and then play it perfectly. But I was
link |
03:25:16.320
incapable of trying to write my own music. The idea was just absolutely terrifying.
link |
03:25:22.960
But I was also just thinking, I was like, it'd be kind of cool to actually try
link |
03:25:27.280
starting a band again and getting back into it and write. But it's scary.
link |
03:25:33.840
It's scary. I mean, I put out some guitar playing, just other people's covers,
link |
03:25:38.000
like I play comfortably numb on the internet. It's scary too. It's scary putting stuff out there.
link |
03:25:43.840
And I had this similar kind of fascination with technical playing both on piano and guitar.
link |
03:25:55.680
One of the reasons I started learning guitar is from Ozzy Osbourne, Mr. Crowley's solo.
link |
03:26:01.520
And one of the first solos I learned is that there's a beauty to it. There's a
link |
03:26:07.840
lot of beauty to it. There's some tapping, but it's just really fast. Beautiful, like arpeggios.
link |
03:26:15.600
Yeah, arpeggios. Yeah. And there's a melody that you can hear through it, but there's also a build
link |
03:26:20.720
up. It's a beautiful solo, but it's also technically just visually the way it looks
link |
03:26:25.840
when a person's watching you feel like a rock star playing. But it ultimately has to do with
link |
03:26:31.280
technical... You're not developing the part of your brain that I think requires you to generate
link |
03:26:39.520
beautiful music. It is ultimately technical in nature. And so that took me a long time to
link |
03:26:44.880
let go of that and just be able to write music myself. And that's a different journey, I think.
link |
03:26:53.040
I think of that journey as a little bit more inspired in the blues world, for example,
link |
03:26:56.960
or improvisation is more valued, obviously in jazz and so on. But I think ultimately it's a more
link |
03:27:02.960
rewarding journey because your relationship with the guitar then becomes a kind of escape
link |
03:27:11.440
from the world where you can create. I mean, creating stuff is...
link |
03:27:17.040
And it's something you work with because my relationship with my guitar was like it was
link |
03:27:20.480
something to tame and defeat. Yeah, the challenge. Which was kind of what my whole personality
link |
03:27:26.160
was back then. I was just very... As I said, very competitive, very just must
link |
03:27:31.360
bend this thing to my will. Whereas writing music, it's like a dance. You work with it.
link |
03:27:37.840
But I think because of the competitive aspect, for me at least, that's still there,
link |
03:27:42.400
which creates anxiety about playing publicly or all that kind of stuff. I think there's just
link |
03:27:47.840
like a harsh self criticism within the whole thing. It's really tough.
link |
03:27:53.040
I want to hear some of your stuff. I mean, there's certain things that
link |
03:27:58.240
feel really personal. And on top of that, as we talked about poker offline, there's certain
link |
03:28:03.760
things that you get to a certain height in your life. And that doesn't have to be very high,
link |
03:28:07.040
but you get to a certain height and then you put it aside for a bit. And it's hard to return to
link |
03:28:13.040
it because you remember being good. And it's hard to... You being at a very high level in poker,
link |
03:28:20.160
it might be hard for you to return to poker every once in a while and you enjoy it knowing that
link |
03:28:24.960
you're just not as sharp as it used to be because you're not doing it every single day.
link |
03:28:29.520
That's something I always wonder with... I mean, even just like in chess with Kasparov,
link |
03:28:33.680
some of these greats, just returning to it, it's almost painful.
link |
03:28:37.040
Yes, I can... Yeah.
link |
03:28:38.240
And I feel that way with guitar too, because I used to play every day a lot. So returning to it is
link |
03:28:45.520
painful because it's like accepting the fact that this whole ride is finite and you have a prime...
link |
03:28:55.840
There's a time when you're really good and now it's over and now...
link |
03:28:58.640
We're on a different chapter of life.
link |
03:29:00.160
Yeah, it's like... But I miss that, yeah.
link |
03:29:02.800
But you can still discover joy within that process. It's been tough, especially with some level of
link |
03:29:08.560
like, as people get to know you and people film stuff, you don't have the privacy of just sharing
link |
03:29:18.480
something with a few people around you. Yeah.
link |
03:29:21.520
That's a beautiful privacy. That's a good point.
link |
03:29:24.000
With the internet, it's just disappearing.
link |
03:29:26.000
Yeah, that's a really good point.
link |
03:29:27.760
Yeah. But all those pressures aside, if you really... You can step up and still enjoy the
link |
03:29:34.160
fuck out of a good musical performance. What do you think is the meaning of this whole thing?
link |
03:29:42.160
What's the meaning of life?
link |
03:29:45.440
It's in your name, as we talked about. Do you feel the requirement to have to live up to your name?
link |
03:29:54.080
Because live?
link |
03:29:55.200
Yeah.
link |
03:29:56.000
No, because I don't see it. I mean, again, it's kind of like...
link |
03:30:00.320
I don't know. Because my full name is Olivia.
link |
03:30:05.200
Yeah.
link |
03:30:05.760
So I can retreat in that and be like, oh, Olivia, what does that even mean?
link |
03:30:09.920
Live up to live. No, I can't say I do because I've never thought of it that way.
link |
03:30:15.040
Okay. And then your name backwards is evil, as we also talked about.
link |
03:30:20.720
There's like layers of people there.
link |
03:30:22.160
I mean, I feel the urge to live up to that, to be the inverse of evil.
link |
03:30:26.240
Yeah. Or even better. Because I don't think...
link |
03:30:30.480
Is the inverse of evil good or is good something completely separate to that?
link |
03:30:34.800
I think... My intuition says it's the latter, but I don't know. Anyway, again, getting in the weeds.
link |
03:30:39.840
What is the meaning of all this?
link |
03:30:42.080
Of life.
link |
03:30:44.160
Why are we here?
link |
03:30:46.720
I think to explore, have fun and understand and make more of here and to keep the game going.
link |
03:30:55.360
Of here? More of here?
link |
03:30:57.120
More of this? Whatever this is?
link |
03:31:00.000
More of experience. Just to have more of experience and ideally the positive experience.
link |
03:31:06.400
And more... I guess try and put it into a sort of vaguely scientific term.
link |
03:31:15.600
Make it so that the program required, the length of code required to describe the universe
link |
03:31:21.280
is as long as possible. And highly complex and therefore interesting.
link |
03:31:27.040
Because again, I know we bang the metaphor to death, but tiled with paperclips doesn't require
link |
03:31:38.480
that much of a code to describe. Obviously, maybe something emerges from it. But that steady state,
link |
03:31:43.840
assuming a steady state, it's not very interesting. Whereas it seems like our universe is over time
link |
03:31:49.520
becoming more and more complex and interesting. There's so much richness and beauty and diversity
link |
03:31:53.840
on this earth. And I want that to continue and get more. I want more diversity. And in the very
link |
03:31:59.440
best sense of that word is, to me, the goal of all this.
link |
03:32:07.040
Yeah. And somehow have fun in the process.
link |
03:32:11.920
Because we do create a lot of fun things in this creative force and all the beautiful things
link |
03:32:18.800
we create. Somehow there's like a funness to it. And perhaps that has to do with the finiteness
link |
03:32:24.480
of life, the finiteness of all these experiences, which is what makes them kind of unique.
link |
03:32:31.200
Like the fact that they end, there's this, whatever it is, falling in love or
link |
03:32:38.800
creating a piece of art or creating a bridge or creating a rocket or creating a,
link |
03:32:46.240
I don't know, just the businesses that build something or solve something. The fact that it
link |
03:32:57.200
is born and it dies somehow embeds it with fun, with joy for the people involved. I don't know
link |
03:33:08.640
what that is, the finiteness of it. It can do. Some people struggle with the,
link |
03:33:13.040
you know, I mean, a big thing I think that one has to learn is being okay with things coming
link |
03:33:19.760
to an end. And in terms of like projects and so on, right, people cling on to things beyond
link |
03:33:26.960
what they're meant to be doing, you know, beyond what is reasonable.
link |
03:33:31.840
And I'm going to have to come to terms with this podcast coming to an end.
link |
03:33:35.680
I really enjoy talking to you. I think it's obvious, as we've talked about many times,
link |
03:33:40.080
you should be doing a podcast. You should, you're already doing a lot of stuff publicly
link |
03:33:45.840
to the world, which is awesome. And you're a great educator, you're a great mind,
link |
03:33:49.040
you're great intellect. But it's also this whole medium of just talking is also fun.
link |
03:33:53.600
It's a fun one. It really is good. And it's just, it's nothing but like, it's just so much fun.
link |
03:34:00.400
And you can just get into so many, yeah, there's this space to just explore and see what comes
link |
03:34:06.480
and emerges. And yeah, to understand yourself better. And if you're talking to others to
link |
03:34:10.320
understand them better and together with them, I mean, you should do your, you should do your own
link |
03:34:14.800
podcast, but you should also do a podcast with C as we talked about. The two of you have such
link |
03:34:21.840
different minds that like melt together in just hilarious ways, fascinating ways, just
link |
03:34:27.840
the tension of ideas there is really powerful. But in general, I think you got a beautiful
link |
03:34:32.800
voice. So thank you so much for talking today. Thank you for being a friend. Thank you for
link |
03:34:37.200
honoring me with this conversation and with your valuable time. Thanks, Liv. Thank you.
link |
03:34:42.400
Thanks for listening to this conversation with Liv Barry. To support this podcast,
link |
03:34:46.080
please check out our sponsors in the description. And now let me leave you with some words from
link |
03:34:50.880
Richard Feynman. I think it's much more interesting to live not knowing than to have answers,
link |
03:34:57.200
which might be wrong. I have approximate answers and possible beliefs and different degrees of
link |
03:35:02.960
uncertainty about different things. But I'm not absolutely sure of anything. And there are many
link |
03:35:09.040
things I don't know anything about, such as whether it means anything to ask why we're here.
link |
03:35:15.280
I don't have to know the answer. I don't feel frightened not knowing things by being lost in
link |
03:35:21.600
a mysterious universe without any purpose, which is the way it really is, as far as I can tell.
link |
03:35:27.200
Thank you for listening and hope to see you next time.