back to index

Jaron Lanier: Virtual Reality, Social Media & the Future of Humans and AI | Lex Fridman Podcast #218


small model | large model

link |
00:00:00.000
The following is a conversation with Jaren Lanier,
link |
00:00:03.060
a computer scientist, visual artist, philosopher,
link |
00:00:06.200
writer, futurist, musician,
link |
00:00:08.080
and the founder of the field of virtual reality.
link |
00:00:11.680
To support this podcast,
link |
00:00:12.800
please check out our sponsors in the description.
link |
00:00:15.880
As a side note, you may know
link |
00:00:17.480
that Jaren is a staunch critic of social media platforms.
link |
00:00:20.720
Him and I agree on many aspects of this,
link |
00:00:23.760
except perhaps I am more optimistic
link |
00:00:26.360
about it being possible to build better platforms.
link |
00:00:29.700
And better artificial intelligence systems
link |
00:00:32.200
that put longterm interests
link |
00:00:33.880
and happiness of human beings first.
link |
00:00:36.640
Let me also say a general comment
link |
00:00:38.480
about these conversations.
link |
00:00:40.240
I try to make sure I prepare well,
link |
00:00:42.520
remove my ego from the picture,
link |
00:00:44.400
and focus on making the other person shine
link |
00:00:47.200
as we try to explore the most beautiful
link |
00:00:49.140
and insightful ideas in their mind.
link |
00:00:51.800
This can be challenging
link |
00:00:53.320
when the ideas that are close to my heart
link |
00:00:55.280
are being criticized.
link |
00:00:57.200
In those cases, I do offer a little pushback,
link |
00:00:59.960
but respectfully, and then move on,
link |
00:01:02.720
trying to have the other person come out
link |
00:01:04.320
looking wiser in the exchange.
link |
00:01:06.640
I think there's no such thing as winning
link |
00:01:08.560
in conversations nor in life.
link |
00:01:11.600
My goal is to learn and to have fun.
link |
00:01:14.040
I ask that you don't see my approach
link |
00:01:15.880
to these conversations as weakness.
link |
00:01:18.000
It is not.
link |
00:01:19.240
It is my attempt at showing respect
link |
00:01:21.560
and love for the other person.
link |
00:01:24.160
That said, I also often just do a bad job of talking,
link |
00:01:28.560
but you probably already knew that.
link |
00:01:30.840
So please give me a pass on that as well.
link |
00:01:33.480
This is the Lex Friedman podcast,
link |
00:01:35.560
and here is my conversation with Jaren Lanier.
link |
00:01:39.560
You're considered the founding father of virtual reality.
link |
00:01:44.640
Do you think we will one day spend most
link |
00:01:47.360
or all of our lives in virtual reality worlds?
link |
00:01:51.580
I have always found the very most valuable moment
link |
00:01:56.420
in virtual reality to be the moment
link |
00:01:58.380
when you take off the headset and your senses are refreshed
link |
00:02:01.220
and you perceive physicality afresh,
link |
00:02:05.180
as if you were a newborn baby,
link |
00:02:07.020
but with a little more experience.
link |
00:02:08.980
So you can really notice just how incredibly strange
link |
00:02:13.740
and delicate and peculiar and impossible the real world is.
link |
00:02:18.740
So the magic is, and perhaps forever,
link |
00:02:21.860
will be in the physical world?
link |
00:02:23.700
Well, that's my take on it.
link |
00:02:25.340
That's just me.
link |
00:02:26.180
I mean, I think I don't get to tell everybody else
link |
00:02:29.580
how to think or how to experience virtuality.
link |
00:02:32.180
And at this point,
link |
00:02:33.020
there have been multiple generations of younger people
link |
00:02:36.460
who've come along and liberated me
link |
00:02:39.580
from having to worry about these things.
link |
00:02:41.980
But I should say also, even in what some,
link |
00:02:45.940
well, I called it mixed reality back in the day.
link |
00:02:48.820
In these days, it's called augmented reality,
link |
00:02:51.620
but with something like a HoloLens,
link |
00:02:53.580
even then, like one of my favorite things
link |
00:02:56.500
is to augment a forest,
link |
00:02:57.860
not because I think the forest needs augmentation,
link |
00:03:00.220
but when you look at the augmentation next to a real tree,
link |
00:03:04.100
the real tree just pops out as being astounding.
link |
00:03:07.220
It's interactive, it's changing slightly all the time
link |
00:03:12.220
if you pay attention,
link |
00:03:13.060
and it's hard to pay attention to that
link |
00:03:15.060
but when you compare to virtuality, all of a sudden you do.
link |
00:03:18.500
And even in practical applications,
link |
00:03:20.740
my favorite early application of virtuality,
link |
00:03:24.700
which we prototype going back to the 80s
link |
00:03:27.380
when I was working with Dr. Joe Rosen at Stanford Med
link |
00:03:30.660
near where we are now,
link |
00:03:32.660
we made the first surgical simulator.
link |
00:03:34.940
And to go from the fake anatomy of the simulation,
link |
00:03:39.940
which is incredibly valuable for many things,
link |
00:03:42.500
for designing procedures,
link |
00:03:43.700
for training for all kinds of things,
link |
00:03:45.180
then to go to the real person,
link |
00:03:47.100
boy, it's really something like,
link |
00:03:49.700
surgeons really get woken up by that transition.
link |
00:03:52.740
It's very cool.
link |
00:03:53.580
So I think the transition is actually more valuable
link |
00:03:55.580
than the simulation.
link |
00:03:57.980
That's fascinating, I never really thought about that.
link |
00:04:01.180
It's almost, it's like traveling elsewhere
link |
00:04:05.100
in the physical space can help you appreciate
link |
00:04:07.700
how much you value your home once you return.
link |
00:04:11.380
Well, that's how I take it.
link |
00:04:13.100
I mean, once again,
link |
00:04:14.820
people have different attitudes towards it.
link |
00:04:17.140
All are welcome.
link |
00:04:18.340
What do you think is the difference
link |
00:04:19.940
between the virtual world and the physical meat space world
link |
00:04:23.380
that you are still drawn,
link |
00:04:25.460
for you personally, still drawn to the physical world?
link |
00:04:28.500
Like they're clearly then as a distinction.
link |
00:04:31.180
Is there some fundamental distinction
link |
00:04:32.780
or is it the peculiarities of the current set of technology?
link |
00:04:37.100
In terms of the kind of virtual reality that we have now,
link |
00:04:41.060
it's made of software and software is terrible stuff.
link |
00:04:44.740
Software is always the slave of its own history,
link |
00:04:49.100
its own legacy.
link |
00:04:50.780
It's always infinitely arbitrarily messy and arbitrary.
link |
00:04:56.300
Working with it brings out
link |
00:04:58.300
a certain kind of nerdy personality in people,
link |
00:05:00.300
or at least in me, which I'm not that fond of.
link |
00:05:04.300
And there are all kinds of things about software I don't like.
link |
00:05:07.620
And so that's different from the physical world.
link |
00:05:09.940
It's not something we understand as you just pointed out.
link |
00:05:13.540
On the other hand,
link |
00:05:14.740
I'm a little mystified when people ask me,
link |
00:05:16.740
well, do you think the universe is a computer?
link |
00:05:19.820
And I have to say, well, I mean,
link |
00:05:23.100
what on earth could you possibly mean
link |
00:05:24.740
if you say it isn't a computer?
link |
00:05:26.700
If it isn't a computer,
link |
00:05:28.580
it wouldn't follow principles consistently
link |
00:05:32.540
and it wouldn't be intelligible
link |
00:05:34.260
because what else is a computer ultimately?
link |
00:05:37.140
And we have physics, we have technology,
link |
00:05:40.140
so we can do technology, so we can program it.
link |
00:05:42.620
So, I mean, of course it's some kind of computer,
link |
00:05:44.820
but I think trying to understand it as a Turing machine
link |
00:05:48.140
is probably a foolish approach.
link |
00:05:51.620
Right, that's the question.
link |
00:05:53.540
Whether it performs this computer, we call the universe,
link |
00:05:57.540
performs the kind of computation that can be modeled
link |
00:06:00.220
as a universal Turing machine,
link |
00:06:02.780
or is it something much more fancy
link |
00:06:05.620
or fancy, so fancy, in fact,
link |
00:06:08.540
that it may be beyond our cognitive capabilities
link |
00:06:11.020
to understand.
link |
00:06:12.660
Turing machines are kind of,
link |
00:06:16.380
I call them teases in a way,
link |
00:06:18.660
because if you have an infinitely smart programmer
link |
00:06:23.180
with an infinite amount of time,
link |
00:06:24.660
an infinite amount of memory,
link |
00:06:25.900
and an infinite clock speed,
link |
00:06:28.260
then they're universal, but that cannot exist.
link |
00:06:31.380
So they're not universal in practice,
link |
00:06:33.140
and they actually are, in practice,
link |
00:06:36.260
a very particular sort of machine
link |
00:06:38.260
within the constraints,
link |
00:06:40.660
within the conservation principles of any reality
link |
00:06:44.020
that's worth being in, probably.
link |
00:06:46.460
And so, I think universality of a particular model
link |
00:06:55.660
is probably a deceptive way to think,
link |
00:06:58.540
even though at some sort of limit,
link |
00:07:00.740
of course, something like that's gotta be true
link |
00:07:05.100
at some sort of high enough limit,
link |
00:07:07.380
but it's just not accessible to us, so what's the point?
link |
00:07:10.460
Well, to me, the question of whether we're living
link |
00:07:12.980
inside a computer or a simulation
link |
00:07:15.460
is interesting in the following way.
link |
00:07:18.500
There's a technical question is here.
link |
00:07:20.940
How difficult does it to build a machine
link |
00:07:25.860
not that simulates the universe,
link |
00:07:28.380
but that makes it sufficiently realistic
link |
00:07:31.500
that we wouldn't know the difference,
link |
00:07:33.460
or better yet, sufficiently realistic
link |
00:07:36.220
that we would kind of know the difference,
link |
00:07:37.860
but we would prefer to stay in the virtual world anyway?
link |
00:07:41.020
I wanna give you a few different answers.
link |
00:07:42.460
I wanna give you the one that I think
link |
00:07:43.980
has the most practical importance
link |
00:07:45.860
to human beings right now,
link |
00:07:47.460
which is that there's a kind of an assertion
link |
00:07:51.580
sort of built into the way the question's usually asked
link |
00:07:54.260
that I think is false,
link |
00:07:55.700
which is a suggestion that people have a fixed level
link |
00:07:59.500
of ability to perceive reality in a given way.
link |
00:08:03.100
And actually, people are always learning, evolving,
link |
00:08:08.140
forming themselves.
link |
00:08:09.100
We're fluid too.
link |
00:08:10.380
We're also programmable, self programmable,
link |
00:08:13.700
changing, adapting, and so my favorite way to get at this
link |
00:08:18.740
is to talk about the history of other media.
link |
00:08:20.980
So for instance, there was a peer review paper
link |
00:08:23.580
that showed that an early wire recorder
link |
00:08:26.420
playing back an opera singer behind a curtain
link |
00:08:28.580
was indistinguishable from a real opera singer.
link |
00:08:31.260
And so now, of course, to us,
link |
00:08:32.380
it would not only be distinguishable,
link |
00:08:34.140
but it would be very blatant
link |
00:08:35.460
because the recording would be horrible,
link |
00:08:37.620
but to the people at the time,
link |
00:08:39.140
without the experience of it, it seemed plausible.
link |
00:08:43.740
There was an early demonstration
link |
00:08:46.300
of extremely crude video teleconferencing
link |
00:08:49.540
between New York and DC in the 30s,
link |
00:08:53.420
I think so, that people viewed
link |
00:08:55.100
as being absolutely realistic and indistinguishable,
link |
00:08:57.260
which to us would be horrible.
link |
00:08:59.820
And there are many other examples.
link |
00:09:00.940
Another one, one of my favorite ones
link |
00:09:02.180
is in the Civil War era,
link |
00:09:04.140
there were itinerant photographers
link |
00:09:06.100
who collected photographs of people
link |
00:09:07.740
who just looked kind of like a few archetypes.
link |
00:09:10.660
So you could buy a photo of somebody
link |
00:09:12.420
who looked kind of like your loved one
link |
00:09:15.420
to remind you of that person
link |
00:09:17.060
because actually photographing them was inconceivable
link |
00:09:20.500
and hiring a painter was too expensive
link |
00:09:22.340
and you didn't have any way for the painter
link |
00:09:23.860
to represent them remotely anyway.
link |
00:09:25.420
How would they even know what they looked like?
link |
00:09:27.620
So these are all great examples
link |
00:09:29.540
of how in the early days of different media,
link |
00:09:32.260
we perceived the media as being really great,
link |
00:09:34.260
but then we evolved through the experience of the media.
link |
00:09:37.740
This gets back to what I was saying,
link |
00:09:38.780
maybe the greatest gift of photography
link |
00:09:40.740
is that we can see the flaws in a photograph
link |
00:09:42.620
and appreciate reality more.
link |
00:09:44.460
Maybe the greatest gift of audio recording
link |
00:09:46.700
is that we can distinguish that opera singer now
link |
00:09:49.900
from that recording of the opera singer
link |
00:09:52.500
on the horrible wire recorder.
link |
00:09:53.740
So we shouldn't limit ourselves
link |
00:09:57.420
by some assumption of stasis that's incorrect.
link |
00:10:01.220
So that's my first answer,
link |
00:10:03.780
which is I think the most important one.
link |
00:10:05.260
Now, of course, somebody might come back and say,
link |
00:10:07.260
oh, but technology can go so far,
link |
00:10:09.300
there must be some point at which it would surpass.
link |
00:10:11.500
That's a different question.
link |
00:10:13.060
I think that's also an interesting question,
link |
00:10:14.700
but I think the answer I just gave you
link |
00:10:16.020
is actually the more important answer
link |
00:10:17.540
to the more important question.
link |
00:10:18.900
That's profound, yeah.
link |
00:10:20.260
But can you, the second question,
link |
00:10:23.140
which you're now making me realize is way different.
link |
00:10:26.820
Is it possible to create worlds
link |
00:10:28.500
in which people would want to stay
link |
00:10:31.580
instead of the real world?
link |
00:10:32.900
Well, like unmasked,
link |
00:10:35.740
like large numbers of people.
link |
00:10:38.220
What I hope is, as I said before,
link |
00:10:41.340
I hope that the experience of virtual worlds
link |
00:10:44.260
helps people appreciate this physical world
link |
00:10:48.740
we have and feel tender towards it
link |
00:10:51.740
and keep it from getting too fucked up.
link |
00:10:54.580
That's my hope.
link |
00:10:56.980
Do you see all technology in that way?
link |
00:10:58.740
So basically, technology helps us appreciate
link |
00:11:02.820
the more sort of technology free aspect of life.
link |
00:11:08.220
Well, media technology.
link |
00:11:10.740
You know, I mean, you can stretch that.
link |
00:11:13.460
I mean, you can, let me say,
link |
00:11:15.260
I could definitely play McLuhan
link |
00:11:17.340
and turn this into a general theory.
link |
00:11:19.060
It's totally doable.
link |
00:11:20.060
The program you just described is totally doable.
link |
00:11:23.180
In fact, I will psychically predict
link |
00:11:25.100
that if you did the research,
link |
00:11:26.100
you could find 20 PhD theses that do that already.
link |
00:11:29.260
I don't know, but they might exist.
link |
00:11:31.300
But I don't know how much value there is
link |
00:11:34.900
in pushing a particular idea that far.
link |
00:11:38.780
Claiming that reality isn't a computer,
link |
00:11:40.980
in some sense, seems incoherent to me
link |
00:11:42.780
because we can program it.
link |
00:11:44.860
We have technology.
link |
00:11:46.220
It has, it seems to obey physical laws.
link |
00:11:48.820
What more do you want from it to be a computer?
link |
00:11:50.700
I mean, it's a computer of some kind.
link |
00:11:52.180
We don't know exactly what kind.
link |
00:11:53.500
We might not know how to think about it.
link |
00:11:54.820
We're working on it.
link |
00:11:56.420
But sorry to draw, but you're absolutely right.
link |
00:11:59.140
Like that's my fascination with the AI as well.
link |
00:12:01.900
Is it helps, in the case of AI,
link |
00:12:05.180
I see as a set of techniques
link |
00:12:07.260
that help us understand ourselves, understand us humans.
link |
00:12:10.140
In the same way, virtual reality,
link |
00:12:12.380
and you're putting it brilliantly,
link |
00:12:14.340
it's a way to help us understand reality.
link |
00:12:18.900
I appreciate and open our eyes more richly to reality.
link |
00:12:23.740
That's certainly how I see it.
link |
00:12:26.100
And I wish people who become incredibly fascinated,
link |
00:12:29.860
who go down the rabbit hole of the different fascinations
link |
00:12:33.940
with whether we're in a simulation or not,
link |
00:12:35.900
or there's a whole world of variations on that.
link |
00:12:40.540
I wish they'd step back and think about their own motivations
link |
00:12:42.900
and exactly what they mean, you know what?
link |
00:12:45.780
And I think the danger with these things is,
link |
00:12:52.860
so if you say, is the universe some kind of computer broadly?
link |
00:12:56.340
It has to be,
link |
00:12:57.180
because it's not coherent to say that it isn't.
link |
00:12:59.820
On the other hand, to say that that means,
link |
00:13:02.260
you know, anything about what kind of computer,
link |
00:13:05.140
that's something very different.
link |
00:13:06.340
And the same thing is true for the brain.
link |
00:13:07.900
The same thing is true for anything
link |
00:13:10.460
where you might use computational metaphors.
link |
00:13:12.060
Like we have to have a bit of modesty about where we stand.
link |
00:13:14.940
And the problem I have with these framings of computation
link |
00:13:19.340
as these ultimate cosmic questions
link |
00:13:21.060
is that it has a way of getting people to pretend
link |
00:13:23.940
they know more than they do.
link |
00:13:25.340
Can you maybe, this is a therapy session.
link |
00:13:28.180
Is that going to last me for a second?
link |
00:13:30.380
I really like the Elder Scrolls series.
link |
00:13:32.260
It's a role playing game, Skyrim, for example.
link |
00:13:36.780
Why do I enjoy so deeply just walking around that world?
link |
00:13:42.340
And then there's people you could talk to
link |
00:13:45.140
and you can just like, it's an escape.
link |
00:13:48.060
But you know, my life is awesome.
link |
00:13:49.820
I'm truly happy.
link |
00:13:51.300
But I also am happy with the music that's playing
link |
00:13:55.140
and the mountains and carrying around a sword
link |
00:13:59.220
and just that, I don't know what that is.
link |
00:14:02.380
It's very pleasant though to go there.
link |
00:14:04.620
And I miss it sometimes.
link |
00:14:06.540
I think it's wonderful to love artistic creations.
link |
00:14:12.380
It's wonderful to love contact with other people.
link |
00:14:15.980
It's wonderful to love play and ongoing, evolving meaning
link |
00:14:22.300
and patterns with other people.
link |
00:14:24.140
I think it's a good thing.
link |
00:14:27.660
You know, I'm not like anti tech
link |
00:14:31.860
and I'm certainly not anti digital tech.
link |
00:14:34.420
I'm anti, as everybody knows by now,
link |
00:14:37.260
I think the, you know, manipulative economy
link |
00:14:40.500
of social media is making everybody nuts and all that.
link |
00:14:42.380
So I'm anti that stuff.
link |
00:14:43.980
But the core of it, of course, I worked for many, many years
link |
00:14:47.620
on trying to make that stuff happen
link |
00:14:49.180
because I think it can be beautiful.
link |
00:14:51.020
Like I don't like, why not?
link |
00:14:54.620
You know, and by the way, there's a thing about humans
link |
00:14:59.140
which is we're problematic, any kind of social interaction
link |
00:15:06.980
with other people is going to have its problems.
link |
00:15:10.140
People are political and tricky.
link |
00:15:13.980
And like, I love classical music,
link |
00:15:16.220
but when you actually go to a classical music thing
link |
00:15:18.540
and it turns out, oh, actually this is like a backroom power
link |
00:15:21.020
deal kind of place and a big status ritual as well.
link |
00:15:24.180
And that's kind of not as fun.
link |
00:15:27.860
That's part of the package.
link |
00:15:28.980
And the thing is it's always going to be.
link |
00:15:30.660
There's always going to be a mix of things.
link |
00:15:34.700
I don't think the search for purity is going to get you
link |
00:15:39.780
anywhere, so I'm not worried about that.
link |
00:15:42.300
I worry about the really bad cases
link |
00:15:44.500
where we're becoming, where we're making ourselves crazy
link |
00:15:47.460
or cruel enough that we might not survive.
link |
00:15:49.220
And I think, you know, the social media criticism
link |
00:15:52.060
rises to that level.
link |
00:15:53.420
But I'm glad you enjoy it.
link |
00:15:54.900
I think it's great.
link |
00:15:57.340
And I like that you basically say that every experience
link |
00:16:00.060
has both beauty and darkness as in with classical music.
link |
00:16:03.620
I also play classical piano, so I appreciate it very much.
link |
00:16:07.140
But it's interesting.
link |
00:16:08.020
I mean, every and even the darkest man's search
link |
00:16:11.220
for meaning with Victor Franco and concentration camps,
link |
00:16:15.780
even there, there's opportunity to discover beauty.
link |
00:16:20.860
And so that's the interesting thing about humans
link |
00:16:25.060
is the capacity to discover beautiful and the darkest
link |
00:16:28.580
moments.
link |
00:16:29.100
But there's always the dark parts, too.
link |
00:16:31.580
Well, I mean, our situation is structurally difficult.
link |
00:16:37.020
We are structurally different.
link |
00:16:40.740
No, it is.
link |
00:16:41.260
It's true.
link |
00:16:42.140
We perceive socially.
link |
00:16:43.540
We depend on each other for our sense of place
link |
00:16:48.700
and perception of the world.
link |
00:16:50.780
I mean, we're dependent on each other.
link |
00:16:52.340
And yet there's also a degree in which we inevitably
link |
00:16:58.780
let each other down.
link |
00:17:01.060
We are set up to be competitive as well as supportive.
link |
00:17:05.180
I mean, our fundamental situation
link |
00:17:08.300
is complicated and challenging.
link |
00:17:10.620
And I wouldn't have it any other way.
link |
00:17:13.540
OK, let's talk about one of the most challenging things.
link |
00:17:17.060
One of the things I, unfortunately,
link |
00:17:19.100
am very afraid of being human, allegedly.
link |
00:17:23.420
You wrote an essay on death and consciousness,
link |
00:17:26.300
in which you write a note, certainly the fear of death
link |
00:17:29.980
has been one of the greatest driving forces
link |
00:17:31.980
in the history of thought and in the formation
link |
00:17:35.180
of the character of civilization.
link |
00:17:37.340
And yet it is under acknowledged.
link |
00:17:39.780
The great book on the subject, The Denial of Death
link |
00:17:42.300
by Ernest Becker deserves a reconsideration.
link |
00:17:45.180
I'm Russian, so I have to ask you about this.
link |
00:17:47.100
What's the role of death in life?
link |
00:17:48.860
See, you would have enjoyed coming to our house,
link |
00:17:51.620
because my wife is Russian.
link |
00:17:54.060
And we also have a piano of such spectacular qualities.
link |
00:17:58.420
You wouldn't have freaked out.
link |
00:18:00.740
But anyway, we'll let all that go.
link |
00:18:04.220
So the context in which I remember that essay,
link |
00:18:08.980
sort of, this was from maybe the 90s or something.
link |
00:18:12.020
And I used to publish in a journal called
link |
00:18:16.100
The Journal of Consciousness Studies,
link |
00:18:17.660
because I was interested in these endless debates
link |
00:18:21.140
about consciousness and science,
link |
00:18:24.500
which certainly continue today.
link |
00:18:28.380
And I was interested in how the fear of death
link |
00:18:35.380
and the denial of death played into different forms
link |
00:18:39.380
and the denial of death played into different
link |
00:18:42.900
philosophical approaches to consciousness.
link |
00:18:45.700
Because I think on the one hand, the sort of sentimental school
link |
00:18:58.740
of dualism, meaning the feeling that there's something
link |
00:19:01.340
apart from the physical brain, some kind of soul
link |
00:19:04.340
or something else, is obviously motivated,
link |
00:19:07.620
a hope that whatever that is will survive death and continue.
link |
00:19:11.580
And that's a very core aspect of a lot of the world
link |
00:19:14.940
religions, not all of them, not really, but most of them.
link |
00:19:21.220
The thing I noticed is that the opposite of those,
link |
00:19:26.900
which might be the sort of hardcore, no,
link |
00:19:29.220
the brain's a computer and that's it,
link |
00:19:31.300
in a sense, we're motivated in the same way
link |
00:19:36.300
with a remarkably similar chain of arguments,
link |
00:19:40.700
which is, no, the brain's a computer
link |
00:19:43.740
and I'm going to figure it out in my lifetime
link |
00:19:45.580
and upload it, upload myself and I'll live forever.
link |
00:19:48.180
That's interesting.
link |
00:19:50.540
Yeah, that's like the implied thought, right?
link |
00:19:53.540
Yeah, and so it's kind of this, in a funny way,
link |
00:19:56.700
it's the same thing.
link |
00:19:58.460
It's peculiar to notice that these people who would appear
link |
00:20:07.060
to be opposites in character and cultural references
link |
00:20:11.500
and their ideas actually are remarkably similar.
link |
00:20:16.580
And to an incredible degree, the sort of hardcore computationalist
link |
00:20:23.580
idea about the brain has turned into medieval Christianity
link |
00:20:28.900
with together, like there's the people who are afraid
link |
00:20:31.420
that if you have the wrong thought,
link |
00:20:32.620
you'll piss off the super eyes of the future
link |
00:20:34.660
who will come back and zap you and all that stuff.
link |
00:20:38.540
It's really turned into medieval Christianity
link |
00:20:41.700
all over again.
link |
00:20:43.060
So the Ernest Becker's idea that death, the fear of death
link |
00:20:47.580
is the warm of the core, which is like that's
link |
00:20:52.180
the core motivator of everything we see humans have created.
link |
00:20:57.020
The question is if that fear of mortality is somehow
link |
00:20:59.740
core is like a prerequisite to what you just moved
link |
00:21:06.460
across this vast cultural chasm that separates me
link |
00:21:11.540
from most of my colleagues in a way.
link |
00:21:13.180
And I can't answer what you just said on the level
link |
00:21:15.460
without this huge deconstruction.
link |
00:21:17.660
Yes.
link |
00:21:18.140
Should I do it?
link |
00:21:18.900
Yes, what's the chasm?
link |
00:21:20.220
OK.
link |
00:21:21.300
Let us travel across this vast.
link |
00:21:23.100
OK, I don't believe in AI.
link |
00:21:24.900
I don't think there's any AI.
link |
00:21:26.140
There's just algorithms.
link |
00:21:27.220
We make them.
link |
00:21:27.780
We control them.
link |
00:21:28.380
Now, they're tools.
link |
00:21:29.860
They're not creatures.
link |
00:21:30.700
Now, this is something that rubs a lot of people the wrong way.
link |
00:21:34.660
And don't I know it?
link |
00:21:36.060
When I was young, my main mentor was Marvin
link |
00:21:38.700
Minsky, who's the principal author of the computer
link |
00:21:43.340
as creature rhetoric that we still use.
link |
00:21:46.940
He was the first person to have the idea at all.
link |
00:21:48.780
But he certainly populated the AI culture
link |
00:21:52.900
with most of its tropes, I would say.
link |
00:21:55.300
Because a lot of the stuff people will say,
link |
00:21:56.940
oh, did you hear this new idea about AI?
link |
00:21:58.580
And I'm like, yeah, I heard it in 1978.
link |
00:22:00.340
Sure, yeah, I remember that.
link |
00:22:01.900
So Marvin was really the person.
link |
00:22:03.620
And Marvin and I used to argue all the time about this stuff
link |
00:22:08.460
because I always rejected it.
link |
00:22:10.260
And of all of his, I wasn't formally his student,
link |
00:22:17.740
but I worked for him as a researcher.
link |
00:22:19.780
But of all of his students and student
link |
00:22:22.220
like people of his young adoptees,
link |
00:22:26.620
I think I was the one who argued with him about this stuff
link |
00:22:29.420
in particular.
link |
00:22:30.100
And he loved it.
link |
00:22:31.140
Yeah, I would have loved to hear that conversation.
link |
00:22:33.180
It was fun.
link |
00:22:34.100
Did you ever converse to a place?
link |
00:22:36.660
Oh, no, no.
link |
00:22:37.420
So the very last time I saw him, he was quite frail.
link |
00:22:40.220
And I was in Boston.
link |
00:22:44.100
And I was going to the old house in Brookline,
link |
00:22:45.740
his amazing house.
link |
00:22:47.340
And one of our mutual friends said,
link |
00:22:49.100
hey, listen, Marvin's so frail.
link |
00:22:51.980
Don't do the argument with him.
link |
00:22:54.020
Don't argue about AI.
link |
00:22:56.380
And so I said, but Marvin loves that.
link |
00:22:58.820
And so I showed up.
link |
00:23:00.020
And he was frail.
link |
00:23:01.540
He looked up and he said, are you ready to argue?
link |
00:23:08.140
He's such an amazing person for that.
link |
00:23:10.220
So it's hard to summarize this because it's decades of stuff.
link |
00:23:16.180
The first thing to say is that nobody
link |
00:23:19.100
can claim absolute knowledge about whether somebody
link |
00:23:23.140
or something else is conscious or not.
link |
00:23:25.900
This is all a matter of faith.
link |
00:23:27.740
And in fact, I think the whole idea of faith
link |
00:23:31.780
needs to be updated.
link |
00:23:32.900
So it's not about God, but it's just about stuff in the universe.
link |
00:23:36.180
We have faith in each other being conscious.
link |
00:23:39.340
And then I used to frame this as a thing called
link |
00:23:42.700
the circle of empathy in my old papers.
link |
00:23:45.300
And then it turned into a thing for the animal rights movement.
link |
00:23:49.020
So I noticed Peter Singer using it.
link |
00:23:50.460
I don't know if it was coincident or but anyway,
link |
00:23:53.780
there's this idea that you draw a circle around yourself
link |
00:23:56.140
and the stuff inside is more like you, might be conscious,
link |
00:23:59.220
might be deserving of your empathy, of your consideration,
link |
00:24:02.180
and the stuff outside the circle isn't.
link |
00:24:04.300
And outside the circle might be a rock or I don't know.
link |
00:24:12.700
And that circle is fundamentally based on faith.
link |
00:24:15.460
Well, your faith and what isn't, what isn't.
link |
00:24:17.940
The thing about this circle is it can't be pure faith.
link |
00:24:21.380
It's also a pragmatic decision.
link |
00:24:23.820
And this is where things get complicated.
link |
00:24:26.020
If you try to make it too big, you suffer from incompetence.
link |
00:24:29.900
If you say, I don't want to kill a bacteria,
link |
00:24:33.300
I will not brush my teeth.
link |
00:24:34.540
I don't know, what do you do?
link |
00:24:35.980
Like there's a competence question
link |
00:24:39.100
where you do have to draw the line.
link |
00:24:40.980
People who make it too small become cruel.
link |
00:24:44.380
People are so clannish and political
link |
00:24:46.380
and so worried about themselves ending up
link |
00:24:48.620
on the bottom of society that they are always
link |
00:24:52.140
ready to gang up on some designated group.
link |
00:24:54.180
And so there's always these people who are being trying.
link |
00:24:56.220
We're always trying to shove somebody out of the circle.
link |
00:24:58.820
And so aren't you shoving AI outside the circle?
link |
00:25:01.540
Well, give me a second.
link |
00:25:02.220
All right.
link |
00:25:02.700
So there's a pragmatic consideration here.
link |
00:25:05.740
And so the biggest questions are probably
link |
00:25:09.780
fetuses and animals lately, but AI is getting there.
link |
00:25:13.380
Now, with AI, I think, and I've had this discussion
link |
00:25:20.500
so many times, people say, but aren't you
link |
00:25:22.500
afraid if you exclude AI, you'd be cruel to some consciousness?
link |
00:25:26.340
And then I would say, well, if you include AI,
link |
00:25:29.500
you make yourself, you exclude yourself
link |
00:25:32.820
from being able to be a good engineer or designer.
link |
00:25:35.860
And so you're facing incompetence immediately.
link |
00:25:38.780
So I really think we need to subordinate algorithms
link |
00:25:41.460
and be much more skeptical of them.
link |
00:25:43.580
Your intuition, you speak about this brilliantly
link |
00:25:45.900
with social media, how things can go wrong.
link |
00:25:48.980
Isn't it possible to design systems that show compassion,
link |
00:25:56.300
not to manipulate you, but give you control
link |
00:25:59.860
and make your life better if you so choose to?
link |
00:26:02.740
Grow together with systems.
link |
00:26:04.020
And the way we grow with dogs and cats with pets
link |
00:26:07.100
with significant others in that way,
link |
00:26:09.420
they grow to become better people.
link |
00:26:11.540
I don't understand why that's fundamentally not possible.
link |
00:26:14.460
You're saying oftentimes you get into trouble
link |
00:26:18.100
by thinking you know what's good for people.
link |
00:26:20.220
Well, look, there's this question of what
link |
00:26:23.260
frame we're speaking in.
link |
00:26:25.620
Do you know who Alan Watts was?
link |
00:26:27.660
So Alan Watts once said, morality is like gravity,
link |
00:26:32.140
that in some absolute cosmic sense,
link |
00:26:34.500
there can't be morality, because at some point
link |
00:26:36.460
it all becomes relative.
link |
00:26:37.700
And who are we anyway?
link |
00:26:39.100
Like morality is relative to us tiny creatures.
link |
00:26:42.100
But here on Earth, we're with each other.
link |
00:26:45.540
This is our frame.
link |
00:26:46.420
And morality is a very real thing.
link |
00:26:47.900
Same thing with gravity.
link |
00:26:48.780
At some point, you get into interstellar space
link |
00:26:52.140
and you might not feel much of it.
link |
00:26:53.980
But here we are on Earth.
link |
00:26:55.180
And I think in the same sense, I think
link |
00:26:58.460
this identification with a frame that's quite remote
link |
00:27:04.420
cannot be separated from a feeling
link |
00:27:07.540
of wanting to feel sort of separate from and superior
link |
00:27:11.020
to other people or something like that.
link |
00:27:12.660
There's an impulse behind it that I really have to reject.
link |
00:27:16.100
And we're just not competent yet to talk
link |
00:27:18.860
about these kinds of absolutes.
link |
00:27:20.980
OK, so I agree with you that a lot of technologies sort
link |
00:27:24.540
of lack this basic respect, understanding,
link |
00:27:27.700
and love for humanity.
link |
00:27:29.180
There's a separation there.
link |
00:27:30.620
The thing I'd like to push back against,
link |
00:27:32.420
it's not that you disagree.
link |
00:27:33.620
But I believe you can create technologies
link |
00:27:36.220
and you can create a new kind of technologist engineer that
link |
00:27:41.300
does build systems that respect humanity, not just respect
link |
00:27:45.380
but admire humanity, that have empathy for common humans,
link |
00:27:49.500
have compassion.
link |
00:27:51.460
So I mean, no, no, no.
link |
00:27:52.580
I think, yeah, I mean, I think musical instruments
link |
00:27:57.300
are a great example of that.
link |
00:27:58.780
Musical instruments or technologies
link |
00:28:00.260
that help people connect in fantastic ways.
link |
00:28:02.260
And that's a great example.
link |
00:28:06.580
My invention or design during the pandemic period
link |
00:28:11.300
was this thing called Together Mode,
link |
00:28:12.500
where people see themselves seated sort of in a classroom
link |
00:28:17.980
or a theater instead of in squares.
link |
00:28:20.260
And it allows them to semi consciously perform to each other
link |
00:28:26.060
as if they have proper eye contact,
link |
00:28:29.500
as if they're paying attention to each other nonverbally
link |
00:28:31.620
and weirdly that turns out to work.
link |
00:28:34.020
And so it promotes empathy so far as I can tell.
link |
00:28:36.980
I hope it is of some use to somebody.
link |
00:28:40.620
The AI idea isn't really new.
link |
00:28:43.100
I would say it was born with Adam Smith's Invisible Hand
link |
00:28:47.060
with this idea that we build this algorithmic thing
link |
00:28:49.660
and it gets a bit beyond us and then we
link |
00:28:52.260
think it must be smarter than us.
link |
00:28:53.940
And the thing about the Invisible Hand
link |
00:28:55.660
is absolutely everybody has some line they draw where they say,
link |
00:28:59.460
now, we're going to take control of this thing.
link |
00:29:01.660
They might have different lines, they
link |
00:29:03.260
might care about different things,
link |
00:29:04.460
but everybody ultimately became a Keynesian
link |
00:29:06.780
because it just didn't work.
link |
00:29:07.900
It really wasn't that smart.
link |
00:29:09.180
It was sometimes smart and sometimes it failed.
link |
00:29:11.700
And so if you really, people who really, really, really
link |
00:29:16.980
want to believe in the Invisible Hand as infinitely smart
link |
00:29:20.860
screw up their economies terribly,
link |
00:29:22.700
you have to recognize the economy as a subservient tool.
link |
00:29:27.380
Everybody does when it's to their advantage.
link |
00:29:29.860
They might not when it's not to their advantage.
link |
00:29:31.780
That's kind of an interesting game that happens.
link |
00:29:34.220
But the thing is, it's just like that with our algorithms.
link |
00:29:38.140
You can have a Chicago economic philosophy
link |
00:29:45.060
about your computer and say, no, no, no, my thing's come alive.
link |
00:29:47.340
It's smarter than anything.
link |
00:29:49.060
I think that there is a deep loneliness within all of us.
link |
00:29:52.820
This is what we seek.
link |
00:29:54.020
We seek love from each other.
link |
00:29:56.300
I think AI can help us connect deeper.
link |
00:29:59.700
This is what you criticize social media for.
link |
00:30:02.300
I think there's much better ways of doing social media that
link |
00:30:05.060
doesn't lead to manipulation, but instead
link |
00:30:07.420
leads to deeper connection between humans, leads
link |
00:30:10.020
to you becoming a better human being.
link |
00:30:11.900
And what that requires is some agency on the part of AI
link |
00:30:15.540
to be almost like a therapist, I mean a companion.
link |
00:30:18.620
It's not telling you what's right.
link |
00:30:22.060
It's not guiding you as if it's an all knowing thing.
link |
00:30:25.340
It's just another companion that you can leave at any time.
link |
00:30:28.740
You have complete transparency control over.
link |
00:30:31.860
There's a lot of mechanisms that you
link |
00:30:33.380
can have that are counter to how current social media operates
link |
00:30:38.900
that I think is subservient to humans or no, deeply respects
link |
00:30:44.580
human beings and empathetic to their experience
link |
00:30:47.780
and all those kinds of things.
link |
00:30:48.860
I think it's possible to create AI systems like that.
link |
00:30:51.580
And I think that's a technical discussion
link |
00:30:54.620
of whether they need to have something that looks like AI
link |
00:31:02.020
versus algorithms, something that has identity, something
link |
00:31:05.980
that has a personality, all those kinds of things.
link |
00:31:09.060
AI systems, and you've spoken extensively
link |
00:31:11.460
how AI systems manipulate you within social networks.
link |
00:31:17.180
And the biggest problem isn't necessarily
link |
00:31:21.140
that there's advertisements that social networks present you
link |
00:31:28.380
with advertisements that then get you to buy stuff.
link |
00:31:31.140
That's not the biggest problem.
link |
00:31:32.300
The biggest problem is they then manipulate you.
link |
00:31:36.300
They alter your human nature to get you to buy stuff
link |
00:31:41.420
or to get you to do whatever the advertiser wants.
link |
00:31:46.620
Maybe you can correct me.
link |
00:31:47.460
Yeah, I don't see it quite that way,
link |
00:31:49.820
but we can work with that as an approximation.
link |
00:31:52.020
Sure.
link |
00:31:53.140
I think the actual thing is even more ridiculous and stupider
link |
00:31:56.340
than that, but that's OK.
link |
00:31:58.140
So my question is, let's not use the word AI,
link |
00:32:02.380
but how do we fix it?
link |
00:32:05.340
Oh, fixing social media.
link |
00:32:07.900
That diverts us into this whole other field in my view,
link |
00:32:11.020
which is economics, which I always thought was really boring,
link |
00:32:14.180
but we have no choice but to turn it to economists
link |
00:32:16.340
if we want to fix this problem, because it's
link |
00:32:18.300
all about incentives.
link |
00:32:19.820
But I've been around this thing since it started,
link |
00:32:24.260
and I've been in the meetings where the social media companies
link |
00:32:30.300
sell themselves to the people who put the most money into them,
link |
00:32:33.860
which are usually the big advertising holding companies
link |
00:32:36.260
and whatnot.
link |
00:32:36.780
And there's this idea that I think is kind of a fiction.
link |
00:32:41.340
And maybe it's even been recognized as that by everybody
link |
00:32:45.100
that the algorithm will get really good at getting people
link |
00:32:48.860
to buy something, because I think people have looked
link |
00:32:51.260
at their returns and looked at what happens,
link |
00:32:53.180
and everybody recognizes it's not exactly right.
link |
00:32:56.340
It's more like a cognitive access blackmail payment
link |
00:33:02.020
at this point.
link |
00:33:03.940
Just to be connected, you're paying the money.
link |
00:33:06.020
It's not so much that the persuasion algorithms.
link |
00:33:08.660
So Stanford renamed its program, but it used
link |
00:33:10.700
to be called Engaged Persuade.
link |
00:33:12.340
The Engaged Part works.
link |
00:33:13.540
The Persuade Part is iffy, but the thing
link |
00:33:16.340
is that once people are engaged, in order for you
link |
00:33:19.460
to exist as a business, in order for you to be known at all,
link |
00:33:21.940
you have to put money into it.
link |
00:33:23.140
Oh, that's dark.
link |
00:33:24.460
Oh, no, that doesn't work, but they have to.
link |
00:33:27.020
But it's a giant cognitive access blackmail scheme
link |
00:33:31.460
at this point, because the science
link |
00:33:34.100
behind the Persuade Part, it's not entirely a failure,
link |
00:33:39.580
but we play make believe that it works more than it does.
link |
00:33:46.940
The damage doesn't come.
link |
00:33:48.820
Honestly, as I've said in my books,
link |
00:33:51.260
I'm not anti advertising.
link |
00:33:53.380
I actually think advertising can be demeaning, and annoying,
link |
00:33:58.260
and banal, and ridiculous, and take up a lot of our time
link |
00:34:03.220
with stupid stuff.
link |
00:34:04.300
Like, there's a lot of ways to criticize advertising
link |
00:34:06.980
that's accurate.
link |
00:34:09.020
And it can also lie, and all kinds of things.
link |
00:34:11.580
However, if I look at the biggest picture,
link |
00:34:13.980
I think advertising, at least as it was understood before,
link |
00:34:17.380
social media, helped bring people into modernity in a way
link |
00:34:20.580
that overall actually did benefit people overall.
link |
00:34:24.620
And you might say, am I contradicting myself,
link |
00:34:27.300
because I was saying you shouldn't manipulate people?
link |
00:34:29.140
Yeah, I am.
link |
00:34:29.900
Probably here.
link |
00:34:30.460
I mean, I'm not pretending to have this perfect, airtight
link |
00:34:33.740
worldview without some contradictions.
link |
00:34:35.460
I think there's a bit of a contradiction there.
link |
00:34:37.900
Well, looking at the long arc of history,
link |
00:34:39.380
advertisement has, in some parts, benefited society.
link |
00:34:43.660
Yeah, because it funded some efforts that perhaps
link |
00:34:46.620
benefited society.
link |
00:34:47.340
I mean, I think there's a thing where sometimes I think
link |
00:34:51.820
it's actually been of some use.
link |
00:34:53.940
Now, where the damage comes is a different thing, though.
link |
00:34:59.060
Social media, algorithms on social media
link |
00:35:03.340
have to work on feedback loops, where they present you
link |
00:35:06.060
with stimulus.
link |
00:35:06.820
They have to see if you respond to the stimulus.
link |
00:35:09.020
Now, the problem is that the measurement mechanism
link |
00:35:12.500
for telling if you respond in the engagement feedback loop
link |
00:35:16.460
is very, very crude.
link |
00:35:17.660
It's things like whether you click more,
link |
00:35:19.540
or occasionally if you're staring at the screen more,
link |
00:35:21.620
if there's a forward facing camera that's activated,
link |
00:35:23.900
but typically there isn't.
link |
00:35:25.540
So you have this incredibly crude back channel of information.
link |
00:35:28.940
And so it's crude enough that it only
link |
00:35:31.260
catches sort of the more dramatic responses from you.
link |
00:35:35.740
And those are the fight or flight responses.
link |
00:35:37.700
Those are the things where you get scared or pissed off,
link |
00:35:40.100
or aggressive, or horny.
link |
00:35:43.420
These are these ancient, what are sometimes called
link |
00:35:46.140
the lizard brain circuits or whatever.
link |
00:35:48.980
These fast response, old, old, old evolutionary business
link |
00:35:54.100
circuits that we have that are helpful in survival
link |
00:35:58.300
once in a while, but are not us at our best.
link |
00:36:00.500
They're not who we want to be.
link |
00:36:01.660
They're not how we relate to each other.
link |
00:36:03.460
They're this old business.
link |
00:36:05.140
So then just when you're engaged using those intrinsically,
link |
00:36:08.180
totally aside from whatever the topic is,
link |
00:36:11.100
you start to get incrementally just a little bit more
link |
00:36:14.140
paranoid, xenophobic, aggressive.
link |
00:36:17.260
You get a little stupid, and you become a jerk.
link |
00:36:20.980
And it happens slowly.
link |
00:36:22.420
It happens.
link |
00:36:23.500
It's not like everybody is instantly transformed.
link |
00:36:26.060
But it does happen progressively,
link |
00:36:28.020
where people who get hooked kind of get drawn more and more
link |
00:36:30.740
into this pattern of being at their worst.
link |
00:36:33.660
Would you say that people are able to,
link |
00:36:35.780
when they get hooked in this way,
link |
00:36:37.500
look back at themselves from 30 days ago and say,
link |
00:36:41.460
I am less happy with who I am now,
link |
00:36:45.100
or I'm not happy with why I'm now versus who I was 30 days ago?
link |
00:36:48.780
Are they able to self reflect when you take yourself
link |
00:36:51.420
outside of the lizard brain?
link |
00:36:52.620
Sometimes.
link |
00:36:54.180
I wrote a book about people suggesting
link |
00:36:56.420
people take a break from their social media
link |
00:36:57.940
to see what happens.
link |
00:36:58.900
And maybe even the title of the book
link |
00:37:01.780
was just the arguments to delete your account.
link |
00:37:04.180
Yeah, 10 arguments.
link |
00:37:05.700
10 arguments.
link |
00:37:06.460
Although I always said, I don't know that you should.
link |
00:37:08.500
I can give you the arguments.
link |
00:37:09.620
It's up to you.
link |
00:37:10.420
I'm always very clear about that.
link |
00:37:11.780
But I don't have a social media account, obviously.
link |
00:37:15.580
And it's not that easy for people to reach me.
link |
00:37:18.900
They have to search out an old fashioned email address
link |
00:37:21.300
on a super crappy, antiquated website.
link |
00:37:23.780
It's actually a bit, I don't make it easy.
link |
00:37:26.140
And even with that, I get this huge flood of mail
link |
00:37:28.700
from people who say, oh, I quit my social media.
link |
00:37:30.580
I'm doing so much better, I can't believe how bad it was.
link |
00:37:33.260
But the thing is, for me, a huge flood of mail
link |
00:37:36.060
would be an imperceptible trickle from the perspective
link |
00:37:38.620
of Facebook, right?
link |
00:37:39.980
And so I think it's rare for somebody
link |
00:37:43.620
to look at themselves and say, oh, boy,
link |
00:37:45.340
I just screwed myself over.
link |
00:37:47.780
It's a really hard thing to ask of somebody.
link |
00:37:49.620
None of us find that easy, right?
link |
00:37:51.300
Well, the reason I asked is, is it
link |
00:37:54.580
possible to design social media systems that
link |
00:37:58.580
optimize for some longer term metrics of you
link |
00:38:03.180
being happy with yourself?
link |
00:38:04.580
Well, see, I don't think you should try to engineer
link |
00:38:07.140
personal growth or happiness.
link |
00:38:08.380
I think what you should do is design a system that's
link |
00:38:11.020
just respectful of the people and subordinates itself
link |
00:38:13.700
to the people and doesn't have perverse incentives.
link |
00:38:16.780
And then at least there's a chance
link |
00:38:18.180
of something decent happening.
link |
00:38:19.780
You'll have to recommend stuff, right?
link |
00:38:22.100
So you're saying, be respectful.
link |
00:38:24.420
What does that actually mean engineering wise?
link |
00:38:26.900
Yeah, curation, people have to be responsible.
link |
00:38:30.260
Algorithms shouldn't be recommending.
link |
00:38:31.700
Algorithms don't understand enough to recommend.
link |
00:38:33.500
Algorithms are crap in this era.
link |
00:38:35.260
I mean, I'm sorry, they are.
link |
00:38:37.020
And I'm not saying this as somebody
link |
00:38:38.420
is a critic from the outside.
link |
00:38:39.420
I'm in the middle of it.
link |
00:38:40.260
I know what they can do.
link |
00:38:41.260
I know the math.
link |
00:38:41.940
I know what the corpora are.
link |
00:38:45.300
I know the best ones.
link |
00:38:46.980
Our office is funding GPT3 and all these things
link |
00:38:49.860
that are at the edge of what's possible.
link |
00:38:53.500
And they do not have yet.
link |
00:38:57.380
I mean, it still is statistical emergent pseudo semantics.
link |
00:39:02.100
It doesn't actually have the representation
link |
00:39:04.140
emerging of anything.
link |
00:39:05.100
It's just not like, I mean, that I'm speaking the truth here
link |
00:39:07.700
and you know it.
link |
00:39:08.580
Well, let me push back on this.
link |
00:39:11.900
There's several truths here.
link |
00:39:13.100
So you're speaking to the way certain companies operate
link |
00:39:16.060
currently.
link |
00:39:16.980
I don't think it's outside the realm of what's technically
link |
00:39:20.380
feasible to do.
link |
00:39:21.740
They're just not incentive like companies are not
link |
00:39:23.660
why fix this thing.
link |
00:39:26.060
I am aware that, for example, the YouTube search
link |
00:39:29.820
and discovery has been very helpful to me.
link |
00:39:33.380
And there's a huge number of there's so many videos
link |
00:39:37.660
that it's nice to have a little bit of help.
link |
00:39:39.700
Have you done.
link |
00:39:40.380
But I'm still in control.
link |
00:39:41.380
Let me ask you something.
link |
00:39:42.180
Have you done the experiment of letting YouTube
link |
00:39:45.020
recommend videos to you either starting
link |
00:39:47.260
from a absolutely anonymous random place
link |
00:39:50.380
where it doesn't know who you are or from knowing who you
link |
00:39:52.580
or somebody else is and then going 15 or 20 hops?
link |
00:39:55.300
Have you ever done that and just let it go top video recommend
link |
00:39:58.940
and then just go 20 hops?
link |
00:40:00.140
No, I haven't.
link |
00:40:00.900
I've done that many times now.
link |
00:40:03.100
I have because of how large YouTube is and how widely it's
link |
00:40:07.020
used, it's very hard to get to enough scale
link |
00:40:10.700
to get a statistically solid result on this.
link |
00:40:14.300
I've done it with high school kids,
link |
00:40:15.820
with dozens of kids doing it at a time.
link |
00:40:18.380
Every time I've done an experiment, the majority
link |
00:40:21.140
of times after about 17 or 18 hops,
link |
00:40:23.460
you end up in really weird, paranoid, bizarre territory.
link |
00:40:27.180
Because ultimately, that is the stuff, the algorithm
link |
00:40:29.980
rewards the most because of the feedback
link |
00:40:31.940
creepiness I was just talking about.
link |
00:40:34.060
So I'm not saying that the video never
link |
00:40:37.140
recommends something cool.
link |
00:40:38.340
I'm saying that its fundamental core
link |
00:40:40.460
is one that promotes a paranoid style, that promotes
link |
00:40:44.660
increasing irritability, that promotes xenophobia,
link |
00:40:48.020
that promotes fear, anger, promotes selfishness,
link |
00:40:51.340
promotes separation between people.
link |
00:40:55.100
The thing is, it's very hard to do this work solidly.
link |
00:40:58.220
Many have repeated this experiment,
link |
00:40:59.980
and yet it still is kind of anecdotal.
link |
00:41:02.220
I'd like to do a large citizen science thing sometime
link |
00:41:05.540
and do it, but then I think the problem with that
link |
00:41:07.300
is YouTube would detect it and then change it.
link |
00:41:09.660
Yes, I love that kind of stuff.
link |
00:41:12.100
So Jack Dorsey has spoken about doing healthy conversations
link |
00:41:16.420
on Twitter or optimizing for healthy conversations.
link |
00:41:19.140
What that requires within Twitter are most likely
link |
00:41:21.980
citizen experiments of what does healthy conversations
link |
00:41:25.940
actually look like and how do you incentivize
link |
00:41:28.620
those healthy conversations.
link |
00:41:30.100
You're describing what often happens
link |
00:41:33.100
and what is currently happening.
link |
00:41:34.780
What I'd like to argue is it's possible to strive
link |
00:41:38.140
for healthy conversations, not in a dogmatic way of saying,
link |
00:41:42.780
I know what healthy conversations are and I will tell you.
link |
00:41:45.540
I think one way to do this is to try to look around
link |
00:41:48.420
at social, maybe not things that are officially social media,
link |
00:41:51.860
but things where people are together online
link |
00:41:54.020
and see which ones have more healthy conversations.
link |
00:41:56.780
Even if it's hard to be completely objective
link |
00:42:00.580
in that measurement, you can kind of at least crudely
link |
00:42:03.180
agree.
link |
00:42:04.020
You could do subjective annotation of this,
link |
00:42:05.860
like have a large crowdsource.
link |
00:42:08.300
One that I've been really interested in is GitHub,
link |
00:42:11.620
because it could change, I'm not saying it'll always be,
link |
00:42:16.060
but for the most part, GitHub has had a relatively
link |
00:42:19.940
quite low poison quotient and I think there's a few things
link |
00:42:24.500
about GitHub that are interesting.
link |
00:42:26.620
One thing about it is that people have a stake in it.
link |
00:42:29.580
It's not just empty status games.
link |
00:42:31.860
There's actual code or there's actual stuff being done.
link |
00:42:35.180
And I think as soon as you have a real world stake
link |
00:42:37.500
in something, you have a motivation to not screw
link |
00:42:41.860
up that thing.
link |
00:42:42.700
And I think that that's often missing,
link |
00:42:45.500
that there's no incentive for the person to really preserve
link |
00:42:49.100
something if they get a little bit of attention
link |
00:42:51.540
from dumping on somebody's TikTok or something.
link |
00:42:55.980
They don't pay any price for it, but you
link |
00:42:57.780
have to kind of get decent with people
link |
00:43:00.780
when you have a shared stake, a little secret.
link |
00:43:03.180
So GitHub does a bit of that.
link |
00:43:06.900
GitHub is wonderful, yes.
link |
00:43:08.620
But I'm tempted to play the Jaren back at you,
link |
00:43:13.340
which is that, so GitHub is currently is amazing,
link |
00:43:16.460
but the thing is, if you have a stake,
link |
00:43:18.420
then if it's a social media platform,
link |
00:43:20.460
they can use the fact that you have a stake to manipulate you
link |
00:43:23.420
because you want to preserve the stake.
link |
00:43:25.300
So like.
link |
00:43:26.220
Right, well, this gets us into the economics.
link |
00:43:29.260
So there's this thing called Data Dignity
link |
00:43:30.900
that I've been studying for a long time.
link |
00:43:33.020
I wrote a book about an earlier version of it called
link |
00:43:35.220
The Future, and the basic idea of it
link |
00:43:39.300
is that, once again, this is a third year conversation.
link |
00:43:43.380
It's a fascinating topic.
link |
00:43:44.260
Let me do the fastest version of this I can do.
link |
00:43:46.780
The fastest way I know how to do this
link |
00:43:48.780
is to compare two futures, all right?
link |
00:43:51.940
So future one is then the normative one,
link |
00:43:55.620
the one we're building right now,
link |
00:43:56.900
and future two is going to be Data Dignity.
link |
00:44:00.180
And I'm going to use a particular population.
link |
00:44:03.100
I live on the hill in Berkeley,
link |
00:44:05.340
and one of the features about the hill
link |
00:44:07.020
is that as the climate changes, we might burn down
link |
00:44:09.380
and I'll lose our houses or die or something.
link |
00:44:11.500
Like it's dangerous, you know, and it didn't used to be.
link |
00:44:14.260
And so who keeps us alive?
link |
00:44:17.020
Well, the city does.
link |
00:44:18.380
The city does some things.
link |
00:44:19.540
The electric company kind of sort of,
link |
00:44:21.500
maybe hopefully better, individual people who own property,
link |
00:44:26.060
take care of their property, that's all nice.
link |
00:44:27.660
But there's this other middle layer,
link |
00:44:29.300
which is fascinating to me,
link |
00:44:30.980
which is that the groundskeepers
link |
00:44:33.580
who work up and down that hill,
link |
00:44:35.340
many of whom are not legally here,
link |
00:44:38.700
many of whom don't speak English,
link |
00:44:40.540
cooperate with each other to make sure trees don't touch
link |
00:44:44.340
to transfer fire easily from lot to lot.
link |
00:44:46.660
They have this whole little web that's keeping us safe.
link |
00:44:49.180
I didn't know about this at first.
link |
00:44:50.500
I just started talking to them
link |
00:44:52.500
because they were out there during the pandemic.
link |
00:44:54.340
And so I'd try to just see who are these people?
link |
00:44:56.820
Who are these people who are keeping us alive?
link |
00:44:59.340
Now, I want to talk about the two different faiths
link |
00:45:01.460
for those people under future one and future two.
link |
00:45:04.900
Future one, some weird kindergarten paint job van
link |
00:45:10.420
with all these cameras and weird things drives up,
link |
00:45:12.500
observes what the gardeners and groundskeepers are doing.
link |
00:45:15.580
A few years later, some amazing robots
link |
00:45:18.220
that can show me up trees and all this show up,
link |
00:45:20.500
all those people are out of work
link |
00:45:21.620
and there are these robots doing the thing.
link |
00:45:23.140
And the robots are good and they can scale to more land
link |
00:45:26.380
and they're actually good.
link |
00:45:28.460
But then there are all these people out of work
link |
00:45:29.860
and these people have lost dignity.
link |
00:45:31.340
They don't know what they're going to do.
link |
00:45:32.940
And then somebody will say, well, they go on basic income,
link |
00:45:35.500
whatever they become wards of the state.
link |
00:45:39.060
My problem with that solution is every time in history
link |
00:45:42.620
that you've had some centralized thing
link |
00:45:44.380
that's doling out the benefits,
link |
00:45:45.660
that things get seized by people
link |
00:45:47.300
because it's too centralized and it gets seized.
link |
00:45:49.540
This happened to every communist experiment I can find.
link |
00:45:53.260
So I think that turns into a poor future
link |
00:45:56.060
that will be these unstable.
link |
00:45:57.740
I don't think people will feel good in it.
link |
00:45:59.220
I think it'll be a political disaster
link |
00:46:01.460
where the sequence of people
link |
00:46:02.540
seizing this central source of the basic income.
link |
00:46:06.820
And you'll say, oh, no, an algorithm can do it.
link |
00:46:08.380
Then people will seize the algorithm.
link |
00:46:09.700
They'll seize control.
link |
00:46:11.220
Unless the algorithm is decentralized
link |
00:46:13.540
and it's impossible to seize the control.
link |
00:46:15.580
Yeah, but 60 something people own a quarter of all the Bitcoin.
link |
00:46:22.660
The things that we think are decentralized
link |
00:46:24.100
are not decentralized.
link |
00:46:25.940
So let's go to future two.
link |
00:46:27.820
Future two, the gardener see that van with all the cameras
link |
00:46:32.460
and the kindergarten paint job.
link |
00:46:33.660
And they say, the groundskeepers,
link |
00:46:35.420
and they say, hey, the robots are coming.
link |
00:46:37.660
We're gonna form a data union.
link |
00:46:38.940
And amazingly, California has a little baby data union.
link |
00:46:42.740
Really?
link |
00:46:43.580
A law emerging in the books.
link |
00:46:44.420
Yes. Interesting.
link |
00:46:45.260
That's interesting.
link |
00:46:46.100
And so they'll, and what they say,
link |
00:46:49.180
we're gonna form a data union and we're gonna,
link |
00:46:53.860
not only are we gonna sell our data to this place,
link |
00:46:56.340
but we're gonna make it better than it would have been
link |
00:46:57.940
if they were just grabbing it without our cooperation.
link |
00:47:00.100
And we're gonna improve it.
link |
00:47:01.780
We're gonna make the robots more effective.
link |
00:47:03.380
We're gonna make them better
link |
00:47:04.220
and we're gonna be proud of it.
link |
00:47:05.340
We're gonna become a new class of experts that are respected.
link |
00:47:09.900
And then here's the interesting,
link |
00:47:11.740
there's two things that are different about that world
link |
00:47:14.540
from future one.
link |
00:47:15.660
One thing, of course, the people have more pride.
link |
00:47:17.660
They have more sense of ownership of agency, but what the robots do changes.
link |
00:47:27.260
Instead of just like this functional,
link |
00:47:29.980
like we'll figure out how to keep the neighborhood from burning down,
link |
00:47:33.540
you have this whole creative community
link |
00:47:35.340
that wasn't there before thinking,
link |
00:47:36.500
well, how can we make these robots better
link |
00:47:38.020
so we can keep on earning money?
link |
00:47:39.700
There'll be waves of creative grounds keeping
link |
00:47:44.300
with spiral pumping, pumping patches and waves of cultural things.
link |
00:47:47.980
There'll be new ideas like,
link |
00:47:49.500
wow, I wonder if we could do something about climate change mitigation
link |
00:47:53.180
with how we do this.
link |
00:47:54.460
What about, what about fresh water?
link |
00:47:56.420
Can we, what about, can we make the food healthier?
link |
00:47:59.220
What about, what about all of a sudden,
link |
00:48:00.500
there'll be this whole creative community on the case?
link |
00:48:03.300
And isn't it nicer to have a high tech future
link |
00:48:06.140
with more creative classes
link |
00:48:07.580
than one with more dependent classes?
link |
00:48:09.220
Isn't that a better future?
link |
00:48:10.460
But, but, but, but, but.
link |
00:48:12.500
Future one and future two have the same robots
link |
00:48:16.460
and the same algorithms.
link |
00:48:17.620
There's no technological difference.
link |
00:48:19.380
There's only a human difference.
link |
00:48:20.780
Yeah.
link |
00:48:21.620
And that second future two, that's data dignity.
link |
00:48:25.700
The economy that you're,
link |
00:48:26.780
I mean, the game theory here is on the humans.
link |
00:48:29.260
And then the technology is just the tools
link |
00:48:31.780
that enable, you know, I mean,
link |
00:48:34.060
I think you can believe in AI and be in future two.
link |
00:48:37.620
I just think it's a little harder.
link |
00:48:38.740
You have to do, you have to do more
link |
00:48:40.940
contortions.
link |
00:48:42.660
It's possible.
link |
00:48:43.500
So in the case of social media,
link |
00:48:46.140
what is a data dignity look like?
link |
00:48:49.260
Is it people getting paid for their data?
link |
00:48:51.540
Yeah.
link |
00:48:52.380
I think what should happen is in the future,
link |
00:48:55.420
there should be massive data unions
link |
00:48:59.460
for people putting content into the system.
link |
00:49:04.060
And those data unions should smooth out
link |
00:49:06.060
the results a little bit.
link |
00:49:07.020
So it's not winter take all,
link |
00:49:08.700
but at the same time,
link |
00:49:10.540
and people have to pay for it too.
link |
00:49:11.700
They have to pay for Facebook
link |
00:49:13.700
the way they pay for Netflix
link |
00:49:14.980
with an allowance for the poor.
link |
00:49:17.460
There has to be a way out too.
link |
00:49:20.340
But the thing is people do pay for Netflix.
link |
00:49:22.260
It's a going concern.
link |
00:49:24.420
People pay for Xbox and PlayStation.
link |
00:49:26.340
Like people,
link |
00:49:27.180
there's enough people to pay for stuff they want
link |
00:49:29.020
this could happen too.
link |
00:49:29.860
It's just that this precedent started
link |
00:49:31.420
that moved in the wrong direction.
link |
00:49:33.140
And then what has to happen,
link |
00:49:36.460
the economy's a measuring device.
link |
00:49:38.100
If it's an honest measuring device,
link |
00:49:40.980
the outcomes for people form a normal distribution,
link |
00:49:44.340
a bell curve.
link |
00:49:45.500
And then so there should be a few people
link |
00:49:47.020
who do really well,
link |
00:49:47.860
a lot of people who do okay.
link |
00:49:49.460
And then we should have an expanding economy
link |
00:49:51.500
reflecting more and more creativity and expertise
link |
00:49:54.700
flowing through the network.
link |
00:49:56.420
And that expanding economy moves the result
link |
00:49:58.700
just a bit forward.
link |
00:49:59.540
So more people are getting money out of it
link |
00:50:01.740
than are putting money into it.
link |
00:50:02.980
So it gradually expands the economy
link |
00:50:04.620
and lifts all boats.
link |
00:50:05.620
And the society has to support the lower wing
link |
00:50:09.420
of the bell curve too,
link |
00:50:10.740
but not universal basic income.
link |
00:50:12.140
It has to be for the,
link |
00:50:14.340
because if it's an honest economy,
link |
00:50:16.740
there will be that lower wing.
link |
00:50:19.260
And we have to support those people.
link |
00:50:20.820
There has to be a safety net.
link |
00:50:22.900
But see what I believe,
link |
00:50:25.100
I'm not gonna talk about AI,
link |
00:50:26.100
but I will say that I think there'll be
link |
00:50:28.860
more and more algorithms that are useful.
link |
00:50:31.220
And so I don't think everybody's gonna be supplying data
link |
00:50:34.820
to groundskeeping robots,
link |
00:50:36.140
nor do I think everybody's gonna make their living
link |
00:50:38.020
with TikTok videos.
link |
00:50:38.860
I think in both cases,
link |
00:50:40.220
there'll be a rather small contingent
link |
00:50:42.820
that do well enough at either of those things.
link |
00:50:45.220
But I think there might be many, many, many, many
link |
00:50:48.340
of those niches that start to evolve
link |
00:50:49.940
as they're more and more algorithms,
link |
00:50:51.140
more and more robots.
link |
00:50:52.180
And it's that large number
link |
00:50:54.620
that will create the economic potential
link |
00:50:56.660
for a very large part of society
link |
00:50:58.620
to become members of new creative classes.
link |
00:51:01.620
So do you think it's possible to create a social network
link |
00:51:06.340
that competes with Twitter and Facebook
link |
00:51:07.980
that's large and centralized in this way?
link |
00:51:10.100
Not centralized, sort of large, large.
link |
00:51:12.220
How to get, all right, so I gotta tell you
link |
00:51:14.620
how to get from what I'm talking,
link |
00:51:16.620
how to get from where we are to anything kind of in the zone
link |
00:51:19.500
of what I'm talking about is challenging.
link |
00:51:23.740
I know some of the people who run,
link |
00:51:26.020
like I know Jack Dorsey,
link |
00:51:27.140
and I view Jack as somebody who's actually,
link |
00:51:34.860
I think he's really striving and searching
link |
00:51:36.980
and trying to find a way to make it better,
link |
00:51:40.100
but is kind of like, it's very hard to do it while in flight.
link |
00:51:44.260
And he's under enormous business pressure too.
link |
00:51:47.460
So Jack Dorsey to me is a fascinating study
link |
00:51:49.660
because I think his mind is in a lot of good places.
link |
00:51:52.700
He's a good human being,
link |
00:51:54.540
but there's a big Titanic ship
link |
00:51:56.500
that's already moving in one direction.
link |
00:51:58.020
It's hard to know what to do with it.
link |
00:51:59.220
I think that's the story of Twitter.
link |
00:52:00.940
I think that's the story of Twitter.
link |
00:52:02.740
One of the things that I observed is that
link |
00:52:04.660
if you just wanna look at the human side,
link |
00:52:06.580
meaning like how are people being changed?
link |
00:52:08.740
How do they feel?
link |
00:52:09.620
What does the culture like?
link |
00:52:11.500
Almost all of the social media platforms that get big
link |
00:52:15.940
have an initial sort of honeymoon period
link |
00:52:18.020
where they're actually kind of sweet and cute.
link |
00:52:20.300
Like if you look at the early years of Twitter,
link |
00:52:22.340
it was really sweet and cute,
link |
00:52:23.740
but also look at Snap, TikTok.
link |
00:52:27.500
And then what happens is as they scale
link |
00:52:30.420
and the algorithms become more influential
link |
00:52:32.740
instead of just the early people,
link |
00:52:34.100
when it gets big enough that it's the algorithm running it,
link |
00:52:36.900
then you start to see the rise of the paranoid style
link |
00:52:39.620
and then they start to get dark.
link |
00:52:40.780
And we've seen that shift in TikTok rather recently.
link |
00:52:43.820
But I feel like that scaling reveals the flaws
link |
00:52:48.700
within the incentives.
link |
00:52:51.660
I feel like I'm torturing you.
link |
00:52:52.860
I'm sorry.
link |
00:52:53.700
No, it's not torturing.
link |
00:52:54.540
No, because I have hope for the world with humans
link |
00:53:00.380
and I have hope for a lot of things that humans create,
link |
00:53:02.860
including technology.
link |
00:53:04.380
And I just, I feel it is possible
link |
00:53:06.860
to create social media platforms
link |
00:53:09.020
that incentivize different things than the current.
link |
00:53:13.420
I think the current incentivization
link |
00:53:15.820
is around like the dumbest possible thing
link |
00:53:18.100
that was invented like 20 years ago, however long.
link |
00:53:21.780
And it just works and so nobody's changing it.
link |
00:53:24.180
I just think that there could be a lot of innovation
link |
00:53:26.660
for more, see, you kind of push back this idea
link |
00:53:29.540
that we can't know what longterm growth or happiness is.
link |
00:53:33.180
I, if you give control to people
link |
00:53:35.660
to define what their longterm happiness and goals are,
link |
00:53:39.460
then that optimization can happen
link |
00:53:42.500
for each of those individual people.
link |
00:53:45.980
Well, I mean, imagine a future
link |
00:53:49.300
where probably a lot of people would love
link |
00:53:55.300
to make their living doing TikTok dance videos,
link |
00:53:59.220
but people recognize generally
link |
00:54:01.340
that's kind of hard to get into.
link |
00:54:03.100
Nonetheless, dance crews have an experience
link |
00:54:06.900
that's very similar to programmers working together
link |
00:54:09.580
on GitHub.
link |
00:54:10.420
So the future is like a cross between TikTok and GitHub
link |
00:54:13.300
and they get together and they have, they're,
link |
00:54:16.900
they have rights.
link |
00:54:17.740
They're negotiating, they're negotiating for returns.
link |
00:54:20.860
They join different artist societies in order
link |
00:54:23.660
to soften the blow of the randomness
link |
00:54:26.380
of who gets the network effect benefit
link |
00:54:28.660
because nobody can know that.
link |
00:54:30.540
And they, and I think an individual person
link |
00:54:35.020
might join a thousand different data unions
link |
00:54:37.260
in the course of their lives or maybe even 10,000.
link |
00:54:39.780
I don't know, but the point is that we'll have like these
link |
00:54:42.380
very hedge distributed portfolios
link |
00:54:45.180
of different data unions were part of.
link |
00:54:47.100
And some of them might just trickle in a little money
link |
00:54:49.580
for nonsense stuff where we're contributing
link |
00:54:52.300
to health studies or something.
link |
00:54:53.820
And, but I think people will find their way,
link |
00:54:56.300
they'll find their way to the right GitHub like community
link |
00:54:59.580
in which they find their value in the context
link |
00:55:03.140
of supplying inputs and data and taste and correctives
link |
00:55:08.140
and all of this into the algorithms
link |
00:55:10.700
and the robots of the future.
link |
00:55:12.100
And that is a way to resist the lizard brain based
link |
00:55:18.580
funding assist mechanism.
link |
00:55:20.420
It's an alternate economic system
link |
00:55:22.740
that rewards productivity, creativity,
link |
00:55:26.020
value as perceived by others.
link |
00:55:27.860
It's a genuine market.
link |
00:55:28.860
It's not doled out from a center.
link |
00:55:30.500
There's not some communist person deciding who's valuable.
link |
00:55:33.740
It's actual market.
link |
00:55:35.100
And the, the money is made by supporting that instead of just
link |
00:55:41.420
grabbing people's attention in the cheapest possible way,
link |
00:55:44.100
which is definitely how you get the lizard brain.
link |
00:55:46.260
Yeah.
link |
00:55:47.100
Okay.
link |
00:55:47.940
So we're finally at the agreement.
link |
00:55:49.700
But I, I just think that so, yeah, I'll tell you what,
link |
00:55:56.700
how I think the fake social media, there's a few things.
link |
00:56:00.580
There's a few things that I think are important.
link |
00:56:03.100
There's a few things, there's a few things.
link |
00:56:05.540
So one, I think people should have complete control over
link |
00:56:08.180
their data and transparency of what that data is
link |
00:56:11.780
and how it's being used if they do hand over the control.
link |
00:56:14.820
Another thing they should be able to delete, walk away
link |
00:56:17.220
with their data at any moment, easy.
link |
00:56:19.740
Like with a single click of a button, maybe two buttons.
link |
00:56:22.180
I don't know.
link |
00:56:23.020
Just easily walk away with their data.
link |
00:56:26.140
The other is control of the algorithm,
link |
00:56:28.180
individualized control of the algorithm for them.
link |
00:56:31.300
So each one has their own algorithm.
link |
00:56:33.540
Each person has their own algorithm.
link |
00:56:34.900
They get to be the decider of what they see in this world.
link |
00:56:39.180
And to me, that, I mean, that's,
link |
00:56:41.300
I guess fundamentally decentralized
link |
00:56:43.820
in terms of the key decisions being made.
link |
00:56:46.180
But if that's made transparent,
link |
00:56:47.500
I feel like people will choose that system
link |
00:56:50.180
over Twitter of today, over Facebook of today.
link |
00:56:53.620
When they have the ability to walk away,
link |
00:56:55.340
to control their data and to control
link |
00:56:57.500
the kinds of thing they see.
link |
00:56:59.100
Now, let's walk away from the term AI, you're right.
link |
00:57:03.060
In this case, you have full control
link |
00:57:06.060
of the algorithms that help you if you want
link |
00:57:09.340
to use their help, but you can also say a few
link |
00:57:11.740
to those algorithms and just consume the raw,
link |
00:57:15.980
beautiful waterfall of the internet.
link |
00:57:19.500
I think that, to me, that's not only fix the social media,
link |
00:57:23.620
but I think it would make a lot more money.
link |
00:57:25.460
So I would like to challenge the idea.
link |
00:57:27.060
I know you're not presenting that,
link |
00:57:28.420
but that the only way to make a ton of money
link |
00:57:31.900
is to operate like Facebook is.
link |
00:57:33.940
I think you can make more money by giving people control.
link |
00:57:37.780
Yeah, I mean, I certainly believe that.
link |
00:57:39.980
We're definitely in the territory
link |
00:57:41.660
of wholehearted agreement here.
link |
00:57:46.460
I do want to caution against one thing,
link |
00:57:48.660
which is making a future that benefits programmers versus,
link |
00:57:53.100
like this idea that people are in control of their data.
link |
00:57:55.340
So years ago, I cofounded an advisory board for the EU
link |
00:57:59.860
with a guy named Giovanni Bottarelli who passed away.
link |
00:58:02.220
It's one of the reasons I wanted to mention it.
link |
00:58:03.540
A remarkable guy who'd been,
link |
00:58:06.140
he was originally a prosecutor
link |
00:58:07.740
who was throwing mafioso and gel in Sicily.
link |
00:58:12.060
So he was like this intense guy who was like,
link |
00:58:15.060
I've dealt with death threats.
link |
00:58:17.180
Mark Zuckerberg doesn't scare me or whatever.
link |
00:58:19.020
So we worked on this path of saying,
link |
00:58:22.220
let's make it all about transparency and consent.
link |
00:58:24.260
And it was one of the feeders
link |
00:58:25.580
that led to this huge data privacy
link |
00:58:30.700
and protection framework in Europe called the GDPR.
link |
00:58:34.020
And so therefore we've been able
link |
00:58:36.460
to have empirical feedback on how that goes.
link |
00:58:39.140
And the problem is that most people actually get stymied
link |
00:58:44.060
by the complexity of that kind of management.
link |
00:58:46.940
They have trouble and reasonably so.
link |
00:58:49.740
I don't, I'm like a techie.
link |
00:58:51.580
I can go in and I can figure out what's going on.
link |
00:58:54.500
But most people really do.
link |
00:58:56.820
And so there's a problem that it differentially benefits
link |
00:59:03.180
those who kind of have a technical mindset and can go in
link |
00:59:06.220
and sort of have a feeling for how this stuff works.
link |
00:59:09.020
I kind of still want to come back to incentives.
link |
00:59:11.580
And so if the incentive for whoever's,
link |
00:59:15.100
if the commercial incentive is to help the creative people
link |
00:59:17.620
of the future make more money because you get a cut of it,
link |
00:59:20.500
that's how you grow an economy.
link |
00:59:22.300
Not the programmers.
link |
00:59:24.100
Well, some of them will be programmers.
link |
00:59:25.660
It's not anti programmer.
link |
00:59:26.700
I'm just saying that it's not only programmers.
link |
00:59:29.860
So yeah, you have to make sure the incentives are right.
link |
00:59:35.660
I mean, I like control is an interface problem
link |
00:59:40.500
to where you have to create something
link |
00:59:41.780
that's compelling to everybody,
link |
00:59:45.100
to the creatives, to the public.
link |
00:59:48.100
I mean, there's a, I don't know, creative commons,
link |
00:59:52.020
like the licensing, there's a bunch of legal speak
link |
00:59:57.300
just in general, the whole legal profession.
link |
01:00:00.340
It's nice when it can be simplified
link |
01:00:01.860
in the way that you can truly simply understand,
link |
01:00:04.020
everybody can simply understand the basics.
link |
01:00:07.860
In the same way, it should be very simple to understand
link |
01:00:12.660
how the data is being used
link |
01:00:14.900
and what data is being used for people.
link |
01:00:17.580
But then you're arguing that in order for that to happen,
link |
01:00:20.580
you have to have the incentives alike.
link |
01:00:22.460
I mean, a lot of the reason that money works
link |
01:00:26.620
is actually information hiding and information loss.
link |
01:00:30.260
Like one of the things about money is a particular dollar
link |
01:00:33.900
you might have passed through your enemy's hands
link |
01:00:36.460
and you don't know it.
link |
01:00:37.780
But also, I mean, this is what Adam Smith,
link |
01:00:40.380
if you wanna give the most charitable interpretation
link |
01:00:42.860
possible to the invisible hand is what he was saying,
link |
01:00:46.020
is that there's this whole complicated thing
link |
01:00:48.620
and not only do you not need to know about it,
link |
01:00:50.660
the truth is you'd never be able to follow it
link |
01:00:52.220
if you tried and just let the economic incentives
link |
01:00:55.860
solve for this whole thing and that in a sense,
link |
01:01:00.300
every transaction is like a neuron and a neural net.
link |
01:01:02.580
If he'd had that metaphor, he would have used it
link |
01:01:05.700
and let the whole thing settle to a solution
link |
01:01:08.060
and don't worry about it.
link |
01:01:09.740
I think this idea of having incentives
link |
01:01:13.700
that reduce complexity for people can be made to work.
link |
01:01:17.380
And that's an example of an algorithm
link |
01:01:19.260
that could be manipulative or not,
link |
01:01:20.620
going back to your question before about,
link |
01:01:22.020
can you do it in a way that's not manipulative?
link |
01:01:24.500
And I would say a GitHub like,
link |
01:01:28.140
if you just have this vision,
link |
01:01:29.340
GitHub plus TikTok combined, is it possible?
link |
01:01:33.340
I think it is, I really think it is.
link |
01:01:34.860
I'm not gonna be able to unsee that idea
link |
01:01:38.740
of creatives on TikTok collaborating in the same way
link |
01:01:41.100
that people on GitHub collaborate.
link |
01:01:42.820
Why not?
link |
01:01:43.660
I like that kind of version.
link |
01:01:44.580
Why not?
link |
01:01:45.700
I like it, I love it.
link |
01:01:46.540
I just, like right now when people use,
link |
01:01:48.940
by the way, father of teenage daughter, so.
link |
01:01:51.740
It's all about TikTok, right?
link |
01:01:53.620
So, you know, when people use TikTok,
link |
01:01:55.620
there's a lot of, it's kind of funny,
link |
01:01:59.140
I was gonna say cattyness,
link |
01:02:00.180
but I was just using the cat as this exemplar
link |
01:02:03.020
of what we're coming up with.
link |
01:02:04.700
I contradict myself, but anyway,
link |
01:02:06.300
there's all this cattyness where people are like,
link |
01:02:07.860
ee, ee, ee, ee, ee, ee, ee.
link |
01:02:09.780
And I just, what about people getting together
link |
01:02:13.660
and saying, okay, we're gonna work on this move,
link |
01:02:16.580
we're gonna get a better, can we get a better musician?
link |
01:02:18.780
And they do that, but that's the part
link |
01:02:22.060
that's kind of off the books right now.
link |
01:02:25.020
You know, that should be like right there,
link |
01:02:26.380
that should be the center.
link |
01:02:27.220
That's where the, that's the really best part.
link |
01:02:29.540
Well, that's where the invention of get period,
link |
01:02:32.060
the versioning is brilliant.
link |
01:02:33.420
And so some of the things you're talking about,
link |
01:02:36.860
technology, algorithms, tools can empower.
link |
01:02:40.420
And that's the thing, for humus to connect,
link |
01:02:43.780
to collaborate and so on.
link |
01:02:45.100
Can we upset more people a little bit?
link |
01:02:48.060
You already.
link |
01:02:49.380
Maybe, we'd have to try.
link |
01:02:50.860
No, no, can we, can ask you to elaborate,
link |
01:02:53.900
because my intuition was that you would be a supporter
link |
01:02:57.020
of something like cryptocurrency and Bitcoin,
link |
01:02:59.340
because it is fundamentally emphasizes decentralization.
link |
01:03:03.140
What do you, so can you elaborate?
link |
01:03:05.700
Yeah, okay, look.
link |
01:03:06.900
Your thoughts on Bitcoin.
link |
01:03:08.060
I, it's kind of funny.
link |
01:03:11.620
I wrote, I've been advocating
link |
01:03:15.140
some kind of digital currency for a long time.
link |
01:03:17.700
And when the, when the, when Bitcoin came out
link |
01:03:23.020
and the original paper on, on blockchain,
link |
01:03:26.820
my heart kind of sank, because I thought, oh my God,
link |
01:03:30.420
we're applying all of this fancy thought
link |
01:03:32.780
on all these very careful distributed security measures
link |
01:03:36.420
to recreate the gold standard.
link |
01:03:38.620
Like it's just so retro, it's so dysfunctional.
link |
01:03:42.180
It's so useless from an economic point of view.
link |
01:03:44.100
So it's always, and then the other thing
link |
01:03:46.420
is using computational inefficiency at a boundless scale
link |
01:03:50.260
as your form of security is a crime against this atmosphere.
link |
01:03:54.020
Obviously, a lot of people know that now,
link |
01:03:55.540
but we knew that at the start.
link |
01:03:57.700
Like the thing is when the first paper came out,
link |
01:03:59.820
I remember a lot of people saying, oh my God,
link |
01:04:01.500
this thing scales, it's a carbon disaster, you know?
link |
01:04:04.540
And, and I, I just like, I'm just mystified,
link |
01:04:09.340
but that's a different question than when you asked.
link |
01:04:11.500
Can you have a cryptographic currency
link |
01:04:15.220
or at least some kind of digital currency
link |
01:04:17.460
that's of a benefit?
link |
01:04:18.380
And absolutely, like I'm,
link |
01:04:20.420
and there are people who are trying to be thoughtful
link |
01:04:22.180
about this, you should, if you haven't,
link |
01:04:23.900
you should interview Vitalik Buterin sometime.
link |
01:04:25.980
Yeah, I've interviewed him twice.
link |
01:04:27.700
Okay, so like there are people in the community
link |
01:04:30.100
who are trying to be thoughtful
link |
01:04:31.060
and trying to figure out how to do this better.
link |
01:04:33.020
It has nice properties though, right?
link |
01:04:34.420
So that one of the nice properties
link |
01:04:35.860
is that like government centralized, it's hard to control.
link |
01:04:39.260
And then the other one, to fix some of the issues
link |
01:04:41.420
that you're referring to,
link |
01:04:42.260
I'm sort of playing devil's advocate here is,
link |
01:04:44.340
you know, there's lightning network,
link |
01:04:45.580
there's ideas how you build stuff on top of Bitcoin,
link |
01:04:49.980
similar with gold,
link |
01:04:51.140
that allow you to have this kind of vibrant economy
link |
01:04:54.460
that operates not on the blockchain,
link |
01:04:56.260
but outside the blockchain
link |
01:04:57.420
and use this Bitcoin for like checking the security
link |
01:05:02.300
of those transactions.
link |
01:05:03.340
So Bitcoin's not new, it's been around for a while.
link |
01:05:05.900
I've been watching it closely.
link |
01:05:08.100
I've not seen one example of it creating economic growth.
link |
01:05:12.980
There was this obsession with the idea
link |
01:05:14.460
that government was the problem.
link |
01:05:16.180
That idea that government's the problem,
link |
01:05:18.500
let's say government earned that wrath honestly,
link |
01:05:22.860
because if you look at some of the things
link |
01:05:25.180
that governments have done in recent decades,
link |
01:05:27.180
it's not a pretty story.
link |
01:05:28.980
Like after a very small number of people
link |
01:05:33.420
in the US government decided to bomb in landmine,
link |
01:05:37.460
Southeast Asia, it's hard to come back
link |
01:05:40.180
and say, oh, government's a great thing.
link |
01:05:41.820
But then the problem is that this resistance to government
link |
01:05:48.900
is basically resistance to politics.
link |
01:05:51.020
It's a way of saying,
link |
01:05:52.380
if I can get rich, nobody should bother me.
link |
01:05:54.300
It's a way of not having obligations to others.
link |
01:05:56.940
And that ultimately is a very suspect motivation.
link |
01:06:00.740
But does that mean that the impulse,
link |
01:06:04.180
that the government should not overreach its power is flawed?
link |
01:06:09.340
Well, I mean, what I wanna ask you to do
link |
01:06:12.100
is to replace the word government with politics.
link |
01:06:15.500
Like our politics is people having to deal with each other.
link |
01:06:19.940
My theory about freedom is that the only authentic
link |
01:06:23.540
form of freedom is perpetual annoyance.
link |
01:06:26.500
All right, so annoyance means
link |
01:06:29.340
you're actually dealing with people
link |
01:06:30.660
because people are annoying.
link |
01:06:31.860
Perpetual means that that annoyance is survivable
link |
01:06:34.580
so it doesn't destroy us all.
link |
01:06:36.220
So if you have perpetual annoyance, then you have freedom.
link |
01:06:38.660
And that's politics.
link |
01:06:39.820
That's politics.
link |
01:06:40.660
If you don't have perpetual annoyance,
link |
01:06:42.900
something's gone very wrong
link |
01:06:44.340
and you've suppressed those people
link |
01:06:45.620
that it's only temporary,
link |
01:06:46.460
it's gonna come back and be horrible.
link |
01:06:48.420
You should seek perpetual annoyance.
link |
01:06:51.060
I'll invite you to a Berkeley City Council meeting
link |
01:06:52.900
so you can know what that feels like.
link |
01:06:54.060
What perfection it feels like.
link |
01:06:57.500
But anyway, so freedom is being,
link |
01:06:59.740
the test of freedom is that you're annoyed by other people.
link |
01:07:02.180
If you're not, you're not free.
link |
01:07:03.540
If you're not, you're trapped in some temporary illusion
link |
01:07:06.180
that's gonna fall apart.
link |
01:07:07.820
Now, this quest to avoid government
link |
01:07:10.580
is really a quest to avoid that political feeling
link |
01:07:13.020
but you have to have it.
link |
01:07:14.180
You have to deal with it.
link |
01:07:16.540
And it sucks, but that's the human situation.
link |
01:07:19.340
That's the human condition.
link |
01:07:20.700
And this idea that we're gonna have this abstract thing
link |
01:07:22.940
that protects us from having to deal with each other
link |
01:07:25.300
is always an illusion.
link |
01:07:26.780
The idea, and I apologize,
link |
01:07:28.740
I overstretched the use of the word government.
link |
01:07:32.340
The idea is there should be some punishment from the people
link |
01:07:37.340
when a bureaucracy, when a set of people
link |
01:07:41.180
or a particular leader, like in an authoritarian regime,
link |
01:07:44.620
which more than half the world currently lives under,
link |
01:07:47.220
if you, like if they become,
link |
01:07:51.900
they start, stop representing the people.
link |
01:07:53.820
It stops being like a Berkeley meeting
link |
01:07:56.700
and starts being more like a dictatorial kind of situation.
link |
01:08:01.700
And so the point is, it's nice to give people,
link |
01:08:06.100
the populace in a decentralized way,
link |
01:08:08.980
power to resist that kind of government
link |
01:08:14.540
becoming authoritarian.
link |
01:08:15.820
Yeah, but people, see this idea that the problem
link |
01:08:18.380
is always the government being powerful is false.
link |
01:08:21.580
The problem can also be criminal gangs.
link |
01:08:23.660
The problem can also be weird cults.
link |
01:08:25.580
The problem can be abusive,
link |
01:08:29.380
abusive clergy.
link |
01:08:30.500
The problem can be infrastructure that fails.
link |
01:08:35.220
The problem can be poisoned water.
link |
01:08:37.660
The problem can be failed electric grids.
link |
01:08:39.980
The problem can be a crappy education system
link |
01:08:45.680
that makes the whole society less and less able
link |
01:08:49.460
to create value.
link |
01:08:51.420
There are all these other problems
link |
01:08:52.860
that are different from an overbearing government.
link |
01:08:54.660
Like you have to keep some sense of perspective
link |
01:08:56.980
and not be obsessed with only one kind of problem
link |
01:08:59.300
because then the others will pop up.
link |
01:09:01.220
But empirically speaking,
link |
01:09:02.500
some problems are bigger than others.
link |
01:09:05.340
So like some groups of people,
link |
01:09:08.900
like governments or gangs or companies lead
link |
01:09:11.500
to problems more than others.
link |
01:09:12.340
Are you a US citizen?
link |
01:09:13.580
Yes.
link |
01:09:14.420
Has the government ever really been a problem for you?
link |
01:09:16.580
Well, okay.
link |
01:09:17.420
So first of all, I grew up in the Soviet Union.
link |
01:09:20.140
And actually, yeah, my wife did too.
link |
01:09:22.340
So I have seen, you know.
link |
01:09:25.540
Sure.
link |
01:09:26.900
And has the government bothered me?
link |
01:09:28.900
I would say that that's a really complicated question,
link |
01:09:32.820
especially because the United States is such,
link |
01:09:35.500
it's a special place in like a lot of other countries.
link |
01:09:39.660
My wife's family were refused NICs.
link |
01:09:41.820
And so we have like a very,
link |
01:09:43.100
and her dad was sent to the gulag
link |
01:09:46.220
for what it's worth on my father's side,
link |
01:09:49.260
all but if you were killed by a pogrom
link |
01:09:51.460
in a post Soviet pogrom in Ukraine.
link |
01:09:57.140
So I would say because you did a little trick
link |
01:10:00.140
of eloquent trick of language
link |
01:10:02.900
that you switched to the United States
link |
01:10:04.820
to talk about government.
link |
01:10:06.260
So I believe, unlike my friend Michael Malis,
link |
01:10:10.620
who's an anarchist,
link |
01:10:12.060
I believe government can do a lot of good in the world.
link |
01:10:15.780
That is exactly what you're saying,
link |
01:10:17.140
which is it's politics.
link |
01:10:19.700
The thing that Bitcoin folks and crypto currency folks argue
link |
01:10:22.940
is that one of the big ways
link |
01:10:25.060
that government can control the populace
link |
01:10:26.780
is centralize bank, like control the money.
link |
01:10:30.100
That was the case in the Soviet Union too.
link |
01:10:32.300
There's inflation can really make poor people suffer.
link |
01:10:38.620
And so what they argue is this is one way
link |
01:10:42.180
to go around that power that government has
link |
01:10:46.300
of controlling the monetary system.
link |
01:10:48.580
So that's a way to resist.
link |
01:10:50.220
That's not actually saying government bad.
link |
01:10:53.460
That's saying some of the ways
link |
01:10:55.700
that central banks get into trouble
link |
01:10:59.780
can be resisted through the central.
link |
01:11:01.340
So let me ask you unbalance today in the real world
link |
01:11:05.140
in terms of actual facts.
link |
01:11:07.740
Do you think cryptocurrencies are doing more
link |
01:11:10.180
to prop up corrupt, murderous, horrible regimes
link |
01:11:13.980
or to resist those regimes?
link |
01:11:15.780
Where do you think the balance is right now?
link |
01:11:17.580
I know exactly having talked to a lot of cryptocurrency folks
link |
01:11:21.500
what they would tell me, right?
link |
01:11:26.860
No, no, no.
link |
01:11:27.700
I'm asking it as a real question.
link |
01:11:29.380
There's no way to know the answer.
link |
01:11:30.820
There's no way to know the answer perfectly.
link |
01:11:32.700
However, I gotta say, if you look at people
link |
01:11:36.460
who've been able to decode blockchains
link |
01:11:39.780
and they do leak a lot of data,
link |
01:11:41.100
they're not as secure as this widely thought.
link |
01:11:43.580
There are a lot of unknown Bitcoin whales
link |
01:11:47.260
from pretty early and they're huge.
link |
01:11:49.740
And if you ask who are these people,
link |
01:11:54.860
there's evidence that a lot of them are quite on,
link |
01:11:57.620
not the people you'd wanna support, let's say.
link |
01:12:00.220
And I just don't, like I think empirically,
link |
01:12:03.820
this idea that there's some intrinsic way
link |
01:12:07.260
that bad governments will be disempowered
link |
01:12:13.620
and people will be able to resist them more
link |
01:12:16.100
than new villains or even villainous governments
link |
01:12:18.900
will be empowered.
link |
01:12:19.740
There's no basis for that assertion.
link |
01:12:21.860
It's just this kind of circumstantial.
link |
01:12:24.540
And I think in general,
link |
01:12:28.420
Bitcoin ownership is one thing,
link |
01:12:31.220
but Bitcoin transactions have tended
link |
01:12:33.540
to support criminality more than productivity.
link |
01:12:37.420
Of course, they would argue that was the story
link |
01:12:39.620
of its early days,
link |
01:12:40.740
that now more and more Bitcoin is being used
link |
01:12:43.860
for legitimate transactions.
link |
01:12:46.460
But that's a different,
link |
01:12:47.620
I didn't say for legitimate transactions,
link |
01:12:49.260
I said for economic growth, for creativity.
link |
01:12:52.100
Like I think what's happening is people are using it
link |
01:12:57.020
a little bit for, I don't know,
link |
01:12:59.220
maybe some of these companies make it available
link |
01:13:02.700
for this and that by a Tesla with it or something.
link |
01:13:07.380
Investing in a startup, hard,
link |
01:13:10.300
it might have happened a little bit,
link |
01:13:11.500
but it's not an engine of productivity,
link |
01:13:13.860
creativity and economic growth.
link |
01:13:16.220
Whereas old fashioned currency still is.
link |
01:13:18.500
And anyway, I'm, look, I think something,
link |
01:13:23.900
I'm pro the idea of digital currencies.
link |
01:13:27.020
I am anti the idea of economics,
link |
01:13:34.660
wiping out politics as a result.
link |
01:13:37.740
I think they have to exist in some balance
link |
01:13:40.020
to avoid the worst dysfunctions of each.
link |
01:13:42.460
In some ways there's parallels to our discussion
link |
01:13:44.780
of algorithms and cryptocurrency is you're pro the idea,
link |
01:13:50.980
but it can be used to manipulate,
link |
01:13:54.380
you can be used poorly by aforementioned humans.
link |
01:13:59.340
Well, I think that you can make better designs
link |
01:14:02.220
and worse designs.
link |
01:14:03.500
And I think, and you know, the thing about cryptocurrency
link |
01:14:06.300
that's so interesting is how many of us
link |
01:14:09.700
are responsible for the poor designs
link |
01:14:12.740
because we're all so hooked on that Horatio Alger story
link |
01:14:16.860
on like, I'm gonna be the one who gets the viral benefit.
link |
01:14:20.020
You know, way back when all this stuff was starting,
link |
01:14:22.860
I remember it would have been in the 80s,
link |
01:14:24.820
somebody had the idea of using viral
link |
01:14:26.740
as a metaphor for network effect.
link |
01:14:29.660
And the whole point was to talk about
link |
01:14:32.300
how bad network effect was,
link |
01:14:33.660
that it always created distortions
link |
01:14:35.860
that ruined the usefulness of economic incentives
link |
01:14:39.220
that created dangerous distortions.
link |
01:14:42.540
Like, but then somehow even after the pandemic,
link |
01:14:45.620
we think of viral as this good thing
link |
01:14:47.140
because we imagine ourselves as the virus, right?
link |
01:14:49.380
We wanna be on the beneficiary side of it.
link |
01:14:52.180
But of course, you're not likely to be.
link |
01:14:54.540
There is a sense because money is involved,
link |
01:14:57.020
people are not reasoning clearly always
link |
01:15:01.580
because they want to be part of that first viral wave
link |
01:15:06.580
that makes them rich.
link |
01:15:07.620
And that blinds people from their basic morality.
link |
01:15:11.380
I had an interesting conversation.
link |
01:15:13.460
I don't, I sort of feel like
link |
01:15:14.820
I should respect some people's privacy,
link |
01:15:16.500
but some of the initial people who started Bitcoin,
link |
01:15:20.900
I remember having an argument about,
link |
01:15:23.060
like it's intrinsically a Ponzi scheme,
link |
01:15:26.580
like the early people have more than the later people.
link |
01:15:29.580
And the further down the chain you get,
link |
01:15:31.820
the more you're subject to gambling like dynamics,
link |
01:15:34.900
where it's more and more random
link |
01:15:36.180
and more and more subject to weird network effects and whatnot,
link |
01:15:38.900
unless you're a very small player, perhaps,
link |
01:15:41.420
and you're just buying something,
link |
01:15:43.060
but even then you'll be subject to fluctuations
link |
01:15:45.260
because the whole thing is just kind of,
link |
01:15:47.220
like as it fluctuates,
link |
01:15:49.060
it's going to wave around the little people more.
link |
01:15:51.820
And I remember the conversation turned to gambling
link |
01:15:55.380
because gambling is a pretty large economic sector.
link |
01:15:58.220
And it's always struck me as being nonproductive.
link |
01:16:01.580
Like somebody goes to Las Vegas and they lose money.
link |
01:16:03.820
And so one argument is, well, they got entertainment.
link |
01:16:07.060
They paid for entertainment as they lost money.
link |
01:16:09.100
So that's fine.
link |
01:16:10.460
And Las Vegas does up the losing of money
link |
01:16:13.540
in an entertaining way.
link |
01:16:14.380
So why not?
link |
01:16:15.220
It's like going to a show.
link |
01:16:16.180
So that's one argument.
link |
01:16:17.700
The argument that was made to me was different from that.
link |
01:16:19.860
It's that, no, what they're doing
link |
01:16:21.420
is they're getting a chance to experience hope.
link |
01:16:23.900
And a lot of people don't get that chance.
link |
01:16:25.540
And so that's really worth it,
link |
01:16:26.700
even if they're going to lose.
link |
01:16:27.660
They have that moment of hope
link |
01:16:29.100
and they need to be able to experience that.
link |
01:16:31.300
And it's a very interesting argument.
link |
01:16:35.020
That's so heartbreaking because I've seen that way.
link |
01:16:40.100
I have that a little bit of a sense.
link |
01:16:41.780
I've talked to some young people who invest in cryptocurrency.
link |
01:16:46.020
And what I see is this hope.
link |
01:16:48.500
This is the first thing that gave them hope.
link |
01:16:50.420
And that's so heartbreaking to me
link |
01:16:52.980
that you've gotten hope from that.
link |
01:16:55.580
So much is invested.
link |
01:16:56.860
It's like hope from somehow becoming rich
link |
01:17:00.060
as opposed to something to me, I apologize.
link |
01:17:02.380
But money is in the longterm
link |
01:17:04.980
not going to be a source of that deep meaning.
link |
01:17:07.940
It's good to have enough money,
link |
01:17:09.820
but it should not be the source of hope.
link |
01:17:12.140
And it's heartbreaking to me
link |
01:17:13.180
how many people is the source of hope.
link |
01:17:16.260
Yeah.
link |
01:17:18.460
You've just described the psychology of virality
link |
01:17:21.340
or the psychology of trying to base the civilization
link |
01:17:25.780
on semi random occurrences of network effect peaks.
link |
01:17:29.740
And it doesn't really work.
link |
01:17:32.300
I mean, I think we need to get away from that.
link |
01:17:34.220
We need to soften those peaks
link |
01:17:38.060
and accept Microsoft, which deserves every penny,
link |
01:17:40.460
but in every other case.
link |
01:17:42.020
Well, you mentioned GitHub.
link |
01:17:43.980
I think what Microsoft did with GitHub was brilliant.
link |
01:17:46.140
I was very, okay, if I can give a, not a critical,
link |
01:17:51.180
but on Microsoft,
link |
01:17:54.340
because they recently purchased Bethesda,
link |
01:17:57.100
so Elder Scrolls is in their hands.
link |
01:17:59.900
I'm watching you, Microsoft,
link |
01:18:01.380
not screw up my favorite game, so.
link |
01:18:03.900
Yeah, look, I'm not speaking for Microsoft.
link |
01:18:07.060
I have an explicit arrangement with them
link |
01:18:09.100
where I don't speak for them.
link |
01:18:10.580
Obviously, that should be very clear.
link |
01:18:12.220
I do not speak for them.
link |
01:18:14.500
I am not saying, I like them.
link |
01:18:17.420
I think Satcha's amazing.
link |
01:18:20.620
The term data dignity was coined by Satcha.
link |
01:18:23.700
Like, so we have, it's kind of extraordinary,
link |
01:18:27.180
but Microsoft's this giant thing.
link |
01:18:29.420
It's gonna screw up this or that.
link |
01:18:31.180
It's not, I don't know.
link |
01:18:33.500
It's kind of interesting.
link |
01:18:35.020
I've had a few occasions in my life
link |
01:18:36.820
to see how things work from the inside of some big thing.
link |
01:18:39.900
And, you know, it's always just people kind of,
link |
01:18:42.620
it's, I don't know.
link |
01:18:44.700
There's always like coordination problems.
link |
01:18:48.220
And there's always.
link |
01:18:49.540
There's human problems.
link |
01:18:50.660
Oh, God.
link |
01:18:51.500
And there's some good people, there's some bad people.
link |
01:18:52.700
It's always.
link |
01:18:53.540
I hope Microsoft doesn't screw up your game.
link |
01:18:56.260
And I hope they bring Clippy back.
link |
01:18:57.900
You should never kill Clippy.
link |
01:18:59.500
Bring Clippy back.
link |
01:19:00.340
Oh, Clippy, but Clippy promotes the myth of AI.
link |
01:19:04.060
Well, that's why I think you're wrong.
link |
01:19:06.340
How about if we, all right,
link |
01:19:07.980
could we bring back Bob instead of Clippy?
link |
01:19:10.260
Which one was Bob?
link |
01:19:11.220
Oh, Bob was another thing.
link |
01:19:13.180
Bob was this other screen character
link |
01:19:15.260
who was supposed to be the voice of AI.
link |
01:19:16.900
Cortana, Cortana, would Cortana do it for you?
link |
01:19:19.340
No, Cortana is too corporate.
link |
01:19:21.340
I like it.
link |
01:19:23.020
I like it.
link |
01:19:23.860
It was fine.
link |
01:19:24.700
There's a woman in Seattle
link |
01:19:26.260
who's like the model for Cortana.
link |
01:19:27.580
Did Cortana's voice and was that voice?
link |
01:19:29.260
There was like.
link |
01:19:30.100
No, the voice is great.
link |
01:19:31.140
We had her as a, she used to walk around
link |
01:19:34.380
and if you were wearing HoloLens for a bit,
link |
01:19:36.300
I don't think that's happening anymore.
link |
01:19:38.180
I think, I don't think you should
link |
01:19:39.180
turn a software into a creature.
link |
01:19:41.140
I think.
link |
01:19:41.980
Well, you and I.
link |
01:19:42.820
Get a cat, just get a cat.
link |
01:19:43.660
You and I, you and I, well, get a dog.
link |
01:19:46.060
Get a dog.
link |
01:19:46.900
Get a dog, yeah.
link |
01:19:47.980
Yeah, you're a.
link |
01:19:49.100
A hedgehog.
link |
01:19:49.940
A hedgehog.
link |
01:19:50.780
Yeah.
link |
01:19:51.900
You coauthored a paper.
link |
01:19:54.340
You mentioned Lee Smollin titled
link |
01:19:56.900
The Autodagdactic Universe,
link |
01:20:00.180
which describes our universe as one
link |
01:20:01.900
that learns its own physical laws.
link |
01:20:06.420
That's a trippy and beautiful and powerful idea.
link |
01:20:09.300
What are, what would you say are the key ideas in this paper?
link |
01:20:13.140
Okay.
link |
01:20:13.980
Well, I should say that paper reflected work from last year
link |
01:20:18.780
and the project, the program has moved quite a lot.
link |
01:20:21.700
So it's a little, there's a lot of stuff
link |
01:20:23.460
that's not published that I'm quite excited about.
link |
01:20:25.380
So I have to kind of keep my frame
link |
01:20:27.900
in that, in that last year's thing.
link |
01:20:30.340
So I have to try to be a little careful about that.
link |
01:20:33.940
We can think about it in a few different ways.
link |
01:20:37.380
The core of the paper, the technical core of it
link |
01:20:40.740
is a triple correspondence.
link |
01:20:43.860
One part of it was already established
link |
01:20:47.140
and then another part is in the process.
link |
01:20:49.820
The part that was established was, of course,
link |
01:20:53.180
understanding different theories of physics
link |
01:20:55.540
as matrix models.
link |
01:20:57.260
The part that was fresher is understanding those
link |
01:21:01.780
as a machine learning system,
link |
01:21:03.580
so that we could move fluidly
link |
01:21:04.980
between these different ways of describing systems.
link |
01:21:07.540
And the reason to wanna do that
link |
01:21:10.420
is just to have more tools and more options
link |
01:21:12.740
because, well, theoretical physics is really hard
link |
01:21:17.700
and a lot of programs have kind of run into a state
link |
01:21:23.540
where they feel a little stalled, I guess.
link |
01:21:25.500
I wanna be delicate about this
link |
01:21:26.940
because I'm not a physicist.
link |
01:21:27.860
I'm the computer scientist collaborating.
link |
01:21:29.700
So I don't mean to diss anybody's.
link |
01:21:32.380
So this is almost like,
link |
01:21:33.540
gives a framework for generating new ideas in physics.
link |
01:21:36.860
As we start to publish more about where it's gone,
link |
01:21:40.140
I think you'll start to see there's tools
link |
01:21:43.620
and ways of thinking about theories
link |
01:21:46.540
that I think open up some new paths
link |
01:21:50.660
that will be of interest.
link |
01:21:53.580
There's the technical core of it,
link |
01:21:55.100
which is this idea of a correspondence
link |
01:21:57.500
to give you more facility.
link |
01:21:58.860
But then there's also the storytelling part of it.
link |
01:22:01.500
And this is something Lee loves stories and I do.
link |
01:22:06.700
And the idea here is that
link |
01:22:11.060
a typical way of thinking about physics
link |
01:22:15.300
is that there's some kind of starting condition
link |
01:22:18.420
and then there's some principle
link |
01:22:19.620
by which the starting condition evolves.
link |
01:22:22.620
And the question is like, why the starting condition?
link |
01:22:26.460
Like how the starting condition has to get kind of,
link |
01:22:30.940
there's this has to be fine tuned
link |
01:22:32.660
and all these things about it have to be kind of perfect.
link |
01:22:35.980
And so we were thinking, well, look, what if we could push
link |
01:22:40.020
the storytelling about where the universe comes from
link |
01:22:42.220
much further back by starting with really simple things
link |
01:22:45.460
that evolve and then through that evolution,
link |
01:22:47.460
explain how things got to be,
link |
01:22:48.580
how they are through very simple principles, right?
link |
01:22:51.420
And so we've been exploring a variety of ways
link |
01:22:55.260
to push the start of the storytelling
link |
01:22:57.820
further and further back, which, and it's an interesting,
link |
01:23:02.540
it's really kind of interesting
link |
01:23:03.780
because like for all of his,
link |
01:23:07.100
Lee is sometimes considered to be,
link |
01:23:11.580
to have a radical quality in the physics world.
link |
01:23:13.940
But he still is like, no, this is gonna be like
link |
01:23:18.380
the kind of time we're talking about
link |
01:23:19.980
and which evolution happens is the same time we're now
link |
01:23:22.540
and we're talking about something that starts and continues.
link |
01:23:25.900
And I'm like, well, what if there's some other kind of time
link |
01:23:28.500
that's time like and sounds like metaphysics,
link |
01:23:31.220
but there's an ambiguity, you know, like,
link |
01:23:34.740
it has to start from something and it's kind of interesting.
link |
01:23:38.100
So there's this, a lot of the math can be thought of either way,
link |
01:23:42.620
which is kind of interesting.
link |
01:23:44.180
So pushes so far back that basically all the things
link |
01:23:46.700
we take for granted in physics start becoming
link |
01:23:48.820
and emergent, it's emergent.
link |
01:23:51.180
I really want to emphasize this is all super baby steps.
link |
01:23:53.580
I don't want to over claim.
link |
01:23:54.580
It's like, I think a lot of the things we're doing,
link |
01:23:57.580
we're approaching some old problems
link |
01:23:59.180
in a pretty fresh way informed.
link |
01:24:02.380
There's been a zillion papers about how you can think of
link |
01:24:04.940
the universe as a big neural net
link |
01:24:06.340
or how you can think of different ideas in physics
link |
01:24:09.260
as being quite similar to or even equivalent to
link |
01:24:12.420
some of the ideas in machine learning.
link |
01:24:14.420
And that actually works out crazy well.
link |
01:24:18.820
Like, I mean, that is actually kind of eerie
link |
01:24:21.140
when you look at it, like there's probably
link |
01:24:24.580
two or three dozen papers that have this quality
link |
01:24:26.940
and some of them are just crazy good
link |
01:24:28.620
and it's very interesting.
link |
01:24:30.700
What we're trying to do is take those kinds of observations
link |
01:24:34.140
and turn them into an actionable framework
link |
01:24:35.940
where you can then start to do things
link |
01:24:38.860
with landscapes of theories that you couldn't do before
link |
01:24:40.660
and that sort of thing.
link |
01:24:42.580
So in that context, or maybe beyond,
link |
01:24:46.220
how do you explain us humans?
link |
01:24:48.060
How unlikely are we, this intelligent civilization?
link |
01:24:51.100
Or is there a lot of others
link |
01:24:53.340
or are we alone in this universe?
link |
01:24:57.380
Yeah.
link |
01:25:00.020
You seem to appreciate humans very much.
link |
01:25:03.460
I've grown fond of us.
link |
01:25:04.940
We're okay.
link |
01:25:06.780
We have our nice qualities.
link |
01:25:13.100
I like that.
link |
01:25:14.700
I mean, we're kind of weird.
link |
01:25:16.380
We spread this here on our heads and then we're,
link |
01:25:18.300
I don't know, we're sort of weird animal.
link |
01:25:20.380
That's the feature, not a bug, I think, the weirdness.
link |
01:25:23.900
I hope so.
link |
01:25:30.740
I think if I'm just gonna answer you
link |
01:25:33.700
in terms of truth, the first thing I'd say
link |
01:25:36.980
is we're not in a privileged enough position,
link |
01:25:40.860
at least as yet, to really know much
link |
01:25:43.380
about who we are, how we are,
link |
01:25:47.580
what we're really like in the context of something larger,
link |
01:25:50.260
what that context is, like all that stuff,
link |
01:25:52.700
we might learn more in the future,
link |
01:25:54.020
our descendants might learn more,
link |
01:25:55.220
but we don't really know very much.
link |
01:25:57.540
Which you can either view as frustrating or charming
link |
01:26:00.620
like that first year of TikTok or something.
link |
01:26:04.540
All roads lead back to TikTok.
link |
01:26:06.140
I like it.
link |
01:26:06.980
Well, lately, but in terms of,
link |
01:26:09.500
there's another level at which I can think about it where
link |
01:26:16.140
I sometimes think that if you are just quiet
link |
01:26:22.020
and you do something that gets you in touch
link |
01:26:23.980
with the way reality happens,
link |
01:26:25.860
and for me, it's playing music,
link |
01:26:28.180
sometimes it seems like you can feel a bit
link |
01:26:31.500
of how the universe is,
link |
01:26:32.940
and it feels like there's a lot more going on in it,
link |
01:26:36.140
and there is a lot more life
link |
01:26:37.540
and a lot more stuff happening
link |
01:26:39.620
and a lot more stuff flowing through it.
link |
01:26:41.460
I'm not speaking as a scientist now,
link |
01:26:42.980
this is kind of a more, my artists side talking,
link |
01:26:46.220
and it's, I feel like I'm suddenly
link |
01:26:49.420
in multiple personalities with you, but.
link |
01:26:51.420
Well, Kerouac, Jack Kerouac said that music
link |
01:26:55.260
is the only truth.
link |
01:26:56.580
What do you, it sounds like you might be, at least in part.
link |
01:27:01.500
There's a passage in Kerouac's book, Dr. Sacks,
link |
01:27:05.580
where somebody tries to just explain the whole situation
link |
01:27:08.020
with reality and people in like a paragraph,
link |
01:27:10.140
and I couldn't reproduce it for you here,
link |
01:27:11.980
but it's like, yeah, like there are these boldest things
link |
01:27:15.020
that walk around and they make these sounds,
link |
01:27:16.500
you can sort of understand them,
link |
01:27:17.700
but only kind of, and then there's like this,
link |
01:27:19.300
and it's just like this amazing, like just really quick,
link |
01:27:22.020
like if some spirit being or something
link |
01:27:25.020
was gonna show up in our reality
link |
01:27:26.460
and hadn't you nothing about it,
link |
01:27:27.580
it's like a little basic intro of like,
link |
01:27:29.540
okay, here's what's going on here,
link |
01:27:31.300
an incredible passage.
link |
01:27:32.660
Yeah, yeah.
link |
01:27:33.940
It's like a one or two sentence summary
link |
01:27:36.620
in H. Heiko's Guide to the Galaxy, right?
link |
01:27:38.820
Of what this.
link |
01:27:40.340
Mostly harmless.
link |
01:27:41.380
Mostly harmless.
link |
01:27:43.060
Yeah.
link |
01:27:43.900
But do you think there's truth to that,
link |
01:27:44.740
that music somehow connects to something
link |
01:27:47.100
that words cannot?
link |
01:27:48.980
Yeah, music is something that just towers above me.
link |
01:27:52.780
I don't, I don't,
link |
01:27:54.900
I don't feel like I have an overview of it.
link |
01:27:57.860
It's just the reverse.
link |
01:27:58.820
I don't, I don't fully understand it
link |
01:28:00.740
because on one level it's simple.
link |
01:28:02.260
Like you can say, oh, it's,
link |
01:28:03.660
it's a thing people evolved to coordinate our brains
link |
01:28:07.580
on a pattern level or a, or something like that.
link |
01:28:11.940
There's all these things you can say about music,
link |
01:28:13.820
which are, you know, some of that's probably true.
link |
01:28:16.900
It's also, there's kind of like this,
link |
01:28:21.900
this is the mystery of meaning.
link |
01:28:26.140
Like there's a way that just instead
link |
01:28:29.820
of just being pure abstraction,
link |
01:28:31.300
music can have like this kind of
link |
01:28:33.180
substantiality to it that is philosophically impossible.
link |
01:28:39.940
I don't know what to do with it.
link |
01:28:41.180
Yeah.
link |
01:28:42.020
The amount of understanding I feel I have
link |
01:28:44.060
when I hear the right song at the right time
link |
01:28:48.220
is not comparable to anything I can read
link |
01:28:51.620
on Wikipedia.
link |
01:28:53.660
Anything I can understand, read through in language.
link |
01:28:56.940
There's, the music does connect us to something.
link |
01:28:59.700
There's this thing there.
link |
01:29:00.820
Yeah, there's, there's,
link |
01:29:03.140
there's some kind of a thing in it.
link |
01:29:04.980
And I've never ever,
link |
01:29:06.980
I've read across a lot of explanations
link |
01:29:09.860
from all kinds of interesting people,
link |
01:29:12.220
like that it's some kind of a flow language
link |
01:29:16.500
between people or between people and how they perceive
link |
01:29:18.980
and that kind of thing.
link |
01:29:19.940
There's, and that sort of explanation is fine,
link |
01:29:24.060
but it's not, it's not quite it either.
link |
01:29:26.540
Yeah.
link |
01:29:27.380
There's a, there's something about music
link |
01:29:29.580
that makes me believe that panpsychism could possibly be true,
link |
01:29:34.100
which is that everything in the universe is conscious.
link |
01:29:36.820
It makes me think,
link |
01:29:39.620
makes me be humble in how much or how little
link |
01:29:44.020
I understand about the functions of our universe
link |
01:29:48.340
that everything might be conscious.
link |
01:29:50.580
Most people interested in theoretical physics
link |
01:29:54.220
eventually land in panpsychism,
link |
01:29:57.980
but I, I'm not one of them.
link |
01:30:00.180
I, I still think there's this pragmatic imperative
link |
01:30:06.020
to treat people as special.
link |
01:30:09.180
So I will proudly be a dualist.
link |
01:30:12.820
Without people and cats.
link |
01:30:14.300
People and cats.
link |
01:30:15.140
I'm not, I'm not, I'm not quite sure where to draw the line
link |
01:30:19.300
or why the lines there or anything like that,
link |
01:30:21.340
but I don't think I should be required to all the same
link |
01:30:23.660
questions or equally mysterious for no line.
link |
01:30:26.020
So I don't, I'm not, I don't feel disadvantaged by that.
link |
01:30:28.620
So I shall remain a dualist,
link |
01:30:30.500
but if you listen to anyone trying to explain
link |
01:30:36.140
where consciousness is in a dualistic sense,
link |
01:30:38.660
either believing in souls or some special thing
link |
01:30:41.220
in the brain or something,
link |
01:30:42.420
you pretty much say, screw this,
link |
01:30:44.300
I'm going to be panpsychist.
link |
01:30:45.660
Hahaha.
link |
01:30:51.500
Fair enough.
link |
01:30:52.340
Well put.
link |
01:30:53.460
Is there moments in your life that happened
link |
01:30:55.940
that were defining in the way that you hope others,
link |
01:30:59.980
your daughters might have thought about it?
link |
01:31:01.340
Well, listen, I gotta say,
link |
01:31:02.620
the moments that defined me were not the good ones.
link |
01:31:06.340
The moments that defined me were often horrible.
link |
01:31:09.620
I've had successes, you know,
link |
01:31:14.900
but if you ask what defined me,
link |
01:31:17.900
my mother's death being under the World Trade Center
link |
01:31:24.700
and the attack, the things that have had an effect on me
link |
01:31:30.780
were the most were sort of real world terrible things,
link |
01:31:35.260
which I don't wish on young people at all.
link |
01:31:37.580
And this is the thing that's hard
link |
01:31:41.220
about giving advice to young people
link |
01:31:42.820
that they have to learn their own lessons
link |
01:31:48.420
and lessons don't come easily.
link |
01:31:53.020
And a world which avoids hard lessons
link |
01:31:56.380
is will be a stupid world, you know,
link |
01:31:58.380
and I don't know what to do with it.
link |
01:32:00.180
That's a little bundle of truth
link |
01:32:03.060
that has a bit of a fatalistic quality to it,
link |
01:32:05.220
but I don't, this is like what I was saying
link |
01:32:07.940
that, you know, freedom equals eternal annoyance.
link |
01:32:09.940
Like you can't, like, there's a degree
link |
01:32:14.660
to which honest advice is not that pleasant to give.
link |
01:32:20.460
And I don't want young people to have to know
link |
01:32:24.300
about everything.
link |
01:32:25.740
I think I think...
link |
01:32:26.580
You don't wanna wish hardship on them.
link |
01:32:28.020
Yeah, I think they deserve to have
link |
01:32:31.940
a little grace period of naivety that's pleasant.
link |
01:32:34.780
I mean, I do, you know, if it's possible,
link |
01:32:37.660
if it's...
link |
01:32:40.180
These things are, this is like, this is tricky stuff.
link |
01:32:42.540
I mean, if you...
link |
01:32:48.180
Okay, so let me try a little bit on this advice thing.
link |
01:32:51.020
I think one thing, any serious broad advice
link |
01:32:55.660
will have been given 1,000 times before for 1,000 years.
link |
01:32:58.340
So I'm not gonna, I'm not going to claim originality,
link |
01:33:03.020
but I think trying to find a way
link |
01:33:07.980
to really pay attention to what you're feeling fundamentally,
link |
01:33:13.220
what your sense of the world is,
link |
01:33:14.740
what your intuition is, if you feel like an intuitive person,
link |
01:33:17.820
what you're...
link |
01:33:22.620
Like to try to escape the constant sway
link |
01:33:26.700
of social perception or manipulation, whatever you wish,
link |
01:33:30.220
not to escape it entirely, that would be horrible,
link |
01:33:32.180
but to find, to find cover from it once in a while,
link |
01:33:37.420
to find a sense of being anchored in that,
link |
01:33:41.100
to believe in experience as a real thing.
link |
01:33:44.140
Believing in experience as a real thing is very dualistic.
link |
01:33:47.220
That goes with my philosophy of dualism.
link |
01:33:50.820
I believe there's something magical,
link |
01:33:52.660
and instead of squirting the magic dust on the programs,
link |
01:33:55.980
I think experience is something real
link |
01:33:58.260
and something apart and something mystical and something...
link |
01:34:00.420
Your own personal experience that you just have,
link |
01:34:04.740
and then you're saying,
link |
01:34:06.140
silence the rest of the world enough to hear that,
link |
01:34:08.260
like whatever that magic dust is from that experience.
link |
01:34:11.340
Find what is there.
link |
01:34:13.540
And I think that's one thing.
link |
01:34:18.140
Another thing is to recognize that kindness requires genius,
link |
01:34:24.900
that it's actually really hard,
link |
01:34:27.180
that facile kindness is not kindness,
link |
01:34:30.020
and that it'll take you a while to have the skills
link |
01:34:33.500
to have kind impulses to want to be kind
link |
01:34:35.620
you can have right away.
link |
01:34:37.540
To be effectively kind is hard.
link |
01:34:40.100
To be effectively kind, yeah.
link |
01:34:41.860
It takes skill, it takes hard lessons.
link |
01:34:50.980
You'll never be perfect at it.
link |
01:34:53.860
To the degree you get anywhere with it,
link |
01:34:55.540
it's the most rewarding thing ever.
link |
01:35:00.340
Let's see, what else would I say?
link |
01:35:04.980
I would say when you're young,
link |
01:35:07.700
you can be very overwhelmed
link |
01:35:12.500
by social and interpersonal emotions.
link |
01:35:16.660
You'll have broken hearts and jealousies.
link |
01:35:20.580
You'll feel socially down the ladder
link |
01:35:24.020
instead of up the ladder.
link |
01:35:25.780
It feels horrible when that happens.
link |
01:35:27.860
All of these things.
link |
01:35:29.300
And you have to remember what a fragile crust
link |
01:35:33.780
all that stuff is.
link |
01:35:35.380
And it's hard because right when it's happening,
link |
01:35:37.460
it's just so intense.
link |
01:35:43.060
And if I was actually giving this advice to my daughter,
link |
01:35:48.260
she'd already be out of the room.
link |
01:35:49.460
So this is for some hypothetical teenager
link |
01:35:55.620
that doesn't really exist,
link |
01:35:56.500
that really wants to sit and listen to my wisdom.
link |
01:35:59.220
Or for your daughter 10 years from now.
link |
01:36:01.780
Maybe.
link |
01:36:03.140
Can I ask you a difficult question?
link |
01:36:06.500
Yeah, sure.
link |
01:36:07.380
You talked about losing your mom.
link |
01:36:10.660
Yeah.
link |
01:36:11.860
Do you miss her?
link |
01:36:14.980
Yeah, I mean, I still connected her through music.
link |
01:36:17.620
She was a young prodigy piano player in Vienna.
link |
01:36:24.820
And she survived the concentration camp
link |
01:36:27.860
and then died in a car accident here in the US.
link |
01:36:33.620
What music makes you think of her?
link |
01:36:35.540
Is there a song that connects you?
link |
01:36:38.100
Well, you know, she was in Vienna.
link |
01:36:40.580
So she had the whole Viennese music thing going,
link |
01:36:48.340
which is this incredible school of absolute skill
link |
01:36:54.580
and romance bundled together.
link |
01:36:56.180
And wonderful on the piano, especially.
link |
01:36:58.740
I learned to play some of the Beethoven sonatas for her.
link |
01:37:01.780
And I played them in this exaggerated, drippy way.
link |
01:37:04.500
I remember when I was a kid.
link |
01:37:05.940
And exaggerated, meaning too full of emotion?
link |
01:37:09.460
Yeah, it's not the only way to play Beethoven.
link |
01:37:13.060
I mean, I didn't know there's any other way.
link |
01:37:14.820
That's a reasonable question.
link |
01:37:16.180
I mean, the fashion these days is to be slightly
link |
01:37:18.820
Apollonian even with Beethoven.
link |
01:37:20.900
But one imagines that actual Beethoven playing
link |
01:37:24.740
might have been different.
link |
01:37:25.860
I don't know.
link |
01:37:28.420
I've gotten to play a few instruments he played
link |
01:37:30.740
and try to see if I could feel anything
link |
01:37:32.260
about how it might have been for him.
link |
01:37:33.540
I don't know, really.
link |
01:37:34.980
I was always against the clinical precision
link |
01:37:37.460
of classical music.
link |
01:37:38.420
I thought a great piano player should be in pain.
link |
01:37:46.420
Emotionally, like, truly feel the music and make it messy.
link |
01:37:55.540
Sure.
link |
01:37:56.180
Maybe play classical music the way, I don't know, blues.
link |
01:38:00.420
Pianist plays blues.
link |
01:38:04.420
It seems like they actually got happier.
link |
01:38:06.740
And I'm not sure if Beethoven got happier.
link |
01:38:08.980
I think it's a different kind of concept
link |
01:38:12.500
of the place of music.
link |
01:38:16.020
I think the blues, the whole African American tradition
link |
01:38:20.660
was initially surviving awful, awful circumstances.
link |
01:38:24.500
You could say there were some of that in the concentration
link |
01:38:27.060
camps and all that, too.
link |
01:38:29.540
And it's not that Beethoven's circumstances were brilliant,
link |
01:38:33.540
but he kind of also, I don't know, this is hard.
link |
01:38:38.820
I mean, it would seem to be his misery
link |
01:38:41.460
was somewhat self imposed, maybe, through, I don't know.
link |
01:38:44.820
It's kind of interesting.
link |
01:38:45.940
I've known some people who loathed Beethoven.
link |
01:38:48.420
The late composer, Pauline Oliveiros,
link |
01:38:51.700
this wonderful modernist composer,
link |
01:38:53.140
I played in her band for a while.
link |
01:38:54.980
And she was like, oh, Beethoven, that's the worst music ever.
link |
01:38:58.260
It's all ego.
link |
01:38:59.460
It completely, it turns information into,
link |
01:39:04.020
I mean, it turns emotion into your enemy.
link |
01:39:06.820
And it's ultimately all about your own self importance,
link |
01:39:11.220
which has to be at the expense of others could,
link |
01:39:13.540
but what else could it be?
link |
01:39:15.460
And blah, blah, blah.
link |
01:39:16.820
So she had, I shouldn't say, I don't mean it to be dismissive.
link |
01:39:19.220
I'm just saying, like, her position on Beethoven
link |
01:39:21.860
was very negative and very unimpressed,
link |
01:39:24.820
which is really interesting for me.
link |
01:39:26.180
The manner of the music.
link |
01:39:27.220
I think, I don't know.
link |
01:39:29.460
I mean, she's not here to speak for herself,
link |
01:39:30.980
so it's a little hard for me to answer that question.
link |
01:39:33.780
But it was interesting, because I'd always thought of Beethoven.
link |
01:39:35.780
It's like, whoa, you know, this is like Beethoven.
link |
01:39:37.780
It's like, really, the dude, you know?
link |
01:39:40.180
And she's like, eh, you know, Beethoven, Schmidhoven,
link |
01:39:43.620
you know, it's like not really happening.
link |
01:39:44.980
Yeah, still, even though it's cliche,
link |
01:39:46.580
I like playing personally just for myself,
link |
01:39:49.060
Moonlight Sonata.
link |
01:39:50.020
I mean, I just...
link |
01:39:52.180
Moonlight's amazing.
link |
01:39:53.620
You know, you're talking about comparing the blues
link |
01:40:00.180
in that sensibility from Europe.
link |
01:40:02.180
It's so different in so many ways.
link |
01:40:04.180
One of the musicians I play with is John Batiste,
link |
01:40:07.220
who has the band on Colbert's show.
link |
01:40:09.060
And he'll sit there playing jazz
link |
01:40:11.380
and suddenly go into Moonlight.
link |
01:40:12.660
He loves Moonlight.
link |
01:40:13.620
And what's kind of interesting is he's found a way
link |
01:40:19.620
to do Beethoven.
link |
01:40:20.660
And he, by the way, he can really do Beethoven.
link |
01:40:22.660
Like, he went through Juilliard and one time he was at my house
link |
01:40:27.220
and he's like, hey, do you have the book of Beethoven's Sonatas
link |
01:40:29.220
to say, yeah, I want to find one I haven't played.
link |
01:40:31.220
And then he sight read through the whole damn thing perfectly.
link |
01:40:33.220
And I'm like, oh, God, I just can't get out of here.
link |
01:40:35.780
I can't even deal with this.
link |
01:40:37.220
But anyway, he has this way of, with the same persona
link |
01:40:45.220
and the same philosophy, moving from the blues into Beethoven,
link |
01:40:48.660
that's really, really fascinating to me.
link |
01:40:50.820
It's like, I don't want to say he plays it as if it were jazz,
link |
01:40:55.620
but he kind of does.
link |
01:40:56.980
It's kind of really, and he talks,
link |
01:40:59.620
while he was sight reading, he talks like Beethoven's talking to him.
link |
01:41:02.420
Like, he's like, oh, yeah, here he's doing this.
link |
01:41:04.420
I can't do John.
link |
01:41:05.380
But it's like, it's really interesting.
link |
01:41:08.420
Like, it's very different.
link |
01:41:09.220
Like, for me, I was introduced to Beethoven
link |
01:41:11.860
as like almost like this godlike figure.
link |
01:41:14.020
And I presume Pauline was too.
link |
01:41:16.020
That was really kind of a press for an art to deal with.
link |
01:41:18.020
And for him, it's just like...
link |
01:41:19.380
It's a conversation he's having.
link |
01:41:21.220
He's playing James P. Johnson or something.
link |
01:41:23.620
It's like another musician who did something and they're talking.
link |
01:41:25.940
And it's very cool to be around.
link |
01:41:27.780
It's very kind of freeing to see someone have that relationship.
link |
01:41:34.500
I would love to hear him play Beethoven.
link |
01:41:36.020
That sounds amazing.
link |
01:41:37.700
He's great.
link |
01:41:39.540
We talked about Ernest Becker and how much value he puts on our mortality
link |
01:41:46.660
and our denial of our mortality.
link |
01:41:49.620
Do you think about your mortality?
link |
01:41:51.780
Do you think about your own death?
link |
01:41:53.460
You know, what's funny is I used to not be able to,
link |
01:41:56.100
but as you get older, you just know people who die
link |
01:41:58.340
and there's all these things that just becomes familiar
link |
01:42:00.820
and more of a more ordinary, which is what it is.
link |
01:42:06.660
But are you afraid?
link |
01:42:10.020
Sure, although less so.
link |
01:42:11.860
And it's not like I didn't have some kind of insight or revelation to become less afraid.
link |
01:42:18.740
I think I just, like I say, it's kind of familiarity.
link |
01:42:23.220
It's just knowing people who've died and I really believe in the future.
link |
01:42:30.180
I have this optimism that people or this whole thing of life on earth,
link |
01:42:35.380
this whole thing we're part of, I don't know where to draw that circle,
link |
01:42:37.780
but this thing is going somewhere and has some kind of value.
link |
01:42:45.700
And you can't both believe in the future and want to live forever.
link |
01:42:49.700
You have to make room for it.
link |
01:42:50.900
You know, like you have to, that optimism has to also come with its own like humility.
link |
01:42:55.860
You have to make yourself small to believe in the future.
link |
01:42:59.060
And so it actually in a funny way comforts me.
link |
01:43:04.180
Wow, that's so funny.
link |
01:43:05.860
Wow, that's powerful.
link |
01:43:10.580
And optimism requires you to kind of step down after time.
link |
01:43:16.660
Yeah, I mean, that said, life seems kind of short, but you know, whatever.
link |
01:43:21.620
Do you think there's, I've tried to find, I can't find the complaint department.
link |
01:43:24.580
You know, I really want to, I want to bring this up,
link |
01:43:26.820
but the customer service number never answers and like the email bounces one way.
link |
01:43:31.860
Do you think there's meaning to it, to life?
link |
01:43:34.260
Ah, well, see, meaning's a funny word.
link |
01:43:38.100
Like we say all these things as if we know what they mean,
link |
01:43:40.260
but meaning, we don't know what we mean when we say meaning.
link |
01:43:42.980
Like we obviously do not.
link |
01:43:44.260
And it's a funny little mystical thing.
link |
01:43:48.340
I think it ultimately connects to that sense of experience that dualists tend to believe in.
link |
01:43:56.100
Because there are why, like if you look up to the stars and you experience that awe inspiring,
link |
01:44:01.460
like joy, whatever, when you look up to the stars, I don't know why.
link |
01:44:07.060
For me, that kind of makes me feel joyful, maybe a little bit melancholy,
link |
01:44:11.940
just some weird soup of feelings.
link |
01:44:14.820
And ultimately the question is like, why are we here in this vast universe?
link |
01:44:22.020
That question, why?
link |
01:44:23.140
Have you been able in some way, maybe through music, answer it for yourself?
link |
01:44:38.260
My impulse is to feel like it's not quite the right question to ask,
link |
01:44:42.980
but I feel like going down that path is just too tedious for the moment.
link |
01:44:48.020
And I don't want to do it, but.
link |
01:44:49.860
The wrong question.
link |
01:44:53.940
Well, just because, you know, I don't know what meaning is.
link |
01:44:57.780
And I think I do know that sense of awe.
link |
01:45:01.540
I grew up in southern New Mexico and the stars were so vivid.
link |
01:45:08.580
I've had some weird misfortunes, but I've had some weird luck also.
link |
01:45:15.220
One of our near neighbors was the head of optics research at White Sands.
link |
01:45:20.980
And when he was young, he discovered Pluto.
link |
01:45:22.900
His name was Clyde Tombow.
link |
01:45:25.060
And he taught me how to make telescopes as grinding mirrors and stuff.
link |
01:45:28.820
And my dad had also made telescopes when he was a kid.
link |
01:45:31.380
But Clyde had, like, backyard telescopes that would put to shame a lot.
link |
01:45:37.140
I mean, he really, he did his telescopes, you know?
link |
01:45:39.860
And so I remember he'd let me go and play with them and just like looking at a globular cluster.
link |
01:45:46.100
And you're seeing the actual photons.
link |
01:45:47.620
And with a good telescope, it's really like this object.
link |
01:45:49.940
Like, you can really tell this isn't coming through some intervening information structure.
link |
01:45:55.220
This is like the actual photons.
link |
01:45:56.820
And it's really a three dimensional object.
link |
01:45:59.540
And you have even a feeling for the vastness of it.
link |
01:46:02.580
And it's, I don't know.
link |
01:46:06.740
So I definitely, I was very, very fortunate to have a connection to this guy that way when I was a kid.
link |
01:46:15.220
To have had that experience, again, the emphasis on experience.
link |
01:46:22.580
It's kind of funny.
link |
01:46:23.460
Like, I feel like sometimes, like I've taken, when she was younger,
link |
01:46:28.500
I took my daughter and her friends to the telescope.
link |
01:46:31.780
There are a few around here that kids can go and use.
link |
01:46:34.500
And they would like look at Jupiter's moons or something.
link |
01:46:37.140
I think like Galilean moons.
link |
01:46:38.980
And I don't know if they quite had that because it's like too,
link |
01:46:45.140
it's been just too normalized.
link |
01:46:46.980
And I think maybe when I was growing up, screens weren't that common yet.
link |
01:46:52.020
And maybe it's like too confusable with the screen.
link |
01:46:55.060
I don't know.
link |
01:46:56.180
You know, somebody brought up in conversation to me somewhere.
link |
01:47:01.060
I don't remember who, but they kind of posited this idea that if humans,
link |
01:47:06.340
early humans weren't able to see the stars, like if earth atmosphere or such, there was cloudy,
link |
01:47:12.020
that we would not develop human civilization.
link |
01:47:14.740
There's something about being able to look up and see a vast universe is like,
link |
01:47:20.420
that's fundamental to the development of human civilization.
link |
01:47:23.620
I thought that was a curious kind of thought.
link |
01:47:25.700
That reminds me of that old Isaac Asimov story where there's this planet where they finally get
link |
01:47:33.060
to see what's in the sky once in a while.
link |
01:47:34.900
And it turns out there in the middle of a globular cluster.
link |
01:47:38.340
I forget what happens exactly.
link |
01:47:39.620
God, that's from when I was the same age as a kid.
link |
01:47:41.860
I don't really remember.
link |
01:47:44.260
But yeah, I don't know.
link |
01:47:47.300
It might be right.
link |
01:47:47.940
I'm just thinking of all the civilizations that grew up under clouds.
link |
01:47:51.300
I mean, the Vikings needed a special diffracting piece of Micah to navigate
link |
01:47:58.580
because they could never see the sun.
link |
01:47:59.940
They had this thing called a sunstone that they found from this one cave.
link |
01:48:02.740
Do you know about that?
link |
01:48:03.940
So they were trying to navigate boats in the North Atlantic without being able to see the sun
link |
01:48:11.620
because it was cloudy.
link |
01:48:12.420
And so they used a chunk of Micah to diffract it in order to be able to align where the sun
link |
01:48:21.940
really was because they couldn't tell by eye and navigate.
link |
01:48:24.580
So I'm just saying there are a lot of civilizations that are pretty impressive
link |
01:48:27.620
that had to deal with a lot of clouds.
link |
01:48:31.540
The Amazonians invented our agriculture and they were probably under clouds a lot.
link |
01:48:36.020
I don't know.
link |
01:48:36.500
I don't know.
link |
01:48:37.300
To me personally, the question of the meaning of life becomes most vibrant, most apparent
link |
01:48:46.020
when you look up at the stars because it makes me feel very small.
link |
01:48:52.100
We are small.
link |
01:48:54.580
But then you ask, it still feels that we're special.
link |
01:48:59.380
And then the natural question is like, well, if we are special as I think we are,
link |
01:49:04.420
why the heck are we here in this vast universe?
link |
01:49:09.220
That ultimately is the question of the meaning of life.
link |
01:49:13.540
I mean, look, there's a confusion sometimes in trying to use, to set up a question or a
link |
01:49:24.420
thought experiment or something that's defined in terms of a context to explain something
link |
01:49:30.180
where there is no larger context.
link |
01:49:31.860
And that's a category error.
link |
01:49:34.580
If we want to do it in physics or in computer science, it's hard to talk about the universe
link |
01:49:42.180
as a Turing machine because a Turing machine has an external clock and an observer and
link |
01:49:46.980
an input and output.
link |
01:49:48.100
There's a larger context implied in order for it to be defined at all.
link |
01:49:51.460
And so if you're talking about the universe, you can't talk about it coherently as a Turing
link |
01:49:55.700
machine.
link |
01:49:56.100
Quantum mechanics is like that.
link |
01:49:57.780
Quantum mechanics has an external clock and has some kind of external context,
link |
01:50:02.500
depending on your interpretation, that's either the observer or whatever.
link |
01:50:09.060
And they're similar that way.
link |
01:50:11.220
So maybe Turing machines and quantum mechanics can be better friends or something because
link |
01:50:17.220
they have a similar setup.
link |
01:50:18.340
But the thing is, if you have something that's defined in terms of an outer context,
link |
01:50:22.820
you can't talk about ultimates with it because obviously it's not suited for that.
link |
01:50:28.100
So there's some ideas that are their own context.
link |
01:50:31.220
General relativity is its own context.
link |
01:50:33.460
It's different.
link |
01:50:34.260
It's why it's hard to unify.
link |
01:50:35.460
And I think the same thing is true when we talk about these types of questions.
link |
01:50:42.820
Like meaning is in a context and to talk about ultimate meaning, is there a four category
link |
01:50:52.020
or it's not a resolvable way of thinking?
link |
01:50:59.140
It might be a way of thinking that is experientially or aesthetically valuable
link |
01:51:08.900
because it is awesome in the sense of awe inspiring.
link |
01:51:16.020
But to try to treat it analytically is not sensible.
link |
01:51:19.460
Maybe that's what music and poetry are for.
link |
01:51:22.260
Yeah, maybe.
link |
01:51:22.980
I think music actually does escape any particular context.
link |
01:51:27.060
That's how it feels to me, but I'm not sure about that.
link |
01:51:28.980
That's once again, crazy artists talking, not scientists.
link |
01:51:33.300
Well, you do both masterfully.
link |
01:51:36.980
Jared, like I said, I'm a big fan of everything you've done, of you as a human being.
link |
01:51:41.940
I appreciate the fun argument we had today that will, I'm sure, continue for 30 years.
link |
01:51:48.500
As it did with Martin Minsky.
link |
01:51:51.540
Honestly, I deeply appreciate that you spend your really valuable time with me today.
link |
01:51:55.540
It was a really great conversation.
link |
01:51:56.740
Thank you so much.
link |
01:51:58.340
Thanks for listening to this conversation with Jared Lanier.
link |
01:52:01.620
To support this podcast, please check out our sponsors in the description.
link |
01:52:06.020
And now let me leave you with some words from Jared Lanier himself.
link |
01:52:10.740
A real friendship ought to introduce each person to unexpected weirdness in the other.
link |
01:52:16.420
Thank you for listening and hope to see you next time.