back to index

Jaron Lanier: Virtual Reality, Social Media & the Future of Humans and AI | Lex Fridman Podcast #218


small model | large model

link |
00:00:00.000
The following is a conversation with Jaron Lanier,
link |
00:00:03.080
a computer scientist, visual artist, philosopher,
link |
00:00:06.200
writer, futurist, musician,
link |
00:00:08.080
and the founder of the field of virtual reality.
link |
00:00:11.680
To support this podcast,
link |
00:00:12.800
please check out our sponsors in the description.
link |
00:00:15.880
As a side note,
link |
00:00:16.840
you may know that Jaron is a staunch critic
link |
00:00:19.040
of social media platforms.
link |
00:00:20.720
Him and I agree on many aspects of this,
link |
00:00:23.780
except perhaps I am more optimistic
link |
00:00:26.340
about it being possible to build better platforms.
link |
00:00:29.700
And better artificial intelligence systems
link |
00:00:32.200
that put longterm interests
link |
00:00:33.900
and happiness of human beings first.
link |
00:00:36.660
Let me also say a general comment about these conversations.
link |
00:00:40.240
I try to make sure I prepare well,
link |
00:00:42.520
remove my ego from the picture,
link |
00:00:44.400
and focus on making the other person shine
link |
00:00:47.200
as we try to explore the most beautiful
link |
00:00:49.140
and insightful ideas in their mind.
link |
00:00:51.800
This can be challenging
link |
00:00:53.320
when the ideas that are close to my heart
link |
00:00:55.280
are being criticized.
link |
00:00:57.180
In those cases, I do offer a little pushback,
link |
00:00:59.940
but respectfully, and then move on,
link |
00:01:02.680
trying to have the other person come out
link |
00:01:04.280
looking wiser in the exchange.
link |
00:01:06.620
I think there's no such thing as winning in conversations,
link |
00:01:09.960
nor in life.
link |
00:01:11.560
My goal is to learn and to have fun.
link |
00:01:14.000
I ask that you don't see my approach
link |
00:01:15.840
to these conversations as weakness.
link |
00:01:17.980
It is not.
link |
00:01:19.200
It is my attempt at showing respect
link |
00:01:21.520
and love for the other person.
link |
00:01:24.120
That said, I also often just do a bad job of talking,
link |
00:01:28.520
but you probably already knew that.
link |
00:01:30.820
So please give me a pass on that as well.
link |
00:01:33.440
This is the Lex Friedman Podcast,
link |
00:01:35.560
and here is my conversation with Jaron Lanier.
link |
00:01:39.520
You're considered the founding father of virtual reality.
link |
00:01:44.600
Do you think we will one day spend most
link |
00:01:47.320
or all of our lives in virtual reality worlds?
link |
00:01:51.540
I have always found the very most valuable moment
link |
00:01:56.420
in virtual reality to be the moment
link |
00:01:58.360
when you take off the headset and your senses are refreshed
link |
00:02:01.260
and you perceive physicality afresh,
link |
00:02:05.180
as if you were a newborn baby,
link |
00:02:07.020
but with a little more experience.
link |
00:02:09.020
So you can really notice just how incredibly strange
link |
00:02:13.740
and delicate and peculiar and impossible the real world is.
link |
00:02:18.740
So the magic is, and perhaps forever will be
link |
00:02:22.220
in the physical world.
link |
00:02:23.540
Well, that's my take on it.
link |
00:02:25.100
That's just me.
link |
00:02:25.940
I mean, I think I don't get to tell everybody else
link |
00:02:29.380
how to think or how to experience virtual reality.
link |
00:02:31.980
And at this point, there have been multiple generations
link |
00:02:35.020
of younger people who've come along and liberated me
link |
00:02:39.460
from having to worry about these things.
link |
00:02:42.620
But I should say also even in what some,
link |
00:02:45.780
well, I called it mixed reality,
link |
00:02:47.340
back in the day, and these days it's called
link |
00:02:49.220
augmented reality, but with something like a HoloLens,
link |
00:02:53.180
even then, like one of my favorite things
link |
00:02:56.060
is to augment a forest, not because I think the forest
link |
00:02:58.900
needs augmentation, but when you look at the augmentation
link |
00:03:02.340
next to a real tree, the real tree just pops out
link |
00:03:05.460
as being astounding, it's interactive,
link |
00:03:09.940
it's changing slightly all the time if you pay attention,
link |
00:03:12.780
and it's hard to pay attention to that,
link |
00:03:14.580
but when you compare it to virtual reality,
link |
00:03:16.340
all of a sudden you do.
link |
00:03:18.140
And even in practical applications,
link |
00:03:20.580
my favorite early application of virtual reality,
link |
00:03:24.380
which we prototyped going back to the 80s
link |
00:03:27.020
when I was working with Dr. Joe Rosen at Stanford Med
link |
00:03:30.460
near where we are now, we made the first surgical simulator.
link |
00:03:34.500
And to go from the fake anatomy of the simulation,
link |
00:03:39.820
which is incredibly valuable for many things,
link |
00:03:42.220
for designing procedures, for training,
link |
00:03:43.820
for all kinds of things, then to go to the real world
link |
00:03:45.580
is then to go to the real person,
link |
00:03:47.700
boy, it's really something like surgeons
link |
00:03:51.260
really get woken up by that transition, it's very cool.
link |
00:03:54.060
So I think the transition is actually more valuable
link |
00:03:56.220
than the simulation.
link |
00:03:58.500
That's fascinating, I never really thought about that.
link |
00:04:01.700
It's almost, it's like traveling elsewhere
link |
00:04:05.620
in the physical space can help you appreciate
link |
00:04:08.260
how much you value your home once you return.
link |
00:04:11.940
Well, that's how I take it.
link |
00:04:13.620
I mean, once again, people have different attitudes
link |
00:04:16.540
towards it, all are welcome.
link |
00:04:18.740
What do you think is the difference
link |
00:04:20.380
between the virtual world and the physical meat space world
link |
00:04:23.820
that you are still drawn, for you personally,
link |
00:04:26.780
still drawn to the physical world?
link |
00:04:28.940
Like there clearly then is a distinction.
link |
00:04:31.620
Is there some fundamental distinction
link |
00:04:33.260
or is it the peculiarities of the current set of technology?
link |
00:04:37.580
In terms of the kind of virtual reality that we have now,
link |
00:04:41.480
it's made of software and software is terrible stuff.
link |
00:04:46.000
Software is always the slave of its own history,
link |
00:04:50.360
its own legacy.
link |
00:04:52.040
It's always infinitely arbitrarily messy and arbitrary.
link |
00:04:57.560
Working with it brings out a certain kind
link |
00:05:00.260
of nerdy personality in people, or at least in me,
link |
00:05:03.480
which I'm not that fond of.
link |
00:05:05.920
And there are all kinds of things about software
link |
00:05:07.760
I don't like.
link |
00:05:09.160
And so that's different from the physical world.
link |
00:05:11.600
It's not something we understand, as you just pointed out.
link |
00:05:15.000
On the other hand, I'm a little mystified
link |
00:05:17.340
when people ask me, well,
link |
00:05:18.680
do you think the universe is a computer?
link |
00:05:21.540
And I have to say, well, I mean,
link |
00:05:24.520
what on earth could you possibly mean
link |
00:05:26.260
if you say it isn't a computer?
link |
00:05:27.960
If it isn't a computer,
link |
00:05:30.080
it wouldn't follow principles consistently
link |
00:05:33.960
and it wouldn't be intelligible
link |
00:05:35.680
because what else is a computer ultimately?
link |
00:05:38.520
I mean, and we have physics, we have technology,
link |
00:05:41.260
so we can do technology so we can program it.
link |
00:05:43.720
So, I mean, of course it's some kind of computer,
link |
00:05:45.720
but I think trying to understand it as a Turing machine
link |
00:05:49.000
is probably a foolish approach.
link |
00:05:52.840
Right, that's the question, whether it performs,
link |
00:05:56.840
this computer we call the universe,
link |
00:05:58.840
performs the kind of computation that can be modeled
link |
00:06:01.160
as a universal Turing machine,
link |
00:06:03.960
or is it something much more fancy,
link |
00:06:07.280
so fancy, in fact, that it may be
link |
00:06:09.440
beyond our cognitive capabilities to understand?
link |
00:06:12.640
Turing machines are kind of,
link |
00:06:16.380
I call them teases in a way,
link |
00:06:18.680
because if you have an infinitely smart programmer
link |
00:06:23.160
with an infinite amount of time,
link |
00:06:24.660
an infinite amount of memory,
link |
00:06:25.900
and an infinite clock speed, then they're universal,
link |
00:06:29.900
but that cannot exist.
link |
00:06:31.400
So they're not universal in practice.
link |
00:06:33.120
And they actually are, in practice,
link |
00:06:36.260
a very particular sort of machine within the constraints,
link |
00:06:40.640
within the conservation principles of any reality
link |
00:06:44.020
that's worth being in, probably.
link |
00:06:46.440
And so I think universality of a particular model
link |
00:06:55.640
is probably a deceptive way to think,
link |
00:06:58.560
even though at some sort of limit,
link |
00:07:00.760
of course something like that's gotta be true
link |
00:07:05.080
at some sort of high enough limit,
link |
00:07:07.400
but it's just not accessible to us, so what's the point?
link |
00:07:10.440
Well, to me, the question of whether we're living
link |
00:07:12.960
inside a computer or a simulation
link |
00:07:15.480
is interesting in the following way.
link |
00:07:18.520
There's a technical question that's here.
link |
00:07:20.920
How difficult is it to build a machine,
link |
00:07:25.840
not that simulates the universe,
link |
00:07:28.360
but that makes it sufficiently realistic
link |
00:07:31.480
that we wouldn't know the difference,
link |
00:07:33.440
or better yet, sufficiently realistic
link |
00:07:36.240
that we would kinda know the difference,
link |
00:07:37.860
but we would prefer to stay in the virtual world anyway?
link |
00:07:41.000
I wanna give you a few different answers.
link |
00:07:42.440
I wanna give you the one that I think
link |
00:07:43.960
has the most practical importance
link |
00:07:45.860
to human beings right now,
link |
00:07:47.480
which is that there's a kind of an assertion
link |
00:07:51.560
sort of built into the way the questions usually asked
link |
00:07:54.240
that I think is false, which is a suggestion
link |
00:07:57.300
that people have a fixed level of ability
link |
00:08:00.220
to perceive reality in a given way.
link |
00:08:03.120
And actually, people are always learning,
link |
00:08:07.560
evolving, forming themselves.
link |
00:08:09.160
We're fluid, too.
link |
00:08:10.440
We're also programmable, self programmable,
link |
00:08:13.760
changing, adapting.
link |
00:08:15.360
And so my favorite way to get at this
link |
00:08:18.800
is to talk about the history of other media.
link |
00:08:21.040
So for instance, there was a peer review paper
link |
00:08:23.640
that showed that an early wire recorder
link |
00:08:26.480
playing back an opera singer behind a curtain
link |
00:08:28.640
was indistinguishable from a real opera singer.
link |
00:08:31.340
And so now, of course, to us,
link |
00:08:32.440
it would not only be distinguishable,
link |
00:08:34.220
but it would be very blatant
link |
00:08:35.560
because the recording would be horrible.
link |
00:08:37.680
But to the people at the time,
link |
00:08:39.240
without the experience of it, it seemed plausible.
link |
00:08:43.800
There was an early demonstration
link |
00:08:46.360
of extremely crude video teleconferencing
link |
00:08:49.640
between New York and DC in the 30s, I think so,
link |
00:08:54.360
that people viewed as being absolutely realistic
link |
00:08:56.480
and indistinguishable, which to us would be horrible.
link |
00:08:59.880
And there are many other examples.
link |
00:09:01.000
Another one, one of my favorite ones,
link |
00:09:02.280
is in the Civil War era,
link |
00:09:04.200
there were itinerant photographers
link |
00:09:06.160
who collected photographs of people
link |
00:09:07.820
who just looked kind of like a few archetypes.
link |
00:09:10.720
So you could buy a photo of somebody
link |
00:09:12.480
who looked kind of like your loved one
link |
00:09:14.320
to remind you of that person
link |
00:09:17.120
because actually photographing them was inconceivable
link |
00:09:20.560
and hiring a painter was too expensive
link |
00:09:22.400
and you didn't have any way for the painter
link |
00:09:23.960
to represent them remotely anyway.
link |
00:09:25.480
How would they even know what they looked like?
link |
00:09:27.680
So these are all great examples
link |
00:09:29.640
of how in the early days of different media,
link |
00:09:32.320
we perceived the media as being really great,
link |
00:09:34.320
but then we evolved through the experience of the media.
link |
00:09:37.800
This gets back to what I was saying.
link |
00:09:38.840
Maybe the greatest gift of photography
link |
00:09:40.800
is that we can see the flaws in a photograph
link |
00:09:42.680
and appreciate reality more.
link |
00:09:44.520
Maybe the greatest gift of audio recording
link |
00:09:46.760
is that we can distinguish that opera singer now
link |
00:09:49.980
from that recording of the opera singer
link |
00:09:52.020
on the horrible wire recorder.
link |
00:09:53.800
So we shouldn't limit ourselves
link |
00:09:57.500
by some assumption of stasis that's incorrect.
link |
00:10:01.280
So that's the first thing, that's my first answer,
link |
00:10:03.840
which is I think the most important one.
link |
00:10:05.300
Now, of course, somebody might come back and say,
link |
00:10:07.340
oh, but you know, technology can go so far.
link |
00:10:09.400
There must be some point at which it would surpass.
link |
00:10:11.560
That's a different question.
link |
00:10:12.640
I think that's also an interesting question,
link |
00:10:14.760
but I think the answer I just gave you
link |
00:10:16.080
is actually the more important answer
link |
00:10:17.600
to the more important question.
link |
00:10:18.960
That's profound, yeah.
link |
00:10:20.280
But can you, the second question,
link |
00:10:23.160
which you're now making me realize is way different.
link |
00:10:26.840
Is it possible to create worlds
link |
00:10:28.520
in which people would want to stay
link |
00:10:31.600
instead of the real world?
link |
00:10:32.920
Well.
link |
00:10:33.900
Like, en masse, like large numbers of people.
link |
00:10:38.260
What I hope is, you know, as I said before,
link |
00:10:41.360
I hope that the experience of virtual worlds
link |
00:10:44.320
helps people appreciate this physical world we have
link |
00:10:49.320
and feel tender towards it
link |
00:10:51.760
and keep it from getting too fucked up.
link |
00:10:54.580
That's my hope.
link |
00:10:57.040
Do you see all technology in that way?
link |
00:10:58.760
So basically technology helps us appreciate
link |
00:11:02.840
the more sort of technology free aspect of life.
link |
00:11:08.240
Well, media technology.
link |
00:11:10.760
You know, I mean, you can stretch that.
link |
00:11:13.480
I mean, you can, let me say,
link |
00:11:15.280
I could definitely play McLuhan
link |
00:11:17.400
and turn this into a general theory.
link |
00:11:19.080
It's totally doable.
link |
00:11:20.120
The program you just described is totally doable.
link |
00:11:23.200
In fact, I will psychically predict
link |
00:11:25.120
that if you did the research,
link |
00:11:26.120
you could find 20 PhD theses that do that already.
link |
00:11:29.280
I don't know, but they might exist.
link |
00:11:31.320
But I don't know how much value there is
link |
00:11:34.920
in pushing a particular idea that far.
link |
00:11:38.780
Claiming that reality isn't a computer in some sense
link |
00:11:41.640
seems incoherent to me because we can program it.
link |
00:11:44.880
We have technology.
link |
00:11:46.200
It seems to obey physical laws.
link |
00:11:48.800
What more do you want from it to be a computer?
link |
00:11:50.680
I mean, it's a computer of some kind.
link |
00:11:52.200
We don't know exactly what kind.
link |
00:11:53.520
We might not know how to think about it.
link |
00:11:54.880
We're working on it, but.
link |
00:11:57.440
Sorry to interrupt, but you're absolutely right.
link |
00:11:59.200
Like, that's my fascination with the AI as well,
link |
00:12:01.940
is it helps, in the case of AI,
link |
00:12:05.240
I see it as a set of techniques
link |
00:12:07.320
that help us understand ourselves, understand us humans.
link |
00:12:10.200
In the same way, virtual reality,
link |
00:12:12.440
and you're putting it brilliantly,
link |
00:12:14.400
which it's a way to help us understand reality,
link |
00:12:17.980
appreciate and open our eyes more richly to reality.
link |
00:12:23.700
That's certainly how I see it.
link |
00:12:26.060
And I wish people who become incredibly fascinated,
link |
00:12:29.840
who go down the rabbit hole of the different fascinations
link |
00:12:33.900
with whether we're in a simulation or not,
link |
00:12:35.860
or, you know, there's a whole world of variations on that.
link |
00:12:40.460
I wish they'd step back
link |
00:12:41.380
and think about their own motivations
link |
00:12:42.860
and exactly what they mean, you know?
link |
00:12:45.740
And I think the danger with these things is,
link |
00:12:52.820
so if you say, is the universe
link |
00:12:54.340
some kind of computer broadly,
link |
00:12:56.300
it has to be because it's not coherent to say that it isn't.
link |
00:12:59.780
On the other hand, to say that that means
link |
00:13:02.200
you know anything about what kind of computer,
link |
00:13:05.100
that's something very different.
link |
00:13:06.300
And the same thing is true for the brain.
link |
00:13:07.880
The same thing is true for anything
link |
00:13:10.420
where you might use computational metaphors.
link |
00:13:12.020
Like, we have to have a bit of modesty about where we stand.
link |
00:13:14.940
And the problem I have with these framings of computation
link |
00:13:19.340
is these ultimate cosmic questions
link |
00:13:21.060
is that it has a way of getting people
link |
00:13:23.320
to pretend they know more than they do.
link |
00:13:25.340
Can you maybe, this is a therapy session,
link |
00:13:28.180
psychoanalyze me for a second.
link |
00:13:30.380
I really liked the Elder Scrolls series.
link |
00:13:32.260
It's a role playing game, Skyrim, for example.
link |
00:13:36.780
Why do I enjoy so deeply just walking around that world?
link |
00:13:41.780
And then there's people and you could talk to
link |
00:13:45.140
and you can just like, it's an escape.
link |
00:13:48.060
But you know, my life is awesome.
link |
00:13:49.820
I'm truly happy, but I also am happy
link |
00:13:52.760
with the music that's playing in the mountains
link |
00:13:56.500
and carrying around a sword and just that.
link |
00:14:00.860
I don't know what that is.
link |
00:14:02.380
It's very pleasant though to go there.
link |
00:14:04.620
And I miss it sometimes.
link |
00:14:06.540
I think it's wonderful to love artistic creations.
link |
00:14:12.380
It's wonderful to love contact with other people.
link |
00:14:15.980
It's wonderful to love play and ongoing evolving
link |
00:14:21.660
meaning and patterns with other people.
link |
00:14:24.180
I think it's a good thing.
link |
00:14:30.380
I'm not like anti tech
link |
00:14:31.860
and I'm certainly not anti digital tech.
link |
00:14:34.420
I'm anti, as everybody knows by now,
link |
00:14:37.260
I think the manipulative economy of social media
link |
00:14:41.220
is making everybody nuts and all that.
link |
00:14:42.420
So I'm anti that stuff.
link |
00:14:43.980
But the core of it, of course, I worked for many, many years
link |
00:14:47.620
on trying to make that stuff happen
link |
00:14:49.180
because I think it can be beautiful.
link |
00:14:51.040
Like I don't like, why not?
link |
00:14:55.040
And by the way, there's a thing about humans,
link |
00:14:59.160
which is we're problematic.
link |
00:15:03.880
Any kind of social interaction with other people
link |
00:15:07.900
is gonna have its problems.
link |
00:15:10.180
People are political and tricky.
link |
00:15:14.060
And like, I love classical music,
link |
00:15:16.260
but when you actually go to a classical music thing
link |
00:15:18.580
and it turns out, oh, actually,
link |
00:15:19.740
this is like a backroom power deal kind of place
link |
00:15:22.020
and a big status ritual as well.
link |
00:15:24.180
And that's kind of not as fun.
link |
00:15:27.900
That's part of the package.
link |
00:15:29.020
And the thing is, it's always going to be,
link |
00:15:30.700
there's always gonna be a mix of things.
link |
00:15:34.160
I don't think the search for purity
link |
00:15:38.820
is gonna get you anywhere.
link |
00:15:40.420
So I'm not worried about that.
link |
00:15:42.320
I worry about the really bad cases
link |
00:15:44.540
where we're making ourselves crazy or cruel enough
link |
00:15:48.260
that we might not survive.
link |
00:15:49.280
And I think the social media criticism rises to that level,
link |
00:15:53.500
but I'm glad you enjoy it.
link |
00:15:54.940
I think it's great.
link |
00:15:57.380
And I like that you basically say
link |
00:15:59.100
that every experience has both beauty and darkness,
link |
00:16:02.220
as in with classical music.
link |
00:16:03.660
I also play classical piano, so I appreciate it very much.
link |
00:16:07.220
But it's interesting.
link |
00:16:08.100
I mean, every, and even the darkness,
link |
00:16:10.220
it's a man's search for meaning
link |
00:16:11.900
with Viktor Frankl in the concentration camps.
link |
00:16:15.820
Even there, there's opportunity to discover beauty.
link |
00:16:20.940
And so that's the interesting thing about humans,
link |
00:16:25.140
is the capacity to discover beautiful
link |
00:16:27.940
in the darkest of moments,
link |
00:16:29.140
but there's always the dark parts too.
link |
00:16:31.660
Well, I mean, it's our situation is structurally difficult.
link |
00:16:37.060
We are, no, it is, it's true.
link |
00:16:42.220
We perceive socially, we depend on each other
link |
00:16:44.860
for our sense of place and perception of the world.
link |
00:16:50.800
I mean, we're dependent on each other.
link |
00:16:52.380
And yet there's also a degree in which we're inevitably,
link |
00:16:58.300
we never really let each other down.
link |
00:17:01.060
We are set up to be competitive as well as supportive.
link |
00:17:05.180
I mean, it's just our fundamental situation
link |
00:17:08.340
is complicated and challenging,
link |
00:17:10.660
and I wouldn't have it any other way.
link |
00:17:13.580
Okay, let's talk about one of the most challenging things.
link |
00:17:17.060
One of the things I unfortunately am very afraid of
link |
00:17:20.860
being human, allegedly.
link |
00:17:23.420
You wrote an essay on death and consciousness
link |
00:17:26.320
in which you write a note.
link |
00:17:28.380
Certainly the fear of death
link |
00:17:29.980
has been one of the greatest driving forces
link |
00:17:31.980
in the history of thought
link |
00:17:33.460
and in the formation of the character of civilization.
link |
00:17:37.340
And yet it is under acknowledged.
link |
00:17:39.800
The great book on the subject,
link |
00:17:41.300
The Denial of Death by Ernest Becker
link |
00:17:43.260
deserves a reconsideration.
link |
00:17:47.940
I'm Russian, so I have to ask you about this.
link |
00:17:49.820
What's the role of death in life?
link |
00:17:51.740
See, you would have enjoyed coming to our house
link |
00:17:54.660
because my wife is Russian and we also have,
link |
00:17:58.620
we have a piano of such spectacular qualities,
link |
00:18:01.380
you wouldn't, you would have freaked out.
link |
00:18:04.660
But anyway, we'll let all that go.
link |
00:18:07.260
So the context in which,
link |
00:18:09.460
I remember that essay sort of,
link |
00:18:12.660
this was from maybe the 90s or something.
link |
00:18:15.060
And I used to publish in a journal
link |
00:18:18.940
called the Journal of Consciousness Studies
link |
00:18:20.620
because I was interested in these endless debates
link |
00:18:24.220
about consciousness and science,
link |
00:18:27.820
which certainly continue today.
link |
00:18:31.700
And I was interested in how the fear of death
link |
00:18:38.580
and the denial of death played into
link |
00:18:41.740
different philosophical approaches to consciousness.
link |
00:18:44.820
Because I think on the one hand,
link |
00:18:53.540
the sort of sentimental school of dualism,
link |
00:18:58.780
meaning the feeling that there's something
link |
00:19:00.300
apart from the physical brain,
link |
00:19:02.180
some kind of soul or something else,
link |
00:19:05.040
is obviously motivated in a sense
link |
00:19:07.100
by a hope that whatever that is
link |
00:19:09.460
will survive death and continue.
link |
00:19:11.580
And that's a very core aspect of a lot of the world religions,
link |
00:19:15.420
not all of them, not really, but most of them.
link |
00:19:21.220
The thing I noticed is that the opposite of those,
link |
00:19:26.900
which might be the sort of hardcore,
link |
00:19:28.960
no, the brain's a computer and that's it.
link |
00:19:31.340
In a sense, we're motivated in the same way
link |
00:19:36.300
with a remarkably similar chain of arguments,
link |
00:19:40.700
which is no, the brain's a computer
link |
00:19:43.720
and I'm gonna figure it out in my lifetime
link |
00:19:45.580
and upload myself and I'll live forever.
link |
00:19:48.220
That's interesting.
link |
00:19:50.540
Yeah, that's like the implied thought, right?
link |
00:19:53.540
Yeah, and so it's kind of this,
link |
00:19:55.900
in a funny way, it's the same thing.
link |
00:20:02.500
It's peculiar to notice that these people
link |
00:20:06.460
who would appear to be opposites in character
link |
00:20:09.580
and cultural references and in their ideas
link |
00:20:14.360
actually are remarkably similar.
link |
00:20:16.640
And to an incredible degree,
link |
00:20:20.340
this sort of hardcore computationalist idea
link |
00:20:24.400
about the brain has turned into medieval Christianity
link |
00:20:28.900
with together, like there's the people who are afraid
link |
00:20:31.440
that if you have the wrong thought,
link |
00:20:32.500
you'll piss off the super AIs of the future
link |
00:20:34.700
who will come back and zap you and all that stuff.
link |
00:20:38.420
It's really turned into medieval Christianity
link |
00:20:41.740
all over again.
link |
00:20:43.100
This is so the Ernest Becker's idea that death,
link |
00:20:46.900
the fear of death is the warm at the core,
link |
00:20:49.620
which is like, that's the core motivator
link |
00:20:53.740
of everything we see humans have created.
link |
00:20:56.900
The question is if that fear of mortality is somehow core,
link |
00:21:00.740
is like a prerequisite to consciousness.
link |
00:21:03.740
You just moved across this vast cultural chasm
link |
00:21:10.380
that separates me from most of my colleagues in a way.
link |
00:21:13.260
And I can't answer what you just said on the level
link |
00:21:15.540
without this huge deconstruction.
link |
00:21:18.220
Should I do it?
link |
00:21:19.060
Yes, what's the chasm?
link |
00:21:20.280
Okay.
link |
00:21:21.380
Let us travel across this vast chasm.
link |
00:21:23.220
Okay, I don't believe in AI.
link |
00:21:25.040
I don't think there's any AI.
link |
00:21:26.220
There's just algorithms, we make them, we control them.
link |
00:21:28.460
Now, they're tools, they're not creatures.
link |
00:21:30.780
Now, this is something that robs a lot of people,
link |
00:21:33.340
the wrong way, and don't I know it.
link |
00:21:36.180
When I was young, my main mentor was Marvin Minsky,
link |
00:21:39.500
who's the principal author of the computer
link |
00:21:43.420
as creature rhetoric that we still use.
link |
00:21:47.060
He was the first person to have the idea at all,
link |
00:21:48.860
but he certainly populated AI culture
link |
00:21:52.980
with most of its tropes, I would say,
link |
00:21:55.420
because a lot of the people will say,
link |
00:21:57.020
oh, did you hear this new idea about AI?
link |
00:21:58.660
And I'm like, yeah, I heard it in 1978.
link |
00:22:00.420
Sure, yeah, I remember that.
link |
00:22:01.980
So Marvin was really the person.
link |
00:22:03.660
And Marvin and I used to argue all the time about this stuff
link |
00:22:08.540
because I always rejected it.
link |
00:22:10.340
And of all of his,
link |
00:22:14.740
of all of his, I wasn't formally his student,
link |
00:22:17.820
but I worked for him as a researcher,
link |
00:22:19.820
but of all of his students and student like people
link |
00:22:23.740
of his young adoptees,
link |
00:22:26.660
I think I was the one who argued with him
link |
00:22:28.500
about this stuff in particular, and he loved it.
link |
00:22:31.460
Yeah, I would have loved to hear that conversation.
link |
00:22:33.260
It was fun.
link |
00:22:34.180
Did you ever converse to a place?
link |
00:22:36.780
Oh, no, no.
link |
00:22:37.620
So the very last time I saw him, he was quite frail.
link |
00:22:40.320
And I was in Boston, and I was going to the old house
link |
00:22:45.260
in Brookline, his amazing house.
link |
00:22:47.460
And one of our mutual friends said,
link |
00:22:49.100
hey, listen, Marvin's so frail.
link |
00:22:52.100
Don't do the argument with him.
link |
00:22:54.140
Don't argue about AI, you know?
link |
00:22:56.500
And so I said, but Marvin loves that.
link |
00:22:58.940
And so I showed up, and he's like, he was frail.
link |
00:23:01.580
He looked up and he said, are you ready to argue?
link |
00:23:04.500
He's such an amazing person for that.
link |
00:23:10.300
So it's hard to summarize this
link |
00:23:13.940
because it's decades of stuff.
link |
00:23:16.240
The first thing to say is that nobody can claim
link |
00:23:19.700
absolute knowledge about whether somebody
link |
00:23:23.140
or something else is conscious or not.
link |
00:23:25.820
This is all a matter of faith.
link |
00:23:27.740
And in fact, I think the whole idea of faith
link |
00:23:31.780
needs to be updated.
link |
00:23:32.900
So it's not about God,
link |
00:23:34.060
but it's just about stuff in the universe.
link |
00:23:36.180
We have faith in each other, being conscious.
link |
00:23:39.380
And then I used to frame this
link |
00:23:42.180
as a thing called the circle of empathy in my old papers.
link |
00:23:45.300
And then it turned into a thing
link |
00:23:47.960
for the animal rights movement too.
link |
00:23:49.100
I noticed Peter Singer using it.
link |
00:23:50.440
I don't know if it was coincident or,
link |
00:23:52.500
but anyway, there's this idea
link |
00:23:54.460
that you draw a circle around yourself
link |
00:23:56.120
and the stuff inside is more like you,
link |
00:23:58.220
might be conscious, might be deserving of your empathy,
link |
00:24:00.660
of your consideration,
link |
00:24:02.140
and the stuff outside the circle isn't.
link |
00:24:04.260
And outside the circle might be a rock or,
link |
00:24:10.380
I don't know.
link |
00:24:12.660
And that circle is fundamentally based on faith.
link |
00:24:15.460
Well, if you don't know it.
link |
00:24:16.300
Your faith in what is and what isn't.
link |
00:24:17.960
The thing about this circle is it can't be pure faith.
link |
00:24:21.380
It's also a pragmatic decision
link |
00:24:23.820
and this is where things get complicated.
link |
00:24:26.000
If you try to make it too big,
link |
00:24:27.880
you suffer from incompetence.
link |
00:24:29.880
If you say, I don't wanna kill a bacteria,
link |
00:24:33.300
I will not brush my teeth.
link |
00:24:34.540
I don't know, like, what do you do?
link |
00:24:35.980
Like, there's a competence question
link |
00:24:39.120
where you do have to draw the line.
link |
00:24:41.000
People who make it too small become cruel.
link |
00:24:44.400
People are so clannish and political
link |
00:24:46.400
and so worried about themselves ending up
link |
00:24:48.620
on the bottom of society
link |
00:24:51.220
that they are always ready to gang up
link |
00:24:52.880
on some designated group.
link |
00:24:54.200
And so there's always these people who are being,
link |
00:24:56.240
we're always trying to shove somebody out of the circle.
link |
00:24:58.820
And so.
link |
00:24:59.660
So aren't you shoving AI outside the circle?
link |
00:25:01.540
Well, give me a second.
link |
00:25:02.380
All right.
link |
00:25:03.200
So there's a pragmatic consideration here.
link |
00:25:05.800
And so, and the biggest questions
link |
00:25:09.260
are probably fetuses and animals lately,
link |
00:25:11.660
but AI is getting there.
link |
00:25:13.400
Now with AI, I think,
link |
00:25:19.180
and I've had this discussion so many times.
link |
00:25:21.580
People say, but aren't you afraid if you exclude AI,
link |
00:25:23.760
you'd be cruel to some consciousness?
link |
00:25:26.320
And then I would say, well, if you include AI,
link |
00:25:29.520
you make yourself, you exclude yourself
link |
00:25:32.840
from being able to be a good engineer or designer.
link |
00:25:35.880
And so you're facing incompetence immediately.
link |
00:25:38.760
So like, I really think we need to subordinate algorithms
link |
00:25:41.460
and be much more skeptical of them.
link |
00:25:43.580
Your intuition, you speak about this brilliantly
link |
00:25:45.920
with social media, how things can go wrong.
link |
00:25:48.980
Isn't it possible to design systems
link |
00:25:54.820
that show compassion, not to manipulate you,
link |
00:25:57.760
but give you control and make your life better
link |
00:26:01.260
if you so choose to, like grow together with systems.
link |
00:26:04.060
And the way we grow with dogs and cats, with pets,
link |
00:26:07.080
with significant others in that way,
link |
00:26:09.440
they grow to become better people.
link |
00:26:11.520
I don't understand why that's fundamentally not possible.
link |
00:26:14.440
You're saying oftentimes you get into trouble
link |
00:26:18.100
by thinking you know what's good for people.
link |
00:26:20.200
Well, look, there's this question
link |
00:26:23.000
of what framework we're speaking in.
link |
00:26:25.600
Do you know who Alan Watts was?
link |
00:26:27.680
So Alan Watts once said, morality is like gravity
link |
00:26:32.120
that in some absolute cosmic sense, there can't be morality
link |
00:26:35.600
because at some point it all becomes relative
link |
00:26:37.720
and who are we anyway?
link |
00:26:39.120
Like morality is relative to us tiny creatures.
link |
00:26:42.120
But here on earth, we're with each other,
link |
00:26:45.560
this is our frame and morality is a very real thing.
link |
00:26:47.920
Same thing with gravity.
link |
00:26:48.800
At some point, you get into interstellar space
link |
00:26:52.160
and you might not feel much of it, but here we are on earth.
link |
00:26:55.200
And I think in the same sense,
link |
00:26:58.120
I think this identification with a frame that's quite remote
link |
00:27:04.440
cannot be separated from a feeling of wanting to feel
link |
00:27:08.680
sort of separate from and superior to other people
link |
00:27:11.960
or something like that.
link |
00:27:12.920
There's an impulse behind it that I really have to reject.
link |
00:27:16.120
And we're just not competent yet
link |
00:27:18.480
to talk about these kinds of absolutes.
link |
00:27:21.000
Okay, so I agree with you that a lot of technologists
link |
00:27:24.400
sort of lack this basic respect, understanding
link |
00:27:27.720
and love for humanity.
link |
00:27:29.200
There's a separation there.
link |
00:27:30.680
The thing I'd like to push back against,
link |
00:27:32.440
it's not that you disagree,
link |
00:27:33.640
but I believe you can create technologies
link |
00:27:36.240
and you can create a new kind of technologist engineer
link |
00:27:41.160
that does build systems that respect humanity,
link |
00:27:44.560
not just respect, but admire humanity,
link |
00:27:46.920
that have empathy for common humans, have compassion.
link |
00:27:51.920
I mean, no, no, no.
link |
00:27:52.760
I think, yeah, I mean, I think musical instruments
link |
00:27:57.320
are a great example of that.
link |
00:27:58.840
Musical instruments are technologies
link |
00:28:00.280
that help people connect in fantastic ways.
link |
00:28:02.280
And that's a great example.
link |
00:28:08.800
My invention or design during the pandemic period
link |
00:28:11.320
was this thing called together mode
link |
00:28:12.520
where people see themselves seated sort of
link |
00:28:14.520
in a classroom or a theater instead of in squares.
link |
00:28:20.320
And it allows them to semi consciously perform to each other
link |
00:28:26.080
as if they have proper eye contact,
link |
00:28:29.480
as if they're paying attention to each other nonverbally
link |
00:28:31.600
and weirdly that turns out to work.
link |
00:28:34.000
And so it promotes empathy so far as I can tell.
link |
00:28:36.960
I hope it is of some use to somebody.
link |
00:28:39.320
The AI idea isn't really new.
link |
00:28:41.800
I would say it was born with Adam Smith's invisible hand
link |
00:28:45.880
with this idea that we build this algorithmic thing
link |
00:28:48.520
and it gets a bit beyond us
link |
00:28:50.720
and then we think it must be smarter than us.
link |
00:28:52.720
And the thing about the invisible hand
link |
00:28:54.480
is absolutely everybody has some line they draw
link |
00:28:57.880
where they say, no, no, no,
link |
00:28:58.720
we're gonna take control of this thing.
link |
00:29:00.480
They might have different lines,
link |
00:29:01.920
they might care about different things,
link |
00:29:03.120
but everybody ultimately became a Keynesian
link |
00:29:05.640
because it just didn't work.
link |
00:29:06.800
It really wasn't that smart.
link |
00:29:08.000
It was sometimes smart and sometimes it failed, you know?
link |
00:29:10.680
And so if you really, you know,
link |
00:29:13.680
people who really, really, really wanna believe
link |
00:29:16.760
in the invisible hand is infinitely smart,
link |
00:29:19.800
screw up their economies terribly.
link |
00:29:21.680
You have to recognize the economy as a subservient tool.
link |
00:29:26.360
Everybody does when it's to their advantage.
link |
00:29:28.840
They might not when it's not to their advantage.
link |
00:29:30.720
That's kind of an interesting game that happens.
link |
00:29:33.240
But the thing is, it's just like that with our algorithms,
link |
00:29:35.720
you know, like, you can have a sort of a Chicago,
link |
00:29:42.120
you know, economic philosophy about your computer.
link |
00:29:44.800
Say, no, no, no, my things come alive,
link |
00:29:46.360
it's smarter than anything.
link |
00:29:48.080
I think that there is a deep loneliness within all of us.
link |
00:29:51.840
This is what we seek, we seek love from each other.
link |
00:29:55.360
I think AI can help us connect deeper.
link |
00:29:58.600
Like this is what you criticize social media for.
link |
00:30:01.240
I think there's much better ways of doing social media
link |
00:30:03.880
than doing social media that doesn't lead to manipulation,
link |
00:30:06.720
but instead leads to deeper connection between humans,
link |
00:30:09.680
leads to you becoming a better human being.
link |
00:30:12.040
And what that requires is some agency on the part of AI
link |
00:30:15.240
to be almost like a therapist, I mean, a companion.
link |
00:30:18.760
It's not telling you what's right.
link |
00:30:22.160
It's not guiding you as if it's an all knowing thing.
link |
00:30:25.480
It's just another companion that you can leave at any time.
link |
00:30:28.920
You have complete transparency control over.
link |
00:30:32.000
There's a lot of mechanisms that you can have
link |
00:30:34.760
that are counter to how current social media operates
link |
00:30:38.960
that I think is subservient to humans,
link |
00:30:41.600
or no, deeply respects human beings
link |
00:30:46.160
and is empathetic to their experience
link |
00:30:47.880
and all those kinds of things.
link |
00:30:48.960
I think it's possible to create AI systems like that.
link |
00:30:51.720
And I think they, I mean, that's a technical discussion
link |
00:30:54.720
of whether they need to have something that looks like more,
link |
00:30:58.720
something that looks like more like AI versus algorithms,
link |
00:31:03.640
something that has a identity,
link |
00:31:05.640
something that has a personality, all those kinds of things.
link |
00:31:09.120
AI systems, and you've spoken extensively
link |
00:31:11.520
how AI systems manipulate you within social networks.
link |
00:31:17.240
And that's the biggest problem,
link |
00:31:19.640
isn't necessarily that there's advertisement
link |
00:31:24.640
that social networks present you with advertisements
link |
00:31:29.320
that then get you to buy stuff.
link |
00:31:31.120
That's not the biggest problem.
link |
00:31:32.280
The biggest problem is they then manipulate you.
link |
00:31:36.240
They alter your human nature to get you to buy stuff
link |
00:31:41.400
or to get you to do whatever the advertiser wants.
link |
00:31:46.320
Or maybe you can correct me.
link |
00:31:47.480
Yeah, I don't see it quite that way,
link |
00:31:49.800
but we can work with that as an approximation.
link |
00:31:52.040
Sure, so my...
link |
00:31:53.120
I think the actual thing is even sort of more ridiculous
link |
00:31:55.880
and stupider than that, but that's okay, let's...
link |
00:31:58.160
So my question is, let's not use the word AI,
link |
00:32:02.440
but how do we fix it?
link |
00:32:05.320
Oh, fixing social media,
link |
00:32:07.880
that diverts us into this whole other field in my view,
link |
00:32:11.040
which is economics,
link |
00:32:12.560
which I always thought was really boring,
link |
00:32:14.200
but we have no choice but to turn into economists
link |
00:32:16.360
if we wanna fix this problem,
link |
00:32:17.920
because it's all about incentives.
link |
00:32:19.840
But I've been around this thing since it started,
link |
00:32:24.240
and I've been in the meetings
link |
00:32:27.200
where the social media companies sell themselves
link |
00:32:31.560
to the people who put the most money into them,
link |
00:32:33.840
which are usually the big advertising holding companies
link |
00:32:36.240
and whatnot.
link |
00:32:37.080
And there's this idea that I think is kind of a fiction,
link |
00:32:41.360
and maybe it's even been recognized as that by everybody,
link |
00:32:45.080
that the algorithm will get really good
link |
00:32:48.320
at getting people to buy something.
link |
00:32:49.720
Because I think people have looked at their returns
link |
00:32:52.080
and looked at what happens,
link |
00:32:53.160
and everybody recognizes it's not exactly right.
link |
00:32:56.280
It's more like a cognitive access blackmail payment
link |
00:33:02.000
at this point.
link |
00:33:03.520
Like just to be connected, you're paying the money.
link |
00:33:06.000
It's not so much that the persuasion algorithms...
link |
00:33:08.600
So Stanford renamed its program,
link |
00:33:10.400
but it used to be called Engage Persuade.
link |
00:33:12.240
The engage part works, the persuade part is iffy,
link |
00:33:15.800
but the thing is that once people are engaged,
link |
00:33:18.880
in order for you to exist as a business,
link |
00:33:20.680
in order for you to be known at all,
link |
00:33:21.920
you have to put money into the...
link |
00:33:23.080
Oh, that's dark.
link |
00:33:24.440
Oh, no, that's not...
link |
00:33:25.280
It doesn't work, but they have to...
link |
00:33:27.040
But they're still...
link |
00:33:28.240
It's a giant cognitive access blackmail scheme
link |
00:33:31.480
at this point.
link |
00:33:32.520
So because the science behind the persuade part,
link |
00:33:35.240
it's not entirely a failure,
link |
00:33:39.600
but it's not what...
link |
00:33:42.760
We play make believe that it works more than it does.
link |
00:33:46.920
The damage doesn't come...
link |
00:33:48.760
Honestly, as I've said in my books,
link |
00:33:51.240
I'm not anti advertising.
link |
00:33:53.360
I actually think advertising can be demeaning
link |
00:33:57.240
and annoying and banal and ridiculous
link |
00:34:01.320
and take up a lot of our time with stupid stuff.
link |
00:34:04.240
Like there's a lot of ways to criticize advertising
link |
00:34:06.960
that's accurate and it can also lie and all kinds of things.
link |
00:34:11.440
However, if I look at the biggest picture,
link |
00:34:13.960
I think advertising, at least as it was understood
link |
00:34:17.080
before social media, helped bring people into modernity
link |
00:34:20.280
in a way that overall actually did benefit people overall.
link |
00:34:24.520
And you might say, am I contradicting myself
link |
00:34:27.520
because I was saying you shouldn't manipulate people?
link |
00:34:29.080
Yeah, I am, probably here.
link |
00:34:30.440
I mean, I'm not pretending to have
link |
00:34:31.960
this perfect airtight worldview without some contradictions.
link |
00:34:35.440
I think there's a bit of a contradiction there, so.
link |
00:34:37.880
Well, looking at the long arc of history,
link |
00:34:39.320
advertisement has, in some parts, benefited society
link |
00:34:43.840
because it funded some efforts that perhaps...
link |
00:34:46.600
Yeah, I mean, I think like there's a thing
link |
00:34:50.000
where sometimes I think it's actually been of some use.
link |
00:34:53.920
Now, where the damage comes is a different thing though.
link |
00:34:59.000
Social media, algorithms on social media
link |
00:35:03.360
have to work on feedback loops
link |
00:35:04.800
where they present you with stimulus
link |
00:35:06.840
and they have to see if you respond to the stimulus.
link |
00:35:09.040
Now, the problem is that the measurement mechanism
link |
00:35:12.520
for telling if you respond in the engagement feedback loop
link |
00:35:16.480
is very, very crude.
link |
00:35:17.680
It's things like whether you click more
link |
00:35:19.560
or occasionally if you're staring at the screen more
link |
00:35:21.640
if there's a forward facing camera that's activated,
link |
00:35:23.920
but typically there isn't.
link |
00:35:25.560
So you have this incredibly crude back channel of information
link |
00:35:29.000
and so it's crude enough that it only catches
link |
00:35:32.640
sort of the more dramatic responses from you
link |
00:35:35.760
and those are the fight or flight responses.
link |
00:35:37.760
Those are the things where you get scared or pissed off
link |
00:35:40.160
or aggressive or horny.
link |
00:35:43.480
These are these ancient,
link |
00:35:44.880
the sort of what are sometimes called the lizard brain
link |
00:35:46.920
circuits or whatever, these fast response,
link |
00:35:51.200
old, old, old evolutionary business circuits that we have
link |
00:35:55.760
that are helpful in survival once in a while
link |
00:35:58.800
but are not us at our best.
link |
00:36:00.560
They're not who we wanna be.
link |
00:36:01.640
They're not how we relate to each other.
link |
00:36:03.480
They're this old business.
link |
00:36:05.080
So then just when you're engaged using those intrinsically
link |
00:36:08.200
totally aside from whatever the topic is,
link |
00:36:11.080
you start to get incrementally just a little bit
link |
00:36:13.920
more paranoid, xenophobic, aggressive.
link |
00:36:17.200
You get a little stupid and you become a jerk
link |
00:36:20.960
and it happens slowly.
link |
00:36:22.400
It's not like everybody's instantly transformed,
link |
00:36:26.080
but it does kind of happen progressively
link |
00:36:28.000
where people who get hooked kind of get drawn
link |
00:36:30.280
more and more into this pattern of being at their worst.
link |
00:36:33.640
Would you say that people are able to,
link |
00:36:35.760
when they get hooked in this way,
link |
00:36:37.480
look back at themselves from 30 days ago
link |
00:36:40.320
and say, I am less happy with who I am now
link |
00:36:45.120
or I'm not happy with who I am now
link |
00:36:47.120
versus who I was 30 days ago.
link |
00:36:48.800
Are they able to self reflect
link |
00:36:50.560
when you take yourself outside of the lizard brain?
link |
00:36:52.600
Sometimes.
link |
00:36:54.120
I wrote a book about people suggesting people take a break
link |
00:36:57.200
from their social media to see what happens
link |
00:36:58.880
and maybe even, actually the title of the book
link |
00:37:01.760
was just the arguments to delete your account.
link |
00:37:04.160
Yeah, 10 arguments.
link |
00:37:05.880
10 arguments.
link |
00:37:06.720
Although I always said, I don't know that you should.
link |
00:37:08.480
I can give you the arguments.
link |
00:37:09.600
It's up to you.
link |
00:37:10.440
I'm always very clear about that.
link |
00:37:11.760
But you know, I get like,
link |
00:37:13.320
I don't have a social media account obviously
link |
00:37:15.560
and it's not that easy for people to reach me.
link |
00:37:18.880
They have to search out an old fashioned email address
link |
00:37:21.280
on a super crappy antiquated website.
link |
00:37:23.560
Like it's actually a bit, I don't make it easy.
link |
00:37:26.120
And even with that, I get this huge flood of mail
link |
00:37:28.680
from people who say, oh, I quit my social media.
link |
00:37:30.560
I'm doing so much better.
link |
00:37:31.400
I can't believe how bad it was.
link |
00:37:33.240
But the thing is, what's for me a huge flood of mail
link |
00:37:36.040
would be an imperceptible trickle
link |
00:37:37.720
from the perspective of Facebook, right?
link |
00:37:39.920
And so I think it's rare for somebody
link |
00:37:43.600
to look at themselves and say,
link |
00:37:44.840
oh boy, I sure screwed myself over.
link |
00:37:46.520
It's a really hard thing to ask of somebody.
link |
00:37:49.600
None of us find that easy, right?
link |
00:37:51.280
It's just hard.
link |
00:37:52.600
The reason I asked this is,
link |
00:37:54.320
is it possible to design social media systems
link |
00:37:58.160
that optimize for some longer term metrics
link |
00:38:01.200
of you being happy with yourself?
link |
00:38:04.520
Well see, I don't think you should try
link |
00:38:06.440
to engineer personal growth or happiness.
link |
00:38:08.320
I think what you should do is design a system
link |
00:38:10.720
that's just respectful of the people
link |
00:38:12.640
and subordinates itself to the people
link |
00:38:14.680
and doesn't have perverse incentives.
link |
00:38:16.760
And then at least there's a chance
link |
00:38:18.200
of something decent happening.
link |
00:38:19.800
You have to recommend stuff, right?
link |
00:38:22.080
So you're saying like, be respectful.
link |
00:38:24.400
What does that actually mean engineering wise?
link |
00:38:26.880
Yeah, curation.
link |
00:38:27.720
People have to be responsible.
link |
00:38:30.240
Algorithms shouldn't be recommending.
link |
00:38:31.680
Algorithms don't understand enough to recommend.
link |
00:38:33.440
Algorithms are crap in this era.
link |
00:38:35.280
I mean, I'm sorry, they are.
link |
00:38:37.000
And I'm not saying this as somebody
link |
00:38:38.360
as a critic from the outside.
link |
00:38:39.360
I'm in the middle of it.
link |
00:38:40.200
I know what they can do.
link |
00:38:41.120
I know the math.
link |
00:38:41.960
I know what the corpora are.
link |
00:38:45.240
I know the best ones.
link |
00:38:46.920
Our office is funding GPT3 and all these things
link |
00:38:49.800
that are at the edge of what's possible.
link |
00:38:53.440
And they do not have yet.
link |
00:38:57.400
I mean, it still is statistical emergent pseudo semantics.
link |
00:39:02.120
It doesn't actually have deep representation
link |
00:39:04.120
emerging of anything.
link |
00:39:05.120
It's just not like, I mean that I'm speaking the truth here
link |
00:39:07.720
and you know it.
link |
00:39:08.600
Well, let me push back on this.
link |
00:39:11.000
This, there's several truths here.
link |
00:39:13.080
So one, you're speaking to the way
link |
00:39:15.080
certain companies operate currently.
link |
00:39:17.040
I don't think it's outside the realm
link |
00:39:18.880
of what's technically feasible to do.
link |
00:39:21.760
There's just not incentive,
link |
00:39:22.760
like companies are not, why fix this thing?
link |
00:39:26.120
I am aware that, for example, the YouTube search
link |
00:39:29.840
and discovery has been very helpful to me.
link |
00:39:32.520
And there's a huge number of, there's so many videos
link |
00:39:36.960
that it's nice to have a little bit of help.
link |
00:39:39.160
But I'm still in control.
link |
00:39:40.800
Let me ask you something.
link |
00:39:41.640
Have you done the experiment of letting YouTube
link |
00:39:44.400
recommend videos to you either starting
link |
00:39:46.640
from a absolutely anonymous random place
link |
00:39:49.760
where it doesn't know who you are
link |
00:39:50.840
or from knowing who you or somebody else is
link |
00:39:52.840
and then going 15 or 20 hops?
link |
00:39:54.640
Have you ever done that and just let it go
link |
00:39:56.840
top video recommend and then just go 20 hops?
link |
00:39:59.640
No, I've not.
link |
00:40:00.480
I've done that many times now.
link |
00:40:02.080
I have, because of how large YouTube is
link |
00:40:05.400
and how widely it's used,
link |
00:40:06.960
it's very hard to get to enough scale
link |
00:40:10.120
to get a statistically solid result on this.
link |
00:40:13.720
I've done it with high school kids,
link |
00:40:15.280
with dozens of kids doing it at a time.
link |
00:40:17.800
Every time I've done an experiment,
link |
00:40:19.640
the majority of times after about 17 or 18 hops,
link |
00:40:22.920
you end up in really weird, paranoid, bizarre territory.
link |
00:40:26.640
Because ultimately, that is the stuff
link |
00:40:28.920
the algorithm rewards the most
link |
00:40:30.360
because of the feedback crudeness I was just talking about.
link |
00:40:33.480
So I'm not saying that the video
link |
00:40:36.320
never recommends something cool.
link |
00:40:37.840
I'm saying that its fundamental core
link |
00:40:40.000
is one that promotes a paranoid style
link |
00:40:43.240
that promotes increasing irritability,
link |
00:40:45.880
that promotes xenophobia, promotes fear, anger,
link |
00:40:49.640
promotes selfishness, promotes separation between people.
link |
00:40:53.800
And the thing is, it's very hard to do this work solidly.
link |
00:40:57.720
Many have repeated this experiment
link |
00:40:59.480
and yet it still is kind of anecdotal.
link |
00:41:01.720
I'd like to do a large citizen science thing sometime
link |
00:41:05.080
and do it, but then I think the problem with that
link |
00:41:06.840
is YouTube would detect it and then change it.
link |
00:41:09.080
Yes, I love that kind of stuff on Twitter.
link |
00:41:11.640
So Jack Dorsey has spoken about doing healthy conversations
link |
00:41:15.920
on Twitter or optimizing for healthy conversations.
link |
00:41:18.600
What that requires within Twitter
link |
00:41:20.160
are most likely citizen experiments
link |
00:41:23.520
of what does healthy conversation actually look like
link |
00:41:26.520
and how do you incentivize those healthy conversations
link |
00:41:29.160
you're describing what often happens
link |
00:41:32.160
and what is currently happening.
link |
00:41:33.960
What I'd like to argue is it's possible
link |
00:41:36.040
to strive for healthy conversations,
link |
00:41:39.040
not in a dogmatic way of saying,
link |
00:41:42.040
I know what healthy conversations are and I will tell you.
link |
00:41:44.800
I think one way to do this is to try to look around
link |
00:41:47.760
at social, maybe not things that are officially social media,
link |
00:41:51.200
but things where people are together online
link |
00:41:53.360
and see which ones have more healthy conversations,
link |
00:41:56.120
even if it's hard to be completely objective
link |
00:42:00.000
in that measurement, you can kind of, at least crudely.
link |
00:42:02.520
You could do subjective annotation
link |
00:42:05.240
like have a large crowd source annotation.
link |
00:42:07.640
One that I've been really interested in is GitHub
link |
00:42:10.960
because it could change.
link |
00:42:14.360
I'm not saying it'll always be, but for the most part,
link |
00:42:17.280
GitHub has had a relatively quite low poison quotient.
link |
00:42:21.640
And I think there's a few things about GitHub
link |
00:42:24.680
that are interesting.
link |
00:42:26.480
One thing about it is that people have a stake in it.
link |
00:42:29.400
It's not just empty status games.
link |
00:42:31.720
There's actual code or there's actual stuff being done.
link |
00:42:35.040
And I think as soon as you have a real world stake
link |
00:42:37.360
in something, you have a motivation
link |
00:42:40.760
to not screw up that thing.
link |
00:42:42.520
And I think that that's often missing
link |
00:42:45.360
that there's no incentive for the person
link |
00:42:48.080
to really preserve something.
link |
00:42:49.480
If they get a little bit of attention
link |
00:42:51.440
from dumping on somebody's TikTok or something,
link |
00:42:55.840
they don't pay any price for it.
link |
00:42:56.840
But you have to kind of get decent with people
link |
00:43:00.640
when you have a shared stake, a little secret.
link |
00:43:03.040
So GitHub does a bit of that.
link |
00:43:06.720
GitHub is wonderful, yes.
link |
00:43:08.520
But I'm tempted to play the Jaren Becker at you,
link |
00:43:13.200
which is that, so GitHub is currently is amazing.
link |
00:43:16.320
But the thing is, if you have a stake,
link |
00:43:18.280
then if it's a social media platform,
link |
00:43:20.360
they can use the fact that you have a stake
link |
00:43:22.440
to manipulate you because you want to preserve the stake.
link |
00:43:25.200
So like, so like.
link |
00:43:26.120
Right, well, this is why,
link |
00:43:27.680
all right, this gets us into the economics.
link |
00:43:29.160
So there's this thing called data dignity
link |
00:43:30.800
that I've been studying for a long time.
link |
00:43:32.920
I wrote a book about an earlier version of it
link |
00:43:34.840
called Who Owns the Future?
link |
00:43:36.960
And the basic idea of it is that,
link |
00:43:41.680
once again, this is a 30 year conversation.
link |
00:43:43.360
It's a fascinating topic.
link |
00:43:44.200
Let me do the fastest version of this I can do.
link |
00:43:46.720
The fastest way I know how to do this
link |
00:43:48.720
is to compare two futures, all right?
link |
00:43:51.880
So future one is then the normative one,
link |
00:43:55.520
the one we're building right now.
link |
00:43:56.960
And future two is gonna be data dignity, okay?
link |
00:44:00.160
And I'm gonna use a particular population.
link |
00:44:03.000
I live on the hill in Berkeley.
link |
00:44:05.200
And one of the features about the hill
link |
00:44:06.880
is that as the climate changes,
link |
00:44:08.080
we might burn down and I'll lose our houses
link |
00:44:10.720
or die or something.
link |
00:44:11.560
Like it's dangerous, you know, and it didn't used to be.
link |
00:44:14.160
And so who keeps us alive?
link |
00:44:16.920
Well, the city does.
link |
00:44:18.280
The city does some things.
link |
00:44:19.440
The electric company kind of sort of,
link |
00:44:21.360
maybe hopefully better.
link |
00:44:23.360
Individual people who own property
link |
00:44:25.920
take care of their property.
link |
00:44:26.880
That's all nice.
link |
00:44:27.720
But there's this other middle layer,
link |
00:44:29.160
which is fascinating to me,
link |
00:44:30.880
which is that the groundskeepers
link |
00:44:33.480
who work up and down that hill,
link |
00:44:35.240
many of whom are not legally here,
link |
00:44:38.600
many of whom don't speak English,
link |
00:44:40.440
cooperate with each other
link |
00:44:42.480
to make sure trees don't touch
link |
00:44:44.240
to transfer fire easily from lot to lot.
link |
00:44:46.560
They have this whole little web
link |
00:44:48.040
that's keeping us safe.
link |
00:44:49.080
I didn't know about this at first.
link |
00:44:50.400
I just started talking to them
link |
00:44:52.400
because they were out there during the pandemic.
link |
00:44:54.240
And so I try to just see who are these people?
link |
00:44:56.720
Who are these people who are keeping us alive?
link |
00:44:59.240
Now, I want to talk about the two different phases
link |
00:45:01.400
for those people in your future one and future two.
link |
00:45:04.800
Future one, some weird like kindergarten paint job van
link |
00:45:10.320
with all these like cameras and weird things,
link |
00:45:11.880
drives up, observes what the gardeners
link |
00:45:13.680
and groundskeepers are doing.
link |
00:45:15.520
A few years later, some amazing robots
link |
00:45:18.120
that can shimmy up trees and all this show up.
link |
00:45:20.400
All those people are out of work
link |
00:45:21.520
and there are these robots doing the thing
link |
00:45:23.040
and the robots are good.
link |
00:45:23.960
And they can scale to more land
link |
00:45:26.280
and they're actually good.
link |
00:45:28.360
But then there are all these people out of work
link |
00:45:29.920
and these people have lost dignity.
link |
00:45:31.240
They don't know what they're going to do.
link |
00:45:32.840
And then somebody will say,
link |
00:45:34.240
well, they go on basic income, whatever.
link |
00:45:35.680
They become wards of the state.
link |
00:45:38.960
My problem with that solution is every time in history
link |
00:45:42.520
that you've had some centralized thing
link |
00:45:44.280
that's doling out the benefits,
link |
00:45:45.560
that things get seized by people
link |
00:45:47.200
because it's too centralized and it gets seized.
link |
00:45:49.480
This happened to every communist experiment I can find.
link |
00:45:53.160
So I think that turns into a poor future
link |
00:45:55.960
that will be unstable.
link |
00:45:57.640
I don't think people will feel good in it.
link |
00:45:59.160
I think it'll be a political disaster
link |
00:46:01.400
with a sequence of people seizing this central source
link |
00:46:04.440
of the basic income.
link |
00:46:06.720
And you'll say, oh no, an algorithm can do it.
link |
00:46:08.320
Then people will seize the algorithm.
link |
00:46:09.680
They'll seize control.
link |
00:46:11.160
Unless the algorithm is decentralized
link |
00:46:13.480
and it's impossible to seize the control.
link |
00:46:15.520
Yeah, but 60 something people
link |
00:46:20.000
own a quarter of all the Bitcoin.
link |
00:46:21.880
Like the things that we think are decentralized
link |
00:46:24.040
are not decentralized.
link |
00:46:25.840
So let's go to future two.
link |
00:46:27.720
Future two, the gardeners see that van with all the cameras
link |
00:46:32.360
and the kindergarten paint job,
link |
00:46:33.560
and they say, the groundskeepers,
link |
00:46:35.320
and they say, hey, the robots are coming.
link |
00:46:37.560
We're going to form a data union.
link |
00:46:38.880
And amazingly, California has a little baby data union law
link |
00:46:43.240
emerging in the books.
link |
00:46:44.080
Yes.
link |
00:46:45.560
And so they say, we're going to form a data union
link |
00:46:52.520
and we're going to,
link |
00:46:53.600
not only are we going to sell our data to this place,
link |
00:46:56.280
but we're going to make it better than it would have been
link |
00:46:57.880
if they were just grabbing it without our cooperation.
link |
00:47:00.040
And we're going to improve it.
link |
00:47:01.720
We're going to make the robots more effective.
link |
00:47:03.320
We're going to make them better
link |
00:47:04.160
and we're going to be proud of it.
link |
00:47:05.280
We're going to become a new class of experts
link |
00:47:08.400
that are respected.
link |
00:47:09.760
And then here's the interesting,
link |
00:47:11.680
there's two things that are different about that world
link |
00:47:14.480
from future one.
link |
00:47:15.600
One thing, of course, the people have more pride.
link |
00:47:17.600
They have more sense of ownership, of agency,
link |
00:47:23.800
but what the robots do changes.
link |
00:47:27.200
Instead of just like this functional,
link |
00:47:29.960
like we'll figure out how to keep the neighborhood
link |
00:47:31.560
from burning down,
link |
00:47:33.520
you have this whole creative community
link |
00:47:35.320
that wasn't there before thinking,
link |
00:47:36.480
well, how can we make these robots better
link |
00:47:38.000
so we can keep on earning money?
link |
00:47:39.680
There'll be waves of creative groundskeeping
link |
00:47:44.240
with spiral pumping, pumpkin patches
link |
00:47:46.320
and waves of cultural things.
link |
00:47:47.920
There'll be new ideas like,
link |
00:47:49.440
wow, I wonder if we could do something
link |
00:47:51.520
about climate change mitigation with how we do this.
link |
00:47:54.400
What about, what about fresh water?
link |
00:47:56.400
Can we, what about, can we make the food healthier?
link |
00:47:59.120
What about, what about all of a sudden
link |
00:48:00.400
there'll be this whole creative community on the case?
link |
00:48:03.280
And isn't it nicer to have a high tech future
link |
00:48:06.080
with more creative classes
link |
00:48:07.520
than one with more dependent classes?
link |
00:48:09.200
Isn't that a better future?
link |
00:48:10.400
But, but, but, but, future one and future two
link |
00:48:14.560
have the same robots and the same algorithms.
link |
00:48:17.440
There's no technological difference.
link |
00:48:19.200
There's only a human difference.
link |
00:48:21.520
And that second future two, that's data dignity.
link |
00:48:25.600
The economy that you're, I mean,
link |
00:48:27.120
the game theory here is on the humans
link |
00:48:29.120
and then the technology is just the tools
link |
00:48:31.600
that enable both possibilities.
link |
00:48:33.360
I mean, I think you can believe in AI
link |
00:48:36.240
and be in future two.
link |
00:48:37.440
I just think it's a little harder.
link |
00:48:38.560
You have to do more contortions, it's possible.
link |
00:48:43.280
So in the case of social media,
link |
00:48:46.080
what does data dignity look like?
link |
00:48:49.120
Is it people getting paid for their data?
link |
00:48:51.440
Yeah, I think what should happen is in the future
link |
00:48:55.200
there should be massive data unions
link |
00:48:59.280
for people putting content into the system
link |
00:49:03.920
and those data unions should smooth out
link |
00:49:05.840
the results a little bit.
link |
00:49:06.800
So it's not winner take all, but at the same time,
link |
00:49:10.400
and people have to pay for it too.
link |
00:49:11.600
They have to pay for Facebook
link |
00:49:13.520
the way they pay for Netflix
link |
00:49:14.960
with an allowance for the poor.
link |
00:49:17.360
There has to be a way out too.
link |
00:49:20.160
But the thing is people do pay for Netflix.
link |
00:49:22.080
It's a going concern.
link |
00:49:24.320
People pay for Xbox and PlayStation.
link |
00:49:26.160
Like people, there's enough people
link |
00:49:27.760
to pay for stuff they want.
link |
00:49:28.880
This could happen too.
link |
00:49:29.680
It's just that this precedent started
link |
00:49:31.280
that moved it in the wrong direction.
link |
00:49:33.040
And then what has to happen,
link |
00:49:34.640
the economy is a measuring device.
link |
00:49:36.320
If it's an honest measuring device,
link |
00:49:39.280
the outcomes for people form a normal distribution,
link |
00:49:42.560
a bell curve.
link |
00:49:43.600
And then, so there should be a few people
link |
00:49:45.280
who do really well, a lot of people who do okay.
link |
00:49:47.680
And then we should have an expanding economy
link |
00:49:49.840
reflecting more and more creativity and expertise
link |
00:49:52.960
flowing through the network.
link |
00:49:54.640
And that expanding economy moves the result
link |
00:49:57.040
just a bit forward.
link |
00:49:57.840
So more people are getting money out of it
link |
00:50:00.000
than are putting money into it.
link |
00:50:01.280
So it gradually expands the economy
link |
00:50:03.040
and lifts all boats.
link |
00:50:04.480
And the society has to support the lower wing
link |
00:50:08.000
of the bell curve too, but not universal basic income.
link |
00:50:10.720
It has to be for the,
link |
00:50:12.800
cause if it's an honest economy,
link |
00:50:15.280
there will be that lower wing
link |
00:50:17.760
and we have to support those people.
link |
00:50:19.360
There has to be a safety net.
link |
00:50:21.600
But see what I believe, I'm not gonna talk about AI,
link |
00:50:24.720
but I will say that I think there'll be more
link |
00:50:27.680
and more algorithms that are useful.
link |
00:50:29.760
And so I don't think everybody's gonna be supplying data
link |
00:50:33.440
to grounds keeping robots,
link |
00:50:34.800
nor do I think everybody's gonna make their living
link |
00:50:36.640
with TikTok videos.
link |
00:50:37.440
I think in both cases,
link |
00:50:38.800
there'll be a rather small contingent
link |
00:50:41.680
that do well enough at either of those things.
link |
00:50:43.920
But I think there might be many, many, many,
link |
00:50:46.880
many of those niches that start to evolve
link |
00:50:48.640
as there are more and more algorithms,
link |
00:50:49.920
more and more robots.
link |
00:50:50.880
And it's that large number that will create
link |
00:50:54.160
the economic potential for a very large part of society
link |
00:50:57.280
to become members of new creative classes.
link |
00:51:00.080
Do you think it's possible to create a social network
link |
00:51:05.040
that competes with Twitter and Facebook
link |
00:51:06.640
that's large and centralized in this way?
link |
00:51:08.800
Not centralized, sort of large, large.
link |
00:51:10.880
How to get, all right, so I gotta tell you
link |
00:51:13.280
how to get from where we are
link |
00:51:16.400
to anything kind of in the zone
link |
00:51:18.160
of what I'm talking about is challenging.
link |
00:51:22.400
I know some of the people who run,
link |
00:51:24.640
like I know Jack Dorsey at H1N1,
link |
00:51:26.960
and I view Jack as somebody who's actually,
link |
00:51:34.720
I think he's really striving and searching
link |
00:51:36.800
and trying to find a way to make it better,
link |
00:51:40.000
but is kind of like,
link |
00:51:42.240
it's very hard to do it while in flight
link |
00:51:44.080
and he's under enormous business pressure too.
link |
00:51:46.240
So Jack Dorsey to me is a fascinating study
link |
00:51:49.520
because I think his mind is in a lot of good places.
link |
00:51:52.480
He's a good human being,
link |
00:51:54.400
but there's a big Titanic ship
link |
00:51:56.320
that's already moving in one direction.
link |
00:51:57.760
It's hard to know what to do with it.
link |
00:51:59.040
I think that's the story of Twitter.
link |
00:52:00.720
I think that's the story of Twitter.
link |
00:52:02.560
One of the things that I observed is that
link |
00:52:04.480
if you just wanna look at the human side,
link |
00:52:06.400
meaning like how are people being changed?
link |
00:52:08.560
How do they feel?
link |
00:52:09.360
What does the culture like?
link |
00:52:11.360
Almost all of the social media platforms that get big
link |
00:52:15.760
have an initial sort of honeymoon period
link |
00:52:17.840
where they're actually kind of sweet and cute.
link |
00:52:19.680
Yeah.
link |
00:52:20.080
Like if you look at the early years of Twitter,
link |
00:52:22.160
it was really sweet and cute,
link |
00:52:23.520
but also look at Snap, TikTok.
link |
00:52:27.360
And then what happens is as they scale
link |
00:52:30.240
and the algorithms become more influential
link |
00:52:32.560
instead of just the early people,
link |
00:52:33.920
when it gets big enough that it's the algorithm running it,
link |
00:52:36.720
then you start to see the rise of the paranoid style
link |
00:52:39.440
and then they start to get dark.
link |
00:52:40.640
And we've seen that shift in TikTok rather recently.
link |
00:52:43.600
But I feel like that scaling reveals the flaws
link |
00:52:48.560
within the incentives.
link |
00:52:51.520
I feel like I'm torturing you.
link |
00:52:52.720
I'm sorry.
link |
00:52:53.520
It's not torture.
link |
00:52:54.320
No, because I have hope for the world with humans
link |
00:53:00.160
and I have hope for a lot of things that humans create,
link |
00:53:02.640
including technology.
link |
00:53:04.160
And I just, I feel it is possible to create
link |
00:53:07.520
social media platforms that incentivize
link |
00:53:11.760
different things than the current.
link |
00:53:13.280
I think the current incentivization is around
link |
00:53:16.000
like the dumbest possible thing that was invented
link |
00:53:19.040
like 20 years ago, however long.
link |
00:53:21.600
And it just works and so nobody's changing it.
link |
00:53:24.000
I just think that there could be a lot of innovation
link |
00:53:26.560
for more, see, you kind of push back this idea
link |
00:53:29.360
that we can't know what longterm growth or happiness is.
link |
00:53:33.600
If you give control to people to define
link |
00:53:36.160
what their longterm happiness and goals are,
link |
00:53:39.280
then that optimization can happen
link |
00:53:42.320
for each of those individual people.
link |
00:53:43.840
Well, I mean, imagine a future where
link |
00:53:53.040
probably a lot of people would love to make their living
link |
00:53:57.600
doing TikTok dance videos, but people recognize generally
link |
00:54:01.920
that's kind of hard to get into.
link |
00:54:03.760
Nonetheless, dance crews have an experience
link |
00:54:07.440
that's very similar to programmers working together on GitHub.
link |
00:54:10.800
So the future is like a cross between TikTok and GitHub
link |
00:54:14.000
and they get together and they have rights.
link |
00:54:18.160
They're negotiating for returns.
link |
00:54:21.280
They join different artists societies
link |
00:54:23.600
in order to soften the blow of the randomness
link |
00:54:26.800
of who gets the network effect benefit
link |
00:54:29.120
because nobody can know that.
link |
00:54:31.040
And I think an individual person
link |
00:54:35.520
might join a thousand different data unions
link |
00:54:37.680
in the course of their lives, or maybe even 10,000.
link |
00:54:40.160
I don't know, but the point is that we'll have
link |
00:54:42.000
like these very hedge distributed portfolios
link |
00:54:45.520
of different data unions we're part of.
link |
00:54:47.760
And some of them might just trickle in a little money
link |
00:54:50.080
for nonsense stuff where we're contributing
link |
00:54:52.720
to health studies or something.
link |
00:54:54.880
But I think people will find their way.
link |
00:54:56.880
They'll find their way to the right GitHub like community
link |
00:55:00.320
in which they find their value in the context
link |
00:55:03.680
of supplying inputs and data and taste
link |
00:55:07.200
and correctives and all of this into the algorithms
link |
00:55:11.200
and the robots of the future.
link |
00:55:14.640
And that is a way to resist
link |
00:55:18.720
the lizard brain based funding system mechanisms.
link |
00:55:22.880
It's an alternate economic system
link |
00:55:25.120
that rewards productivity, creativity,
link |
00:55:28.800
value as perceived by others.
link |
00:55:30.240
It's a genuine market.
link |
00:55:31.360
It's not doled out from a center.
link |
00:55:32.960
There's not some communist person deciding who's valuable.
link |
00:55:36.160
It's actual market.
link |
00:55:38.560
And the money is made by supporting that
link |
00:55:43.280
instead of just grabbing people's attention
link |
00:55:46.320
in the cheapest possible way,
link |
00:55:47.520
which is definitely how you get the lizard brain.
link |
00:55:49.760
Yeah, okay.
link |
00:55:51.040
So we're finally at the agreement.
link |
00:55:55.600
But I just think that...
link |
00:55:59.120
So yeah, I'll tell you how I think to fix social media.
link |
00:56:03.120
There's a few things.
link |
00:56:05.360
So one, I think people should have complete control
link |
00:56:07.840
over their data and transparency of what that data is
link |
00:56:11.600
and how it's being used if they do hand over the control.
link |
00:56:14.640
Another thing they should be able to delete,
link |
00:56:16.400
walk away with their data at any moment, easy.
link |
00:56:19.520
Like with a single click of a button, maybe two buttons,
link |
00:56:22.000
I don't know, just easily walk away with their data.
link |
00:56:26.240
The other is control of the algorithm,
link |
00:56:28.080
individualized control of the algorithm for them.
link |
00:56:31.120
So each one has their own algorithm.
link |
00:56:33.360
Each person has their own algorithm.
link |
00:56:34.720
They get to be the decider of what they see in this world.
link |
00:56:38.960
And to me, that's, I guess, fundamentally decentralized
link |
00:56:43.680
in terms of the key decisions being made.
link |
00:56:45.920
But if that's made transparent,
link |
00:56:47.360
I feel like people will choose that system
link |
00:56:50.080
over Twitter of today, over Facebook of today,
link |
00:56:53.360
when they have the ability to walk away,
link |
00:56:55.200
to control their data
link |
00:56:56.560
and to control the kinds of things they see.
link |
00:56:58.960
Now, let's walk away from the term AI.
link |
00:57:01.520
You're right.
link |
00:57:02.320
In this case, you have full control
link |
00:57:05.360
of the algorithms that help you
link |
00:57:07.360
if you want to use their help.
link |
00:57:09.760
But you can also say a few to those algorithms
link |
00:57:12.320
and just consume the raw, beautiful waterfall
link |
00:57:17.680
of the internet.
link |
00:57:18.800
I think that, to me, that's not only fixes social media,
link |
00:57:22.960
but I think it would make a lot more money.
link |
00:57:24.800
So I would like to challenge the idea.
link |
00:57:26.560
I know you're not presenting that,
link |
00:57:27.760
but that the only way to make a ton of money
link |
00:57:30.400
is to operate like Facebook is.
link |
00:57:32.320
I think you can make more money by giving people control.
link |
00:57:36.240
Yeah, I mean, I certainly believe that.
link |
00:57:38.320
We're definitely in the territory
link |
00:57:40.160
of a wholehearted agreement here.
link |
00:57:44.880
I do want to caution against one thing,
link |
00:57:47.040
which is making a future that benefits programmers
link |
00:57:50.880
versus this idea that people are in control of their data.
link |
00:57:53.680
So years ago, I cofounded an advisory board for the EU
link |
00:57:58.240
with a guy named Jay.
link |
00:57:59.120
Giovanni Bottarelli, who passed away.
link |
00:58:00.960
It's one of the reasons I wanted to mention it.
link |
00:58:02.320
A remarkable guy who'd been,
link |
00:58:04.960
he was originally a prosecutor
link |
00:58:06.640
who was throwing mafioso in jail in Sicily.
link |
00:58:10.960
So he was like this intense guy who was like,
link |
00:58:13.920
I've dealt with death threats.
link |
00:58:16.080
Mark Zuckerberg doesn't scare me or whatever.
link |
00:58:17.840
So we worked on this path of saying,
link |
00:58:21.040
let's make it all about transparency and consent.
link |
00:58:23.120
And it was one of the feeders that led to this huge data
link |
00:58:26.960
privacy and protection framework in Europe called the GDPR.
link |
00:58:30.960
And so therefore we've been able to have empirical feedback
link |
00:58:35.520
on how that goes.
link |
00:58:36.320
And the problem is that most people actually get stymied
link |
00:58:41.280
by the complexity of that kind of management.
link |
00:58:44.080
They have trouble and reasonably so.
link |
00:58:46.960
I don't, I'm like a techie.
link |
00:58:48.720
I can go in and I can figure out what's going on.
link |
00:58:51.600
But most people really do.
link |
00:58:54.000
And so there's a problem that it differentially benefits
link |
00:59:00.000
those who kind of have a technical mindset
link |
00:59:02.240
and can go in and sort of have a feeling
link |
00:59:04.000
for how this stuff works.
link |
00:59:05.840
I kind of still want to come back to incentives.
link |
00:59:08.320
And so if the incentive for whoever is,
link |
00:59:11.840
if the commercial incentive is to help the creative people
link |
00:59:14.320
of the future make more money,
link |
00:59:15.440
because you get a cut of it,
link |
00:59:17.440
that's how you grow an economy.
link |
00:59:19.040
Not the programmers.
link |
00:59:20.080
Well, some of them will be programmers.
link |
00:59:24.800
It's not anti programmer.
link |
00:59:26.560
I'm just saying that it's not only programmers, you know?
link |
00:59:30.480
So, yeah, you have to make sure the incentives are right.
link |
00:59:35.520
I mean, I like control is an interface problem
link |
00:59:40.320
to where you have to create something that's compelling
link |
00:59:43.920
to everybody, to the creatives, to the public.
link |
00:59:47.920
I mean, there's, I don't know, Creative Commons,
link |
00:59:51.920
like the licensing, there's a bunch of legal speak
link |
00:59:57.200
just in general, the whole legal profession.
link |
01:00:00.160
It's nice when it can be simplified
link |
01:00:01.680
in the way that you can truly simply understand.
link |
01:00:03.920
Everybody can simply understand the basics.
link |
01:00:07.680
In the same way, it should be very simple to understand
link |
01:00:12.560
how the data is being used
link |
01:00:14.800
and what data is being used for people.
link |
01:00:17.440
But then you're arguing that in order for that to happen,
link |
01:00:20.400
you have to have the incentives alike.
link |
01:00:22.400
I mean, a lot of the reason that money works
link |
01:00:26.560
is actually information hiding and information loss.
link |
01:00:30.160
Like one of the things about money
link |
01:00:32.160
is a particular dollar you get
link |
01:00:34.240
might have passed through your enemy's hands
link |
01:00:36.320
and you don't know it.
link |
01:00:37.600
But also, I mean, this is what Adam Smith,
link |
01:00:40.320
if you wanna give the most charitable interpretation possible
link |
01:00:43.440
to the invisible hand is what he was saying,
link |
01:00:45.920
is that like there's this whole complicated thing
link |
01:00:48.480
and not only do you not need to know about it,
link |
01:00:50.480
the truth is you'd never be able to follow it if you tried
link |
01:00:52.720
and just like let the economic incentives
link |
01:00:55.840
solve for this whole thing.
link |
01:00:58.160
And that in a sense, every transaction
link |
01:01:00.880
is like a neuron and a neural net.
link |
01:01:02.960
If he'd had that metaphor, he would have used it
link |
01:01:05.600
and let the whole thing settle to a solution
link |
01:01:07.920
and don't worry about it.
link |
01:01:10.080
I think this idea of having incentives
link |
01:01:13.520
that reduce complexity for people
link |
01:01:15.760
can be made to work.
link |
01:01:17.200
And that's an example of an algorithm
link |
01:01:19.040
that could be manipulative or not,
link |
01:01:20.480
going back to your question before
link |
01:01:21.600
about can you do it in a way that's not manipulative?
link |
01:01:24.320
And I would say a GitHub like,
link |
01:01:28.000
if you just have this vision,
link |
01:01:29.120
GitHub plus TikTok combined, is it possible?
link |
01:01:33.200
I think it is.
link |
01:01:34.160
I really think it is.
link |
01:01:34.640
I'm not gonna be able to unsee that idea
link |
01:01:38.560
of creatives on TikTok collaborating
link |
01:01:40.320
in the same way that people on GitHub collaborate.
link |
01:01:42.400
Why not?
link |
01:01:42.880
I like that kind of version.
link |
01:01:44.480
Why not?
link |
01:01:45.440
I like it, I love it.
link |
01:01:46.320
I just like, right now when people use,
link |
01:01:48.640
by the way, father of teenage daughter.
link |
01:01:50.240
It's all about TikTok, right?
link |
01:01:53.440
So, when people use TikTok,
link |
01:01:55.440
there's a lot of, it's kind of funny,
link |
01:01:58.960
I was gonna say cattiness,
link |
01:01:59.920
but I was just using the cat
link |
01:02:01.440
as this exemplar of what we're talking about.
link |
01:02:04.480
I contradict myself.
link |
01:02:05.440
But anyway, there's all this cattiness
link |
01:02:07.120
where people are like,
link |
01:02:07.760
ee, this person's ee.
link |
01:02:09.600
And I just, what about people getting together
link |
01:02:13.440
and kind of saying,
link |
01:02:14.640
okay, we're gonna work on this move.
link |
01:02:16.320
We're gonna get a better,
link |
01:02:17.120
can we get a better musician?
link |
01:02:18.320
Like, and they do that,
link |
01:02:20.160
but that's the part
link |
01:02:21.920
that's kind of off the books right now.
link |
01:02:25.120
That should be like right there.
link |
01:02:26.160
That should be the center.
link |
01:02:27.040
That's where the, that's the really best part.
link |
01:02:29.280
Well, that's where the invention of Git period,
link |
01:02:31.840
the versioning is brilliant.
link |
01:02:33.280
And so some of the things
link |
01:02:35.440
you're talking about,
link |
01:02:36.720
technology, algorithms, tools can empower.
link |
01:02:40.240
And that's the thing for humans to connect,
link |
01:02:43.600
to collaborate and so on.
link |
01:02:44.880
Can we upset more people a little bit?
link |
01:02:49.200
Maybe we'd have to try.
link |
01:02:50.640
No, no.
link |
01:02:51.040
Can we, can I ask you to elaborate?
link |
01:02:53.680
Cause I, my intuition was that
link |
01:02:55.760
you would be a supporter of something
link |
01:02:57.120
like cryptocurrency and Bitcoin
link |
01:02:59.200
because it is fundamentally emphasizes decentralization.
link |
01:03:02.880
What do you, so can you elaborate?
link |
01:03:05.520
Yeah.
link |
01:03:06.240
Okay, look.
link |
01:03:06.720
Your thoughts on Bitcoin.
link |
01:03:07.920
I, it's kind of funny.
link |
01:03:10.560
Um, I, I wrote, I, I've been advocating
link |
01:03:14.960
some kind of digital currency for a long time.
link |
01:03:17.520
And when the, the, uh, when, when Bitcoin came out
link |
01:03:22.800
and the original paper on, on blockchain,
link |
01:03:26.160
um, my heart kind of sank because I thought,
link |
01:03:29.200
Oh my God, we're applying all of this fancy thought
link |
01:03:32.560
and all these very careful distributed security
link |
01:03:35.280
measures to recreate the gold standard.
link |
01:03:38.400
Like it's just so retro.
link |
01:03:40.320
It's so dysfunctional.
link |
01:03:42.000
It's so useless from an economic point of view.
link |
01:03:44.000
So it's always, and then the other thing
link |
01:03:46.320
is using computational inefficiency
link |
01:03:48.880
at a boundless scale as your form of security
link |
01:03:51.680
is a crime against this atmosphere.
link |
01:03:53.920
Obviously a lot of people know that now,
link |
01:03:55.360
but we knew that at the start.
link |
01:03:57.440
Like the thing is when the first paper came out,
link |
01:03:59.600
I remember a lot of people saying,
link |
01:04:00.560
Oh my God, I think this thing scales.
link |
01:04:02.400
It's a carbon disaster, you know?
link |
01:04:04.400
And, and, um, I, I just like, I'm just mystified,
link |
01:04:09.120
but that's a different question than when you asked,
link |
01:04:11.440
can you have, um, a cryptographic currency
link |
01:04:15.040
or at least some kind of digital currency
link |
01:04:17.280
that's of a benefit?
link |
01:04:18.240
And absolutely.
link |
01:04:19.440
Like I'm, and there are people who are trying
link |
01:04:21.360
to be thoughtful about this.
link |
01:04:22.560
You should, uh, if you haven't,
link |
01:04:23.680
you should interview, uh, Vitalik Buterin sometime.
link |
01:04:25.760
Yeah, I've interviewed him twice.
link |
01:04:27.600
Okay.
link |
01:04:28.080
So like there are people in the community
link |
01:04:29.920
who are trying to be thoughtful
link |
01:04:30.880
and trying to figure out how to do this better.
link |
01:04:32.800
It has nice properties though, right?
link |
01:04:34.240
So the, one of the nice properties is that
link |
01:04:36.000
like government centralized, it's hard to control.
link |
01:04:38.240
Uh, and then the other one to fix some of the issues
link |
01:04:40.720
that you're referring to,
link |
01:04:41.600
I'm sort of playing devil's advocate here is,
link |
01:04:43.600
you know, there's lightning network.
link |
01:04:44.880
There's ideas how to, how you, uh, build stuff
link |
01:04:48.400
on top of Bitcoin, similar with gold
link |
01:04:50.400
that allow you to have this kind of vibrant economy
link |
01:04:53.760
that operates not on the blockchain,
link |
01:04:55.600
but outside the blockchain.
link |
01:04:56.720
And you use this, uh, Bitcoin for, uh, for like
link |
01:05:00.800
checking the security of those transactions.
link |
01:05:02.640
So Bitcoin's not new.
link |
01:05:03.760
It's been around for a while.
link |
01:05:05.280
I've been watching it closely.
link |
01:05:07.440
I've not, I've not seen one example of it
link |
01:05:11.040
creating economic growth.
link |
01:05:12.880
There was this obsession with the idea
link |
01:05:14.320
that government was the problem,
link |
01:05:16.080
that idea that government's the problem.
link |
01:05:18.400
Let's say government earned that wrath, honestly,
link |
01:05:22.720
because if you look at some of the things
link |
01:05:25.040
that governments have done in recent decades,
link |
01:05:27.040
it's not a pretty story.
link |
01:05:28.960
Like, uh, after, uh, after a very small number
link |
01:05:32.960
of people in the US government decided to bomb
link |
01:05:35.920
in landmine Southeast Asia, it's hard to come back
link |
01:05:40.080
and say, oh, government's this great thing.
link |
01:05:41.760
But, uh, then the problem is that this resistance
link |
01:05:47.120
to government is basically resistance to politics.
link |
01:05:50.880
It's a way of saying, if I can get rich,
link |
01:05:53.120
nobody should bother me.
link |
01:05:54.240
It's a way of not, of not having obligations to others.
link |
01:05:56.880
And that ultimately is a very suspect motivation.
link |
01:06:00.400
But does that mean that the impulse that the government, um,
link |
01:06:06.560
should not overreach its power is flawed?
link |
01:06:09.280
Well, I mean, what I want to ask you to do
link |
01:06:12.080
is to replace the word government with politics.
link |
01:06:15.520
Like our politics is people having to deal with each other.
link |
01:06:20.000
My theory about freedom is that the only authentic form
link |
01:06:23.760
of freedom is perpetual annoyance.
link |
01:06:26.400
All right.
link |
01:06:27.120
So annoyance means you're actually dealing with people
link |
01:06:30.560
because people are annoying.
link |
01:06:31.760
Perpetual means that that annoyance is survivable
link |
01:06:34.480
so it doesn't destroy us all.
link |
01:06:36.080
So if you have perpetual annoyance,
link |
01:06:37.600
then you have freedom.
link |
01:06:38.400
And that's politics.
link |
01:06:39.680
That's politics.
link |
01:06:40.560
If you don't have perpetual annoyance,
link |
01:06:42.800
something's gone very wrong and you've suppressed those people
link |
01:06:45.440
that it's only temporary.
link |
01:06:46.400
It's going to come back and be horrible.
link |
01:06:48.400
You should seek perpetual annoyance.
link |
01:06:50.880
I'll invite you to a Berkeley city council meeting
link |
01:06:52.800
so you can know what that feels like.
link |
01:06:53.920
What perpetual annoyance feels like.
link |
01:06:57.280
But anyway, so freedom is being...
link |
01:06:59.840
The test of freedom is that you're annoyed by other people.
link |
01:07:02.080
If you're not, you're not free.
link |
01:07:03.440
If you're not, you're trapped in some temporary illusion
link |
01:07:06.000
that's going to fall apart.
link |
01:07:07.680
Now, this quest to avoid government
link |
01:07:10.400
is really a quest to avoid that political feeling,
link |
01:07:12.880
but you have to have it.
link |
01:07:14.080
You have to deal with it.
link |
01:07:16.480
And it sucks, but that's the human situation.
link |
01:07:19.200
That's the human condition.
link |
01:07:20.560
And this idea that we're going to have this abstract thing
link |
01:07:22.800
that protects us from having to deal with each other
link |
01:07:25.200
is always an illusion.
link |
01:07:26.640
The idea, and I apologize,
link |
01:07:28.560
I overstretched the use of the word government.
link |
01:07:32.320
The idea is there should be some punishment from the people
link |
01:07:37.200
when a bureaucracy, when a set of people
link |
01:07:40.960
or a particular leader, like in an authoritarian regime,
link |
01:07:44.400
which more than half the world currently lives under,
link |
01:07:47.040
if they become, they stop representing the people,
link |
01:07:53.680
it stops being like a Berkeley meeting
link |
01:07:56.560
and starts being more like a dictatorial kind of situation.
link |
01:08:01.520
And so the point is, it's nice to give people,
link |
01:08:05.920
the populace in a decentralized way,
link |
01:08:08.880
power to resist that kind of government becoming over authoritarian.
link |
01:08:15.360
Yeah, but people see this idea that the problem
link |
01:08:18.160
is always the government being powerful is false.
link |
01:08:21.360
The problem can also be criminal gangs.
link |
01:08:23.440
The problem can also be weird cults.
link |
01:08:25.440
The problem can be abusive clergy.
link |
01:08:30.320
The problem can be infrastructure that fails.
link |
01:08:35.040
The problem can be poisoned water.
link |
01:08:37.440
The problem can be failed electric grids.
link |
01:08:39.760
The problem can be a crappy education system
link |
01:08:45.440
that makes the whole society less and less able to create value.
link |
01:08:51.120
There are all these other problems
link |
01:08:52.640
that are different from an overbearing government.
link |
01:08:54.480
Like you have to keep some sense of perspective
link |
01:08:56.800
and not be obsessed with only one kind of problem
link |
01:08:59.120
because then the others will pop up.
link |
01:09:01.040
But empirically speaking, some problems are bigger than others.
link |
01:09:05.120
So like some groups of people,
link |
01:09:08.720
like governments or gangs or companies lead to problems.
link |
01:09:12.080
Are you a US citizen?
link |
01:09:13.520
Yes.
link |
01:09:14.080
Has the government ever really been a problem for you?
link |
01:09:16.480
Well, okay.
link |
01:09:17.200
So first of all, I grew up in the Soviet Union.
link |
01:09:20.480
Yeah, my wife did too.
link |
01:09:22.240
So I have seen, and has the government bothered me?
link |
01:09:28.800
I would say that that's a really complicated question,
link |
01:09:32.720
especially because the United States is such,
link |
01:09:34.720
it's a special place like a lot of other countries.
link |
01:09:39.440
My wife's family were refused NICs.
link |
01:09:41.680
And so we have like a very,
link |
01:09:43.040
and her dad was sent to the Gulag.
link |
01:09:46.080
For what it's worth on my father's side,
link |
01:09:49.120
all but a few were killed by a pogrom
link |
01:09:51.360
in a post Soviet pogrom in Ukraine.
link |
01:09:57.040
So I would say because you did a little trick
link |
01:10:00.000
of eloquent trick of language
link |
01:10:02.720
that you switched to the United States
link |
01:10:04.640
to talk about government.
link |
01:10:06.160
So I believe unlike my friend,
link |
01:10:09.840
Michael Malus, who's an anarchist,
link |
01:10:11.920
I believe government can do a lot of good in the world.
link |
01:10:15.600
That is exactly what you're saying,
link |
01:10:16.960
which is it's politics.
link |
01:10:19.680
The thing that Bitcoin folks and cryptocurrency folks argue
link |
01:10:22.800
is that one of the big ways that government
link |
01:10:25.520
can control the populace is centralized bank,
link |
01:10:27.600
like control the money.
link |
01:10:29.920
That was the case in the Soviet Union too.
link |
01:10:32.160
There's inflation can really make poor people suffer.
link |
01:10:38.480
And so what they argue is this is one way to go around
link |
01:10:43.600
that power that government has
link |
01:10:46.160
of controlling the monetary system.
link |
01:10:48.400
So that's a way to resist.
link |
01:10:50.080
That's not actually saying government bad.
link |
01:10:53.280
That's saying some of the ways
link |
01:10:55.520
that central banks get into trouble
link |
01:10:59.600
can be resisted through centralized.
link |
01:11:01.120
So let me ask you on balance today in the real world
link |
01:11:04.960
in terms of actual facts,
link |
01:11:07.520
do you think cryptocurrencies are doing more
link |
01:11:10.000
to prop up corrupt, murderous, horrible regimes
link |
01:11:13.840
or to resist those regimes?
link |
01:11:15.600
Where do you think the balance is right now?
link |
01:11:17.440
I know exactly having talked to a lot of cryptocurrency folks
link |
01:11:21.360
what they would tell me, right?
link |
01:11:22.720
I, it's hard, it's, I don't, no, no.
link |
01:11:27.440
I'm asking it as a real question.
link |
01:11:29.200
There's no way to know the answer perfectly.
link |
01:11:30.720
There's no way to know the answer perfectly.
link |
01:11:32.640
However, I gotta say, if you look at people
link |
01:11:36.400
who've been able to decode blockchains
link |
01:11:39.680
and they do leak a lot of data.
link |
01:11:41.040
They're not as secure as this widely thought.
link |
01:11:43.520
There are a lot of unknown Bitcoin whales
link |
01:11:47.200
from pretty early and they're huge.
link |
01:11:49.760
And if you ask, who are these people?
link |
01:11:54.880
There's evidence that a lot of them are quite
link |
01:11:57.600
not the people you'd wanna support, let's say.
link |
01:12:00.240
And I just don't, like, I think empirically
link |
01:12:03.760
this idea that there's some intrinsic way
link |
01:12:07.200
that bad governments will be disempowered
link |
01:12:13.520
and people will be able to resist them more
link |
01:12:16.000
than new villains or even villainous governments
link |
01:12:18.800
will be empowered.
link |
01:12:19.600
There's no basis for that assertion.
link |
01:12:21.840
It just is kind of circumstantial.
link |
01:12:23.840
And I think in general, Bitcoin ownership is one thing,
link |
01:12:30.560
but Bitcoin transactions have tended
link |
01:12:32.960
to support criminality more than productivity.
link |
01:12:36.640
Of course, they would argue that was the story
link |
01:12:38.960
of its early days, that now more and more Bitcoin
link |
01:12:42.480
is being used for legitimate transactions, but...
link |
01:12:46.240
That's the difference.
link |
01:12:47.040
I didn't say for legitimate transactions.
link |
01:12:48.560
I said for economic growth, for creativity.
link |
01:12:51.440
Like, I think what's happening is people are using it
link |
01:12:56.400
a little bit for buying, I don't know,
link |
01:12:58.720
maybe some of these companies make it available
link |
01:13:02.160
for this and that, they buy a Tesla with it or something.
link |
01:13:04.480
Investing in a startup hard, it might've happened
link |
01:13:10.480
a little bit, but it's not an engine of productivity,
link |
01:13:13.360
creativity, and economic growth,
link |
01:13:15.600
whereas old fashioned currency still is.
link |
01:13:17.840
And anyway, look, I think something...
link |
01:13:24.080
I'm pro the idea of digital currencies.
link |
01:13:28.000
I am anti the idea of economics wiping out politics
link |
01:13:36.160
as a result.
link |
01:13:37.520
I think they have to exist in some balance
link |
01:13:39.840
to avoid the worst dysfunctions of each.
link |
01:13:42.320
In some ways, there's parallels to our discussion
link |
01:13:44.640
of algorithms and cryptocurrency is you're pro the idea,
link |
01:13:50.720
but it can be used to manipulate,
link |
01:13:54.240
you can be used poorly by aforementioned humans.
link |
01:13:59.200
Well, I think that you can make better designs
link |
01:14:02.000
and worse designs.
link |
01:14:04.400
And the thing about cryptocurrency that's so interesting
link |
01:14:07.680
is how many of us are responsible for the poor designs
link |
01:14:12.560
because we're all so hooked on that Horatio Alger story
link |
01:14:16.720
on like, I'm gonna be the one who gets the viral benefit.
link |
01:14:20.720
Way back when all this stuff was starting,
link |
01:14:22.720
I remember it would have been in the 80s,
link |
01:14:24.720
somebody had the idea of using viral
link |
01:14:26.640
as a metaphor for network effect.
link |
01:14:29.600
And the whole point was to talk about
link |
01:14:32.160
how bad network effect was,
link |
01:14:33.520
that it always created distortions
link |
01:14:35.760
that ruined the usefulness of economic incentives
link |
01:14:39.200
that created dangerous distortions.
link |
01:14:42.400
Like, but then somehow, even after the pandemic,
link |
01:14:45.440
we think of viral as this good thing
link |
01:14:46.960
because we imagine ourselves as the virus, right?
link |
01:14:49.200
We wanna be on the beneficiary side of it.
link |
01:14:52.000
But of course, you're not likely to be.
link |
01:14:54.480
There is a sense because money is involved,
link |
01:14:56.880
people are not reasoning clearly always
link |
01:15:01.440
because they want to be part of that first viral wave
link |
01:15:06.400
that makes them rich.
link |
01:15:07.520
And that blinds people from their basic morality.
link |
01:15:11.200
I had an interesting conversation.
link |
01:15:14.000
I sort of feel like I should respect some people's privacy,
link |
01:15:16.400
but some of the initial people who started Bitcoin,
link |
01:15:20.720
I remember having an argument about like,
link |
01:15:24.320
it's intrinsically a Ponzi scheme,
link |
01:15:26.400
like the early people have more than the later people.
link |
01:15:29.520
And the further down the chain you get,
link |
01:15:31.680
the more you're subject to gambling like dynamics
link |
01:15:34.800
where it's more and more random
link |
01:15:36.000
and more and more subject to weird network effects
link |
01:15:37.760
and whatnot unless you're a very small player perhaps
link |
01:15:41.280
and you're just buying something,
link |
01:15:42.880
but even then you'll be subject to fluctuations
link |
01:15:45.120
because the whole thing is just kind of,
link |
01:15:48.080
as it fluctuates,
link |
01:15:48.960
it's gonna wave around the little people more.
link |
01:15:51.680
And I remember the conversation turned to gambling
link |
01:15:55.200
because gambling is a pretty large economic sector.
link |
01:15:58.080
And it's always struck me as being nonproductive.
link |
01:16:01.440
Like somebody goes to Las Vegas and they lose money.
link |
01:16:03.680
And so one argument is, well, they got entertainment.
link |
01:16:06.880
They paid for entertainment as they lost money.
link |
01:16:08.960
So that's fine.
link |
01:16:10.320
And Las Vegas does up the losing of money
link |
01:16:13.360
in an entertaining way.
link |
01:16:14.160
So why not?
link |
01:16:14.720
It's like going to a show.
link |
01:16:15.920
So that's one argument.
link |
01:16:17.520
The argument that was made to me was different from that.
link |
01:16:19.680
It's that, no, what they're doing
link |
01:16:21.200
is they're getting a chance to experience hope.
link |
01:16:23.680
And a lot of people don't get that chance.
link |
01:16:25.360
And so that's really worth it.
link |
01:16:26.560
Even if they're gonna lose,
link |
01:16:27.440
they have that moment of hope
link |
01:16:28.960
and they need to be able to experience that.
link |
01:16:31.120
And it was a very interesting argument.
link |
01:16:33.600
That's so heartbreaking, but I've seen that.
link |
01:16:39.920
I have that a little bit of a sense.
link |
01:16:41.600
I've talked to some young people
link |
01:16:43.040
who invest in cryptocurrency.
link |
01:16:45.840
And what I see is this hope.
link |
01:16:48.320
This is the first thing that gave them hope.
link |
01:16:50.240
And that's so heartbreaking to me
link |
01:16:52.800
that you've gotten hope from that.
link |
01:16:55.440
So much is invested.
link |
01:16:56.720
It's like hope from somehow becoming rich
link |
01:16:59.840
as opposed to something to me.
link |
01:17:01.520
I apologize, but money is in the longterm
link |
01:17:04.960
not going to be a source of that deep meaning.
link |
01:17:07.760
It's good to have enough money,
link |
01:17:09.600
but it should not be the source of hope.
link |
01:17:11.920
And it's heartbreaking to me
link |
01:17:13.040
how many people is the source of hope.
link |
01:17:16.160
Yeah, you've just described the psychology of virality
link |
01:17:21.200
or the psychology of trying to base a civilization
link |
01:17:25.600
on semi random occurrences of network effect peaks.
link |
01:17:28.640
Yeah, and it doesn't really work.
link |
01:17:31.680
I mean, I think we need to get away from that.
link |
01:17:33.600
We need to soften those peaks
link |
01:17:37.440
and accept Microsoft, which deserves every penny,
link |
01:17:39.840
but in every other case.
link |
01:17:41.280
Well, you mentioned GitHub.
link |
01:17:43.360
I think what Microsoft did with GitHub was brilliant.
link |
01:17:45.600
I was very happy.
link |
01:17:47.120
Okay, if I can give a, not a critical,
link |
01:17:50.560
but on Microsoft because they recently purchased Bethesda.
link |
01:17:56.480
So Elder Scrolls is in their hands.
link |
01:17:58.400
I'm watching you, Microsoft,
link |
01:18:01.200
do not screw up my favorite game.
link |
01:18:03.840
Yeah, well, look, I'm not speaking for Microsoft.
link |
01:18:06.880
I have an explicit arrangement with them
link |
01:18:08.960
where I don't speak for them, obviously,
link |
01:18:11.120
like that should be very clear.
link |
01:18:12.080
I do not speak for them.
link |
01:18:14.880
I am not saying I like them.
link |
01:18:17.280
I think such is amazing.
link |
01:18:20.560
The term data dignity was coined by Sacha.
link |
01:18:23.520
Like, so, you know, we have, it's kind of extraordinary,
link |
01:18:27.040
but, you know, Microsoft's this giant thing.
link |
01:18:29.280
It's going to screw up this or that.
link |
01:18:30.560
You know, it's not, I don't know.
link |
01:18:33.440
It's kind of interesting.
link |
01:18:34.880
I've had a few occasions in my life
link |
01:18:36.720
to see how things work from the inside of some big thing.
link |
01:18:39.760
And, you know, it's always just people kind of,
link |
01:18:44.000
I don't know, there's always like coordination problems.
link |
01:18:48.720
There's always human problems.
link |
01:18:50.480
Oh God, there's some good people.
link |
01:18:51.600
There's some bad people.
link |
01:18:52.560
It's always, I hope Microsoft doesn't screw up your game.
link |
01:18:55.120
And I hope they bring Clippy back.
link |
01:18:57.760
You should never kill Clippy.
link |
01:18:59.360
Bring Clippy back.
link |
01:19:00.080
Oh, Clippy.
link |
01:19:01.120
But Clippy promotes the myth of AI.
link |
01:19:03.840
Well, that's why, this is why I think you're wrong.
link |
01:19:06.240
How about if we, all right.
link |
01:19:07.840
Could we bring back Bob instead of Clippy?
link |
01:19:10.000
Which one was Bob?
link |
01:19:11.040
Oh, Bob was another thing.
link |
01:19:13.040
Bob was this other screen character
link |
01:19:15.040
who was supposed to be the voice of AI.
link |
01:19:16.720
Cortana?
link |
01:19:17.440
Cortana?
link |
01:19:17.920
Would Cortana do it for you?
link |
01:19:19.200
Cortana is too corporate.
link |
01:19:20.640
I like it, Cortana's fine.
link |
01:19:23.680
There's a woman in Seattle who's like the model for Cortana,
link |
01:19:27.440
did Cortana's voice.
link |
01:19:28.400
The voice?
link |
01:19:29.120
There was like,
link |
01:19:29.760
No, the voice is great.
link |
01:19:31.680
We had her as a, she used to walk around
link |
01:19:34.560
if you were wearing Hollands for a bit.
link |
01:19:36.160
I don't think that's happening anymore.
link |
01:19:38.000
I think, I don't think you should turn a software
link |
01:19:40.160
into a creature.
link |
01:19:41.040
Well, you and I,
link |
01:19:42.080
Get a cat, just get a cat.
link |
01:19:43.280
You and I, you and I.
link |
01:19:44.320
Well, get a dog.
link |
01:19:45.840
Get a dog.
link |
01:19:46.320
Or a dog, yeah.
link |
01:19:47.840
Yeah.
link |
01:19:48.240
Or a hedgehog.
link |
01:19:49.680
A hedgehog.
link |
01:19:50.640
Yeah.
link |
01:19:51.760
You coauthored a paper, you mentioned Lee Smolin,
link |
01:19:56.320
titled The Autodidactic Universe,
link |
01:20:00.000
which describes our universe as one that learns its own physical laws.
link |
01:20:06.240
That's a trippy and beautiful and powerful idea.
link |
01:20:09.520
What are, what would you say are the key ideas in this paper?
link |
01:20:12.560
Ah, okay.
link |
01:20:13.680
Well, I should say that paper reflected work from last year
link |
01:20:18.640
and the project, the program has moved quite a lot.
link |
01:20:21.440
So it's a little, there's a lot of stuff that's not published
link |
01:20:24.080
that I'm quite excited about.
link |
01:20:25.200
So I have to kind of keep my frame in that,
link |
01:20:28.560
in that last year's thing.
link |
01:20:30.160
So I have to try to be a little careful about that.
link |
01:20:33.760
We can think about it in a few different ways.
link |
01:20:37.200
The core of the paper, the technical core of it
link |
01:20:40.640
is a triple correspondence.
link |
01:20:43.760
One part of it was already established
link |
01:20:46.960
and then another part is in the process.
link |
01:20:49.600
The part that was established was, of course,
link |
01:20:53.040
understanding different theories of physics as matrix models.
link |
01:20:57.040
The part that was fresher is understanding those
link |
01:21:01.600
as machine learning systems so that we could move fluidly
link |
01:21:04.800
between these different ways of describing systems.
link |
01:21:07.440
And the reason to want to do that is to just have more tools
link |
01:21:11.680
and more options because, well,
link |
01:21:15.920
theoretical physics is really hard
link |
01:21:17.520
and a lot of programs have kind of run into a state
link |
01:21:23.360
where they feel a little stalled, I guess.
link |
01:21:25.680
I want to be delicate about this
link |
01:21:26.720
because I'm not a physicist,
link |
01:21:27.680
I'm the computer scientist collaborating.
link |
01:21:29.600
So I don't mean to diss anybody's.
link |
01:21:32.080
So this is almost like gives a framework
link |
01:21:34.560
for generating new ideas in physics.
link |
01:21:37.280
As we start to publish more about where it's gone,
link |
01:21:40.000
I think you'll start to see there's tools
link |
01:21:42.800
and ways of thinking about theories
link |
01:21:45.840
that I think open up some new paths
link |
01:21:49.520
that will be of interest.
link |
01:21:52.800
There's the technical core of it,
link |
01:21:54.320
which is this idea of a correspondence
link |
01:21:56.720
to give you more facility.
link |
01:21:58.080
But then there's also the storytelling part of it.
link |
01:22:00.720
And this is something Lee loves stories and I do.
link |
01:22:05.920
And the idea here is that a typical way
link |
01:22:13.760
of thinking about physics is that there's some kind
link |
01:22:17.040
of starting condition and then there's some principle
link |
01:22:19.360
by which the starting condition evolves.
link |
01:22:23.920
And the question is like, why the starting condition?
link |
01:22:28.640
The starting condition has to be fine tuned
link |
01:22:32.400
and all these things about it have to be kind of perfect.
link |
01:22:35.760
And so we were thinking, well, look,
link |
01:22:37.360
what if we could push the storytelling
link |
01:22:40.720
about where the universe comes from much further back
link |
01:22:42.960
by starting with really simple things that evolve
link |
01:22:46.080
and then through that evolution,
link |
01:22:47.280
explain how things got to be how they are
link |
01:22:48.880
through very simple principles, right?
link |
01:22:51.200
And so we've been exploring a variety of ways
link |
01:22:55.120
to push the start of the storytelling
link |
01:22:57.680
further and further back,
link |
01:23:00.400
and it's really kind of interesting
link |
01:23:03.600
because like for all of his,
link |
01:23:07.040
Lee is sometimes considered to be,
link |
01:23:11.360
to have a radical quality in the physics world.
link |
01:23:13.840
But he still is like, no, this is gonna be like,
link |
01:23:18.240
the kind of time we're talking about
link |
01:23:19.760
in which evolution happens is the same time we're now
link |
01:23:22.320
and we're talking about something that starts and continues.
link |
01:23:25.680
And I'm like, well, what if there's some other kind
link |
01:23:27.760
of time that's time like, and it sounds like metaphysics,
link |
01:23:31.040
but there's an ambiguity, you know, like,
link |
01:23:34.560
it has to start from something
link |
01:23:36.160
and it's kind of interesting.
link |
01:23:37.920
So there's this, a lot of the math
link |
01:23:41.440
can be thought of either way, which is kind of interesting.
link |
01:23:44.000
So push this so far back that basically
link |
01:23:46.000
all the things that we take for granted in physics
link |
01:23:47.920
start becoming emergent, it's emergent.
link |
01:23:50.960
I really wanna emphasize this is all super baby steps.
link |
01:23:53.440
I don't wanna over claim.
link |
01:23:54.480
It's like, I think a lot of the things we're doing,
link |
01:23:57.440
we're approaching some old problems
link |
01:23:59.040
in a pretty fresh way, informed.
link |
01:24:02.240
There's been a zillion papers about how you can think
link |
01:24:04.640
of the universe as a big neural net
link |
01:24:06.160
or how you can think of different ideas in physics
link |
01:24:09.120
as being quite similar to, or even equivalent
link |
01:24:11.680
to some of the ideas in machine learning.
link |
01:24:15.360
And that actually works out crazy well.
link |
01:24:18.720
Like, I mean, that is actually kind of eerie
link |
01:24:21.040
when you look at it, like there's probably
link |
01:24:24.800
two or three dozen papers that have this quality
link |
01:24:26.800
and some of them are just crazy good.
link |
01:24:28.480
And it's very interesting.
link |
01:24:30.480
What we're trying to do is take those kinds
link |
01:24:33.200
of observations and turn them into an actionable framework
link |
01:24:35.760
where you can then start to do things
link |
01:24:38.640
with landscapes or theories that you couldn't do before
link |
01:24:40.480
and that sort of thing.
link |
01:24:42.480
So in that context, or maybe beyond,
link |
01:24:46.000
how do you explain us humans?
link |
01:24:47.920
How unlikely are we, this intelligent civilization
link |
01:24:50.960
or is there a lot of others or are we alone in this universe?
link |
01:24:54.800
Yeah.
link |
01:24:57.520
You seem to appreciate humans very much.
link |
01:25:03.280
I've grown fond of us.
link |
01:25:06.240
We're okay.
link |
01:25:09.200
We have our nice qualities.
link |
01:25:12.960
I like that.
link |
01:25:14.560
I mean, we're kind of weird.
link |
01:25:16.240
We sprout this hair on our heads and then we're,
link |
01:25:18.160
I don't know, we're sort of weird animals.
link |
01:25:20.240
That's the feature, not a bug, I think.
link |
01:25:22.400
The weirdness.
link |
01:25:23.120
I hope so.
link |
01:25:24.320
I hope so.
link |
01:25:30.160
I think if I'm just going to answer you in terms of truth,
link |
01:25:35.360
the first thing I'd say is we're not in a privileged enough
link |
01:25:39.040
position, at least as yet, to really know much about who we
link |
01:25:44.720
are, how we are, what we're really like in the context
link |
01:25:48.320
of something larger, what that context is,
link |
01:25:50.640
like all that stuff.
link |
01:25:51.440
We might learn more in the future.
link |
01:25:52.880
Our descendants might learn more, but we don't really know
link |
01:25:55.200
very much, which you can either view as frustrating or charming
link |
01:25:59.440
like that first year of TikTok or something.
link |
01:26:03.200
All roads lead back to TikTok.
link |
01:26:04.960
I like it.
link |
01:26:05.520
Well, lately.
link |
01:26:07.120
But in terms of, there's another level at which I can think
link |
01:26:10.240
about it where I sometimes think that if you are just quiet
link |
01:26:19.840
and you do something that gets you in touch with the way
link |
01:26:22.640
reality happens, and for me it's playing music, sometimes it
link |
01:26:27.440
seems like you can feel a bit of how the universe is.
link |
01:26:30.880
And it feels like there's a lot more going on in it and there
link |
01:26:34.240
is a lot more life and a lot more stuff happening and a lot
link |
01:26:38.000
more stuff flowing through it.
link |
01:26:39.120
I'm not speaking as a scientist now.
link |
01:26:40.800
This is kind of a more my artist side talking and I feel like
link |
01:26:46.560
I'm suddenly in multiple personalities with you.
link |
01:26:50.960
Jack Kerouac said that music is the only truth.
link |
01:26:57.520
It sounds like you might be at least in part.
link |
01:27:01.360
There's a passage in Kerouac's book, Dr.
link |
01:27:04.560
Sacks, where somebody tries to just explain the whole
link |
01:27:07.360
situation with reality and people in like a paragraph.
link |
01:27:10.000
And I couldn't reproduce it for you here, but it's like, yeah,
link |
01:27:13.200
like there are these bulbous things that walk around and
link |
01:27:15.520
they make these sounds, you can sort of understand them, but
link |
01:27:17.680
only kind of, and then there's like this, and it's just like
link |
01:27:19.600
this amazing, like just really quick, like if some spirit
link |
01:27:24.240
being or something was going to show up in our reality and
link |
01:27:26.400
hadn't knew nothing about it, it's like a little basic intro
link |
01:27:29.120
of like, okay, here's what's going on here.
link |
01:27:30.400
It's an incredible passage.
link |
01:27:32.400
Yeah.
link |
01:27:32.720
Yeah.
link |
01:27:33.760
It's like a one or two sentence summary in H.
link |
01:27:36.800
Hiker's Guide to the Galaxy, right?
link |
01:27:38.880
Of what this...
link |
01:27:40.160
Mostly harmless.
link |
01:27:41.280
Mostly harmless.
link |
01:27:42.880
Do you think there's truth to that, that music somehow
link |
01:27:45.760
connects to something that words cannot?
link |
01:27:48.880
Yeah.
link |
01:27:49.200
Music is something that just towers above me.
link |
01:27:52.560
I don't feel like I have an overview of it.
link |
01:27:57.680
It's just the reverse.
link |
01:27:58.640
I don't fully understand it because on one level it's simple.
link |
01:28:02.080
Like you can say, oh, it's a thing people evolved to
link |
01:28:06.160
coordinate our brains on a pattern level or something like that.
link |
01:28:11.760
There's all these things you can say about music, which are,
link |
01:28:14.240
you know, some of that's probably true.
link |
01:28:16.800
It's also, there's kind of like this, this is the mystery of
link |
01:28:25.520
meaning.
link |
01:28:26.000
Like there's a way that just instead of just being pure
link |
01:28:30.400
abstraction, music can have like this kind of substantiality
link |
01:28:34.160
to it that is philosophically impossible.
link |
01:28:39.760
I don't know what to do with it.
link |
01:28:41.120
Yeah.
link |
01:28:41.760
The amount of understanding I feel I have when I hear the
link |
01:28:45.520
right song at the right time is not comparable to anything I
link |
01:28:51.120
can read on Wikipedia.
link |
01:28:53.520
Anything I can understand, read through in language.
link |
01:28:57.200
The music does connect us to something.
link |
01:28:59.520
There's this thing there.
link |
01:29:00.720
Yeah, there's some kind of a thing in it.
link |
01:29:04.800
And I've never ever, I've read across a lot of explanations
link |
01:29:09.760
from all kinds of interesting people like that it's some kind
link |
01:29:13.440
of a flow language between people or between people and how
link |
01:29:18.160
they perceive and that kind of thing.
link |
01:29:20.880
And that sort of explanation is fine, but it's not quite it
link |
01:29:26.000
either.
link |
01:29:26.400
Yeah.
link |
01:29:27.040
There's something about music that makes me believe that
link |
01:29:31.360
panpsychism could possibly be true, which is that everything
link |
01:29:35.520
in the universe is conscious.
link |
01:29:36.720
It makes me think, makes me be humble in how much or how
link |
01:29:43.520
little I understand about the functions of our universe that
link |
01:29:48.480
everything might be conscious.
link |
01:29:50.560
Most people interested in theoretical physics eventually
link |
01:29:54.640
land in panpsychism, but I'm not one of them.
link |
01:30:00.080
I still think there's this pragmatic imperative to treat
link |
01:30:08.080
people as special.
link |
01:30:09.120
So I will proudly be a dualist without people and cats.
link |
01:30:14.880
Yeah, I'm not quite sure where to draw the line or why the
link |
01:30:19.600
line's there or anything like that.
link |
01:30:21.120
But I don't think I should be required to all the same
link |
01:30:23.600
questions are equally mysterious for no line.
link |
01:30:25.920
So I don't feel disadvantaged by that.
link |
01:30:28.480
So I shall remain a dualist.
link |
01:30:30.320
But if you listen to anyone trying to explain where
link |
01:30:36.640
consciousness is in a dualistic sense, either believing in
link |
01:30:39.520
souls or some special thing in the brain or something, you
link |
01:30:42.400
pretty much say, screw this.
link |
01:30:44.080
I'm going to be a panpsychist.
link |
01:30:51.200
Fair enough.
link |
01:30:52.000
Well put.
link |
01:30:53.280
Is there moments in your life that happened that we're
link |
01:30:56.320
defining in the way that you hope others your daughter?
link |
01:31:00.160
Well, listen, I got to say the moments that defined me were
link |
01:31:04.400
not the good ones.
link |
01:31:06.240
The moments that defined me were often horrible.
link |
01:31:12.320
I've had successes, you know, but if you ask what defined
link |
01:31:16.720
me, my mother's death, being under the World Trade Center
link |
01:31:24.640
and the attack, the things that have had an effect on me
link |
01:31:30.720
were the most were sort of real world, terrible things,
link |
01:31:35.120
which I don't wish on young people at all.
link |
01:31:38.640
And this is the thing that's hard about giving advice to
link |
01:31:42.080
young people that they have to learn their own lessons.
link |
01:31:48.320
And lessons don't come easily.
link |
01:31:52.000
And a world which avoids hard lessons will be a stupid
link |
01:31:56.560
world, you know, and I don't know what to do with it.
link |
01:31:59.200
That's a little bundle of truth that has a bit of a fatalistic
link |
01:32:03.440
quality to it, but I don't—this is like when I'm saying
link |
01:32:07.040
that, you know, freedom equals eternal annoyance.
link |
01:32:08.960
Like, you can't—like, there's a degree to which honest
link |
01:32:14.560
advice is not that pleasant to give.
link |
01:32:19.280
And I don't want young people to have to know about
link |
01:32:24.960
everything.
link |
01:32:25.600
You don't want to wish hardship on them.
link |
01:32:27.920
Yeah, I think they deserve to have a little grace period
link |
01:32:33.040
of naiveté that's pleasant.
link |
01:32:34.640
I mean, I do, you know, if it's possible, if it's—these
link |
01:32:40.240
things are—this is like—this is tricky stuff.
link |
01:32:42.400
I mean, if you—okay, so let me try a little bit on this
link |
01:32:50.000
advice thing.
link |
01:32:50.880
I think one thing—and any serious, broad advice will
link |
01:32:55.680
have been given a thousand times before for a thousand
link |
01:32:57.920
years, so I'm not going to claim originality, but I think
link |
01:33:04.480
trying to find a way to really pay attention to what you're
link |
01:33:11.600
feeling fundamentally, what your sense of the world is, what
link |
01:33:14.720
your intuition is, if you feel like an intuitive person, what
link |
01:33:17.920
you're—like, to try to escape the constant sway of social
link |
01:33:27.120
perception or manipulation, whatever you wish—not to
link |
01:33:30.400
escape it entirely, that would be horrible, but to find cover
link |
01:33:35.120
from it once in a while, to find a sense of being anchored
link |
01:33:39.760
in that, to believe in experience as a real thing.
link |
01:33:44.000
Believing in experience as a real thing is very dualistic.
link |
01:33:47.040
That goes with my philosophy of dualism.
link |
01:33:50.640
I believe there's something magical, and instead of squirting
link |
01:33:53.840
the magic dust on the programs, I think experience is something
link |
01:33:57.840
real and something apart, something mystical and something—
link |
01:34:00.240
Your own personal experience that you just have, and then
link |
01:34:04.800
you're saying silence the rest of the world enough to hear
link |
01:34:07.760
that—like, whatever that magic dust is in that experience.
link |
01:34:11.280
Find what is there, and I think that's one thing.
link |
01:34:18.080
Another thing is to recognize that kindness requires genius,
link |
01:34:24.720
that it's actually really hard, that facile kindness is not
link |
01:34:29.120
kindness, and that it'll take you a while to have the skills
link |
01:34:33.280
to have kind impulses to want to be kind you can have right
link |
01:34:36.240
away. To be effectively kind is hard.
link |
01:34:39.280
To be effectively kind, yeah.
link |
01:34:41.760
It takes skill. It takes hard lessons.
link |
01:34:50.880
You'll never be perfect at it. To the degree you get anywhere
link |
01:34:55.120
with it, it's the most rewarding thing ever.
link |
01:35:01.040
Let's see, what else would I say?
link |
01:35:02.480
I would say when you're young, you can be very overwhelmed
link |
01:35:12.640
by social and interpersonal emotions. You'll have broken hearts and
link |
01:35:19.920
jealousies. You'll feel socially down the ladder instead of up the
link |
01:35:24.880
ladder. It feels horrible when that happens. All of these things.
link |
01:35:28.720
And you have to remember what a fragile crust all that stuff is,
link |
01:35:35.440
and it's hard because right when it's happening, it's just so intense.
link |
01:35:46.000
If I was actually giving this advice to my daughter, she'd already
link |
01:35:48.880
be out of the room. This is for some hypothetical teenager that
link |
01:35:55.760
doesn't really exist that really wants to sit and listen to my
link |
01:35:58.880
voice for your daughter 10 years from now. Maybe.
link |
01:36:03.120
Can I ask you a difficult question?
link |
01:36:06.480
Yeah, sure.
link |
01:36:07.280
You talked about losing your mom.
link |
01:36:10.640
Yeah.
link |
01:36:11.840
Do you miss her?
link |
01:36:14.960
Yeah, I mean, I still connected her through music. She was a
link |
01:36:18.160
a young prodigy piano player in Vienna, and she survived the
link |
01:36:26.640
concentration camp and then died in a car accident here in the US.
link |
01:36:32.960
What music makes you think of her? Is there a song that connects?
link |
01:36:38.160
Well, she was in Vienna, so she had the whole Viennese music thing
link |
01:36:46.080
going, which is this incredible school of absolute skill and
link |
01:36:54.640
romance bundled together and wonderful on the piano, especially.
link |
01:36:58.640
I learned to play some of the Beethoven sonatas for her, and I
link |
01:37:01.920
played them in this exaggerated, drippy way I remember when I was
link |
01:37:05.440
a kid.
link |
01:37:06.800
Exaggerated meaning too full of emotion?
link |
01:37:09.440
Yeah, just like...
link |
01:37:11.360
Isn't that the only way to play Beethoven? I mean, I didn't know
link |
01:37:14.000
there's any other way.
link |
01:37:14.800
That's a reasonable question. I mean, the fashion these days is to
link |
01:37:17.920
be slightly Apollonian even with Beethoven, but one imagines that
link |
01:37:23.440
actual Beethoven playing might have been different. I don't
link |
01:37:26.160
know. I've gotten to play a few instruments he played and tried
link |
01:37:31.040
to see if I could feel anything about how it might have been for
link |
01:37:33.280
him. I don't know, really.
link |
01:37:34.880
I was always against the clinical precision of classical music.
link |
01:37:38.400
I thought a great piano player should be, like, in pain, like,
link |
01:37:47.040
you know, emotionally, like, truly feel the music and make it
link |
01:37:55.280
messy, sort of maybe play classical music the way, I don't
link |
01:38:00.080
know, blues pianist plays blues.
link |
01:38:02.640
It seems like they actually got happier, and I'm not sure if
link |
01:38:05.920
Beethoven got happier. I think it's a different kind of concept
link |
01:38:10.720
of the place of music. I think the blues, the whole African
link |
01:38:17.840
American tradition was initially surviving awful, awful
link |
01:38:21.840
circumstances. So you could say, you know, there was some of
link |
01:38:23.840
that in the concentration camps and all that too. And it's not
link |
01:38:29.280
that Beethoven's circumstances were brilliant, but he kind of
link |
01:38:32.400
also, I don't know, this is hard. Like, I mean, it would
link |
01:38:38.080
seem to be his misery was somewhat self imposed, maybe
link |
01:38:41.280
through, I don't know. It's kind of interesting, like, I've
link |
01:38:44.240
known some people who loathed Beethoven, like the composer,
link |
01:38:47.840
late composer, Pauline Oliveros, this wonderful modernist
link |
01:38:50.640
composer. I played in her band for a while, and she was like,
link |
01:38:54.160
oh, Beethoven, like, that's the worst music ever. It's like,
link |
01:38:56.560
all ego. It completely, it turns information, I mean, it
link |
01:39:02.400
turns emotion into your enemy. And it's ultimately all about
link |
01:39:08.240
your own self importance, which has to be at the expense of
link |
01:39:11.200
others. What else could it be? And blah, blah, blah. So she
link |
01:39:15.200
had, I shouldn't say, I don't mean to be dismissive, but I'm
link |
01:39:17.200
just saying, like, her position on Beethoven was very negative
link |
01:39:21.680
and very unimpressed, which is really interesting because
link |
01:39:24.160
the manner of the music. I think, I don't know. I mean,
link |
01:39:27.440
she's not here to speak for herself. So it's a little hard
link |
01:39:29.360
for me to answer that question. But it was interesting because
link |
01:39:32.560
I'd always thought of Beethoven as like, whoa, you know, this
link |
01:39:34.400
is like Beethoven is like really the dude, you know, and it's
link |
01:39:38.320
just like, Beethoven, Schmadovan, you know, it's like
link |
01:39:42.000
not really happening. Yeah, I still, even though it's cliche,
link |
01:39:44.240
I like playing personally, just for myself, Moonlight Sonata.
link |
01:39:47.440
I mean, I just, Moonlight's amazing. I mean, it's like,
link |
01:39:52.560
Moonlight's amazing. You know, I, you know, you're talking
link |
01:39:59.200
about comparing the blues and that sensibility from Europe
link |
01:40:02.640
is so different in so many ways. One of the musicians I
link |
01:40:06.240
play with is John Batiste, who has the band on Colbert Show,
link |
01:40:09.600
and he'll sit there playing jazz and suddenly go into
link |
01:40:12.880
Moonlight. He loves Moonlight. And what's kind of interesting
link |
01:40:16.240
is he's found a way to do Beethoven. And he, by the way,
link |
01:40:22.000
he can really do Beethoven. Like, he went through Juilliard
link |
01:40:25.680
and one time he was at my house, he's saying, hey, do you
link |
01:40:28.800
have the book of Beethoven's Sonatas? I say, yeah, I want to
link |
01:40:30.720
find one I haven't played. And then he sight read through the
link |
01:40:32.320
whole damn thing perfectly. And I'm like, oh, God, I just
link |
01:40:35.760
get out of here. I can't even deal with this. But anyway,
link |
01:40:41.200
but anyway, the thing is he has this way of with the same
link |
01:40:45.360
persona and the same philosophy moving from the blues into
link |
01:40:48.640
Beethoven that's really, really fascinating to me. It's like,
link |
01:40:53.680
I don't want to say he plays it as if it were jazz, but he
link |
01:40:56.560
kind of does. It's kind of really, and he talks, well, he
link |
01:41:00.640
was sight reading, he talks like Beethoven's talking to him.
link |
01:41:03.200
Like he's like, oh yeah, here, he's doing this. I can't do
link |
01:41:05.840
John, but you know, it's like, it's really interesting. Like
link |
01:41:09.040
it's very different. Like for me, I was introduced to
link |
01:41:11.920
Beethoven as like almost like this godlike figure, and I
link |
01:41:14.960
presume Pauline was too, that was really kind of a press
link |
01:41:17.680
for an art to deal with. And for him, it's just like the
link |
01:41:20.160
conversation. He's playing James P. Johnson or something. It's
link |
01:41:23.680
like another musician who did something and they're talking
link |
01:41:25.920
and it's very cool to be around. It's very kind of freeing
link |
01:41:30.800
to see someone have that relationship. I would love to
link |
01:41:35.040
hear him play Beethoven. That sounds amazing. He's great. We
link |
01:41:39.760
talked about Ernest Becker and how much value he puts on our
link |
01:41:45.840
mortality and our denial of our mortality. Do you think about
link |
01:41:50.720
your mortality? Do you think about your own death? You know
link |
01:41:53.760
what's funny is I used to not be able to, but as you get older,
link |
01:41:57.120
you just know people who die and there's all these things
link |
01:41:59.040
that just becomes familiar and and more of a more ordinary,
link |
01:42:04.960
which is what it is. But are you afraid? Sure, although less
link |
01:42:11.600
so. And it's not like I didn't have some kind of insight or
link |
01:42:18.880
revelation to become less afraid. I think I just, like I
link |
01:42:22.880
say, it's kind of familiarity. It's just knowing people who've
link |
01:42:27.440
died and I really believe in the future. I have this optimism
link |
01:42:34.240
that people or this whole thing of life on Earth, this whole
link |
01:42:37.920
thing we're part of, I don't know where to draw that circle,
link |
01:42:39.920
but this thing is going somewhere and has some kind of
link |
01:42:47.280
value and you can't both believe in the future and want
link |
01:42:51.600
to live forever. You have to make room for it. You know, like
link |
01:42:54.000
you have to, that optimism has to also come with its own like
link |
01:42:58.000
humility. You have to make yourself small to believe in
link |
01:43:01.280
the future and so it actually in a funny way comforts me.
link |
01:43:06.960
Wow, that's powerful. And optimism requires you to kind
link |
01:43:13.520
of step down after a time. Yeah, I mean, that said, life
link |
01:43:18.720
seems kind of short, but you know, whatever. Do you think
link |
01:43:22.000
there's I've tried to find I can't find the complaint
link |
01:43:24.080
department. You know, I really want to I want to bring this
link |
01:43:26.720
up, but the customer service number never answers and like
link |
01:43:29.440
the email bounces one way. So yeah, do you think there's
link |
01:43:32.480
meaning to it to life? We'll see. Meaning is a funny word
link |
01:43:38.080
like we say all these things as if we know what they mean, but
link |
01:43:40.480
meaning we don't know what we mean when we say meaning like
link |
01:43:43.200
we obviously do not and it's a it's it's a funny little
link |
01:43:47.600
mystical thing. I think it ultimately connects to that
link |
01:43:50.080
sense of experience that dualists tend to believe in.
link |
01:43:56.160
I guess there are why like if you look up to the stars and
link |
01:43:58.960
you experience that awe inspiring like joy at whatever
link |
01:44:04.960
when you look up to the stars that I don't know why for me
link |
01:44:07.440
that's kind of makes me feel joyful, maybe a little bit
link |
01:44:11.280
melancholy, just some weird soup of feelings and ultimately
link |
01:44:15.680
the question is like why are we here in this vast universe?
link |
01:44:22.080
That question why?
link |
01:44:25.120
Have you been able in some way maybe through music answer it
link |
01:44:30.080
for yourself?
link |
01:44:37.520
My impulse is to feel like it's not quite the right question
link |
01:44:42.480
to ask, but I feel like going down that path is just too
link |
01:44:46.800
tedious for the moment and I don't want to do it, but
link |
01:44:51.840
the wrong question. Well, just because you know, I don't know
link |
01:44:56.640
what meaning is and I think I do know that sense of awe. I
link |
01:45:01.680
grew up in southern New Mexico and the stars were so vivid.
link |
01:45:08.560
I've had some weird misfortunes, but I've had some
link |
01:45:13.200
weird luck also. One of our near neighbors was the head of
link |
01:45:19.360
optics research at White Sands and when he was young he
link |
01:45:22.000
discovered Pluto. His name was Clyde Tombaugh and he taught me
link |
01:45:26.000
how to make telescopes, grinding mirrors and stuff. My dad
link |
01:45:29.200
had also made telescopes when he was a kid, but Clyde had like
link |
01:45:33.840
backyard telescopes that would put to shame a lot of like
link |
01:45:37.120
I mean he really he did his telescopes you know and so
link |
01:45:40.480
I remember he'd let me go and play with him and just like looking at a
link |
01:45:45.040
globular cluster and you're seeing the actual photons and with a good
link |
01:45:48.080
telescope it's really like this object like you can really tell
link |
01:45:51.760
this isn't coming through some intervening information structure this
link |
01:45:55.360
is like the actual photons and it's really a three dimensional object
link |
01:45:59.520
and you have even a feeling for the vastness of it
link |
01:46:02.560
and it's it's it's I don't know I so I definitely I was
link |
01:46:08.000
very very fortunate to have a connection to the sky that way
link |
01:46:13.440
when I was a kid. To have had that experience
link |
01:46:17.200
again the emphasis on experience.
link |
01:46:22.560
It's kind of funny like I feel like sometimes
link |
01:46:25.920
like I've taken when she was younger I took my daughter and her friends to
link |
01:46:30.000
to like a telescope there are a few around here that are
link |
01:46:33.440
kids can go and use and they would like look at Jupiter's moons or something
link |
01:46:37.120
I think like Galilean moons and I don't know if they quite
link |
01:46:41.440
had that because it's like too
link |
01:46:44.960
it's been just too normalized and I think maybe
link |
01:46:49.760
when I was growing up screens weren't that common yet and maybe it's like too
link |
01:46:53.440
confusable with the screen I don't know you know somebody uh
link |
01:46:57.920
brought up in conversation to me somewhere I don't remember who
link |
01:47:02.080
but they they kind of posited this idea that
link |
01:47:05.200
if humans early humans weren't able to see the stars like if
link |
01:47:09.120
earth atmosphere was such there was cloudy
link |
01:47:12.080
that we would not develop human civilization there's something about
link |
01:47:15.600
being able to look up and see a vast universe is like
link |
01:47:20.480
that's fundamental to the development of human civilization
link |
01:47:23.920
I thought that was a curious kind of thought that reminds me of that
link |
01:47:28.800
old Isaac Asimov story where the you know there's this planet where they
link |
01:47:32.720
finally get to see what's in the sky once in a while and it turns out they're in
link |
01:47:35.680
the middle of a globular cluster and they're all these stars and
link |
01:47:38.320
I forget what happens exactly god that's that's from when I was the same age as a
link |
01:47:41.680
kid I don't really remember yeah uh but um yeah I don't know it's uh
link |
01:47:46.720
it's it might be right I'm just thinking of all the
link |
01:47:49.200
civilizations that grew up under clouds I mean like
link |
01:47:52.240
the the vikings needed a special uh diffracting piece of mica to navigate
link |
01:47:58.640
because they could never see the sun they had this thing called a sunstone
link |
01:48:01.200
that they found from this this one cave you know about that
link |
01:48:03.920
so they were in this like uh they were trying to navigate
link |
01:48:07.840
boats you know in the north atlantic with without being able to see the sun
link |
01:48:11.680
because it was cloudy and so they they used uh of a uh
link |
01:48:17.520
a chunk of mica to diffract it in order to be able to align where the sun really
link |
01:48:22.240
was because they couldn't tell by eye and navigate so
link |
01:48:25.040
I'm just saying there are a lot of civilizations that are pretty impressive
link |
01:48:27.680
that had to deal with a lot of clouds uh
link |
01:48:31.520
the amazonians invented our agriculture and they they were probably under
link |
01:48:35.600
clouds a lot I don't know I don't know to me personally the the question of the
link |
01:48:39.840
meaning of life becomes most um
link |
01:48:44.080
vibrant most apparent when you look up at the stars
link |
01:48:47.680
because it makes me feel very small uh that we're not small
link |
01:48:54.560
but then you ask it it still feels that we're special and then the natural
link |
01:49:00.720
question is like well if we are special as I think we
link |
01:49:04.240
are why the heck are we here in this vast
link |
01:49:08.000
universe that ultimately is the question of um
link |
01:49:12.560
right well the meaning of life I mean look
link |
01:49:15.920
there's a confusion sometimes in trying to use uh
link |
01:49:22.800
to set up a question or a thought experiment or something
link |
01:49:26.720
that's defined in terms of a context to explain something
link |
01:49:30.960
where there is no larger context and that's a category error
link |
01:49:34.480
um if we want to do it in physics um or well or in computer science um
link |
01:49:41.360
it's hard to talk about the universe as a Turing machine because a Turing
link |
01:49:44.560
machine has an external clock and an observer and a
link |
01:49:47.920
input and output there's a larger context implied in order for it to be
link |
01:49:51.280
defined at all and so if you're talking about the
link |
01:49:53.600
universe you can't talk about it coherently as a Turing machine uh
link |
01:49:57.200
quantum mechanics is like that quantum mechanics has an external clock and has
link |
01:50:01.040
some kind of external context depending on your interpretation
link |
01:50:04.800
um that's either you know the observer or whatever
link |
01:50:08.480
uh and there's a they're they're similar that way so maybe
link |
01:50:12.000
maybe Turing machines and quantum mechanics can be
link |
01:50:16.000
better friends or something because they have a similar setup but the thing is if
link |
01:50:19.040
you have something that's defined in terms of an outer context you can't
link |
01:50:24.240
talk about ultimates with it because obviously it doesn't
link |
01:50:27.600
it's not suited for that so there's some ideas that
link |
01:50:30.960
are their own context general relativity is its own context
link |
01:50:34.400
it's different that's why it's hard to unify and
link |
01:50:37.840
um i think the same thing is true when we talk about
link |
01:50:42.320
these types of questions like uh meaning is in a context and
link |
01:50:49.920
to talk about ultimate meaning is therefore a category error it's not
link |
01:50:53.440
it's not a um it's not a resolvable way of thinking
link |
01:50:59.120
it might be a way of thinking that is experientially
link |
01:51:06.320
um or aesthetically valuable because it is awesome in the sense of
link |
01:51:13.280
you know awe inspiring um but to try to treat it analytically is not
link |
01:51:18.960
sensible maybe that's what music can poetry for
link |
01:51:22.320
yeah maybe i think so i think music actually does
link |
01:51:25.680
escape any particular context that's how it feels to me but i'm not sure about
link |
01:51:28.800
that that's once again crazy artist talking not scientist
link |
01:51:33.360
well you did uh you do both masterfully uh jaron i'm like i said i'm a big fan
link |
01:51:38.960
of everything you've done of you as a human being
link |
01:51:41.280
um i appreciate the the fun argument we had today that will i'm sure
link |
01:51:47.360
continue for 30 years as it did with mark mitski um honestly
link |
01:51:52.160
i i deeply appreciate that you spend your really valuable time with me today
link |
01:51:55.520
it was a really great conversation thank you so much
link |
01:51:58.400
thanks for listening to this conversation with jaron lanier
link |
01:52:01.600
to support this podcast please check out our sponsors in the description
link |
01:52:06.080
and now let me leave you with some words from jaron lanier himself
link |
01:52:10.800
a real friendship ought to introduce each person
link |
01:52:13.840
to unexpected weirdness in the other thank you for listening i hope to see
link |
01:52:19.120
you next time