back to index

Mark Zuckerberg: Meta, Facebook, Instagram, and the Metaverse | Lex Fridman Podcast #267


small model | large model

link |
00:00:00.000
Let's talk about free speech and censorship.
link |
00:00:02.560
You don't build a company like this
link |
00:00:04.040
unless you believe that people expressing themselves
link |
00:00:06.440
is a good thing.
link |
00:00:07.280
Let me ask you as a father,
link |
00:00:08.640
there's a weight heavy on you
link |
00:00:09.760
that people get bullied on social networks.
link |
00:00:13.000
I care a lot about how people feel
link |
00:00:14.440
when they use our products
link |
00:00:15.460
and I don't want to build products that make people angry.
link |
00:00:19.320
Why do you think so many people dislike you?
link |
00:00:23.440
Some even hate you.
link |
00:00:25.580
And how do you regain their trust and support?
link |
00:00:30.160
The following is a conversation with Mark Zuckerberg,
link |
00:00:32.720
CEO of Facebook, now called Meta.
link |
00:00:36.760
Please allow me to say a few words
link |
00:00:38.680
about this conversation with Mark Zuckerberg,
link |
00:00:41.300
about social media,
link |
00:00:42.680
and about what troubles me in the world today,
link |
00:00:45.440
and what gives me hope.
link |
00:00:47.760
If this is not interesting to you,
link |
00:00:49.480
I understand, please skip.
link |
00:00:51.540
I believe that at its best,
link |
00:00:54.620
social media puts a mirror to humanity
link |
00:00:57.800
and reveals the full complexity of our world,
link |
00:01:01.060
shining a light on the dark aspects of human nature
link |
00:01:04.080
and giving us hope, a way out,
link |
00:01:06.700
through compassionate but tense chaos of conversation
link |
00:01:09.780
that eventually can turn into understanding,
link |
00:01:12.780
friendship, and even love.
link |
00:01:15.600
But this is not simple.
link |
00:01:17.460
Our world is not simple.
link |
00:01:19.580
It is full of human suffering.
link |
00:01:22.340
I think about the hundreds of millions of people
link |
00:01:24.460
who are starving and who live in extreme poverty,
link |
00:01:28.460
the one million people who take their own life every year,
link |
00:01:31.600
the 20 million people that attempt it,
link |
00:01:33.940
and the many, many more millions who suffer quietly
link |
00:01:37.380
in ways that numbers can never know.
link |
00:01:40.740
I'm troubled by the cruelty and pain of war.
link |
00:01:44.660
Today, my heart goes out to the people of Ukraine.
link |
00:01:48.540
My grandfather spilled his blood on this land,
link |
00:01:52.220
held the line as a machine gunner
link |
00:01:54.020
against the Nazi invasion, surviving impossible odds.
link |
00:01:59.180
I am nothing without him.
link |
00:02:01.380
His blood runs in my blood.
link |
00:02:04.980
My words are useless here.
link |
00:02:07.560
I send my love.
link |
00:02:09.020
It's all I have.
link |
00:02:11.160
I hope to travel to Russia and Ukraine soon.
link |
00:02:14.140
I will speak to citizens and leaders,
link |
00:02:16.780
including Vladimir Putin.
link |
00:02:19.980
As I've said in the past, I don't care about access,
link |
00:02:22.700
fame, money, or power, and I'm afraid of nothing.
link |
00:02:27.660
But I am who I am, and my goal in conversation
link |
00:02:31.100
is to understand the human being before me,
link |
00:02:33.460
no matter who they are, no matter their position.
link |
00:02:36.540
And I do believe the line between good and evil
link |
00:02:40.060
runs through the heart of every man.
link |
00:02:43.540
So this is it.
link |
00:02:45.380
This is our world.
link |
00:02:47.420
It is full of hate, violence, and destruction.
link |
00:02:51.720
But it is also full of love, beauty,
link |
00:02:55.300
and the insatiable desire to help each other.
link |
00:02:59.140
The people who run the social networks
link |
00:03:01.320
that show this world, that show us to ourselves,
link |
00:03:05.380
have the greatest of responsibilities.
link |
00:03:08.500
In a time of war, pandemic, atrocity,
link |
00:03:11.740
we turn to social networks to share real human insights
link |
00:03:14.460
and experiences, to organize protests and celebrations,
link |
00:03:18.620
to learn and to challenge our understanding of the world,
link |
00:03:21.940
of our history and of our future,
link |
00:03:24.300
and above all, to be reminded of our common humanity.
link |
00:03:28.500
When the social networks fail,
link |
00:03:30.500
they have the power to cause immense suffering.
link |
00:03:33.640
And when they succeed,
link |
00:03:35.020
they have the power to lessen that suffering.
link |
00:03:37.820
This is hard.
link |
00:03:39.340
It's a responsibility, perhaps,
link |
00:03:41.080
almost unlike any other in history.
link |
00:03:44.080
This podcast conversation attempts to understand the man
link |
00:03:47.260
and the company who take this responsibility on,
link |
00:03:50.680
where they fail and where they hope to succeed.
link |
00:03:54.260
Mark Zuckerberg's feet are often held to the fire,
link |
00:03:57.920
as they should be, and this actually gives me hope.
link |
00:04:01.540
The power of innovation and engineering,
link |
00:04:03.900
coupled with the freedom of speech
link |
00:04:05.560
in the form of its highest ideal,
link |
00:04:07.700
I believe can solve any problem in the world.
link |
00:04:11.140
But that's just it, both are necessary,
link |
00:04:14.780
the engineer and the critic.
link |
00:04:17.620
I believe that criticism is essential, but cynicism is not.
link |
00:04:23.260
And I worry that in our public discourse,
link |
00:04:25.660
cynicism too easily masquerades as wisdom, as truth,
link |
00:04:30.380
becomes viral and takes over,
link |
00:04:32.220
and worse, suffocates the dreams of young minds
link |
00:04:35.460
who want to build solutions to the problems of the world.
link |
00:04:39.260
We need to inspire those young minds.
link |
00:04:41.540
At least for me, they give me hope.
link |
00:04:44.580
And one small way I'm trying to contribute
link |
00:04:47.160
is to have honest conversations like these
link |
00:04:49.520
that don't just ride the viral wave of cynicism,
link |
00:04:53.260
but seek to understand the failures
link |
00:04:54.900
and successes of the past, the problems before us,
link |
00:04:57.940
and the possible solutions
link |
00:04:59.540
in this very complicated world of ours.
link |
00:05:02.560
I'm sure I will fail often,
link |
00:05:05.820
and I count on the critic to point it out when I do.
link |
00:05:10.180
But I ask for one thing,
link |
00:05:12.540
and that is to fuel the fire of optimism,
link |
00:05:15.180
especially in those who dream to build solutions,
link |
00:05:18.340
because without that, we don't have a chance
link |
00:05:21.780
on this too fragile, tiny planet of ours.
link |
00:05:25.820
This is the Lex Friedman podcast.
link |
00:05:28.020
To support it, please check out our sponsors
link |
00:05:30.260
in the description.
link |
00:05:31.620
And now, dear friends, here's Mark Zuckerberg.
link |
00:05:40.980
Can you circle all the traffic lights, please?
link |
00:05:53.880
You actually did it.
link |
00:05:54.860
That is very impressive performance.
link |
00:05:56.820
Okay, now we can initiate the interview procedure.
link |
00:06:00.020
Is it possible that this conversation is happening
link |
00:06:02.980
inside a metaverse created by you,
link |
00:06:05.380
by Meta many years from now,
link |
00:06:07.180
and we're doing a memory replay experience?
link |
00:06:10.180
I don't know the answer to that.
link |
00:06:11.260
Then I'd be some computer construct
link |
00:06:15.220
and not the person who created that Meta company.
link |
00:06:18.980
But that would truly be Meta.
link |
00:06:21.380
Right, so this could be somebody else
link |
00:06:23.360
using the Mark Zuckerberg avatar
link |
00:06:26.420
who can do the Mark and the Lex conversation replay
link |
00:06:29.200
from four decades ago when Meta, it was first sort of.
link |
00:06:33.340
I mean, it's not gonna be four decades
link |
00:06:34.780
before we have photorealistic avatars like this.
link |
00:06:38.020
So I think we're much closer to that.
link |
00:06:40.100
Well, that's something you talk about
link |
00:06:41.380
is how passionate you are about the idea
link |
00:06:43.500
of the avatar representing who you are in the metaverse.
link |
00:06:46.580
So I do these podcasts in person.
link |
00:06:51.100
You know, I'm a stickler for that,
link |
00:06:52.260
because there's a magic to the in person conversation.
link |
00:06:55.780
How long do you think it'll be before
link |
00:06:58.140
you can have the same kind of magic in the metaverse,
link |
00:07:00.620
the same kind of intimacy in the chemistry,
link |
00:07:02.620
whatever the heck is there when we're talking in person?
link |
00:07:06.060
How difficult is it?
link |
00:07:07.100
How long before we have it in the metaverse?
link |
00:07:10.340
Well, I think this is like the key question, right?
link |
00:07:12.940
Because the thing that's different about virtual
link |
00:07:17.500
and hopefully augmented reality
link |
00:07:19.140
compared to all other forms of digital platforms before
link |
00:07:22.400
is this feeling of presence, right?
link |
00:07:24.420
The feeling that you're right,
link |
00:07:25.860
that you're in an experience
link |
00:07:27.000
and that you're there with other people or in another place.
link |
00:07:29.700
And that's just different from all of the other screens
link |
00:07:32.300
that we have today, right?
link |
00:07:33.900
Phones, TVs, all the stuff.
link |
00:07:36.780
They're trying to, in some cases, deliver experiences
link |
00:07:39.500
that feel high fidelity,
link |
00:07:43.040
but at no point do you actually feel like you're in it, right?
link |
00:07:46.380
At some level, your content is trying to sort of convince you
link |
00:07:49.660
that this is a realistic thing that's happening,
link |
00:07:51.980
but all of the kind of subtle signals are telling you,
link |
00:07:54.860
no, you're looking at a screen.
link |
00:07:56.500
So the question about how you develop these systems is like,
link |
00:08:00.700
what are all of the things that make the physical world
link |
00:08:03.860
all the different cues?
link |
00:08:04.900
So I think on visual presence and spatial audio,
link |
00:08:13.080
we're making reasonable progress.
link |
00:08:15.440
Spatial audio makes a huge deal.
link |
00:08:16.940
I don't know if you've tried this experience,
link |
00:08:19.620
workrooms that we launched where you have meetings.
link |
00:08:21.960
And I basically made a rule for all of the top,
link |
00:08:26.380
you know, management folks at the company
link |
00:08:27.760
that they need to be doing standing meetings
link |
00:08:29.520
in workrooms already, right?
link |
00:08:31.720
I feel like we got to dog food this,
link |
00:08:33.260
you know, this is how people are gonna work in the future.
link |
00:08:35.760
So we have to adopt this now.
link |
00:08:38.800
And there were already a lot of things
link |
00:08:40.780
that I think feel significantly better
link |
00:08:42.280
than like typical Zoom meetings,
link |
00:08:44.740
even though the avatars are a lot lower fidelity.
link |
00:08:48.620
You know, the idea that you have spatial audio,
link |
00:08:50.780
you're around a table in VR with people.
link |
00:08:53.940
If someone's talking from over there,
link |
00:08:55.260
it sounds like it's talking from over there.
link |
00:08:56.860
You can see, you know, the arm gestures
link |
00:08:59.960
and stuff feel more natural.
link |
00:09:01.760
You can have side conversations,
link |
00:09:03.080
which is something that you can't really do in Zoom.
link |
00:09:04.860
I mean, I guess you can text someone over,
link |
00:09:06.940
like out of band,
link |
00:09:08.740
but if you're actually sitting around a table with people,
link |
00:09:12.940
you know, you can lean over
link |
00:09:14.200
and whisper to the person next to you
link |
00:09:15.580
and like have a conversation that you can't,
link |
00:09:17.760
you know, that you can't really do
link |
00:09:19.100
with in just video communication.
link |
00:09:23.500
So I think it's interesting in what ways
link |
00:09:27.500
some of these things already feel more real
link |
00:09:29.820
than a lot of the technology that we have,
link |
00:09:32.580
even when the visual fidelity isn't quite there,
link |
00:09:35.040
but I think it'll get there over the next few years.
link |
00:09:37.180
Now, I mean, you were asking about comparing that
link |
00:09:38.740
to the true physical world,
link |
00:09:40.500
not Zoom or something like that.
link |
00:09:42.740
And there, I mean, I think you have feelings
link |
00:09:45.700
of like temperature, you know, olfactory,
link |
00:09:50.700
obviously touch, right, we're working on haptic gloves,
link |
00:09:54.380
you know, the sense that you wanna be able to,
link |
00:09:56.300
you know, put your hands down
link |
00:09:57.220
and feel some pressure from the table.
link |
00:09:59.740
You know, all of these things
link |
00:10:00.580
I think are gonna be really critical
link |
00:10:01.740
to be able to keep up this illusion
link |
00:10:04.660
that you're in a world
link |
00:10:06.820
and that you're fully present in this world.
link |
00:10:08.820
But I don't know,
link |
00:10:09.740
I think we're gonna have a lot of these building blocks
link |
00:10:11.640
within, you know, the next 10 years or so.
link |
00:10:14.220
And even before that, I think it's amazing
link |
00:10:15.920
how much you're just gonna be able to build with software
link |
00:10:18.140
that sort of masks some of these things.
link |
00:10:21.380
I realize I'm going long,
link |
00:10:22.780
but I was told we have a few hours here.
link |
00:10:25.300
So it's a...
link |
00:10:26.140
We're here for five to six hours.
link |
00:10:27.220
Yeah, so I mean, it's, look,
link |
00:10:28.740
I mean, that's on the shorter end
link |
00:10:30.460
of the congressional testimonies I've done.
link |
00:10:32.540
But it's, but, you know, one of the things
link |
00:10:36.300
that we found with hand presence, right?
link |
00:10:39.540
So the earliest VR, you just have the headset
link |
00:10:42.000
and then, and that was cool, you could look around,
link |
00:10:44.440
you feel like you're in a place,
link |
00:10:45.380
but you don't feel like you're really able to interact with it
link |
00:10:47.380
until you have hands.
link |
00:10:48.540
And then there was this big question
link |
00:10:49.620
where once you got hands,
link |
00:10:51.340
what's the right way to represent them?
link |
00:10:53.460
And initially, all of our assumptions was, okay,
link |
00:10:58.560
when I look down and see my hands in the physical world,
link |
00:11:00.420
I see an arm and it's gonna be super weird
link |
00:11:02.780
if you see, you know, just your hand.
link |
00:11:06.460
But it turned out to not be the case
link |
00:11:08.060
because there's this issue with your arms,
link |
00:11:09.860
which is like, what's your elbow angle?
link |
00:11:11.540
And if the elbow angle that we're kind of interpolating
link |
00:11:14.780
based on where your hand is and where your headset is
link |
00:11:18.660
actually isn't accurate,
link |
00:11:19.940
it creates this very uncomfortable feeling
link |
00:11:21.900
where it's like, oh, like my arm is actually out like this,
link |
00:11:24.440
but it's like showing it in here.
link |
00:11:25.940
And that actually broke the feeling of presence a lot more.
link |
00:11:29.540
Whereas it turns out that if you just show the hands
link |
00:11:31.900
and you don't show the arms,
link |
00:11:34.840
it actually is fine for people.
link |
00:11:36.220
So I think that there's a bunch
link |
00:11:38.420
of these interesting psychological cues
link |
00:11:41.060
where it'll be more about getting the right details right.
link |
00:11:44.980
And I think a lot of that will be possible
link |
00:11:46.940
even over a few year period or a five year period.
link |
00:11:49.780
And we won't need like every single thing to be solved
link |
00:11:52.100
to deliver this like full sense of presence.
link |
00:11:54.620
Yeah, it's a fascinating psychology question
link |
00:11:56.500
of what is the essence
link |
00:11:59.960
that makes in person conversation special?
link |
00:12:04.260
It's like emojis are able to convey emotion really well,
link |
00:12:08.020
even though they're obviously not photorealistic.
link |
00:12:10.580
And so in that same way, Jessica, you're saying,
link |
00:12:12.460
just showing the hands is able
link |
00:12:14.260
to create a comfortable expression with your hands.
link |
00:12:18.180
So I wonder what that is.
link |
00:12:19.520
People in the world wars used to write letters
link |
00:12:21.920
and you can fall in love with just writing letters.
link |
00:12:24.380
You don't need to see each other in person.
link |
00:12:26.640
You can convey emotion.
link |
00:12:27.700
You can be depth of experience with just words.
link |
00:12:32.740
So that's, I think, a fascinating place
link |
00:12:35.740
to explore psychology of like,
link |
00:12:37.460
how do you find that intimacy?
link |
00:12:39.220
Yeah, and the way that I come to all of this stuff is,
link |
00:12:42.700
I basically studied psychology and computer science.
link |
00:12:45.060
So all of the work that I do
link |
00:12:47.900
is sort of at the intersection of those things.
link |
00:12:49.900
I think most of the other big tech companies
link |
00:12:52.180
are building technology for you to interact with.
link |
00:12:55.060
What I care about is building technology
link |
00:12:56.680
to help people interact with each other.
link |
00:12:58.100
So I think it's a somewhat different approach
link |
00:12:59.900
than most of the other tech entrepreneurs
link |
00:13:02.300
and big companies come at this from.
link |
00:13:04.340
And a lot of the lessons
link |
00:13:08.980
in terms of how I think about designing products
link |
00:13:10.980
come from some just basic elements of psychology, right?
link |
00:13:15.980
In terms of our brains,
link |
00:13:19.060
you can compare it to the brains of other animals.
link |
00:13:22.080
We're very wired to specific things, facial expressions.
link |
00:13:25.560
I mean, we're very visual, right?
link |
00:13:28.200
So compared to other animals,
link |
00:13:29.340
I mean, that's clearly the main sense
link |
00:13:31.820
that most people have.
link |
00:13:32.980
But there's a whole part of your brain
link |
00:13:35.260
that's just kind of focused on reading facial cues.
link |
00:13:38.560
So when we're designing the next version of Quest
link |
00:13:42.180
or the VR headset, a big focus for us is face tracking
link |
00:13:45.820
and basically eye tracking so you can make eye contact,
link |
00:13:48.860
which again, isn't really something
link |
00:13:50.100
that you can do over a video conference.
link |
00:13:51.500
It's sort of amazing how far video conferencing
link |
00:13:55.500
has gotten without the ability to make eye contact, right?
link |
00:13:58.620
It's sort of a bizarre thing if you think about it.
link |
00:14:00.540
You're looking at someone's face,
link |
00:14:03.140
sometimes for an hour when you're in a meeting
link |
00:14:05.620
and you looking at their eyes to them
link |
00:14:08.960
doesn't look like you're looking at their eyes.
link |
00:14:11.780
You're always looking past each other, I guess.
link |
00:14:15.020
I guess you're right.
link |
00:14:15.860
You're not sending that signal.
link |
00:14:16.680
Well, you're trying to.
link |
00:14:17.520
Right, you're trying to.
link |
00:14:18.360
A lot of times, or at least I find myself,
link |
00:14:19.760
I'm trying to look into the other person's eyes.
link |
00:14:21.420
But they don't feel like you're looking to their eyes.
link |
00:14:23.060
So then the question is,
link |
00:14:23.900
all right, am I supposed to look at the camera
link |
00:14:25.220
so that way you can have a sensation
link |
00:14:27.800
that I'm looking at you?
link |
00:14:28.640
I think that that's an interesting question.
link |
00:14:30.140
And then with VR today,
link |
00:14:33.820
even without eye tracking
link |
00:14:35.660
and knowing what your eyes are actually looking at,
link |
00:14:37.500
you can fake it reasonably well, right?
link |
00:14:39.380
So you can look at where the head pose is.
link |
00:14:42.240
And if it looks like I'm kind of looking
link |
00:14:43.720
in your general direction,
link |
00:14:44.680
then you can sort of assume
link |
00:14:46.500
that maybe there's some eye contact intended
link |
00:14:48.640
and you can do it in a way where it's like,
link |
00:14:50.740
okay, maybe it's not a fixated stare,
link |
00:14:54.300
but it's somewhat natural.
link |
00:14:56.880
But once you have actual eye tracking,
link |
00:14:58.740
you can do it for real.
link |
00:15:00.180
And I think that that's really important stuff.
link |
00:15:02.140
So when I think about Meta's contribution to this field,
link |
00:15:05.300
I have to say it's not clear to me
link |
00:15:06.640
that any of the other companies
link |
00:15:08.700
that are focused on the Metaverse
link |
00:15:11.180
or on virtual and augmented reality
link |
00:15:13.340
are gonna prioritize putting these features in the hardware
link |
00:15:15.820
because like everything, they're trade offs, right?
link |
00:15:18.260
I mean, it adds some weight to the device.
link |
00:15:21.500
Maybe it adds some thickness.
link |
00:15:22.740
You could totally see another company taking the approach
link |
00:15:24.840
of let's just make the lightest and thinnest thing possible.
link |
00:15:27.600
But I want us to design the most human thing possible
link |
00:15:31.380
that creates the richest sense of presence
link |
00:15:33.340
and cause so much of human emotion and expression
link |
00:15:37.900
comes from these like micro movements.
link |
00:15:39.500
If I like move my eyebrow millimeter,
link |
00:15:41.800
you will notice and that like means something.
link |
00:15:44.640
So the fact that we're losing these signals
link |
00:15:46.840
and a lot of communication I think is a loss.
link |
00:15:49.940
So it's not like, okay, there's one feature
link |
00:15:51.700
and you add this, then it all of a sudden
link |
00:15:53.300
is gonna feel like we have real presence.
link |
00:15:55.100
You can sort of look at how the human brain works
link |
00:15:57.820
and how we express and kind of read emotions
link |
00:16:01.820
and you can just build a roadmap of that,
link |
00:16:04.700
of just what are the most important things
link |
00:16:06.460
to try to unlock over a five to 10 year period
link |
00:16:08.520
and just try to make the experience
link |
00:16:10.040
more and more human and social.
link |
00:16:12.780
When do you think would be a moment,
link |
00:16:16.640
like a singularity moment for the Metaverse
link |
00:16:19.340
where there's a lot of ways to ask this question,
link |
00:16:22.300
but people will have many or most
link |
00:16:26.580
of their meaningful experiences
link |
00:16:28.820
in the Metaverse versus the real world.
link |
00:16:31.340
And actually it's interesting to think about
link |
00:16:33.060
the fact that a lot of people are having
link |
00:16:35.560
the most important moments of their life
link |
00:16:37.280
happen in the digital sphere,
link |
00:16:39.100
especially not during COVID,
link |
00:16:41.680
like even falling in love or meeting friends
link |
00:16:45.060
or getting excited about stuff
link |
00:16:46.420
that is happening on the 2D digital plane.
link |
00:16:49.660
When do you think the Metaverse
link |
00:16:50.860
will provide those experiences for a large number,
link |
00:16:54.100
like a majority of the population?
link |
00:16:54.940
Yeah, I think it's a really good question.
link |
00:16:57.240
There was someone, I read this piece
link |
00:17:00.260
that framed this as a lot of people think
link |
00:17:03.740
that the Metaverse is about a place,
link |
00:17:06.040
but one definition of this is it's about a time
link |
00:17:10.380
when basically immersive digital worlds
link |
00:17:12.900
become the primary way that we live our lives
link |
00:17:17.100
and spend our time.
link |
00:17:18.720
I think that that's a reasonable construct.
link |
00:17:20.160
And from that perspective,
link |
00:17:21.900
I think you also just wanna look at this as a continuation
link |
00:17:25.540
because it's not like, okay,
link |
00:17:27.140
we are building digital worlds,
link |
00:17:28.940
but we don't have that today.
link |
00:17:29.820
I think you and I probably already live
link |
00:17:32.340
a very large part of our life in digital worlds.
link |
00:17:34.660
They're just not 3D immersive virtual reality,
link |
00:17:37.260
but I do a lot of meetings over video
link |
00:17:39.820
or I spend a lot of time writing things over email
link |
00:17:42.300
or WhatsApp or whatever.
link |
00:17:44.540
So what is it gonna take to get there
link |
00:17:46.060
for kind of the immersive presence version of this,
link |
00:17:48.700
which I think is what you're asking.
link |
00:17:51.060
And for that, I think that there's just a bunch
link |
00:17:52.940
of different use cases.
link |
00:17:55.640
And I think when you're building technology,
link |
00:18:00.460
I think a lot of it is just you're managing this duality
link |
00:18:05.820
where on the one hand,
link |
00:18:06.980
you wanna build these elegant things that can scale
link |
00:18:10.100
and have billions of people use them
link |
00:18:12.140
and get value from them.
link |
00:18:13.340
And then on the other hand,
link |
00:18:14.540
you're fighting this kind of ground game
link |
00:18:17.020
where there are just a lot of different use cases
link |
00:18:19.680
and people do different things
link |
00:18:20.880
and you wanna be able to unlock them.
link |
00:18:22.260
So the first ones that we basically went after
link |
00:18:25.940
were gaming with Quest and social experiences.
link |
00:18:30.340
And it goes back to when we started working
link |
00:18:32.540
on virtual reality.
link |
00:18:33.380
My theory at the time was basically
link |
00:18:37.400
people thought about it as gaming,
link |
00:18:39.440
but if you look at all computing platforms up to that point,
link |
00:18:44.160
gaming is a huge part, it was a huge part of PCs,
link |
00:18:47.440
it was a huge part of mobile,
link |
00:18:49.460
but it was also very decentralized.
link |
00:18:51.960
There wasn't, for the most part,
link |
00:18:54.240
one or two gaming companies.
link |
00:18:55.660
There were a lot of gaming companies
link |
00:18:57.440
and gaming is somewhat hits based.
link |
00:18:58.700
I mean, we're getting some games that have more longevity,
link |
00:19:01.440
but in general, there were a lot of different games
link |
00:19:05.340
out there.
link |
00:19:06.560
But on PC and on mobile,
link |
00:19:10.740
the companies that focused on communication
link |
00:19:13.700
and social interaction,
link |
00:19:15.100
there tended to be a smaller number of those
link |
00:19:17.260
and that ended up being just as important of a thing
link |
00:19:19.160
as all of the games that you did combined.
link |
00:19:21.580
I think productivity is another area.
link |
00:19:23.140
That's obviously something
link |
00:19:23.980
that we've historically been less focused on,
link |
00:19:26.020
but I think it's gonna be really important for us.
link |
00:19:27.220
With workroom, do you mean productivity
link |
00:19:29.580
in the collaborative aspect?
link |
00:19:30.860
Yeah, I think that there's a workroom's aspect of this,
link |
00:19:34.360
like a meeting aspect,
link |
00:19:35.360
and then I think that there's like a Word, Excel,
link |
00:19:39.500
productivity, either you're working or coding
link |
00:19:42.940
or knowledge work as opposed to just meetings.
link |
00:19:46.760
So you can kind of go through all these different use cases.
link |
00:19:49.620
Gaming, I think we're well on our way.
link |
00:19:51.280
Social, I think we're just the kind of preeminent company
link |
00:19:56.080
that focuses on this.
link |
00:19:57.040
And I think that that's already on Quest becoming the,
link |
00:20:00.420
if you look at the list of what are the top apps,
link |
00:20:03.460
social apps are already number one, two, three.
link |
00:20:06.460
So that's kind of becoming a critical thing, but I don't know.
link |
00:20:10.860
I would imagine for someone like you,
link |
00:20:12.580
it'll be until we get a lot of the work things dialed in.
link |
00:20:17.980
When this is just like much more adopted
link |
00:20:20.860
and clearly better than Zoom for VC,
link |
00:20:24.260
when if you're doing your coding or your writing
link |
00:20:27.100
or whatever it is in VR,
link |
00:20:29.420
which it's not that far off to imagine that
link |
00:20:31.260
because pretty soon you're just gonna be able
link |
00:20:32.660
to have a screen that's bigger than,
link |
00:20:34.220
it'll be your ideal setup and you can bring it with you
link |
00:20:36.140
and put it on anywhere
link |
00:20:37.540
and have your kind of ideal workstation.
link |
00:20:39.780
So I think that there are a few things to work out on that,
link |
00:20:42.580
but I don't think that that's more than five years off.
link |
00:20:46.940
And then you'll get a bunch of other things
link |
00:20:48.120
that like aren't even possible
link |
00:20:50.220
or you don't even think about using a phone
link |
00:20:52.040
or PC for today, like fitness, right?
link |
00:20:54.440
So, I mean, I know you're, we were talking before
link |
00:20:57.460
about how you're into running
link |
00:20:58.980
and like I'm really into a lot of things
link |
00:21:00.780
around fitness as well,
link |
00:21:02.740
different things in different places.
link |
00:21:04.100
I got really into hydrofoiling recently
link |
00:21:06.060
and surfing and I used to fence competitively.
link |
00:21:12.420
I like run.
link |
00:21:13.260
So, and you were saying that you were thinking
link |
00:21:14.860
about trying different martial arts
link |
00:21:16.380
and I tried to trick you and convince you
link |
00:21:18.180
into doing Brazilian Jiu Jitsu.
link |
00:21:19.940
Or you actually mentioned that that was one
link |
00:21:21.500
you're curious about and I don't know.
link |
00:21:23.220
Is that a trick?
link |
00:21:24.140
Yeah, I don't know.
link |
00:21:26.020
We're in the metaverse now.
link |
00:21:27.420
Yeah, no, I took that seriously.
link |
00:21:29.580
I thought that that was a real suggestion.
link |
00:21:34.260
That would be an amazing chance
link |
00:21:36.380
if we ever step on the mat together
link |
00:21:37.780
and just like roll around.
link |
00:21:39.020
I'll show you some moves.
link |
00:21:40.100
Well, give me a year to train and then we can do it.
link |
00:21:43.700
You know, you've seen Rocky IV
link |
00:21:44.780
where the Russian faces off the American.
link |
00:21:46.340
I'm the Russian in this picture.
link |
00:21:47.980
And then you're the Rocky, the underdog
link |
00:21:49.980
that gets to win in the end.
link |
00:21:51.340
The idea of me as Rocky and like fighting is...
link |
00:21:56.180
If he dies, he dies.
link |
00:21:58.080
Sorry, I just had to.
link |
00:22:00.740
I mean.
link |
00:22:01.580
Anyway, yeah.
link |
00:22:02.640
But I mean, a lot of aspects of fitness.
link |
00:22:05.900
You know, I don't know if you've tried supernatural
link |
00:22:08.780
on Quest or...
link |
00:22:10.180
So first of all, can I just comment on the fact
link |
00:22:12.060
every time I played around with Quest 2,
link |
00:22:15.020
I just, I get giddy every time I step into virtual reality.
link |
00:22:18.780
So you mentioned productivity and all those kinds of things.
link |
00:22:20.860
That's definitely something I'm excited about,
link |
00:22:23.820
but really I just love the possibilities
link |
00:22:26.780
of stepping into that world.
link |
00:22:28.820
Maybe it's the introvert in me,
link |
00:22:30.460
but it just feels like the most convenient way
link |
00:22:34.060
to travel into worlds,
link |
00:22:37.500
into worlds that are similar to the real world
link |
00:22:40.340
or totally different.
link |
00:22:41.660
It's like Alice in Wonderland.
link |
00:22:42.860
Just try out crazy stuff.
link |
00:22:44.660
The possibilities are endless.
link |
00:22:45.780
And I just, I personally am just love,
link |
00:22:50.900
get excited for stepping in those virtual worlds.
link |
00:22:53.980
So I'm a huge fan.
link |
00:22:55.020
In terms of the productivity as a programmer,
link |
00:22:58.300
I spend most of my day programming.
link |
00:23:00.060
That's really interesting also,
link |
00:23:01.980
but then you have to develop the right IDEs.
link |
00:23:04.340
You have to develop, like there has to be a threshold
link |
00:23:07.380
where a large amount of the programming community
link |
00:23:09.340
moves there, but the collaborative aspects
link |
00:23:11.860
that are possible in terms of meetings,
link |
00:23:14.260
in terms of when two coders are working together,
link |
00:23:18.380
I mean, the possibilities there are super, super exciting.
link |
00:23:21.740
I think that in building this, we sort of need to balance.
link |
00:23:27.100
There are gonna be some new things
link |
00:23:28.160
that you just couldn't do before.
link |
00:23:29.700
And those are gonna be the amazing experiences.
link |
00:23:31.540
So teleporting to any place, right?
link |
00:23:33.420
Whether it's a real place or something that people made.
link |
00:23:38.620
And I mean, some of the experiences
link |
00:23:40.340
around how we can build stuff in new ways,
link |
00:23:42.100
where a lot of the stuff that,
link |
00:23:44.740
when I'm coding stuff, it's like, all right,
link |
00:23:46.060
you code it and then you build it
link |
00:23:47.180
and then you see it afterwards.
link |
00:23:48.260
But increasingly it's gonna be possible to,
link |
00:23:50.500
you're in a world and you're building the world
link |
00:23:52.780
as you are in it and kind of manipulating it.
link |
00:23:55.780
One of the things that we showed at our Inside the Lab
link |
00:23:59.900
for recent artificial intelligence progress
link |
00:24:02.500
is this Builder Bot program,
link |
00:24:03.940
where now you can just talk to it and say,
link |
00:24:07.180
hey, okay, I'm in this world,
link |
00:24:08.460
like put some trees over there and it'll do that.
link |
00:24:10.740
And like, all right, put some bottles of water
link |
00:24:13.220
on our picnic blanket and it'll do that
link |
00:24:17.060
and you're in the world.
link |
00:24:17.900
And I think there are gonna be new paradigms for coding.
link |
00:24:19.940
So yeah, there are gonna be some things
link |
00:24:22.100
that I think are just pretty amazing,
link |
00:24:24.620
especially the first few times that you do them,
link |
00:24:26.580
but that you're like, whoa,
link |
00:24:28.300
like I've never had an experience like this.
link |
00:24:30.620
But most of your life, I would imagine,
link |
00:24:34.260
is not doing things that are amazing for the first time.
link |
00:24:38.180
A lot of this in terms of,
link |
00:24:39.620
I mean, just answering your question from before around,
link |
00:24:42.020
what is it gonna take
link |
00:24:42.860
before you're spending most of your time in this?
link |
00:24:45.060
Well, first of all, let me just say it as an aside,
link |
00:24:48.180
the goal isn't to have people spend a lot more time
link |
00:24:50.340
in computing.
link |
00:24:51.180
It's to make it so that. I'm asking for myself.
link |
00:24:52.340
Yeah, it's to make it. When will I spend all my time in?
link |
00:24:54.460
Yeah, it's to make computing more natural.
link |
00:24:57.060
But I think you will spend most of your computing time
link |
00:25:02.740
in this when it does the things
link |
00:25:04.900
that you use computing for somewhat better.
link |
00:25:07.300
So maybe having your perfect workstation
link |
00:25:10.540
is a 5% improvement on your coding productivity.
link |
00:25:15.140
Maybe it's not like a completely new thing.
link |
00:25:19.300
But I mean, look, if I could increase the productivity
link |
00:25:21.620
of every engineer at Meta by 5%,
link |
00:25:25.500
we'd buy those devices for everyone.
link |
00:25:27.620
And I imagine a lot of other companies would too.
link |
00:25:30.340
And that's how you start getting to the scale
link |
00:25:31.860
that I think makes this rival
link |
00:25:34.500
some of the bigger computing platforms that exist today.
link |
00:25:37.020
Let me ask you about identity.
link |
00:25:38.300
We talked about the avatar.
link |
00:25:40.460
How do you see identity in the Metaverse?
link |
00:25:42.740
Should the avatar be tied to your identity
link |
00:25:46.420
or can I be anything in the Metaverse?
link |
00:25:49.300
Like, can I be whatever the heck I want?
link |
00:25:52.180
Can I even be a troll?
link |
00:25:53.660
So there's exciting freeing possibilities
link |
00:25:57.420
and there's the darker possibilities too.
link |
00:26:00.740
Yeah, I mean, I think that there's gonna be a range, right?
link |
00:26:03.180
So we're working on, for expression and avatars,
link |
00:26:10.180
on one end of the spectrum are kind of expressive
link |
00:26:13.100
and cartoonish avatars.
link |
00:26:14.940
And then on the other end of the spectrum
link |
00:26:16.460
are photorealistic avatars.
link |
00:26:18.500
And I just think the reality is
link |
00:26:20.660
that there are gonna be different use cases
link |
00:26:22.100
for different things.
link |
00:26:23.260
And I guess there's another axis.
link |
00:26:25.140
So if you're going from photorealistic to expressive,
link |
00:26:28.700
there's also like representing you directly
link |
00:26:31.100
versus like some fantasy identity.
link |
00:26:33.660
And I think that there are gonna be things
link |
00:26:35.340
on all ends of that spectrum too, right?
link |
00:26:37.860
So you'll want photo, like in some experience,
link |
00:26:41.020
you might wanna be like a photorealistic dragon, right?
link |
00:26:44.300
Or if I'm playing Onward,
link |
00:26:46.980
or just this military simulator game,
link |
00:26:50.940
I think getting to be more photorealistic as a soldier
link |
00:26:53.620
in that could enhance the experience.
link |
00:26:57.780
There are times when I'm hanging out with friends
link |
00:26:59.540
where I want them to know it's me.
link |
00:27:02.060
So a kind of cartoonish or expressive version of me is good.
link |
00:27:06.180
But there are also experiences like,
link |
00:27:09.580
VRChat does this well today,
link |
00:27:11.580
where a lot of the experience is kind of dressing up
link |
00:27:14.900
and wearing a fantastical avatar
link |
00:27:17.780
that's almost like a meme or is humorous.
link |
00:27:19.580
So you come into an experience
link |
00:27:21.300
and it's almost like you have like a built in icebreaker
link |
00:27:24.540
because like you see people and you're just like,
link |
00:27:27.300
all right, I'm cracking up at what you're wearing
link |
00:27:29.940
because that's funny.
link |
00:27:30.780
And it's just like, where'd you get that?
link |
00:27:31.900
Or, oh, you made that?
link |
00:27:32.740
That's, it's awesome.
link |
00:27:35.500
Whereas, okay, if you're going into a work meeting,
link |
00:27:38.900
maybe a photorealistic version of your real self
link |
00:27:41.740
is gonna be the most appropriate thing for that.
link |
00:27:43.540
So I think the reality is there aren't going to be,
link |
00:27:47.340
it's not just gonna be one thing.
link |
00:27:50.500
You know, my own sense of kind of how you wanna
link |
00:27:54.380
express identity online has sort of evolved over time.
link |
00:27:56.860
And that, you know, early days in Facebook,
link |
00:27:58.620
I thought, okay, people are gonna have one identity.
link |
00:28:00.260
And now I think that's clearly not gonna be the case.
link |
00:28:02.100
I think you're gonna have all these different things
link |
00:28:04.420
and there's utility in being able to do different things.
link |
00:28:07.300
So some of the technical challenges
link |
00:28:10.100
that I'm really interested in around it
link |
00:28:12.140
are how do you build the software
link |
00:28:14.180
to allow people to seamlessly go between them?
link |
00:28:17.100
So say, so you could view them
link |
00:28:19.300
as just completely discrete points on a spectrum,
link |
00:28:25.140
but let's talk about the metaverse economy for a second.
link |
00:28:28.500
Let's say I buy a digital shirt
link |
00:28:31.220
for my photorealistic avatar, which by the way,
link |
00:28:34.420
I think at the time where we're spending a lot of time
link |
00:28:36.460
in the metaverse doing a lot of our work meetings
link |
00:28:38.740
in the metaverse and et cetera,
link |
00:28:40.260
I would imagine that the economy around virtual clothing
link |
00:28:42.460
as an example is going to be quite as big.
link |
00:28:44.660
Why wouldn't I spend almost as much money
link |
00:28:47.100
in investing in my appearance or expression
link |
00:28:49.780
for my photorealistic avatar for meetings
link |
00:28:52.420
as I would for whatever I'm gonna wear in my video chat.
link |
00:28:55.540
But the question is, okay, so you,
link |
00:28:56.700
let's say you buy some shirt
link |
00:28:57.620
for your photorealistic avatar.
link |
00:28:59.780
Wouldn't it be cool if there was a way
link |
00:29:02.620
to basically translate that into a more expressive thing
link |
00:29:07.620
for your kind of cartoonish or expressive avatar?
link |
00:29:11.220
And there are multiple ways to do that.
link |
00:29:12.580
You can view them as two discrete points and okay,
link |
00:29:14.940
maybe if a designer sells one thing,
link |
00:29:18.220
then it actually comes in a pack and there's two
link |
00:29:19.940
and you can use either one on that,
link |
00:29:22.340
but I actually think this stuff might exist more
link |
00:29:24.420
as a spectrum in the future.
link |
00:29:26.100
And that's what I do think the direction
link |
00:29:29.460
on some of the AI advances that is happening
link |
00:29:33.380
to be able to, especially stuff around like style transfer,
link |
00:29:35.980
being able to take a piece of art or express something
link |
00:29:39.860
and say, okay, paint me this photo in the style of Gauguin
link |
00:29:44.900
or whoever it is that you're interested in.
link |
00:29:49.060
Take this shirt and put it in the style
link |
00:29:51.300
of what I've designed for my expressive avatar.
link |
00:29:55.220
I think that's gonna be pretty compelling.
link |
00:29:56.940
And so the fashion, you might be buying like a generator,
link |
00:30:00.060
like a closet that generates a style.
link |
00:30:03.260
And then like with the GANs,
link |
00:30:05.540
you'll be able to infinitely generate outfits
link |
00:30:08.180
thereby making it, so the reason I wear the same thing
link |
00:30:10.780
all the time is I don't like choice.
link |
00:30:12.380
You've talked about the same thing,
link |
00:30:15.140
but now you don't even have to choose.
link |
00:30:16.700
Your closet generates your outfit for you every time.
link |
00:30:19.580
So you have to live with the outfit it generates.
link |
00:30:23.460
I mean, you could do that, although,
link |
00:30:25.500
no, I think that that's, I think some people will,
link |
00:30:27.500
but I think like, I think there's going to be a huge aspect
link |
00:30:31.300
of just people doing creative commerce here.
link |
00:30:35.900
So I think that there is going to be a big market
link |
00:30:37.860
around people designing digital clothing.
link |
00:30:41.060
But the question is, if you're designing digital clothing,
link |
00:30:43.020
do you need to design, if you're the designer,
link |
00:30:44.860
do you need to make it for each kind of specific discrete
link |
00:30:48.140
point along a spectrum, or are you just designing it
link |
00:30:51.500
for kind of a photo realistic case or an expressive case,
link |
00:30:54.140
or can you design one
link |
00:30:55.140
and have it translate across these things?
link |
00:30:57.460
If I buy a style from a designer who I care about,
link |
00:31:01.780
and now I'm a dragon, is there a way to morph that
link |
00:31:04.220
so it goes on the dragon in a way that makes sense?
link |
00:31:07.660
And that I think is an interesting AI problem
link |
00:31:09.460
because you're probably not going to make it
link |
00:31:10.820
so that designers have to go design for all those things.
link |
00:31:14.700
But the more useful the digital content is that you buy
link |
00:31:17.900
in a lot of uses, in a lot of use cases,
link |
00:31:21.220
the more that economy will just explode.
link |
00:31:23.420
And that's a lot of what all of the,
link |
00:31:28.100
we were joking about NFTs before,
link |
00:31:29.700
but I think a lot of the promise here is that
link |
00:31:32.580
if the digital goods that you buy are not just tied
link |
00:31:35.020
to one platform or one use case,
link |
00:31:37.060
they end up being more valuable,
link |
00:31:38.260
which means that people are more willing
link |
00:31:39.820
and more likely to invest in them,
link |
00:31:41.300
and that just spurs the whole economy.
link |
00:31:44.220
But the question is, that's a fascinating positive aspect,
link |
00:31:47.260
but the potential negative aspect is that
link |
00:31:50.780
you can have people concealing their identity
link |
00:31:52.660
in order to troll or even not people, bots.
link |
00:31:57.060
So how do you know in the metaverse
link |
00:31:58.780
that you're talking to a real human or an AI
link |
00:32:02.060
or a well intentioned human?
link |
00:32:03.940
Is that something you think about,
link |
00:32:04.980
something you're concerned about?
link |
00:32:06.940
Well, let's break that down into a few different cases.
link |
00:32:10.260
I mean, because knowing that you're talking to someone
link |
00:32:11.980
who has good intentions is something that I think
link |
00:32:13.880
is not even solved in pretty much anywhere.
link |
00:32:17.860
But I mean, if you're talking to someone who's a dragon,
link |
00:32:20.380
I think it's pretty clear that they're not representing
link |
00:32:22.000
themselves as a person.
link |
00:32:23.300
I think probably the most pernicious thing
link |
00:32:25.300
that you want to solve for is,
link |
00:32:30.140
I think probably one of the scariest ones is
link |
00:32:32.300
how do you make sure that someone isn't impersonating you?
link |
00:32:35.020
So, okay, you're in a future version of this conversation,
link |
00:32:39.340
and we have photorealistic avatars,
link |
00:32:41.700
and we're doing this in work rooms
link |
00:32:43.320
or whatever the future version of that is,
link |
00:32:44.980
and someone walks in who looks like me.
link |
00:32:48.860
How do you know that that's me?
link |
00:32:50.300
And one of the things that we're thinking about
link |
00:32:54.200
is it's still a pretty big AI project
link |
00:32:57.500
to be able to generate photorealistic avatars
link |
00:32:59.500
that basically can like,
link |
00:33:00.880
they work like these codecs of you, right?
link |
00:33:03.380
So you kind of have a map from your headset
link |
00:33:06.220
and whatever sensors of what your body's actually doing,
link |
00:33:08.020
and it takes the model and it kind of displays it in VR.
link |
00:33:11.180
But there's a question, which is,
link |
00:33:12.660
should there be some sort of biometric security
link |
00:33:15.420
so that when I put on my VR headset
link |
00:33:18.220
or I'm going to go use that avatar,
link |
00:33:20.940
I need to first prove that I am that?
link |
00:33:24.340
And I think you probably are gonna want something like that.
link |
00:33:26.780
So as we're developing these technologies,
link |
00:33:31.120
we're also thinking about the security for things like that
link |
00:33:34.500
because people aren't gonna wanna be impersonated.
link |
00:33:37.060
That's a huge security issue.
link |
00:33:41.700
Then you just get the question
link |
00:33:42.900
of people hiding behind fake accounts
link |
00:33:46.580
to do malicious things,
link |
00:33:48.300
which is not gonna be unique to the metaverse,
link |
00:33:51.020
although certainly in a environment
link |
00:33:56.140
where it's more immersive
link |
00:33:57.300
and you have more of a sense of presence,
link |
00:33:58.640
it could be more painful.
link |
00:34:01.740
But this is obviously something
link |
00:34:03.140
that we've just dealt with for years
link |
00:34:06.480
in social media and the internet more broadly.
link |
00:34:08.740
And there, I think there have been a bunch of tactics
link |
00:34:13.140
that I think we've just evolved to,
link |
00:34:17.900
we've built up these different AI systems
link |
00:34:20.480
to basically get a sense of,
link |
00:34:21.880
is this account behaving in the way that a person would?
link |
00:34:26.340
And it turns out,
link |
00:34:28.340
so in all of the work that we've done around,
link |
00:34:31.940
we call it community integrity
link |
00:34:33.340
and it's basically like policing harmful content
link |
00:34:36.920
and trying to figure out where to draw the line.
link |
00:34:38.340
And there are all these like really hard
link |
00:34:39.800
and philosophical questions around like,
link |
00:34:41.300
where do you draw the line on some of this stuff?
link |
00:34:42.880
And the thing that I've kind of found the most effective
link |
00:34:47.820
is as much as possible trying to figure out
link |
00:34:51.260
who are the inauthentic accounts
link |
00:34:53.380
or where are the accounts that are behaving
link |
00:34:55.540
in an overall harmful way at the account level,
link |
00:34:58.420
rather than trying to get into like policing
link |
00:35:00.460
what they're saying, right?
link |
00:35:01.420
Which I think the metaverse is gonna be even harder
link |
00:35:03.700
because the metaverse I think will have more properties of,
link |
00:35:07.320
it's almost more like a phone call, right?
link |
00:35:09.220
Or it's not like I post a piece of content
link |
00:35:12.380
and is that piece of content good or bad?
link |
00:35:14.700
So I think more of this stuff will have to be done
link |
00:35:16.260
at the level of the account.
link |
00:35:19.420
But this is the area where,
link |
00:35:21.740
between the kind of counter intelligence teams
link |
00:35:27.140
that we built up inside the company
link |
00:35:28.480
and like years of building just different AI systems
link |
00:35:33.220
to basically detect what is a real account and what isn't.
link |
00:35:36.900
I'm not saying we're perfect,
link |
00:35:37.940
but like this is an area where I just think
link |
00:35:39.920
we are like years ahead of basically anyone else
link |
00:35:43.580
in the industry in terms of having built those capabilities.
link |
00:35:48.100
And I think that that just is gonna be incredibly important
link |
00:35:50.180
for this next wave of things.
link |
00:35:51.540
And like you said, on a technical level,
link |
00:35:53.460
on a philosophical level,
link |
00:35:54.980
it's an incredibly difficult problem to solve.
link |
00:35:59.260
By the way, I would probably like to open source my avatar
link |
00:36:03.220
so there could be like millions of Lexis walking around
link |
00:36:05.940
just like an army.
link |
00:36:07.040
Like Agent Smith?
link |
00:36:08.500
Agent Smith, yeah, exactly.
link |
00:36:10.700
So the Unity ML folks built a copy of me
link |
00:36:16.420
and they sent it to me.
link |
00:36:18.420
So there's a person running around
link |
00:36:20.220
and I've just been doing reinforcement learning on it.
link |
00:36:22.500
I was gonna release it
link |
00:36:25.420
because just to have sort of like thousands of Lexis
link |
00:36:29.860
doing reinforcement.
link |
00:36:31.180
So they fall over naturally,
link |
00:36:32.460
they have to learn how to like walk around and stuff.
link |
00:36:34.900
So I love that idea,
link |
00:36:36.660
this tension between biometric security,
link |
00:36:39.140
you want to have one identity,
link |
00:36:40.340
but then certain avatars, you might have to have many.
link |
00:36:43.620
I don't know which is better security,
link |
00:36:45.400
sort of flooding the world with Lexis
link |
00:36:48.140
and thereby achieving security
link |
00:36:49.440
or really being protective of your identity.
link |
00:36:51.780
I have to ask you a security question actually.
link |
00:36:53.860
Well, how does flooding the world with Lexis help me know
link |
00:36:56.860
in our conversation that I'm talking to the real Lex?
link |
00:36:59.640
I completely destroy the trust
link |
00:37:01.580
in all my relationships then, right?
link |
00:37:03.060
If I flood,
link |
00:37:04.180
cause then it's, yeah, that.
link |
00:37:07.820
I think that one's not gonna work that well for you.
link |
00:37:09.500
It's not gonna work that well for the original copy.
link |
00:37:11.860
It probably fits some things.
link |
00:37:13.380
Like if you're a public figure
link |
00:37:14.820
and you're trying to have a bunch of,
link |
00:37:18.500
if you're trying to show up
link |
00:37:19.480
in a bunch of different places in the future,
link |
00:37:21.060
you'll be able to do that in the metaverse.
link |
00:37:23.500
So that kind of replication I think will be useful.
link |
00:37:26.260
But I do think that you're gonna want a notion of like,
link |
00:37:29.260
I am talking to the real one.
link |
00:37:31.500
Yeah.
link |
00:37:32.700
Yeah, especially if the fake ones start outperforming you
link |
00:37:35.660
and all your private relationships
link |
00:37:37.460
and then you're left behind.
link |
00:37:38.740
I mean, that's a serious concern I have with clones.
link |
00:37:41.060
Again, the things I think about.
link |
00:37:43.340
Okay, so I recently got, I use QNAP NAS storage.
link |
00:37:48.380
So just storage for video and stuff.
link |
00:37:50.220
And I recently got hacked.
link |
00:37:51.460
This is the first time for me with ransomware.
link |
00:37:53.540
It's not me personally, it's all QNAP devices.
link |
00:37:58.700
So the question that people have
link |
00:38:00.780
is about security in general.
link |
00:38:03.300
Because I was doing a lot of the right things
link |
00:38:05.060
in terms of security and nevertheless,
link |
00:38:06.820
ransomware basically disabled my device.
link |
00:38:10.940
Is that something you think about?
link |
00:38:12.060
What are the different steps you could take
link |
00:38:13.780
to protect people's data on the security front?
link |
00:38:16.940
I think that there's different solutions for,
link |
00:38:21.380
and strategies where it makes sense to have stuff
link |
00:38:23.660
kind of put behind a fortress, right?
link |
00:38:25.460
So the centralized model versus the decentralizing.
link |
00:38:30.220
Then I think both have strengths and weaknesses.
link |
00:38:32.140
So I think anyone who says, okay,
link |
00:38:33.260
just decentralize everything, that'll make it more secure.
link |
00:38:36.660
I think that that's tough because,
link |
00:38:38.980
I mean, the advantage of something like encryption
link |
00:38:42.740
is that we run the largest encrypted service
link |
00:38:46.420
in the world with WhatsApp.
link |
00:38:47.700
And we're one of the first to roll out
link |
00:38:49.580
a multi platform encryption service.
link |
00:38:52.660
And that's something that I think was a big advance
link |
00:38:55.980
for the industry.
link |
00:38:57.180
And one of the promises that we can basically make
link |
00:38:59.300
because of that, our company doesn't see
link |
00:39:02.260
when you're sending an encrypted message
link |
00:39:04.500
and to an encrypted message,
link |
00:39:05.860
what the content is of what you're sharing.
link |
00:39:07.860
So that way, if someone hacks Meta servers,
link |
00:39:11.540
they're not gonna be able to access the WhatsApp message
link |
00:39:14.700
that you're sending to your friend.
link |
00:39:16.940
And that I think matters a lot to people
link |
00:39:19.100
because obviously if someone is able to compromise
link |
00:39:21.900
a company's servers and that company has hundreds
link |
00:39:23.900
of millions or billions of people,
link |
00:39:25.220
then that ends up being a very big deal.
link |
00:39:27.900
The flip side of that is, okay,
link |
00:39:29.380
all the content is on your phone.
link |
00:39:32.900
Are you following security best practices on your phone?
link |
00:39:35.860
If you lose your phone, all your content is gone.
link |
00:39:38.060
So that's an issue.
link |
00:39:39.620
Maybe you go back up your content from WhatsApp
link |
00:39:42.300
or some other service in an iCloud or something,
link |
00:39:45.740
but then you're just at Apple's whims about,
link |
00:39:47.940
are they gonna go turn over the data to some government
link |
00:39:51.920
or are they gonna get hacked?
link |
00:39:53.380
So a lot of the time it is useful to have data
link |
00:39:57.340
in a centralized place too because then you can train
link |
00:40:00.580
systems that can just do much better personalization.
link |
00:40:04.740
I think that in a lot of cases, centralized systems
link |
00:40:08.580
can offer, especially if you're a serious company,
link |
00:40:13.460
you're running the state of the art stuff
link |
00:40:16.020
and you have red teams attacking your own stuff
link |
00:40:19.540
and you're putting out bounty programs
link |
00:40:24.260
and trying to attract some of the best hackers in the world
link |
00:40:26.260
to go break into your stuff all the time.
link |
00:40:27.820
So any system is gonna have security issues,
link |
00:40:30.500
but I think the best way forward is to basically try
link |
00:40:34.340
to be as aggressive and open about hardening
link |
00:40:36.460
the systems as possible, not trying to kind of hide
link |
00:40:39.140
and pretend that there aren't gonna be issues,
link |
00:40:40.740
which I think is over time why a lot of open source systems
link |
00:40:43.820
have gotten relatively more secure is because they're open
link |
00:40:46.540
and it's not, rather than pretending that there aren't
link |
00:40:48.740
gonna be issues, just people surface them quicker.
link |
00:40:50.900
So I think you want to adopt that approach as a company
link |
00:40:53.760
and just constantly be hardening yourself.
link |
00:40:56.640
Trying to stay one step ahead of the attackers.
link |
00:41:01.060
It's an inherently adversarial space.
link |
00:41:03.940
I think it's an interesting security is interesting
link |
00:41:07.260
because of the different kind of threats
link |
00:41:09.140
that we've managed over the last five years,
link |
00:41:11.820
there are ones where basically the adversaries
link |
00:41:15.540
keep on getting better and better.
link |
00:41:16.820
So trying to kind of interfere with security
link |
00:41:21.580
is certainly one area of this.
link |
00:41:23.100
If you have nation states that are trying
link |
00:41:24.860
to interfere in elections or something,
link |
00:41:27.300
they're kind of evolving their tactics.
link |
00:41:29.460
Whereas on the other hand, I don't want to be too simplistic
link |
00:41:32.780
about it, but if someone is saying something hateful,
link |
00:41:36.660
people usually aren't getting smarter and smarter
link |
00:41:38.700
about how they say hateful things.
link |
00:41:40.460
So maybe there's some element of that,
link |
00:41:42.600
but it's a very small dynamic compared
link |
00:41:44.820
to how advanced attackers and some of these other places
link |
00:41:48.520
get over time.
link |
00:41:49.980
I believe most people are good,
link |
00:41:51.360
so they actually get better over time
link |
00:41:53.660
and not being less hateful
link |
00:41:55.420
because they realize it's not fun being hateful.
link |
00:42:00.060
That's at least the belief I have.
link |
00:42:01.980
But first, bathroom break.
link |
00:42:04.940
Sure, okay.
link |
00:42:06.840
So we'll come back to AI,
link |
00:42:08.180
but let me ask some difficult questions now.
link |
00:42:11.020
Social Dilemma is a popular documentary
link |
00:42:13.820
that raised concerns about the effects
link |
00:42:15.500
of social media on society.
link |
00:42:17.540
You responded with a point by point rebuttal titled,
link |
00:42:20.820
What the Social Dilemma Gets Wrong.
link |
00:42:23.120
People should read that.
link |
00:42:24.980
I would say the key point they make
link |
00:42:26.700
is because social media is funded by ads,
link |
00:42:29.580
algorithms want to maximize attention and engagement
link |
00:42:33.280
and an effective way to do so is to get people angry
link |
00:42:38.020
at each other, increase division and so on.
link |
00:42:40.940
Can you steel man their criticisms and arguments
link |
00:42:44.260
that they make in the documentary
link |
00:42:46.260
as a way to understand the concern
link |
00:42:48.580
and as a way to respond to it?
link |
00:42:53.060
Well, yeah, I think that's a good conversation to have.
link |
00:42:56.860
I don't happen to agree with the conclusions
link |
00:43:00.420
and I think that they make a few assumptions
link |
00:43:02.100
that are just very big jumps
link |
00:43:06.300
that I don't think are reasonable to make.
link |
00:43:08.880
But I understand overall why people would be concerned
link |
00:43:13.880
that our business model and ads in general,
link |
00:43:19.380
we do make more money
link |
00:43:20.640
as people use the service more in general, right?
link |
00:43:23.280
So as a kind of basic assumption, okay,
link |
00:43:26.680
do we have an incentive for people to build a service
link |
00:43:29.400
that people use more?
link |
00:43:31.200
Yes, on a lot of levels.
link |
00:43:32.840
I mean, we think what we're doing is good.
link |
00:43:34.480
So we think that if people are finding it useful,
link |
00:43:37.140
they'll use it more.
link |
00:43:38.560
Or if you just look at it as this sort of,
link |
00:43:41.320
if the only thing we cared about is money,
link |
00:43:43.280
which is not for anyone who knows me,
link |
00:43:46.200
but okay, we're a company.
link |
00:43:47.820
So let's say you just kind of simplified it down to that,
link |
00:43:51.360
then would we want people to use the services more?
link |
00:43:53.840
Yes, and then you get to the second question,
link |
00:43:57.240
which is does kind of getting people agitated
link |
00:44:02.920
make them more likely to use the services more?
link |
00:44:07.480
And I think from looking at other media in the world,
link |
00:44:12.480
especially TV, and there's the old news adage,
link |
00:44:17.280
if it bleeds, it leads.
link |
00:44:18.680
Like I think that this is,
link |
00:44:20.020
there are a bunch of reasons why someone might think
link |
00:44:25.520
that that kind of provocative content
link |
00:44:30.520
would be the most engaging.
link |
00:44:32.640
Now, what I've always found is two things.
link |
00:44:35.640
One is that what grabs someone's attention in the near term
link |
00:44:39.180
is not necessarily something
link |
00:44:40.840
that they're going to appreciate having seen
link |
00:44:43.640
or going to be the best over the long term.
link |
00:44:45.320
So I think what a lot of people get wrong
link |
00:44:47.460
is that I'm not building this company
link |
00:44:50.400
to make the most money or get people to spend the most time
link |
00:44:53.300
on this in the next quarter or the next year.
link |
00:44:55.760
I've been doing this for 17 years at this point,
link |
00:44:58.980
and I'm still relatively young,
link |
00:45:00.380
and I have a lot more that I wanna do
link |
00:45:02.040
over the coming decades.
link |
00:45:03.380
So I think that it's too simplistic to say,
link |
00:45:08.380
hey, this might increase time in the near term,
link |
00:45:11.820
therefore, it's what you're gonna do.
link |
00:45:13.440
Because I actually think a deeper look
link |
00:45:15.340
at kind of what my incentives are,
link |
00:45:17.240
the incentives of a company
link |
00:45:18.300
that are focused on the long term,
link |
00:45:20.540
is to basically do what people
link |
00:45:22.720
are gonna find valuable over time,
link |
00:45:24.160
not what is gonna draw people's attention today.
link |
00:45:26.780
The other thing that I'd say is that,
link |
00:45:29.980
I think a lot of times people look at this
link |
00:45:31.500
from the perspective of media
link |
00:45:34.580
or kind of information or civic discourse,
link |
00:45:37.780
but one other way of looking at this is just that,
link |
00:45:40.860
okay, I'm a product designer, right?
link |
00:45:42.540
Our company, we build products,
link |
00:45:45.180
and a big part of building a product
link |
00:45:47.340
is not just the function and utility
link |
00:45:49.020
of what you're delivering,
link |
00:45:50.180
but the feeling of how it feels, right?
link |
00:45:52.020
And we spend a lot of time talking about virtual reality
link |
00:45:55.660
and how the kind of key aspect of that experience
link |
00:45:58.820
is the feeling of presence, which it's a visceral thing.
link |
00:46:01.980
It's not just about the utility that you're delivering,
link |
00:46:03.940
it's about like the sensation.
link |
00:46:05.940
And similarly, I care a lot about how people feel
link |
00:46:10.420
when they use our products,
link |
00:46:11.420
and I don't want to build products that make people angry.
link |
00:46:15.300
I mean, that's like not, I think,
link |
00:46:17.020
what we're here on this earth to do,
link |
00:46:18.460
is to build something that people spend a bunch of time doing
link |
00:46:22.180
and it just kind of makes them angrier at other people.
link |
00:46:24.020
I mean, I think that that's not good.
link |
00:46:26.300
That's not what I think would be
link |
00:46:30.120
sort of a good use of our time
link |
00:46:32.040
or a good contribution to the world.
link |
00:46:33.660
So, okay, it's like people, they tell us
link |
00:46:36.500
on a per content basis, does this thing,
link |
00:46:39.220
do I like it?
link |
00:46:40.060
Do I love it?
link |
00:46:40.880
Does it make me angry?
link |
00:46:41.720
Does it make me sad?
link |
00:46:42.980
And based on that, we choose to basically show content
link |
00:46:47.220
that makes people angry less,
link |
00:46:49.180
because of course, if you're designing a product
link |
00:46:52.720
and you want people to be able to connect
link |
00:46:56.320
and feel good over a long period of time,
link |
00:46:59.180
then that's naturally what you're gonna do.
link |
00:47:02.080
So, I don't know, I think overall,
link |
00:47:06.580
I understand at a high level,
link |
00:47:10.540
if you're not thinking too deeply about it,
link |
00:47:13.680
why that argument might be appealing.
link |
00:47:16.100
But I just think if you actually look
link |
00:47:19.220
at what our real incentives are,
link |
00:47:20.940
not just like if we were trying to optimize
link |
00:47:25.100
for the next week,
link |
00:47:26.820
but like as people working on this,
link |
00:47:28.980
like why are we here?
link |
00:47:30.480
And I think it's pretty clear
link |
00:47:32.900
that that's not actually how you would wanna
link |
00:47:34.300
design the system.
link |
00:47:35.740
I guess one other thing that I'd say is that,
link |
00:47:37.780
while we're focused on the ads business model,
link |
00:47:40.820
I do think it's important to note that a lot
link |
00:47:43.400
of these issues are not unique to ads.
link |
00:47:45.380
I mean, so take like a subscription news business model,
link |
00:47:47.900
for example, I think that has just as many
link |
00:47:50.920
potential pitfalls.
link |
00:47:53.180
Maybe if someone's paying for a subscription,
link |
00:47:55.240
you don't get paid per piece of content that they look at,
link |
00:47:57.940
but say for example, I think like a bunch
link |
00:48:02.640
of the partisanship that we see could potentially
link |
00:48:06.180
be made worse by you have these kind of partisan
link |
00:48:12.620
news organizations that basically sell subscriptions
link |
00:48:15.740
and they're only gonna get people on one side
link |
00:48:17.580
to basically subscribe to them.
link |
00:48:19.900
So their incentive is not to print content
link |
00:48:22.780
or produce content that's kind of centrist
link |
00:48:26.100
or down the line either.
link |
00:48:27.820
I bet that what a lot of them find is that
link |
00:48:30.060
if they produce stuff that's kind of more polarizing
link |
00:48:32.480
or more partisan, then that is what gets
link |
00:48:35.460
the more subscribers.
link |
00:48:36.860
So I think that this stuff is all,
link |
00:48:40.260
there's no perfect business model.
link |
00:48:41.940
Everything has pitfalls.
link |
00:48:44.340
The thing that I think is great about advertising
link |
00:48:46.500
is it makes it so the consumer service is free,
link |
00:48:48.740
which if you believe that everyone should have a voice
link |
00:48:50.860
and everyone should be able to connect,
link |
00:48:52.020
then that's a great thing, as opposed to building
link |
00:48:55.000
a luxury service that not everyone can afford.
link |
00:48:57.240
But look, every business model, you have to be careful
link |
00:49:00.020
about how you're implementing what you're doing.
link |
00:49:02.500
You responded to a few things there.
link |
00:49:04.620
You spoke to the fact that there is a narrative
link |
00:49:07.260
of malevolence, like you're leaning into them,
link |
00:49:12.420
making people angry just because it makes more money
link |
00:49:15.020
in the short term, that kind of thing.
link |
00:49:16.300
So you responded to that.
link |
00:49:17.880
But there's also kind of reality of human nature.
link |
00:49:22.060
Just like you spoke about, there's fights,
link |
00:49:25.380
arguments we get in and we don't like ourselves afterwards,
link |
00:49:28.700
but we got into them anyway.
link |
00:49:30.340
So our longterm growth is, I believe for most of us,
link |
00:49:34.540
has to do with learning, challenging yourself,
link |
00:49:38.220
improving, being kind to each other,
link |
00:49:40.960
finding a community of people that you connect with
link |
00:49:47.620
on a real human level, all that kind of stuff.
link |
00:49:50.520
But it does seem when you look at social media
link |
00:49:54.660
that a lot of fights break out,
link |
00:49:56.540
a lot of arguments break out,
link |
00:49:58.180
a lot of viral content ends up being sort of outrage
link |
00:50:03.020
in one direction or the other.
link |
00:50:04.820
And so it's easy from that to infer the narrative
link |
00:50:08.020
that social media companies are letting
link |
00:50:11.220
this outrage become viral.
link |
00:50:13.940
And so they're increasing the division in the world.
link |
00:50:16.820
I mean, perhaps you can comment on that
link |
00:50:18.820
or further, how can you be,
link |
00:50:21.140
how can you push back on this narrative?
link |
00:50:25.780
How can you be transparent about this battle?
link |
00:50:28.420
Because I think it's not just motivation or financials,
link |
00:50:33.540
it's a technical problem too,
link |
00:50:36.000
which is how do you improve longterm wellbeing
link |
00:50:41.020
of human beings?
link |
00:50:43.020
I think that going through some of the design decisions
link |
00:50:47.940
would be a good conversation.
link |
00:50:49.660
But first, I actually think,
link |
00:50:51.780
I think you acknowledged that,
link |
00:50:54.260
that narrative is somewhat anecdotal.
link |
00:50:56.900
And I think it's worth grounding this conversation
link |
00:50:59.500
in the actual research that has been done on this,
link |
00:51:02.600
which by and large finds that social media
link |
00:51:07.980
is not a large driver of polarization, right?
link |
00:51:10.780
And, I mean, there's been a number of economists
link |
00:51:14.820
and social scientists and folks who have studied this.
link |
00:51:18.380
In a lot of polarization, it varies around the world.
link |
00:51:21.220
If social media is basically in every country,
link |
00:51:23.100
Facebook's in pretty much every country
link |
00:51:24.580
except for China and maybe North Korea.
link |
00:51:27.180
And you see different trends in different places
link |
00:51:32.460
where in a lot of countries polarization is declining,
link |
00:51:37.000
in some it's flat, in the US it's risen sharply.
link |
00:51:41.660
So the question is, what are the unique phenomenon
link |
00:51:44.660
in the different places?
link |
00:51:45.980
And I think for the people who are trying to say,
link |
00:51:47.580
hey, social media is the thing that's doing this.
link |
00:51:50.220
I think that that clearly doesn't hold up
link |
00:51:52.940
because social media is a phenomenon
link |
00:51:54.480
that is pretty much equivalent
link |
00:51:56.020
in all of these different countries.
link |
00:51:57.740
And you have researchers like this economist at Stanford,
link |
00:52:00.600
Matthew Genskow, who has just written at length about this.
link |
00:52:05.300
And it's a bunch of books by political scientists,
link |
00:52:10.420
Ezra Klein and folks, why we're polarized,
link |
00:52:13.060
basically goes through this decades long analysis in the US.
link |
00:52:17.100
Before I was born, basically talking about
link |
00:52:19.500
some of the forces in kind of partisan politics
link |
00:52:22.780
and Fox News and different things
link |
00:52:25.160
that predate the internet in a lot of ways
link |
00:52:27.640
that I think are likely larger contributors.
link |
00:52:30.100
So to the contrary on this,
link |
00:52:32.220
not only is it pretty clear that social media
link |
00:52:35.340
is not a major contributor,
link |
00:52:37.600
but most of the academic studies that I've seen
link |
00:52:40.060
actually show that social media use
link |
00:52:42.640
is correlated with lower polarization.
link |
00:52:45.400
And Genskow, the same person who just did the study
link |
00:52:48.640
that I cited about longitudinal polarization
link |
00:52:51.640
across different countries,
link |
00:52:54.180
also did a study that basically showed
link |
00:52:57.480
that if you looked after the 2016 election in the US,
link |
00:53:02.120
the voters who were the most polarized
link |
00:53:05.280
were actually the ones who were not on the internet.
link |
00:53:07.560
So, and there have been recent other studies,
link |
00:53:10.280
I think in Europe and around the world,
link |
00:53:12.840
basically showing that as people stop using social media,
link |
00:53:16.720
they tend to get more polarized.
link |
00:53:19.200
Then there's a deeper analysis around,
link |
00:53:21.360
okay, well, polarization actually isn't even one thing.
link |
00:53:24.740
Cause you know, having different opinions on something
link |
00:53:26.520
isn't, I don't think that that's by itself bad.
link |
00:53:28.900
What people who study this say is most problematic
link |
00:53:33.920
is what they call affective polarization,
link |
00:53:35.920
which is basically are you,
link |
00:53:37.920
do you have negative feelings towards people
link |
00:53:40.040
of another group?
link |
00:53:41.040
And the way that a lot of scholars study this
link |
00:53:43.760
is they basically ask a group,
link |
00:53:46.780
would you let your kids marry someone of group X?
link |
00:53:50.600
Whatever the groups are that you're worried
link |
00:53:53.320
that someone might have negative feelings towards.
link |
00:53:55.520
And in general, use of social media
link |
00:53:58.160
has corresponded to decreases
link |
00:53:59.880
in that kind of affective polarization.
link |
00:54:01.960
So I just wanna, I think we should talk
link |
00:54:04.760
through the design decisions and how we handle
link |
00:54:07.720
the kind of specific pieces of content,
link |
00:54:10.720
but overall, I think it's just worth grounding
link |
00:54:13.280
that discussion in the research that's existed
link |
00:54:15.600
that I think overwhelmingly shows
link |
00:54:17.440
that the mainstream narrative around this
link |
00:54:19.560
is just not right.
link |
00:54:21.040
But the narrative does take hold
link |
00:54:24.060
and it's compelling to a lot of people.
link |
00:54:27.920
There's another question I'd like to ask you on this.
link |
00:54:31.300
I was looking at various polls and saw that you're
link |
00:54:34.980
one of the most disliked tech leaders today,
link |
00:54:38.080
54% unfavorable rating.
link |
00:54:41.400
Elon Musk is 23%.
link |
00:54:43.240
It's basically everybody has a very high unfavorable rating
link |
00:54:46.260
that are tech leaders.
link |
00:54:48.000
Maybe you can help me understand that.
link |
00:54:50.640
Why do you think so many people dislike you?
link |
00:54:54.720
Some even hate you.
link |
00:54:56.880
And how do you regain their trust and support?
link |
00:54:59.160
Given everything you just said,
link |
00:55:02.360
why are you losing the battle
link |
00:55:05.360
in explaining to people what actual impact
link |
00:55:09.440
social media has on society?
link |
00:55:12.440
Well, I'm curious if that's a US survey or world.
link |
00:55:16.760
It is US, yeah.
link |
00:55:17.960
So I think that there's a few dynamics.
link |
00:55:19.360
One is that our brand
link |
00:55:24.360
has been somewhat uniquely challenged in the US
link |
00:55:27.880
compared to other places.
link |
00:55:29.040
It's not that there are.
link |
00:55:29.880
I mean, other countries, we have issues too,
link |
00:55:32.660
but I think in the US, there was this dynamic where
link |
00:55:36.920
if you look at like the next sentiment
link |
00:55:38.880
of kind of coverage or attitude towards us,
link |
00:55:42.880
before 2016, I think that there were probably
link |
00:55:44.880
very few months, if any, where it was negative.
link |
00:55:47.480
And since 2016, I think that there probably
link |
00:55:49.440
been very few months, if any, then it's been positive.
link |
00:55:51.960
Politics.
link |
00:55:53.600
But I think it's a specific thing.
link |
00:55:55.360
And this is very different from other places.
link |
00:55:56.960
So I think in a lot of other countries in the world,
link |
00:55:59.840
the sentiment towards meta and our services
link |
00:56:02.440
is extremely positive.
link |
00:56:04.840
In the US, we have more challenges.
link |
00:56:06.600
And I think compared to other companies,
link |
00:56:09.800
you can look at certain industries,
link |
00:56:12.520
I think if you look at it from like a partisan perspective,
link |
00:56:16.320
not from like a political perspective,
link |
00:56:18.060
but just kind of culturally,
link |
00:56:19.080
it's like there are people who are probably
link |
00:56:20.240
more left of center and there are people
link |
00:56:21.360
who are more right of center,
link |
00:56:22.520
and there's kind of blue America and red America.
link |
00:56:25.880
There are certain industries that I think
link |
00:56:27.660
maybe one half of the country has a more positive view
link |
00:56:30.880
towards than another.
link |
00:56:32.200
And I think we're in a,
link |
00:56:36.400
one of the positions that we're in that I think
link |
00:56:38.320
is really challenging is that because of a lot
link |
00:56:41.720
of the content decisions that we've basically
link |
00:56:44.760
had to arbitrate, and because we're not a partisan company,
link |
00:56:49.560
we're not a Democrat company or a Republican company,
link |
00:56:52.640
we're trying to make the best decisions we can
link |
00:56:55.080
to help people connect and help people have as much voice
link |
00:56:58.480
as they can while having some rules
link |
00:57:01.120
because we're running a community.
link |
00:57:04.920
The net effect of that is that we're kind of constantly
link |
00:57:07.420
making decisions that piss off people in both camps.
link |
00:57:12.760
And the effect that I've sort of seen is that
link |
00:57:17.160
when we make a decision that is,
link |
00:57:21.480
that's a controversial one that's gonna upset,
link |
00:57:24.440
say about half the country,
link |
00:57:27.880
those decisions are all negative sum,
link |
00:57:30.340
from a brand perspective, because it's not like,
link |
00:57:33.640
if we make that decision in one way
link |
00:57:35.640
and say half the country is happy
link |
00:57:37.840
about that particular decision that we make,
link |
00:57:40.000
they tend to not say, oh, sweet, meta got that one right.
link |
00:57:43.720
They're just like, ah, you didn't mess that one up.
link |
00:57:46.160
But their opinion doesn't tend to go up by that much.
link |
00:57:48.960
Whereas the people who kind of are on the other side of it
link |
00:57:52.840
are like, God, how could you mess that up?
link |
00:57:55.080
How could you possibly think that that piece of content
link |
00:57:57.760
is okay and should be up and should not be censored?
link |
00:58:00.120
Or, and so I think the, whereas if you leave it up
link |
00:58:04.940
and, you know, it's, or if you take it down,
link |
00:58:09.220
the people who thought it should be taken down or,
link |
00:58:11.360
you know, it's like, all right, fine, great.
link |
00:58:12.740
You didn't mess that one up.
link |
00:58:14.080
So our internal assessment of,
link |
00:58:16.120
and the kind of analytics on our brand
link |
00:58:17.920
are basically anytime one of these big controversial things
link |
00:58:20.540
comes up in society,
link |
00:58:23.740
our brand goes down with half of the country.
link |
00:58:26.080
And then like, if you,
link |
00:58:27.600
and then if you just kind of extrapolate that out,
link |
00:58:29.600
it's just been very challenging for us to try to navigate
link |
00:58:33.200
what is a polarizing country in a principled way,
link |
00:58:36.640
where we're not trying to kind of hew to one side
link |
00:58:38.600
or the other, we're trying to do
link |
00:58:39.440
what we think is the right thing.
link |
00:58:41.040
But that's what I think is the right thing
link |
00:58:43.220
for us to do though.
link |
00:58:44.060
So, I mean, that's what we'll try to keep doing.
link |
00:58:47.360
Just as a human being, how does it feel though,
link |
00:58:50.160
when you're giving so much of your day to day life
link |
00:58:53.380
to try to heal division, to try to do good in the world,
link |
00:58:58.060
as we've talked about, that so many people in the US,
link |
00:59:02.260
the place you call home have a negative view
link |
00:59:06.480
of you as a leader, as a human being
link |
00:59:09.120
and the company you love?
link |
00:59:14.040
Well, I mean, it's not great,
link |
00:59:15.880
but I mean, look, if I wanted people to think positively
link |
00:59:21.040
about me as a person,
link |
00:59:25.760
I don't know, I'm not sure if you go build a company.
link |
00:59:27.960
I mean, it's like.
link |
00:59:28.800
Or a social media company.
link |
00:59:30.280
It seems exceptionally difficult to do
link |
00:59:32.040
with a social media company.
link |
00:59:32.880
Yeah, so, I mean, I don't know,
link |
00:59:34.800
there is a dynamic where a lot of the other people
link |
00:59:39.760
running these companies, internet companies,
link |
00:59:42.160
have sort of stepped back and they just do things
link |
00:59:45.320
that are sort of, I don't know, less controversial.
link |
00:59:49.480
And some of it may be that they just get tired over time.
link |
00:59:52.720
But, you know, it's, so I don't know.
link |
00:59:55.400
I think that, you know, running a company is hard,
link |
00:59:58.120
building something at scale is hard.
link |
00:59:59.880
You only really do it for a long period of time
link |
01:00:01.640
if you really care about what you're doing.
link |
01:00:04.240
And yeah, so, I mean, it's not great, but like,
link |
01:00:08.000
but look, I think that at some level,
link |
01:00:11.560
whether 25% of people dislike you
link |
01:00:14.960
or 75% of people dislike you,
link |
01:00:18.080
your experience as a public figure is gonna be
link |
01:00:21.040
that there's a lot of people who dislike you, right?
link |
01:00:23.360
So, I actually am not sure how different it is.
link |
01:00:28.680
You know, certainly, you know,
link |
01:00:31.200
the country's gotten more polarized
link |
01:00:32.760
and we in particular have gotten, you know,
link |
01:00:35.040
more controversial over the last five or years or so.
link |
01:00:39.080
But, I don't know, I kind of think like as a public figure
link |
01:00:45.240
and leader of one of these enterprises.
link |
01:00:48.560
Comes with the job.
link |
01:00:49.400
Yeah, part of what you do is like,
link |
01:00:51.440
and look, the answer can't just be ignore it, right?
link |
01:00:54.640
Because like a huge part of the job
link |
01:00:56.680
is like you need to be getting feedback
link |
01:00:58.200
and internalizing feedback on how you can do better.
link |
01:01:00.760
But I think increasingly what you need to do
link |
01:01:02.520
is be able to figure out, you know,
link |
01:01:04.560
who are the kind of good faith critics
link |
01:01:08.000
who are criticizing you because
link |
01:01:10.640
they're trying to help you do a better job
link |
01:01:12.520
rather than tear you down.
link |
01:01:13.960
And those are the people I just think you have to cherish
link |
01:01:16.440
and like, and listen very closely
link |
01:01:19.120
to the things that they're saying,
link |
01:01:20.280
because, you know, I think it's just as dangerous
link |
01:01:23.040
to tune out everyone who says anything negative
link |
01:01:26.840
and just listen to the people who are kind of positive
link |
01:01:29.320
and support you, you know,
link |
01:01:31.240
as it would be psychologically to pay attention
link |
01:01:33.760
trying to make people who are never gonna like you like you.
link |
01:01:36.600
So I think that that's just kind of a dance
link |
01:01:38.880
that people have to do.
link |
01:01:40.080
But I mean, I, you know,
link |
01:01:41.760
so you kind of develop more of a feel for like,
link |
01:01:44.720
who actually is trying to accomplish
link |
01:01:46.280
the same types of things in the world
link |
01:01:48.400
and who has different ideas about how to do that
link |
01:01:51.440
and how can I learn from those people?
link |
01:01:52.880
And like, yeah, we get stuff wrong.
link |
01:01:54.800
And when the people whose opinions I respect
link |
01:01:57.760
call me out on getting stuff wrong,
link |
01:01:59.840
that hurts and makes me wanna do better.
link |
01:02:02.120
But I think at this point, I'm pretty tuned to just,
link |
01:02:04.680
all right, if someone, if I know they're,
link |
01:02:06.360
they're kind of like operating in bad faith
link |
01:02:08.080
and they're not really trying to help,
link |
01:02:10.800
then, you know, I don't know, it's not, it's, it doesn't,
link |
01:02:13.000
you know, I think over time,
link |
01:02:13.840
it just doesn't bother you that much.
link |
01:02:15.280
But you are surrounded by people that believe in the mission
link |
01:02:18.720
that love you.
link |
01:02:21.240
Are there friends or colleagues in your inner circle
link |
01:02:23.600
you trust that call you out on your bullshit
link |
01:02:26.560
whenever your thinking may be misguided
link |
01:02:28.600
as it is for leaders at times?
link |
01:02:30.920
I think we have a famously open company culture
link |
01:02:34.840
where we sort of encourage that kind of dissent internally,
link |
01:02:39.440
which is, you know, why there's so much material
link |
01:02:41.760
internally that can leak out
link |
01:02:43.120
with people sort of disagreeing
link |
01:02:44.520
is because that's sort of the culture.
link |
01:02:47.480
You know, our management team, I think it's a lot of people,
link |
01:02:50.200
you know, there are some newer folks who come in,
link |
01:02:52.000
there are some folks who've kind of been there for a while,
link |
01:02:54.800
but there's a very high level of trust.
link |
01:02:56.840
And I would say it is a relatively confrontational
link |
01:02:59.840
group of people.
link |
01:03:01.080
And my friends and family, I think, will push me on this.
link |
01:03:04.560
But look, it's not just,
link |
01:03:06.400
but I think you need some diversity, right?
link |
01:03:09.280
It can't just be, you know,
link |
01:03:12.080
people who are your friends and family.
link |
01:03:13.760
It's also, you know, I mean, there are journalists
link |
01:03:16.920
or analysts or, you know,
link |
01:03:19.520
peer executives at other companies
link |
01:03:23.240
or, you know, other people who sort of are insightful
link |
01:03:27.600
about thinking about the world,
link |
01:03:28.760
you know, certain politicians
link |
01:03:30.800
or people kind of in that sphere
link |
01:03:32.680
who I just think have like very insightful perspectives
link |
01:03:36.160
who even if they would,
link |
01:03:39.800
they come at the world from a different perspective,
link |
01:03:41.600
which is sort of what makes the perspective so valuable.
link |
01:03:44.360
But, you know, I think fundamentally
link |
01:03:46.200
we're trying to get to the same place
link |
01:03:47.560
in terms of, you know, helping people connect more,
link |
01:03:50.680
helping the whole world function better,
link |
01:03:53.480
not just, you know, one place or another.
link |
01:03:57.120
And I don't know, I mean,
link |
01:03:58.920
those are the people whose opinions really matter to me.
link |
01:04:02.880
And I just, it's, you know,
link |
01:04:04.240
that's how I learn on a day to day basis.
link |
01:04:05.640
People are constantly sending me comments on stuff
link |
01:04:07.880
or links to things they found interesting.
link |
01:04:10.160
And I don't know, it's kind of constantly evolving
link |
01:04:13.400
this model of the world
link |
01:04:14.480
and kind of what we should be aspiring to be.
link |
01:04:16.840
You've talked about, you have a famously open culture
link |
01:04:20.880
which comes with the criticism
link |
01:04:25.440
and the painful experiences.
link |
01:04:27.280
So let me ask you another difficult question.
link |
01:04:30.960
Frances Haugen, the Facebook whistleblower,
link |
01:04:33.440
leaked the internal Instagram research
link |
01:04:35.800
into teenagers and wellbeing.
link |
01:04:38.040
Her claim is that Instagram is choosing profit
link |
01:04:41.240
over wellbeing of teenage girls.
link |
01:04:43.080
So Instagram is quote, toxic for them.
link |
01:04:46.720
Your response titled,
link |
01:04:48.120
what our research really says about teen wellbeing
link |
01:04:52.440
and Instagram says, no, Instagram research shows
link |
01:04:55.520
that 11 of 12 wellbeing issues,
link |
01:04:58.800
teenage girls who said they struggle
link |
01:05:02.720
with those difficult issues also said
link |
01:05:04.400
that Instagram made them better rather than worse.
link |
01:05:07.600
Again, can you steal man and defend the point
link |
01:05:11.000
and Frances Haugen's characterization of the study
link |
01:05:14.800
and then help me understand the positive
link |
01:05:17.080
and negative effects of Instagram
link |
01:05:19.000
and Facebook on young people?
link |
01:05:20.880
So there are certainly questions around teen mental health
link |
01:05:25.840
that are really important.
link |
01:05:26.680
It's hard to, as a parent, it's like hard to imagine
link |
01:05:29.480
any set of questions that are sort of more important.
link |
01:05:32.040
I mean, I guess maybe other aspects of physical health
link |
01:05:34.080
or wellbeing are probably come to that level,
link |
01:05:37.240
but like, these are really important questions, right?
link |
01:05:40.600
Which is why we dedicate teams to studying them.
link |
01:05:45.640
I don't think the internet or social media are unique
link |
01:05:48.880
in having these questions.
link |
01:05:50.000
I mean, I think people and there've been sort of magazines
link |
01:05:53.160
with promoting certain body types for women
link |
01:05:56.320
and kids for decades,
link |
01:05:58.520
but we really care about this stuff.
link |
01:06:01.440
So we wanted to study it.
link |
01:06:02.760
And of course, we didn't expect
link |
01:06:05.000
that everything was gonna be positive all the time.
link |
01:06:07.000
So, I mean, the reason why you study this stuff
link |
01:06:08.520
is to try to improve and get better.
link |
01:06:10.760
So, I mean, look, the place where I disagree
link |
01:06:13.200
with the characterization first,
link |
01:06:15.320
I thought some of the reporting and coverage of it
link |
01:06:18.720
just took the whole thing out of proportion
link |
01:06:20.840
and that it focused on, as you said,
link |
01:06:22.640
I think there were like 20 metrics in there
link |
01:06:24.280
and on 18 or 19, the effect of using Instagram
link |
01:06:27.600
was neutral or positive on the teen's wellbeing.
link |
01:06:30.920
And there was one area where I think it showed
link |
01:06:34.560
that we needed to improve
link |
01:06:35.480
and we took some steps to try to do that
link |
01:06:37.720
after doing the research.
link |
01:06:38.800
But I think having the coverage just focus on that one
link |
01:06:41.680
without focusing on the,
link |
01:06:43.040
I mean, I think an accurate characterization
link |
01:06:45.080
would have been that kids using Instagram
link |
01:06:47.920
or not kids, teens is generally positive
link |
01:06:52.200
for their mental health.
link |
01:06:53.760
But of course, that was not the narrative that came out.
link |
01:06:55.560
So I think it's hard to,
link |
01:06:56.720
that's not a kind of logical thing to straw man,
link |
01:06:59.200
but I sort of disagree or steel man,
link |
01:07:01.440
but I sort of disagree with that overall characterization.
link |
01:07:04.040
I think anyone sort of looking at this objectively would,
link |
01:07:09.960
but then, I mean, there is this sort of intent critique
link |
01:07:15.040
that I think you were getting at before,
link |
01:07:16.360
which says, it assumes some sort of malevolence, right?
link |
01:07:19.680
It's like, which it's really hard for me
link |
01:07:23.160
to really wrap my head around this
link |
01:07:26.520
because as far as I know,
link |
01:07:29.800
it's not clear that any of the other tech companies
link |
01:07:31.720
are doing this kind of research.
link |
01:07:33.320
So why the narrative should form that we did research
link |
01:07:37.840
because we were studying an issue
link |
01:07:38.800
because we wanted to understand it to improve
link |
01:07:40.760
and took steps after that to try to improve it,
link |
01:07:43.560
that your interpretation of that would be
link |
01:07:46.280
that we did the research
link |
01:07:47.880
and tried to sweep it under the rug.
link |
01:07:49.240
It just, it sort of is like, I don't know,
link |
01:07:53.920
it's beyond credibility to me
link |
01:07:55.920
that like that's the accurate description of the actions
link |
01:07:59.000
that we've taken compared to the others in the industry.
link |
01:08:01.160
So I don't know, that's kind of, that's my view on it.
link |
01:08:05.280
These are really important issues
link |
01:08:06.600
and there's a lot of stuff
link |
01:08:07.960
that I think we're gonna be working on
link |
01:08:09.120
related to teen mental health for a long time,
link |
01:08:11.400
including trying to understand this better.
link |
01:08:14.240
And I would encourage everyone else
link |
01:08:15.280
in the industry to do this too.
link |
01:08:18.400
Yeah, I would love there to be open conversations
link |
01:08:21.880
and a lot of great research being released internally
link |
01:08:25.680
and then also externally.
link |
01:08:27.960
It doesn't make me feel good
link |
01:08:31.040
to see press obviously get way more clicks
link |
01:08:35.000
when they say negative things about social media.
link |
01:08:39.280
Objectively speaking, I can just tell
link |
01:08:42.400
that there's hunger to say negative things
link |
01:08:44.400
about social media.
link |
01:08:46.040
And I don't understand how that's supposed to lead
link |
01:08:50.720
to an open conversation about the positives
link |
01:08:53.000
and the negatives, the concerns about social media,
link |
01:08:56.000
especially when you're doing that kind of research.
link |
01:08:59.200
I mean, I don't know what to do with that,
link |
01:09:01.720
but let me ask you as a father,
link |
01:09:05.560
there's a weight heavy on you
link |
01:09:06.720
that people get bullied on social networks.
link |
01:09:10.280
So people get bullied in their private life.
link |
01:09:13.520
But now because so much of our life is in the digital world,
link |
01:09:17.080
the bullying moves from the physical world
link |
01:09:19.640
to the digital world.
link |
01:09:21.200
So you're now creating a platform
link |
01:09:24.520
on which bullying happens.
link |
01:09:26.520
And some of that bullying can lead to damage
link |
01:09:30.440
to mental health.
link |
01:09:31.840
And some of that bullying can lead to depression,
link |
01:09:35.120
even suicide.
link |
01:09:37.640
There's a weight heavy on you
link |
01:09:38.760
that people have committed suicide
link |
01:09:43.240
or will commit suicide based on the bullying
link |
01:09:46.120
that happens on social media.
link |
01:09:48.080
Yeah, I mean, there's a set of harms
link |
01:09:51.560
that we basically track and build systems to fight against.
link |
01:09:55.400
And bullying and self harm are,
link |
01:10:01.600
these are some of the biggest things
link |
01:10:03.240
that we are most focused on.
link |
01:10:10.920
For bullying, like you say, it's gonna be,
link |
01:10:16.200
while this predates the internet,
link |
01:10:18.240
then it's probably impossible to get rid of all of it.
link |
01:10:22.000
You wanna give people tools to fight it
link |
01:10:24.160
and you wanna fight it yourself.
link |
01:10:27.200
And you also wanna make sure that people have the tools
link |
01:10:28.800
to get help when they need it.
link |
01:10:30.160
So I think this isn't like a question of,
link |
01:10:33.200
can you get rid of all bullying?
link |
01:10:34.640
I mean, it's like, all right, I mean, I have two daughters
link |
01:10:39.080
and they fight and push each other around and stuff too.
link |
01:10:43.840
And the question is just,
link |
01:10:44.680
how do you handle that situation?
link |
01:10:47.080
And there's a handful of things that I think you can do.
link |
01:10:51.480
We talked a little bit before around some of the AI tools
link |
01:10:55.000
that you can build to identify
link |
01:10:56.800
when something harmful is happening.
link |
01:10:59.160
It's actually, it's very hard in bullying
link |
01:11:00.480
because a lot of bullying is very context specific.
link |
01:11:02.720
It's not like you're trying to fit a formula of like,
link |
01:11:06.320
if like looking at the different harms,
link |
01:11:09.480
someone promoting a terrorist group is like,
link |
01:11:12.280
probably one of the simpler things to generally find
link |
01:11:14.360
because things promoting that group are gonna look
link |
01:11:17.000
at a certain way or feel a certain way.
link |
01:11:19.240
Bullying could just be, you know,
link |
01:11:21.840
someone making some subtle comment about someone's appearance
link |
01:11:24.880
that's idiosyncratic to them.
link |
01:11:26.800
And it could look at just like humor.
link |
01:11:28.680
So humor to one person can be destructive
link |
01:11:31.000
to another human being, yeah.
link |
01:11:32.280
So with bullying, I think there are certain things
link |
01:11:36.400
that you can find through AI systems,
link |
01:11:40.240
but I think it is increasingly important
link |
01:11:42.640
to just give people more agency themselves.
link |
01:11:44.800
So we've done things like making it
link |
01:11:46.480
so people can turn off comments
link |
01:11:47.800
or take a break from hearing from a specific person
link |
01:11:52.240
without having to signal at all
link |
01:11:54.120
that they're gonna stop following them
link |
01:11:55.600
or kind of make some stand that,
link |
01:11:58.200
okay, I'm not friends with you anymore.
link |
01:11:59.400
I'm not following you.
link |
01:12:00.440
I just like, I just don't wanna hear about this,
link |
01:12:01.880
but I also don't wanna signal at all publicly
link |
01:12:05.560
that or to them that there's been an issue.
link |
01:12:10.880
And then you get to some of the more extreme cases
link |
01:12:14.040
like you're talking about
link |
01:12:14.880
where someone is thinking about self harm or suicide.
link |
01:12:19.160
And there we've found that that is a place
link |
01:12:24.080
where AI can identify a lot
link |
01:12:26.400
as well as people flagging things.
link |
01:12:28.560
If people are expressing something
link |
01:12:31.080
that is potentially they're thinking of hurting themselves,
link |
01:12:35.000
those are cues that you can build systems
link |
01:12:37.520
and hundreds of languages around the world
link |
01:12:39.560
to be able to identify that.
link |
01:12:41.080
And one of the things that I'm actually quite proud of
link |
01:12:45.320
is we've built these systems
link |
01:12:47.400
that I think are clearly leading at this point
link |
01:12:50.960
that not only identify that,
link |
01:12:53.000
but then connect with local first responders
link |
01:12:57.040
and have been able to save, I think at this point,
link |
01:12:59.680
it's in thousands of cases,
link |
01:13:01.960
be able to get first responders to people
link |
01:13:04.560
through these systems who really need them
link |
01:13:07.800
because of specific plumbing that we've done
link |
01:13:09.600
between the AI work and being able to communicate
link |
01:13:11.680
with local first responder organizations.
link |
01:13:13.800
We're rolling that out in more places around the world.
link |
01:13:15.800
And I think the team that worked on that
link |
01:13:18.160
just did awesome stuff.
link |
01:13:19.360
So I think that that's a long way of saying,
link |
01:13:22.760
yeah, I mean, this is a heavy topic
link |
01:13:25.480
and you want to attack it in a bunch of different ways
link |
01:13:30.000
and also kind of understand that some of nature
link |
01:13:33.240
is for people to do this to each other,
link |
01:13:36.360
which is unfortunate,
link |
01:13:37.320
but you can give people tools and build things that help.
link |
01:13:40.600
It's still one hell of a burden though.
link |
01:13:43.840
A platform that allows people
link |
01:13:46.160
to fall in love with each other
link |
01:13:48.600
is also by nature going to be a platform
link |
01:13:51.000
that allows people to hurt each other.
link |
01:13:52.880
And when you're managing such a platform, it's difficult.
link |
01:13:57.120
And I think you spoke to it,
link |
01:13:58.200
but the psychology of that, of being a leader in that space,
link |
01:14:01.280
of creating technology that's playing in this space,
link |
01:14:05.280
like you mentioned, psychology is really damn difficult.
link |
01:14:10.280
And I mean, the burden of that is just great.
link |
01:14:13.120
I just wanted to hear you speak to that point.
link |
01:14:18.720
I have to ask about the thing you've brought up a few times,
link |
01:14:23.160
which is making controversial decisions.
link |
01:14:26.520
Let's talk about free speech and censorship.
link |
01:14:29.440
So there are two groups of people pressuring Meta on this.
link |
01:14:33.920
One group is upset that Facebook, the social network,
link |
01:14:37.240
allows misinformation in quotes to be spread on the platform.
link |
01:14:41.480
The other group are concerned that Facebook censors speech
link |
01:14:44.800
by calling it misinformation.
link |
01:14:46.560
So you're getting it from both sides.
link |
01:14:48.840
You, in 2019, October at Georgetown University,
link |
01:14:54.600
eloquently defended the importance of free speech,
link |
01:14:58.240
but then COVID came and the 2020 election came.
link |
01:15:04.360
Do you worry that outside pressures
link |
01:15:06.440
from advertisers, politicians, the public,
link |
01:15:08.840
have forced Meta to damage the ideal of free speech
link |
01:15:11.840
that you spoke highly of?
link |
01:15:14.000
Just to say some obvious things upfront,
link |
01:15:16.880
I don't think pressure from advertisers
link |
01:15:18.840
or politicians directly in any way
link |
01:15:21.440
affects how we think about this.
link |
01:15:22.680
I think these are just hard topics.
link |
01:15:25.080
So let me just take you through our evolution
link |
01:15:26.880
from kind of the beginning of the company
link |
01:15:28.240
to where we are now.
link |
01:15:30.240
You don't build a company like this
link |
01:15:31.720
unless you believe that people expressing themselves
link |
01:15:34.120
is a good thing, right?
link |
01:15:35.720
So that's sort of the foundational thing.
link |
01:15:38.160
You can kind of think about our company as a formula
link |
01:15:41.880
where we think giving people voice
link |
01:15:44.240
and helping people connect creates opportunity, right?
link |
01:15:47.440
So those are the two things that we're always focused on
link |
01:15:49.640
are sort of helping people connect.
link |
01:15:50.800
We talked about that a lot,
link |
01:15:52.080
but also giving people voice
link |
01:15:53.880
and ability to express themselves.
link |
01:15:55.800
Then by the way, most of the time
link |
01:15:56.960
when people express themselves,
link |
01:15:58.120
that's not like politically controversial content.
link |
01:16:00.840
It's like expressing something about their identity
link |
01:16:04.040
that's more related to the avatar conversation
link |
01:16:06.520
we had earlier in terms of expressing some facet,
link |
01:16:08.600
but that's what's important to people on a day to day basis.
link |
01:16:11.240
And sometimes when people feel strongly enough
link |
01:16:13.480
about something, it kind of becomes a political topic.
link |
01:16:16.360
That's sort of always been a thing that we've focused on.
link |
01:16:19.120
There's always been the question of safety in this,
link |
01:16:22.320
which if you're building a community,
link |
01:16:24.360
I think you have to focus on safety.
link |
01:16:26.040
We've had these community standards from early on,
link |
01:16:28.320
and there are about 20 different kinds of harm
link |
01:16:32.640
that we track and try to fight actively.
link |
01:16:34.840
We've talked about some of them already.
link |
01:16:36.400
So it includes things like bullying and harassment.
link |
01:16:40.880
It includes things like terrorism or promoting terrorism,
link |
01:16:46.000
inciting violence, intellectual property theft.
link |
01:16:49.320
And in general, I think call it about 18 out of 20 of those.
link |
01:16:53.760
There's not really a particularly polarized definition
link |
01:16:57.200
of that.
link |
01:16:59.160
I think you're not really gonna find many people
link |
01:17:01.440
in the country or in the world
link |
01:17:03.760
who are trying to say we should be
link |
01:17:07.040
fighting terrorist content less.
link |
01:17:09.320
I think the content where there are a couple of areas
link |
01:17:12.200
where I think that this has gotten more controversial
link |
01:17:14.000
recently, which I'll talk about.
link |
01:17:16.320
And you're right, the misinformation is basically is up there.
link |
01:17:20.000
And I think sometimes the definition of hate speech
link |
01:17:21.920
is up there too.
link |
01:17:22.760
But I think in general, most of the content
link |
01:17:25.760
that I think we're working on for safety
link |
01:17:29.560
is not actually, people don't kind of have these questions.
link |
01:17:32.560
So it's sort of this subset.
link |
01:17:35.280
But if you go back to the beginning of the company,
link |
01:17:37.400
this was sort of pre deep learning days.
link |
01:17:42.000
And therefore, it was me and my roommate Dustin join me.
link |
01:17:47.000
And if someone posted something bad,
link |
01:17:54.040
it was the AI technology did not exist yet
link |
01:17:57.880
to be able to go basically look at all the content.
link |
01:18:02.480
And we were a small enough outfit
link |
01:18:06.120
that no one would expect that we could review it all.
link |
01:18:08.800
Even if someone reported it to us,
link |
01:18:10.400
we basically did our best, right?
link |
01:18:11.720
It's like someone would report it
link |
01:18:12.680
and we try to look at stuff and deal with stuff.
link |
01:18:16.880
And for call it the first seven or eight years
link |
01:18:22.360
of the company, we weren't that big of a company.
link |
01:18:26.600
For a lot of that period, we weren't even really profitable.
link |
01:18:28.760
The AI didn't really exist to be able to do
link |
01:18:30.520
the kind of moderation that we do today.
link |
01:18:32.720
And then at some point in kind of the middle
link |
01:18:35.760
of the last decade, that started to flip.
link |
01:18:38.160
And we got to the point where we were sort of a larger
link |
01:18:44.160
and more profitable company.
link |
01:18:45.240
And the AI was starting to come online
link |
01:18:48.000
to be able to proactively detect
link |
01:18:50.480
some of the simpler forms of this.
link |
01:18:52.840
So things like pornography,
link |
01:18:54.800
you could train an image classifier
link |
01:18:57.600
to identify what a nipple was,
link |
01:18:59.520
or you can fight against terrorist content.
link |
01:19:01.320
You still could.
link |
01:19:02.160
There's actually papers on this, it's great.
link |
01:19:03.440
Oh, of course there are.
link |
01:19:04.280
Technical papers.
link |
01:19:05.120
Of course there are.
link |
01:19:06.480
Those are relatively easier things to train AI to do
link |
01:19:09.280
than for example, understand the nuances
link |
01:19:12.440
of what is inciting violence
link |
01:19:14.000
in a hundred languages around the world
link |
01:19:15.800
and not have the false positives of like,
link |
01:19:20.200
okay, are you posting about this thing
link |
01:19:22.360
that might be inciting violence
link |
01:19:24.040
because you're actually trying to denounce it?
link |
01:19:26.360
In which case we probably shouldn't take that down.
link |
01:19:28.280
Where if you're trying to denounce something
link |
01:19:29.520
that's inciting violence in some kind of dialect
link |
01:19:33.920
in a corner of India, as opposed to,
link |
01:19:37.200
okay, actually you're posting this thing
link |
01:19:38.440
because you're trying to incite violence.
link |
01:19:39.600
Okay, building an AI that can basically get
link |
01:19:42.360
to that level of nuance and all the languages
link |
01:19:44.400
that we serve is something that I think
link |
01:19:47.120
is only really becoming possible now,
link |
01:19:49.680
not towards the middle of the last decade.
link |
01:19:51.920
But there's been this evolution,
link |
01:19:54.880
and I think what happened,
link |
01:19:57.560
people sort of woke up after 2016
link |
01:20:00.120
and a lot of people are like,
link |
01:20:02.560
okay, the country is a lot more polarized
link |
01:20:05.000
and there's a lot more stuff here than we realized.
link |
01:20:08.080
Why weren't these internet companies on top of this?
link |
01:20:11.800
And I think at that point it was reasonable feedback
link |
01:20:18.760
that some of this technology had started becoming possible.
link |
01:20:22.400
And at that point, I really did feel like
link |
01:20:25.320
we needed to make a substantially larger investment.
link |
01:20:27.920
We'd already worked on this stuff a lot,
link |
01:20:29.680
on AI and on these integrity problems,
link |
01:20:32.400
but that we should basically invest,
link |
01:20:35.360
have a thousand or more engineers
link |
01:20:37.080
basically work on building these AI systems
link |
01:20:39.160
to be able to go and proactively identify the stuff
link |
01:20:41.600
across all these different areas.
link |
01:20:43.680
Okay, so we went and did that.
link |
01:20:45.360
Now we've built the tools to be able to do that.
link |
01:20:48.000
And now I think it's actually a much more complicated
link |
01:20:50.560
set of philosophical rather than technical questions,
link |
01:20:53.400
which is the exact policies, which are okay.
link |
01:20:56.960
Now, the way that we basically hold ourselves accountable
link |
01:21:01.960
is we issue these transparency reports every quarter
link |
01:21:04.400
and the metric that we track is for each of these
link |
01:21:06.320
20 types of harmful content.
link |
01:21:10.240
How much of that content are we taking down
link |
01:21:12.400
before someone even has to report it to us?
link |
01:21:14.320
So how effective is our AI at doing this?
link |
01:21:17.000
But that basically creates this big question,
link |
01:21:19.440
which is okay, now we need to really be careful
link |
01:21:22.960
about how proactive we set the AI
link |
01:21:25.440
and where the exact policy lines are
link |
01:21:28.000
around what we're taking down.
link |
01:21:30.200
It's certainly at a point now where I felt like
link |
01:21:35.200
at the beginning of that journey
link |
01:21:37.720
of building those AI systems, there was a lot of push.
link |
01:21:43.160
There's saying, okay, you've got to do more.
link |
01:21:44.360
There's clearly a lot more bad content
link |
01:21:46.320
that people aren't reporting or that you're not getting to
link |
01:21:49.880
and you need to get more effective at that.
link |
01:21:51.200
And I was pretty sympathetic to that.
link |
01:21:52.920
But then I think at some point along the way,
link |
01:21:54.800
there started to be almost equal issues on both sides
link |
01:21:58.960
of, okay, actually you're kind of taking down
link |
01:22:00.960
too much stuff, right?
link |
01:22:02.080
Or some of the stuff is borderline
link |
01:22:05.560
and it wasn't really bothering anyone
link |
01:22:07.560
and they didn't report it.
link |
01:22:09.640
So is that really an issue that you need to take down?
link |
01:22:13.000
Whereas we still have the critique on the other side too
link |
01:22:15.440
where a lot of people think we're not doing enough.
link |
01:22:18.560
So it's become, as we built the technical capacity,
link |
01:22:21.840
I think it becomes more philosophically interesting almost
link |
01:22:25.960
where you wanna be on the line.
link |
01:22:27.520
And I just think you don't want one person
link |
01:22:31.160
making those decisions.
link |
01:22:32.440
So we've also tried to innovate
link |
01:22:33.760
in terms of building out this independent oversight board,
link |
01:22:36.520
which has people who are dedicated to free expression
link |
01:22:39.520
but from around the world who people can appeal cases to.
link |
01:22:43.640
So a lot of the most controversial cases basically go to them
link |
01:22:46.200
and they make the final binding decision
link |
01:22:47.640
on how we should handle that.
link |
01:22:49.080
And then of course, their decisions,
link |
01:22:50.680
we then try to figure out what the principles are
link |
01:22:53.000
behind those and encode them into the algorithms.
link |
01:22:55.760
And how are those people chosen, which, you know,
link |
01:22:58.080
you're outsourcing a difficult decision.
link |
01:23:00.200
Yeah, the initial people,
link |
01:23:02.560
we chose a handful of chairs for the group
link |
01:23:09.040
and we basically chose the people
link |
01:23:12.480
for a commitment to free expression
link |
01:23:16.480
and like a broad understanding of human rights
link |
01:23:19.640
and the trade offs around free expression.
link |
01:23:21.520
So they fundamentally people
link |
01:23:22.720
who are gonna lean towards free expression.
link |
01:23:24.880
Towards freedom of speech.
link |
01:23:26.040
Okay, so there's also this idea of fact checkers.
link |
01:23:28.560
So jumping around to the misinformation questions,
link |
01:23:31.600
especially during COVID,
link |
01:23:33.120
which is an exceptionally speaking of polarization.
link |
01:23:36.080
Can I speak to the COVID thing?
link |
01:23:38.240
I mean, I think one of the hardest set of questions
link |
01:23:40.240
around free expression,
link |
01:23:41.200
because you asked about Georgetown
link |
01:23:42.240
has my stance fundamentally changed?
link |
01:23:43.840
And the answer to that is no, my stance has not changed.
link |
01:23:48.400
It is fundamentally the same as when I was talking
link |
01:23:52.040
at Georgetown from a philosophical perspective.
link |
01:23:56.480
The challenge with free speech is that everyone agrees
link |
01:24:01.840
that there is a line where if you're actually
link |
01:24:05.440
about to do physical harm to people
link |
01:24:08.160
that there should be restrictions.
link |
01:24:10.560
So, I mean, there's the famous Supreme Court
link |
01:24:13.960
historical example of like,
link |
01:24:15.120
you can't yell fire in a crowded theater.
link |
01:24:18.040
The thing that everyone disagrees on
link |
01:24:20.360
is what is the definition of real harm?
link |
01:24:22.680
Where I think some people think,
link |
01:24:24.560
okay, this should only be a very literal,
link |
01:24:27.920
I mean, take it back to the bullying conversation
link |
01:24:29.840
we were just having, where is it just harm
link |
01:24:32.760
if the person is about to hurt themselves
link |
01:24:34.800
because they've been bullied so hard?
link |
01:24:36.640
Or is it actually harm like as they're being bullied?
link |
01:24:39.880
And kind of at what point in the spectrum is that?
link |
01:24:42.160
And that's the part that there's not agreement on.
link |
01:24:44.480
But I think what people agree on pretty broadly
link |
01:24:47.000
is that when there is an acute threat
link |
01:24:49.440
that it does make sense from a societal perspective
link |
01:24:52.960
to tolerate less speech.
link |
01:24:57.120
That could be potentially harmful in that acute situation.
link |
01:24:59.560
So I think where COVID got very difficult is,
link |
01:25:02.840
I don't think anyone expected this to be going on for years.
link |
01:25:06.000
But if you'd kind of asked now a priori,
link |
01:25:10.360
would a global pandemic where a lot of people are dying
link |
01:25:14.880
and catching this, is that an emergency
link |
01:25:19.040
that where you'd kind of consider it
link |
01:25:21.560
that it's problematic to basically yell fire
link |
01:25:25.600
in a crowded theater?
link |
01:25:26.840
I think that that probably passes that test.
link |
01:25:29.000
So I think that it's a very tricky situation,
link |
01:25:32.320
but I think the fundamental commitment
link |
01:25:35.240
to free expression is there.
link |
01:25:38.200
And that's what I believe.
link |
01:25:39.840
And again, I don't think you start this company
link |
01:25:41.440
unless you care about people being able
link |
01:25:42.720
to express themselves as much as possible.
link |
01:25:44.800
But I think that that's the question,
link |
01:25:48.880
is how do you define what the harm is
link |
01:25:50.480
and how acute that is?
link |
01:25:52.440
And what are the institutions that define that harm?
link |
01:25:55.440
A lot of the criticism is that the CDC, the WHO,
link |
01:25:59.720
the institutions we've come to trust as a civilization
link |
01:26:03.800
to give the line of what is and isn't harm
link |
01:26:07.760
in terms of health policy have failed in many ways,
link |
01:26:11.640
in small ways and in big ways, depending on who you ask.
link |
01:26:14.320
And then the perspective of meta and Facebook is like,
link |
01:26:17.120
well, where the hell do I get the information
link |
01:26:20.160
of what is and isn't misinformation?
link |
01:26:22.400
So it's a really difficult place to be in,
link |
01:26:25.160
but it's great to hear that you're leaning
link |
01:26:26.720
towards freedom of speech on this aspect.
link |
01:26:30.140
And again, I think this actually calls to the fact
link |
01:26:33.000
that we need to reform institutions
link |
01:26:35.320
that help keep an open mind
link |
01:26:36.800
of what is and isn't misinformation.
link |
01:26:39.880
And misinformation has been used to bully on the internet.
link |
01:26:44.600
I mean, I just have, I'm friends with Joe Rogan
link |
01:26:46.920
and he is called as a,
link |
01:26:49.280
I remember hanging out with him in Vegas
link |
01:26:51.280
and somebody yelled, stop spreading misinformation.
link |
01:26:54.660
I mean, and there's a lot of people that follow him
link |
01:26:57.640
that believe he's not spreading misinformation.
link |
01:26:59.880
Like you can't just not acknowledge the fact
link |
01:27:02.900
that there's a large number of people
link |
01:27:05.720
that have a different definition of misinformation.
link |
01:27:08.840
And that's such a tough place to be.
link |
01:27:10.760
Like who do you listen to?
link |
01:27:11.840
Do you listen to quote unquote experts who gets,
link |
01:27:15.320
as a person who has a PhD, I gotta say,
link |
01:27:17.600
I mean, I'm not sure I know what defines an expert,
link |
01:27:21.120
especially in a new,
link |
01:27:24.080
in a totally new pandemic or a new catastrophic event,
link |
01:27:29.160
especially when politics is involved
link |
01:27:31.520
and especially when the news are,
link |
01:27:33.320
the media involved that can propagate
link |
01:27:37.440
sort of outrageous narratives
link |
01:27:39.520
and thereby make a lot of money.
link |
01:27:40.720
Like what the hell?
link |
01:27:41.800
Where's the source of truth?
link |
01:27:43.200
And then everybody turns to Facebook.
link |
01:27:45.480
It's like, please tell me what the source of truth is.
link |
01:27:49.040
Well, I mean, well, how would you handle this
link |
01:27:50.740
if you were in my position?
link |
01:27:52.680
Is very, very, very, very difficult.
link |
01:27:55.160
I would say,
link |
01:27:59.400
I would more speak about how difficult the choices are
link |
01:28:02.680
and be transparent about like,
link |
01:28:04.040
what the hell do you do with this?
link |
01:28:05.360
Like here, you got exactly,
link |
01:28:07.080
ask the exact question you just asked me,
link |
01:28:08.800
but to the broader public, like, okay, yeah,
link |
01:28:10.840
you guys tell me what to do.
link |
01:28:12.400
So like crowdsource it.
link |
01:28:14.200
And then the other aspect is when you spoke really eloquently
link |
01:28:19.800
about the fact that there's this going back and forth
link |
01:28:23.440
and now there's a feeling like you're censoring
link |
01:28:25.240
a little bit too much.
link |
01:28:26.720
So I would lean, I would try to be ahead of that feeling.
link |
01:28:30.240
I would now lean towards freedom of speech and say,
link |
01:28:33.080
we're not the ones that are going to define misinformation.
link |
01:28:36.240
Let it be a public debate, let the idea stand.
link |
01:28:40.040
And I actually place, this idea of misinformation,
link |
01:28:44.280
I place the responsibility
link |
01:28:46.360
on the poor communication skills of scientists.
link |
01:28:50.020
They should be in the battlefield of ideas
link |
01:28:52.560
and everybody who is spreading information
link |
01:28:57.400
against the vaccine, they should not be censored.
link |
01:29:00.400
They should be talked with and you should show the data,
link |
01:29:03.040
you should have open discussion
link |
01:29:04.800
as opposed to rolling your eyes and saying,
link |
01:29:07.080
I'm the expert, I know what I'm talking about.
link |
01:29:09.840
No, you need to convince people, it's a battle of ideas.
link |
01:29:13.240
So that's the whole point of freedom of speech.
link |
01:29:15.360
It's the way to defeat bad ideas
link |
01:29:17.120
is with good ideas, with speech.
link |
01:29:20.080
So like the responsibility here falls
link |
01:29:22.080
on the poor communication skills of scientists.
link |
01:29:26.560
Thanks to social media, scientists are not communicators.
link |
01:29:32.180
They have the power to communicate.
link |
01:29:34.040
Some of the best stuff I've seen about COVID
link |
01:29:36.800
from doctors is on social media.
link |
01:29:38.840
It's a way to learn to respond really quickly,
link |
01:29:41.520
to go faster than the peer review process.
link |
01:29:43.800
And so they just need to get way better
link |
01:29:45.460
at that communication.
link |
01:29:46.480
And also by better, I don't mean just convincing,
link |
01:29:50.060
I also mean speak with humility,
link |
01:29:51.800
don't talk down to people, all those kinds of things.
link |
01:29:54.280
And as a platform, I would say,
link |
01:29:56.860
I would step back a little bit.
link |
01:29:59.800
Not all the way, of course,
link |
01:30:00.800
because there's a lot of stuff that can cause real harm
link |
01:30:03.520
as we've talked about,
link |
01:30:04.440
but you lean more towards freedom of speech
link |
01:30:06.920
because then people from a brand perspective
link |
01:30:09.560
wouldn't be blaming you for the other ills of society,
link |
01:30:13.760
which there are many.
link |
01:30:14.600
The institutions have flaws, the political divide,
link |
01:30:19.840
obviously politicians have flaws, that's news.
link |
01:30:23.400
The media has flaws that they're all trying to work with.
link |
01:30:28.040
And because of the central place of Facebook in the world,
link |
01:30:31.080
all of those flaws somehow kind of propagate to Facebook.
link |
01:30:34.320
And you're sitting there as Plato, the philosopher,
link |
01:30:38.160
have to answer to some of the most difficult questions
link |
01:30:40.720
asking, being asked of human civilization.
link |
01:30:43.960
So I don't know, maybe this is an American answer though,
link |
01:30:47.000
to lean towards freedom of speech.
link |
01:30:48.420
I don't know if that applies globally.
link |
01:30:51.300
So yeah, I don't know.
link |
01:30:52.640
But transparency and saying, I think as a technologist,
link |
01:30:57.400
one of the things I sense about Facebook and meta
link |
01:30:59.560
when people talk about this company
link |
01:31:02.360
is they don't necessarily understand
link |
01:31:04.960
fully how difficult the problem is.
link |
01:31:06.880
You talked about AI has to catch
link |
01:31:08.440
a bunch of harmful stuff really quickly.
link |
01:31:11.720
Just the sea of data you have to deal with.
link |
01:31:14.600
It's a really difficult problem.
link |
01:31:16.700
So like any of the critics,
link |
01:31:18.400
if you just hand them the helm for a week,
link |
01:31:22.840
let's see how well you can do.
link |
01:31:25.400
Like that, to me, that's definitely something
link |
01:31:28.160
that would wake people up to how difficult this problem is
link |
01:31:31.320
if there's more transparency
link |
01:31:32.640
of saying how difficult this problem is.
link |
01:31:35.580
Let me ask you about, on the AI front,
link |
01:31:37.800
just because you mentioned language and my ineloquence.
link |
01:31:41.600
Translation is something I wanted to ask you about.
link |
01:31:44.120
And first, just to give a shout out to the supercomputer.
link |
01:31:47.760
You've recently announced the AI research supercluster, RSC.
link |
01:31:51.960
Obviously, I'm somebody who loves the GPUs.
link |
01:31:54.680
It currently has 6,000 GPUs.
link |
01:31:57.120
NVIDIA DGX A100 is the systems that have
link |
01:32:02.160
in total 6,000 GPUs.
link |
01:32:04.080
And it will eventually, maybe this year,
link |
01:32:06.560
maybe soon, will have 16,000 GPUs.
link |
01:32:10.000
So it can do a bunch of different kinds
link |
01:32:11.680
of machine learning applications.
link |
01:32:15.040
There's a cool thing on the distributed storage aspect
link |
01:32:18.560
and all that kind of stuff.
link |
01:32:19.680
So one of the applications that I think is super exciting
link |
01:32:23.040
is translation, real time translation.
link |
01:32:26.320
I mentioned to you that having a conversation,
link |
01:32:29.120
I speak Russian fluently,
link |
01:32:30.200
I speak English somewhat fluently,
link |
01:32:32.360
and having a conversation with Vladimir Putin,
link |
01:32:34.940
say, as a use case.
link |
01:32:36.040
Me, as a user, coming to you as a use case.
link |
01:32:38.480
We both speak each other's language.
link |
01:32:42.520
I speak Russian, he speaks English.
link |
01:32:45.040
How can we have that communication go well
link |
01:32:48.000
with the help of AI?
link |
01:32:49.400
I think it's such a beautiful and a powerful application
link |
01:32:52.440
of AI to connect the world,
link |
01:32:54.720
that bridge the gap, not necessarily between me and Putin,
link |
01:32:57.560
but people that don't have that shared language.
link |
01:33:01.960
Can you just speak about your vision with translation?
link |
01:33:04.120
Because I think that's a really exciting application.
link |
01:33:06.680
If you're trying to help people connect
link |
01:33:08.000
all around the world,
link |
01:33:09.400
a lot of content is produced in one language
link |
01:33:11.600
and people in all these other places are interested in it.
link |
01:33:14.720
So being able to translate that
link |
01:33:17.720
just unlocks a lot of value on a day to day basis.
link |
01:33:20.560
I mean, so the kind of AI around translation is interesting
link |
01:33:24.400
because it's gone through a bunch of iterations.
link |
01:33:27.880
But the basic state of the art
link |
01:33:29.680
is that you don't wanna go through
link |
01:33:33.560
different kind of intermediate symbolic
link |
01:33:38.800
representations of language or something like that.
link |
01:33:42.520
You basically wanna be able to map the concepts
link |
01:33:46.920
and basically go directly from one language to another.
link |
01:33:49.300
And you just can train bigger and bigger models
link |
01:33:53.040
in order to be able to do that.
link |
01:33:54.120
And that's where the research supercluster comes in
link |
01:33:58.160
is basically a lot of the trend in machine learning
link |
01:34:01.080
is just you're building bigger and bigger models
link |
01:34:03.400
and you just need a lot of computation to train them.
link |
01:34:05.700
So it's not that like the translation would run
link |
01:34:08.360
on the supercomputer, the training of the model,
link |
01:34:12.080
which could have billions or trillions of examples
link |
01:34:15.800
of just basically that.
link |
01:34:19.080
You're training models on this supercluster
link |
01:34:22.360
in days or weeks that might take a much longer period of time
link |
01:34:27.120
on a smaller cluster.
link |
01:34:28.120
So it just wouldn't be practical for most teams to do.
link |
01:34:30.200
But the translation work,
link |
01:34:34.560
we're basically getting from being able to go
link |
01:34:38.160
between about a hundred languages seamlessly today
link |
01:34:42.280
to being able to go to about 300 languages in the near term.
link |
01:34:46.740
So from any language to any other language.
link |
01:34:48.720
Yeah.
link |
01:34:49.560
And part of the issue when you get closer to more languages
link |
01:34:53.880
is some of these get to be pretty,
link |
01:34:59.840
not very popular languages, right?
link |
01:35:01.920
Where there isn't that much content in them.
link |
01:35:04.280
So you end up having less data
link |
01:35:07.280
and you need to kind of use a model that you've built up
link |
01:35:10.960
around other examples.
link |
01:35:12.080
And this is one of the big questions around AI
link |
01:35:14.040
is like how generalizable can things be?
link |
01:35:16.680
And that I think is one of the things
link |
01:35:18.760
that's just kind of exciting here
link |
01:35:19.800
from a technical perspective.
link |
01:35:21.300
But capturing, we talked about this with the metaverse,
link |
01:35:23.800
capturing the magic of human to human interaction.
link |
01:35:26.440
So me and Putin, okay.
link |
01:35:29.180
Again, this is therapy session.
link |
01:35:30.020
I mean, it's a tough example
link |
01:35:31.080
because you actually both speak Russian and English.
link |
01:35:33.360
No, but that's.
link |
01:35:34.200
But in the future.
link |
01:35:35.020
I see it as a touring test of a kind
link |
01:35:37.740
because we would both like to have an AI that improves
link |
01:35:40.440
because I don't speak Russian that well.
link |
01:35:42.240
He doesn't speak English that well.
link |
01:35:44.200
Yeah.
link |
01:35:45.040
It would be nice to outperform our abilities
link |
01:35:48.640
and it sets a really nice bar
link |
01:35:50.640
because I think AI can really help in translation
link |
01:35:53.600
for people that don't speak the language at all,
link |
01:35:55.720
but to actually capture the magic of the chemistry,
link |
01:36:00.120
the translation, which would make the metaverse
link |
01:36:03.240
super immersive.
link |
01:36:04.800
I mean, that's exciting.
link |
01:36:05.840
You remove the barrier of language, period.
link |
01:36:08.700
Yeah, so when people think about translation,
link |
01:36:11.240
I think a lot of that is they're thinking about text to text,
link |
01:36:14.240
but speech to speech, I think is a whole nother thing.
link |
01:36:17.120
And I mean, one of the big lessons on that,
link |
01:36:19.080
which I was referring to before is I think early models,
link |
01:36:22.120
it's like, all right, they take speech,
link |
01:36:23.800
they translate it to text,
link |
01:36:25.080
translate the text to another language
link |
01:36:26.680
and then kind of output that as speech in that language.
link |
01:36:29.400
And you don't wanna do that.
link |
01:36:30.440
You just wanna be able to go directly from speech
link |
01:36:32.260
in one language to speech in another language
link |
01:36:34.180
and build up the models to do that.
link |
01:36:36.400
And I mean, I think one of the,
link |
01:36:39.140
there have been,
link |
01:36:40.760
when you look at the progress in machine learning,
link |
01:36:42.860
there have been big advances in the techniques,
link |
01:36:47.240
some of the advances in self supervised learning,
link |
01:36:51.560
which I know you talked to Jan about
link |
01:36:52.920
and he's like one of the leading thinkers in this area.
link |
01:36:55.320
I just think that that stuff is really exciting,
link |
01:36:57.560
but then you couple that with the ability
link |
01:36:59.840
to just throw larger and larger amounts of compute
link |
01:37:02.480
at training these models.
link |
01:37:04.000
And you can just do a lot of things
link |
01:37:05.600
that were harder to do before.
link |
01:37:09.360
But we're asking more of our systems too, right?
link |
01:37:12.960
So if you think about the applications
link |
01:37:14.880
that we're gonna need for the metaverse,
link |
01:37:18.400
or think about it, okay,
link |
01:37:19.400
so let's talk about AR here for a second.
link |
01:37:21.480
You're gonna have these glasses,
link |
01:37:23.140
they're gonna look hopefully
link |
01:37:24.760
like a normal ish looking pair of glasses,
link |
01:37:28.140
but they're gonna be able to put holograms in the world
link |
01:37:31.240
and intermix virtual and physical objects in your scene.
link |
01:37:35.980
And one of the things that's gonna be unique about this
link |
01:37:39.080
compared to every other computing device
link |
01:37:41.240
that you've had before,
link |
01:37:42.580
is that this is gonna be the first computing device
link |
01:37:45.160
that has all the same signals
link |
01:37:47.560
about what's going on around you that you have.
link |
01:37:49.480
Right, so your phone,
link |
01:37:50.480
you can have it take a photo or a video,
link |
01:37:54.140
but I mean, these glasses are gonna,
link |
01:37:56.600
whenever you activate them,
link |
01:37:57.480
they're gonna be able to see what you see
link |
01:37:59.000
from your perspective,
link |
01:38:00.240
they're gonna be able to hear what you hear
link |
01:38:01.540
because the microphones and all that
link |
01:38:03.440
are gonna be right around where your ears are.
link |
01:38:05.800
So you're gonna want an AI assistant,
link |
01:38:08.160
that's a new kind of AI assistant
link |
01:38:10.240
that can basically help you process the world
link |
01:38:13.880
from this first person perspective
link |
01:38:17.040
or from the perspective that you have.
link |
01:38:18.580
And the utility of that is gonna be huge,
link |
01:38:21.800
but the kinds of AI models that we're gonna need
link |
01:38:25.440
are going to be just,
link |
01:38:28.640
I don't know, there's a lot that we're gonna need
link |
01:38:30.040
to basically make advances in.
link |
01:38:31.840
But I mean, but that's why I think these concepts
link |
01:38:33.760
of the metaverse and the advances in AI
link |
01:38:36.600
are so fundamentally interlinked
link |
01:38:40.200
that I mean, they're kind of enabling each other.
link |
01:38:42.880
Yeah, like the world builder is a really cool idea.
link |
01:38:45.440
Like you can be like a Bob Ross,
link |
01:38:47.240
like I'm gonna put a little tree right here.
link |
01:38:49.120
Yeah.
link |
01:38:49.940
I need a little tree, it's missing a little tree.
link |
01:38:51.200
And then, but at scale,
link |
01:38:52.960
like enriching your experience in all kinds of ways.
link |
01:38:55.680
You mentioned the assistant too,
link |
01:38:56.960
that's really interesting how you can have AI assistants
link |
01:39:00.020
helping you out on different levels
link |
01:39:01.640
of sort of intimacy of communication.
link |
01:39:04.000
It could be just like scheduling
link |
01:39:05.480
or it could be like almost like therapy.
link |
01:39:08.120
Clearly I need some.
link |
01:39:09.880
So let me ask you,
link |
01:39:11.160
you're one of the most successful people ever.
link |
01:39:14.000
You've built an incredible company
link |
01:39:16.240
that has a lot of impact.
link |
01:39:18.080
What advice do you have for young people today?
link |
01:39:23.120
How to live a life they can be proud of?
link |
01:39:25.480
How to build something that can have a big positive impact
link |
01:39:30.420
on the world?
link |
01:39:31.260
Well, let's break that down.
link |
01:39:37.460
Cause I think you proud of, have a big positive impact.
link |
01:39:41.300
Well, you're actually listening.
link |
01:39:42.320
And how to live your life
link |
01:39:43.880
are actually three different things that I think,
link |
01:39:47.580
I mean, they could line up,
link |
01:39:48.900
but, and also like what age of people are you talking to?
link |
01:39:52.460
Cause I mean, I can like.
link |
01:39:53.500
High school and college.
link |
01:39:54.700
So you don't really know what you're doing,
link |
01:39:56.500
but your dream big.
link |
01:39:58.220
And you really have a chance to do something unprecedented.
link |
01:40:02.280
Yeah.
link |
01:40:04.180
So I guess just to.
link |
01:40:05.020
Also for people my age.
link |
01:40:06.300
Okay, so let's maybe start with the kind of most
link |
01:40:09.620
philosophical and abstract version of this.
link |
01:40:12.060
Every night when I put my daughters to bed,
link |
01:40:16.220
we go through this thing and like,
link |
01:40:20.380
they call it the good night things.
link |
01:40:21.740
Cause we're basically what we talk about at night.
link |
01:40:25.440
And I just, I go through them.
link |
01:40:29.700
Sounds like a good show.
link |
01:40:31.460
The good night things.
link |
01:40:32.700
Yeah.
link |
01:40:33.540
Priscilla's always asking, she's like,
link |
01:40:34.360
can I get good night things?
link |
01:40:35.200
Like, I don't know.
link |
01:40:36.040
You go to bed too early.
link |
01:40:37.100
But it's,
link |
01:40:41.580
but I basically go through with Max and Augie,
link |
01:40:46.340
what are the things that are most important in life?
link |
01:40:48.940
Right.
link |
01:40:49.780
That I just, it's like, what do I want them to remember
link |
01:40:51.560
and just have like really ingrained in them as they grow up?
link |
01:40:53.940
And it's health, right?
link |
01:40:56.740
Making sure that you take care of yourself
link |
01:40:58.800
and keep yourself in good shape,
link |
01:41:00.700
loving friends and family, right?
link |
01:41:02.940
Because having the relationships,
link |
01:41:05.380
the family and making time for friends,
link |
01:41:08.700
I think is perhaps one of the most important things.
link |
01:41:13.820
And then the third is maybe a little more amorphous,
link |
01:41:16.040
but it is something that you're excited about for the future.
link |
01:41:19.420
And when I'm talking to a four year old,
link |
01:41:21.300
often I'll ask her what she's excited about
link |
01:41:23.500
for tomorrow or the week ahead.
link |
01:41:25.220
But I think for most people, it's really hard.
link |
01:41:29.580
I mean, the world is a heavy place.
link |
01:41:31.440
And I think like the way that we navigate it
link |
01:41:34.780
is that we have things that we're looking forward to.
link |
01:41:37.340
So whether it is building AR glasses for the future
link |
01:41:41.620
or being able to celebrate my 10 year wedding anniversary
link |
01:41:45.520
with my wife that's coming up,
link |
01:41:47.380
it's like, I think people,
link |
01:41:48.620
you know, you have things that you're looking forward to.
link |
01:41:51.860
Or for the girls, it's often I want to see mom
link |
01:41:53.860
in the morning, right?
link |
01:41:54.700
It's just, but it's like that's a really critical thing.
link |
01:41:57.020
And then the last thing is I ask them every day,
link |
01:42:00.340
what did you do today to help someone?
link |
01:42:04.340
Because I just think that that's a really critical thing
link |
01:42:07.140
is like, it's easy to kind of get caught up in yourself
link |
01:42:10.740
and kind of stuff that's really far down the road,
link |
01:42:14.300
but like, did you do something just concrete today
link |
01:42:17.520
to help someone?
link |
01:42:18.360
And, you know, it can just be as simple as, okay, yeah,
link |
01:42:21.060
I helped set the table for lunch, right?
link |
01:42:23.420
Or, you know, this other kid in our school
link |
01:42:26.440
was having a hard time with something
link |
01:42:27.900
and I like helped explain it to him.
link |
01:42:29.260
But in that those are, that's sort of like,
link |
01:42:32.940
if you were to boil down my overall life philosophy
link |
01:42:36.080
into what I try to impart to my kids,
link |
01:42:40.140
those are the things that I think are really important.
link |
01:42:43.000
So, okay, so let's say college.
link |
01:42:44.320
So if you're a graduate in college,
link |
01:42:45.860
probably more practical advice, I'm always very focused
link |
01:42:52.340
on people.
link |
01:42:53.180
And I think the most important decision
link |
01:42:57.140
you're probably gonna make if you're in college
link |
01:42:59.320
is who you surround yourself with,
link |
01:43:01.620
because you become like the people
link |
01:43:02.960
you surround yourself with.
link |
01:43:04.620
And I sort of have this hiring heuristic at Metta,
link |
01:43:09.620
which is that I will only hire someone to work for me
link |
01:43:13.620
if I could see myself working for them.
link |
01:43:17.220
Not necessarily that I want them to run the company
link |
01:43:19.020
because I like my job, but in an alternate universe,
link |
01:43:22.420
if it was their company and I was looking
link |
01:43:23.980
to go work somewhere, would I be happy to work for them?
link |
01:43:27.060
And I think that that's a helpful heuristic
link |
01:43:31.220
to help balance, you know,
link |
01:43:33.060
when you're building something like this,
link |
01:43:33.900
there's a lot of pressure to, you know,
link |
01:43:36.060
you wanna build out your team,
link |
01:43:37.380
because there's a lot of stuff that you need to get done.
link |
01:43:39.620
And everyone always says, don't compromise on quality,
link |
01:43:41.860
but there's this question of, okay,
link |
01:43:42.940
well, how do you know that someone is good enough?
link |
01:43:44.020
And I think my answer is, I would want someone
link |
01:43:46.860
to be on my team if I would work for them.
link |
01:43:50.660
But I think it's actually a pretty similar answer
link |
01:43:53.300
to like, if you were choosing friends or a partner
link |
01:43:58.300
or something like that.
link |
01:43:59.460
So when you're kind of in college,
link |
01:44:01.900
trying to figure out what your circle is gonna be,
link |
01:44:03.580
trying to figure out, you know,
link |
01:44:04.420
you're evaluating data,
link |
01:44:05.260
your circle is gonna be trying to figure out, you know,
link |
01:44:07.620
you're evaluating different job opportunities.
link |
01:44:09.300
Who are the people, even if they're gonna be peers
link |
01:44:12.980
in what you're doing,
link |
01:44:14.420
who are the people who in an alternate university,
link |
01:44:17.020
you would wanna work for them,
link |
01:44:18.700
because you think you're gonna learn a lot from them,
link |
01:44:20.420
because they know, because they are kind of values aligned
link |
01:44:24.180
on the things that you care about,
link |
01:44:25.500
and they're gonna like, and they're gonna push you,
link |
01:44:28.260
but also they know different things
link |
01:44:29.500
and have different experiences
link |
01:44:30.660
that are kind of more of what you wanna become like
link |
01:44:32.940
over time.
link |
01:44:33.780
But I don't know, I think probably people are too,
link |
01:44:37.020
in general, objective focused,
link |
01:44:39.100
and maybe not focused enough on the connections
link |
01:44:42.700
and the people who they're basically building relationships
link |
01:44:46.620
with.
link |
01:44:47.460
I don't know what it says about me,
link |
01:44:48.300
but my place in Austin now has seven legged robots.
link |
01:44:53.540
So I'm surrounded myself by robots,
link |
01:44:55.420
which is probably something I should look into.
link |
01:44:59.140
What kind of world would you like to see your daughters
link |
01:45:02.500
grow up in, even after you're gone?
link |
01:45:09.300
Well, I think one of the promises of all the stuff
link |
01:45:11.500
that is getting built now is that it can be a world
link |
01:45:15.580
where more people can just live out their imagination.
link |
01:45:21.780
One of my favorite quotes,
link |
01:45:23.420
I think it was attributed to Picasso,
link |
01:45:25.140
it's that all children are artists,
link |
01:45:26.780
and the challenge is how do you remain one
link |
01:45:28.340
when you grow up?
link |
01:45:29.580
And if you have kids, this is pretty clear,
link |
01:45:33.620
I mean, they just have wonderful imaginations.
link |
01:45:36.260
And part of what I think is gonna be great
link |
01:45:38.980
about the creator economy and the metaverse
link |
01:45:41.380
and all this stuff is this notion around
link |
01:45:44.740
that a lot more people in the future
link |
01:45:46.300
are gonna get to work doing creative stuff
link |
01:45:49.020
than what I think today we would just consider
link |
01:45:51.420
traditional labor or service.
link |
01:45:53.740
And I think that that's awesome.
link |
01:45:56.300
And that's a lot of what people are here to do
link |
01:46:00.140
is collaborate together, work together,
link |
01:46:03.140
think of things that you wanna build and go do it.
link |
01:46:06.420
And I don't know, one of the things
link |
01:46:08.540
that I just think is striking,
link |
01:46:09.380
so I teach my daughters some basic coding with Scratch.
link |
01:46:13.660
I mean, they're still obviously really young,
link |
01:46:15.420
but I think of coding as building,
link |
01:46:18.340
where it's like when I'm coding,
link |
01:46:19.780
I'm building something that I want to exist.
link |
01:46:22.300
But my youngest daughter, she's very musical
link |
01:46:27.980
and pretty artistic and she thinks about coding as art.
link |
01:46:32.820
She calls it code art, not the code,
link |
01:46:35.540
but the output of what she is making.
link |
01:46:37.540
It's like, she's just very interesting visually
link |
01:46:39.340
in what she can kind of output and how it can move around.
link |
01:46:42.580
And do we need to fix that?
link |
01:46:45.020
Are we good?
link |
01:46:45.860
What happened?
link |
01:46:47.460
Do we have to clap, Alexa?
link |
01:46:49.460
Yeah, so I was just talking about Augie and her code art,
link |
01:46:53.020
but I mean, to me, this is like a beautiful thing, right?
link |
01:46:56.540
The notion that like for me,
link |
01:46:58.700
coding was this functional thing and I enjoyed it.
link |
01:47:01.380
And it like helped build something utilitarian,
link |
01:47:04.620
but that for the next generation of people,
link |
01:47:06.820
it will be even more an expression
link |
01:47:10.460
of their kind of imagination and artistic sense
link |
01:47:14.900
for what they want to exist.
link |
01:47:15.940
So I don't know if that happens,
link |
01:47:17.620
if we can help bring about this world
link |
01:47:20.420
where a lot more people can,
link |
01:47:23.580
that that's like their existence going forward
link |
01:47:25.900
is being able to basically create
link |
01:47:28.620
and live out all these different kinds of art.
link |
01:47:32.940
I just think that that's like a beautiful
link |
01:47:34.420
and wonderful thing and will be very freeing for humanity
link |
01:47:37.940
to spend more of our time on the things that matter to us.
link |
01:47:40.420
Yeah, allow more and more people to express their art
link |
01:47:43.140
in the full meaning of that word.
link |
01:47:45.140
That's a beautiful vision.
link |
01:47:46.900
We mentioned that you are mortal.
link |
01:47:50.260
Are you afraid of death?
link |
01:47:51.900
Do you think about your mortality?
link |
01:47:56.300
And are you afraid of it?
link |
01:48:01.220
You didn't sign up for this on a podcast, did you?
link |
01:48:03.100
No, I mean, it's an interesting question.
link |
01:48:07.060
I mean, I'm definitely aware of it.
link |
01:48:08.700
I do a fair amount of like extreme sport type stuff.
link |
01:48:13.700
So like, so I'm definitely aware of it.
link |
01:48:19.700
And you're flirting with it a bit.
link |
01:48:22.220
I train hard.
link |
01:48:23.700
I mean, so it's like, if I'm gonna go out
link |
01:48:25.100
in like a 15 foot wave.
link |
01:48:27.700
Go out big.
link |
01:48:28.540
Well, then it's like, all right,
link |
01:48:29.700
I'll make sure we have the right safety gear
link |
01:48:31.540
and like make sure that I'm like used to that spot
link |
01:48:34.540
and all that stuff.
link |
01:48:35.580
But like, but you know, I mean, you.
link |
01:48:37.740
The risk is still there.
link |
01:48:38.940
You take some head blows along the way.
link |
01:48:40.780
Yes, but definitely aware of it.
link |
01:48:45.340
Definitely would like to stay safe.
link |
01:48:48.020
I have a lot of stuff that I want to build and want to.
link |
01:48:52.020
Does it freak you out that it's finite though?
link |
01:48:55.460
That there's a deadline when it's all over
link |
01:48:59.180
and that there'll be a time when your daughters are around
link |
01:49:01.500
and you're gone?
link |
01:49:03.060
I don't know.
link |
01:49:03.900
That doesn't freak me out.
link |
01:49:04.940
I think, I don't know.
link |
01:49:09.780
Constraints are helpful.
link |
01:49:16.220
Yeah.
link |
01:49:17.300
Yeah, the finiteness makes ice cream
link |
01:49:20.260
taste more delicious somehow.
link |
01:49:21.740
The fact that it's gonna be over.
link |
01:49:23.140
There's something about that with the metaverse too.
link |
01:49:25.700
You want, we talked about this identity earlier,
link |
01:49:28.500
like having just one, like NFTs.
link |
01:49:30.260
There's something powerful about the constraint
link |
01:49:34.340
of finiteness or uniqueness.
link |
01:49:36.900
That this moment is singular in history.
link |
01:49:39.740
But I mean, a lot of,
link |
01:49:41.060
as you go through different waves of technology,
link |
01:49:42.700
I think a lot of what is interesting is
link |
01:49:44.340
what becomes in practice infinite
link |
01:49:48.020
or kind of there can be many, many of a thing
link |
01:49:51.500
and then what ends up still being constrained.
link |
01:49:53.660
So the metaverse should hopefully allow
link |
01:50:00.340
a very large number or maybe in practice,
link |
01:50:04.220
hopefully close to an infinite amount of expression
link |
01:50:06.860
and worlds, but we'll still only have
link |
01:50:09.700
a finite amount of time.
link |
01:50:11.220
Yes.
link |
01:50:12.060
I think living longer I think is good.
link |
01:50:18.020
And obviously all of my, our philanthropic work is,
link |
01:50:21.700
it's not focused on longevity,
link |
01:50:23.380
but it is focused on trying to achieve
link |
01:50:25.940
what I think is a possible goal in this century,
link |
01:50:29.620
which is to be able to cure, prevent
link |
01:50:31.020
or manage all diseases.
link |
01:50:33.460
So I certainly think people kind of getting sick
link |
01:50:36.140
and dying is a bad thing because,
link |
01:50:37.740
and I'm dedicating almost all of my capital
link |
01:50:40.260
towards advancing research in that area to push on that,
link |
01:50:44.500
which I mean, we could do a whole,
link |
01:50:45.460
another one of these podcasts about that
link |
01:50:46.940
because that's a fascinating topic.
link |
01:50:49.660
I mean, this is with your wife Priscilla Chan,
link |
01:50:51.740
you formed the Chan Zuckerberg Initiative,
link |
01:50:54.140
gave away 99% or pledged to give away 99%
link |
01:50:57.020
of Facebook non meta shares.
link |
01:50:59.260
I mean, like you said, we could talk forever
link |
01:51:01.980
about all the exciting things you're working on there,
link |
01:51:06.100
including the sort of moonshot of eradicating disease
link |
01:51:11.300
by the mid century marker.
link |
01:51:13.260
I don't actually know if you're gonna ever eradicate it,
link |
01:51:15.420
but I think you can get to a point where you
link |
01:51:17.980
can either cure things that happened, right?
link |
01:51:20.900
So people get diseases, but you can cure them.
link |
01:51:22.940
Prevent is probably closest to eradication
link |
01:51:25.620
or just be able to manage as sort of like ongoing things
link |
01:51:28.860
that are not gonna ruin your life.
link |
01:51:33.260
And I think that that's possible.
link |
01:51:34.300
I think saying that there's gonna be no disease at all
link |
01:51:37.060
probably is not possible within the next several decades.
link |
01:51:41.540
Basic thing is increase the quality of life
link |
01:51:44.300
and maybe keep the finiteness
link |
01:51:46.820
because it makes everything taste more delicious.
link |
01:51:50.180
Maybe that's just being a romantic 20th century human.
link |
01:51:54.740
Maybe, but I mean, but it was an intentional decision
link |
01:51:57.140
to not focus on our philanthropy on like explicitly
link |
01:52:01.780
on longevity or living forever.
link |
01:52:03.460
Yes.
link |
01:52:06.980
If at the moment of your death, and by the way,
link |
01:52:09.060
I like that the lights went out
link |
01:52:11.540
when we started talking about death.
link |
01:52:13.380
You get to meet God.
link |
01:52:14.220
It does make it a lot more dramatic.
link |
01:52:15.660
It does.
link |
01:52:17.940
I should get closer to the mic.
link |
01:52:19.740
At the moment of your death, you get to meet God
link |
01:52:23.020
and you get to ask one question.
link |
01:52:26.140
What question would you like to ask?
link |
01:52:29.820
Or maybe a whole conversation.
link |
01:52:31.180
I don't know.
link |
01:52:32.020
It's up to you.
link |
01:52:32.860
It's more dramatic when it's just one question.
link |
01:52:37.100
Well, if it's only one question and I died,
link |
01:52:42.980
I would just wanna know that Priscilla and my family,
link |
01:52:48.020
like if they were gonna be okay.
link |
01:52:50.900
That might depend on the circumstances of my death.
link |
01:52:54.620
But I think that in most circumstances that I can think of,
link |
01:52:58.060
that's probably the main thing that I would care about.
link |
01:53:01.100
Yeah, I think God will hear that question and be like,
link |
01:53:02.820
all right, fine, you get in.
link |
01:53:04.260
That's the right question to ask.
link |
01:53:06.700
Is it?
link |
01:53:07.540
I don't know.
link |
01:53:08.380
The humility and selfishness.
link |
01:53:09.580
All right, you're in.
link |
01:53:10.820
I mean, but well, maybe.
link |
01:53:14.540
They're gonna be fine.
link |
01:53:15.380
Don't worry, you're in.
link |
01:53:16.220
Okay, but I mean, one of the things that I think
link |
01:53:18.220
I struggle with at least is on the one hand,
link |
01:53:22.300
that's probably the thing that's closest to me
link |
01:53:25.620
and maybe the most common human experience.
link |
01:53:29.420
But I don't know, one of the things that I just struggle with
link |
01:53:32.380
in terms of running this large enterprise is like,
link |
01:53:38.100
should the thing that I care more about
link |
01:53:41.020
be that responsibility?
link |
01:53:44.860
And I think it's shifted over time.
link |
01:53:49.300
I mean, like before I really had a family
link |
01:53:52.060
that was like the only thing I cared about.
link |
01:53:53.860
And at this point, I mean, I care deeply about it,
link |
01:53:59.980
but yeah, I think that that's not as obvious of a question.
link |
01:54:06.060
Yeah, we humans are weird.
link |
01:54:07.860
You get this ability to impact millions of lives
link |
01:54:12.780
and it's definitely something, billions of lives,
link |
01:54:15.660
it's something you care about,
link |
01:54:16.980
but the weird humans that are closest to us,
link |
01:54:21.100
those are the ones that mean the most.
link |
01:54:23.700
And I suppose that's the dream of the metaverse
link |
01:54:26.140
is to connect, form small groups like that
link |
01:54:29.300
where you can have those intimate relationships.
link |
01:54:31.700
Let me ask you the big, ridiculous.
link |
01:54:33.620
Well, and to be able to be close,
link |
01:54:36.940
not just based on who you happen to be next to.
link |
01:54:39.900
I think that's what the internet is already doing
link |
01:54:41.980
is allowing you to spend more of your time
link |
01:54:44.540
not physically proximate.
link |
01:54:46.540
I mean, I always think when you think about the metaverse,
link |
01:54:49.940
people ask this question about the real world.
link |
01:54:52.140
It's like the virtual world versus the real world.
link |
01:54:54.860
And it's like, no, the real world is a combination
link |
01:54:58.180
of the virtual world and the physical world.
link |
01:55:00.100
But I think over time, as we get more technology,
link |
01:55:04.060
the physical world is becoming less of a percent
link |
01:55:06.780
of the real world.
link |
01:55:08.180
And I think that that opens up a lot of opportunities
link |
01:55:10.940
for people, because you can work in different places.
link |
01:55:13.460
You can stay more close to, stay closer to people
link |
01:55:17.580
who are in different places.
link |
01:55:18.420
So I think that's good.
link |
01:55:19.300
Removing barriers of geography
link |
01:55:21.380
and then barriers of language.
link |
01:55:23.140
That's a beautiful vision.
link |
01:55:25.860
Big, ridiculous question.
link |
01:55:27.580
What do you think is the meaning of life?
link |
01:55:44.100
I think that, well, there are probably a couple
link |
01:55:46.560
of different ways that I would go at this.
link |
01:55:52.020
But I think it gets back to this last question
link |
01:55:53.900
that we talked about, about the duality
link |
01:55:55.380
between you have the people around you
link |
01:55:58.660
who you care the most about,
link |
01:56:00.380
and then there's like this bigger thing
link |
01:56:03.060
that maybe you're building.
link |
01:56:05.820
And I think that in my own life, I mean,
link |
01:56:07.340
I sort of think about this tension,
link |
01:56:09.360
but I mean, it's like, I started this whole company
link |
01:56:11.860
and my life's work is around human connection.
link |
01:56:15.100
So I think it's intellectually probably the thing
link |
01:56:22.100
that I go to first is just that human connection
link |
01:56:27.720
is the meaning.
link |
01:56:29.560
And I mean, I think that it's a thing
link |
01:56:31.160
that our society probably systematically undervalues.
link |
01:56:36.860
I mean, I just remember when I was growing up
link |
01:56:39.280
and in school, it's like, do your homework
link |
01:56:43.140
and then go play with your friends after.
link |
01:56:45.200
And it's like, no, well, what if playing
link |
01:56:47.020
with your friends is the point?
link |
01:56:50.180
That sounds like an argument your daughter would make.
link |
01:56:52.340
Well, I mean, I don't know, I just think it's interesting.
link |
01:56:54.620
Homework doesn't even matter, man.
link |
01:56:56.340
Well, I think it's interesting because it's,
link |
01:56:58.540
and people, I think people tend to think
link |
01:57:02.700
about that stuff as wasting time,
link |
01:57:05.180
or that's like what you do in the free time that you have.
link |
01:57:08.100
But like, what if that's actually the point?
link |
01:57:11.140
So that's one.
link |
01:57:12.580
But here's maybe a different way of counting out this,
link |
01:57:14.760
which is maybe more like religious in nature.
link |
01:57:17.880
I mean, I always like,
link |
01:57:22.200
there's a rabbi who I've studied with
link |
01:57:25.400
who kind of gave me this,
link |
01:57:27.980
we were talking through Genesis and the Bible and the Torah
link |
01:57:31.820
and they're basically walking through,
link |
01:57:36.100
it's like, okay, you go through the seven days of creation
link |
01:57:40.740
and it's basically, it's like,
link |
01:57:45.660
why does the Bible start there?
link |
01:57:48.100
Right, it's like it could have started anywhere,
link |
01:57:49.460
right, in terms of like how to live.
link |
01:57:52.000
But basically it starts with talking about
link |
01:57:54.620
how God created people in his, her image.
link |
01:58:00.560
But the Bible starts by talking about
link |
01:58:02.720
how God created everything.
link |
01:58:04.740
So I actually think that there's like a compelling argument
link |
01:58:11.240
that I think I've always just found meaningful
link |
01:58:12.980
and inspiring that a lot of the point
link |
01:58:18.400
of what sort of religion has been telling us
link |
01:58:22.980
that we should do is to create and build things.
link |
01:58:30.060
So these things are not necessarily at odds.
link |
01:58:32.100
I mean, I think like, I mean, that's,
link |
01:58:34.780
and I think probably to some degree
link |
01:58:36.100
you'd expect me to say something like this
link |
01:58:37.720
because I've dedicated my life to creating things
link |
01:58:39.760
that help people connect.
link |
01:58:40.600
So, I mean, that's sort of the fusion of,
link |
01:58:43.780
I mean, getting back to what we talked about earlier,
link |
01:58:45.260
it's, I mean, what I studied in school
link |
01:58:46.500
or psychology and computer science, right?
link |
01:58:48.260
So it's, I mean, these are like the two themes
link |
01:58:50.580
that I care about, but I don't know for me,
link |
01:58:54.460
that's kind of what I think about, that's what matters.
link |
01:58:57.140
To create and to love, which is the ultimate form
link |
01:59:02.200
of connection.
link |
01:59:03.980
I think this is one hell of an amazing replay experience
link |
01:59:07.220
in the metaverse.
link |
01:59:08.060
So whoever is using our avatars years from now,
link |
01:59:11.980
I hope you had fun and thank you for talking today.
link |
01:59:14.780
Thank you.
link |
01:59:16.460
Thanks for listening to this conversation
link |
01:59:18.180
with Mark Zuckerberg.
link |
01:59:19.500
To support this podcast, please check out our sponsors
link |
01:59:22.060
in the description.
link |
01:59:23.660
And now, let me leave you with the end of the poem, If,
link |
01:59:27.820
by Roger Kipling.
link |
01:59:30.820
If you can talk with crowds and keep your virtue,
link |
01:59:34.160
or walk with kings, nor lose the common touch,
link |
01:59:37.820
if neither foes nor loving friends can hurt you,
link |
01:59:41.220
if all men count with you, but none too much.
link |
01:59:47.100
If you can fill the unforgiving minute
link |
01:59:49.260
with 60 seconds worth of distance run,
link |
01:59:52.340
yours is the earth and everything that's in it.
link |
01:59:56.340
And which is more, you'll be a man, my son.
link |
01:59:59.580
Thank you for listening and hope to see you next time.