back to index

Mark Zuckerberg: Meta, Facebook, Instagram, and the Metaverse | Lex Fridman Podcast #267


small model | large model

link |
00:00:00.000
Let's talk about free speech and censorship.
link |
00:00:02.560
You don't build a company like this
link |
00:00:04.080
unless you believe that people expressing themselves
link |
00:00:06.440
is a good thing.
link |
00:00:07.280
Let me ask you as a father,
link |
00:00:08.640
there's a way heavy on you that
link |
00:00:10.480
people get bullied on social networks.
link |
00:00:13.000
I care a lot about how people feel
link |
00:00:14.440
when they use our products.
link |
00:00:15.440
And I don't want to build products
link |
00:00:18.120
that make people angry.
link |
00:00:19.320
Why do you think so many people dislike you?
link |
00:00:23.440
Some even hate you.
link |
00:00:25.580
And how do you regain their trust and support?
link |
00:00:30.160
The following is a conversation with Mark Zuckerberg,
link |
00:00:32.720
CEO of Facebook, now called Metta.
link |
00:00:36.760
Please allow me to say a few words
link |
00:00:38.720
about this conversation with Mark Zuckerberg,
link |
00:00:41.320
about social media,
link |
00:00:42.680
and about what troubles me in the world today.
link |
00:00:45.480
And what gives me hope.
link |
00:00:47.760
If this is not interesting to you,
link |
00:00:49.520
I understand, please skip.
link |
00:00:52.720
I believe that at its best,
link |
00:00:55.040
social media puts a mirror to humanity
link |
00:00:57.800
and reveals the full complexity of our world.
link |
00:01:01.080
Shining a light on the dark aspects of human nature
link |
00:01:04.080
and giving us hope,
link |
00:01:05.620
a way out through compassionate
link |
00:01:07.640
but tense chaos of conversation
link |
00:01:09.800
that eventually can turn into understanding,
link |
00:01:12.800
friendship, and even love.
link |
00:01:15.600
But this is not simple.
link |
00:01:17.440
Our world is not simple.
link |
00:01:19.560
It is full of human suffering.
link |
00:01:22.360
I think about the hundreds of millions of people
link |
00:01:24.480
who are starving and who live in extreme poverty,
link |
00:01:28.480
the one million people who take their own life every year,
link |
00:01:31.600
the 20 million people that attempted,
link |
00:01:33.920
and the many, many more millions who suffer quietly
link |
00:01:37.400
in ways that numbers can never know.
link |
00:01:40.760
I'm troubled by the cruelty and pain of war.
link |
00:01:44.640
Today, my heart goes out to the people of Ukraine.
link |
00:01:48.560
My grandfather spilled his blood on this land,
link |
00:01:52.240
held the line as a machine gunner
link |
00:01:54.000
against the Nazi invasion, surviving impossible odds.
link |
00:01:59.160
I am nothing without him.
link |
00:02:01.360
His blood runs in my blood.
link |
00:02:04.960
My words are useless here.
link |
00:02:07.560
I send my love, it's all I have.
link |
00:02:11.120
I hope to travel to Russia and Ukraine soon.
link |
00:02:14.080
I will speak to citizens and leaders,
link |
00:02:16.760
including Vladimir Putin.
link |
00:02:19.960
As I've said in the past, I don't care about access,
link |
00:02:22.640
fame, money, or power, and I'm afraid of nothing.
link |
00:02:27.640
But I am who I am, and my goal in conversation
link |
00:02:31.120
is to understand the human being before me,
link |
00:02:33.440
no matter who they are, no matter their position.
link |
00:02:36.520
And I do believe the line between good and evil
link |
00:02:40.040
runs through the heart of every man.
link |
00:02:43.520
So this is it.
link |
00:02:45.360
This is our world.
link |
00:02:47.400
It is full of hate, violence, and destruction.
link |
00:02:50.640
But it is also full of love, beauty,
link |
00:02:54.040
and the insatiable desire to help each other.
link |
00:02:57.960
The people who run the social networks
link |
00:03:00.280
that show this world, that show us to ourselves,
link |
00:03:04.280
have the greatest of responsibilities.
link |
00:03:07.360
In a time of war, pandemic, atrocity,
link |
00:03:10.640
we turn to social networks to share real human insights
link |
00:03:13.600
and experiences, to organize protests and celebrations,
link |
00:03:17.600
to learn and to challenge our understanding
link |
00:03:20.920
of the world, of our history and of our future,
link |
00:03:24.240
and above all, to be reminded of our common humanity.
link |
00:03:28.440
When the social networks fail,
link |
00:03:30.440
they have the power to cause immense suffering.
link |
00:03:33.560
And when they succeed,
link |
00:03:34.960
they have the power to lessen that suffering.
link |
00:03:37.760
This is hard.
link |
00:03:39.240
It's a responsibility, perhaps,
link |
00:03:41.000
almost unlike any other in history.
link |
00:03:44.000
This podcast conversation attempts to understand the man
link |
00:03:47.160
and the company who take this responsibility on,
link |
00:03:50.600
where they fail and where they hope to succeed.
link |
00:03:54.200
Mark Zuckerberg's feet are often held to the fire,
link |
00:03:57.880
as they should be, and this actually gives me hope.
link |
00:04:01.480
The power of innovation and engineering
link |
00:04:03.840
coupled with the freedom of speech
link |
00:04:05.480
in the form of its highest ideal,
link |
00:04:07.640
I believe, can solve any problem in the world.
link |
00:04:11.080
But that's just it.
link |
00:04:12.600
Both are necessary.
link |
00:04:14.720
The engineer and the critic.
link |
00:04:17.760
I believe that criticism is essential,
link |
00:04:20.520
but cynicism is not.
link |
00:04:23.240
And I worry that in our public discourse,
link |
00:04:25.640
cynicism too easily masquerades as wisdom, as truth,
link |
00:04:30.320
becomes viral and takes over,
link |
00:04:32.160
and worse suffocates the dreams of young minds
link |
00:04:35.400
who want to build solutions to the problems of the world.
link |
00:04:39.200
We need to inspire those young minds.
link |
00:04:41.480
At least for me, they give me hope.
link |
00:04:43.680
And one small way I'm trying to contribute
link |
00:04:46.520
is to have honest conversations like these
link |
00:04:48.800
that don't just ride the viral wave of cynicism,
link |
00:04:52.520
but seek to understand the failures and successes
link |
00:04:54.800
of the past, the problems before us,
link |
00:04:57.160
and the possible solutions
link |
00:04:58.840
in this very complicated world of ours.
link |
00:05:01.760
I'm sure I will fail often.
link |
00:05:05.080
And I count on the critic to point it out when I do.
link |
00:05:09.520
But I ask for one thing,
link |
00:05:11.920
and that is to fuel the fire of optimism,
link |
00:05:14.600
especially in those who dream to build solutions.
link |
00:05:17.760
Because without that, we don't have a chance
link |
00:05:21.200
on this too fragile, tiny planet of ours.
link |
00:05:25.240
This is the Lex Friedman podcast.
link |
00:05:27.480
To support it, please check out our sponsors
link |
00:05:29.720
in the description.
link |
00:05:31.080
And now, dear friends, here's Mark Zuckerberg.
link |
00:05:35.080
Can you circle all the traffic lights, please?
link |
00:05:48.080
You actually did it.
link |
00:05:49.080
That is very impressive performance.
link |
00:05:51.080
Okay, now we can initiate the interview procedure.
link |
00:05:54.080
Is it possible that this conversation is happening
link |
00:05:57.080
inside the metaverse created by you
link |
00:06:00.080
and not inside the metaverse?
link |
00:06:02.080
It's happening inside the metaverse created by you,
link |
00:06:05.080
by Meta, many years from now,
link |
00:06:07.080
and we're doing a memory replay experience.
link |
00:06:10.080
I don't know the answer to that.
link |
00:06:11.080
Then I'd be some computer construct
link |
00:06:15.080
and not the person who created that meta company.
link |
00:06:19.080
But that would truly be Meta.
link |
00:06:21.080
Right, so this could be somebody else
link |
00:06:23.080
using the Mark Zuckerberg avatar.
link |
00:06:26.080
You can do the Mark and the Lex conversation replay
link |
00:06:29.080
from four decades ago.
link |
00:06:31.080
I mean, it's not going to be four decades
link |
00:06:34.080
before we have photo realistic avatars like this.
link |
00:06:37.080
So I think we're much closer to that.
link |
00:06:39.080
Well, that's something you talk about
link |
00:06:41.080
is how passionate you are about the idea
link |
00:06:43.080
of the avatar representing who you are in the metaverse.
link |
00:06:46.080
So I do these podcasts in person.
link |
00:06:50.080
I'm a stickler for that because there's a magic
link |
00:06:53.080
to the in person conversation.
link |
00:06:55.080
How long do you think it'll be before
link |
00:06:57.080
you can have the same kind of magic in the metaverse,
link |
00:07:00.080
the same kind of intimacy in the chemistry,
link |
00:07:02.080
whatever the heck it's there when we're talking in person.
link |
00:07:05.080
How difficult is it, how long before we have it in the metaverse?
link |
00:07:10.080
Well, I think this is like the key question, right?
link |
00:07:13.080
Because the thing that's different about virtual
link |
00:07:17.080
and hopefully augmented reality
link |
00:07:19.080
compared to all other forms of digital platforms before
link |
00:07:22.080
is this feeling of presence, right?
link |
00:07:24.080
The feeling that you're right, that you're in an experience
link |
00:07:26.080
and that you're there with other people or in another place.
link |
00:07:29.080
And that's just different from all the other screens
link |
00:07:32.080
that we have today, right?
link |
00:07:34.080
Phones, TVs, all the stuff.
link |
00:07:36.080
They're trying to, in some cases, deliver experiences
link |
00:07:39.080
that feel high fidelity,
link |
00:07:42.080
but at no point do you actually feel like you're in it, right?
link |
00:07:46.080
At some level, your content is trying to sort of convince you
link |
00:07:49.080
that this is a realistic thing that's happening,
link |
00:07:52.080
but all of the kind of subtle signals
link |
00:07:54.080
are telling you now you're looking at a screen.
link |
00:07:56.080
So, the question about how you develop these systems
link |
00:07:59.080
is like, what are all of the things that make
link |
00:08:02.080
the physical world all the different cues?
link |
00:08:05.080
So, I think on visual presence and spatial audio,
link |
00:08:12.080
we're making reasonable progress.
link |
00:08:15.080
Spatial audio makes a huge deal.
link |
00:08:17.080
I don't know if you've tried this experience,
link |
00:08:19.080
workrooms that we launch where you have meetings.
link |
00:08:22.080
I basically made a rule for all of the top management folks
link |
00:08:27.080
of the company that they need to be doing standing meetings
link |
00:08:29.080
in workrooms already, right?
link |
00:08:31.080
I feel like we got to dog food this.
link |
00:08:33.080
This is how people are going to work in the future,
link |
00:08:35.080
so we have to adopt this now.
link |
00:08:38.080
And there are already a lot of things
link |
00:08:40.080
that I think feel significantly better
link |
00:08:42.080
than typical Zoom meetings.
link |
00:08:44.080
Even though the avatars are a lot lower fidelity,
link |
00:08:48.080
the idea that you have spatial audio,
link |
00:08:50.080
you're around a table in VR with people.
link |
00:08:53.080
If someone's talking from over there,
link |
00:08:55.080
it sounds like it's talking from over there.
link |
00:08:57.080
You can see the arm gestures and stuff feel more natural.
link |
00:09:01.080
You can have side conversations,
link |
00:09:03.080
which is something that you can't really do in Zoom.
link |
00:09:05.080
I mean, I guess you can text someone out of band.
link |
00:09:08.080
But if you're actually sitting around a table with people,
link |
00:09:12.080
you can lean over and whisper to the person next to you
link |
00:09:15.080
and have a conversation that you can't really do
link |
00:09:19.080
with in just video communication.
link |
00:09:23.080
So I think it's interesting in what ways
link |
00:09:27.080
some of these things already feel more real
link |
00:09:29.080
than a lot of the technology that we have,
link |
00:09:32.080
even when the visual fidelity isn't quite there,
link |
00:09:35.080
but I think it'll get there over the next few years.
link |
00:09:37.080
Now, I mean, you were asking about comparing that
link |
00:09:39.080
to the true physical world, not Zoom or something like that.
link |
00:09:42.080
And there, I mean, I think you have feelings of, like,
link |
00:09:46.080
temperature, olfactory, obviously touch,
link |
00:09:51.080
we're working on haptic gloves,
link |
00:09:54.080
the sense that you wanna be able to put your hands down
link |
00:09:57.080
and feel some pressure from the table.
link |
00:09:59.080
All these things I think are gonna be really critical
link |
00:10:01.080
to be able to keep up this illusion
link |
00:10:04.080
that you're in a world
link |
00:10:06.080
and that you're fully present in this world.
link |
00:10:08.080
But I think we're gonna have a lot of these building blocks
link |
00:10:11.080
within the next 10 years or so.
link |
00:10:14.080
And even before that, I think it's amazing
link |
00:10:16.080
how much you're just gonna be able to build with software
link |
00:10:18.080
that sort of masks some of these things.
link |
00:10:21.080
I realize I'm going long,
link |
00:10:23.080
but I was told we have a few hours here.
link |
00:10:25.080
Yeah, we're here for five to six hours.
link |
00:10:27.080
Yeah, so I mean, look, I mean,
link |
00:10:29.080
that's on the shorter end of the congressional testimonies
link |
00:10:31.080
I've done.
link |
00:10:32.080
But it's, you know, one of the things that we found
link |
00:10:37.080
with hand presence, right?
link |
00:10:39.080
So the earliest VR, you just have the headset,
link |
00:10:42.080
and then, and that was cool.
link |
00:10:44.080
You could look around, you feel like you're in a place,
link |
00:10:46.080
but you don't feel like you're really able to interact with it
link |
00:10:48.080
until you have hands.
link |
00:10:49.080
And then there was this big question where once you got hands,
link |
00:10:51.080
what's the right way to represent them?
link |
00:10:54.080
And initially, all of our assumptions was,
link |
00:10:58.080
okay, when I look down and see my hands in the physical world,
link |
00:11:00.080
I see an arm, and it's gonna be super weird
link |
00:11:03.080
if you see, you know, just your hand.
link |
00:11:06.080
But it turned out to not be the case
link |
00:11:08.080
because there's this issue with your arms,
link |
00:11:10.080
like, what's your elbow angle?
link |
00:11:12.080
And if the elbow angle that we're kind of interpolating
link |
00:11:14.080
based on where your hand is and where your headset is,
link |
00:11:18.080
actually is inaccurate.
link |
00:11:20.080
It creates this very uncomfortable feeling where it's like,
link |
00:11:22.080
oh, like my arm is actually out like this,
link |
00:11:24.080
but it's like showing it in here,
link |
00:11:26.080
and that actually broke the feeling of presence a lot more.
link |
00:11:29.080
Whereas it turns out that if you just show the hands
link |
00:11:31.080
and you don't show the arms,
link |
00:11:34.080
it actually is fine for people.
link |
00:11:36.080
So I think that there's a bunch of these interesting
link |
00:11:39.080
psychological cues where it'll be more about
link |
00:11:42.080
getting the right details right.
link |
00:11:45.080
And I think a lot of that will be possible even over,
link |
00:11:47.080
you know, a few year period or a five year period,
link |
00:11:49.080
and we won't need like every single thing to be solved
link |
00:11:52.080
to deliver this like full sense of presence.
link |
00:11:54.080
Yeah, it's a fascinating psychology question of
link |
00:11:56.080
what is the essence that makes in person conversation special?
link |
00:12:04.080
It's like emojis are able to convey emotion really well,
link |
00:12:08.080
even though they're obviously not photorealistic.
link |
00:12:10.080
And so in that same way, just like you're saying,
link |
00:12:12.080
just showing the hands is able to create
link |
00:12:15.080
a comfortable expression with your hands.
link |
00:12:18.080
So I wonder what that is.
link |
00:12:19.080
You know, people in the World Wars used to write letters,
link |
00:12:22.080
and you can fall in love with just writing letters.
link |
00:12:24.080
You don't need to see each other in person.
link |
00:12:26.080
You can convey emotion.
link |
00:12:28.080
You can be depth of experience with just words.
link |
00:12:33.080
So that's a, I think, a fascinating place to explore psychology
link |
00:12:37.080
of how do you find that intimacy?
link |
00:12:39.080
Yeah, and you know, the way that I come to all of this stuff is,
link |
00:12:42.080
you know, I basically studied psychology and computer science.
link |
00:12:45.080
So all of the work that I do is sort of at the intersection
link |
00:12:49.080
of those things.
link |
00:12:50.080
I think most of the other big tech companies
link |
00:12:52.080
are building technology for you to interact with.
link |
00:12:55.080
What I care about is building technology to help people
link |
00:12:57.080
interact with each other.
link |
00:12:58.080
So I think it's a somewhat different approach
link |
00:13:00.080
than most of the other tech entrepreneurs
link |
00:13:02.080
and big companies come at this from.
link |
00:13:06.080
And a lot of the lessons in terms of how I think about designing
link |
00:13:10.080
products come from some just basic elements of psychology,
link |
00:13:15.080
right?
link |
00:13:16.080
In terms of, you know, our brains, you know,
link |
00:13:19.080
you can compare to the brains of other animals,
link |
00:13:21.080
you know, we're very wired to specific things,
link |
00:13:24.080
facial expressions, right?
link |
00:13:25.080
I mean, we're very visual, right?
link |
00:13:28.080
So compared to other animals, I mean, that's clearly
link |
00:13:30.080
the main sense that most people have.
link |
00:13:33.080
But there's a whole part of your brain that's just kind of
link |
00:13:36.080
focused on reading facial cues.
link |
00:13:38.080
So, you know, when we're designing the next version of Quest
link |
00:13:42.080
or the VR headset, a big focus for us is face tracking
link |
00:13:45.080
and basically eye tracking so you can make eye contact,
link |
00:13:48.080
which again, isn't really something that you can do
link |
00:13:50.080
over video conference.
link |
00:13:51.080
It's sort of amazing how much, how far video conferencing
link |
00:13:55.080
has gotten without the ability to make eye contact, right?
link |
00:13:58.080
It's sort of a bizarre thing if you think about it.
link |
00:14:00.080
You're like looking at someone's face, you know,
link |
00:14:02.080
sometimes for, you know, an hour when you're in a meeting
link |
00:14:05.080
and like you looking at their eyes to them doesn't look
link |
00:14:09.080
like you're looking at their eyes.
link |
00:14:11.080
You're always looking at me past each other, I guess.
link |
00:14:14.080
Yeah, I guess you're right.
link |
00:14:15.080
You're not sending that signal.
link |
00:14:16.080
Well, you're trying to.
link |
00:14:17.080
Right, you're trying to.
link |
00:14:18.080
Like a lot of times, I mean, or at least I find myself,
link |
00:14:20.080
I'm trying to look into the other person's eyes.
link |
00:14:21.080
But they don't feel like you're looking to them.
link |
00:14:23.080
Yeah, so then the question is, all right,
link |
00:14:24.080
am I supposed to look at the camera so that way you can,
link |
00:14:26.080
you know, have a sensation that I'm looking at you?
link |
00:14:28.080
I think that that's an interesting question.
link |
00:14:30.080
And then, you know, with VR today, even without eye tracking
link |
00:14:35.080
and knowing what your eyes are actually looking at,
link |
00:14:37.080
you can fake it reasonably well, right?
link |
00:14:39.080
So you can look at like where the head pose is
link |
00:14:42.080
and if it looks like I'm kind of looking in your general direction,
link |
00:14:44.080
then you can sort of assume that maybe there's some eye contact
link |
00:14:47.080
intended and you can do it in a way where it's like,
link |
00:14:50.080
okay, maybe not.
link |
00:14:51.080
It's like a, maybe it's not a fixated stare,
link |
00:14:54.080
but it's somewhat natural.
link |
00:14:56.080
But once you have actual eye tracking,
link |
00:14:58.080
you can do it for real.
link |
00:15:00.080
And I think that that's really important stuff.
link |
00:15:02.080
So when I think about Meta's contribution to this field,
link |
00:15:05.080
I have to say it's not clear to me that any of the other companies
link |
00:15:08.080
that are focused on the Metaverse
link |
00:15:11.080
or on virtual and augmented reality
link |
00:15:13.080
are going to prioritize putting these features in the hardware
link |
00:15:16.080
because like everything, they're tradeoffs, right?
link |
00:15:18.080
I mean, it adds some weight to the device.
link |
00:15:21.080
Maybe it adds some thickness.
link |
00:15:23.080
You could totally see another company taking the approach
link |
00:15:25.080
of let's just make the lightest and thinnest thing possible.
link |
00:15:27.080
But, you know, I want us to design the most human thing possible
link |
00:15:31.080
that creates the richest sense of presence.
link |
00:15:33.080
And because so much of human emotion and expression
link |
00:15:37.080
comes from these like micro movements.
link |
00:15:39.080
If I like move my eyebrow, you know, millimeter,
link |
00:15:41.080
you will notice and that like means something.
link |
00:15:44.080
So the fact that we're losing these signals
link |
00:15:46.080
and a lot of communication I think is a loss.
link |
00:15:49.080
And so it's not like, okay, there's one feature
link |
00:15:51.080
and you add this, then it all of a sudden is going to feel
link |
00:15:53.080
like we have real presence.
link |
00:15:55.080
You can sort of look at how the human brain works
link |
00:15:58.080
and how we express and kind of read emotions.
link |
00:16:02.080
And you can just build a roadmap of that, you know,
link |
00:16:05.080
of just what are the most important things
link |
00:16:07.080
to try to unlock over a five to 10 year period
link |
00:16:09.080
and just try to make the experience more and more human and social.
link |
00:16:13.080
When do you think would be a moment,
link |
00:16:17.080
like a singularity moment for the metaverse
link |
00:16:20.080
where there's a lot of ways to ask this question.
link |
00:16:23.080
People will have many or most of their meaningful experiences
link |
00:16:28.080
in the metaverse versus the real world.
link |
00:16:31.080
And actually it's interesting to think about the fact
link |
00:16:33.080
that a lot of people are having the most important moments
link |
00:16:36.080
of their life happen in the digital sphere,
link |
00:16:39.080
especially not during COVID, you know,
link |
00:16:41.080
like even falling in love or meeting friends
link |
00:16:44.080
or getting excited about stuff that is happening
link |
00:16:47.080
on the 2D digital plane.
link |
00:16:49.080
When do you think the metaverse will provide those experiences
link |
00:16:52.080
for a large number, like a majority population?
link |
00:16:55.080
Yeah, I think it's a really good question.
link |
00:16:57.080
There was someone, you know, I read this piece that framed this as
link |
00:17:02.080
a lot of people think that the metaverse is about a place,
link |
00:17:06.080
but one definition of this is it's about a time
link |
00:17:10.080
when basically immersive digital worlds become the primary way
link |
00:17:14.080
that we live our lives and spend our time.
link |
00:17:18.080
I think that's a reasonable construct.
link |
00:17:20.080
And from that perspective, you know,
link |
00:17:22.080
I think you also just want to look at this as a continuation
link |
00:17:25.080
because it's not like, okay, we are building digital worlds,
link |
00:17:29.080
but we don't have that today.
link |
00:17:30.080
I think, you know, you and I probably already live a very large
link |
00:17:33.080
part of our life in digital worlds.
link |
00:17:35.080
They're just not 3D immersive virtual reality,
link |
00:17:37.080
but, you know, I do a lot of meetings over video
link |
00:17:40.080
and I spend a lot of time writing things over email
link |
00:17:42.080
or WhatsApp or whatever.
link |
00:17:44.080
So what is it going to take to get there
link |
00:17:46.080
for kind of the immersive presence version of this,
link |
00:17:49.080
is what you're asking.
link |
00:17:51.080
And for that, I think that there's just a bunch
link |
00:17:53.080
of different use cases.
link |
00:17:55.080
And I think when you're building technology,
link |
00:18:00.080
I think a lot of it is just you're managing this duality
link |
00:18:05.080
where on the one hand, you want to build these elegant things
link |
00:18:09.080
that can scale and have billions of people use them
link |
00:18:12.080
and get value from them.
link |
00:18:13.080
And then on the other hand, you're fighting this kind of
link |
00:18:16.080
ground game where it's just, there are just a lot
link |
00:18:19.080
of different use cases and people do different things
link |
00:18:21.080
and like you want to be able to unlock them.
link |
00:18:22.080
So the first ones that we basically went after were gaming
link |
00:18:27.080
with Quest and social experiences.
link |
00:18:30.080
And this is, you know, it goes back to when we started
link |
00:18:32.080
working on virtual reality.
link |
00:18:33.080
My theory at the time was basically,
link |
00:18:37.080
people thought about it as gaming,
link |
00:18:39.080
but if you look at all computing platforms up to that point,
link |
00:18:44.080
you know, gaming is a huge part.
link |
00:18:46.080
It was a huge part of PCs.
link |
00:18:47.080
It was a huge part of mobile,
link |
00:18:49.080
but it was also very decentralized, right?
link |
00:18:52.080
There wasn't, you know, for the most part,
link |
00:18:54.080
you know, one or two gaming companies,
link |
00:18:55.080
there were a lot of gaming companies
link |
00:18:57.080
and gaming is somewhat hits based.
link |
00:18:58.080
I mean, we're getting some games that have more longevity,
link |
00:19:01.080
but in general, you know, there were a lot
link |
00:19:04.080
of different games out there.
link |
00:19:06.080
But on PC and on mobile, the companies that focused
link |
00:19:12.080
on communication and social interaction,
link |
00:19:15.080
there tended to be a smaller number of those,
link |
00:19:17.080
and that ended up being just as important of a thing
link |
00:19:19.080
as all of the games that you did combined.
link |
00:19:21.080
I think productivity is another area.
link |
00:19:23.080
That's obviously something that we've historically
link |
00:19:24.080
been less focused on, but I think it's going
link |
00:19:26.080
to be really important for us.
link |
00:19:27.080
Would workroom or give me productivity
link |
00:19:29.080
in the collaborative aspect?
link |
00:19:31.080
Yeah, I think that there's a workroom's aspect of this,
link |
00:19:34.080
like a meeting aspect.
link |
00:19:35.080
And then I think that there's like, you know,
link |
00:19:37.080
word, Excel, you know, productivity.
link |
00:19:41.080
Either you're like, you're working or coding
link |
00:19:43.080
or knowledge work, right?
link |
00:19:45.080
As opposed to just meetings.
link |
00:19:47.080
So you can kind of go through all these different use cases.
link |
00:19:49.080
You know, gaming, I think we're well in our way.
link |
00:19:51.080
Social, I think, we're just the kind of preeminent company
link |
00:19:56.080
that focuses on this.
link |
00:19:57.080
And I think that that's already on Quest becoming the,
link |
00:20:00.080
you know, if you look at the list of what are the top apps,
link |
00:20:03.080
you know, social apps are already, you know,
link |
00:20:05.080
number one, two, three.
link |
00:20:07.080
So that's kind of becoming a critical thing.
link |
00:20:10.080
But I don't know, I would imagine for someone like you,
link |
00:20:12.080
it'll be, you know, until we get, you know,
link |
00:20:15.080
a lot of the work things dialed in, right?
link |
00:20:17.080
When this is just like much more adopted
link |
00:20:20.080
and clearly better than Zoom for VC,
link |
00:20:24.080
when, you know, if you're doing your coding
link |
00:20:26.080
or your writing or whatever it is in VR,
link |
00:20:29.080
which it's not that far off to imagine that
link |
00:20:31.080
because pretty soon you're just going to be able to have a screen
link |
00:20:33.080
that's bigger than, you know, it'll be your ideal setup
link |
00:20:35.080
and you can bring it with you and put it on anywhere
link |
00:20:37.080
and have your kind of ideal workstation.
link |
00:20:39.080
So I think that there are a few things to work out on that,
link |
00:20:42.080
but I don't think that that's more than, you know, five years off.
link |
00:20:46.080
And then you'll get a bunch of other things
link |
00:20:48.080
that like aren't even possible
link |
00:20:50.080
or you don't even think about using a phone or PC for today,
link |
00:20:53.080
like fitness, right?
link |
00:20:54.080
So I mean, I know that you're, you know,
link |
00:20:56.080
we were talking before about how you're into running
link |
00:20:58.080
and like I'm really into, you know,
link |
00:21:00.080
a lot of things around fitness as well,
link |
00:21:02.080
you know, different things in different places.
link |
00:21:04.080
I got really into hydrofoiling recently.
link |
00:21:06.080
Nice. I saw a video.
link |
00:21:08.080
Yeah, and surfing and I used to fence competitively at like run.
link |
00:21:13.080
And you were saying that you were thinking about
link |
00:21:15.080
trying different martial arts and I tried to trick you
link |
00:21:17.080
and convince you into doing Brazilian Jiu Jitsu.
link |
00:21:19.080
Or you actually mentioned that that was one you're curious about.
link |
00:21:22.080
And I doubt it.
link |
00:21:23.080
Is that a trick?
link |
00:21:24.080
Yeah, I don't know.
link |
00:21:25.080
We're in the metaverse now.
link |
00:21:27.080
Yeah, I mean, I took that seriously.
link |
00:21:29.080
I thought that that was a real suggestion.
link |
00:21:34.080
That would be an amazing chance
link |
00:21:36.080
if we ever step on the mat together and just like roll around.
link |
00:21:39.080
I'll show you some moves.
link |
00:21:40.080
Well, give me a year to train and then we can do it.
link |
00:21:43.080
This is like, you know, you've seen Rocky 4
link |
00:21:45.080
where the Russian faces off the America.
link |
00:21:47.080
I'm the Russian in this picture.
link |
00:21:48.080
And then you're the Rocky, the underdog
link |
00:21:50.080
that gets to it, to win in the end.
link |
00:21:52.080
The idea of me as Rocky and like fighting is...
link |
00:21:56.080
If he dies, he dies.
link |
00:21:58.080
Sorry. Just had to...
link |
00:22:00.080
I mean...
link |
00:22:01.080
Anyway, yeah.
link |
00:22:02.080
But I mean, a lot of aspects of fitness
link |
00:22:05.080
and I don't know if you've tried Supernatural on Quest or...
link |
00:22:10.080
So first of all, can I just comment on the fact
link |
00:22:12.080
every time I played around with Quest 2,
link |
00:22:15.080
I get giddy every time I step into virtual reality.
link |
00:22:18.080
So you mentioned productivity and all those kinds of things.
link |
00:22:21.080
That's definitely something I'm excited about.
link |
00:22:24.080
But really, I just love the possibilities of stepping into that world.
link |
00:22:28.080
It's...
link |
00:22:29.080
Maybe it's the introvert in me,
link |
00:22:31.080
but it just feels like the most convenient way
link |
00:22:34.080
to travel into worlds,
link |
00:22:37.080
worlds that are similar to the real world
link |
00:22:40.080
are totally different.
link |
00:22:41.080
It's like Alice in Wonderland.
link |
00:22:43.080
Just try out crazy stuff.
link |
00:22:45.080
The possibilities are endless.
link |
00:22:46.080
And I just...
link |
00:22:47.080
I personally just love...
link |
00:22:51.080
get excited for stepping those virtual worlds.
link |
00:22:54.080
So I'm a huge fan.
link |
00:22:55.080
In terms of the productivity as a program,
link |
00:22:58.080
I spend most of my day programming,
link |
00:23:00.080
that's really interesting also.
link |
00:23:02.080
But then you have to develop the right IDs.
link |
00:23:04.080
You have to develop...
link |
00:23:06.080
There has to be a threshold where a large amount
link |
00:23:08.080
of the program community moves there.
link |
00:23:10.080
But the collaborative aspects that are possible
link |
00:23:13.080
in terms of meetings,
link |
00:23:14.080
in terms of the...
link |
00:23:16.080
when two coders are working together,
link |
00:23:18.080
I mean, the possibilities there are super, super exciting.
link |
00:23:21.080
I think that in building this,
link |
00:23:24.080
we sort of need to balance...
link |
00:23:27.080
there are going to be some new things
link |
00:23:28.080
that you just couldn't do before.
link |
00:23:29.080
And those are going to be the amazing experiences.
link |
00:23:31.080
So teleporting to any place, right?
link |
00:23:33.080
Whether it's a real place or something that people made.
link |
00:23:38.080
And I mean, some of the experiences around
link |
00:23:40.080
how we can build stuff in new ways,
link |
00:23:42.080
where a lot of the stuff that...
link |
00:23:44.080
when I'm coding stuff, it's like,
link |
00:23:45.080
all right, you code it and then you build it
link |
00:23:47.080
and then you see it afterwards.
link |
00:23:48.080
But increasingly, it's going to be possible to...
link |
00:23:50.080
you're in a world and you're building the world
link |
00:23:52.080
as you are in it and kind of manipulating it.
link |
00:23:55.080
One of the things that we showed at our inside the lab
link |
00:23:59.080
for recent artificial intelligence progress
link |
00:24:02.080
is this builder bot program where now you are...
link |
00:24:05.080
you can just talk to it and say,
link |
00:24:07.080
hey, okay, I'm in this world.
link |
00:24:08.080
Put some trees over there and it'll do that.
link |
00:24:10.080
And like, all right, put some bottles of water on...
link |
00:24:14.080
on our picnic blanket and it'll do that
link |
00:24:16.080
and you're in the world.
link |
00:24:17.080
And I think there are going to be new paradigms for coding.
link |
00:24:19.080
So yeah, there are going to be some things
link |
00:24:21.080
that I think are just pretty amazing,
link |
00:24:24.080
especially the first few times that you do that,
link |
00:24:26.080
that you're like, whoa, like I've never had an experience
link |
00:24:29.080
like this.
link |
00:24:30.080
But most of your life, I would imagine,
link |
00:24:34.080
is not doing things that are amazing for the first time.
link |
00:24:37.080
A lot of this in terms of...
link |
00:24:39.080
I mean, just answering your question from before around,
link |
00:24:41.080
what is it going to take before you're spending
link |
00:24:43.080
most of your time in this?
link |
00:24:44.080
Well, first of all, let me just say it as an aside,
link |
00:24:48.080
the goal isn't to have people spend a lot more time
link |
00:24:50.080
in computing.
link |
00:24:51.080
I'm asking for myself, when will I spend all of my time
link |
00:24:54.080
in computing? Yeah, it's to make computing more natural.
link |
00:24:57.080
But I think you will spend more...
link |
00:25:01.080
most of your computing time in this
link |
00:25:03.080
when it does the things that you use computing for
link |
00:25:06.080
somewhat better.
link |
00:25:07.080
So maybe having your perfect workstation
link |
00:25:10.080
is a 5% improvement on your coding productivity.
link |
00:25:14.080
Maybe it's not like a completely new thing.
link |
00:25:18.080
But I mean, look, if I could increase the productivity
link |
00:25:21.080
of every engineer at Meta by 5%,
link |
00:25:24.080
we'd buy those devices for everyone.
link |
00:25:27.080
And I imagine a lot of other companies would too.
link |
00:25:30.080
And that's how you start getting to the scale
link |
00:25:32.080
that I think makes this rival some of the bigger
link |
00:25:35.080
computing platforms that exist today.
link |
00:25:37.080
Let me ask you about identity.
link |
00:25:38.080
We talked about the avatar.
link |
00:25:40.080
How do you see identity in the Metaverse?
link |
00:25:42.080
Should the avatar be tied to your identity?
link |
00:25:46.080
Or can I be anything in the Metaverse?
link |
00:25:49.080
Like, can I be whatever the heck I want?
link |
00:25:52.080
Can I even be a troll?
link |
00:25:53.080
So there's exciting, freeing possibilities
link |
00:25:57.080
and there's the darker possibilities too.
link |
00:26:00.080
Yeah, I mean, I think that there's going to be a range.
link |
00:26:03.080
So we're working on, for expression in avatars,
link |
00:26:10.080
on one end of the spectrum are kind of expressive
link |
00:26:13.080
and cartoonish avatars.
link |
00:26:15.080
And then on the other end of the spectrum
link |
00:26:17.080
are realistic avatars.
link |
00:26:19.080
And I just think the reality is that there are going to be
link |
00:26:21.080
different use cases for different things.
link |
00:26:23.080
And I guess there's another axis.
link |
00:26:25.080
So if you're going from photo realistic to expressive,
link |
00:26:28.080
there's also like representing you directly
link |
00:26:31.080
versus like some fantasy identity.
link |
00:26:33.080
And I think that there are going to be things
link |
00:26:35.080
on all ends of that spectrum too, right?
link |
00:26:38.080
So you'll want, in some experience,
link |
00:26:41.080
you might want to be like a photo realistic dragon, right?
link |
00:26:44.080
Or if I'm playing Onward or just this military simulator game,
link |
00:26:49.080
I think getting to be more photo realistic as a soldier
link |
00:26:53.080
in that could enhance the experience.
link |
00:26:57.080
There are times when I'm hanging out with friends
link |
00:26:59.080
where I want them to know it's me,
link |
00:27:02.080
so a kind of cartoonish or expressive version of me is good.
link |
00:27:06.080
But there are also experiences like,
link |
00:27:09.080
VRChat does this well today,
link |
00:27:11.080
where a lot of the experience is kind of dressing up
link |
00:27:14.080
and wearing a fantastical avatar
link |
00:27:17.080
that's almost like a meme or is humorous.
link |
00:27:19.080
So you come into an experience and it's almost like,
link |
00:27:22.080
you have like a built in icebreaker
link |
00:27:24.080
because like you see people and you're just like,
link |
00:27:27.080
all right, I'm cracking up at what you're wearing
link |
00:27:29.080
because that's funny and it's just like,
link |
00:27:31.080
where'd you get that or oh, you made that?
link |
00:27:33.080
That's, you know, it's awesome.
link |
00:27:35.080
Whereas, you know, okay, if you're going into a work meeting,
link |
00:27:38.080
maybe a photo realistic version of your real self
link |
00:27:41.080
is going to be the most appropriate thing for that.
link |
00:27:43.080
So I think the reality is there aren't going to be,
link |
00:27:46.080
it's not just going to be one thing.
link |
00:27:49.080
You know, my own sense of kind of how you want to express
link |
00:27:54.080
identity online has sort of evolved over time
link |
00:27:56.080
in that, you know, early days in Facebook,
link |
00:27:58.080
I thought, okay, people are going to have one identity
link |
00:28:00.080
and now I think that's clearly not going to be the case.
link |
00:28:02.080
I think you're going to have all these different things
link |
00:28:04.080
and there's utility in being able to do different things.
link |
00:28:07.080
So some of the technical challenges
link |
00:28:10.080
that I'm really interested in around it
link |
00:28:12.080
are how do you build the software
link |
00:28:14.080
to allow people to seamlessly go between them?
link |
00:28:17.080
So say, so you could view them as just completely
link |
00:28:23.080
discrete points on a spectrum,
link |
00:28:25.080
but let's talk about the metaverse economy for a second.
link |
00:28:28.080
Let's say I buy a digital shirt
link |
00:28:31.080
for my photo realistic avatar,
link |
00:28:33.080
which by the way, I think at the time
link |
00:28:35.080
where we're spending a lot of time in the metaverse
link |
00:28:37.080
doing a lot of our work meetings in the metaverse,
link |
00:28:39.080
et cetera, I would imagine that the economy
link |
00:28:41.080
around virtual clothing, as an example,
link |
00:28:43.080
is going to be quite as big.
link |
00:28:45.080
Why wouldn't I spend almost as much money
link |
00:28:47.080
in investing in my appearance or expression
link |
00:28:50.080
for my photo realistic avatar for meetings
link |
00:28:52.080
as I would for whatever I'm going to wear in my video chat?
link |
00:28:55.080
But the question is, okay, so let's say you buy some shirt
link |
00:28:57.080
for your photo realistic avatar.
link |
00:28:59.080
Wouldn't it be cool if there was a way to basically
link |
00:29:04.080
translate that into a more expressive thing
link |
00:29:08.080
for your cartoonish or expressive avatar?
link |
00:29:11.080
And there are multiple ways to do that.
link |
00:29:13.080
You can view them as two discrete points and, okay,
link |
00:29:15.080
maybe if a designer sells one thing,
link |
00:29:18.080
then it actually comes in a pack and there's two
link |
00:29:20.080
and you can use either one on that.
link |
00:29:22.080
But I actually think this stuff might exist more
link |
00:29:24.080
as a spectrum in the future.
link |
00:29:26.080
And that's what, I do think the direction
link |
00:29:29.080
on some of the AI advances that is happening
link |
00:29:33.080
to be able to, especially stuff around like style transfer,
link |
00:29:36.080
being able to take a piece of art or express something
link |
00:29:39.080
and say, okay, paint me this photo in the style of Gauguin
link |
00:29:44.080
or whoever it is that you're interested in.
link |
00:29:48.080
Take this shirt and put it in the style
link |
00:29:51.080
of what I've designed for my expressive avatar.
link |
00:29:54.080
I think that's going to be pretty compelling.
link |
00:29:57.080
And so the fashion, you might be buying like a generator,
link |
00:30:00.080
like a closet that generates a style.
link |
00:30:03.080
And then like with the gowns,
link |
00:30:05.080
they'll be able to infinitely generate outfits
link |
00:30:08.080
thereby making it.
link |
00:30:09.080
So the reason I wear the same thing all the time
link |
00:30:11.080
is that I don't like choice you've talked about.
link |
00:30:13.080
You've talked about the same thing,
link |
00:30:15.080
but now you don't even have to choose.
link |
00:30:17.080
Your closet generates your outfit for you every time.
link |
00:30:19.080
And so you have to live with the outfit it generates.
link |
00:30:23.080
I mean, you could do that, although,
link |
00:30:25.080
no, I think that that's, I think some people will,
link |
00:30:27.080
but I think like, I think that there's going to be
link |
00:30:30.080
a huge aspect of just people doing creative commerce here.
link |
00:30:35.080
So I think that there is going to be a big market
link |
00:30:37.080
around people designing digital clothing.
link |
00:30:40.080
But the question is, if you're designing digital clothing,
link |
00:30:43.080
do you need to design, if you're the designer,
link |
00:30:45.080
do you need to make it for each kind of specific
link |
00:30:47.080
discrete point along a spectrum?
link |
00:30:49.080
Or are you just designing it for kind of a photo realistic case
link |
00:30:53.080
or an expressive case?
link |
00:30:54.080
Or can you design one and have it translate
link |
00:30:56.080
across these things?
link |
00:30:58.080
If I buy a style from a designer who I care about
link |
00:31:01.080
and now I'm a dragon, is there a way to morph that
link |
00:31:04.080
so it goes on the dragon in a way that makes sense?
link |
00:31:07.080
And that, I think, is an interesting AI problem
link |
00:31:09.080
because you're probably not going to make it
link |
00:31:11.080
so that designers have to go design for all those things.
link |
00:31:14.080
But the more useful the digital content is that you buy
link |
00:31:17.080
in a lot of uses, in a lot of use cases,
link |
00:31:21.080
the more that economy will just explode.
link |
00:31:23.080
And that's a lot of what all of the...
link |
00:31:27.080
We were joking about NFTs before,
link |
00:31:29.080
but I think a lot of the promise here
link |
00:31:31.080
is that if the digital goods that you buy
link |
00:31:34.080
are not just tied to one platform or one use case,
link |
00:31:37.080
they end up being more valuable,
link |
00:31:38.080
which means that people are more willing
link |
00:31:39.080
and more likely to invest in them
link |
00:31:41.080
and that just spurs the whole economy.
link |
00:31:44.080
But the question is, that's a fascinating positive aspect,
link |
00:31:47.080
but the potential negative aspect is that
link |
00:31:50.080
you can have people concealing their identity
link |
00:31:52.080
in order to troll or even not people, bots.
link |
00:31:56.080
So how do you know in the metaverse
link |
00:31:58.080
that you're talking to a real human or an AI
link |
00:32:01.080
or a well intentioned human?
link |
00:32:03.080
Is that something you think about,
link |
00:32:04.080
something you're concerned about?
link |
00:32:06.080
Well, let's break that down into a few different cases.
link |
00:32:09.080
I mean, because knowing that you're talking to someone
link |
00:32:11.080
who has good intentions is something that I think
link |
00:32:13.080
is not even solved in pretty much anywhere.
link |
00:32:17.080
But if you're talking to someone who's a dragon,
link |
00:32:20.080
I think it's pretty clear that they're not representing
link |
00:32:22.080
themselves as a person.
link |
00:32:23.080
I think probably the most pernicious thing
link |
00:32:25.080
that you want to solve for is,
link |
00:32:29.080
I think probably one of the scariest ones
link |
00:32:31.080
is how do you make sure that someone isn't impersonating you?
link |
00:32:34.080
Right?
link |
00:32:35.080
So like, okay, you're in a future version
link |
00:32:37.080
of this conversation.
link |
00:32:39.080
And we have photorealistic avatars
link |
00:32:41.080
and we're doing this in work rooms
link |
00:32:43.080
or whatever the future version of that is.
link |
00:32:45.080
And someone walks in who looks like me.
link |
00:32:48.080
How do you know that that's me?
link |
00:32:50.080
And one of the things that we're thinking about
link |
00:32:54.080
is it's still a pretty big AI project
link |
00:32:57.080
to be able to generate photorealistic avatars
link |
00:32:59.080
that basically can like, they work like these codecs of you.
link |
00:33:03.080
So you kind of have a map from your headset
link |
00:33:06.080
and whatever sensors of what your body's actually doing
link |
00:33:08.080
and it takes the model in it and it kind of displays it in VR.
link |
00:33:11.080
But there's a question which is,
link |
00:33:12.080
should there be some sort of biometric security
link |
00:33:15.080
so that when I put on my VR headset
link |
00:33:18.080
or I'm going to go use that avatar,
link |
00:33:21.080
I need to first prove that I am that.
link |
00:33:24.080
And I think you probably are going to want something like that.
link |
00:33:27.080
So as we're developing these technologies,
link |
00:33:31.080
we're also thinking about the security for things like that
link |
00:33:34.080
because people aren't going to want to be impersonated.
link |
00:33:37.080
That's a huge security issue.
link |
00:33:41.080
Then you just get the question of people hiding behind
link |
00:33:44.080
fake accounts to do malicious things,
link |
00:33:48.080
which is not going to be unique to the metaverse,
link |
00:33:51.080
although certainly in an environment
link |
00:33:56.080
where it's more immersive and you have more of a sense of presence,
link |
00:33:58.080
it could be more painful.
link |
00:34:01.080
But this is obviously something that we've just dealt with
link |
00:34:04.080
for years in social media and the internet more broadly.
link |
00:34:08.080
And there, I think, there have been a bunch of tactics
link |
00:34:13.080
that I think we've just evolved to,
link |
00:34:17.080
we've built up these different AI systems
link |
00:34:20.080
to basically get a sense of,
link |
00:34:22.080
is this account behaving in the way that a person would?
link |
00:34:26.080
And it turns out,
link |
00:34:28.080
so in all of the work that we've done around,
link |
00:34:31.080
we call it community integrity,
link |
00:34:33.080
and it's basically policing harmful content
link |
00:34:36.080
and trying to figure out where to draw the line.
link |
00:34:38.080
And there are all these really hard and philosophical questions
link |
00:34:41.080
around where do you draw the line on some of this stuff.
link |
00:34:43.080
And the thing that I've kind of found the most effective
link |
00:34:47.080
is as much as possible trying to figure out
link |
00:34:51.080
who are the inauthentic accounts
link |
00:34:53.080
or where are the accounts that are behaving
link |
00:34:55.080
in an overall harmful way at the account level
link |
00:34:58.080
rather than trying to get into policing what they're saying.
link |
00:35:01.080
Which I think the metaverse is going to be even harder
link |
00:35:03.080
because the metaverse, I think, will have more properties
link |
00:35:06.080
of it's almost more like a phone call,
link |
00:35:09.080
or it's not like I post a piece of content
link |
00:35:12.080
and is that piece of content good or bad.
link |
00:35:14.080
So I think more of this stuff will have to be done
link |
00:35:16.080
at the level of the account.
link |
00:35:19.080
But this is the area where,
link |
00:35:21.080
between the kind of counterintelligence teams
link |
00:35:27.080
that we've built up inside the company
link |
00:35:28.080
and years of building just different AI systems
link |
00:35:33.080
to basically detect what is a real account and what isn't.
link |
00:35:36.080
I'm not saying we're perfect,
link |
00:35:38.080
but this is an area where I just think we are like years
link |
00:35:42.080
ahead of basically anyone else in the industry
link |
00:35:45.080
in terms of having built those capabilities.
link |
00:35:48.080
And I think that that just is going to be incredibly important
link |
00:35:50.080
for this next wave of things.
link |
00:35:52.080
And like you said, on a technical level,
link |
00:35:54.080
on a philosophical level,
link |
00:35:55.080
it's an incredibly difficult problem to solve.
link |
00:35:59.080
By the way, I would probably like to open source my avatar
link |
00:36:03.080
so there could be like millions of Lexes walking around,
link |
00:36:06.080
like an army. Like Agent Smith?
link |
00:36:08.080
Agent Smith, yeah, exactly.
link |
00:36:10.080
So the Unity ML folks built a copy of me
link |
00:36:16.080
and they sent it to me.
link |
00:36:18.080
So there's a person running around
link |
00:36:20.080
and I've just been doing reinforcement learning on it.
link |
00:36:22.080
I was going to release it because, you know,
link |
00:36:26.080
just to have sort of like thousands of Lexes doing reinforcement.
link |
00:36:31.080
So they fall over naturally.
link |
00:36:32.080
They have to learn how to like walk around and stuff.
link |
00:36:35.080
But I love that idea of this tension
link |
00:36:37.080
between biometric security.
link |
00:36:39.080
You want to have one identity,
link |
00:36:40.080
but then certain avatars you might have to have many.
link |
00:36:43.080
I don't know which is better security,
link |
00:36:45.080
sort of flooding the world with Lexes
link |
00:36:48.080
and thereby achieving security
link |
00:36:49.080
or really being protected over your identity.
link |
00:36:51.080
I had to ask a security question actually.
link |
00:36:54.080
Well, how does flooding the world with Lexes
link |
00:36:56.080
help me know in our conversation
link |
00:36:58.080
that I'm talking to the real Lex?
link |
00:37:00.080
I completely destroy the trust in all my relationships then, right?
link |
00:37:03.080
If I flood, because then it's, yeah, that...
link |
00:37:07.080
I think that one's not going to work that well for you.
link |
00:37:09.080
It's not going to work for the original copy.
link |
00:37:11.080
Well, it probably fits some things.
link |
00:37:13.080
Like if you're a public figure and you're trying to have, you know,
link |
00:37:17.080
a bunch of...
link |
00:37:18.080
If you're trying to show up in a bunch of different places
link |
00:37:20.080
in the future, you'll be able to do that in the metaverse.
link |
00:37:23.080
So that kind of replication I think will be useful.
link |
00:37:26.080
But I do think that you're going to want a notion of like,
link |
00:37:29.080
I am talking to the real one.
link |
00:37:31.080
Yeah.
link |
00:37:32.080
Especially if the fake ones start outperforming you
link |
00:37:35.080
in all your private relationships
link |
00:37:37.080
and then you're left behind.
link |
00:37:38.080
I mean, that's a serious concern I have with clones.
link |
00:37:41.080
Again, the things I think about.
link |
00:37:43.080
Okay.
link |
00:37:44.080
So I recently got...
link |
00:37:45.080
I use QNAP NAS storage.
link |
00:37:48.080
So just storage for video and stuff.
link |
00:37:50.080
And I recently got hacked.
link |
00:37:51.080
This is the first time for me with ransomware.
link |
00:37:53.080
It's not me personally.
link |
00:37:54.080
It's all QNAP devices.
link |
00:37:58.080
So the question that people have is about security in general.
link |
00:38:03.080
Because I was doing a lot of the right things in terms of security
link |
00:38:06.080
and nevertheless ransomware basically disabled my device.
link |
00:38:10.080
Yeah.
link |
00:38:11.080
Is that something you think about?
link |
00:38:12.080
What are the different steps you could take to protect people's data
link |
00:38:15.080
on the security front?
link |
00:38:17.080
I think that there's different solutions for...
link |
00:38:21.080
In strategies where it makes sense to have stuff kind of put behind a fortress,
link |
00:38:25.080
so the centralized model versus decentralizing.
link |
00:38:30.080
Then I think both have strengths and weaknesses.
link |
00:38:32.080
So I think anyone who says,
link |
00:38:33.080
okay, just decentralize everything, that'll make it more secure.
link |
00:38:36.080
I think that that's tough because the advantage of something like encryption
link |
00:38:42.080
is that we run the largest encrypted service in the world with WhatsApp.
link |
00:38:47.080
And one of the first to roll out a multi platform encryption service.
link |
00:38:52.080
And that's something that I think was a big advance for the industry.
link |
00:38:57.080
And one of the promises that we can basically make because of that,
link |
00:39:00.080
our company doesn't see when you're sending an encrypted message
link |
00:39:04.080
and an encrypted message what the content is of what you're sharing.
link |
00:39:08.080
So that way if someone hacks Meta servers,
link |
00:39:11.080
they're not going to be able to access the WhatsApp message
link |
00:39:14.080
that you're sending to your friend.
link |
00:39:17.080
And that I think matters a lot to people
link |
00:39:19.080
because obviously if someone is able to compromise a company's servers
link |
00:39:23.080
and that company has hundreds of millions or billions of people,
link |
00:39:25.080
then that ends up being a very big deal.
link |
00:39:28.080
The flip side of that is, okay, all the content is on your phone.
link |
00:39:32.080
Are you following security best practices on your phone?
link |
00:39:36.080
If you lose your phone, all your content is gone.
link |
00:39:38.080
So that's an issue.
link |
00:39:40.080
Maybe you go back up your content from WhatsApp
link |
00:39:42.080
or some other service in an iCloud or something,
link |
00:39:46.080
then you're just at Apple's whims about are they going to go turn over the data
link |
00:39:50.080
to some government or are they going to get hacked?
link |
00:39:53.080
So a lot of the time it is useful to have data in a centralized place too
link |
00:39:58.080
because then you can train systems that can just do much better personalization.
link |
00:40:04.080
I think that in a lot of cases, centralized systems can offer,
link |
00:40:09.080
especially if you're a serious company,
link |
00:40:13.080
you're running the state of the art stuff
link |
00:40:16.080
and you have red teams attacking your own stuff
link |
00:40:19.080
and you're putting out bounty programs
link |
00:40:24.080
and trying to attract some of the best hackers in the world
link |
00:40:26.080
to go break into your stuff all the time.
link |
00:40:28.080
So any system is going to have security issues,
link |
00:40:30.080
but I think the best way forward is to basically try to be as aggressive
link |
00:40:35.080
and open about hardening the systems as possible,
link |
00:40:37.080
not trying to kind of hide and pretend that there aren't going to be issues,
link |
00:40:40.080
which I think is over time why a lot of open source systems
link |
00:40:43.080
have gotten relatively more secure is because they're open
link |
00:40:46.080
and it's not rather than pretending that there aren't going to be issues,
link |
00:40:49.080
just people surface them quicker.
link |
00:40:51.080
So I think you want to adopt that approach as a company
link |
00:40:53.080
and just constantly be hardening yourself.
link |
00:40:56.080
Trying to stay one step ahead of the attackers.
link |
00:41:00.080
It's an inherently adversarial space.
link |
00:41:03.080
I think it's an interesting security is interesting
link |
00:41:07.080
because of the different kind of threats that we've managed over the last five years,
link |
00:41:12.080
there are ones where basically the adversaries keep on getting better and better.
link |
00:41:17.080
So trying to kind of interfere with security is certainly one area of this.
link |
00:41:23.080
If you have like nation states that are trying to interfere in elections or something,
link |
00:41:27.080
they're kind of evolving their tactics.
link |
00:41:29.080
Whereas on the other hand, I don't want to be too simplistic about it,
link |
00:41:33.080
but if someone is saying something hateful,
link |
00:41:37.080
people usually aren't getting smarter and smarter about how they say hateful things.
link |
00:41:41.080
Maybe there's some element of that,
link |
00:41:43.080
but it's a very small dynamic compared to how advanced attackers
link |
00:41:47.080
and some of these other places get over time.
link |
00:41:50.080
I believe most people are good, so they actually get better over time
link |
00:41:54.080
and not being less hateful because they realize it's not fun being hateful.
link |
00:42:00.080
That's at least the belief I have.
link |
00:42:02.080
But first, Bath & Break.
link |
00:42:05.080
Sure.
link |
00:42:06.080
Okay.
link |
00:42:07.080
So we'll come back to AI, but let me ask some difficult questions now.
link |
00:42:11.080
Social Dilemma is a popular documentary that raised concerns about the effects of social media in society.
link |
00:42:17.080
You responded with a point by point rebuttal titled,
link |
00:42:21.080
What the Social Dilemma Gets Wrong.
link |
00:42:23.080
People should read that.
link |
00:42:25.080
I would say the key point they make is because social media is funded by ads,
link |
00:42:29.080
algorithms want to maximize attention and engagement,
link |
00:42:33.080
and an effective way to do so is to get people angry at each other,
link |
00:42:39.080
increase division, and so on.
link |
00:42:41.080
Can you steelman their criticisms and arguments that they make in the documentary
link |
00:42:46.080
as a way to understand the concern and as a way to respond to it?
link |
00:42:53.080
Well, yeah, I think that's a good conversation to have.
link |
00:42:57.080
I don't happen to agree with the conclusions,
link |
00:43:00.080
and I think that they make a few assumptions that are just very big jumps
link |
00:43:06.080
that I don't think are reasonable to make.
link |
00:43:09.080
But I understand overall why people would be concerned that our business model and ads in general,
link |
00:43:19.080
we do make more money as people use the service more in general, right?
link |
00:43:23.080
So as a kind of basic assumption,
link |
00:43:26.080
okay, do we have an incentive for people to build a service that people use more?
link |
00:43:31.080
Yes, on a lot of levels.
link |
00:43:33.080
I mean, we think what we're doing is good.
link |
00:43:35.080
So we think that if people are finding it useful, they'll use it more.
link |
00:43:38.080
Or if you just look at it as this sort of if the only thing we cared about is money,
link |
00:43:43.080
which is not for anyone who knows me, but okay, we're a company.
link |
00:43:48.080
So let's say you just kind of simplified it down to that.
link |
00:43:51.080
Then would we want people to use the services more?
link |
00:43:54.080
Yes. But then you get to the second question, which is,
link |
00:43:59.080
does kind of getting people agitated make them more likely to use the services more?
link |
00:44:07.080
And I think from looking at other media in the world,
link |
00:44:12.080
especially TV and, you know, there's the old news adage, if it bleeds, it leads.
link |
00:44:19.080
I think that there are a bunch of reasons why someone might think that
link |
00:44:27.080
that kind of provocative content would be the most engaging.
link |
00:44:32.080
Now, what I've always found is two things.
link |
00:44:35.080
One is that what grabs someone's attention in the near term
link |
00:44:39.080
is not necessarily something that they're going to appreciate having seen
link |
00:44:43.080
or going to be the best over the long term.
link |
00:44:45.080
So I think what a lot of people get wrong is that we're not,
link |
00:44:49.080
I'm not building this company to make the most money
link |
00:44:52.080
or get people to spend the most time on this in the next quarter or the next year.
link |
00:44:55.080
I've been doing this for 17 years at this point,
link |
00:44:59.080
and I'm still relatively young and have a lot more that I want to do over the coming decades.
link |
00:45:03.080
So I think that it's too simplistic to say,
link |
00:45:08.080
hey, this might increase time in the near term, therefore, it's what you're going to do.
link |
00:45:13.080
Because I actually think a deeper look at kind of what my incentives are,
link |
00:45:17.080
the incentives of a company that are focused on the long term,
link |
00:45:20.080
is to basically do what people are going to find valuable over time,
link |
00:45:24.080
not what is going to draw people's attention today.
link |
00:45:26.080
The other thing that I'd say is that,
link |
00:45:29.080
I think a lot of times people look at this from the perspective of media
link |
00:45:34.080
or kind of information or civic discourse,
link |
00:45:37.080
but one other way of looking at this is just that,
link |
00:45:41.080
I'm a product designer, our company, we build products,
link |
00:45:45.080
and a big part of building a product is not just the function and utility of what you're delivering,
link |
00:45:50.080
but the feeling of how it feels.
link |
00:45:52.080
We spent a lot of time talking about virtual reality and how the kind of key aspect of that experience
link |
00:45:59.080
is the feeling of presence, which is a visceral thing.
link |
00:46:02.080
It's not just about the utility that you're delivering, it's about the sensation.
link |
00:46:06.080
Similarly, I care a lot about how people feel when they use our products,
link |
00:46:11.080
and I don't want to build products that make people angry.
link |
00:46:15.080
That's not, I think, what we're here on this earth to do,
link |
00:46:18.080
is to build something that people spend a bunch of time doing
link |
00:46:22.080
and it just kind of makes them angry at other people.
link |
00:46:24.080
That's not good.
link |
00:46:26.080
That's not what I think would be sort of a good use of our time
link |
00:46:32.080
or a good contribution to the world.
link |
00:46:34.080
It's like people, they tell us on a per content basis,
link |
00:46:38.080
does this thing, do I like it? Do I love it?
link |
00:46:40.080
Does it make me angry? Does it make me sad?
link |
00:46:43.080
Based on that, we choose to basically show content that makes people angry less,
link |
00:46:49.080
because of course, if you're designing a product and you want people to be able to connect
link |
00:46:56.080
and feel good over a long period of time, then that's naturally what you're going to do.
link |
00:47:02.080
So, I don't know, I think overall, I understand at a high level,
link |
00:47:10.080
if you're not thinking too deeply about it, why that argument might be appealing,
link |
00:47:16.080
but I just think if you actually look at what our real incentives are,
link |
00:47:21.080
not just like, you know, if we were trying to optimize for the next week,
link |
00:47:26.080
but as people working on this, why are we here?
link |
00:47:32.080
I think it's pretty clear that that's not actually how you would want to design the system.
link |
00:47:36.080
I guess one other thing that I'd say is that while we're focused on the ads business model,
link |
00:47:41.080
I do think it's important to note that a lot of these issues are not unique to ads.
link |
00:47:46.080
So, take a subscription news business model, for example.
link |
00:47:49.080
I think that has just as many potential pitfalls.
link |
00:47:53.080
So, maybe if someone's paying for a subscription, you don't get paid per piece of content that they look at,
link |
00:47:58.080
but you know, say for example, I think like a bunch of the partisanship that we see
link |
00:48:05.080
could potentially be made worse by you have these kind of partisan news organizations
link |
00:48:14.080
that basically sell subscriptions and they're only going to get people on one side
link |
00:48:18.080
to subscribe to them, so their incentive is not to print content
link |
00:48:23.080
or produce content that's kind of centrist or down the line either.
link |
00:48:28.080
I bet that what a lot of them find is that if they produce stuff that's kind of more polarizing
link |
00:48:33.080
or more partisan, then that is what gets them more subscribers.
link |
00:48:37.080
So, I think that this stuff is all, there's no perfect business model.
link |
00:48:42.080
Everything has pitfalls.
link |
00:48:44.080
The thing that I think is great about advertising is it makes the consumer services free,
link |
00:48:48.080
which if you believe that everyone should have a voice and everyone should be able to connect,
link |
00:48:52.080
then that's a great thing as opposed to building a luxury service that not everyone can afford.
link |
00:48:57.080
But look, I mean, every business model, you know, you have to be careful about how you're implementing what you're doing.
link |
00:49:02.080
You responded to a few things there.
link |
00:49:04.080
You spoke to the fact that, you know, there is a narrative of malevolence.
link |
00:49:09.080
Like, you know, you're leaning into the making people angry just because it makes more money in the short term,
link |
00:49:15.080
that kind of thing.
link |
00:49:16.080
So you responded to that.
link |
00:49:18.080
But there's also kind of reality of human nature, just like you spoke about.
link |
00:49:23.080
There's fights, arguments we get in, and we don't like ourselves afterwards, but we got into them anyway.
link |
00:49:30.080
So our long term growth is, I believe for most of us, has to do with learning,
link |
00:49:36.080
challenging yourself, improving, being kind to each other, finding a community of people that,
link |
00:49:44.080
you know, you connect with on a real human level, all that kind of stuff.
link |
00:49:50.080
But it does seem when you look at social media that a lot of fights break out,
link |
00:49:56.080
a lot of arguments break out, a lot of viral content ends up being sort of outraged in one direction or the other.
link |
00:50:04.080
And so it's easy from that to infer the narrative that social media companies are letting this outrage become viral.
link |
00:50:13.080
And so they're increasing the division in the world.
link |
00:50:16.080
I mean, perhaps you can comment on that or further, how can you be, how can you push back on this narrative?
link |
00:50:25.080
How can you be transparent about this battle?
link |
00:50:28.080
Because I think it's not just motivation or financials, it's a technical problem too,
link |
00:50:36.080
which is how do you improve long term well being of human beings?
link |
00:50:42.080
I think that going through some of the design decisions would be a good conversation.
link |
00:50:49.080
But first, I actually think, you know, I think you acknowledge that, you know, that narrative is somewhat anecdotal.
link |
00:50:57.080
I think it's worth grounding this conversation and the actual research that has been done on this,
link |
00:51:03.080
which by and large finds that social media is not a large driver of polarization.
link |
00:51:11.080
And, you know, I mean, there's been a number of economists and social scientists and folks who have studied this.
link |
00:51:18.080
You know, a lot of polarization, it varies around the world.
link |
00:51:21.080
Social media is basically in every country, Facebook's in pretty much every country except for China and maybe North Korea.
link |
00:51:27.080
And you see different trends in different places where, you know, in a lot of countries, polarization is declining.
link |
00:51:36.080
In some, it's flat. In the U.S., it's risen sharply.
link |
00:51:41.080
So the question is, what are the unique phenomena in the different places?
link |
00:51:46.080
And I think for the people who are trying to say, hey, social media is the thing that's doing this,
link |
00:51:50.080
I think that that clearly doesn't hold up because social media is a phenomenon that is pretty much equivalent in all of these different countries.
link |
00:51:57.080
And you have researchers like this economist at Stanford, Matthew Genska, who's just written at length about this.
link |
00:52:04.080
And, you know, it's a bunch of books by, you know, political scientists,
link |
00:52:10.080
Ezra Klein and folks, why we're polarized basically goes through this decades long analysis in the U.S.
link |
00:52:17.080
Before I was born, basically talking about some of the forces and kind of partisan politics and Fox News
link |
00:52:24.080
and different things that predate the Internet in a lot of ways that I think are likely larger contributors.
link |
00:52:30.080
So to the contrary on this, not only is it pretty clear that social media is not a major contributor,
link |
00:52:37.080
but most of the academic studies that I've seen actually show that social media use is correlated with lower polarization.
link |
00:52:46.080
Genska, the same person who just did the study that I cited about longitudinal polarization across different countries,
link |
00:52:53.080
you know, also did a study that basically showed that if you looked after, you know, the 2016 election in the U.S.,
link |
00:53:02.080
the voters who were the most polarized were actually the ones who were not on the Internet.
link |
00:53:07.080
So, and there have been recent other studies I think in Europe and around the world basically showing that as people stop using social media,
link |
00:53:16.080
they tend to get more polarized. Then there's a deeper analysis around, okay, well, what polarization actually isn't even one thing.
link |
00:53:24.080
Because, you know, having different opinions on something isn't, I don't think that that's by itself bad.
link |
00:53:28.080
What people who study this say is most problematic is what they call affective polarization, which is basically, are you,
link |
00:53:37.080
do you have negative feelings towards people of another group?
link |
00:53:40.080
And the way that a lot of scholars study this is they basically ask a group,
link |
00:53:46.080
would you let your kids marry someone of group X?
link |
00:53:50.080
Whatever the groups are that you're worried that someone might have negative feelings towards.
link |
00:53:55.080
And in general, use of social media has corresponded to decreases in that kind of affective polarization.
link |
00:54:01.080
So, I just want to, I think we should talk through the design decisions and how we handle the kind of specific pieces of content.
link |
00:54:10.080
But overall, I think it's just worth grounding that discussion in the research that's existed that I think overwhelmingly shows that the mainstream narrative around this is just not right.
link |
00:54:20.080
But the narrative does take hold and it's compelling to a lot of people.
link |
00:54:27.080
There's another question I'd like to ask you on this.
link |
00:54:30.080
I was looking at various polls and saw that you're one of the most disliked tech leaders today.
link |
00:54:37.080
54% unfavorable rating.
link |
00:54:40.080
Elon Musk is 23%.
link |
00:54:42.080
It's basically everybody has a very high unfavorable rating that are tech leaders.
link |
00:54:47.080
Maybe you can help me understand that.
link |
00:54:50.080
Why do you think so many people dislike you?
link |
00:54:53.080
Some even hate you.
link |
00:54:56.080
And how do you regain their trust and support?
link |
00:54:59.080
Given everything you just said, why are you losing the battle in explaining to people what actual impact social media has on society?
link |
00:55:11.080
Well, I'm curious if that's a US survey for world.
link |
00:55:16.080
It is US, yeah.
link |
00:55:18.080
So I think that there's a few dynamics.
link |
00:55:20.080
One is that our brand has been somewhat uniquely challenged in the US compared to other places.
link |
00:55:29.080
It's not that there are other countries we have issues to, but I think in the US there was this dynamic where if you look at the next sentiment of kind of coverage or attitude towards us.
link |
00:55:42.080
Before 2016, I think that there were probably very few months, if any, where it was negative.
link |
00:55:47.080
And since 2016, I think that there have probably been very few months, if any, then it's been positive.
link |
00:55:51.080
Politics.
link |
00:55:53.080
But I think it's a specific thing.
link |
00:55:55.080
And this is very different from other places.
link |
00:55:57.080
I think in a lot of other countries in the world, the sentiment towards meta and our services is extremely positive.
link |
00:56:04.080
In the US, we have more challenges.
link |
00:56:06.080
And I think compared to other companies, you can look at certain industries.
link |
00:56:12.080
I think if you look at it from like a partisan perspective, not from like a political perspective, but just kind of culturally.
link |
00:56:19.080
It's like there are people who are probably more left of center and there are people in more right of center and there's, you know, kind of blue America and red America.
link |
00:56:25.080
There are certain industries that I think maybe one half of the country has a more positive view towards than another.
link |
00:56:32.080
And I think we're in a...
link |
00:56:36.080
One of the positions that we're in that I think is really challenging is that because of a lot of the content decisions that we've basically had to arbitrate.
link |
00:56:47.080
And because we're not a partisan company, right?
link |
00:56:50.080
We're not a Democrat company or a Republican company.
link |
00:56:53.080
We're trying to make the best decisions we can to help people connect and help people have as much voice as they can while, you know, having some rules.
link |
00:57:01.080
Because we're running a community.
link |
00:57:05.080
The net effect of that is that we're kind of constantly making decisions that piss off people in both camps.
link |
00:57:12.080
And the effect that I've sort of seen is that when we make a decision that is...
link |
00:57:21.080
That's a controversial one that's going to upset, say, about half the country.
link |
00:57:27.080
And those decisions are all negative some from a brand perspective because it's not like...
link |
00:57:33.080
Like if we make that decision in one way and, you know, say half the country is happy about that particular decision that we make, they tend to not say, oh, sweet, Meta got that one right.
link |
00:57:43.080
They're just like, ah, you didn't mess that one up, right?
link |
00:57:46.080
But their opinion doesn't tend to go up by that much.
link |
00:57:49.080
Whereas the people who kind of are on the other side of it are like, God, how could you mess that up?
link |
00:57:55.080
How could you possibly think that like that piece of content is okay and should be up and should not be censored?
link |
00:58:01.080
And so I think the...
link |
00:58:03.080
Whereas if you leave it up and, you know, it's...
link |
00:58:08.080
Or if you take it down, the people who thought it should be taken down or, you know, it's like, all right, fine, great, you didn't mess that one up.
link |
00:58:14.080
So our internal assessment of kind of analytics on our brand are basically any time one of these big controversial things comes up in society,
link |
00:58:23.080
our brand goes down with half of the country.
link |
00:58:26.080
And then if you just kind of extrapolate that out, it's just been very challenging for us to try to navigate what is a polarizing country in a principled way.
link |
00:58:36.080
Where we're not trying to kind of hue to one side or the other.
link |
00:58:39.080
We're trying to do what we think is the right thing.
link |
00:58:41.080
But that's what I think is the right thing for us to do, though.
link |
00:58:44.080
So I mean, that's what we'll try to keep doing.
link |
00:58:47.080
But just as a human being, how does it feel, though, when you're giving so much of your day to day life to try to heal the vision to try to do good in the world,
link |
00:58:58.080
as we've talked about, that so many people in the US, the place you call home, have a negative view of you as a leader, as a human being, and the company you love?
link |
00:59:14.080
Well, I mean, it's not great.
link |
00:59:16.080
But I mean, look, if I wanted people to think positively about me as a person, I don't know.
link |
00:59:26.080
I'm not sure if you go build a company.
link |
00:59:28.080
I mean, it's like...
link |
00:59:29.080
Or social media company.
link |
00:59:30.080
It seems exceptionally difficult to do with a social media company.
link |
00:59:33.080
Yeah.
link |
00:59:34.080
So I mean, I don't know.
link |
00:59:35.080
There is a dynamic where a lot of the other people running these companies, internet companies, have sort of stepped back.
link |
00:59:44.080
And they just do things that are sort of, I don't know, less controversial.
link |
00:59:50.080
And some of it may be that they just get tired over time.
link |
00:59:53.080
So I don't know.
link |
00:59:56.080
I think that running a company is hard.
link |
00:59:58.080
Building something at scale is hard.
link |
01:00:00.080
You only really do it for a long period of time if you really care about what you're doing.
link |
01:00:04.080
And yeah, so I mean, it's not great, but look, I think that at some level, whether 25% of people dislike you or 75% of people dislike you,
link |
01:00:18.080
your experience as a public figure is going to be that there's a lot of people who dislike you, right?
link |
01:00:23.080
So I actually am not sure how different it is.
link |
01:00:28.080
Certainly, the country's gotten more polarized and we in particular have gotten more controversial over the last five or years or so.
link |
01:00:39.080
But I don't know.
link |
01:00:43.080
I kind of think like as a public figure and leader of one of these enterprises.
link |
01:00:48.080
Comes with a job.
link |
01:00:49.080
Yeah, part of what you do is like, and look, the answer can't just be ignored, right?
link |
01:00:54.080
Because like a huge part of the job is like, you need to be getting feedback and internalizing feedback on how you can do better.
link |
01:01:00.080
But I think increasing what you need to do is be able to figure out, you know, who are the kind of good faith critics who are criticizing you
link |
01:01:09.080
because they're trying to help you do a better job rather than tear you down.
link |
01:01:13.080
And those are the people I just think you have to cherish and listen very closely to the things that they're saying.
link |
01:01:20.080
Because I think it's just as dangerous to tune out everyone who says anything negative and just listen to the people who are kind of positive and support you,
link |
01:01:30.080
as it would be psychologically to pay attention trying to make people who are never going to like you like you.
link |
01:01:36.080
So that's just kind of a dance that people have to do.
link |
01:01:40.080
But I mean, you know, you kind of develop more of a feel for like who actually is trying to accomplish the same types of things in the world and who has different ideas about how to do that.
link |
01:01:51.080
And how can I learn from those people?
link |
01:01:53.080
And like, yeah, we get stuff wrong.
link |
01:01:55.080
And when the people whose opinions I respect call me out on getting stuff wrong, that hurts and makes me want to do better.
link |
01:02:02.080
But I think at this point, I'm pretty tuned to just, all right, if I know they're kind of like operating in bad faith and they're not really trying to help, then, you know, I don't know.
link |
01:02:12.080
It doesn't, you know, I think over time it just doesn't bother you that much.
link |
01:02:15.080
But you are surrounded by people that believe in the mission, that love you.
link |
01:02:21.080
Are there friends or colleagues in your inner circle you trust that call you out on your bullshit whenever you're thinking may be misguided as it is for leaders at times?
link |
01:02:31.080
I think we have a famously open company culture where we sort of encourage that kind of descent internally, which is, you know, why there's so much material internally that can leak out with people sort of disagreeing is because that's sort of the culture.
link |
01:02:47.080
You know, our management team, I think it's a lot of people, you know, there's some newer folks who come in, there's some folks who have kind of been there for a while, but there's a very high level of trust.
link |
01:02:57.080
I would say it is a relatively confrontational group of people and my friends and family, I think will push me on this, but look, it's not just, but I think you need some diversity, right?
link |
01:03:09.080
It can't just be, you know, people who are your friends and family.
link |
01:03:14.080
So, you know, I mean, there are journalists or analysts or, you know, peer executives at other companies or, you know, other people who sort of are insightful about thinking about the world, you know, certain politicians or people kind of in that sphere who I just
link |
01:03:33.080
think have like very insightful perspectives who, even if they would, they come at the world from a different perspective, which is sort of what makes the perspective so valuable.
link |
01:03:44.080
But, you know, I think fundamentally you're trying to get to the same place in terms of, you know, helping people connect more, helping the whole world function better, not just, you know, one place or another.
link |
01:03:57.080
And I don't know, I mean, those are the people whose opinions really matter to me.
link |
01:04:03.080
And I just, you know, that's how I learn on a day to day basis.
link |
01:04:06.080
People are constantly sending me comments on stuff or links to things they found interesting.
link |
01:04:10.080
And I don't know, it's kind of constantly evolving this model of the world and kind of what we should be aspiring to be.
link |
01:04:17.080
You've talked about, you have a famously open culture, which comes with the criticism and the painful experiences.
link |
01:04:27.080
So let me ask you another difficult question.
link |
01:04:30.080
Francis Haugen, the Facebook whistleblower, leaked the internal Instagram research into teenagers and well being.
link |
01:04:38.080
Her claim is that Instagram is choosing profit over well being of teenage girls.
link |
01:04:43.080
So Instagram is quote toxic for them.
link |
01:04:46.080
Your response titled what are research really says about teen well being and Instagram says no.
link |
01:04:54.080
Instagram research shows that 11 of 12 well being issues.
link |
01:04:58.080
Teenage girls who said they struggle with those difficult issues also said that Instagram made them better rather than worse.
link |
01:05:07.080
Again, can you steal man and defend the point and Francis Haugen's characterization of the study and then help me understand the positive and negative effects of Instagram and Facebook on young people.
link |
01:05:20.080
So there are certainly questions around teen mental health that are really important.
link |
01:05:26.080
It's hard to, you know, as a parent, it's like hard to imagine any set of questions that are sort of more important.
link |
01:05:32.080
I mean, I guess maybe other aspects of physical health or well being are probably come to that level.
link |
01:05:37.080
But like, these are really important questions, right, which is why we dedicate teams to studying them.
link |
01:05:45.080
You know, I don't think the internet or social media are unique in having these questions.
link |
01:05:50.080
I mean, I think people and there have been sort of magazines with promoting certain body types for women and kids for decades.
link |
01:05:58.080
But, you know, we really care about this stuff. So we wanted to study it.
link |
01:06:03.080
And of course, you know, we didn't expect that everything was going to be positive all the time.
link |
01:06:07.080
So I mean, the reason why you study this stuff is to try to improve and get better.
link |
01:06:11.080
So I mean, look, the place where I disagree with the characterization first, I thought, you know, some of the reporting and coverage of it just took the whole thing out of proportion and that it focused on, as you said, I think there were like 20 metrics in there.
link |
01:06:24.080
And on 18 or 19, the effect of using Instagram was neutral or positive on the teen's well being.
link |
01:06:31.080
And there was one area where I think it showed that we needed to improve and we took some steps to try to do that after doing the research.
link |
01:06:39.080
But I think having the coverage just focus on that one without focusing on the, you know, I mean, I think an accurate characterization would have been that kids using Instagram or not kids, teens is generally positive for their mental health.
link |
01:06:53.080
But of course, that was not the narrative that came out. So I think it's hard to, that's not a kind of logical thing to straw man, but I sort of disagree or steel man, but I sort of disagree with that overall characterization.
link |
01:07:04.080
I think anyone sort of looking at this objectively would.
link |
01:07:09.080
But then, you know, I mean, the, there is this sort of intent critique that I think you were getting at before, which, which says, you know, it assumes some sort of malevolence, right?
link |
01:07:20.080
Which it's, it's really hard for me to really wrap my head around this because as far as I know, it's not clear that any of the other tech companies are doing this kind of research.
link |
01:07:33.080
So why the narrative should form that we did research because we're studying an issue because we wanted to understand it to improve and took steps after that to try to improve it.
link |
01:07:43.080
That your interpretation of that would be that, that we did the research and tried to sweep it under the rug. It just, it's sort of is like, I don't know, it's beyond credibility to me that like, that's the accurate description of the actions that we've taken compared to the others in the industry.
link |
01:08:01.080
So, I don't know, that's, that's kind of, that's, that's my view on it. These are really important issues and there's a lot of stuff that I think we're going to be working on related to teen mental health for a long time, including trying to understand this better.
link |
01:08:14.080
And I would encourage everyone else in the industry to do this too.
link |
01:08:18.080
Yeah, I would love there to be open conversations and a lot of great research being released internally and then also externally. It doesn't make me feel good to see press obviously get way more clicks when they say negative things about social media.
link |
01:08:39.080
Objectively speaking, I can just tell that there's hunger to say negative things about social media. And I don't understand how that's supposed to lead to an open conversation about the positives and the negatives, the concerns about social media,
link |
01:08:55.080
especially when you're doing those kind of research. I mean, I don't know what to do with that. But let me ask you as a father, there's a way heavy on you that people get bullied on social networks.
link |
01:09:10.080
So, people get bullied in their private life. But now, because so much of our life is in the digital world, the bullying moves from the physical world to the digital world.
link |
01:09:21.080
So, you're now creating a platform on which bullying happens and some of that bullying can lead to damage to mental health and some of that bullying can lead to depression, even suicide.
link |
01:09:37.080
There's a way heavy on you that people have committed suicide or will commit suicide based on the bullying that happens on social media.
link |
01:09:48.080
Yeah, I mean, there's a set of harms that we basically track and build systems to fight against. And bullying and self harm are, these are some of the biggest things that we are most focused on.
link |
01:10:11.080
And for bullying, like you say, it's going to be, while this predates the internet, then it's probably impossible to get rid of all of it.
link |
01:10:22.080
You want to give people tools to fight it, and you want to fight it yourself. And you also want to make sure that people have the tools to get help when they need it.
link |
01:10:30.080
So, I think this isn't like a question of, can you get rid of all bullying? I mean, it's like, all right. I mean, I have two daughters and they fight and push each other around and stuff too.
link |
01:10:44.080
And the question is just how do you handle that situation? And there's a handful of things that I think you can do.
link |
01:10:52.080
And we talked a little bit before around some of the AI tools that you can build to identify when something harmful is happening.
link |
01:11:00.080
It's actually, it's very hard in bullying, because a lot of bullying is very context specific. It's not like you're trying to fit a formula of like, you know, if like looking at the different harms, you know, someone promoting a terrorist group is like,
link |
01:11:13.080
probably one of the simpler things to generally find, because things promoting that group are going to, you know, look a certain way or feel a certain way. Bullying could just be, you know, someone making some subtle comment about someone's appearance that's idiosyncratic to them.
link |
01:11:26.080
And it could look at just like humor. So, humor to one person can be destructive to another human being. Yeah.
link |
01:11:32.080
So with bullying, I think there are, there are certain things that you can find through AI systems. But I think it is increasingly important to just give people more agency themselves.
link |
01:11:44.080
So we've done things like making it so people can turn off comments or, you know, take a break from, you know, hearing from a specific person without having to signal at all that they're going to stop following them
link |
01:11:55.080
or kind of make some stand that, okay, I'm not friends with you anymore. I'm not following you. I just like, I just don't want to hear about this, but I also don't want to signal at all publicly that or to them that there's been an issue.
link |
01:12:10.080
And then you get to some of the more extreme cases like you're talking about where someone is thinking about self harm or suicide. And in there, we've found that that is a place where AI can identify a lot, as well as people flagging things.
link |
01:12:28.080
You know, if people are expressing something that is, is, you know, potentially they're thinking of hurting themselves. Those are cues that you can build systems and, you know, hundreds of languages around the world to be able to identify that.
link |
01:12:41.080
And one of the things that I'm actually quite proud of is we've we've built these systems that I think are clearly leading at this point that not only identify that, but then connect with local first responders and have been able to save I think at this point it's,
link |
01:13:00.080
you know, in thousands of cases, be able to get first responders to people through these systems, who really need them, because of specific plumbing that we've done between the AI work and being able to communicate with with local first responder
link |
01:13:13.080
organizations, we're rolling that out in more places around the world. And, and I think the team that worked on that just did awesome stuff. So, I think that that's a long way of saying, Yeah, I mean, this is this is a this is a heavy topic and there's, you want to attack it in a bunch of different
link |
01:13:28.080
ways. And, and also kind of understand that some of nature is for people to, to do this to each other, which is unfortunate, but, but you can give people tools and build things that help.
link |
01:13:40.080
It's still one hell of a burden. No. A platform that allows people to fall in love with each other is also by nature going to be a platform that allows people to hurt each other. And when you're managing such a platform, it's difficult.
link |
01:13:56.080
And I think you spoke to it, but the psychology of that of being a leader in that space of creating technology that's playing in this space, like you mentioned psychology is really damn difficult.
link |
01:14:09.080
And I mean, the burden of that is just, it's just great. I just wanted to hear you speak to that point.
link |
01:14:18.080
I have to ask about the thing you've brought up a few times, which is making controversial decisions.
link |
01:14:26.080
Let's talk about free speech and censorship. So there are two groups of people pressuring meta on this one group is upset that Facebook, the social network allows misinformation and quotes to be spread on the platform.
link |
01:14:41.080
The other group are concerned that Facebook censors speech by calling it misinformation. So you're getting it from both sides.
link |
01:14:49.080
You, in 2019, October at Georgetown University eloquently defended the importance of free speech.
link |
01:14:58.080
But then COVID came. And the 2020 election came. Do you worry that outside pressures from advertisers, politicians, the public have forced meta to damage the ideal of free speech that you spoke highly of?
link |
01:15:14.080
Just to say some obvious things up front, I don't think pressure from advertisers or politicians directly in any way affects how we think about this. I think these are just hard topics.
link |
01:15:25.080
So let me just take you through our evolution from kind of the beginning of the company to where we are now.
link |
01:15:30.080
You don't build a company like this unless you believe that people expressing themselves is a good thing.
link |
01:15:35.080
So that's sort of the foundational thing. You can kind of think about our company as a formula where we think giving people voice and helping people connect creates opportunity.
link |
01:15:47.080
So those are the two things that we're always focused on are sort of helping people connect. We talked about that a lot, but also giving people voice and ability to express themselves.
link |
01:15:56.080
And by the way, most of the time when people express themselves, that's not like politically controversial content.
link |
01:16:01.080
It's like expressing something about their identity that's more related to the avatar conversation we had earlier in terms of expressing some facet.
link |
01:16:09.080
But that's what's important to people on a day to day basis. And sometimes when people feel strongly enough about something, it kind of becomes a political topic.
link |
01:16:16.080
That's sort of always been a thing that we focused on. There's always been the question of safety in this, which, you know, if you're building a community, I think you have to focus on safety.
link |
01:16:26.080
We've had these community standards from early on. And there are about 20 different kinds of harm that we track and try to fight actively.
link |
01:16:35.080
And we've talked about some of them already. So it includes things like bullying and harassment.
link |
01:16:41.080
It includes things like terrorism or promoting terrorism, inciting violence, intellectual property theft.
link |
01:16:49.080
And in general, I think, call it about 18 out of 20 of those, there's not really a particularly polarized definition of that.
link |
01:16:58.080
I think you're not really going to find many people in the country or in the world who are trying to say we should be fighting terrorist content less.
link |
01:17:09.080
I think that the content where there are a couple of areas where I think this has gotten more controversial recently, which I'll talk about.
link |
01:17:16.080
And you're right, that misinformation is basically is up there. And I think sometimes the definition of hate speech is up there too.
link |
01:17:23.080
But I think in general, most of the content that I think we're working on for safety is not actually, you know, people don't kind of have these questions.
link |
01:17:33.080
So it's sort of this subset. But if you go back to the beginning of the company, this was sort of pre deep learning days.
link |
01:17:42.080
And therefore, you know, it was me and my roommate Dustin joined me.
link |
01:17:48.080
And like, if someone posted something bad, you know, it was the AI technology did not exist yet to be able to go basically look at all the content.
link |
01:18:02.080
And we were a small enough outfit that no one would expect that we could review it all.
link |
01:18:09.080
Even if like someone reported it to us, we basically did our best, right?
link |
01:18:12.080
It's like someone would report it and we try to look at stuff and then deal with stuff.
link |
01:18:17.080
And for called the first, I don't know, seven or eight years of the company, you know, we weren't that big of a company, you know, for a lot of that period, we weren't even really profitable.
link |
01:18:29.080
And I didn't really exist to be able to do the kind of moderation that we do today.
link |
01:18:33.080
And then at some point, in kind of the middle of the last decade, that started to flip.
link |
01:18:38.080
And we, you know, we became, it got to the point where we were sort of a larger and more profitable company and the AI was starting to come online to be able to proactively detect some of the simpler forms of this.
link |
01:18:53.080
So things, things like pornography, you could train an image classifier to, you know, identify what a nipple was, or you can fight against terrorist content.
link |
01:19:01.080
There's actually papers on this is great.
link |
01:19:03.080
Oh, of course there are technical papers.
link |
01:19:05.080
Of course there are.
link |
01:19:06.080
You know, those are relatively easier things to train AI to do than, for example, understand the nuances of what is inciting violence and 100 languages around the world and not have the false positives of like,
link |
01:19:20.080
Okay, are you posting about this thing that might be inciting violence, because you're actually trying to denounce it, in which case we probably shouldn't take that down.
link |
01:19:28.080
Where if you're trying to denounce something that's inciting violence in, in some kind of dialect in a corner of India, as opposed to, okay, actually you're posting this thing because you're trying to incite violence.
link |
01:19:39.080
Okay, get building an AI that can basically get to that level of nuance and all the languages that we serve.
link |
01:19:46.080
It's something that I think is only really becoming possible now, not, not towards the middle of the last decade, but there's been this evolution.
link |
01:19:55.080
And I think what happened, you know, people sort of woke up after 2016.
link |
01:20:00.080
And, you know, a lot of people like, okay, this, the country is a lot more polarized and there's a lot more stuff here than we realized.
link |
01:20:09.080
Why weren't these internet companies on top of this.
link |
01:20:13.080
And I think at that point, it was reasonable feedback that, you know, some of this technology had started becoming possible.
link |
01:20:23.080
And at that point, I really did feel like we needed to make a substantially larger investment.
link |
01:20:29.080
We'd already worked on this stuff a lot on AI and on these integrity problems, but that we should basically invest, you know, have a thousand or more engineers basically work on building these AI systems to be able to go and
link |
01:20:41.080
proactively identify the stuff across all these different areas.
link |
01:20:44.080
Okay, so we went and did that.
link |
01:20:46.080
Now we've built the tools to be able to do that.
link |
01:20:49.080
And now I think it's actually a much more complicated set of philosophical rather than technical questions, which is the exact policies, which are okay.
link |
01:20:58.080
Now, the way that we basically hold ourselves accountable is we issue transparency reports every quarter and the metric that we track is for each of those 20 types of harmful content.
link |
01:21:11.080
How much of that content are we taking down before someone even has to report it to us?
link |
01:21:15.080
Right.
link |
01:21:16.080
So how effective is our AI at doing this?
link |
01:21:18.080
But that basically creates this big question, which is, okay, now we need to really be careful about how proactive we set the AI and where the exact policy lines are around what we're taking down.
link |
01:21:31.080
It's certainly at a point now where, you know, I felt like at the beginning of that journey of building those AI systems, there's a lot of push.
link |
01:21:43.080
There's something like, okay, you've got to do more. There's clearly a lot more bad content that people aren't reporting or that you're not getting to, and you need to get more effective at that.
link |
01:21:51.080
And I was pretty sympathetic to that.
link |
01:21:53.080
But then I think at some point along the way, there started to be almost equal issues on both sides of, okay, actually, you're kind of taking down too much stuff, right?
link |
01:22:02.080
Or some of the stuff is borderline, and it wasn't really bothering anyone and they didn't report it.
link |
01:22:09.080
So is that really an issue that you need to take down?
link |
01:22:13.080
Whereas we still have the critique on the other side too, where a lot of people think we're not doing enough.
link |
01:22:18.080
So it's become, as we built the technical capacity, I think it becomes more philosophically interesting almost where you want to be on the line.
link |
01:22:28.080
And I just think you don't want one person making those decisions. So we've also tried to innovate in terms of building out this independent oversight board, which has people who are dedicated to free expression, but from around the world, who people can appeal cases to.
link |
01:22:44.080
So a lot of the most controversial cases basically go to them and they make the final binding decision on how we should handle that.
link |
01:22:50.080
And then, of course, their decisions, we then try to figure out what the principles are behind those and encode them into the algorithms.
link |
01:22:56.080
And how are those people chosen, which you're outsourcing a difficult decision?
link |
01:23:01.080
Yeah, the initial people, we chose a handful of chairs for the group.
link |
01:23:09.080
And we basically chose the people for a commitment to free expression and a broad understanding of human rights and the tradeoffs around free expression.
link |
01:23:22.080
But fundamentally, people who are going to lean towards free expression.
link |
01:23:25.080
Towards freedom of speech.
link |
01:23:26.080
Okay. So there's also this idea of fact checkers. So jumping around to the misinformation questions, especially during COVID, which isn't exceptionally speaking of polarization.
link |
01:23:36.080
Can I speak to the COVID thing?
link |
01:23:38.080
Yes.
link |
01:23:39.080
I mean, I think one of the hardest set of questions around free expression, because you asked about Georgetown, has my stance fundamentally changed.
link |
01:23:44.080
And the answer to that is, no, my stance has not changed. It is fundamentally the same as when I was talking about Georgetown from a philosophical perspective.
link |
01:23:56.080
The challenge with free speech is that everyone agrees that there is a line where if you're actually about to do physical harm to people, that there should be restrictions.
link |
01:24:10.080
So I mean, there's the famous Supreme Court historical example of like you can't yell fire in a crowded theater.
link |
01:24:17.080
The thing that everyone agrees disagrees on is what is the definition of real harm where I think some people think, okay, this should only be a very literal, I mean, take it back to the bullying conversation we were just having,
link |
01:24:30.080
where is it just harm if the person is about to hurt themselves because they've been bullied so hard, or is it actually harm like as they're being bullied and kind of at what point in the spectrum is that and that's the part that there's not agreement on.
link |
01:24:44.080
But I think what people agree on pretty broadly is that when there is an acute threat that it does make sense from a societal perspective to tolerate less speech that could be potentially harmful in that acute situation.
link |
01:24:59.080
So I think where COVID got very difficult is, you know, I don't think anyone expected this to be going on for years.
link |
01:25:05.080
But if you'd kind of asked the a priori, would a global pandemic where, you know, a lot of people are dying and catching this, is that an emergency where you'd kind of consider it that, you know, it's problematic to basically yell fire in a crowded theater.
link |
01:25:26.080
I think that that probably passes that test. So I think that it's a very tricky situation, but I think the fundamental commitment to free expression is there.
link |
01:25:38.080
And that's what I believe. And again, I don't think you start this company unless you care about people being able to express themselves as much as possible.
link |
01:25:44.080
But I think that that's the question, right, is like, how do you define what the harm is and how acute that is.
link |
01:25:52.080
And what are the institutions that define that harm? A lot of the criticism is that the CDC, the WHO, the institutions we've come to trust as a civilization to give the line of what is and isn't harm in terms of health policy have failed in many ways and small ways
link |
01:26:12.080
in a big ways, depending on who you ask. And then the perspective of meta and Facebook is like, well, where the hell do I get the information of what is and isn't misinformation.
link |
01:26:22.080
So it's a really difficult place to be in. But it's great to hear that you're leaning towards freedom of speech on this aspect.
link |
01:26:30.080
And again, I think this actually calls to the fact that we need to reform institutions that help keep an open mind of what is and isn't misinformation.
link |
01:26:39.080
And misinformation has been used to bully on the interest. I mean, I just have, you know, I'm friends with Joe Rogan and he is called as a, I remember hanging out with him in Vegas and somebody yelled, stop spreading misinformation.
link |
01:26:54.080
I mean, and there's a lot of people that follow him that believe he's not spreading misinformation. Like you can't just not acknowledge the fact that there's a large number of people that have a different definition of misinformation.
link |
01:27:08.080
That's such a tough place to be. Like, who do you listen to? Do you listen to quote unquote experts? Who gets, as a person who has a PhD, I got to say, I mean, I'm not sure I know what defines an expert,
link |
01:27:20.080
especially in a new, in a totally new pandemic or a new catastrophic event, especially when politics is involved and especially when the news or the media involved that can propagate sort of outrageous narratives and thereby make a lot of money.
link |
01:27:40.080
Like, what the hell? Where is the source of truth? And then everybody turns to Facebook, it's like, please tell me what the source of truth is.
link |
01:27:48.080
Well, I mean, well, how would you handle this if you're in my position?
link |
01:27:52.080
It's very, very, very, very difficult. I would say I would more speak about how difficult the choices are and be transparent about like, what the hell do you do with this?
link |
01:28:05.080
Like here, you got exactly asked the exact question you just asked me, but to the broader public, like, okay, yeah, you guys tell me what to do. So like crowdsource it. And then the other, the other aspect is when you spoke really eloquently about the fact that there's this going back and forth.
link |
01:28:23.080
And now there's a feeling like you're censoring a little bit too much. And so I would lean, I would try to be ahead of that feeling. I would now lean towards freedom of speech and say, you know, we're not the ones that are going to define misinformation.
link |
01:28:35.080
Let it be a public debate. Let the idea stand. And I actually place, you know, this idea misinformation, I place the responsibility on the poor communication skills of scientists.
link |
01:28:49.080
They should be in the battlefield of ideas. And everybody who is spreading information against the vaccine, they should not be censored. They should be talked with.
link |
01:29:01.080
And you should show the data. You should have open discussion, as opposed to rolling your eyes and saying, I'm the expert. I know what I'm talking about.
link |
01:29:09.080
No, you need to convince people it's a battle of ideas. So that's the whole point of freedom of speech is the way to defeat bad ideas is with good ideas with speech.
link |
01:29:19.080
So like the responsibility here falls on the poor communication skills of scientists. Thanks to social media.
link |
01:29:29.080
Scientists are not communicators. They have the power to communicate some of the best stuff I've seen about COVID and from doctors is on social media.
link |
01:29:38.080
It's a way to learn to respond really quickly to go faster than the peer review process. And so they just need to get way better at that communication.
link |
01:29:46.080
And also by better, I don't mean just convincing. I also mean speak with humility. Don't talk down to people, all those kinds of things.
link |
01:29:54.080
And as a platform, I would say I would step back a little bit, not all the way, of course, because there's a lot of stuff that can cause real harm, as we talked about.
link |
01:30:04.080
But you lean more towards freedom of speech because then people from a brand perspective wouldn't be blaming you for the other ills of society, which there are many.
link |
01:30:14.080
The institutions have flaws. The political divide. Obviously politicians have flaws. That's news.
link |
01:30:23.080
The media has flaws that they're all trying to work with. And because of the central place of Facebook in the world, all of those flaws somehow kind of propagate to Facebook.
link |
01:30:34.080
And you're sitting there as Plato, the philosopher, have to answer to some of the most difficult questions being asked of human civilization.
link |
01:30:44.080
So I don't know, maybe this is an American answer, though, to lean towards freedom of speech. I don't know if that applies globally.
link |
01:30:51.080
So yeah, I don't know. But transparency and saying, I think, as a technologist, one of the things I sense about Facebook and when people talk about this company is they don't necessarily understand fully how difficult the problem is.
link |
01:31:06.080
You talked about AI has to catch a bunch of harmful stuff really quickly, just the sea of data you have to deal with. It's a really difficult problem.
link |
01:31:16.080
So like any of the critics, if you just hand them the helm for a week, let's see how well you can do.
link |
01:31:25.080
Like that, to me, that that's definitely something that would wake people up to how difficult this problem is if there's more transparency is saying how difficult this problem is.
link |
01:31:35.080
Let me ask you about on the iPhone, just because you mentioned language and my ineliquence translation is something I wanted to ask you about.
link |
01:31:44.080
And first, just to give a shout out to the supercomputer, you've recently announced the AI Research Supercluster, RSC.
link |
01:31:52.080
Obviously, I'm somebody who loves the GPUs. It currently has 6000 GPUs, NVIDIA DGX A100 is the systems that have in total 6000 GPUs, and it will eventually, maybe this year, maybe soon, will have 16000 GPUs.
link |
01:32:10.080
So it can do a bunch of different kinds of machine learning applications. There's a cool thing on the distributed storage aspect and all that kind of stuff.
link |
01:32:19.080
So one of the applications that I think is super exciting is translation, real time translation.
link |
01:32:26.080
I mentioned to you that, you know, having a conversation, I speak Russian fluently, I speak English somewhat fluently.
link |
01:32:32.080
And I'm having a conversation with Vladimir Putin, say as a use case, me as a user coming to you as a use case.
link |
01:32:38.080
We both speak each other's language. I speak Russian, he speaks English.
link |
01:32:44.080
How can we have that communication go well with the help of AI?
link |
01:32:49.080
I think it's such a beautiful and a powerful application of AI to connect the world, that bridge the gap, not necessarily between me and Putin, but people that don't have that shared language.
link |
01:33:01.080
Can you just speak about your vision with translation? Because I think that's a really exciting application.
link |
01:33:06.080
If you're trying to help people connect all around the world, a lot of content is produced in one language and people and all these other places are interested in it.
link |
01:33:15.080
So being able to translate that just unlocks a lot of value on a day to day basis.
link |
01:33:20.080
And so the kind of AI around translation is interesting because it's gone through a bunch of iterations.
link |
01:33:28.080
The basic state of the art is that you don't want to go through different kind of intermediate symbolic representations of language or something like that.
link |
01:33:42.080
You basically want to be able to map the concepts and basically go directly from one language to another and you just can train bigger and bigger models in order to be able to do that.
link |
01:33:54.080
And that's where the research supercluster comes in is basically a lot of the trend in machine learning is just you're building bigger and bigger models and you just need a lot of computation to train them.
link |
01:34:05.080
So it's not that like the translation would run on the supercomputer or the training of the model, which could have billions or trillions of examples of just basically that.
link |
01:34:18.080
You're training models on this supercluster in days or weeks that might take a much longer period of time on a smaller cluster.
link |
01:34:28.080
So it just wouldn't be practical for most teams to do.
link |
01:34:30.080
But the translation work, we were basically getting from being able to go between about 100 languages seamlessly today to being able to go to about 300 languages in the near term.
link |
01:34:46.080
So from any language to any other language.
link |
01:34:49.080
And part of the issue when you get closer to more languages is some of these get to be not very popular languages where there isn't that much content in them.
link |
01:35:04.080
So you end up having less data and you need to kind of use a model that you've built up around other examples. And this is one of the big questions around AI is like how generalizable can things be.
link |
01:35:16.080
And that's that I think is one of the things that's just kind of exciting here from a technical perspective.
link |
01:35:21.080
But capturing we talked about this with the metaverse capturing the magic of human to human interaction.
link |
01:35:26.080
So me and Putin.
link |
01:35:28.080
Okay.
link |
01:35:29.080
Again, this is it's a top example because you actually both speak Russian and English.
link |
01:35:33.080
But that's in the future.
link |
01:35:34.080
I see it as a touring test of a kind because we would both like to have an AI that improves because I don't speak Russian that well.
link |
01:35:42.080
He doesn't speak English that well.
link |
01:35:44.080
It would be nice to outperform our abilities.
link |
01:35:48.080
And it sets a really nice bar because I think AI can really help in translation for people that don't speak the language at all.
link |
01:35:55.080
But to actually capture the magic of the chemistry, the translation, which would make the metaverse super immersive.
link |
01:36:04.080
That's exciting.
link |
01:36:05.080
You remove the barrier of language period.
link |
01:36:08.080
Yeah.
link |
01:36:09.080
So when people think about translation, I think a lot of that is their thing about text to text.
link |
01:36:14.080
But speech to speech, I think is a whole another thing.
link |
01:36:17.080
And I mean, one of the big lessons on that, which I was referring to before is I think early models, it's like, all right, they take speech, they translated to text.
link |
01:36:24.080
Translate the text to another language and then kind of output that as speech in that language.
link |
01:36:29.080
And you don't want to do that.
link |
01:36:30.080
You just want to be able to go directly from speech in one language to speech in another language and build up the models to do that.
link |
01:36:36.080
And I think one of the, there have been, when you look at the progress in machine learning, there have been big advances in the techniques.
link |
01:36:46.080
Some of the advances in self supervised learning, which I know you talked to Jan about, and he's like one of the leading thinkers in this area.
link |
01:36:55.080
I just think that that stuff is really exciting.
link |
01:36:57.080
But then you couple that with the ability to just throw larger and larger amounts of compute at training these models.
link |
01:37:04.080
And you can just do a lot of things that were harder to do before.
link |
01:37:09.080
But we're asking more of our systems too, right?
link |
01:37:13.080
So if you think about the applications that we're going to need for the metaverse, or think about it.
link |
01:37:19.080
Okay, so let's talk about AR here for a second.
link |
01:37:21.080
You're going to have these glasses.
link |
01:37:23.080
They're going to look and hopefully like a normal ish looking pair of glasses, but they're going to be able to put holograms in the world and intermix virtual and physical objects in your scene.
link |
01:37:36.080
And one of the things that's going to be unique about this compared to every other computing device that you've had before, is that this is going to be the first computing device that has all the same signals about what's going on around you that you have.
link |
01:37:49.080
Where it's your phone, you can have it take a photo or a video, but I mean, these glasses are going to, you know, whenever you activate them, they're going to be able to see what you see from your perspective.
link |
01:38:00.080
They're going to be able to hear what you hear because they're the microphones and all that are going to be right around where your ears are.
link |
01:38:05.080
So you're going to want an AI assistant that's a new kind of AI assistant that can basically help you process the world from this first person perspective or from the perspective that you have.
link |
01:38:18.080
And the utility of that is going to be huge, but the kinds of AI models that we're going to need are going to be just, I don't know, there's a lot that we're going to need to basically make advances in.
link |
01:38:31.080
But I mean, but that's why I think these concepts of the metaverse and the advances in AI are so fundamentally interlinked that, I mean, they're kind of enabling each other.
link |
01:38:42.080
Yeah, like the world builder is a really cool idea. Like, you can be like a Bob Ross, like, I'm going to put a little tree right here.
link |
01:38:49.080
Yeah, I need a little tree. It's missing a little tree. And then at scale, like enriching your experience in all kinds of ways.
link |
01:38:55.080
You mentioned the assistant to that's really interesting how you can have AI assistants helping you out on different levels of sort of intimacy of communication.
link |
01:39:03.080
It could be just like scheduling or it could be like almost like therapy. Clearly, I need some.
link |
01:39:09.080
So let me ask you, you're one of the most successful people ever. You've built an incredible company that has a lot of impact.
link |
01:39:17.080
What advice do you have for young people today? How to live a life they can be proud of?
link |
01:39:25.080
How to build something that can have a big positive impact on the world?
link |
01:39:34.080
Well, let's break that down because I think you're proud of have a big positive impact.
link |
01:39:41.080
Well, you're actually listening. And how to live your life are actually three different things that I think they could line up.
link |
01:39:49.080
And also, like, what age of people are you talking to? I mean, I can like high school and college, so you don't really know what you're doing, but you dream big.
link |
01:39:58.080
And you really have a chance to do something unprecedented.
link |
01:40:02.080
Yeah. So let's maybe start with the kind of most philosophical and abstract version of this.
link |
01:40:12.080
Every night when I put my daughters to bed, we go through this thing and like they call it the good night things because we're basically what we talk about at night.
link |
01:40:25.080
And I go through them. Sounds like a good show. Yeah, the good night things. Priscilla's always asking, can I get good night things?
link |
01:40:35.080
I don't know, you go to bed too early. But I basically go through with Max and Augie, what are the things that are most important in life?
link |
01:40:49.080
It's like, what do I want them to remember and just have like really ingrained in them as they grow up? And it's health, right?
link |
01:40:56.080
Making sure that you take care of yourself and keep yourself in good shape. Loving friends and family, right?
link |
01:41:03.080
Because, you know, having the relationships, the family and making time for friends I think is perhaps one of the most important things.
link |
01:41:13.080
And then the third is maybe a little more amorphous, but it is something that you're excited about for the future.
link |
01:41:19.080
And when I'm talking to a four year old, often I'll ask her what she's excited about for tomorrow or the week ahead.
link |
01:41:25.080
But I think for most people, it's really hard. I mean, the world is a heavy place.
link |
01:41:31.080
And I think like the way that we navigate it is that we have things that we're looking forward to.
link |
01:41:37.080
So whether it is building AR glasses for the future or being able to celebrate my 10 year wedding anniversary with my wife that's coming up.
link |
01:41:47.080
It's like, I think people, you know, you have things that you're looking forward to.
link |
01:41:51.080
Or for the girls, it's often I want to see mom in the morning, right?
link |
01:41:55.080
But it's like that's a really critical thing. And then the last thing is I ask them every day, what did you do today to help someone?
link |
01:42:04.080
Because I just think that that's a really critical thing. It's easy to kind of get caught up in yourself and stuff that's really far down the road.
link |
01:42:14.080
But did you do something just concrete today to help someone?
link |
01:42:18.080
And it can just be as simple as, okay, yeah, I helped set the table for lunch.
link |
01:42:23.080
Or this other kid in our school was having a hard time with something and I helped explain it to him.
link |
01:42:29.080
But that's sort of like, if you were to boil down my overall life philosophy into what I try to impart to my kids, those are the things that I think are really important.
link |
01:42:42.080
So, okay, so let's say college, so if you graduate in college, probably more practical advice, it's almost very focused on people.
link |
01:42:53.080
And I think the most important decision you're probably going to make if you're in college is who you surround yourself with, because you become like the people you surround yourself with.
link |
01:43:04.080
And I sort of have this hiring heuristic at Metta, which is that I will only hire someone to work for me if I could see myself working for them.
link |
01:43:19.080
Not necessarily that I want them to run the company because I like my job, but in an alternate universe, if it was their company and I was looking to go work somewhere, would I be happy to work for them?
link |
01:43:29.080
And I think that that's a helpful heuristic to help balance.
link |
01:43:35.080
And when you're building something like this, there's a lot of pressure to, you want to build out your teams because there's a lot of stuff that you need to get done.
link |
01:43:41.080
And everyone always says, don't compromise on quality, but there's this question of, okay, how do you know that someone is good enough?
link |
01:43:46.080
And I think my answer is, I would want someone to be on my team if I would work for them.
link |
01:43:52.080
But I think it's actually a pretty similar answer to like, if you were choosing friends or a partner or something like that.
link |
01:44:01.080
So when you're kind of in college, trying to figure out what your circle is going to be, trying to figure out, you know, you're evaluating different job opportunities, who are the people, even if they're going to be peers in what you're doing,
link |
01:44:14.080
who are the people who, in an alternate university, you would want to work for them because you think you're going to learn a lot from them because they know, because they are kind of values aligned on the things that you care about and they're going to like,
link |
01:44:26.080
and they're going to push you, but also they know different things and have different experiences that are kind of more of what you want to become like over time.
link |
01:44:33.080
So I don't know, I think probably people are too, in general, objective focused and maybe not focused enough on the connections and the people who they're basically building relationships with.
link |
01:44:46.080
I don't know what it says about me, but my place in Austin now has seven legged robots.
link |
01:44:53.080
So I'm surrounding myself by robots, which is probably something I should look into.
link |
01:44:58.080
What kind of world would you like to see your daughters grow up in, even after you're gone?
link |
01:45:09.080
Well, I think one of the promises of all the stuff that is getting built now is that it can be a world where more people have, can just live out their imagination.
link |
01:45:21.080
One of my favorite quotes, I think it was attributed to Picasso, it's that all children are artists and the challenge is how do you remain one when you grow up?
link |
01:45:29.080
If you have kids, this is pretty clear, they just have wonderful imaginations.
link |
01:45:36.080
Part of what I think is going to be great about the creator economy and the metaverse and all this stuff is this notion around that a lot more people in the future are going to get to work doing creative stuff
link |
01:45:48.080
than what I think today we would just consider traditional labor or service.
link |
01:45:53.080
I think that that's awesome.
link |
01:45:56.080
A lot of what people are here to do is collaborate together, work together, think of things that you want to build and go do it.
link |
01:46:06.080
One of the things that I always think is striking, I teach my daughters some basic coding with scratch.
link |
01:46:13.080
They're still obviously really young, but I think of coding as building.
link |
01:46:18.080
When I'm coding, I'm building something that I want to exist, but my youngest daughter, she's very musical and pretty artistic and she thinks about coding as art.
link |
01:46:32.080
She calls it code art, not the code, but the output of what she is making.
link |
01:46:37.080
It's like she's just very interesting visually and what she can output and how it can move around.
link |
01:46:44.080
Do we need to fix that? Are we good?
link |
01:46:46.080
What happened?
link |
01:46:47.080
Do we have to clap? Alexa?
link |
01:46:49.080
Yes, I was just talking about Augie and her code art.
link |
01:46:53.080
To me, this is a beautiful thing.
link |
01:46:56.080
The notion that for me, coding was this functional thing and I enjoyed it and it helped build something utilitarian,
link |
01:47:04.080
but that for the next generation of people, it will be even more an expression of their imagination and artistic sense for what they want to exist.
link |
01:47:16.080
If that happens, if we can help bring about this world where a lot more people can,
link |
01:47:23.080
that that's their existence going forward is being able to basically create and live out all these different kinds of art.
link |
01:47:33.080
I just think that that's a beautiful and wonderful thing and will be very freeing for humanity to spend more of our time on the things that matter to us.
link |
01:47:40.080
Yeah, allow more and more people to express their art in the full meaning of that word.
link |
01:47:45.080
That's a beautiful vision.
link |
01:47:47.080
We mentioned that you are mortal.
link |
01:47:50.080
Are you afraid of death?
link |
01:47:52.080
Do you think about your mortality?
link |
01:47:56.080
And are you afraid of it?
link |
01:48:01.080
You didn't sign up for this on a podcast.
link |
01:48:03.080
No, I mean, it's an interesting question.
link |
01:48:07.080
I'm definitely aware of it.
link |
01:48:10.080
I do a fair amount of extreme sport type stuff.
link |
01:48:18.080
So I'm definitely aware of it.
link |
01:48:22.080
And you're flirting with it a bit.
link |
01:48:24.080
I train hard.
link |
01:48:26.080
And so it's like, if I'm going to go out in a 15 foot wave, then it's like, all right, I'll make sure we have the right safety gear and make sure that I'm used to that spot and all that stuff.
link |
01:48:38.080
But the risk is still there.
link |
01:48:41.080
You take some head blows along the way.
link |
01:48:44.080
But definitely aware of it.
link |
01:48:47.080
Definitely would like to stay safe.
link |
01:48:50.080
I have a lot of stuff that I want to build.
link |
01:48:54.080
Does it freak you out that it's finite though?
link |
01:48:57.080
That there's a deadline when it's all over?
link |
01:49:01.080
And that there'll be a time when your daughters are around and you're gone?
link |
01:49:05.080
I don't know.
link |
01:49:06.080
That doesn't freak me out.
link |
01:49:07.080
I think constraints are helpful.
link |
01:49:16.080
Yeah.
link |
01:49:17.080
Yeah, the finiteness makes ice cream taste more delicious somehow.
link |
01:49:22.080
The fact that it's going to be over.
link |
01:49:23.080
There's something about that with the metaverse too.
link |
01:49:26.080
We talked about this identity earlier, like having just one with NFTs.
link |
01:49:30.080
There's something powerful about the constraint of finiteness or uniqueness that this moment is singular in history.
link |
01:49:40.080
But I mean, a lot of, as you go through different waves of technology, I think a lot of what is interesting is what becomes in practice infinite or kind of there can be many, many of a thing.
link |
01:49:51.080
And then what ends up still being constrained.
link |
01:49:53.080
So the metaverse should hopefully allow a very large number or maybe in practice, hopefully close to an infinite amount of expression and worlds.
link |
01:50:08.080
But we'll still only have a finite amount of time.
link |
01:50:11.080
Yes.
link |
01:50:12.080
So I think living longer, I think, is good.
link |
01:50:21.080
And obviously all of my, our philanthropic work is, it's not focused on longevity, but it is focused on trying to achieve what I think is a possible goal in this century, which is to be able to cure, prevent or manage all diseases.
link |
01:50:33.080
So I certainly think people kind of getting sick and dying is a bad thing because, and I'm dedicating almost all of my capital towards advancing research in that area to push on that, which I mean, we could do a whole another one of these podcasts about that.
link |
01:50:47.080
Exactly.
link |
01:50:48.080
Because that's a fascinating topic.
link |
01:50:50.080
I mean, this is with the White Priscilla Chan.
link |
01:50:52.080
You formed the Chan Zuckerberg Initiative, gave away 99% or pledged to give away 99% of Facebook non meta shares.
link |
01:50:59.080
I mean, like you said, we could talk forever about all the exciting things you're working on there, including the sort of moonshot of eradicating disease by the mid century mark.
link |
01:51:13.080
I don't actually know if you're going to ever eradicate it, but I think you can get to a point where you can either cure things that happened, right?
link |
01:51:21.080
If people get diseases, but you can cure them, prevent is probably closest to eradication or just be able to manage is sort of like ongoing things that are not going to ruin your life.
link |
01:51:33.080
And I think that that's possible.
link |
01:51:34.080
I think saying that there's going to be no disease at all probably is not possible within the next several decades.
link |
01:51:41.080
The basic thing is increase the quality of life and maybe keep the finiteness because it makes everything taste more delicious.
link |
01:51:50.080
Maybe that's just being a romantic 20th century human.
link |
01:51:54.080
Maybe, but I mean, but it was an intentional decision to not focus on our philanthropy on like explicitly on longevity or living forever.
link |
01:52:03.080
Yes.
link |
01:52:06.080
If at the moment of your death, and by the way, I like that the lights went out when we started talking about death, you get to meet God.
link |
01:52:14.080
He does make it a lot more dramatic.
link |
01:52:15.080
He does.
link |
01:52:17.080
I should get closer to the mic.
link |
01:52:19.080
At the moment of your death, you get to meet God and you get to ask one question, what question would you like to ask?
link |
01:52:29.080
Or maybe a whole conversation.
link |
01:52:31.080
I don't know.
link |
01:52:32.080
It's up to you.
link |
01:52:33.080
It's more dramatic when it's just one question.
link |
01:52:37.080
Well, if it's only one question and I died, I would just want to know that Priscilla and my family, like if they were going to be okay.
link |
01:52:48.080
That might depend on the circumstances of my death, but I think that in most circumstances that I can think of, that's probably the main thing that I would care about.
link |
01:53:00.080
I think God would hear that question and be like, all right, fine, you get in.
link |
01:53:04.080
That's the right question to ask.
link |
01:53:06.080
Is it?
link |
01:53:07.080
I don't know.
link |
01:53:08.080
Humility and selfishness.
link |
01:53:09.080
All right.
link |
01:53:10.080
You're right.
link |
01:53:11.080
I mean, but maybe.
link |
01:53:14.080
They're going to be fine.
link |
01:53:15.080
Don't worry you're in.
link |
01:53:16.080
But I mean, one of the things that I think you, I struggle with at least is on the one hand, that's probably the most, the thing that's closest to me and maybe the most common human experience.
link |
01:53:29.080
But I don't know.
link |
01:53:31.080
One of the things that I just struggle with in terms of running this large enterprise is like, should the thing that I care more about be that responsibility.
link |
01:53:44.080
And I think it's shifted over time.
link |
01:53:49.080
I mean, like before I really had a family that was like the only thing I cared about.
link |
01:53:54.080
And at this point, it's, I mean, I mean, I care deeply about it, but like, yeah, I think that that's not as obvious of a question.
link |
01:54:06.080
Yeah, we humans are weird.
link |
01:54:08.080
You get this ability to impact millions of lives and it's definitely something, billions of lives is something you care about.
link |
01:54:17.080
But the weird humans that are closest to us, those are the ones that mean the most.
link |
01:54:24.080
And I suppose that's the dream of the metaverse is to connect, form small groups like that where you can have those intimate relationships.
link |
01:54:31.080
Let me ask you the big ridiculous.
link |
01:54:33.080
When to be able to be close, not just based on who you happen to be next to.
link |
01:54:39.080
I think that's what the internet is already doing, is allowing you to spend more of your time, not physically proximate.
link |
01:54:46.080
I mean, I always think when you think about the metaverse, people ask this question about the real world.
link |
01:54:52.080
It's like, the virtual world versus the real world.
link |
01:54:55.080
It's like, no, the real world is a combination of the virtual world and the physical world.
link |
01:55:00.080
But I think over time, as we get more technology, the physical world is becoming less of a percent of the real world.
link |
01:55:08.080
And I think that that opens up a lot of opportunities for people because, you know, you can work in different places.
link |
01:55:13.080
You can stay more close to stay closer to people who are in different places.
link |
01:55:18.080
That's good.
link |
01:55:19.080
Removing barriers of geography and then barriers of language.
link |
01:55:23.080
That's a beautiful vision.
link |
01:55:25.080
Big ridiculous question. What do you think is the meaning of life?
link |
01:55:44.080
I think there are probably a couple of different ways that I would go at this.
link |
01:55:52.080
But I think it gets back to this last question that we talked about, about the duality between you have the people around you who you care the most about,
link |
01:56:00.080
and then there's like this bigger thing that maybe you're building.
link |
01:56:05.080
And I think that in my own life, I mean, I sort of think about this tension, but I started this whole company and my life's work is around human connection.
link |
01:56:15.080
So I think it's intellectually probably the thing that I go to first is just that human connection is the meaning.
link |
01:56:29.080
And I mean, I think that it's a thing that our society probably systematically undervalues.
link |
01:56:37.080
And I just remember when I was growing up and in school, it's like, do your homework and then go play with your friends after.
link |
01:56:45.080
And it's like, no, what if playing with your friends is the point?
link |
01:56:50.080
Sounds like an argument your daughters make.
link |
01:56:53.080
Well, I mean, I don't know. I just think it's interesting.
link |
01:56:55.080
Homework doesn't even matter, man.
link |
01:56:57.080
It's interesting because it's, you know, and people, I think people tend to think about that stuff as wasting time or that's like what you do in the free time that you have.
link |
01:57:08.080
But like, what if that's actually the point?
link |
01:57:11.080
So that's one.
link |
01:57:13.080
But here's maybe a different way of counting at this, which is maybe more like religious in nature.
link |
01:57:18.080
I always like, there's a rabbi who I've studied with who kind of gave me this.
link |
01:57:27.080
We were talking through Genesis and the Bible and the Torah and they're basically walking through.
link |
01:57:36.080
It's like, okay, you go through the seven days of creation and it's basically like, why does the Bible start there?
link |
01:57:47.080
It's like, it could have started anywhere in terms of like how to live.
link |
01:57:51.080
But basically it starts with talking about how God created people in his or her image.
link |
01:57:59.080
But the Bible starts by talking about how God created everything.
link |
01:58:04.080
So I actually think that there's like a compelling argument that I think I've always just found meaningful and inspiring that a lot of the point of what sort of religion has been telling us that we should do is to create and build things.
link |
01:58:29.080
So these things are not necessarily at odds.
link |
01:58:32.080
I mean, I think probably to some degree you'd expect me to say something like this because I've dedicated my life to creating things that help people connect.
link |
01:58:40.080
So I mean, that's sort of the fusion of, I mean, getting back to what we talked about earlier.
link |
01:58:45.080
It's what I studied in school or psychology and computer science, right?
link |
01:58:48.080
So I mean, these are like the two themes that I care about.
link |
01:58:53.080
But I don't know, for me, that's kind of what I think about.
link |
01:58:56.080
That's what matters.
link |
01:58:57.080
To create and to love, which is the ultimate form of connection.
link |
01:59:04.080
I think this is one hell of an amazing replay experience in the metaverse.
link |
01:59:08.080
So whoever is using our avatars years from now, I hope you had fun.
link |
01:59:13.080
And thank you for talking today.
link |
01:59:15.080
Thank you.
link |
01:59:16.080
Thanks for listening to this conversation with Mark Zuckerberg.
link |
01:59:19.080
To support this podcast, please check out our sponsors in the description.
link |
01:59:23.080
And now, let me leave you with the end of the poem, If by Roger Kipling.
link |
01:59:30.080
If you can talk with crowds and keep your virtue or walk with kings or lose the common touch.
link |
01:59:37.080
If neither foes nor loving friends can hurt you.
link |
01:59:41.080
If all men count with you, but none too much.
link |
01:59:47.080
If you can fill the unforgiving minute with 60 seconds worth of distance run.
link |
01:59:52.080
Yours is the earth and everything that's in it.
link |
01:59:56.080
And which is more, you'll be a man, my son.
link |
02:00:01.080
Thank you for listening and hope to see you next time.