back to index

Kevin Systrom: Instagram | Lex Fridman Podcast #243


small model | large model

link |
00:00:00.000
The following is a conversation with Kevin Systrom, cofounder and longtime CEO of Instagram,
link |
00:00:06.560
including for six years after Facebook's acquisition of Instagram. This is the Lex
link |
00:00:11.840
Friedman podcast. To support it, please check out our sponsors in the description. And now,
link |
00:00:17.600
here's my conversation with Kevin Systrom. At the risk of asking the Rolling Stones
link |
00:00:24.560
to play satisfaction, let me ask you about the origin story of Instagram.
link |
00:00:28.240
Sure. So maybe some context, you, like we were talking about offline, grew up in Massachusetts,
link |
00:00:34.640
learned computer programming there, liked to play Doom II, worked at a vinyl record store,
link |
00:00:40.720
then you went to Stanford, turned down Mr. Mark Zuckerberg and Facebook, went to Florence to study
link |
00:00:48.240
photography. Those are just some random, beautiful, impossibly brief glimpses into a life. So let me
link |
00:00:55.040
ask again, can you take me through the origin story of Instagram? Give me that context.
link |
00:00:59.120
Yeah, you basically set it up. All right. So we have a fair amount of time. So I'll go into some
link |
00:01:05.360
detail. But basically what I'll say is Instagram started out of a company actually called Bourbon,
link |
00:01:13.360
and it was spelled B U R B N. And a couple of things were happening at the time. So if we zoom
link |
00:01:19.920
back to 2010, not a lot of people remember what was happening in the dot com world then. But
link |
00:01:27.360
check in apps were all the rage. So what's the checking out? Gawalla, Foursquare, Hot Potato.
link |
00:01:34.240
So I'm at a place, I'm going to tell the world that I'm at this place. That's right. What's the
link |
00:01:39.040
idea behind this kind of app, by the way? You know what, I'm going to answer that, but through
link |
00:01:44.000
what Instagram became and why I believe Instagram replaced them. So the whole idea was to share
link |
00:01:49.200
with the world what you were doing specifically with your friends, right? But there were all
link |
00:01:54.240
the rage and Foursquare was getting all the press and I remember sitting around saying,
link |
00:01:57.600
hey, I want to build something, but I don't know what I want to build. What if I built a better
link |
00:02:02.320
version of Foursquare? And I asked myself, well, why don't I like Foursquare or how could it be
link |
00:02:08.480
improved? And basically I sat down and I said, I think that if you have a few extra features,
link |
00:02:16.400
it might be enough. One of which happened to be posting a photo of where you were.
link |
00:02:20.160
There were some others. It turns out that wasn't enough. My cofounder joined. We were going to
link |
00:02:24.960
attack Foursquare and the likes and try to build something interesting. And no one used it. No
link |
00:02:31.200
one cared because it wasn't enough. It wasn't different enough, right? So one day we were
link |
00:02:36.400
sitting down and we asked ourselves, okay, let's come to Jesus moment. Are we going to do this
link |
00:02:41.520
startup? And if we're going to, we can't do what we're currently doing. We have to switch it up.
link |
00:02:46.880
So what do people love the most? So we sat down and we wrote out three things that we thought
link |
00:02:52.160
people uniquely loved about our products that weren't in other products. Photos happen to be the
link |
00:02:58.080
top one. So sharing a photo of what you were doing, where you were at the moment was not something
link |
00:03:04.640
products let you do really. Facebook was like, post an album of your vacation from two weeks ago,
link |
00:03:10.640
right? Twitter allowed you to post a photo but their feed was primarily text and they didn't
link |
00:03:16.000
show the photo in line. Or at least I don't think they did at the time. So even though it seems
link |
00:03:21.520
totally stupid and obvious to us now, at the moment then, posting a photo of what you were
link |
00:03:28.240
doing at the moment was like not a thing. So we decided to go after that because we'd noticed that
link |
00:03:35.120
people who used our service, the one thing they happened to like the most was posting a photo.
link |
00:03:39.440
So that was the beginning of Instagram. And yes, we went through and we added filters and
link |
00:03:44.400
there's a bunch of stories around that. But the origin of this was that we were trying to be a
link |
00:03:48.800
check and app, realized that no one wanted another check and app. It became a photo sharing app but
link |
00:03:54.080
one that was much more about what you're doing and where you are. And that's why when I say I think
link |
00:03:58.720
we've replaced check and apps, it became a check in via a photo rather than saying your location
link |
00:04:06.400
and then optionally adding a photo. When you were thinking about what people like,
link |
00:04:11.440
from where did you get a sense that this is what people like? You said we sat down, we wrote some
link |
00:04:16.560
stuff down on paper. Where is that intuition that seems fundamental to the success of
link |
00:04:24.800
an app like Instagram? Where does that idea, where does that list of three things come from?
link |
00:04:30.240
Exactly. Only after having studied machine learning now for a couple of years,
link |
00:04:34.800
you have understood yourself. I've started to make connections. We can go into this later.
link |
00:04:43.200
But obviously, the connections between machine learning and the human brain I think are
link |
00:04:52.000
stretched sometimes. At the same time, being able to back prop and being able to look at the world,
link |
00:04:59.520
try something, figure out how you're wrong, how wrong you are, and then nudge your company in
link |
00:05:06.080
the right direction based on how wrong you are. It's like a fascinating concept. We didn't know
link |
00:05:13.440
we were doing it at the time but that's basically what we were doing. We put it out to call it 100
link |
00:05:18.720
people and you would look at their data. You would say, what are they sharing? What resonates,
link |
00:05:26.000
what doesn't resonate? We think they're going to resonate with X, but it turns out they resonate
link |
00:05:29.840
with Y. Okay, shift the company towards Y. It turns out if you do that enough quickly enough,
link |
00:05:36.240
you can get to a solution that has product market fit. Most companies fail because they sit there
link |
00:05:42.560
and either their learning rate is too slow, they sit there and they're adamant that they're right
link |
00:05:47.840
even though the data is telling them they're not right, or they're learning rates too high and
link |
00:05:53.200
they wildly chase different ideas and they never actually sit on one where they don't groove.
link |
00:05:59.360
And I think when we sat down and we wrote out those three ideas, what we were saying is,
link |
00:06:03.280
what are the three possible, whether they're local or global, maxima in our world, that
link |
00:06:11.120
users are telling us they like because they're using the product that way? It was clear people
link |
00:06:15.680
like the photos because that was the thing they were doing. And we just said, okay, what if we
link |
00:06:20.560
just cut out most of the other stuff and focus on that thing? And then it happened to be a
link |
00:06:26.080
multi billion dollar business. And it's that easy, by the way. Yeah, I guess so. Well, nobody ever
link |
00:06:32.880
writes about neural networks that miserably failed. So this particular neural network succeeded.
link |
00:06:38.320
Oh, they fail all the time, right? Yeah. But nobody writes about... The default state is failing.
link |
00:06:42.880
Yes. When you said the way people are using the app, is that the loss function for this
link |
00:06:50.160
neural network or is it also self report? Do you ever ask people what they like or do you have to
link |
00:06:56.640
track exactly what they're doing, not what they're saying? I once made a Thanksgiving dinner. Okay.
link |
00:07:03.040
And it was for relatives and I like to cook a lot. Okay. And I worked really hard on picking the
link |
00:07:10.960
specific dishes. And I was really proud because I had planned it out using a Gantt chart and
link |
00:07:18.320
like it was ready on time and everything was hot. Nice. Like, I don't know if you're a big
link |
00:07:22.320
Thanksgiving guy, but like the worst thing about Thanksgiving is when the turkey is cold and some
link |
00:07:27.280
things are hot and something... Anyway, you got a Gantt chart. Did you actually have a chart?
link |
00:07:31.120
Oh, yeah. Yeah. Omni plan. Fairly expensive, like Gantt chart thing that I think maybe 10 people
link |
00:07:37.120
have purchased in the world. But I'm one of them and I use it for recipe planning only around
link |
00:07:42.640
big holidays. That's brilliant, by the way. Do people do this kind of... Over engineering?
link |
00:07:48.800
It's not over. It's just engineering. It's planning. Thanksgiving is a complicated
link |
00:07:53.840
set of events with some uncertainty with a lot of things going on. You should be able...
link |
00:07:58.240
You should be planning in this way. There should be a chart. It's not over.
link |
00:08:01.440
I mean, so what's funny is brief aside. Yes. Brilliant. I love cooking. I love food. I love
link |
00:08:08.000
coffee. And I've spent some time with some chefs who know their stuff. And they always just take
link |
00:08:14.160
out a piece of paper and just work backwards in rough order. Like, it's never perfect, but rough
link |
00:08:19.600
order. It's just like, oh, that makes sense. Why not just work backwards from the end goal, right?
link |
00:08:24.640
And put in some buffer time. And so I probably over specified it a bit using a Gantt chart. But
link |
00:08:30.080
the fact that you can do it, it's what professional kitchens roughly do. They just don't call it a
link |
00:08:36.080
Gantt chart. Or at least I don't think they do. Anyway, I was telling a story about Thanksgiving.
link |
00:08:40.240
So here's the thing. I'm sitting down. We have the meal. And then I got to know Ray Dalio
link |
00:08:48.320
fairly well over maybe the last year of Instagram. And one thing that he kept saying was like,
link |
00:08:55.040
feedback is really hard to get honestly from people. And I sat down at... After dinner, I said,
link |
00:09:02.320
guys, I want feedback. What was good and what was bad? Yes. And what's funny is like, literally,
link |
00:09:08.320
everyone just said everything was great. And I personally knew I had screwed up a handful of
link |
00:09:13.920
things. But no one would say it. And can you imagine now, not something as high stakes as
link |
00:09:20.720
Thanksgiving dinner, okay? Thanksgiving dinner, it's not that high stakes. But you're trying to
link |
00:09:25.600
build a product and everyone knows you left your job for it, and you're trying to build it out,
link |
00:09:29.360
and you're trying to make something wonderful. And it's yours, right? You designed it.
link |
00:09:35.440
Now try asking for feedback. And know that you're giving this to your friends and your family.
link |
00:09:42.160
People have trouble giving hard feedback. People have trouble saying, I don't like this,
link |
00:09:48.640
or this isn't great, or this is how it's failed me. In fact, you usually have two classes of people,
link |
00:09:56.480
people who just won't say bad things. You can literally say to them, please tell me what you
link |
00:10:01.360
hate most about this, and they won't do it. They'll try, but they won't. And then the other class
link |
00:10:06.320
of people are just negative period about everything. And it's hard to parse out what is true and what
link |
00:10:13.360
isn't. So my rule of thumb with this is you should always ask people. But at the end of the day,
link |
00:10:21.040
it's amazing what data will tell you. And that's why with whatever project I work on,
link |
00:10:25.760
even now, collecting data from the beginning on usage patterns, so engagement, how many days of
link |
00:10:32.240
the week, do they use it? How many, I don't know, if we were to go back to Instagram, how many
link |
00:10:36.880
impressions per day, right? Is that growing? Is that shrinking? And don't be like overly
link |
00:10:43.200
scientific about it, right? Because maybe you have 50 beta users or something.
link |
00:10:47.280
But what's fascinating is that data doesn't lie. People are very defensive about their time.
link |
00:10:57.360
They'll say, oh, I'm so busy. I'm sorry. I didn't get to use the app. But I don't know.
link |
00:11:03.840
You were posting on Instagram the whole time. So I don't know. At the end of the day, at Facebook,
link |
00:11:10.080
there was before time spent became kind of this loaded term there. The idea that people's currency
link |
00:11:19.120
in their lives is time. And they only have a certain amount of time to give things, whether it's
link |
00:11:23.760
friends or family or apps or TV shows or whatever. There's no way of inventing more of it, at least
link |
00:11:29.200
not that I know of. If they don't use it, it's because it's not great. So the moral of the story
link |
00:11:38.160
is you can ask all you want, but you just have to look at the data. And data doesn't lie, right?
link |
00:11:45.280
I mean, there's metrics. There's data can obscure the key insight if you're not careful. So
link |
00:11:55.760
time spent in the app, that's one, there's so many metrics you can put at this. And they will
link |
00:12:00.160
give you totally different insights, especially when you're trying to create something that doesn't
link |
00:12:05.280
obviously exist yet. So, you know, measuring maybe why you left the app or measuring special
link |
00:12:15.760
moments of happiness that will make sure you return to the app or moments of happiness that
link |
00:12:21.440
are long lasting versus like dopamine short term, all of those things. But I think I suppose in the
link |
00:12:27.760
beginning, you can just get away with just asking the question, which features are used a lot? Let's
link |
00:12:35.520
do more of that. And how hard was the decision? And I mean, maybe you can tell me what Instagram
link |
00:12:43.040
looked in the beginning, but how hard was it to make pictures the first class citizen?
link |
00:12:48.320
That's a revolutionary idea. Like, at whatever point Instagram became this feed of photos.
link |
00:12:56.800
That's quite brilliant. Plus, also don't know when this happened, but they're all shaped the same.
link |
00:13:05.520
It's like, I have to tell you why that's the interesting part.
link |
00:13:10.400
Why is that? So a couple of things. One is data data, like you're right, you can overinterpret
link |
00:13:17.600
data. Like imagine trying to fly a plane by staring at, I don't know, a single metric like
link |
00:13:24.320
airspeed. You don't know if you're going up or down. I mean, it correlates with up or down,
link |
00:13:28.400
but you don't actually know. It will never help you land the plane. So don't stare at one metric.
link |
00:13:33.760
Like it turns out you have to synthesize a bunch of metrics to know where to go.
link |
00:13:38.640
But it doesn't lie. Like if your speed is zero, unless it's not working, right? If it's zero,
link |
00:13:44.720
you're probably going to fall out of the sky. So generally, you look around and you have the scan
link |
00:13:49.600
going. Yes. And you're just asking yourself, is this working or is this not working? But people
link |
00:13:57.040
have trouble explaining how they actually feel. So it's about synthesizing both of them. So
link |
00:14:05.600
then Instagram, right? We were talking about revolutionary moment where the feed became
link |
00:14:11.760
square photos basically. And photos first and then square photos. Yeah. It was clear to me
link |
00:14:20.320
that the biggest, so I believe the biggest companies are founded when enormous technical
link |
00:14:29.440
shifts happen. And the biggest technical shift that happened right before Instagram was founded
link |
00:14:34.720
was the advent of a phone that didn't suck, the iPhone, right? Like in retrospect, we're like,
link |
00:14:40.000
oh my God, the first iPhone that almost had, like it wasn't that good. But compared to everything
link |
00:14:45.920
else at the time, it was amazing. And by the way, the first phone that had an incredible camera
link |
00:14:54.560
that could do as well as the point and shoot you might carry around was the iPhone 4. And that
link |
00:15:01.120
was right when Instagram launched. And we looked around and we said, what will change because
link |
00:15:06.080
everyone has a camera in their pocket? And it was so clear to me that the world of social networks
link |
00:15:13.120
before it was based in the desktop and sitting there and having a link you could share, right?
link |
00:15:20.560
And that wasn't going to be the case. So the question is, what would you share if you were
link |
00:15:23.680
out and about in the world? If not only did you have a camera that fit in your pocket, but by the
link |
00:15:29.760
way, that camera had a network attached to it that allowed you to share instantly. That seemed
link |
00:15:35.200
revolutionary. And a bunch of people saw it at the same time. It wasn't just Instagram. There were
link |
00:15:38.880
a bunch of competitors. The thing we did, I think, was not only, well, we focused on two things. So
link |
00:15:45.280
we wrote down those things. We circled photos and we said, I think we should invest in this.
link |
00:15:49.840
But then we said, what sucks about photos? One, they look like crap, right? They just, at least
link |
00:15:56.480
back then. Now, my phone takes pretty great photos, right? Back then, they were blurry,
link |
00:16:03.120
not so great, compressed, right? Two, it was really slow, like really slow to upload a photo.
link |
00:16:12.000
And I'll tell a fun story about that and explain to you why they're all the same size and square as
link |
00:16:16.400
well. And three, man, if you wanted to share a photo on different networks, you had to go to
link |
00:16:23.680
each of the individual apps and select all of them and upload individually. And so we were like,
link |
00:16:29.040
all right, those are the pain points. We're going to focus on that. So one, instead of, because
link |
00:16:33.920
they weren't beautiful, we were like, why don't we lean into the fact that they're not beautiful?
link |
00:16:38.240
And I remember studying in Florence, my photography teacher gave me this Holga camera. And I'm not
link |
00:16:43.200
sure everyone knows what a Holga camera is, but they're these old school plastic cameras.
link |
00:16:48.240
I think they're produced in China at the time. And I want to say the original ones,
link |
00:16:53.920
like from the 70s or the 80s or something, they're supposed to be like $3 cameras for the every person.
link |
00:16:59.760
They took nice medium format films, large, large negatives, but they kind of blurred the light
link |
00:17:07.680
and they kind of like light leaked into the side. And there was this whole resurgence where people
link |
00:17:12.800
looked at that and said, oh my God, this is a style, right? And I remember using that in Florence
link |
00:17:18.400
and just saying, well, why don't we just like lean into the fact that these photos suck and make them
link |
00:17:22.640
suck more, but in an artistic way. And it turns out that had product market fit. People really
link |
00:17:28.720
like that. They were willing to share their not so great photos if they looked not so great on
link |
00:17:34.000
purpose. The second part. That's where the filters come into the picture. So computational
link |
00:17:41.120
modification of photos to make them look extra crappy to where it becomes art. Yeah. And I mean,
link |
00:17:48.080
add light leaks, add like an overlay filter and make them more contrasty than they should be.
link |
00:17:52.960
The first filter we ever produced was called X Pro 2. And I designed it while I was in this
link |
00:17:59.440
small little bed and breakfast room in Toto Santos, Mexico. I was trying to take a break from the
link |
00:18:05.200
bourbon days. And I remember saying to my cofounder, I just need like a week to reset.
link |
00:18:10.320
And that was on that trip, worked on the first filter because I said, you know, I think I can
link |
00:18:15.840
I think I can do this. And I literally iterated one by one over the RGB values in the array that
link |
00:18:22.720
was the photo and just slightly shifted. Basically, there was a function of our function of G,
link |
00:18:29.200
function of B that just shifted them slightly on some rocket science. And it turns out that
link |
00:18:35.440
actually made your photo look pretty cool. It's just mapped from one color space to another color
link |
00:18:41.360
space. It was simple, but it was really slow. I mean, if you applied a filter, I think it used
link |
00:18:47.840
to take two or three seconds to render only eventually what I figure out how to do it on the
link |
00:18:52.400
GPU. And I'm not even sure it was a GPU, but it was using OpenGL. But anyway, I would eventually
link |
00:18:58.720
figure that out. And then it would be instant, but it used to be really slow. By the way,
link |
00:19:03.040
anyone who's watching or listening, it's amazing what you can get away with in a startup,
link |
00:19:09.280
as long as the product outcome is right for the user, like you can be slow, you can be terrible,
link |
00:19:15.760
you can be as long as you have product market fit, people will put up with a lot. And then the
link |
00:19:22.080
question is just about compressing, making it more performant over time so that they get that
link |
00:19:27.760
product market fit instantly. So fascinating, because there's some things where those three
link |
00:19:33.520
those three seconds would make or break the app. But some things you're saying not. It's hard to
link |
00:19:39.920
know when, you know, it's what it's the problem Spotify solved, making streaming like work.
link |
00:19:48.400
And like delays in listening to music is a huge negative, even like slight delays.
link |
00:19:55.520
But here you're saying, I mean, how do you know when those three seconds are okay? Or you just
link |
00:20:00.800
kind of have to try it out? Because to me, my intuition would be those three seconds would
link |
00:20:08.960
kill the app. Like I would try to do the OpenGL thing. Right. So I wish I were that smart at the
link |
00:20:15.520
time. I wasn't, I just knew how to do what I knew how to do, right? And I decided, okay, like, why
link |
00:20:23.760
don't I just iterate over the values and change them? And what's interesting is that compared
link |
00:20:32.080
to the alternatives, no one else used OpenGL. Right. So everyone else was doing that the dumb
link |
00:20:36.800
way. And in fact, they were doing it at a high resolution. Now comes in the small resolution
link |
00:20:42.320
that we'll talk about for a second. By choosing 512 pixels by 512 pixels, which I believe it was
link |
00:20:49.520
at the time, we iterated over a lot fewer pixels than our competitors who were trying to do these
link |
00:20:55.280
enormous output like images. So instead of taking 20 seconds, I mean, three seconds feels pretty good,
link |
00:21:02.320
right? So on a relative basis, we were winning like a lot. Okay. So that's answer number one.
link |
00:21:08.320
Answer number two is we actually focused on latency in the right places. So we did this
link |
00:21:13.600
really wonderful thing when you uploaded. So the way it would work is, you know, you'd take your phone,
link |
00:21:20.720
you'd take the photo, and then you'd go to the, you'd go to the edit screen, where you would
link |
00:21:26.720
caption it. And on that caption screen, you'd start typing, you'd think, okay, like, what's a
link |
00:21:32.320
clever caption? And I said to Mike, hey, when I worked on the Gmail team, you know what they did?
link |
00:21:37.280
When you typed in your username or your email address, even before you've entered in your
link |
00:21:42.080
password, like the chant probability, once you enter in your username that you're going to
link |
00:21:48.160
actually sign in is extremely high. So when I just start loading your account in the background,
link |
00:21:53.120
not not like sending it down to the desktop, that would be a security issue. But like,
link |
00:21:59.760
load it into memory on the server, like get it ready, prepare it. I always thought that was so
link |
00:22:04.880
fascinating and unintuitive. And I was like, Mike, why don't we just do that? But like,
link |
00:22:09.120
we'll just upload the photo and like assume you're going to upload the photo. And if you don't,
link |
00:22:15.200
forget about it, we'll delete it, right? So what ended up happening was people would caption their
link |
00:22:21.280
photo, they'd press done or upload. And you'd see this little progress bar just go,
link |
00:22:27.760
it was lightning fast. Okay, we were no faster than anyone else at the time. But by choosing 512
link |
00:22:34.800
by 512 and doing in the background, it almost guaranteed that it was done by the time you
link |
00:22:40.160
captioned. And everyone when they used it was like, how the hell is this thing so fast? But we
link |
00:22:47.120
were slow, we just hit the slowness. It wasn't like, these things are just like it's a shelly game,
link |
00:22:52.080
you're just hiding the latency. That that mattered to people like a lot. And I think that so you
link |
00:23:00.080
were willing to put up with a slow filter, if it meant you could share it immediately. And of
link |
00:23:04.800
course, we added sharing options with let which let you distribute it really quickly. That was
link |
00:23:08.400
the third part. So latency matters, but relative to what? And then there's some like tricks,
link |
00:23:17.520
you get around to just hiding the latency. Like, I don't know if Spotify starts downloading the
link |
00:23:23.040
next song eagerly, I'm assuming they do. There are a bunch of ideas here that are not rocket science
link |
00:23:29.200
that that really help. And all of that was stuff you were explicitly having a discussion about,
link |
00:23:36.080
like those designs and argument, you were having like arguments, discussions.
link |
00:23:41.360
I'm not sure it was arguments. I mean, I'm not sure if you've met my co founder Mike,
link |
00:23:45.280
but he's a pretty nice guy. He's very reasonable. And we both just saw eye to eye and we're like,
link |
00:23:51.120
yeah, just like, make this fast, early seem fast. It'll be great. I mean, honestly,
link |
00:23:57.920
I think the most contentious thing and he would say this to initially was I was on an iPhone 3G,
link |
00:24:05.680
so like the not so fast one, and he had a brand new iPhone 4. I was cheap. Nice.
link |
00:24:12.400
And his fee loaded super smoothly, like when he would scroll from photo to photo,
link |
00:24:18.640
buttery smooth, right? But on my phone, every time you got to a new photo, it was like,
link |
00:24:23.040
kuchunk, kuchunk, allocate memory, like all this stuff, right? I was like, Mike, that's
link |
00:24:28.080
unacceptable. And he's like, Oh, come on, man, just like upgrade your phone. Basically, you
link |
00:24:32.320
didn't actually say that he's nicer than that. But I can tell he wished like I would just stop
link |
00:24:37.360
being cheap and just get a new phone. But what's funny is we actually sat there working on that
link |
00:24:41.840
little detail for a few days before launch. And that polished experience plus the fact that
link |
00:24:49.440
uploading seemed fast for all these people who didn't have nice phones, I think meant a lot.
link |
00:24:55.760
Because far too often you see teams focus not on performance, they focus on what's the cool
link |
00:25:02.720
computer science problem they can solve, right? Can we scale this thing to a billion users? And
link |
00:25:08.400
they've got like 100, right? Yeah. You talked about loss function. So I want to come back to that.
link |
00:25:15.040
But like, the loss function is like, do you provide a great, happy, magical, whatever experience for
link |
00:25:20.960
the consumer? And listen, if it happens to involve something complex and technical, then great.
link |
00:25:26.960
But it turns out, I think most of the time, those experiences are just sitting there waiting to be
link |
00:25:33.760
built with like, not that complex solutions. But everyone is just like so stuck in their
link |
00:25:39.680
own head that they have to over engineer everything. And then they forget about the easy stuff.
link |
00:25:43.520
I mean, also, maybe to flip the loss function there is you're trying to minimize the number of
link |
00:25:49.920
times there's unpleasant experience, right? Like the one you mentioned where when you go to the
link |
00:25:56.160
next photo, it freezes for a little bit. So it's almost as opposed to maximizing pleasure,
link |
00:26:00.880
it's probably easier to minimize the number of like the friction. Yeah. And as we all know,
link |
00:26:08.880
you just make the pleasure negative and then minimize everything. So
link |
00:26:12.000
like we're mapping this all back to neural networks. But actually, can I say one thing on
link |
00:26:16.240
that? Which is, I don't know a lot about machine learning, but I feel like I've tried studying a
link |
00:26:22.320
bunch. That whole idea of reinforcement learning and planning out more than the greedy single
link |
00:26:29.760
experience, I think is the closest you can get to like ideal product design thinking,
link |
00:26:37.440
where you're not saying, hey, like, can we have a great experience just this one time? But like,
link |
00:26:42.720
what is the right way to onboard someone? What series of experiences correlate most with them
link |
00:26:48.560
hanging on long term, right? So not just saying, oh, did the photo load slowly a couple times or
link |
00:26:55.440
did they get a great photo at the top of their feed? But like, what are the things that are going
link |
00:27:00.240
to make this person come back over the next week, over the next month? And as a product designer,
link |
00:27:06.240
asking yourself, okay, I want to optimize, not just minimize bad experiences in the short run,
link |
00:27:10.960
but like, how do I get someone to engage over the next month? And I'm not going to claim that I
link |
00:27:19.040
thought that way at all at the time, because I certainly didn't. But if I were going back and
link |
00:27:23.040
giving myself any advice, it would be thinking, what are those, what are those second order effects
link |
00:27:27.600
that you can create? And it turns out having your friends on the service is an enormous win. So
link |
00:27:34.080
starting with a very small group of people that produce content that you wanted to see,
link |
00:27:38.560
which we did, we seeded the community very well, I think, ended up mattering. And so.
link |
00:27:44.960
Yeah, you said that community is one of the most important things. So it's from a metrics perspective,
link |
00:27:50.880
from maybe a philosophy perspective, building a certain kind of community within the app. See,
link |
00:27:56.880
I wasn't sure what exactly you meant by that when I've heard you say that. Maybe you can elaborate,
link |
00:28:01.760
maybe you can elaborate. But as I understand now, it's can literally mean get your friends onto the
link |
00:28:08.800
app. Yeah, think of it this way. You can build an amazing restaurant or bar or whatever, right?
link |
00:28:17.760
But if you show up and you're the only one there, is it like, does it matter how good the food is?
link |
00:28:23.840
The drinks, whatever. No, these are inherently social experiences that we were working on.
link |
00:28:29.760
So the idea of having people there, like you needed to have that, otherwise,
link |
00:28:37.520
it was just a filter app. But by the way, part of the genius, I'm going to say genius, even though
link |
00:28:41.760
it wasn't really genius, was starting to be marauding as a filter app was awesome. The fact that you
link |
00:28:50.240
could. So we talk about single player mode a lot, which is like, can you play the game alone?
link |
00:28:54.480
And Instagram, you could totally play alone. You could filter your photos. And a lot of people would
link |
00:28:58.800
tell me, I didn't even realize that this thing was a social network until my friends showed up.
link |
00:29:04.400
It totally worked as a single player game. And then when your friends showed up, all of a sudden,
link |
00:29:09.840
it was like, Oh, not only was this great alone, but now I actually have this trove of photos that
link |
00:29:15.440
people can look at and start liking. And then I can like theirs. And so it was this bootstrap method
link |
00:29:21.680
of how do you make the thing not suck when the restaurant is empty? Yeah, but the thing is,
link |
00:29:26.640
when you say friends, we were not necessarily referring to friends in the physical space. So
link |
00:29:31.600
you're not bringing your physical friends with you. You're also making new friends. So you're
link |
00:29:35.760
finding new community. So it's not immediately obvious to me that it's like, it's almost like
link |
00:29:41.440
building any kind of community. It was, it was both. And what we learned very early on was what
link |
00:29:47.680
made Instagram special. And the reason why you would sign up for it versus say, just sit on
link |
00:29:51.840
Facebook and look at your friends photos. Of course, we were live. And of course, it was
link |
00:29:56.800
interesting to see what your friends were doing now. But the fact that you connect with people who
link |
00:30:01.280
like took really beautiful photos in a certain style all around the world, whether they were
link |
00:30:05.920
travelers, it was the beginning or beginning of the influencer economy is these people who became
link |
00:30:11.600
professional Instagrammers way back when, right? But they took these amazing photos and some of
link |
00:30:18.480
them were photographers, right? Like professionally. And all of a sudden you had this moment in the
link |
00:30:24.880
day when you could open up this app and sure you could see what your friends were doing, but also
link |
00:30:28.560
it was like, Oh my God, that's a beautiful, beautiful waterfall or Oh my God, I didn't realize
link |
00:30:32.880
there was that corner of England or like really cool stuff. And the beauty about Instagram early
link |
00:30:39.280
on was that it was international by default. You didn't have to speak English to use it, right?
link |
00:30:46.480
Just look at the photos. Works great. We did translate. We had some pretty bad translations,
link |
00:30:52.160
but we did translate the app. And, you know, even if our translations were pretty poor, the idea
link |
00:30:58.400
that you could just connect with other people through their images was pretty powerful.
link |
00:31:03.120
How much technical difficulties there with the programming, like what programming language
link |
00:31:08.160
you were talking about, what was zero? Like maybe it was hard for us, but I mean, we,
link |
00:31:15.200
there was nothing. The only thing that was complex about Instagram at the beginning
link |
00:31:20.480
technically was making it scale. And we were just plain old Objective C for the client.
link |
00:31:27.760
So it was iPhone only at first?
link |
00:31:29.680
iPhone only, yep.
link |
00:31:30.800
As an Android person, I'm deeply offended, but go ahead.
link |
00:31:33.520
Come on. This was 2010. Oh, sure, sure. Sorry. Sorry. Android's gotten a lot better.
link |
00:31:38.960
Yeah, yeah. I take it back, you're right.
link |
00:31:41.680
If I were to do something today, I think it would be very different in terms of
link |
00:31:45.280
launch strategy, right? Android's enormous, too. But anyway, back to that moment,
link |
00:31:51.040
it was Objective C. And then we were Python based, which is just like, this is before Python was
link |
00:31:59.360
really cool. Like now it's cool because it's all these machine learning libraries like support
link |
00:32:03.600
Python and, right? Now it's super, now it's like cool to be Python. Back then it was like, oh,
link |
00:32:09.200
Google uses Python. Like maybe you should use Python. Facebook was PHP. Like I had worked at
link |
00:32:15.920
a small startup of some ex Googlers that used Python. So we used it and we used a framework
link |
00:32:20.880
called Django. It still exists and people use for basically the back end. And then
link |
00:32:27.520
you threw a couple of interesting things in there. I mean, we used Postgres, which was kind of fun.
link |
00:32:31.520
It was a little bit like hipster database at the time, right?
link |
00:32:34.240
MySQL. MySQL. Like everyone used MySQL. So like using Postgres was like an interesting decision,
link |
00:32:39.840
right? But we used it because it had a bunch of geo features built in because we thought we were
link |
00:32:45.840
going to be a check and app, remember? It's also super cool now. So you were into Python before
link |
00:32:50.160
it was cool and you were into Postgres before it was cool. Yeah, we were basically like not only
link |
00:32:54.960
hipster, hipster photo company, hipster tech company, right? We also adopted Redis early
link |
00:33:02.560
and like loved it. I mean, it solved so many problems for us. And turns out that's still
link |
00:33:08.080
pretty cool. But the programming was very easy. It was like, sign up a user, have a feed. There
link |
00:33:13.920
was nothing, no machine learning at all, zero. Can you get some context? How many users at each
link |
00:33:19.760
of these stages? We're talking about 100 users, a thousand users. So the stage I just described,
link |
00:33:25.520
I mean, that technical stack lasted through probably 50 million users. I mean, seriously,
link |
00:33:33.440
like you can get away with a lot with a pretty basic stack. Like I think a lot of startups try to
link |
00:33:40.160
over engineer their solutions from the beginning to like really scale and you can get away with a
link |
00:33:44.720
lot. That being said, most of the first two years of Instagram was literally just trying to make that
link |
00:33:50.000
stack scale. And it wasn't, it was not a Python problem. It was like, literally just like, where
link |
00:33:56.800
do we put the data? Like it's all coming in too fast. Like how do we store it? How do we make sure
link |
00:34:01.600
to be up? How do we like, how do we make sure we're on the right size boxes that they have enough
link |
00:34:07.760
memory? Those were the issues. But can you speak to the choices you make at that stage?
link |
00:34:14.160
When you're growing so quickly, do you use something like somebody else's computer infrastructure?
link |
00:34:20.320
Or do you build in house? I'm only laughing because we, when we launched, we had a single
link |
00:34:27.440
computer that we had rented in some color space in LA, I don't remember what it was called.
link |
00:34:34.480
Because I thought that's what you did when I worked at a company called Odio that became
link |
00:34:37.840
Twitter. I remember visiting our space in San Francisco, you walked in, you had to wear the
link |
00:34:42.240
ear things. It was cold and fans everywhere, right? And we had to, you know, plug one out,
link |
00:34:48.960
replace one, and I was the intern. So I just like held things. But I thought to myself, oh,
link |
00:34:53.840
this is how it goes. And then I remember being in a VC's office. I think it was Benchmark's office.
link |
00:34:59.920
And I think we ran into another entrepreneur and they were like, oh, how are things going? We're
link |
00:35:03.600
like, you know, try to scale this thing. And they were like, well, I mean, can't you just add more
link |
00:35:09.280
instances? And I was like, what do you mean? And they're like instances on Amazon. I was like,
link |
00:35:13.920
what are those? And it was this moment where we realized how deep in it we were, because we had
link |
00:35:19.680
no idea that AWS existed, nor should we be using it. Anyway, that night, we went back to the office
link |
00:35:26.800
and we got on AWS, but we did this really dumb thing. We're so sorry to people listening. But
link |
00:35:33.040
we brought up an instance, which was our database. It's going to be a replacement for our database.
link |
00:35:39.440
But we had it talking over the public internet to our little box in LA that was our app server.
link |
00:35:44.960
Nice. Yeah. That's how sophisticated we were. And obviously, that was very, very slow. Didn't
link |
00:35:51.760
work at all. I mean, it worked, but didn't work. Only like later that night did we realize we had
link |
00:35:57.120
to have it all together. But at least like if you're listening right now and you're thinking,
link |
00:36:01.520
you know, I have no chance, I'm going to start a stock, but I have no chance. I don't know. We
link |
00:36:06.080
did it. We made a bunch of really dumb mistakes initially. I think the question is how quickly
link |
00:36:10.720
do you learn that you're making a mistake? And do you do the right thing immediately right after?
link |
00:36:15.120
So you didn't pay for those mistakes by failure. So yeah, how quickly did you fix it? I guess
link |
00:36:22.800
there's a lot of ways to sneak up to this question of how do you scale the thing? Other startups,
link |
00:36:28.320
if you have an idea, how do you scale the thing? Is it just AWS? And you try to write the kind of
link |
00:36:37.200
code that's easy to spread across a large number of instances, and then the rest is just put money
link |
00:36:44.480
into it? Basically, I would say a couple of things. First off, don't even ask the question.
link |
00:36:51.440
Just find product market fit. Duck tape it together, right? Like if you have to. I think
link |
00:36:56.720
there's a big caveat here, which I want to get to. But generally, all that matters is product
link |
00:37:02.480
market fit. That's all that matters. If people like your product. Do not worry about when 50,000
link |
00:37:09.120
people use your product because you will be happy that you have that problem when you get there.
link |
00:37:13.760
I actually can't name many startups where they go from nothing to something overnight and they
link |
00:37:22.320
can't figure out how to scale it. There are some, but I think nowadays it's a, when I say a solved
link |
00:37:29.120
problem, like there are ways of solving it. The base case is typically that startups worry way
link |
00:37:35.440
too much about scaling way too early and forget that they actually have to make something that
link |
00:37:39.760
people like. That's the default mistake case. But what I'll say is once you start scaling,
link |
00:37:46.880
I mean, hiring quickly, people who have seen the game before and just know how to do it, it becomes
link |
00:37:55.600
a bit of like, yeah, just throw instances of the problem, right? But the last thing I'll say on this
link |
00:38:00.640
that I think did save us. We were pretty rigorous about writing tests from the beginning. That helped
link |
00:38:09.120
us move very, very quickly when we wanted to rewrite parts of the product and know that we
link |
00:38:15.200
weren't breaking something else. Tests are one of those things where it's like, you go slow to go fast
link |
00:38:20.560
and they suck when you have to write them because you have to figure it out and
link |
00:38:25.360
there are always those ones that break when you don't want them to break and they're annoying and
link |
00:38:29.040
it feels like you spend all this time. But looking back, I think that like long term optimal, even
link |
00:38:34.880
with a team of four, it allowed us to move very, very quickly because anyone could touch any part
link |
00:38:40.240
of the product and know that they weren't going to bring down the site, or at least in general.
link |
00:38:45.840
At which point do you know product market fit? How many users would you say? Is it all it takes
link |
00:38:51.280
us like 10 people or is it 1,000? Is it 50,000? I don't think it is generally a question of absolute
link |
00:38:59.280
numbers. I think it's a question of cohorts and I think it's a question of trends. So,
link |
00:39:04.080
you know, it depends how big your business is trying to be, right? But if I were signing up 1,000
link |
00:39:11.040
people a week and they all retain like the retention curves for those cohorts looked good,
link |
00:39:16.960
healthy, and even like as you started getting more people on the service, maybe those earlier cohorts
link |
00:39:23.360
started curving up again because now there are network effects and their friends are on the
link |
00:39:26.800
service or totally depends what type of business you're in. But I'm talking purely social, right? I
link |
00:39:34.080
don't think it's an absolute number. I think it is a, I guess you could call it a marginal number.
link |
00:39:39.600
So, I spent a lot of time when I worked with startups asking them like, okay, have you looked
link |
00:39:44.800
at that cohort versus this cohort, whether it's your clients or whether it's people signing up for
link |
00:39:50.560
the service. But a lot of people think you just have to hit some mark like 10,000 people or 50,000
link |
00:39:56.720
people, but really 7ish billion people in the world. Most people forever will not know about your
link |
00:40:04.480
product. There are always more people out there to sign up. It's just a question of how you turn
link |
00:40:09.200
on the spigot. So, at that stage, early stage, yourself, but also by way of advice, should
link |
00:40:16.800
you worry about money at all? How this thing is going to make money? Or do you just try to
link |
00:40:22.240
find product market fit and get a lot of users to enjoy using your thing?
link |
00:40:28.480
I think it totally depends and that's an unsatisfying answer. I was talking with a friend today who,
link |
00:40:36.320
he was one of our earlier investors and he was saying, hey, have you been doing any angel
link |
00:40:40.720
investing lately? I said, not really. I'm just focused on what I want to do next. And
link |
00:40:44.800
he said the number of financings have just gone bonkers. People are throwing money everywhere
link |
00:40:52.320
right now. And I think the question is, do you have an inkling of how you're going to make money?
link |
00:41:02.720
Or are you really just waving your hands? I would not like to be an entrepreneur in the position
link |
00:41:08.560
of, well, I have no idea how this will eventually make money. That's not fun.
link |
00:41:16.960
If you are in an area like, let's say you wanted to start a social network, right? Not saying this
link |
00:41:23.520
is a good idea, but if you did, there are only a handful of ways they've made money and really
link |
00:41:28.160
only one way they've made money in the past and that's ads. So, if you have a service that's amenable
link |
00:41:36.640
to that, and then I wouldn't worry too much about that because if you get to the scale,
link |
00:41:40.800
you can hire some smart people and figure that out. I do think that it is really healthy for a
link |
00:41:47.840
lot of startups these days, especially the ones doing enterprise software, slacks of the world,
link |
00:41:53.840
et cetera, to be worried about money from the beginning, but mostly as a way of winning over
link |
00:41:59.680
clients and having stickiness. Of course, you need to be worried about money, but I'm going to
link |
00:42:07.360
also say this again, which is it's like long term profitability. If you have a roadmap to that,
link |
00:42:14.000
then that's great. But if you're just like, I don't know, maybe never, working on this
link |
00:42:19.040
metaverse thing, I think maybe someday, I don't know. That seems harder to me. So,
link |
00:42:25.120
you have to be as big as Facebook to finance that bet, right? Do you think it's possible? You
link |
00:42:30.480
said you're not saying it's necessarily a good idea to launch a social network. Do you think it's
link |
00:42:36.160
possible today, maybe you can put yourself in those shoes, to launch a social network that
link |
00:42:44.160
achieves the scale of a Facebook or a Twitter or an Instagram and maybe even greater scale?
link |
00:42:51.200
Absolutely. How do you do it? Asking for a friend. Yeah, if I knew, I'd probably be doing it right
link |
00:42:58.560
now and not sitting here. So, I mean, there's a lot of ways to ask this question. One is create a
link |
00:43:04.720
totally new product, market fit, create a new market, create something like Instagram did,
link |
00:43:10.160
which is like create something kind of new or literally outcompete Facebook at its own thing
link |
00:43:16.880
or outcompete Twitter at its own thing. The only way to compete now if you want to build a large
link |
00:43:22.880
social network is to look for the cracks, look for the openings. No one competed,
link |
00:43:30.880
I mean, no one competed with the core business of Google. No one competed with the core business
link |
00:43:35.200
of Microsoft. You don't go at the big guys doing exactly what they're doing. Instagram didn't win,
link |
00:43:43.120
quote unquote, because it tried to be a visual Twitter. We spotted things that either Twitter
link |
00:43:49.840
wasn't going to do or refused to do, images and feed for the longest time, right? Or that Facebook
link |
00:43:56.640
wasn't doing or not paying attention to because they were mostly desktop at the time and we were
link |
00:44:00.960
purely mobile, purely visual. Often there are opportunities sitting there. You just have to
link |
00:44:08.160
figure out like, I think like there's a strategy book, I can't remember the name, but talk about
link |
00:44:15.840
motes and just like the best place to play is where your competitor like literally can't pivot
link |
00:44:22.160
because structurally they're set up not to be there. And that's where you win. And what's
link |
00:44:29.040
fascinating is like, do you know how many people were like, images, Facebook does that, Twitter
link |
00:44:34.400
does that. I mean, how wrong were they? Really wrong. And these are some of the smartest people
link |
00:44:38.480
in Silicon Valley, right? But now Instagram exists for a while. How is it that Snapchat could then
link |
00:44:44.480
exist? Makes no sense. Like plenty of people would say, well, there's Facebook, no images. Okay,
link |
00:44:50.240
okay. I mean, Instagram, I'll give you that one. But wait, now another image based social network
link |
00:44:55.120
is going to get really big. And then TikTok comes along. Like the prior, so you asked me,
link |
00:45:02.240
is it possible? The only reason I'm answering yes is because my prior is that it's happened
link |
00:45:08.560
once every, I don't know, three, four or five years consistently. And I can't imagine there's
link |
00:45:14.080
anything structurally that would change that. So that's why I answer that way. Not because I know
link |
00:45:19.600
how I just, when you see a pattern, you see a pattern and there's no reason to believe that's
link |
00:45:24.560
going to stop. And it's subtle too. Because like you said, Snapchat and TikTok, they're all doing
link |
00:45:30.160
the same space of things. But there's something fundamentally different about like a three
link |
00:45:36.720
second video and a five second video and a 15 second video and a one minute video and a one hour
link |
00:45:41.600
video, like fundamentally different. Fundamentally different. I mean, I think one of the reasons
link |
00:45:47.200
Snapchat exists is because Instagram was so focused on posting great, beautiful, manicured
link |
00:45:54.080
versions of yourself throughout time. And there was this enormous demand of like, hey,
link |
00:45:58.960
I really like this behavior. I love using Instagram. But man, I just wish I could share
link |
00:46:05.120
something going on in my day. Do I really have to put it on my profile? Do I really have to make
link |
00:46:10.880
it last forever? Do I really? And that opened up a door. It created a market, right? And then
link |
00:46:16.960
what's fascinating is Instagram had an Explorer page for the longest time. It was image driven,
link |
00:46:22.160
right? But there's absolutely a behavior where you open up Instagram and you sit on the Explorer
link |
00:46:27.600
page all day. That is effectively TikTok, but obviously focused on videos. And it's not like
link |
00:46:32.880
you could just put the Explorer page in TikTok form and it works. It had to be video, it had to
link |
00:46:38.240
have music. These are the hard parts about product development that are very hard to predict. But
link |
00:46:46.000
they're all versions of the same thing with varying. If you line them up in a bunch of
link |
00:46:51.360
dimensions, they're just like kind of on, they're different values of the same dimensions,
link |
00:46:56.800
which is, I guess, easy to say in retrospect. But if I were an entrepreneur going after that area,
link |
00:47:02.400
I'd ask myself, where's the opening? What needs to exist because TikTok exists now?
link |
00:47:08.400
So I wonder how much things that don't yet exist and can exist is in the space of algorithms,
link |
00:47:15.760
in the space of recommender systems, so in the space of how the feed is generated.
link |
00:47:21.520
So we kind of talk about the actual elements of the content. That's what we've been talking,
link |
00:47:27.760
the difference between photos, between short videos, longer videos. I wonder how much disruption
link |
00:47:34.000
it's possible in the way the algorithms work. Because a lot of the criticism towards social
link |
00:47:39.040
media is in the way the algorithms work currently. And it feels like, first of all, talking about
link |
00:47:45.200
product market fit, there's certainly a hunger for social media algorithms that do something
link |
00:47:55.760
different. I don't think anyone, everyone's complaining, this is not doing, this is hurting
link |
00:48:02.000
me and this is hurting society. But I keep doing it because I'm addicted to it. And they say,
link |
00:48:08.400
we want something different, but we don't know what. It feels like just different. It feels
link |
00:48:15.600
like there's a hunger for that. But that's in the space of algorithms. I wonder if it's possible
link |
00:48:20.320
to disrupt in that space. Absolutely. I have this thesis that the worst part about social networks
link |
00:48:28.800
is that there is the people. It's a line that sounds funny, right? Because that's why you're
link |
00:48:37.440
called a social network. But what does social networks actually do for you? Just think, imagine
link |
00:48:44.880
you're an alien and you landed and someone says, hey, there's this site. It's a social network.
link |
00:48:49.280
We're not going to tell you what it is, but what does it do? And you have to explain it to them.
link |
00:48:53.040
Just two things. One is that people you know and have social ties with distribute updates
link |
00:49:01.840
through whether it's photos or videos about their lives so that you don't have to physically be with
link |
00:49:08.000
them, but you can keep in touch with them. That's one. That's like a big part of Instagram. That's
link |
00:49:12.320
a big part of Snap. It is not part of TikTok at all. So there's another big part, which is there's
link |
00:49:19.280
all this content out in the world that's entertaining, whether you want to watch it or you want to read
link |
00:49:24.560
it and matchmaking between content that exists in the world and people that want that content.
link |
00:49:33.280
It turns out to be a really big business. Search and discovery, would you?
link |
00:49:37.200
Search and discovery. But my point is it could be video, it could be text, it could be websites,
link |
00:49:41.440
it could be, I mean, think back to dig or stumble upon. But what did those do? They basically
link |
00:49:52.480
distributed interesting content to you. I think the most interesting part or the future of social
link |
00:50:01.120
networks is going to be making them less social because I think people are part of the root cause
link |
00:50:06.240
of the problem. So for instance, often in recommender systems, we talk about two stages.
link |
00:50:11.360
There's a candidate generation step, which is just like of our vast trove of stuff that you
link |
00:50:16.480
might want to see. What small subset should we pick for you? Typically, that is grabbed from
link |
00:50:25.440
things your friends have shared. Then there's a ranking step, which says, okay, now given these
link |
00:50:31.360
100, 200 things, depends on the network, let's be really good about ranking them and generally
link |
00:50:37.520
rank the things up higher that get the most engagement. So what's the problem with that?
link |
00:50:41.760
Step one is we've limited everything you could possibly see to things that your friends have
link |
00:50:47.120
chosen to share, or maybe not friends, but influencers. What things do people generally
link |
00:50:53.280
want to share? They want to share things that are going to get likes, that are going to show up
link |
00:50:56.640
broadly. So they tend to be more emotionally driven. They tend to be more risque or whatever.
link |
00:51:02.880
So why do we have this problem? It's because we show people things people have decided to share
link |
00:51:08.800
and those things self select to being the things that are most divisive. So how do you fix that?
link |
00:51:15.680
Well, what if you just imagine for a second that why do you have to grab things from things
link |
00:51:21.600
your friends have shared? Why not just like grab things? That's really fascinating to me.
link |
00:51:26.560
And that's something I've been thinking a lot about. And just like, why is it that when you
link |
00:51:32.320
log on to Twitter, you're just sitting there looking at things from accounts that you've
link |
00:51:39.120
followed for whatever reason? And TikTok, I think, has done a wonderful job here, which is like,
link |
00:51:44.640
you can literally be anyone. And if you produce something fascinating, it'll go viral. But like,
link |
00:51:52.720
you don't have to be someone that anyone knows. You don't have to have built up a giant following.
link |
00:51:57.040
You don't have to have paid for followers. You don't have to try to maintain those followers.
link |
00:52:01.600
You literally just have to produce something interesting. That is, I think, the future
link |
00:52:05.920
of social networking. That's the direction things will head. And I think what you'll find is,
link |
00:52:11.520
it's far less about people manipulating distribution and far more about what is like,
link |
00:52:17.440
is this content good? And good is obviously a vague definition that we spend hours on. But
link |
00:52:25.280
different networks, I think, will decide different value functions to decide what is good and what
link |
00:52:29.440
isn't good. And I think that's a fascinating direction. So that's almost like creating an
link |
00:52:33.200
internet. I mean, that's what Google did for web pages. They did page rank search. So discovery,
link |
00:52:40.240
you don't follow anybody on Google when you use a search engine. You just discover web pages.
link |
00:52:46.160
And so what TikTok does is saying, let's start from scratch. Let's start a new internet and have
link |
00:52:54.720
people discover stuff on that new internet within a particular pool of people. But what's so fascinating
link |
00:53:00.720
about this is the field of information retrieval. I always talked about, as I was studying this stuff,
link |
00:53:08.240
I would always use the word query and document. So I was like, why are they saying query and
link |
00:53:12.320
document? They're literally like, if you just stop thinking about query as literally a search query,
link |
00:53:18.640
and a query could be a person. I mean, a lot of the way, I'm not going to claim to know how Instagram
link |
00:53:23.440
or Facebook machine learning works today. But if you want to find a match for a query,
link |
00:53:30.160
the query is actually the attributes of the person, their age, their gender, where they're from,
link |
00:53:36.640
maybe some kind of summarization of their interests. And that's a query. And that matches
link |
00:53:42.560
against documents. And by the way, documents don't have to be text. They can be videos. They're
link |
00:53:47.440
however long. I don't know what the limit is on TikTok these days. They keep changing it.
link |
00:53:51.200
My point is just, you've got a query, which is someone in search of something that
link |
00:53:55.760
they want to match, and you've got the document, and it doesn't have to be text, it could be anything.
link |
00:54:00.560
And how do you match make? And that's one of these like, I mean, I've spent a lot of time
link |
00:54:04.960
thinking about this, and I don't claim to have mastered it at all. But I think it's so fascinating
link |
00:54:10.080
about where that will go with new social networks. See, what I'm also fascinated by is metrics that
link |
00:54:16.080
are different than engagement. So the other thing from an alien perspective, what social networks
link |
00:54:21.840
are doing is they, they in the short term, bring out different aspects of each human being. So
link |
00:54:30.560
first, let me say that an algorithm or a social network for each individual can bring out the best
link |
00:54:40.960
of that person or the worst of that person, or there's a bunch of different parts to us,
link |
00:54:45.520
parts we're proud of, that we are, parts we're not so proud of. When we look at the big picture
link |
00:54:52.160
of our lives, when we look back 30 days from now, am I proud that I said those things or not?
link |
00:54:57.440
Am I proud that I felt those things? Am I proud that I experienced or read those things or thought
link |
00:55:03.760
about those things? Just in that kind of self reflected kind of way. And so coupled with that,
link |
00:55:10.080
I wonder if it's possible to have different metrics that are not just about engagement,
link |
00:55:14.080
but are about long term happiness, growth of a human being, where they look back and say,
link |
00:55:22.160
I'm a better human being for having spent 100 hours on that app. And that feels like it's
link |
00:55:28.640
actually strongly correlated with engagement in the long term. In the short term, it may not be,
link |
00:55:34.640
but in the long term, it's like the same kind of thing where you really fall in love with the
link |
00:55:39.600
product. You fall in love with an iPhone, you fall in love with a car. That's what makes you fall
link |
00:55:44.560
in love is really being proud in a self reflective way, understanding that you're a better human being
link |
00:55:53.520
for having used the thing. And that's what great relationships are made from. It's not just like
link |
00:56:00.080
you're hot and we like being together or something like that. It's more like I'm a better human being
link |
00:56:06.000
because I'm with you. And that feels like a metric that could be optimized for by the algorithms.
link |
00:56:12.960
But anytime I kind of talk about this with anybody, they seem to say, yeah, okay, that's
link |
00:56:18.720
going to get outcompeted immediately by the engagement if it's ad driven, especially.
link |
00:56:24.560
I just don't think so. I mean, a lot of it is just implementation.
link |
00:56:30.080
I'll say a couple of things. One is to pull back the curtain on daily meetings inside of these
link |
00:56:37.440
large social media companies. A lot of what management or at least the people that are
link |
00:56:44.480
tweaking these algorithms spend their time on are trade offs. And there's these things called value
link |
00:56:49.520
functions, which are like, okay, we can predict the probability that you'll click on this thing
link |
00:56:55.360
or the probability that you'll share it or the probability that you will leave a comment on it
link |
00:57:01.040
or the probability you'll dwell on it. Individual actions. And you've got this neural network that
link |
00:57:09.520
basically has a bunch of heads at the end and all of them are between zero and one and great,
link |
00:57:14.720
they all have values or they all have probabilities. And then in these meetings,
link |
00:57:20.640
what they will do is say, well, how much do we value a comment versus a click versus a share
link |
00:57:27.760
versus a, and then maybe even some downstream thing that has nothing to do with the item
link |
00:57:33.600
there but like driving follows or something. And what typically happens is they will say,
link |
00:57:40.000
well, what are our goals for this quarter at the company? Oh, we want to drive sharing up.
link |
00:57:44.000
Okay, well, let's turn down these metrics and turn up these metrics. And they blend them into a
link |
00:57:50.960
single scalar with which they're trying to optimize. That is really hard. Because invariably,
link |
00:57:58.000
you think you're solving for, I don't know, something called meaningful interactions.
link |
00:58:02.560
This was the big Facebook pivot. And I don't actually have any internal knowledge. I wasn't
link |
00:58:07.440
in those meetings. But at least from what we've seen over the last month or so, it seems by actually
link |
00:58:14.800
trying to optimize for meaningful interactions. It had all these side effects of optimizing for
link |
00:58:20.080
these other things. And I don't claim to fully understand them. But what I will say
link |
00:58:26.320
is that tradeoffs abound. And as much as you'd like to solve for one thing, if you have a network
link |
00:58:32.320
of over a billion people, you're going to have unintended consequences either way. And it gets
link |
00:58:37.120
really hard. So what you're describing is effectively a value model that says, can we capture,
link |
00:58:43.280
this is the thing that I spent a lot of time thinking about, can you capture utility in a
link |
00:58:50.000
way that actually measures someone's happiness that isn't just a, what do they call it, a surrogate
link |
00:58:57.040
problem where you say, well, I think the more you use the product, the happier you are. That was
link |
00:59:03.120
always the argument at Facebook, by the way. It was like, well, people use it more so they must be
link |
00:59:07.920
more happy. Turns out there are like a lot of things you use more that make you less happy in
link |
00:59:12.160
the world, not talking about Facebook, just, you know, let's think about whether it's gambling or
link |
00:59:17.120
whatever, like that you can do more of, but doesn't necessarily make you happier. So the idea that
link |
00:59:21.440
time equals happiness, obviously, you can't map utility and time together easily. There are a
link |
00:59:27.120
lot of edge cases. So when you look around the world and you say, well, what are all the ways
link |
00:59:31.440
we can model utility? There's like one of the, please, if you know someone's smart doing this,
link |
00:59:36.160
introduce me because I'm fascinated by it. And it seems really tough. But the idea that reinforcement
link |
00:59:42.240
learning, like everyone interesting, I know, in machine learning, like I was really interested
link |
00:59:47.600
in recommender systems and supervised learning. And the more I dug into it, I was like, oh,
link |
00:59:52.960
literally everyone's smart is working on reinforcement learning, like literally everyone.
link |
00:59:57.520
You just made people at OpenAI and DeepMind very happy, yes.
link |
01:00:00.640
But I mean, but what's interesting is like, it's one thing to train a game. And like,
link |
01:00:06.480
I mean, that paper that where they just took Atari and they used a ConvNet to basically
link |
01:00:11.120
just like train simple actions, mind blowing, right? Absolutely mind blowing. But it's a game,
link |
01:00:17.120
great. So now what if you're constructing a feed for a person, right? Like, how can you construct
link |
01:00:27.440
that feed in such a way that optimizes for a diversity of experience, a long term happiness,
link |
01:00:35.200
right? But that reward function, it turns out in reinforcement learning again,
link |
01:00:41.040
as I've learned, like reward design is really hard. And I don't know, like, how do you design a
link |
01:00:48.640
scalar reward for someone's happiness over time? I mean, do you have to measure dopamine levels?
link |
01:00:53.120
Like, do you have to? Well, you have to have a lot of a lot more signals
link |
01:00:57.920
from the human being currently feels like there's not enough signals coming from the human being
link |
01:01:03.280
users of of this algorithm. So for reinforcement learning to work well, you need to have a lot
link |
01:01:09.280
more data needs to have a lot of data. And that actually is a challenge for anyone who wants to
link |
01:01:13.680
start something, which is you don't have a lot of data. So how do you compete? But I do think
link |
01:01:18.560
back to your original point, rethinking the algorithm, rethinking reward functions, rethinking
link |
01:01:24.480
utility. That's fascinating. That's cool. And I think that's an open opportunity for
link |
01:01:31.920
our company that figures it out. I have to ask about April 2012, when Instagram,
link |
01:01:39.680
along with its massive employee base of 13 people, was sold to Facebook for $1 billion.
link |
01:01:46.480
What was the process like on a business level, engineering level, human level? What was that
link |
01:01:52.560
process of selling to Facebook like? What did it feel like? So I want to provide some context,
link |
01:01:58.000
which is I worked in corporate development at Google, which not a lot of people know,
link |
01:02:02.480
but corporate development is effectively the group that buys companies, right? You sit there and
link |
01:02:06.880
you acquire companies. And I had sat through so many of these meetings with entrepreneurs.
link |
01:02:11.920
We actually, fun fact, we never acquired a single company when I worked in corporate
link |
01:02:15.280
development. So I can't claim that I had a lot of experience. But I had enough experience to
link |
01:02:22.880
understand, okay, what prices are people getting and what's the process? And as we started to grow,
link |
01:02:32.160
we were trying to keep this thing running and we were exhausted and we were 13 people. And I mean,
link |
01:02:36.400
we were trying to think back, it's probably 27, 37 now. So young and on a relative basis, right?
link |
01:02:48.240
And we're trying to keep the thing running. And then, you know, we go out to raise money and
link |
01:02:54.560
we're kind of like the hot startup at the time. And I remember going into a specific VC and saying,
link |
01:03:00.320
our terms we're looking for are, we're looking for a $500 million valuation.
link |
01:03:04.240
And I've never seen so many jobs drop all in unison, right? And I was like,
link |
01:03:10.880
thanked and walked out the door very kindly after. And then I got a call the next day from someone
link |
01:03:17.600
who was connected to them and said, they said, we just want to let you know that like, it was
link |
01:03:22.080
pretty offensive that you asked for $500 million valuation. And I can't tell if that was like,
link |
01:03:28.560
just negotiating or what, but it's true, like no one offered us more, right? So we were,
link |
01:03:33.600
so can you clarify the number again? We said, how many million 500 500 million 500 million. Yeah,
link |
01:03:39.920
half a billion. Yeah. So in my mind, I'm anchored like, okay, well, literally no one's
link |
01:03:45.600
biting at 500 million. And eventually we would get Sequoia and Greylock and others together at 500
link |
01:03:52.000
million, basically, post it was 450 pre, I think we raised $50 million. But just like no one was
link |
01:03:59.440
used to seeing 500 million dollar companies then like, I don't know if it was because we were
link |
01:04:04.480
just coming out of the hangover 2008 and things were still on recovery mode. But then along comes
link |
01:04:13.600
Facebook. And after some negotiation, we've two X to the number from a half a billion to a billion.
link |
01:04:21.840
Yeah, it seems pretty good. You know, and I think Mark and I really saw eye to eye that this
link |
01:04:28.640
thing could be big. We thought we could, their resources would help us scale it. And in a lot
link |
01:04:34.080
of ways it de risks. I mean, it's a lot of the employees lives for the rest of their lives,
link |
01:04:39.280
including me, including Mike, right? I think I might have had like 10 grand in my bank account
link |
01:04:44.640
at the time, right? Like, we're working hard. We had nothing. So on a relative basis, it seems
link |
01:04:51.520
very high. And then I think the last company to exit for anywhere close to a billion was YouTube
link |
01:04:56.640
that I could think of. And, and thus began the giant long bull run of 2012 to all the way,
link |
01:05:03.840
all the way to where we are now where I saw some stat yesterday about like how many unicorns exist
link |
01:05:09.680
and it's absurd. But then again, never underestimate technology and like the value it can provide and
link |
01:05:16.960
man costs have dropped and man scale has increased. And you can make businesses make a lot of money
link |
01:05:23.040
now. But on a fundamental level, I don't know, like how do you describe the decision to sell
link |
01:05:30.320
a company with 13 people for a billion dollars? So first of all, like how do you take a lot of
link |
01:05:36.400
guts to set a table and say 500 million or 1 billion with Mark Zuckerberg? It seems like a
link |
01:05:42.720
very large number with 13. Like, especially like, it doesn't seem it is, they're all large, especially
link |
01:05:49.760
like you said, before the unicorn parade. I like that. I'm going to use that.
link |
01:05:56.720
The unicorn parade. Yeah, the the you were at the head of the unicorn parade.
link |
01:06:01.600
It's the yeah, it's a massive unicorn parade. Okay, so
link |
01:06:08.000
no, I mean, we knew we were worth, quote unquote, a lot, but we didn't, I mean, there was no market
link |
01:06:13.600
for Instagram. I mean, it's not you couldn't marked, you couldn't mark to market this thing
link |
01:06:17.040
in the public markets, you didn't quite understand what it would be worth or was worth at the time.
link |
01:06:22.880
So in a market, an illiquid market where you have one buyer and one seller and you're going back
link |
01:06:26.880
and forth. And well, I guess there were like VC firms who were willing to, you know, invest at
link |
01:06:33.680
a certain valuation. So I don't know, you just go with your gut. And, and at the end of the day,
link |
01:06:40.480
I would say the hardest part of it was not realizing, like when we sold, it was tough,
link |
01:06:52.000
because like literally everywhere I go, restaurants, whatever, like, for a good six months after,
link |
01:06:59.040
it was a lot of attention on the deal, a lot of attention on the product, a lot of attention.
link |
01:07:03.920
And it was kind of miserable, right? And you're like, wait, like I made a lot of money, but like,
link |
01:07:08.480
why is this not great? And it's because it turns out, you know, and I don't, I don't know, like
link |
01:07:14.880
I don't really keep in touch with Mark, but I've got to assume his job right now is not exactly
link |
01:07:18.480
the most happy job in the world. It's really tough when you're on top. And it's really tough
link |
01:07:22.640
when you're in the limelight. So the decision itself was like, Oh, cool, this is great. How
link |
01:07:27.520
lucky are we? Right? So okay, there's a million questions. Yeah, go first of all, first of all,
link |
01:07:32.480
why? Why is it hard to be on top? Why did you not feel good? Like, can you dig into that? It always,
link |
01:07:42.960
I've heard like Olympic athletes say, after they win gold, they get depressed. Is it something like
link |
01:07:50.240
that where it feels like it was kind of like a thing you were working towards? Yeah, some loose
link |
01:07:57.680
definition of success. And this shows like feels like at least according to other startups, this
link |
01:08:03.440
is what success looks like. And now, why don't I feel any better? I'm still human. I still have
link |
01:08:09.600
all the same problems. Is that the nature? Or is it just like negative attention or some kind?
link |
01:08:14.320
I think it's all of the above. But to be clear, there was a lot of happiness in terms of like,
link |
01:08:19.200
oh, my God, this is great. Like we like won the Super Bowl of startups, right? Anyone who can get to
link |
01:08:27.600
a liquidity event of anything meaningful feels like, wow, this is what we started out to do. Of
link |
01:08:32.880
course, we want to create great things that people love. But like we won in a big way. But yeah,
link |
01:08:38.800
there's this big like, Oh, if we won, what is like, what's next? I don't. So they call it that we have
link |
01:08:45.760
a Rive syndrome, which I need to go back and look where I can quote that from. But I remember
link |
01:08:53.120
reading about it at the time, I was like, Oh, yeah, that's that. And I remember we had a product
link |
01:08:57.440
manager leave very early on when we got to Facebook. And he said to me, I just don't believe I can
link |
01:09:02.800
learn anything at this company anymore. It's like it's hit its apex. We sold it great. I just don't
link |
01:09:10.240
have anything else to learn. So from 2012, all the way to the day I left in 2018, like the amount I
link |
01:09:17.120
learned and the humility with which I realized, Oh, we thought we won. Billion dollars is cool.
link |
01:09:23.360
But like, there are a hundred billion dollar companies. And by the way, on top of that,
link |
01:09:28.320
we had no revenue. We had, I mean, we had a cool product, but we didn't scale it yet. And
link |
01:09:33.360
there's so much to learn. And then competitors and how fun was it to fight Snapchat? Oh my God,
link |
01:09:38.800
like it was like Yankees red socks. It's great. Like that's what you live for.
link |
01:09:45.920
You know, you win some, you lose some, but the amount you can learn through that process.
link |
01:09:52.480
What I've realized in life is that there is no and there's always someone who has more,
link |
01:09:58.480
there's always more challenge, just at different scales. And it sounds like a little Buddhist,
link |
01:10:04.800
but everything is super challenging, whether you're like a small business or an enormous business.
link |
01:10:12.960
I say like, choose the game you like to play, right? You've got to imagine that if you're an
link |
01:10:18.320
amazing basketball player, you enjoy, to some extent, practicing basketball. It's got to be
link |
01:10:23.280
something you love. It's going to suck. It's going to be hard. You're going to have injuries,
link |
01:10:27.120
right? But you got to love it. And the same thing with Instagram, which is
link |
01:10:30.720
we might have sold, but it was like, great, there's one Super Bowl title. Can we win five?
link |
01:10:38.560
What else can we do? Now I imagine you didn't ask this, but okay, so I left. There's a little bit
link |
01:10:44.560
of like, what do you do next? Right? Like, what do you, how do you top that thing? It's the wrong
link |
01:10:49.600
question. The question is like, when you wake up every day, what is the hardest, most interesting
link |
01:10:54.640
thing you can go work on? Because like, at the end of the day, we all turn into dirt,
link |
01:10:59.200
doesn't matter, right? But what does matter is like, can we really enjoy this life?
link |
01:11:05.360
Not in a hedonistic way, because that's those, it's like the reinforcement learning, learning
link |
01:11:10.000
like short term versus long term objectives. Can you wake up every day and truly enjoy what you're
link |
01:11:16.800
doing, knowing that it's going to be painful, knowing that like, no matter what you choose,
link |
01:11:24.000
it's going to be painful. Whether you sit on a beach or whether you manage a thousand people or
link |
01:11:29.280
10,000, it's going to be painful. So choose something that's fun to have pain. But yes,
link |
01:11:35.600
there was a lot of, we have arrived and it's a maturation process. You just have to go through.
link |
01:11:44.320
So no matter how much success there is, how much money you make, you have to wake up the next day
link |
01:11:49.360
and choose the hard life, whatever that means next. That's fun. The fun slash hard life.
link |
01:11:55.520
Hard life, that's fun. I guess what I'm trying to say is slightly different,
link |
01:12:00.240
which is just that no one realizes everything's going to be hard. Even chilling out is hard.
link |
01:12:07.600
And then you just start worrying about stupid stuff. Like, I don't know, like,
link |
01:12:11.680
did so and so forget to paint the house today? Or like, did the gardener come or whatever? Like,
link |
01:12:16.320
or, oh, I'm so angry and my shipment of wine didn't show up and I'm sitting here on the beach
link |
01:12:21.040
without my wine. I don't know, I'm making shit up now. But like, it turns out that even chilling
link |
01:12:25.760
aka meditation is hard work. Yeah. And at least meditation is like productive chilling where
link |
01:12:30.960
you're like actually training yourself to calm down and beat. But backing up for a moment,
link |
01:12:36.160
everything's hard. You might as well be like playing the game you love to play. I just like
link |
01:12:42.080
playing and winning. And I'm still on the, I think, the first half of life knock on wood.
link |
01:12:52.000
And I've got a lot of years and what am I going to do? Sit around? And the other way of looking
link |
01:12:56.080
at this, by the way, imagine you made one movie and it was great. Would you just like stop making
link |
01:13:03.680
movies? No. Generally, you're like, wow, I really like making movies. Let's make another one. A lot
link |
01:13:09.520
of times, by the way, the second one or the third one, not that great, but the fourth one, awesome.
link |
01:13:13.760
And no one forgets the second or everyone forgets the second and the third one.
link |
01:13:17.440
So there's just this constant process of like, can I produce? And is that fun? Is that exciting?
link |
01:13:24.320
What else can I learn? So this machine learning stuff for me has been this awesome new chapter
link |
01:13:29.520
of being like, man, that's something I didn't understand at all. And now I feel like I'm one
link |
01:13:34.960
tenth of the way there. And that feels like a big mountain to climb. So I distracted us from the
link |
01:13:41.520
original question. No, when we'll return to the machine learning, because I'd love to explore
link |
01:13:45.280
your interest there. But I mean, speaking of sort of challenges and hard things, is there a possible
link |
01:13:50.960
world where sitting in a room with Mark Zuckerberg, with a $1 billion deal, you turn it down?
link |
01:13:59.120
Yeah, of course. What does that world look like? Why would you turn it down? Why did you take it?
link |
01:14:05.520
What was the calculation that you were making? Thus enters the world of counterfactuals and
link |
01:14:11.520
not really knowing. And if only we could run that experiment. Well, the universe exists. It's just
link |
01:14:17.040
running in parallel to our own. Yeah. It's so fascinating, right? I mean, we're talking a lot
link |
01:14:23.920
about money. But the real question was, I'm not sure you'll believe me when I say this, but could
link |
01:14:31.680
we strap our little company onto the side of a rocket ship and get out to a lot of people
link |
01:14:38.240
really, really quickly with the support, with the talent of a place like Facebook? I mean,
link |
01:14:45.280
people often ask me what I would do differently at Instagram today. And I say, well, I'd probably
link |
01:14:50.000
hire more carefully because we showed up just like before I knew it. We had like 100 people on the
link |
01:14:55.360
team and 200, then 300. I don't know where all these people were coming from. I never had to
link |
01:15:00.000
recruit them. I never had to screen them. They were just like internal transfers, right? So it's
link |
01:15:05.280
like relying on the Facebook hiring machine, which is quite sort of, I mean, it's an elaborate
link |
01:15:10.640
machine. It's great, by the way. They have really talented people there. But my point is
link |
01:15:16.480
the choice was like, take this thing, put it on the side of a rocket ship that you know is growing
link |
01:15:23.840
very quickly. Like I had seen what had happened when Ev sold Blogger to Google, and then Google
link |
01:15:30.160
went public. Remember, we sold before Facebook went public. There was a moment at which the
link |
01:15:36.000
stock price was $17, by the way. Facebook stock price was $17. I remember thinking, what the
link |
01:15:41.200
the fuck did I just do? Now at $320, I don't know where we are today, but like, okay. The
link |
01:15:51.920
best thing, by the way, is when the stock is down, everyone calls you dope. And then when it's up,
link |
01:15:57.440
they also call you a dope, but just for a different reason, right? You can't win.
link |
01:16:02.400
Less than in there somewhere. But the choice is to strap yourself to a rocket ship or to build
link |
01:16:07.760
your own. You know, Mr. Elon built his own, literally, with a rocket ship. That's a difficult
link |
01:16:14.720
choice because there's a world. Actually, I would say something different, which is Elon and others
link |
01:16:21.200
decided to sell PayPal for not that much. I mean, how much was it about a billion dollars? I can't
link |
01:16:27.040
remember. Something like that. Yeah. I mean, it was early, but it's worth a lot more now.
link |
01:16:31.440
To then build a new rocket ship. So this is the cool part, right? If you are an entrepreneur,
link |
01:16:37.200
and you own a controlling stake in the company, not only is it really hard to do something else
link |
01:16:42.560
with your life, because all of the value is tied up in you as a personality attached to this company,
link |
01:16:49.680
right? But if you sell it and you get yourself enough capital and you have enough energy,
link |
01:16:55.680
you could do another thing or 10 other things, or in Elon's case, like a bunch of other things.
link |
01:16:59.840
I don't know. I lost count at this point. And it might have seemed silly at the time. And sure,
link |
01:17:06.160
like if you look back, man, PayPal is worth a lot now, right? But I don't know. Do you think Elon
link |
01:17:12.400
cares about, are we going to buy Pinterest or not? He created a massive capital that allowed
link |
01:17:21.280
him to do what he wants to do. And that's awesome. That's more freeing than anything,
link |
01:17:25.360
because when you are an entrepreneur attached to a company, you got to stay at that company for a
link |
01:17:29.600
really long time. It's really hard to remove yourself. But I'm not sure how much he loved
link |
01:17:34.400
PayPal versus SpaceX and Tesla. I have a sense that you love Instagram.
link |
01:17:41.280
Yeah, I loved enough to work for six years beyond the deal.
link |
01:17:44.880
Which is rare, which is very rare. You chose.
link |
01:17:47.360
But can I tell you why? There are not a lot of companies that you can be part of where
link |
01:17:56.160
the Pope's like, I would like to sign up for your product. I'm not a religious person at all.
link |
01:18:00.720
I'm really not. But when you go to the Vatican and you're walking among the giant columns and
link |
01:18:06.960
you're hearing the music and everything, and the Pope walks in and he wants to press the
link |
01:18:11.280
sign up button on your product, it's a moment in life. No matter what your persuasion.
link |
01:18:19.280
The number of doors and experiences that that opened up was incredible. The people I got to
link |
01:18:24.880
meet, the places I got to go, I assume maybe a payments company is slightly different, right?
link |
01:18:31.920
But that's why it was so fun. Plus, I truly believed we were building such a great product,
link |
01:18:38.400
and I loved the game. It wasn't about money. It was about the game.
link |
01:18:44.480
Do you think you had the guts to say no? I often think about this. How hard is it for an
link |
01:18:51.280
entrepreneur to say no? Because the peer pressure, basically the sea of entrepreneurs in Silicon
link |
01:18:58.160
Valley are going to tell you, this is their dream. The thing you were sitting before was a dream.
link |
01:19:05.040
To walk away from that is really, it seems like nearly impossible. Because Instagram could,
link |
01:19:13.360
in 10 years, you could talk about Google. You could be making self driving cars and building
link |
01:19:20.640
rockets that go to Mars and compete with SpaceX. Totally. That's an interesting decision to say,
link |
01:19:28.080
am I willing to risk it? The reason I also say it's an interesting decision, because
link |
01:19:35.440
it feels like, per our previous discussion, if you're launching a social network company,
link |
01:19:42.000
there's going to be that meeting, whatever that number is, if you're successful. If you're on
link |
01:19:47.280
this rocket ship of success, there's going to be a meeting with one of the social media social
link |
01:19:52.800
network companies that want to buy you, whether it's Facebook or Twitter, but it could also very
link |
01:19:59.040
well be Google who seems to have a graveyard of failed social networks. I think about that,
link |
01:20:10.560
how difficult it is for an entrepreneur to make that decision. How many have successfully made
link |
01:20:15.040
that decision, I guess. This is a big question. It's sad to me, to be honest, that too many make
link |
01:20:21.120
that decision perhaps for the wrong reason. Sorry, when you say make the decision, you mean to the
link |
01:20:25.920
affirmative. To the affirmative, yeah. You got it. Yeah. There are also companies that don't sell
link |
01:20:31.360
and take the path and say, we're going to be independent, and then you've never heard of
link |
01:20:36.160
that again. I remember Path was one of our competitors early on. There's a big moment
link |
01:20:44.640
when they had, I can't remember what it was, $110 million offer from Google or something.
link |
01:20:50.640
It might have been larger. I don't know. I remember there was this big TechCrunch article
link |
01:20:56.480
that was like, they turned it down after talking deeply about their values and everything. I don't
link |
01:21:04.000
know the inner workings of Foursquare, but I'm certain there were many conversations over time
link |
01:21:09.040
where there were companies that wanted Foursquare as well. Recently, what other companies,
link |
01:21:15.840
there's Clubhouse. I don't know. Maybe people were really interested in them, too. There are
link |
01:21:21.920
plenty of moments where people say no, and we just forget that those things happen. We only
link |
01:21:28.720
focus on the ones where they said yes and like, wow, what if they had stayed independent? I don't
link |
01:21:36.560
know. I used to think a lot about this. Now, I just don't because I'm like, whatever. Things
link |
01:21:42.880
have gone pretty well. I'm ready for the next game. Think about an athlete where, I don't know,
link |
01:21:50.720
maybe they do something wrong in the World Series or whatever. I'm like, if you let it
link |
01:21:55.200
haunt you for the rest of your career, why not just be like, I don't know, it was a game.
link |
01:22:00.240
Next game. Next shot. If you just move to that world, at least I have a next shot, right?
link |
01:22:07.360
No, that's beautiful. I mean, just insights. It's funny you brought up Clubhouse. It is
link |
01:22:12.960
very true. It seems like Clubhouse is on the downward path, and it's very possible to see
link |
01:22:21.200
a billion plus dollar deal at some stage, maybe like a year ago or half a year ago from Facebook,
link |
01:22:27.840
from Google. I think Facebook was flirting with that idea too, and I think a lot of companies
link |
01:22:33.280
probably were. I wish it was more public. You know what? There's not like a badass public story
link |
01:22:41.120
about them making the decision to walk away. We just don't hear about it, and then we get to see
link |
01:22:45.760
the results of that success or the failure more often failure. So a couple of things. One is,
link |
01:22:52.160
I would not assume Clubhouse is down for the count at all. They're young. They have plenty
link |
01:22:56.000
of money. They're run by really smart people. I'd give them a very fighting chance to figure it
link |
01:23:02.160
out. There are a lot of times when people call Twitter down for the count and they figure it
link |
01:23:05.520
out, and they seem to be doing well, right? So just backing up and not knowing anything
link |
01:23:10.640
about their internals, there's a strong chance they will figure it out and that people are just
link |
01:23:16.400
down because they like being down about companies. They like assuming that they're going to fail.
link |
01:23:19.920
So who knows, right? But let's take the ones in the past where we know how it played out.
link |
01:23:23.920
There are plenty of examples where people have turned down big offers and then you've just never
link |
01:23:29.200
heard from them again. But we never focus on the companies because you just forget that those were
link |
01:23:34.000
big. But inside your psyche, I think it's easy for someone with enough money to say money doesn't
link |
01:23:43.040
matter, which I think is bullshit. Of course, money matters to people. But at the moment,
link |
01:23:50.000
you just can't even grasp the number of zeros that you're talking about. It just doesn't make sense,
link |
01:23:55.600
right? So to think rationally in that moment is not something many people are equipped to do,
link |
01:24:01.280
especially not people where I think we had founded the company a year earlier, maybe
link |
01:24:05.840
two years earlier, like a year and a half, we were 13 people. But I will say, I still don't know
link |
01:24:13.200
if it was the right decision because I don't have that counterfactual. I don't know that other world.
link |
01:24:17.280
Yeah. I'm just thankful that by and large, most people love Instagram still do. By and large,
link |
01:24:24.480
people are very happy with the time we had there. And I'm proud of what we built. So I'm cool.
link |
01:24:32.800
Now it's next shot, right? Well, if we could just linger on this Yankees versus Red Sox,
link |
01:24:40.240
the fun of it, the competition over, I would say, over the space of features. So there are a bunch
link |
01:24:47.440
of features, like there's photos, there's one minute videos on Instagram, there's IGTV, there's
link |
01:24:56.000
stories, there's reels, there's live. So that sounds like it's like a long list of too much
link |
01:25:02.480
stuff. But it's not because it feels like they're close together. But there's somehow,
link |
01:25:08.640
like we were saying, fundamentally distinct, like each of the things I mentioned.
link |
01:25:13.280
Maybe can you describe the philosophies, the design philosophies behind some of these,
link |
01:25:17.760
how you were thinking about it during the historic war between Snapchat and Instagram,
link |
01:25:24.720
or just in general, like the space of features that was discovered?
link |
01:25:30.080
There's this great book by Clay Christensen called Competing Against Luck. It's like a terrible
link |
01:25:37.040
title. But within it, there's effectively an expression of this thing called jobs to be done
link |
01:25:45.120
theory. And it's unclear if he came up with it or some of his colleagues, but there are a bunch
link |
01:25:50.960
of places you can find with people claiming to have come up with this jobs to be done theory.
link |
01:25:54.880
But the idea is if you zoom out and you look at your product, you ask yourself,
link |
01:26:01.600
why are people hiring your product? Imagine every product in your life is effectively an employee,
link |
01:26:10.000
your CEO of your life, and you hire products to be employees effectively. They all have roles and
link |
01:26:14.720
jobs. Why are you hiring a product? Why do you want that product to perform something in your
link |
01:26:20.720
life? What are the hidden reasons why you're in love with this product? Instagram was about
link |
01:26:27.600
sharing your life with others visually, period. Why? Because you feel connected with them,
link |
01:26:34.080
you get to show off, you get to feel good and cared about with likes. And it turns out that
link |
01:26:43.840
that will, I think, forever define Instagram and any product that serves that job is going to do
link |
01:26:51.280
very well. Stories, let's take it as an example, is very much serving that job. In fact, it serves
link |
01:26:59.920
it better than the original product because when you're large and have an enormous audience,
link |
01:27:05.920
you're worried about people seeing your stuff or you're worried about being permanent so that a
link |
01:27:09.760
college admissions person is going to see your photo of you doing something. And so it turns out
link |
01:27:15.200
that that is a more efficient way of performing that job than the original product was. The original
link |
01:27:20.080
product still has its value, but at scale, these two things together work really, really well.
link |
01:27:26.320
Now, I will claim that other parts of the product over time didn't perform that job as well. I
link |
01:27:32.720
think IGTV probably didn't. Shopping is completely unrelated to what I just described, but in my
link |
01:27:40.400
work, I don't know. Products that succeed are products that all share this parent node
link |
01:27:48.720
of this job to be done that is in common, and then they're just different ways of doing it.
link |
01:27:55.200
Apple, I think, does a great job with this. It's like managing your digital life,
link |
01:27:59.520
and all the products just work together. They sync. It's beautiful, even if they require
link |
01:28:07.040
silly specific chords to work, but they're all part of a system. It's when you leave that system
link |
01:28:14.080
and you start doing something weird that people start scratching their head, and I think you are
link |
01:28:18.400
less successful. I think one of the challenges Facebook has had throughout its life is that it
link |
01:28:22.880
has never fully, I think, appreciated the job to be done of the main product, and what it's done is
link |
01:28:28.800
said, ooh, there's a shiny object over there. That starps getting some traction. Let's go copy that
link |
01:28:33.600
thing. Then they're confused why it doesn't work. Why doesn't it work? It's because the people who
link |
01:28:38.720
show up for this don't want that. It's different. What's the purpose of Facebook? I remember I
link |
01:28:44.800
was a very early Facebook user. The reason I was personally excited about Facebook is
link |
01:28:52.240
because you can, first of all, use your real name. I can exist in this world. It can be formally
link |
01:29:00.080
exist. I like anonymity for certain things, Reddit and so on, but I wanted to also exist
link |
01:29:07.840
not anonymously so that I can connect with other friends of mine, not anonymously,
link |
01:29:13.520
and there's a reliable way to know that I'm real and they're real and that we're connecting.
link |
01:29:21.760
I liked it for the reasons that people linked in, I guess, but not everybody is dressed up
link |
01:29:30.800
and being super polite, more like with friends. Then it became something much bigger than that,
link |
01:29:37.440
I suppose. There's a feed. It became a place to discover content,
link |
01:29:48.800
to share content that's not just about connecting directly with friends. I mean,
link |
01:29:53.840
it became something else. I don't even know what it is really. You said Instagram is a place where
link |
01:29:58.880
you visually share your life. What is Facebook? Let's go back to the founding of Facebook and
link |
01:30:06.160
why it worked really well initially at Harvard and then Dartmouth and Stanford. I can't remember.
link |
01:30:11.920
Probably MIT. There were a handful of schools in that first trunche. It worked because there are
link |
01:30:19.120
communities that exist in the world that want to transact. When I say transact, I don't mean
link |
01:30:24.640
commercially. I just mean they want to share. They want to coordinate. They want to communicate.
link |
01:30:29.520
They want to space for themselves. Facebook at its best, I think, is that. If actually you
link |
01:30:37.360
look at the most popular products that Facebook has built over time, if you look at things like
link |
01:30:43.040
groups and marketplace, groups is enormous. Groups is effectively like everyone can found
link |
01:30:50.240
their own little Stanford or Dartmouth or MIT and find each other and share and communicate
link |
01:30:56.960
about something that matters deeply to them. That is the core of what Facebook was built around.
link |
01:31:03.840
I think today is where it stands most strongly. Yeah, it's brilliant. The groups, I wish groups
link |
01:31:12.720
were done better. It feels like it's not a first class citizen. I know I may be saying something
link |
01:31:18.400
without much knowledge, but it feels like it's bolted on while being used a lot.
link |
01:31:26.080
It feels like there needs to be a little bit more structure in terms of discovery.
link |
01:31:32.720
I mean, look at Reddit. Reddit is basically groups of public and open and a little bit crazy
link |
01:31:38.480
in a good way. But there's clear product market fit for that specific use case,
link |
01:31:44.800
and it doesn't have to be a college. It can be anything. It can be a small group, a big group.
link |
01:31:48.320
It can be group messaging. Facebook shines, I think, when it leans into that. I think when
link |
01:31:54.240
there are other companies that just seem exciting, and now all of a sudden the product shifts in some
link |
01:32:01.520
fundamental way to go try to compete with that other thing, that's when I think consumers get
link |
01:32:07.280
confused. Even if you can be successful, even if you can compete with that other company,
link |
01:32:12.960
even if you can figure out how to bolt it on, eventually you come back and you look at the
link |
01:32:17.680
app and you're like, I just don't know why I opened this app. There are too many things going on,
link |
01:32:23.040
and that was always a worry. I mean, you listed all the things at Instagram, and that almost
link |
01:32:26.640
gave me a heart attack. Way too many things, but I don't know. Entrepreneurs get bored.
link |
01:32:31.760
They want to add things. They want it right. I don't have a good answer for it, except for that,
link |
01:32:39.040
I think, being true to your original use case, and not even original use case, but
link |
01:32:43.760
sorry, actually not use case, original job. There are many use cases under that job.
link |
01:32:48.480
Being true to that and being really good at it over time and morphing as needs change,
link |
01:32:57.040
I think that's how to make a company last forever. Honestly, my main thesis about why
link |
01:33:04.480
Facebook is in the position it is today is if they have had a series of product launches
link |
01:33:12.720
that delighted people over time, I think they'd be in a totally different world. Just imagine
link |
01:33:19.680
for a moment, and by the way, Apple's entering this, but Apple for so long, just product after
link |
01:33:24.800
product, you couldn't wait for it. You stood in line for it. You talked about it. You got excited.
link |
01:33:29.840
Amazon makes your life so easy. It's like, wow, I needed this thing, and it showed up at my door
link |
01:33:35.280
two days later, and both of these companies, by the way, Amazon, Apple have issues. There are
link |
01:33:41.680
labor issues, whether it's here in the US or in China. There are environmental issues.
link |
01:33:48.160
But when's the last time you heard a large chorus being like, these companies better pay
link |
01:33:53.840
for what they're doing on these things, right? I think Facebook's main issue today is you need
link |
01:34:00.320
to produce a hit. If you don't produce hits, it's really hard to keep consumers on your side.
link |
01:34:06.480
Then people just start picking on you for a variety of reasons, whether it's right or wrong.
link |
01:34:11.120
I'm not even going to place a judgment right here and right now. I'm just going to say that
link |
01:34:16.080
it is way better to be in a world where you are producing hits and consumers love what you're
link |
01:34:20.640
doing because then they're on your side. I think that it's the past 10 years for Facebook has been
link |
01:34:28.000
fairly hard on this dimension. And by hits, it doesn't necessarily mean financial hits. It feels
link |
01:34:33.360
like to me what you're saying is something that brings joy, a product that brings joy to some
link |
01:34:38.640
fraction of the population. Yeah. I mean, TikTok isn't just literally an algorithm.
link |
01:34:45.360
In some ways, TikTok's content and algorithm have more sway now over the American psyche
link |
01:34:53.200
than Facebook's algorithm. It's visual, it's video. By the way, it's not defined by who you
link |
01:34:58.960
follow. It's defined by some magical thing that, by the way, if someone wanted to tweak to show you
link |
01:35:02.960
a certain type of content for some reason, they could. People love it.
link |
01:35:10.640
So as a CEO, let me ask you a question because leadership matters.
link |
01:35:17.520
This is a complicated question. Why is Mark Zuckerberg distrusted,
link |
01:35:22.800
disliked, and sometimes even hated by many people in public?
link |
01:35:26.640
Right. That is a complicated question. Well, the premise, I'm not sure I agree with the premise.
link |
01:35:34.560
And I can expand that to include even a more mysterious question for me, Bill Gates.
link |
01:35:44.640
What is the Bill Gates version of the question? Do you think people hate Bill Gates?
link |
01:35:48.480
No, distrust. So take away one. It's a checklist. I think Mark Zuckerberg's distrust is the primary
link |
01:36:00.880
one, but there's also a dislike. Maybe hate is too strong a word, but just if you look at the
link |
01:36:08.400
articles that are being written and so on, there's a dislike. It's confusing to me because
link |
01:36:16.080
it's like the public picks certain individuals and they attach certain kinds of emotions to those
link |
01:36:22.080
individuals. Yeah. So someone just recently said, there's a strong case that founder led
link |
01:36:29.920
companies have this problem and that a lot of Mark's issues today come from the fact that he
link |
01:36:36.320
is a visible founder with this story that people have watched in both a movie and they followed
link |
01:36:43.440
along and he's this boy wonder kid who became one of the world's richest people. And he's no
link |
01:36:49.200
longer Mark the person. He's marked this image of a person with enormous wealth and power.
link |
01:36:55.520
And in today's world, we have issues with enormous wealth and power for a variety of reasons. One
link |
01:37:02.240
of which is we've been stuck inside for a year and a half, two years. One of which is a lot of
link |
01:37:09.200
people were really unhappy about not the last election, but the last, last election. And where
link |
01:37:14.080
do you take out that anger? Who do you blame but the people in charge? That's one example or one
link |
01:37:19.760
reason why I think a lot of people express anger, resentment or unhappiness with Mark. At the same
link |
01:37:30.080
time, I don't know, I pointed out to that person, I was like, well, I don't know. I think a lot of
link |
01:37:35.360
people really like Elon. Like, Elon arguably, like he kept his factory open here throughout
link |
01:37:41.520
COVID protocols, which arguably a lot of people would be against. While saying a bunch of crazy,
link |
01:37:49.760
offensive things on the internet, they still like basically, you know, gives the middle finger to
link |
01:37:55.760
the SEC like on Twitter and like, I don't know. I'm like, well, there's a founder and like,
link |
01:38:00.880
people kind of like him. So I do think that the founder and slash CEO of a company that's a social
link |
01:38:10.400
network company is like an extra level of difficulty. If life is a video game, you just chose the
link |
01:38:16.160
harder video game. So I mean, that's why it's interesting to ask you because you were the
link |
01:38:21.520
founder and CEO of a social network. I challenge it because exactly. But you're one of the rare
link |
01:38:28.000
examples, even Jack Dorsey's this light, not to the degree, but it just seems harder when you're
link |
01:38:36.400
running a social media company. It's interesting. I never thought of Jack as just like, I think
link |
01:38:41.600
generally, he's well respected. Yeah, I think so. I think you're right. But like, he's not loved.
link |
01:38:50.400
Yeah. And I feel like you, I mean, to me, Twitter is an incredible thing. Yeah. Again,
link |
01:38:55.920
can I just come back to this point, which seems oversimplistic, but like, I really do think
link |
01:39:01.920
how a product makes someone feel they ascribe that feeling to the founder.
link |
01:39:09.840
So, so make people feel good. So think about it. Like, let's just go with this thesis first.
link |
01:39:14.640
Sure. I like it though. Amazon's pretty utilitarian, right? It delivers brown boxes to
link |
01:39:20.800
your front door. Sure, you can have Alexa and you can have all these things, right? But in general,
link |
01:39:26.240
it delivers stuff quickly to you at a reasonable price, right? I think Jeff Bezos is wonderfully
link |
01:39:33.520
wealthy, thoughtful, smart guy, right? But like, people kind of feel that way about them. They're
link |
01:39:38.800
like, wow, this is really big. We're impressed that this is really big. And, but he's doing the
link |
01:39:44.000
same space stuff Elon's doing, but they don't necessarily ascribe the same sense of wonder,
link |
01:39:48.640
right? Now, let's take Elon. And again, this is pet theory. I don't have much proof other
link |
01:39:54.000
than my own intuition. He is literally about living the future. Mars. It's about wonder. It's
link |
01:40:02.000
about going back to that feeling as a kid when you looked up to the stars and asked,
link |
01:40:05.760
is there life out there? People get behind that because it's a sense of hope and excitement and
link |
01:40:12.800
innovation. And like, you can say whatever you want, but we ascribe that emotion to that person. Now,
link |
01:40:19.280
let's say you're on a social network and people make you kind of angry because they disagree with
link |
01:40:23.920
you or they say something ridiculous or they're living a FOMO type life where you're like, wow,
link |
01:40:28.720
I wish I was doing that thing. I think Instagram, if I were to think back, by and large, when I was
link |
01:40:35.120
there was not about FOMO, was not about this influencer economy, although it certainly became
link |
01:40:40.960
that way closer to the end. It was about the sense of wonder and happiness and beautiful things in
link |
01:40:46.720
the world. And I don't know. I mean, like, I don't want to have a blind spot, but I don't think
link |
01:40:50.880
anyone had a strong opinion about FOMO or the other. For the longest time, the way people
link |
01:40:54.880
explained to me, I mean, if you want to go for toxicity, you go to Facebook or Twitter. If you
link |
01:40:59.520
want to go to make feel good about life, you go to Instagram to enjoy, celebrate life though.
link |
01:41:04.320
And my experience when talking to people is they gave me the benefit of the doubt because of that.
link |
01:41:08.240
But if your experience of the product is kind of makes you angry, it's where you argue.
link |
01:41:13.440
I mean, a big part of Jack might be that he wasn't actually the CEO for a very long time and only
link |
01:41:18.000
became recently. So I'm not sure how much of the connection got made. But in general, I mean, if you
link |
01:41:25.440
hate, you know, I'm just thinking about other companies that are in tech companies, if you
link |
01:41:30.960
hate like what a company is doing or it makes you not feel happy, I don't know, like people are
link |
01:41:37.200
really angry about Comcast or whatever. Are they even called Comcast anymore? It's like Xfinity or
link |
01:41:41.520
something, right? They had to rebrand. They became meta, right? It's like, but my point is if it
link |
01:41:48.000
makes you angry. That's beautiful. Yeah. But the thing is, this is me saying this, I think your
link |
01:41:55.360
thesis is very strong and correct, has elements of correctness, but I still personally put some
link |
01:42:04.000
blame on individuals. Of course. I think you said, Elon looking up, there's something about
link |
01:42:11.360
childlike wander to him, like to his personality, his character, something about, I think more so
link |
01:42:18.880
than others, where people can trust them. And there's, I don't know, Sunder Prachai is an example
link |
01:42:24.320
of somebody who's like, there's some, it's hard to put into words, but there's something about
link |
01:42:30.560
the human being, where he's trustworthy. Yeah. He's human in a way that connects to us. And the
link |
01:42:38.320
same with Sajin Adela, I mean, some of these folks, something about us is drawn to them,
link |
01:42:47.360
even when they're flawed, even like, so like your thesis really holds up for Steve Jobs,
link |
01:42:52.960
because I think people didn't like Steve Jobs, but like he delivered products that and then they
link |
01:42:58.080
fell in love every time. I guess you could say that the CEO, the leader is also a product.
link |
01:43:05.600
And if they keep delivering a product that people like, by being a public and saying
link |
01:43:10.240
things that people like, that's also a way to make people happy. But from a social network
link |
01:43:15.600
perspective, it makes me wonder how difficult it is to explain to people why certain things
link |
01:43:21.520
happen, like to explain machine learning, to explain why certain the woke mob effect happens,
link |
01:43:32.320
or the certain kinds of like bullying happens, which is like, it's human nature combined with
link |
01:43:39.840
algorithm, and it's very difficult to control for how the spread of quote unquote misinformation
link |
01:43:44.560
happens. It's very difficult to control for that. And so you try to decelerate certain parts and
link |
01:43:50.400
you create more problems than you solve. And anything that looks at all like censorship can
link |
01:43:55.840
create huge amounts of problems as it's liberty slope. And then you have to inject humans to
link |
01:44:01.200
oversee the machine learning algorithms. And anytime you inject humans into the system,
link |
01:44:05.360
it's gonna create a huge number of problems. And I feel like it's up to the leader to communicate
link |
01:44:09.360
that effectively to be transparent. First of all, design products that don't have those problems.
link |
01:44:15.040
And second of all, when they have those problems, to be able to communicate with them, I guess that's
link |
01:44:19.520
all going to when you run a social network company, your job is hard. Yeah, I will say the one element
link |
01:44:27.360
that you haven't named that I think you're getting at is just bedside manner, which Steve Jobs,
link |
01:44:34.400
I never worked for him, I never met him in person, had an like an uncanny ability in public to have
link |
01:44:42.720
bedside manner. I mean, some of the best clips of Steve Jobs from like, I would say,
link |
01:44:47.520
maybe the 80s when he's on these stage and getting questions from the audience about life for,
link |
01:44:54.240
and he'll take this question that is like, how are you going to compete with blah? And it's super
link |
01:44:58.960
boring. And I don't even know the name of the company. And his answer is, is if you would just
link |
01:45:03.680
ask like your grandfather, the meaning of life. Yeah. And you sit there and you're just like,
link |
01:45:08.800
what? Like, and there's that bedside manner. And if you lack that, or if that's just not
link |
01:45:14.880
intuitive to you, I think that it can be a lot harder to gain the trust of people and then add
link |
01:45:21.920
on top of that, missteps of companies, right? It's, I don't know if you have any friends from
link |
01:45:28.480
the past where like, maybe they crossed you once or like, maybe you get back together and your
link |
01:45:33.520
friends again, but you just never really forget that thing. It's human nature not to forget.
link |
01:45:38.720
I'm Russian. You crossed me once. We solve the problem. So my point is, there is,
link |
01:45:47.520
humans don't forget. And if there are times in the past where they feel like they don't trust
link |
01:45:51.520
the company or the company hasn't had their back, that is really hard to earn back,
link |
01:45:56.960
especially if you don't have that bedside manner. And again, like, I'm not attributing this
link |
01:46:01.840
specifically to Mark because I think a lot of companies have this issue where one, you have
link |
01:46:09.120
to be trustworthy as a company and live by it and live by those actions. And then two,
link |
01:46:13.280
I think you need to be able to be really relatable in a way that's very difficult if you're worth
link |
01:46:18.640
like what these people are. It's really hard. Yeah. Jack does a pretty good job of this
link |
01:46:24.640
by being a monk. But I also like Jack issues attention. Like he's not out there almost on
link |
01:46:31.440
purpose. He's just working hard, doing square and right. I literally shared a desk like this
link |
01:46:36.960
with him at Odio. I mean, this normal guy who likes painting. I remember he would leave
link |
01:46:42.400
early on like Wednesdays or something to go to like a painting class. And he's creative. He's
link |
01:46:48.640
thoughtful. I mean, money makes people like more creative and more thoughtful, like extreme versions
link |
01:46:54.000
of themselves, right? And this was a long, long time ago. You mentioned that he asked you to do
link |
01:47:00.560
some kind of JavaScript thing. We were working on some JavaScript together. That's hilarious.
link |
01:47:06.080
Like pre Twitter, early Twitter days, you and Jack Dorsey are in a room together talking about
link |
01:47:11.600
JavaScript, solving some kind of menial problem. Terrible problems. Yeah. I mean, not terrible,
link |
01:47:16.480
just like boring, boring widget problem. I think it was the Odio widget we were working on at the
link |
01:47:20.560
time. I'm surprised anyone paid me to be in the room as an intern because I didn't really provide
link |
01:47:25.760
any value. I'm very thankful to anyone who included me back in the day. It was very helpful. So thank
link |
01:47:32.480
you for listening. I mean, is there Odio that's a precursor to Twitter? First of all, did you have
link |
01:47:39.680
any anticipation that this Jack Dorsey guy could be also head of a major social network?
link |
01:47:45.280
And second, did you learn anything from the guy that like, do you think it's a coincidence that
link |
01:47:52.480
you two were in the room together? And it's the coincidence, meaning like, why does the world play
link |
01:48:00.480
its game in a certain way where these two founders of social networks? I don't know. It's so weird,
link |
01:48:05.360
right? Like, I mean, it's also weird that Mark showed up, you know, in our fraternity, my sophomore
link |
01:48:13.360
year, and we got to know each other then like long before Instagram, it's, it's a small world. But
link |
01:48:21.840
let me tell a fun story about Jack. We're at Odio. And I don't know, I think Ev was feeling like
link |
01:48:28.480
people weren't working hard enough or something. And I can't remember exactly what he created this
link |
01:48:35.840
thing where every Friday, I don't know if it was every Friday, I only remember this happening once.
link |
01:48:42.080
But he had us like a statuette. It's like of Mary. And in the bottom, it's hollow, right? And I
link |
01:48:51.120
remember on a Friday, he decided he was going to let everyone vote for who had worked the hardest
link |
01:48:57.440
that week. We all voted closed ballot, right? We all put in a bucket. And he tallied the votes.
link |
01:49:03.280
And then whoever got the most votes, as I recall, got the statuette. And in the statuette was a
link |
01:49:10.400
thousand bucks. Or as I recall, there was a thousand bucks. It might have been a hundred bucks,
link |
01:49:14.480
but let's call it a thousand. It's more exciting that way. It felt like a thousand. It did to me
link |
01:49:18.080
for sure. I actually got two votes. I was very happy. We were a small company, but as the intern,
link |
01:49:22.480
I got at least two votes. So everybody knew how many votes they got individually? Yeah. Yeah. And
link |
01:49:26.560
I think it was one of these self accountability things. Anyway, I remember Jack just getting like
link |
01:49:30.960
the vast majority of votes from everyone. And I remember just thinking like, like, I couldn't
link |
01:49:37.360
imagine he would become what he'd become and do what he would do. But I had a profound respect that
link |
01:49:43.520
the new guy who I really liked worked that hard. And you could see his dedication, even that,
link |
01:49:50.880
and that people respected him. That's the one story that I remember of him, like working with him
link |
01:49:55.920
specifically from that summer. Can take a small tangent on that. Of course. There's kind of a
link |
01:49:59.920
pushback in Silicon Valley a little bit against hard work. Can you speak to the sort of the
link |
01:50:06.880
thing you admired to see the new guy working so hard, that thing? What is the value of that thing
link |
01:50:12.640
in a company? See, this is like, just to be very frank, it drives me nuts. I saw this really funny
link |
01:50:20.320
video on TikTok. Was it on TikTok? It was like, I'm taking a break from my mental health to work
link |
01:50:25.040
on my career. I thought that was funny. So it's like, oh, it is kind of phrased that way the opposite
link |
01:50:32.080
often, right? Okay, so a couple of things. I have worked so hard to do the things that I did,
link |
01:50:43.600
like Mike and I lost years off of our lives, staying up late, figuring things out, the stress
link |
01:50:50.720
that comes with the job. I have a lot more gray hair now than I did back then. It requires an
link |
01:50:55.520
enormous amount of work. And most people aren't successful, right? But even the ones that do
link |
01:51:00.480
don't skate by. I am okay if people choose not to work hard, because I don't actually think
link |
01:51:08.240
there's anything in this world that says you have to work hard. But I do think that great
link |
01:51:14.160
things require a lot of hard work. So there's no way you can expect to change the world without
link |
01:51:18.800
working really hard. And by the way, even changing the world, the folks that I respect the most have
link |
01:51:24.320
nudged the world in a slight direction, slight, very, very slight. Even if Elon accomplishes
link |
01:51:33.280
all the things he wants to accomplish, we will have nudged the world in a slight direction.
link |
01:51:38.320
But it requires enormous amount. There was an interview with him where he was just like,
link |
01:51:42.960
he was interviewed, I think, at the Tesla factory, and he was like, work is really hard. This is
link |
01:51:47.760
actually unhealthy. And I can't recall the exact, but he was visibly shaken about how hard he had
link |
01:51:53.600
been working. And he was like, this is bad. And unfortunately, I think to have great outcomes,
link |
01:51:57.520
you actually do need to work at three standard deviations above the mean. But there's nothing
link |
01:52:02.320
saying that people have to go for that. See, the thing is, but what I would argue,
link |
01:52:06.320
this is my personal opinion, is nobody has to do anything first of all. Exactly.
link |
01:52:10.320
They certainly don't have to work hard. Exactly. But I think hard work in a company should be
link |
01:52:17.600
admired. I do too. And you should not feel like you shouldn't feel good about yourself or not
link |
01:52:27.520
working hard. So for example, I don't have to work out. I don't have to run. I hate running.
link |
01:52:35.760
But I certainly don't feel good if I don't run because I know for my health, there's certain
link |
01:52:40.640
values, I guess is what I'm trying to get. There's certain values that you have in life. It feels
link |
01:52:44.800
like there's certain values that companies should have. And hard work is one of the things
link |
01:52:49.840
I think that should be admired. I often ask this kind of silly question,
link |
01:52:54.400
just to get a sense of people, like if I'm hiring and so on. I just ask if they think it's better
link |
01:53:00.720
to work hard or work smart. It was helpful for me to get a sense of people from that.
link |
01:53:07.200
Because you think like the right... The answer is both.
link |
01:53:09.600
What's that? The answer is both.
link |
01:53:10.720
The answer is both. I usually try not to give them that, but sometimes I'll say both if that's
link |
01:53:15.760
an option. But a lot of people kind of... A surprising number will say work smart. And there
link |
01:53:22.240
are usually people who don't know how to work smart and they're literally just lazy. Not just
link |
01:53:30.080
that there's two effects behind that. One is laziness and the other is ego. When you're younger
link |
01:53:38.080
and you say it's better to work smart, it means you think you know what it means to work smart
link |
01:53:45.040
at this early stage. To me, people that say work hard or both, they have the humility to understand
link |
01:53:51.600
like I'm going to have to work my ass off because I'm too dumb to know how to work smart. And people
link |
01:53:57.120
who are self critical in this way in some small amount. You have to have some confidence. But
link |
01:54:02.800
if you have humility, that means you're going to actually eventually figure out what it means to
link |
01:54:06.800
work smart. And then to actually be successful, you should do both.
link |
01:54:11.120
So I have a very particular take on this, which is that
link |
01:54:17.360
no one's forcing you to do anything, all choices of consequences. So if you major in,
link |
01:54:24.720
I don't know, theoretical literature. I don't even know if that's a major. I'm just making
link |
01:54:29.280
something up. That's supposed to regular literature. Applied literature.
link |
01:54:33.360
Yeah. Think about like theoretical Spanish lit from the 14th century. Like just make up your
link |
01:54:40.400
esoteric thing. And then the number of people I went to Stanford with who get out in the world
link |
01:54:44.880
and they're like, wait, what? I can't find a job. Like no one wants a theoretical, like
link |
01:54:50.240
there are plenty of counter examples of people have majored in esoteric things and gone on to
link |
01:54:53.760
be very successful. So I just want to be clear. It's not about the major, but every choice you make,
link |
01:54:58.400
whether it's to have kids, like I love my children. It's so awesome to have two kids. And it is so
link |
01:55:05.200
hard to work really hard and also have kids. It's really hard. And there's a reason why
link |
01:55:10.800
certain very successful people like don't have or not successful, but people who run very, very
link |
01:55:15.600
large companies or startups have chosen not to have kids for a while or chosen not to like
link |
01:55:20.000
prioritize them. Everything's a choice. And like I choose to prioritize my children because like
link |
01:55:25.280
I want to do that. So everything's a choice. Now, once you've made that choice, I think it's
link |
01:55:33.440
important that the contract is clear, which is to say, let's imagine you were joining a new startup.
link |
01:55:40.640
It's important that that startup communicate that like the expectation is like, we're all
link |
01:55:44.960
working really, really hard right now. You don't have to join the startup. But like if you do,
link |
01:55:49.520
just know like you're, it's almost as if you join, I don't know, pick your, pick your,
link |
01:55:56.160
pick your like sports team, like let's go back to the Yankees for a second. You want to join the
link |
01:56:01.440
Yankees, but you don't really want to work that hard. You don't really want to do batting practice
link |
01:56:05.600
or pitching practice or whatever for your position, right? That to me is wacko. And that's actually
link |
01:56:12.000
the world that it feels like we live in and tech sometimes where people both want to work for the
link |
01:56:16.640
Yankees because it pays a lot, but like don't actually want to work that hard. That I don't
link |
01:56:22.080
fully understand because if you sign up for some of these things, just sign up for it,
link |
01:56:26.720
but it's okay if you don't want to sign up for it. There's so many wonderful careers in this world
link |
01:56:31.600
that don't require 80 hours a week. But when I read about companies going to like four day work
link |
01:56:35.760
weeks and stuff, I just like, I chuckle because I can't get enough done with a seven day week.
link |
01:56:40.960
I don't know how, and people will say, oh, you're just not working smart. And it's like,
link |
01:56:45.120
no, I work pretty smart, I think in general, like I wouldn't have gotten to this point
link |
01:56:49.200
if I hadn't like some amount of working smart. And there is balance though. So I used to be
link |
01:56:54.480
like a pretty big cyclist. I don't do it much anymore just because of kids and like prioritizing
link |
01:56:59.040
other things, right? But one of the most important things to learn as a cyclist is to take a rest
link |
01:57:04.880
day. But to me and to cyclists, like resting is a function of optimizing for the long run.
link |
01:57:11.760
It's not like a thing that you do for its own merits. It's actually like, if you don't rest,
link |
01:57:17.120
your muscles don't recover, and then you're just not as like, you're not training as efficiently.
link |
01:57:20.720
You should probably, the successful people I've known in terms of athletes, they hate rest days,
link |
01:57:26.320
but they know they have to do it for the long term. Absolutely. They think their opposition
link |
01:57:30.560
is getting stronger and stronger. And that's the feeling, but you know, it's the right thing.
link |
01:57:35.120
And usually you need a coach to help you. Yeah, totally. So I mean, I use this thing called
link |
01:57:39.760
Training Peaks. And it's interesting because it actually mathematically shows like where you are
link |
01:57:44.640
on the curve and all this stuff. But you have to, like you have to have that rest, but it's a function
link |
01:57:50.720
of going harder for longer. Again, it's this reinforcement learning, like planning the aggregate
link |
01:57:55.760
and the long, but a lot of people will hide behind laziness by saying that they're trying
link |
01:57:59.440
to optimize for the long run and they're not, they're just not working very hard. But again,
link |
01:58:03.520
you don't have to sign up for it. It's totally cool. Like I don't think less of people for like
link |
01:58:07.680
not working super hard, just like don't sign up for things that require working super hard.
link |
01:58:11.680
And some of that requires for the leadership to have the guts, the boldness to communicate
link |
01:58:16.400
effectively at the very beginning. I mean, sometimes I think most of the problems arise
link |
01:58:20.640
in the fact that the leadership is kind of hesitant to communicate this socially difficult
link |
01:58:30.480
truth of what it takes to be at this company. And so they kind of say, hey, come with us,
link |
01:58:36.320
there's whiff snacks, but unlimited vacation. And Ray at Bridgewater is always fascinating
link |
01:58:44.080
because people, it's been called like a cult on the outside or cult ish. But what's fascinating is
link |
01:58:50.560
like they just don't give on their principles. They're like, listen, this is what it's like
link |
01:58:54.400
to work here. We record every meeting. We're brutally honest. And that's not going to feel
link |
01:58:59.920
right to everyone. And if it doesn't feel right to you, totally cool. Just go work somewhere else.
link |
01:59:04.640
But if you work here, you are signing up for this. And that's that's been fascinating to me
link |
01:59:10.320
because it's honesty up front. It's a system in which you operate. And if it's not for you,
link |
01:59:16.560
like no one's forcing you to work there, right? I actually did. So I did a conversation with him
link |
01:59:22.400
and kind of got stuck in a funny moment, which is at the end, I asked him to give me honest
link |
01:59:28.880
feedback of how I did on the interview. And I was like, I don't think so. He was super nice.
link |
01:59:36.160
He asked me, he's like, well, tell me, did you accomplish what you're hoping to accomplish?
link |
01:59:41.840
I was like, that's not, that's not, I'm asking you as an objective observer of two people talking,
link |
01:59:49.440
how do we do today? And then he's like, well, he gave me this politician's answer. Well, I feel
link |
01:59:56.000
like we've accomplished successful communication of ideas, which is I'd like I'd love to spread
link |
02:00:01.840
some of the ideas in that like in principles and so on. Back to my original point, it's really hard
link |
02:00:09.600
to get even for a dollar. It's really hard to give feedback. And one of the other things I learned
link |
02:00:14.880
from him and just people in that world is like, man, humans really like to pretend like they've
link |
02:00:22.480
come to that they've come to some kind of meeting of the minds, like if there's conflict, if you and
link |
02:00:28.240
I have conflict, it's always better to meet face to face, right? Or on the phone, Slack is not great,
link |
02:00:34.800
right? Email is not great, but face to face, what's crazy is you and I get together and we
link |
02:00:39.040
actively try to even if we're not actually solving the conflict, we actively try to paper over the
link |
02:00:44.800
conflict. Oh, yeah, it didn't really bother bother me that much. Oh, yeah, I'm sure you didn't mean
link |
02:00:49.840
it. I'm sure. But like, no, in our minds, we're still there. Yeah. So this is one of the things
link |
02:00:55.280
that as a leader, you always have to be digging, especially as you said straight to the conflict.
link |
02:01:00.720
Yeah, as you ascend, no one wants to tell you you're crazy. No one wants to tell you your your
link |
02:01:05.120
ideas bad. And you can you're like, Oh, I'm going to be a leader. And the idea is, well, I'm just
link |
02:01:10.800
going to ask people, I want to tell you. So like you have to look for the markers, knowing that
link |
02:01:17.120
literally just people aren't going to tell you along the way and be paranoid. I mean, you asked
link |
02:01:22.000
about selling, you know, the company, I think one of the biggest differences between me and a lot of
link |
02:01:26.320
other entrepreneurs is like, I wasn't completely confident we could do it. Like we could be alone
link |
02:01:32.640
and and and actually be great. And if any entrepreneurs honest with you, they also feel
link |
02:01:38.320
that way. But a lot of people are like, well, I have to be cocky and just say, I can do this on
link |
02:01:43.200
my own. We're going to be fine. We're going to crush everyone. Some people do say that. And then
link |
02:01:48.400
it's not right. And they and they fail. But being honest in that moment with yourself,
link |
02:01:55.360
with those close to you. And also, you talked about the personality of leaders and who resonates
link |
02:02:01.920
and who doesn't. It's rare that I see leaders be vulnerable, rare. And one thing I tried to do
link |
02:02:11.840
at Instagram, at least internally, was like, say when I screwed up, and like, point out how I was
link |
02:02:18.240
wrong about things and point out where my judgment was off. Everyone thinks they have to bat 1000.
link |
02:02:24.960
Right? Like, that's crazy. The best quant hedge funds in the world bat 50.001%.
link |
02:02:31.600
They just take a lot of bets, right? Renaissance. They might they might bat 51%, right? But
link |
02:02:37.840
holy hell, like, the question isn't, are you right every single time and you have to seem invincible?
link |
02:02:46.240
The question is, how many at bats do you get? And on average, are you, are you better on average?
link |
02:02:52.720
Right? With enough bets and enough at bats that your aggregate can be very high.
link |
02:02:58.480
I mean, Steve Jobs was wrong at a lot of stuff. The Newton was too early, right? Next, not quite
link |
02:03:04.560
right. There was even a time when he said like, no one will ever want to watch a video on the iPod.
link |
02:03:13.280
Totally wrong. But who cares if you come around and realize your mistake and fix it?
link |
02:03:18.400
It becomes just like you said, harder and harder when your ego grows and the number of people
link |
02:03:22.320
around you that say positive things towards you grows. And I actually think it's really valuable
link |
02:03:27.920
that like let's manage and a counterfactual where Instagram became worth like $300 billion or something
link |
02:03:35.040
crazy, right? I kind of like that my life is relatively normal now. When I say relatively,
link |
02:03:41.200
you get what I mean. I'm not making a claim that I live a normal life. But like, I certainly don't
link |
02:03:45.280
live in a world where there are like 15 Sherpas following me, like fetching me water or whatever.
link |
02:03:51.040
Like, that's not how it works. I actually like that I have a sense of humility of like,
link |
02:03:56.480
I may not found another thing that's nearly as big. So I have to work twice as hard.
link |
02:04:01.040
Or I have to like learn twice as much. I have to read my, we haven't talked about machine
link |
02:04:06.800
learning yet. But my favorite thing is all these like famous, you know, tech guys who have worked
link |
02:04:14.240
in the industry, pontificating about the future of machine learning and how it's going to kill us
link |
02:04:18.160
all. And like, I'm pretty sure they've never tried to build anything with machine learning
link |
02:04:23.680
themselves. Yes. So there's a nice line between people that actually build stuff with machine,
link |
02:04:29.520
like actually program something, or at least understand some of those fundamentals and the
link |
02:04:33.840
people that are just saying philosophical stuff or journalists and so on. It's, it's a, it's an
link |
02:04:39.680
interesting line to walk because the people who program are often not philosophers.
link |
02:04:44.720
Or don't have the attention. They can't write an op ed for the Wall Street Journal like it
link |
02:04:48.160
doesn't work. So like, it's nice to be both a little bit, like to have elements of both.
link |
02:04:52.720
My point is the fact that I have to learn stuff from scratch or that I choose to or like,
link |
02:04:58.000
it's humbling. Yeah. I mean, again, I have a lot of advantages. I like,
link |
02:05:04.240
but my point is it's awesome to be back in a game where you have to fight. That is, that's fun.
link |
02:05:13.040
So being humble, being vulnerable, it's an important aspect of a leader. And I hope it
link |
02:05:18.480
serves me well, but like, I can't fast forward 10 years to know. I've just, that's my game plan.
link |
02:05:24.000
Before I forget, I have to ask you one last thing on Instagram.
link |
02:05:28.800
What do you think about the whistleblower Francis Haugen recently coming out and saying that
link |
02:05:33.840
Facebook is aware of Instagram's harmful effect on teenage girls as per their own
link |
02:05:41.120
internal research studies and the matter? What do you think about this
link |
02:05:44.160
baby of yours, Instagram, being under fire now, as we've been talking about under the leadership of Facebook?
link |
02:05:52.880
You know, I often question, where does the blame lie? Is the blame at the people that originated the
link |
02:06:02.400
network, me, right? Is the blame at like the decision to combine the network with another
link |
02:06:10.880
network with a certain set of values? Is the blame at how it gets run after I left? Like,
link |
02:06:19.920
is it the driver? Is it the car? Right? Is it that someone enabled these devices in the first place
link |
02:06:27.840
if you go to an extreme, right? Or is it the users themselves, just human nature? Is it just the way
link |
02:06:36.080
of human nature? Sure. And the idea that we're going to find a mutually exclusive answer here
link |
02:06:41.360
is crazy. There's not one place that's a combination of a lot of these things. And then the question
link |
02:06:46.240
is, is it true at all? I'm not actually saying that it's not true or that it's true, but there's
link |
02:06:53.440
always more nuance here. Do I believe that social media has an effect on young people? Well, it's
link |
02:07:00.400
God who they use it a lot. And I bet you there are a lot of positive effects and I bet you there
link |
02:07:04.720
are negative effects, just like any technology. And where I've come to in my thinking on this is
link |
02:07:10.240
that I think any technology has negative side effects. The question is, as a leader, what do
link |
02:07:15.600
you do about them? And are you actively working on them or do you just like not really believe in
link |
02:07:19.600
them? If you're a leader that sits there and says, well, we're going to put an enormous amount of
link |
02:07:24.000
resources against this, we're going to acknowledge when there are true criticisms, we're going to
link |
02:07:29.920
be vulnerable and that we're not perfect. And we're going to go fix them and we're going to be held
link |
02:07:34.400
accountable along the way. I think that people generally really respect that. But I think that
link |
02:07:42.400
where Facebook, I think, has had issues in the past is where they say things like,
link |
02:07:46.640
I can't remember what Mark said about misinformation during the election. There was
link |
02:07:50.240
that famous quote where he's like, it's pretty crazy to think that Facebook had anything to
link |
02:07:54.720
do with this election. That was something like that quote. And I don't remember what stage he was
link |
02:07:58.400
on. But that did not age well. You have to be willing to say, well, maybe there's something
link |
02:08:07.760
there. And wow, I want to go look into it and truly believe it in your gut. But if people look at
link |
02:08:13.520
you and how you act and what you say and don't believe, you truly feel that way?
link |
02:08:18.560
It's not just the words you say, but how you say them and that people believe they actually feel
link |
02:08:22.720
the pain of having caused any suffering in the world. So to me, it's much more about your
link |
02:08:28.080
actions and your posture post event than it is about debugging the why. I don't know this research.
link |
02:08:36.640
It was written well after I left. Is it the algorithm? Is it the explore page? Is it the
link |
02:08:42.880
people you might know unit connecting you to ideas that are dangerous? I really don't know.
link |
02:08:51.120
So we'd have to have a much deeper dive to understand where the blame lies.
link |
02:08:56.400
What's very unpleasant to me to consider now, I don't know if this is true, but
link |
02:09:00.560
to consider the very fact that there might be some complicated games being played here.
link |
02:09:06.880
For example, you know, as somebody, I really love psychology. And I love it enough to know
link |
02:09:12.800
that the field is pretty broken. In the following way, it's very difficult to study human beings
link |
02:09:18.080
well at scale. Because the questions you ask affect the results, you can you can basically
link |
02:09:23.520
get any results you want. And so you have an internal Facebook study that asks some question
link |
02:09:28.720
of which we don't know the full details. And there's some kind of analysis. But that's just
link |
02:09:33.280
the one little tiny slice into some much bigger picture. And so you can have thousands of employees
link |
02:09:39.840
at Facebook, one of them comes out and picks whatever narrative, knowing that they become famous
link |
02:09:46.320
coupled with the other really uncomfortable thing I see in the world, which is journalists seem to
link |
02:09:52.560
understand they get a lot of clickbait attention from saying something negative about social
link |
02:09:57.520
networks, certain companies like they even get some, some clickbait stuff about Tesla or about,
link |
02:10:06.000
especially when it's like, when there's a public famous CEO type of person, if they get a lot of
link |
02:10:12.160
views on the negative, not the positive, the positive they'll get. I mean, it actually goes to
link |
02:10:16.880
the thing you were saying before, if there's a hot sexy new product, that's great to look forward
link |
02:10:21.840
to they get positive on that. But absent a product, it's nice to have like the CEO messing up with
link |
02:10:29.120
some kind of way. And so couple that with the whistleblower, and with the this whole dynamic
link |
02:10:36.720
of journalism and so on, you know, with social dilemma being really popular documentary, it's
link |
02:10:42.160
like, all right, my concern is there's deep flaws in human nature here, in terms of things we need
link |
02:10:49.840
to deal with, like the nature of hate, bullying, all those kinds of things. And then there's people
link |
02:10:57.200
who are trying to use that potentially to become famous and make money off of blaming others for
link |
02:11:05.680
causing more of the problem as opposed to helping solve the problem. So I don't know what to think.
link |
02:11:10.080
I'm not saying this is like, I'm just uncomfortable with, I guess, not knowing what to think about
link |
02:11:15.200
any of this. Because a bunch of folks I know that work at Facebook, on the machine learning side,
link |
02:11:20.480
so Jan Likun, I mean, they, they're quite upset by what's happening, because there's a lot of
link |
02:11:25.920
really brilliant good people inside Facebook, they're trying to do good. And so like, all of
link |
02:11:31.520
this press, Jan is one of them. And he has an amazing team of machine learning researchers,
link |
02:11:35.600
like, he's really upset with the fact that people don't seem to understand that this this is not
link |
02:11:41.600
the portrayal does not represent the full nature of efforts that's going on on Facebook.
link |
02:11:46.400
So I don't know what to think about that. Well, you just, I think, very
link |
02:11:50.800
helpfully explain the nuance of the situation and why it's so hard to understand. But a couple
link |
02:11:56.640
things. One is, I think I have been surprised at the scale with which some product manager can do
link |
02:12:12.720
an enormous amount of harm to a very, very large company by releasing a trove of documents. Like,
link |
02:12:19.360
I think I read a couple of them when they got published, and I haven't even spent any time
link |
02:12:23.120
going deep. Part of it's like, I don't really feel like reliving a previous life. But
link |
02:12:29.680
wow, like talk about challenging the idea of open culture and like what that does to
link |
02:12:35.920
Facebook internally, if Facebook was built, like I remember, like my office, we had this,
link |
02:12:43.680
like, no visitors rule around my office, because we always had like confidential stuff up on the
link |
02:12:47.600
walls and never was super angry, because they're like, that goes against our culture of transparency.
link |
02:12:52.720
And like Mark's in the fish cube or whatever they call it, the aquarium, I think they called it,
link |
02:12:57.680
where like literally anyone could see what he was doing at any point. And I don't know. I mean,
link |
02:13:03.440
other companies like Apple have been quiet slash lockdown, Snapchat's the same way for a reason.
link |
02:13:10.160
And I don't know what this does to transparency on the inside of startups that value that. I think
link |
02:13:17.040
that it's a seminal moment. And you can say, well, you should have nothing to hide, right?
link |
02:13:22.160
But to your point, you can pick out documents that show anything, right? But I don't know. So
link |
02:13:28.960
what happens to transparency inside of startups and the culture that
link |
02:13:34.400
startups or companies in the future will grow? Like the startup of the future that becomes
link |
02:13:38.240
the next Facebook will be locked down. And what does that do? Right? So that's part one. Part two.
link |
02:13:44.640
Like, I don't think that you could design a more like well orchestrated handful of
link |
02:13:55.040
events from the like 60 minutes to releasing the documents in the way that they were released
link |
02:14:01.920
at the right time. That takes a lot of planning and partnership. And it seems like she has a
link |
02:14:07.520
partner at some firm, right, that probably helped a lot with this. But man, on a personal level,
link |
02:14:15.360
if you're her, you'd have to really believe in what you are doing, really believe in it,
link |
02:14:22.160
because you are personally putting your ass on the line, right? Like, you've got a very large
link |
02:14:27.920
company that doesn't like enemies, right? It takes a lot of guts. And I don't love these
link |
02:14:38.240
conspiracy theories about like, oh, she's being financed from some person or people like, I don't
link |
02:14:42.720
love them because that's like the easy thing to say. I think that the Occam's razor here is like
link |
02:14:48.560
someone thought they were doing something wrong, and was like very, very courageous. And I don't
link |
02:14:56.160
know if courageous is the word, but like, so without getting into like, is she a martyr? Is she
link |
02:15:01.920
courageous? Is she right? Like, let's put that aside for a second. Then there are the documents
link |
02:15:06.800
themselves. They say what they say. To your point, a lot of the things that like people have been
link |
02:15:12.480
worried about, are already in the documents or they're already been said externally. And
link |
02:15:19.280
I don't know. I'm just like, I'm thankful that I am focused on new things with my life.
link |
02:15:24.160
Well, let me just say, I just think it's a really hard problem that probably Facebook and Twitter
link |
02:15:30.720
are trying to solve. I'm actually just fascinated by how hard this problem is.
link |
02:15:35.200
There are fundamental issues at Facebook in tone and in an approach of how product gets built and
link |
02:15:41.680
the objective functions. And since people, organizations are not people. So yawn and fair,
link |
02:15:49.600
right? Like, there are a lot of really great people who like literally just want to push
link |
02:15:53.200
reinforcement learning forward. They literally just want to teach a robot to touch, feel, lift,
link |
02:15:59.040
right? Like, they're not thinking about political misinformation, right? But there's a strong
link |
02:16:05.600
connection between what funds that research and an enormously profitable machine that has tradeoffs.
link |
02:16:13.280
And one cannot separate the two. You are not completely separate from the system.
link |
02:16:20.640
So, I agree. It can feel really frustrating to feel if you're internal there, that you're working
link |
02:16:26.960
on something completely unrelated and you feel like your group's good. I can understand that.
link |
02:16:31.840
But there's some responsibility still. You have to acknowledge, it's like the right
link |
02:16:35.200
value thing. You have to look in the mirror and see if there's problems and you have to fix those
link |
02:16:39.040
problems. Yeah. You've mentioned machine learning and reinforcement quite a bit. I mean, to me,
link |
02:16:46.640
social networks is one of the exciting places, the recommender systems where machine learning is
link |
02:16:52.000
applied. Where else in the world, in the space of possibilities over the next 5, 10, 20 years,
link |
02:16:58.000
do you think we're going to see impact over machine learning when you try it? On the philosophical
link |
02:17:04.000
level, on a technical level, what do you think? Or within social networks themselves?
link |
02:17:09.040
Well, I think the obvious answers are climate change. Think about how much fuel
link |
02:17:19.680
or just waste there is in energy consumption today because we don't plan accordingly. Because we
link |
02:17:28.320
take the least efficient route. The logistics and stuff, the supply chain and all that kind
link |
02:17:32.880
of stuff. Yeah. Listen, if we're going to fight climate change, one awesome way to do it is figure
link |
02:17:40.240
out how to optimize how we operate as a species and minimize the amount of energy we consume to
link |
02:17:47.280
maximize whatever economic impact we want to have. Because right now, those two are very much tied
link |
02:17:53.040
together. And I don't believe that has to be the case. There's this really interesting, you've read
link |
02:17:58.240
it, for people who are listening. There's this really interesting paper on reinforcement learning
link |
02:18:03.920
and energy consumption inside buildings. It's one of the seminal ones. But imagine that at
link |
02:18:08.960
massive scale. That's super interesting. They've done resource planning for servers for peak load
link |
02:18:16.160
using reinforcement learning. I don't know if that was at Google or somewhere else, but like,
link |
02:18:20.240
okay, great, you do it for servers. But what if you could do it for just capacity and general
link |
02:18:24.400
energy capacity for cities and planning for traffic. And of course, there's all the self
link |
02:18:29.840
driving cars. And I don't know, I'm not going to pontificate crazy ideas using reinforcement
link |
02:18:38.880
learning or machine learning. It's just so clear to me that humans don't think quickly enough.
link |
02:18:43.520
So it's interesting to think about machine learning helping a little bit at scale. So a
link |
02:18:50.000
little bit to a large number of people that has a huge impact. So if you optimize, it's like Google
link |
02:18:55.920
Maps, something like that trajectory planning or what the map quest first, getting here, I looked
link |
02:19:02.240
and it was like, here's the most energy efficient route. And I was like, I'm going to be late. I
link |
02:19:05.680
need to take the fastest route as opposed to unrolling the map. Yeah. Yeah. Like, and that's
link |
02:19:11.520
going to be very inefficient no matter what. I was definitely the other day, like part of the
link |
02:19:15.440
Epsilon of Epsilon greedy with ways where like I was sent on like a weird route that I could tell
link |
02:19:22.400
they're like, we just need to collect data of this road. Kevin's definitely going to be the
link |
02:19:29.600
guinea pig. And great. Now we have the Julli's feel pride. Oh, going through it. I was like,
link |
02:19:35.040
Oh, this is fun. Like now they get data about this weird shortcut. And actually I hit all the
link |
02:19:39.040
green lights networked. I'm like, this is a problem. This is bad data. Bad data. They're just
link |
02:19:43.440
going to imagine. I could see you slowing down and stopping at a green light just to give them
link |
02:19:47.840
the right kind of data. But to answer your question, like I feel like that was fairly
link |
02:19:52.320
unsatisfying and it's easy to say climate change. But what I would say is at Instagram,
link |
02:19:58.640
everything we applied machining learning to got better for users and it got better for the company.
link |
02:20:04.640
And I saw the power. I didn't fully understand it as an executive. And I think that's actually one
link |
02:20:09.200
of the issues that and when I say understand, I mean the mathematics of it. Like I understand
link |
02:20:14.640
what it does. I understand that it helps. But there are a lot of executives now that talk about it
link |
02:20:21.680
in the way that they talk about the internet or they talked about the internet like 10 years ago,
link |
02:20:25.440
they're like, we're going to build mobile. And you're like, what does that mean? They're like,
link |
02:20:28.080
we're just going to do mobile. And you're like, okay. So my sense is the next generation of leaders
link |
02:20:33.760
will have grown up having had classes in reinforcement learning, supervised learning, whatever.
link |
02:20:40.080
And they will be able to thoughtfully apply it to their companies and the places that it is needed
link |
02:20:45.040
most. And that's really cool. Because I mean, talk about efficiency gains. That's what excites me
link |
02:20:53.680
the most about it. Yeah. So there's, it's interesting just to get a fundamental first principle is
link |
02:20:58.800
understanding of certain concepts of machine learning. So supervised learning from an executive
link |
02:21:03.760
perspective, supervised learning, you have to have a lot of humans label a lot of data.
link |
02:21:08.320
So the question there is, okay, can we gather a large amount of data that can be labeled well?
link |
02:21:14.240
And that's the question Tesla asked, like, can we create a data engine that keeps
link |
02:21:20.080
sending an imperfect machine learning system out there? Whenever it fails, it gives us data back,
link |
02:21:24.880
we label it by human and we send it back and forth to this way. Then there is a young,
link |
02:21:29.600
the Coons excited about the self supervised learning, where you do much less human labeling.
link |
02:21:36.000
And there's some kind of mechanism for the system to learn it by itself on the human
link |
02:21:41.600
generated data. And then there's the reinforcement learning, which is like,
link |
02:21:46.400
basically allowing it's applying the alpha zero technology that allow through self play
link |
02:21:54.400
to learn how to solve the game of go and achieve incredible levels at the game of chess.
link |
02:22:02.400
Can you formulate the problem you're trying to solve in a way that's amenable to reinforcement
link |
02:22:06.720
learning? And can you get the right kind of signal at scale? Because you need a lot, a lot of signal.
link |
02:22:11.680
And that's, that's kind of fascinating to see which part of a social network can you convert
link |
02:22:17.520
into a reinforcement learning problem. The fascinating thing about reinforcement learning,
link |
02:22:22.160
I think, is that we now have learned to apply neural networks to guess the Q function,
link |
02:22:34.160
basically the values for any state in action. And that is fascinating because we used to just
link |
02:22:39.840
like, I don't know, have like a linear regression like hope it worked. And that was the fanciest
link |
02:22:43.760
version of it. But now you look at it, I'm like trying to learn this stuff. And I look at it,
link |
02:22:47.920
I'm like, there are like 17 different acronyms of different ways you can try to apply this.
link |
02:22:52.720
No one quite agrees. Like, what's the best? Generally, if you're trying to like build a
link |
02:22:57.520
neural network, they're pretty well trodden ways of doing that. You use Adam, you use Reilu, you
link |
02:23:04.400
like, there's just like general good ideas. And in reinforcement learning, I feel like the
link |
02:23:09.360
consensus is like, it totally depends. And by the way, it's really hard to get it to converge. And
link |
02:23:16.400
it's noisy. And it like, so there are all these really interesting ideas around building simulators.
link |
02:23:23.440
You know, like, for instance, in self driving, right? Like, you don't want to like, actually
link |
02:23:28.640
have someone getting in an accident to learn that an accident is bad. So you start simulating
link |
02:23:33.360
accidents, simulating aggressive drivers or simulating crazy dogs that run into the street.
link |
02:23:39.040
Wow, fascinating, right? Like my mind starts racing. And then the question is, okay, forget
link |
02:23:44.000
about self driving cars, let's talk about social networks. How can you produce a better, more
link |
02:23:51.680
thoughtful experience using these types of algorithms? And honestly, in talking to some
link |
02:23:57.680
of the people that work at Facebook and old Instagrammers, most people are like, yeah,
link |
02:24:02.880
we tried a lot of things, didn't quite ever make it work. And for the longest time, Facebook ads
link |
02:24:07.280
was effectively a logistic regression. Okay, I don't know what it is now. But like, if you look at
link |
02:24:12.320
this paper that they published back in the day, it was literally just a logistic regression.
link |
02:24:15.920
They had a lot of money. So even at these like, extremely large scales, if we are not yet touching
link |
02:24:23.200
what reinforcement learning can truly do, imagine what the next 10 years looks like.
link |
02:24:27.280
Yeah. How cool is that? It's amazing. So I really like the use of reinforcement learning as part
link |
02:24:32.400
of the simulation, for example, like, with self driving cars, it's modeling pedestrians. So
link |
02:24:37.920
the nice thing about reinforcement learning, it can be used to learn agents within the world.
link |
02:24:44.880
So they can learn to behave properly. Like you can teach pedestrians that,
link |
02:24:49.200
like you don't hard code the way they behave. They learn how to behave. And that same way,
link |
02:24:54.000
I do have a hope was that Jack Dorsey talks about healthy conversations. You talked about
link |
02:24:59.840
meaningful interactions, I believe. Yeah.
link |
02:25:01.600
Yeah. Like, simulating interactions. So you can learn how to manage that. It's fascinating. So
link |
02:25:10.000
where most of your algorithm development happens in virtual worlds. And then you can really learn
link |
02:25:16.400
how to design the interface, how you design much of aspects of the experience, in terms of the,
link |
02:25:22.480
how you select what's shown in the feed, all those kinds of things that it feels like if you
link |
02:25:27.040
can connect reinforcement learning to that, that's super exciting. Yep. And I think if you have a
link |
02:25:33.520
company and leadership that believe in doing the right things and can apply this technology in the
link |
02:25:37.840
right way, some really special stuff can happen. It is mostly likely going to be a group of people
link |
02:25:44.240
we've never heard about, start up from scratch, right? And you asked if like new social networks
link |
02:25:51.360
could be built. I've got to imagine they will be. And whoever starts it, it might be some kids in a
link |
02:25:57.840
garage that took these classes from these people, you, right? Like, and they're building all of
link |
02:26:03.840
these things with this tech at the core. So I'm trying not to be someone who just like throws
link |
02:26:08.320
around reinforcement learning as a buzzword. I truly believe that it is the most cutting edge
link |
02:26:15.440
in what can happen in social networks. And I also believe it's super hard. Like, it's super hard to
link |
02:26:21.120
make it work. It's super hard to do it at scale. It's super hard to find people that truly understand
link |
02:26:25.360
it. So I'm not going to say that like, I think it'll be applied in social networks before we have
link |
02:26:31.680
true self driving. Let me put it that way. We could argue about this for a long time. But yes,
link |
02:26:36.480
I agree with you. I think self driving is way harder than people realize. Oh, absolutely. Let me
link |
02:26:41.360
ask you, in terms of that kid in the garage or those couple of kids in the garage, what advice
link |
02:26:46.000
would you give to them if they want to start a new social network or a business? What advice would
link |
02:26:51.360
you give to somebody with a big dream and a young startup? To me, you have to choose to do something
link |
02:26:59.360
that even if it fails, like it was so fun, right? Like, we never started Instagram knowing it was
link |
02:27:06.800
going to be big. We started Instagram because we loved photography. We loved social networks. I had
link |
02:27:12.640
seen what other social networks had done. And I thought, hmm, maybe we put did a spin on this.
link |
02:27:17.520
But like, nowhere was our fate predestined. Like, it wasn't like, it wasn't written out
link |
02:27:23.120
anywhere that everything was going to go great. And I often think the counterfactual, like,
link |
02:27:27.040
what if it had not gone well? I would be like, I don't know, that was fun. We raised some money.
link |
02:27:31.120
We learned some stuff. And does it position you well for the next experience? That's the advice
link |
02:27:38.480
that I would give to anyone wanting to start something today, which is like, does this meet with
link |
02:27:43.920
your ultimate goals, not wealth, not fame, none of that, because all of that, by the way, is bullshit.
link |
02:27:49.840
You can get super famous and super wealthy. And I think generally, those are not things that,
link |
02:27:55.520
again, it's easy to say with a lot of money that somehow it's not good to have a lot of money.
link |
02:28:01.680
It's just, I think that complicates life enormously in a way that people don't fully
link |
02:28:06.240
comprehend. So I think it is way more interesting to shoot for, can I make something that people love
link |
02:28:11.520
that provides value in the world that I love building, that I love working on, that I write?
link |
02:28:20.000
That's what I would do if I were starting from scratch. And by the way, in some ways,
link |
02:28:22.800
that I will do that personally, which is like, choose the thing that you get up every morning,
link |
02:28:27.360
you're like, I love this, even when it's painful. Even when it's painful. What about a social network
link |
02:28:35.760
specifically? If you were to imagine, put yourself in the mind, compete against myself. I can't give
link |
02:28:42.640
out ideas. Okay, I got you. No, but it's like high level. You like focus on community. Yeah.
link |
02:28:47.760
Yeah. I said that as a, as a half joke. In all honesty, I think these things are so hard to
link |
02:28:56.560
build that like ideas are a dime a dozen. But you have to talk about keeping it simple.
link |
02:29:02.800
Can I tell you, which is a liberating idea? Yes. It's three circles and they overlap.
link |
02:29:08.160
One circle is, what do I have experience at slash what am I good at? I don't like saying
link |
02:29:13.280
what am I good at? Because it just like seems like, what do I have experience in? Right? What
link |
02:29:18.240
can I bring to the table? What am I excited about is the other circle? What gets it? Well,
link |
02:29:22.560
it's just super cool, right? That I want to work on because even when this is hard,
link |
02:29:29.040
I think it's so cool. I want to stick with it. And the last circle is like, what does the world need?
link |
02:29:34.720
And if that circle ain't there, it doesn't matter what you work on, because there are a lot of
link |
02:29:38.240
startups that exist that just no one needs or very small markets need. But if you want to be
link |
02:29:43.840
successful, I think if you're like, if you're good at it, you have it, sorry, if you're good at it,
link |
02:29:49.520
you're passionate about it and the world needs it. I mean, this sounds simple,
link |
02:29:52.880
but not enough people sit down and just think about those circles and think,
link |
02:29:57.040
do these things overlap? And then can I get that middle section? It's small, but can I get that
link |
02:30:00.640
middle section? I think a lot about that personally. And then you have to be really
link |
02:30:07.200
honest about the circle that you're good at and really honest about the circle that the world needs.
link |
02:30:16.960
And as opposed to really honest about the passion, what do you actually love? As opposed to some
link |
02:30:23.280
kind of dream of making money, all those kinds of stuff. I literally love doing it.
link |
02:30:26.480
I had a former engineer who decided to start a startup and I was like, are you sure you want
link |
02:30:31.200
to start a company versus like join something else? Because being a coach of an NBA team and playing
link |
02:30:39.040
basketball are two very, very different things. And like not everyone fully understands the
link |
02:30:44.480
difference. I think you can kind of do it both. And I don't know, jury's out on that one because
link |
02:30:52.320
like they're in the middle of it now. So but it's really important to figure out what you're good
link |
02:30:58.000
at, not be full of yourself, like truly look at your track record. What's the saying like it ain't
link |
02:31:04.720
bragging if you could, if you can do it. But too many people are delusional and like think they're
link |
02:31:13.600
better at things than they actually are or think there's a bigger market than there actually is.
link |
02:31:18.480
When you confuse your passion for things with a big market, that's really scary,
link |
02:31:23.120
right? Like just because you think it's cool doesn't mean that it's a big business opportunity.
link |
02:31:26.960
So like what evidence do you have? Again, I'm a fairly like I'm a strict rationalist on this.
link |
02:31:32.400
And like sometimes people don't like working with me because I'm pretty pragmatic about things.
link |
02:31:37.040
Like I'm not, I'm not Elon. Like I don't sit and make bold proclamations about visiting Mars.
link |
02:31:44.000
Like that's just not how I work. I'm like, okay, I want to build this really cool thing that's
link |
02:31:48.480
fairly practical. And I think we could do it and it's in this way. And what's cool though is like
link |
02:31:53.840
that's just my sweet spot. I'm not like, I just, I can't, I can't with a straight face talk about
link |
02:31:59.040
the metaverse. I can't, I just, it's not me. What do you think about the Facebook renaming
link |
02:32:05.040
itself to meta? I didn't mean that as a dig. I just literally mean like I'm fairly,
link |
02:32:09.280
I like to live in the next five years. And like what things can I get out in a year that people
link |
02:32:13.840
will use at scale? And so it's just, again, those circles I think are different for different
link |
02:32:20.640
people, but it's important to realize that like market matters, you being good at it matters
link |
02:32:25.920
and having passion for it matters. Your question, sorry. Well, on last, on this topic, in terms of
link |
02:32:32.720
funding, is there by way of advice, was funding in your own journey helpful, unhelpful? Like,
link |
02:32:44.880
is there a right time to get funding? Venture funding. Venture funding or anything,
link |
02:32:49.360
borrow some money from your parents? I don't know. But like, is money getting in the way?
link |
02:32:54.720
Does it help? Is the timing important? Is there some kind of wisdom you can
link |
02:32:59.680
give there? Because you were exceptionally successful very quickly.
link |
02:33:06.480
Funding helps as long as it's from the right people. That includes yourself. And I'll talk
link |
02:33:10.880
about myself funding myself in a second, which is like, because I can fund myself doing whatever
link |
02:33:16.400
projects I can do, I don't really have another person putting pressure on me except for myself.
link |
02:33:21.440
And that creates strange dynamics, right? But let's talk about people getting funding from
link |
02:33:27.200
a venture capitalist initially. We raised money from Matt Kohler at Benchmark. He's brilliant,
link |
02:33:34.160
amazing guy, very thoughtful. And he was very helpful early on. But I have stories from entrepreneurs
link |
02:33:40.960
where they raised money from the wrong person or the wrong firm where incentives weren't aligned.
link |
02:33:46.640
They didn't think in the same way. And bad things happened because of that. The board
link |
02:33:51.520
of room was always noisy. There were fights. We just never had that. Matt was great.
link |
02:33:56.960
I think capital these days is kind of a dime a dozen, right? Like, as long as you're
link |
02:34:02.480
fundable, it seems like there's money out there is what I'm hearing. It's really important that
link |
02:34:09.520
you are aligned and that you think of raising money as hiring someone for your team rather than
link |
02:34:14.160
taking money if capital is plentiful, right? It provides a certain amount of pressure
link |
02:34:20.800
to do the right thing that I think is healthy for any startup. And it keeps you real and honest
link |
02:34:25.600
because they don't want to lose their money. They're paid to not lose their money. The problem,
link |
02:34:31.280
maybe I could depersonalize it, but I remember having lunch with Elon. It only happened once.
link |
02:34:36.640
And I asked him, I was trying to figure out what it was doing after Instagram, right? And I asked
link |
02:34:42.880
him something about angel investing. And he looked at me with a straight face and was like,
link |
02:34:46.560
why the F would I do that? I was like, I don't know. You're connected. Seems like,
link |
02:34:52.400
he's like, I only invest in myself. I was like, okay. Not the confidence. I was just like,
link |
02:34:59.600
what a novel idea. It's like, yeah, if you have money, why not just put it against your bag and
link |
02:35:06.000
like enable you're visiting Mars or something, right? Like, that's awesome. Great. But I had
link |
02:35:12.960
never really thought of it that way. But also with that comes an interesting dynamic where
link |
02:35:19.520
you don't actually have people who are going to lose that money telling you, hey, don't do this or
link |
02:35:25.520
hey, you need to face this reality. So you need to create other versions of that truth teller.
link |
02:35:30.640
And whatever I do next, that's going to be one of the interesting challenges is how do you create
link |
02:35:37.680
that truth telling situation. And that's part of why by the way, I think someone like Jack,
link |
02:35:42.800
when you start Square, you have money, but you still you bring on partners because I think it
link |
02:35:47.440
creates a truth telling type environment. I'm still trying to figure this out. It's an interesting
link |
02:35:55.680
dynamic. So you're thinking of perhaps launching some kind of venture where you're investing
link |
02:36:00.480
in yourself. I mean, I'm 37 going on 38 next month. I have a long life to live. I'm not
link |
02:36:08.080
definitely not going to sit on the beach, right? So I'm going to do something at some point. And
link |
02:36:14.880
I got to imagine I will I will like help fund it, right? So the other way of thinking about this is
link |
02:36:20.080
you could park your money in the SMP and this is bad because the SMP has done wonderfully well
link |
02:36:24.320
the last year, right? Or you can invest in yourself. And if you're not going to invest in
link |
02:36:30.400
yourself, you probably shouldn't do a startup. It's kind of the way of thinking about it.
link |
02:36:35.680
And you can invest yourself in the way Elon does, which is basically go all in on this investment.
link |
02:36:41.200
Maybe that's one way to achieve accountability is like, you're kind of screwed if you're if you
link |
02:36:45.360
fail. Yeah, that's yeah. I personally like that. I like burning bridges behind me so that I'm
link |
02:36:53.120
fucked if it fails. Yeah. Yeah. Yeah. It's really important, though. One of the things I think
link |
02:37:01.200
Mark said to me early on that sticks with me that I think is true. We were talking about people who
link |
02:37:08.640
had left like operating roles and started doing venture or something. It was like a lot of people
link |
02:37:12.400
convinced themselves they work really hard, like they think they work really hard and they put on
link |
02:37:16.880
the show and in their minds, they work really hard, but they don't work very hard. There is
link |
02:37:22.560
something about lighting a fire underneath you and burning bridges such that you can't turn back.
link |
02:37:28.160
That I think we didn't talk about this specifically, but I think you're right. You need to have that
link |
02:37:34.880
because there's this self delusion at a certain scale. Oh, I have so many board calls. Oh, we have
link |
02:37:41.840
all these things to figure out. It's like, this is one of the hard parts about it being an operator.
link |
02:37:46.960
It's like, there are so many people that have made a lot of money not operating,
link |
02:37:52.640
but operating is just one of the hardest things on earth. It is just so effing hard.
link |
02:37:58.080
It is stressful. It is you're dealing with real humans, not just like throwing capital in and
link |
02:38:02.960
hoping it grows. I'm not undermining the VC mindset. I think it's a wonderful thing and needed and
link |
02:38:08.960
so many wonderful VCs I've worked with. But yeah, like when your ass is on the line and it's your
link |
02:38:15.120
money, it's talk to me in 10 years. We'll see how it goes. Yeah, but like you were saying,
link |
02:38:22.640
that is a source. When you wake up in the morning and you look forward to the day full of challenges,
link |
02:38:28.720
that's also where you can find happiness. Let me ask you about love and friendship. Sure.
link |
02:38:33.200
What's the role in this heck of a difficult journey you have been on of love, of friendship?
link |
02:38:41.440
What's the role of love in the human condition? Well, first things first, the woman I married,
link |
02:38:48.000
my wife, Nicole, there's no way I could do what I do if we weren't together.
link |
02:38:53.680
She had the filter idea. Yeah, exactly. We didn't go over that story.
link |
02:38:59.280
Everything is a partnership, right? And to achieve great things. It's not about someone
link |
02:39:04.640
pulling their weight in places. It's not like someone supporting you so that you could do
link |
02:39:09.600
this other thing. It's literally like Mike and I and our partnership as cofounders is fascinating
link |
02:39:18.800
because I don't think Instagram would have happened without that partnership. Either him
link |
02:39:23.520
or me alone, no way. We pushed and pulled each other in a way that allowed us to build a better
link |
02:39:30.800
thing because of it. Nicole, she pushed me to work on the filters early on. And yes,
link |
02:39:36.080
that's exciting. It's a fun story, right? But the truth of it is being able to level with someone
link |
02:39:42.320
about how hard the process is and have someone see you for who you are before Instagram and know
link |
02:39:49.920
that there's a constant you throughout all of this and be able to call you when you're drifting
link |
02:39:54.800
from that, but also support you when you're trying to stick with that. That's true friendship slash
link |
02:40:02.480
love, whatever you want to call it. But also for someone not to care, I remember Nicole saying,
link |
02:40:08.480
hey, I know you're going to do this Instagram thing. I guess it was bourbon at the time.
link |
02:40:12.720
You should do it because even if it doesn't work, we can move to a smaller apartment
link |
02:40:18.560
and it'll be fine. We'll make it work. How beautiful is that, right?
link |
02:40:23.440
Yeah. That's almost like a superpower. It gives you permission to fail and somehow
link |
02:40:28.000
that actually leads to success. But also, she's the least impressed about Instagram of anyone.
link |
02:40:33.360
She's like, yeah, it's great, but I love you for you. I like that you're a decent cook.
link |
02:40:37.680
That's beautiful. It's beautiful with the Gantt chart and thanks again,
link |
02:40:42.560
which I still think is a brilliant, deafing idea. Thank you.
link |
02:40:46.240
Big ridiculous question. Have you, you're old and wise at the stage,
link |
02:40:52.240
so have you discovered meaning to this whole thing? Why the hell are we descendants of
link |
02:40:58.400
apes here on earth? What's the meaning of it? What's the meaning of life?
link |
02:41:01.840
I haven't. The best learning for me has been no matter what level of success you achieve,
link |
02:41:12.480
you're still worried about similar things, maybe on a slightly different scale. You're still
link |
02:41:16.560
concerned about the same things. You're still self conscious about the same things. Actually,
link |
02:41:24.320
that moment going through that is what makes you believe there's got to be more machinery
link |
02:41:30.080
to life or purpose to life. We're all chasing these materialistic things, but you start
link |
02:41:36.800
realizing it's almost like the Truman Show when he gets the edge and he knocks against it.
link |
02:41:41.920
It's like, what? There's this awakening that happens when you get to that edge that you
link |
02:41:47.120
realize, oh, sure, it's great that we all chase money and fame and success,
link |
02:41:53.520
but you hit the edge and I'm not even claiming I hit an edge like Elon's hit an edge. There's
link |
02:41:58.480
clearly larger scales, but what's cool is you learn that it doesn't actually matter and that
link |
02:42:04.000
there are all these other things that truly matter. That's not a case for working less hard,
link |
02:42:09.120
that's not a case for taking it easy, that's not a case for the four day work. What that is a case
link |
02:42:15.040
for is designing your life exactly the way you want to design it because I don't know. I think
link |
02:42:21.040
we go around the Sun a certain number of times and then we die and then that's it. That's me.
link |
02:42:28.000
CB – Are you afraid of that moment?
link |
02:42:29.680
No, not at all. In fact, at least not yet. Listen, I'm like a pilot. I do crazy things and I'm like,
link |
02:42:41.200
no, if anything, I'm like, oh, I got to choose mindfully and purposefully the thing I am doing
link |
02:42:50.560
right now and not just fall into it because you're going to wake up one day and ask yourself,
link |
02:42:55.680
why the hell you spent the last 10 years doing X, Y, or Z? I guess my shorter answer to this is
link |
02:43:03.760
doing things on purpose because you choose to do them so important in life and not just floating
link |
02:43:11.520
down the river of life, hitting branches along the way because you will hit branches,
link |
02:43:16.960
but rather literally plotting a course and not having a 10 year plan, but just choosing every
link |
02:43:22.240
day to opt in. I haven't figured out the meaning of life by any stretch of the imagination,
link |
02:43:31.440
but it certainly isn't money and it certainly isn't fame and it certainly isn't travel and
link |
02:43:36.080
it's way more of opting into the game you love playing.
link |
02:43:39.040
CB – Every day, opting in.
link |
02:43:41.520
CB – Just opting in. Don't let it happen. You opt in.
link |
02:43:45.440
CB – Kevin, it's great to end on the love and the meaning of life. This is an amazing
link |
02:43:53.280
conversation. You gave me light into some fascinating aspects of this technical world
link |
02:44:00.080
and I can't honestly wait to see what you do next. Thank you so much.
link |
02:44:04.640
CB – Thanks for having me.
link |
02:44:05.920
CB – Thanks for listening to this conversation with Kevin Systrom. To support this podcast,
link |
02:44:11.760
please check out our sponsors in the description. And now, let me leave you some words from Kevin,
link |
02:44:17.440
Systrom himself. Focusing on one thing and doing it really, really well can get you very far.
link |
02:44:25.280
Thank you for listening and hope to see you next time.