back to index

Erik Brynjolfsson: Economics of AI, Social Networks, and Technology | Lex Fridman Podcast #141


small model | large model

link |
00:00:00.000
The following is a conversation with Erik Brynjolfsson.
link |
00:00:03.200
He's an economics professor at Stanford
link |
00:00:05.800
and the director of Stanford's Digital Economy Lab.
link |
00:00:09.360
Previously, he was a long, long time professor at MIT
link |
00:00:13.440
where he did groundbreaking work
link |
00:00:15.160
on the economics of information.
link |
00:00:17.720
He's the author of many books,
link |
00:00:19.760
including The Second Machine Age
link |
00:00:21.960
and Machine Platform Crowd,
link |
00:00:24.520
coauthored with Andrew McAfee.
link |
00:00:27.520
Quick mention of each sponsor,
link |
00:00:29.080
followed by some thoughts related to the episode.
link |
00:00:31.560
Ventura Watches, the maker of classy,
link |
00:00:34.040
well performing watches.
link |
00:00:35.960
Four Sigmatic, the maker of delicious mushroom coffee.
link |
00:00:39.720
ExpressVPN, the VPN I've used for many years
link |
00:00:42.760
to protect my privacy on the internet.
link |
00:00:44.920
And CashApp, the app I use to send money to friends.
link |
00:00:48.920
Please check out these sponsors in the description
link |
00:00:50.840
to get a discount and to support this podcast.
link |
00:00:54.480
As a side note, let me say that the impact
link |
00:00:56.800
of artificial intelligence and automation
link |
00:00:59.120
on our economy and our world
link |
00:01:01.640
is something worth thinking deeply about.
link |
00:01:04.360
Like with many topics that are linked
link |
00:01:06.280
to predicting the future evolution of technology,
link |
00:01:09.040
it is often too easy to fall into one of two camps.
link |
00:01:12.560
The fear mongering camp
link |
00:01:14.680
or the technological utopianism camp.
link |
00:01:18.160
As always, the future will land us somewhere in between.
link |
00:01:21.480
I prefer to wear two hats in these discussions
link |
00:01:24.240
and alternate between them often.
link |
00:01:26.400
The hat of a pragmatic engineer
link |
00:01:29.360
and the hat of a futurist.
link |
00:01:31.760
This is probably a good time to mention Andrew Yang,
link |
00:01:34.920
the presidential candidate who has been
link |
00:01:37.920
one of the high profile thinkers on this topic.
link |
00:01:41.040
And I'm sure I will speak with him
link |
00:01:42.680
on this podcast eventually.
link |
00:01:44.600
A conversation with Andrew has been on the table many times.
link |
00:01:48.520
Our schedules just haven't aligned,
link |
00:01:50.440
especially because I have a strongly held to preference
link |
00:01:54.280
for long form, two, three, four hours or more,
link |
00:01:58.040
and in person.
link |
00:02:00.000
I work hard to not compromise on this.
link |
00:02:02.880
Trust me, it's not easy.
link |
00:02:04.800
Even more so in the times of COVID,
link |
00:02:07.080
which requires getting tested nonstop,
link |
00:02:09.640
staying isolated and doing a lot of costly
link |
00:02:12.440
and uncomfortable things that minimize risk for the guest.
link |
00:02:15.760
The reason I do this is because to me,
link |
00:02:17.760
something is lost in remote conversation.
link |
00:02:20.720
That something, that magic,
link |
00:02:23.360
I think is worth the effort,
link |
00:02:25.120
even if it ultimately leads to a failed conversation.
link |
00:02:29.360
This is how I approach life,
link |
00:02:31.280
treasuring the possibility of a rare moment of magic.
link |
00:02:35.840
I'm willing to go to the ends of the world
link |
00:02:38.240
for just such a moment.
link |
00:02:40.680
If you enjoy this thing, subscribe on YouTube,
link |
00:02:43.080
review it with five stars on Apple Podcast,
link |
00:02:45.320
follow on Spotify, support on Patreon,
link |
00:02:47.960
connect with me on Twitter at Lex Friedman.
link |
00:02:51.080
And now here's my conversation with Erik Brynjolfsson.
link |
00:02:56.080
You posted a quote on Twitter by Albert Bartlett
link |
00:02:59.800
saying that the greatest shortcoming of the human race
link |
00:03:03.240
is our inability to understand the exponential function.
link |
00:03:07.760
Why would you say the exponential growth
link |
00:03:09.680
is important to understand?
link |
00:03:12.120
Yeah, that quote, I remember posting that.
link |
00:03:15.040
It's actually a reprise of something Andy McAfee and I said
link |
00:03:17.720
in the second machine age,
link |
00:03:19.240
but I posted it in early March
link |
00:03:21.240
when COVID was really just beginning to take off
link |
00:03:23.680
and I was really scared.
link |
00:03:25.600
There were actually only a couple dozen cases,
link |
00:03:28.080
maybe less at that time,
link |
00:03:29.800
but they were doubling every like two or three days
link |
00:03:32.040
and I could see, oh my God, this is gonna be a catastrophe
link |
00:03:35.280
and it's gonna happen soon,
link |
00:03:36.840
but nobody was taking it very seriously
link |
00:03:38.760
or not a lot of people were taking it very seriously.
link |
00:03:40.760
In fact, I remember I did my last in person conference
link |
00:03:45.000
that week, I was flying back from Las Vegas
link |
00:03:47.680
and I was the only person on the plane wearing a mask
link |
00:03:50.640
and the flight attendant came over to me.
link |
00:03:52.160
She looked very concerned.
link |
00:03:53.080
She kind of put her hands on my shoulder.
link |
00:03:54.240
She was touching me all over, which I wasn't thrilled about
link |
00:03:56.440
and she goes, do you have some kind of anxiety disorder?
link |
00:03:59.280
Are you okay?
link |
00:04:00.360
And I was like, no, it's because of COVID.
link |
00:04:02.760
This is early March.
link |
00:04:03.920
Early March, but I was worried
link |
00:04:06.720
because I knew I could see or I suspected, I guess,
link |
00:04:10.680
that that doubling would continue and it did
link |
00:04:13.200
and pretty soon we had thousands of times more cases.
link |
00:04:17.000
Most of the time when I use that quote,
link |
00:04:18.480
I try to, it's motivated by more optimistic things
link |
00:04:21.120
like Moore's law and the wonders
link |
00:04:23.040
of having more computer power,
link |
00:04:25.400
but in either case, it can be very counterintuitive.
link |
00:04:28.480
I mean, if you walk for 10 minutes,
link |
00:04:31.440
you get about 10 times as far away
link |
00:04:32.920
as if you walk for one minute.
link |
00:04:34.720
That's the way our physical world works.
link |
00:04:35.920
That's the way our brains are wired,
link |
00:04:38.040
but if something doubles for 10 times as long,
link |
00:04:41.480
you don't get 10 times as much.
link |
00:04:43.080
You get a thousand times as much
link |
00:04:45.240
and after 20, it's a billion.
link |
00:04:47.080
After 30, it's a, no, sorry, after 20, it's a million.
link |
00:04:50.720
After 30, it's a billion.
link |
00:04:53.400
And pretty soon after that,
link |
00:04:54.480
it just gets to these numbers that you can barely grasp.
link |
00:04:57.600
Our world is becoming more and more exponential,
link |
00:05:00.640
mainly because of digital technologies.
link |
00:05:03.440
So more and more often our intuitions are out of whack
link |
00:05:06.200
and that can be good in the case of things creating wonders,
link |
00:05:10.760
but it can be dangerous in the case of viruses
link |
00:05:13.520
and other things.
link |
00:05:14.440
Do you think it generally applies,
link |
00:05:16.280
like is there spaces where it does apply
link |
00:05:18.120
and where it doesn't?
link |
00:05:19.320
How are we supposed to build an intuition
link |
00:05:21.600
about in which aspects of our society
link |
00:05:25.160
does exponential growth apply?
link |
00:05:27.280
Well, you can learn the math,
link |
00:05:29.480
but the truth is our brains, I think,
link |
00:05:32.000
tend to learn more from experiences.
link |
00:05:35.280
So we just start seeing it more and more often.
link |
00:05:37.440
So hanging around Silicon Valley,
link |
00:05:39.320
hanging around AI and computer researchers,
link |
00:05:41.560
I see this kind of exponential growth a lot more frequently
link |
00:05:44.800
and I'm getting used to it, but I still make mistakes.
link |
00:05:46.760
I still underestimate some of the progress
link |
00:05:48.680
in just talking to someone about GPT3
link |
00:05:50.920
and how rapidly natural language has improved.
link |
00:05:54.040
But I think that as the world becomes more exponential,
link |
00:05:58.280
we'll all start experiencing it more frequently.
link |
00:06:01.760
The danger is that we may make some mistakes in the meantime
link |
00:06:05.360
using our old kind of caveman intuitions
link |
00:06:07.680
about how the world works.
link |
00:06:09.280
Well, the weird thing is it always kind of looks linear
link |
00:06:11.760
in the moment.
link |
00:06:12.640
Like it's hard to feel,
link |
00:06:16.920
it's hard to like introspect
link |
00:06:19.920
and really acknowledge how much has changed
link |
00:06:22.960
in just a couple of years or five years or 10 years
link |
00:06:26.320
with the internet.
link |
00:06:27.200
If we just look at advancements of AI
link |
00:06:29.600
or even just social media,
link |
00:06:31.680
all the various technologies
link |
00:06:33.680
that go into the digital umbrella,
link |
00:06:36.240
it feels pretty calm and normal and gradual.
link |
00:06:39.440
Well, a lot of stuff,
link |
00:06:40.760
I think there are parts of the world,
link |
00:06:42.000
most of the world that is not exponential.
link |
00:06:45.600
The way humans learn,
link |
00:06:47.000
the way organizations change,
link |
00:06:49.200
the way our whole institutions adapt and evolve,
link |
00:06:52.000
those don't improve at exponential paces.
link |
00:06:54.520
And that leads to a mismatch oftentimes
link |
00:06:56.320
between these exponentially improving technologies
link |
00:06:58.920
or let's say changing technologies
link |
00:07:00.440
because some of them are exponentially more dangerous
link |
00:07:03.040
and our intuitions and our human skills
link |
00:07:06.720
and our institutions that just don't change very fast at all.
link |
00:07:11.720
And that mismatch I think is at the root
link |
00:07:13.760
of a lot of the problems in our society,
link |
00:07:15.520
the growing inequality
link |
00:07:18.160
and other dysfunctions in our political
link |
00:07:22.560
and economic systems.
link |
00:07:24.240
So one guy that talks about exponential functions
link |
00:07:28.240
a lot is Elon Musk.
link |
00:07:29.440
He seems to internalize this kind of way
link |
00:07:32.040
of exponential thinking.
link |
00:07:34.680
He calls it first principles thinking,
link |
00:07:36.320
sort of the kind of going to the basics,
link |
00:07:39.480
asking the question,
link |
00:07:41.560
like what were the assumptions of the past?
link |
00:07:43.800
How can we throw them out the window?
link |
00:07:46.720
How can we do this 10X much more efficiently
link |
00:07:49.160
and constantly practicing that process?
link |
00:07:51.360
And also using that kind of thinking
link |
00:07:54.200
to estimate sort of when, you know, create deadlines
link |
00:08:01.200
and estimate when you'll be able to deliver
link |
00:08:04.040
on some of these technologies.
link |
00:08:06.520
Now, it often gets him in trouble
link |
00:08:09.560
because he overestimates,
link |
00:08:12.520
like he doesn't meet the initial estimates of the deadlines,
link |
00:08:17.360
but he seems to deliver late but deliver.
link |
00:08:22.360
And which is kind of interesting.
link |
00:08:25.080
Like, what are your thoughts about this whole thing?
link |
00:08:26.920
I think we can all learn from Elon.
link |
00:08:28.840
I think going to first principles,
link |
00:08:30.120
I talked about two ways of getting more of a grip
link |
00:08:32.840
on the exponential function.
link |
00:08:34.560
And one of them just comes from first principles.
link |
00:08:36.280
You know, if you understand the math of it,
link |
00:08:37.800
you can see what's gonna happen.
link |
00:08:39.040
And even if it seems counterintuitive
link |
00:08:41.040
that a couple of dozen of COVID cases
link |
00:08:42.960
can become thousands or tens or hundreds of thousands
link |
00:08:46.200
of them in a month,
link |
00:08:48.080
it makes sense once you just do the math.
link |
00:08:51.200
And I think Elon tries to do that a lot.
link |
00:08:53.680
You know, in fairness, I think he also benefits
link |
00:08:55.120
from hanging out in Silicon Valley
link |
00:08:56.960
and he's experienced it in a lot of different applications.
link |
00:09:00.600
So, you know, it's not as much of a shock to him anymore,
link |
00:09:04.080
but that's something we can all learn from.
link |
00:09:07.160
In my own life, I remember one of my first experiences
link |
00:09:10.400
really seeing it was when I was a grad student
link |
00:09:12.960
and my advisor asked me to plot the growth of computer power
link |
00:09:17.560
in the US economy in different industries.
link |
00:09:20.000
And there are all these, you know,
link |
00:09:21.560
exponentially growing curves.
link |
00:09:23.000
And I was like, holy shit, look at this.
link |
00:09:24.560
In each industry, it was just taking off.
link |
00:09:26.880
And, you know, you didn't have to be a rocket scientist
link |
00:09:29.240
to extend that and say, wow,
link |
00:09:30.640
this means that this was in the late 80s and early 90s
link |
00:09:33.600
that, you know, if it goes anything like that,
link |
00:09:35.880
we're gonna have orders of magnitude more computer power
link |
00:09:38.160
than we did at that time.
link |
00:09:39.480
And of course we do.
link |
00:09:41.320
So, you know, when people look at Moore's law,
link |
00:09:45.040
they often talk about it as just,
link |
00:09:46.880
so the exponential function is actually
link |
00:09:49.240
a stack of S curves.
link |
00:09:51.360
So basically it's you milk or whatever,
link |
00:09:57.240
take the most advantage of a particular little revolution
link |
00:10:01.240
and then you search for another revolution.
link |
00:10:03.000
And it's basically revolutions stack on top of revolutions.
link |
00:10:06.720
Do you have any intuition about how the head humans
link |
00:10:08.960
keep finding ways to revolutionize things?
link |
00:10:12.280
Well, first, let me just unpack that first point
link |
00:10:14.200
that I talked about exponential curves,
link |
00:10:17.160
but no exponential curve continues forever.
link |
00:10:21.480
It's been said that if anything can't go on forever,
link |
00:10:24.960
eventually it will stop.
link |
00:10:26.680
And, and it's very profound, but it's,
link |
00:10:29.840
it seems that a lot of people don't appreciate
link |
00:10:32.440
that half of it as well either.
link |
00:10:33.960
And that's why all exponential functions eventually turn
link |
00:10:36.560
into some kind of S curve or stop in some other way,
link |
00:10:39.800
maybe catastrophically.
link |
00:10:41.240
And that's a cap with COVID as well.
link |
00:10:42.800
I mean, it was, it went up and then it sort of, you know,
link |
00:10:44.560
at some point it starts saturating the pool of people
link |
00:10:47.640
to be infected.
link |
00:10:49.040
There's a standard epidemiological model
link |
00:10:51.320
that's based on that.
link |
00:10:52.960
And it's beginning to happen with Moore's law
link |
00:10:55.040
or different generations of computer power.
link |
00:10:56.920
It happens with all exponential curves.
link |
00:10:59.280
The remarkable thing is you elude,
link |
00:11:01.040
the second part of your question is that we've been able
link |
00:11:03.560
to come up with a new S curve on top of the previous one
link |
00:11:06.840
and do that generation after generation with new materials,
link |
00:11:10.640
new processes, and just extend it further and further.
link |
00:11:15.680
I don't think anyone has a really good theory
link |
00:11:17.400
about why we've been so successful in doing that.
link |
00:11:21.400
It's great that we have been,
link |
00:11:23.160
and I hope it continues for some time,
link |
00:11:26.400
but it's, you know, one beginning of a theory
link |
00:11:31.480
is that there's huge incentives when other parts
link |
00:11:34.440
of the system are going on that clock speed
link |
00:11:36.880
of doubling every two to three years.
link |
00:11:39.280
If there's one component of it that's not keeping up,
link |
00:11:42.120
then the economic incentives become really large
link |
00:11:44.800
to improve that one part.
link |
00:11:46.200
It becomes a bottleneck and anyone who can do improvements
link |
00:11:49.720
in that part can reap huge returns
link |
00:11:51.640
so that the resources automatically get focused
link |
00:11:54.000
on whatever part of the system isn't keeping up.
link |
00:11:56.560
Do you think some version of the Moore's law will continue?
link |
00:11:59.680
Some version, yes, it is.
link |
00:12:01.360
I mean, one version that has become more important
link |
00:12:04.560
is something I call Coomey's law,
link |
00:12:06.280
which is named after John Coomey,
link |
00:12:08.440
who I should mention was also my college roommate,
link |
00:12:10.280
but he identified the fact that energy consumption
link |
00:12:14.360
has been declining by a factor of two.
link |
00:12:17.280
And for most of us, that's more important.
link |
00:12:18.960
The new iPhones came out today as we're recording this.
link |
00:12:21.320
I'm not sure when you're gonna make it available.
link |
00:12:23.120
Very soon after this, yeah.
link |
00:12:24.920
And for most of us, having the iPhone be twice as fast,
link |
00:12:30.000
it's nice, but having the battery lifelonger,
link |
00:12:33.160
that would be much more valuable.
link |
00:12:35.440
And the fact that a lot of the progress in chips now
link |
00:12:38.200
is reducing energy consumption is probably more important
link |
00:12:42.800
for many applications than just the raw speed.
link |
00:12:46.040
Other dimensions of Moore's law
link |
00:12:47.480
are in AI and machine learning.
link |
00:12:51.560
Those tend to be very parallelizable functions,
link |
00:12:55.160
especially deep neural nets.
link |
00:12:58.440
And so instead of having one chip,
link |
00:13:01.280
you can have multiple chips or you can have a GPU,
link |
00:13:05.000
graphic processing unit that goes faster.
link |
00:13:07.000
Now, special chips designed for machine learning
link |
00:13:09.600
like tensor processing units,
link |
00:13:11.160
each time you switch, there's another 10X
link |
00:13:13.000
or 100X improvement above and beyond Moore's law.
link |
00:13:16.720
So I think that the raw silicon
link |
00:13:18.800
isn't improving as much as it used to,
link |
00:13:20.720
but these other dimensions are becoming important,
link |
00:13:23.880
more important, and we're seeing progress in them.
link |
00:13:26.240
I don't know if you've seen the work by OpenAI
link |
00:13:28.200
where they show the exponential improvement
link |
00:13:31.880
of the training of neural networks
link |
00:13:34.320
just literally in the techniques used.
link |
00:13:36.920
So that's almost like the algorithm.
link |
00:13:40.520
It's fascinating to think like, can I actually continue?
link |
00:13:43.640
I was figuring out more and more tricks
link |
00:13:45.160
on how to train networks faster and faster.
link |
00:13:47.000
The progress has been staggering.
link |
00:13:49.520
If you look at image recognition, as you mentioned,
link |
00:13:51.720
I think it's a function of at least three things
link |
00:13:53.440
that are coming together.
link |
00:13:54.520
One, we just talked about faster chips,
link |
00:13:56.560
not just Moore's law, but GPUs, TPUs and other technologies.
link |
00:14:00.400
The second is just a lot more data.
link |
00:14:02.760
I mean, we are awash in digital data today
link |
00:14:05.600
in a way we weren't 20 years ago.
link |
00:14:08.080
Photography, I'm old enough to remember,
link |
00:14:09.960
it used to be chemical, and now everything is digital.
link |
00:14:12.800
I took probably 50 digital photos yesterday.
link |
00:14:16.440
I wouldn't have done that if it was chemical.
link |
00:14:17.880
And we have the internet of things
link |
00:14:20.680
and all sorts of other types of data.
link |
00:14:22.920
When we walk around with our phone,
link |
00:14:24.120
it's just broadcasting a huge amounts of digital data
link |
00:14:27.240
that can be used as training sets.
link |
00:14:29.240
And then last but not least, as they mentioned at OpenAI,
link |
00:14:34.240
there've been significant improvements in the techniques.
link |
00:14:37.160
The core idea of deep neural nets
link |
00:14:39.240
has been around for a few decades,
link |
00:14:41.160
but the advances in making it work more efficiently
link |
00:14:44.200
have also improved a couple of orders of magnitude or more.
link |
00:14:48.160
So you multiply together,
link |
00:14:49.640
a hundred fold improvement in computer power,
link |
00:14:52.480
a hundred fold or more improvement in data,
link |
00:14:55.320
a hundred fold improvement in techniques
link |
00:14:59.000
of software and algorithms,
link |
00:15:00.560
and soon you're getting into a million fold improvements.
link |
00:15:03.840
So somebody brought this up, this idea with GPT3 that,
link |
00:15:09.920
so it's trained in a self supervised way
link |
00:15:11.960
on basically internet data.
link |
00:15:15.080
And that's one of the, I've seen arguments made
link |
00:15:18.920
and they seem to be pretty convincing
link |
00:15:21.120
that the bottleneck there is going to be
link |
00:15:23.440
how much data there is on the internet,
link |
00:15:25.640
which is a fascinating idea that it literally
link |
00:15:29.120
will just run out of human generated data to train on.
link |
00:15:33.280
Right, I know we make it to the point where it's consumed
link |
00:15:35.720
basically all of human knowledge
link |
00:15:37.480
or all digitized human knowledge, yeah.
link |
00:15:39.120
And that will be the bottleneck.
link |
00:15:40.880
But the interesting thing with bottlenecks
link |
00:15:44.520
is people often use bottlenecks
link |
00:15:47.360
as a way to argue against exponential growth.
link |
00:15:49.920
They say, well, there's no way
link |
00:15:51.400
you can overcome this bottleneck,
link |
00:15:53.840
but we seem to somehow keep coming up in new ways
link |
00:15:56.960
to like overcome whatever bottlenecks
link |
00:15:59.200
the critics come up with, which is fascinating.
link |
00:16:01.920
I don't know how you overcome the data bottleneck,
link |
00:16:04.600
but probably more efficient training algorithms.
link |
00:16:07.000
Yeah, well, you already mentioned that,
link |
00:16:08.240
that these training algorithms are getting much better
link |
00:16:10.240
at using smaller amounts of data.
link |
00:16:12.440
We also are just capturing a lot more data than we used to,
link |
00:16:15.880
especially in China, but all around us.
link |
00:16:18.880
So those are both important.
link |
00:16:20.680
In some applications, you can simulate the data,
link |
00:16:24.160
video games, some of the self driving car systems
link |
00:16:28.920
are simulating driving, and of course,
link |
00:16:32.520
that has some risks and weaknesses,
link |
00:16:34.240
but you can also, if you want to exhaust
link |
00:16:38.080
all the different ways you could beat a video game,
link |
00:16:39.840
you could just simulate all the options.
link |
00:16:42.280
Can we take a step in that direction of autonomous vehicles?
link |
00:16:44.920
Next, you're talking to the CTO of Waymo tomorrow.
link |
00:16:48.640
And obviously, I'm talking to Elon again in a couple of weeks.
link |
00:16:53.920
What's your thoughts on autonomous vehicles?
link |
00:16:57.040
Like where do we stand as a problem
link |
00:17:01.800
that has the potential of revolutionizing the world?
link |
00:17:04.480
Well, I'm really excited about that,
link |
00:17:06.760
but it's become much clearer
link |
00:17:09.000
that the original way that I thought about it,
link |
00:17:10.680
most people thought about like,
link |
00:17:11.520
you know, will we have a self driving car or not
link |
00:17:13.440
is way too simple.
link |
00:17:15.640
The better way to think about it
link |
00:17:17.400
is that there's a whole continuum
link |
00:17:19.360
of how much driving and assisting the car can do.
link |
00:17:22.320
I noticed that you're right next door
link |
00:17:24.760
to the Toyota Research Institute.
link |
00:17:25.960
That is a total accident.
link |
00:17:27.600
I love the TRI folks, but yeah.
link |
00:17:29.320
Have you talked to Gil Pratt?
link |
00:17:30.800
Yeah, we're supposed to talk.
link |
00:17:34.080
It's kind of hilarious.
link |
00:17:34.960
So there's kind of the,
link |
00:17:35.800
I think it's a good counterpart to say what Elon is doing.
link |
00:17:38.720
And hopefully they can be frank
link |
00:17:40.040
in what they think about each other,
link |
00:17:41.440
because I've heard both of them talk about it.
link |
00:17:43.920
But they're much more, you know,
link |
00:17:45.400
this is an assistive, a guardian angel
link |
00:17:47.440
that watches over you as opposed to try to do everything.
link |
00:17:50.440
I think there's some things like driving on a highway,
link |
00:17:53.320
you know, from LA to Phoenix,
link |
00:17:55.320
where it's mostly good weather, straight roads.
link |
00:17:58.120
That's close to a solved problem, let's face it.
link |
00:18:01.240
In other situations, you know,
link |
00:18:02.560
driving through the snow in Boston
link |
00:18:04.640
where the roads are kind of crazy.
link |
00:18:06.160
And most importantly, you have to make a lot of judgments
link |
00:18:08.200
about what the other driver is gonna do
link |
00:18:09.440
at these intersections that aren't really right angles
link |
00:18:11.680
and aren't very well described.
link |
00:18:13.400
It's more like game theory.
link |
00:18:15.320
That's a much harder problem
link |
00:18:17.080
and requires understanding human motivations.
link |
00:18:22.080
So there's a continuum there of some places
link |
00:18:24.320
where the cars will work very well
link |
00:18:27.560
and others where it could probably take decades.
link |
00:18:30.920
What do you think about the Waymo?
link |
00:18:33.480
So you mentioned two companies
link |
00:18:36.000
that actually have cars on the road.
link |
00:18:38.040
There's the Waymo approach that it's more like
link |
00:18:40.640
we're not going to release anything until it's perfect
link |
00:18:42.800
and we're gonna be very strict
link |
00:18:45.320
about the streets that we travel on,
link |
00:18:47.680
but it better be perfect.
link |
00:18:49.160
Yeah.
link |
00:18:50.200
Well, I'm smart enough to be humble
link |
00:18:53.920
and not try to get between.
link |
00:18:55.000
I know there's very bright people
link |
00:18:56.600
on both sides of the argument.
link |
00:18:57.440
I've talked to them and they make convincing arguments to me
link |
00:19:00.000
about how careful they need to be and the social acceptance.
link |
00:19:04.000
Some people thought that when the first few people died
link |
00:19:07.400
from self driving cars, that would shut down the industry,
link |
00:19:09.840
but it was more of a blip actually.
link |
00:19:11.800
And, you know, so that was interesting.
link |
00:19:14.440
Of course, there's still a concern
link |
00:19:16.040
that if there could be setbacks, if we do this wrong,
link |
00:19:20.560
you know, your listeners may be familiar
link |
00:19:22.600
with the different levels of self driving,
link |
00:19:24.160
you know, level one, two, three, four, five.
link |
00:19:26.800
I think Andrew Ng has convinced me that this idea
link |
00:19:29.640
of really focusing on level four,
link |
00:19:32.600
where you only go in areas that are well mapped
link |
00:19:35.000
rather than just going out in the wild
link |
00:19:37.480
is the way things are gonna evolve.
link |
00:19:39.720
But you can just keep expanding those areas
link |
00:19:42.600
where you've mapped things really well,
link |
00:19:44.040
where you really understand them
link |
00:19:45.040
and eventually all become kind of interconnected.
link |
00:19:47.960
And that could be a kind of another way of progressing
link |
00:19:51.160
to make it more feasible over time.
link |
00:19:55.240
I mean, that's kind of like the Waymo approach,
link |
00:19:57.400
which is they just now released,
link |
00:19:59.520
I think just like a day or two ago,
link |
00:20:01.960
a public, like anyone from the public
link |
00:20:05.480
in the Phoenix, Arizona to, you know,
link |
00:20:12.160
you can get a ride in a Waymo car
link |
00:20:14.360
with no person, no driver.
link |
00:20:16.120
Oh, they've taken away the safety driver?
link |
00:20:17.640
Oh yeah, for a while now there's been no safety driver.
link |
00:20:21.080
Okay, because I mean, I've been following that one
link |
00:20:22.760
in particular, but I thought it was kind of funny
link |
00:20:24.840
about a year ago when they had the safety driver
link |
00:20:26.960
and then they added a second safety driver
link |
00:20:28.400
because the first safety driver would fall asleep.
link |
00:20:30.880
It's like, I'm not sure they're going
link |
00:20:32.120
in the right direction with that.
link |
00:20:33.400
No, they've Waymo in particular
link |
00:20:38.200
done a really good job of that.
link |
00:20:39.480
They actually have a very interesting infrastructure
link |
00:20:44.360
of remote like observation.
link |
00:20:47.480
So they're not controlling the vehicles remotely,
link |
00:20:49.840
but they're able to, it's like a customer service.
link |
00:20:52.440
They can anytime tune into the car.
link |
00:20:55.160
I bet they can probably remotely control it as well,
link |
00:20:58.160
but that's officially not the function that they use.
link |
00:21:00.920
Yeah, I can see that being really,
link |
00:21:02.760
because I think the thing that's proven harder
link |
00:21:06.280
than maybe some of the early people expected
link |
00:21:08.040
was there's a long tail of weird exceptions.
link |
00:21:10.840
So you can deal with 90, 99, 99.99% of the cases,
link |
00:21:15.440
but then there's something that just never been seen before
link |
00:21:17.720
in the training data.
link |
00:21:18.840
And humans more or less can work around that.
link |
00:21:21.080
Although let me be clear and note,
link |
00:21:22.840
there are about 30,000 human fatalities
link |
00:21:25.640
just in the United States and maybe a million worldwide.
link |
00:21:28.360
So they're far from perfect.
link |
00:21:30.000
But I think people have higher expectations of machines.
link |
00:21:33.440
They wouldn't tolerate that level of death
link |
00:21:36.280
and damage from a machine.
link |
00:21:40.000
And so we have to do a lot better
link |
00:21:41.800
at dealing with those edge cases.
link |
00:21:43.480
And also the tricky thing that if I have a criticism
link |
00:21:46.960
for the Waymo folks, there's such a huge focus on safety
link |
00:21:51.800
where people don't talk enough about creating products
link |
00:21:55.160
that people, that customers love,
link |
00:21:57.240
that human beings love using.
link |
00:22:00.680
It's very easy to create a thing that's safe
link |
00:22:03.320
at the extremes, but then nobody wants to get into it.
link |
00:22:06.880
Yeah, well, back to Elon, I think one of,
link |
00:22:09.640
part of his genius was with the electric cars.
link |
00:22:11.440
Before he came along, electric cars were all kind of
link |
00:22:13.960
underpowered, really light,
link |
00:22:15.680
and there were sort of wimpy cars that weren't fun.
link |
00:22:20.680
And the first thing he did was he made a roadster
link |
00:22:23.640
that went zero to 60 faster than just about any other car
link |
00:22:27.440
and went the other end.
link |
00:22:28.480
And I think that was a really wise marketing move
link |
00:22:30.800
as well as a wise technology move.
link |
00:22:33.160
Yeah, it's difficult to figure out
link |
00:22:34.840
what the right marketing move is for AI systems.
link |
00:22:37.960
That's always been, I think it requires guts and risk taking
link |
00:22:42.960
which is what Elon practices.
link |
00:22:46.320
I mean, to the chagrin of perhaps investors or whatever,
link |
00:22:50.480
but it also requires rethinking what you're doing.
link |
00:22:54.320
I think way too many people are unimaginative,
link |
00:22:57.520
intellectually lazy, and when they take AI,
link |
00:22:59.760
they basically say, what are we doing now?
link |
00:23:01.640
How can we make a machine do the same thing?
link |
00:23:04.000
Maybe we'll save some costs, we'll have less labor.
link |
00:23:06.720
And yeah, it's not necessarily the worst thing
link |
00:23:08.560
in the world to do, but it's really not leading
link |
00:23:10.640
to a quantum change in the way you do things.
link |
00:23:12.840
When Jeff Bezos said, hey, we're gonna use the internet
link |
00:23:16.640
to change how bookstores work and we're gonna use technology,
link |
00:23:19.320
he didn't go and say, okay, let's put a robot cashier
link |
00:23:22.680
where the human cashier is and leave everything else alone.
link |
00:23:25.640
That would have been a very lame way to automate a bookstore.
link |
00:23:28.360
He's like went from soup to nuts and let's just rethink it.
link |
00:23:31.400
We get rid of the physical bookstore.
link |
00:23:33.040
We have a warehouse, we have delivery,
link |
00:23:34.520
we have people order on a screen
link |
00:23:36.360
and everything was reinvented.
link |
00:23:38.480
And that's been the story
link |
00:23:39.720
of these general purpose technologies all through history.
link |
00:23:43.560
And in my books, I write about like electricity
link |
00:23:46.320
and how for 30 years, there was almost no productivity gain
link |
00:23:50.360
from the electrification of factories a century ago.
link |
00:23:53.040
Now it's not because electricity
link |
00:23:54.160
is a wimpy useless technology.
link |
00:23:55.800
We all know how awesome electricity is.
link |
00:23:57.560
It's cause at first,
link |
00:23:58.400
they really didn't rethink the factories.
link |
00:24:00.560
It was only after they reinvented them
link |
00:24:02.160
and we describe how in the book,
link |
00:24:04.040
then you suddenly got a doubling and tripling
link |
00:24:05.920
of productivity growth.
link |
00:24:07.560
But it's the combination of the technology
link |
00:24:09.920
with the new business models, new business organization.
link |
00:24:12.960
That just takes a long time
link |
00:24:14.000
and it takes more creativity than most people have.
link |
00:24:16.920
Can you maybe linger on electricity?
link |
00:24:19.000
Cause that's a fun one.
link |
00:24:20.080
Yeah, well, sure, I'll tell you what happened.
link |
00:24:22.480
Before electricity, there were basically steam engines
link |
00:24:25.760
or sometimes water wheels and to power the machinery,
link |
00:24:28.400
you had to have pulleys and crankshafts
link |
00:24:30.560
and you really can't make them too long
link |
00:24:32.120
cause they'll break the torsion.
link |
00:24:34.040
So all the equipment was kind of clustered
link |
00:24:35.960
around this one giant steam engine.
link |
00:24:37.960
You can't make small steam engines either
link |
00:24:39.480
cause of thermodynamics.
link |
00:24:40.800
So you have one giant steam engine,
link |
00:24:42.360
all the equipment clustered around it, multi story.
link |
00:24:44.320
They have it vertical to minimize the distance
link |
00:24:46.080
as well as horizontal.
link |
00:24:47.760
And then when they did electricity,
link |
00:24:48.960
they took out the steam engine.
link |
00:24:50.080
They got the biggest electric motor
link |
00:24:51.360
they could buy from General Electric or someone like that.
link |
00:24:54.200
And nothing much else changed.
link |
00:24:57.920
It took until a generation of managers retired
link |
00:25:00.760
or died three years later,
link |
00:25:03.200
that people started thinking,
link |
00:25:04.440
wait, we don't have to do it that way.
link |
00:25:05.840
You can make electric motors, big, small, medium.
link |
00:25:09.480
You can put one with each piece of equipment.
link |
00:25:11.480
There's this big debate
link |
00:25:12.320
if you read the management literature
link |
00:25:13.360
between what they call a group drive versus unit drive
link |
00:25:16.160
where every machine would have its own motor.
link |
00:25:18.960
Well, once they did that, once they went to unit drive,
link |
00:25:21.040
those guys won the debate.
link |
00:25:23.040
Then you started having a new kind of factory
link |
00:25:25.080
which is sometimes spread out over acres, single story
link |
00:25:29.400
and each piece of equipment has its own motor.
link |
00:25:31.360
And most importantly, they weren't laid out based on
link |
00:25:33.360
who needed the most power.
link |
00:25:35.000
They were laid out based on
link |
00:25:37.600
what is the workflow of materials?
link |
00:25:40.000
Assembly line, let's have it go from this machine
link |
00:25:41.720
to that machine, to that machine.
link |
00:25:43.240
Once they rethought the factory that way,
link |
00:25:46.040
huge increases in productivity.
link |
00:25:47.680
It was just staggering.
link |
00:25:48.520
People like Paul David have documented this
link |
00:25:50.080
in their research papers.
link |
00:25:51.760
And I think that that is a lesson you see over and over.
link |
00:25:55.920
It happened when the steam engine changed manual production.
link |
00:25:58.560
It's happened with the computerization.
link |
00:26:00.240
People like Michael Hammer said, don't automate, obliterate.
link |
00:26:03.600
In each case, the big gains only came once
link |
00:26:08.440
smart entrepreneurs and managers
link |
00:26:10.400
basically reinvented their industries.
link |
00:26:13.040
I mean, one other interesting point about all that
link |
00:26:14.680
is that during that reinvention period,
link |
00:26:18.920
you often actually not only don't see productivity growth,
link |
00:26:22.200
you can actually see a slipping back.
link |
00:26:24.320
Measured productivity actually falls.
link |
00:26:26.560
I just wrote a paper with Chad Severson and Daniel Rock
link |
00:26:29.040
called the productivity J curve,
link |
00:26:31.520
which basically shows that in a lot of these cases,
link |
00:26:33.840
you have a downward dip before it goes up.
link |
00:26:36.520
And that downward dip is when everyone's trying
link |
00:26:38.320
to like reinvent things.
link |
00:26:40.400
And you could say that they're creating knowledge
link |
00:26:43.120
and intangible assets,
link |
00:26:44.640
but that doesn't show up on anyone's balance sheet.
link |
00:26:46.760
It doesn't show up in GDP.
link |
00:26:48.320
So it's as if they're doing nothing.
link |
00:26:50.080
Like take self driving cars, we were just talking about it.
link |
00:26:52.480
There have been hundreds of billions of dollars
link |
00:26:55.040
spent developing self driving cars.
link |
00:26:57.880
And basically no chauffeur has lost his job, no taxi driver.
link |
00:27:02.360
I guess I got to check out the ones that.
link |
00:27:03.200
It's a big J curve.
link |
00:27:04.440
Yeah, so there's a bunch of spending
link |
00:27:06.080
and no real consumer benefit.
link |
00:27:08.120
Now they're doing that in the belief,
link |
00:27:11.440
I think the justified belief
link |
00:27:13.240
that they will get the upward part of the J curve
link |
00:27:15.160
and there will be some big returns,
link |
00:27:16.920
but in the short run, you're not seeing it.
link |
00:27:19.320
That's happening with a lot of other AI technologies,
link |
00:27:21.840
just as it happened
link |
00:27:22.680
with earlier general purpose technologies.
link |
00:27:25.040
And it's one of the reasons
link |
00:27:25.880
we're having relatively low productivity growth lately.
link |
00:27:29.280
As an economist, one of the things that disappoints me
link |
00:27:31.400
is that as eye popping as these technologies are,
link |
00:27:34.440
you and I are both excited
link |
00:27:35.360
about some of the things they can do.
link |
00:27:36.880
The economic productivity statistics are kind of dismal.
link |
00:27:40.280
We actually, believe it or not,
link |
00:27:42.200
have had lower productivity growth
link |
00:27:44.560
in the past about 15 years
link |
00:27:47.000
than we did in the previous 15 years,
link |
00:27:48.840
in the 90s and early 2000s.
link |
00:27:51.360
And so that's not what you would have expected
link |
00:27:53.200
if these technologies were that much better.
link |
00:27:55.520
But I think we're in kind of a long J curve there.
link |
00:27:59.400
Personally, I'm optimistic.
link |
00:28:00.560
We'll start seeing the upward tick,
link |
00:28:02.120
maybe as soon as next year.
link |
00:28:04.520
But the past decade has been a bit disappointing
link |
00:28:08.520
if you thought there's a one to one relationship
link |
00:28:10.000
between cool technology and higher productivity.
link |
00:28:12.760
Well, what would you place your biggest hope
link |
00:28:15.240
for productivity increases on?
link |
00:28:17.240
Because you kind of said at a high level AI,
link |
00:28:19.880
but if I were to think about
link |
00:28:22.840
what has been so revolutionary in the last 10 years,
link |
00:28:28.200
I would 15 years and thinking about the internet,
link |
00:28:32.240
I would say things like,
link |
00:28:35.800
hopefully I'm not saying anything ridiculous,
link |
00:28:37.160
but everything from Wikipedia to Twitter.
link |
00:28:41.040
So like these kind of websites,
link |
00:28:43.600
not so much AI,
link |
00:28:46.080
but like I would expect to see some kind
link |
00:28:48.160
of big productivity increases
link |
00:28:50.520
from just the connectivity between people
link |
00:28:54.160
and the access to more information.
link |
00:28:58.040
Yeah, well, so that's another area
link |
00:29:00.040
I've done quite a bit of research on actually,
link |
00:29:01.840
is these free goods like Wikipedia, Facebook, Twitter, Zoom.
link |
00:29:06.840
We're actually doing this in person,
link |
00:29:08.120
but almost everything else I do these days is online.
link |
00:29:12.400
The interesting thing about all those
link |
00:29:13.760
is most of them have a price of zero.
link |
00:29:18.880
What do you pay for Wikipedia?
link |
00:29:19.960
Maybe like a little bit for the electrons
link |
00:29:21.680
to come to your house.
link |
00:29:22.960
Basically zero, right?
link |
00:29:25.600
Take a small pause and say, I donate to Wikipedia.
link |
00:29:28.040
Often you should too.
link |
00:29:28.880
It's good for you, yeah.
link |
00:29:30.080
So, but what does that do mean for GDP?
link |
00:29:32.480
GDP is based on the price and quantity
link |
00:29:36.120
of all the goods, things bought and sold.
link |
00:29:37.760
If something has zero price,
link |
00:29:39.560
you know how much it contributes to GDP?
link |
00:29:42.280
To a first approximation, zero.
link |
00:29:44.520
So these digital goods that we're getting more and more of,
link |
00:29:47.560
we're spending more and more hours a day
link |
00:29:50.240
consuming stuff off of screens,
link |
00:29:52.080
little screens, big screens,
link |
00:29:54.520
that doesn't get priced into GDP.
link |
00:29:56.160
It's like they don't exist.
link |
00:29:58.640
That doesn't mean they don't create value.
link |
00:30:00.000
I get a lot of value from watching cat videos
link |
00:30:03.360
and reading Wikipedia articles and listening to podcasts,
link |
00:30:06.160
even if I don't pay for them.
link |
00:30:08.440
So we've got a mismatch there.
link |
00:30:10.440
Now, in fairness, economists,
link |
00:30:12.520
since Simon Kuznets invented GDP and productivity,
link |
00:30:15.320
all those statistics back in the 1930s,
link |
00:30:17.760
he recognized, he in fact said,
link |
00:30:19.680
this is not a measure of wellbeing.
link |
00:30:21.520
This is not a measure of welfare.
link |
00:30:23.200
It's a measure of production.
link |
00:30:25.120
But almost everybody has kind of forgotten
link |
00:30:28.960
that he said that and they just use it.
link |
00:30:31.000
It's like, how well off are we?
link |
00:30:32.200
What was GDP last year?
link |
00:30:33.240
It was 2.3% growth or whatever.
link |
00:30:35.800
That is how much physical production,
link |
00:30:39.440
but it's not the value we're getting.
link |
00:30:42.360
We need a new set of statistics
link |
00:30:43.840
and I'm working with some colleagues.
link |
00:30:45.280
Avi Collis and others to develop something
link |
00:30:48.360
we call GDP dash B.
link |
00:30:50.440
GDP B measures the benefits you get, not the cost.
link |
00:30:55.440
If you get benefit from Zoom or Wikipedia or Facebook,
link |
00:31:00.360
then that gets counted in GDP B,
link |
00:31:02.520
even if you pay zero for it.
link |
00:31:04.560
So, you know, back to your original point,
link |
00:31:07.360
I think there is a lot of gain over the past decade
link |
00:31:10.480
in these digital goods that doesn't show up in GDP,
link |
00:31:15.280
doesn't show up in productivity.
link |
00:31:16.440
By the way, productivity is just defined
link |
00:31:17.920
as GDP divided by hours worked.
link |
00:31:20.080
So if you mismeasure GDP,
link |
00:31:22.080
you mismeasure productivity by the exact same amount.
link |
00:31:25.360
That's something we need to fix.
link |
00:31:26.480
I'm working with the statistical agencies
link |
00:31:28.440
to come up with a new set of metrics.
link |
00:31:30.200
And, you know, over the coming years,
link |
00:31:32.200
I think we'll see, we're not gonna do away with GDP.
link |
00:31:34.480
It's very useful, but we'll see a parallel set of accounts
link |
00:31:37.240
that measure the benefits.
link |
00:31:38.400
How difficult is it to get that B in the GDP B?
link |
00:31:41.080
It's pretty hard.
link |
00:31:41.920
I mean, one of the reasons it hasn't been done before
link |
00:31:44.360
is that, you know, you can measure it,
link |
00:31:46.720
the cash register, what people pay for stuff,
link |
00:31:49.000
but how do you measure what they would have paid,
link |
00:31:51.040
like what the value is?
link |
00:31:52.040
That's a lot harder, you know?
link |
00:31:54.040
How much is Wikipedia worth to you?
link |
00:31:56.040
That's what we have to answer.
link |
00:31:57.480
And to do that, what we do is we can use online experiments.
link |
00:32:00.720
We do massive online choice experiments.
link |
00:32:03.120
We ask hundreds of thousands, now millions of people
link |
00:32:05.720
to do lots of sort of A, B tests.
link |
00:32:07.760
How much would I have to pay you
link |
00:32:09.080
to give up Wikipedia for a month?
link |
00:32:10.840
How much would I have to pay you to stop using your phone?
link |
00:32:14.120
And in some cases, it's hypothetical.
link |
00:32:15.960
In other cases, we actually enforce it,
link |
00:32:17.520
which is kind of expensive.
link |
00:32:18.920
Like we pay somebody $30 to stop using Facebook
link |
00:32:22.440
and we see if they'll do it.
link |
00:32:23.440
And some people will give it up for $10.
link |
00:32:26.280
Some people won't give it up even if you give them $100.
link |
00:32:28.880
And then you get a whole demand curve.
link |
00:32:31.080
You get to see what all the different prices are
link |
00:32:33.600
and how much value different people get.
link |
00:32:36.000
And not surprisingly,
link |
00:32:36.880
different people have different values.
link |
00:32:38.240
We find that women tend to value Facebook more than men.
link |
00:32:41.520
Old people tend to value it a little bit more
link |
00:32:43.200
than young people.
link |
00:32:44.040
That was interesting.
link |
00:32:44.880
I think young people maybe know about other networks
link |
00:32:46.600
that I don't know the name of that are better than Facebook.
link |
00:32:50.280
And so you get to see these patterns,
link |
00:32:53.480
but every person's individual.
link |
00:32:55.440
And then if you add up all those numbers,
link |
00:32:57.280
you start getting an estimate of the value.
link |
00:33:00.040
Okay, first of all, that's brilliant.
link |
00:33:01.240
Is this a work that will soon eventually be published?
link |
00:33:05.720
Yeah, well, there's a version of it
link |
00:33:07.040
in the Proceedings of the National Academy of Sciences
link |
00:33:09.520
about I think we call it massive online choice experiments.
link |
00:33:11.880
I should remember the title, but it's on my website.
link |
00:33:14.920
So yeah, we have some more papers coming out on it,
link |
00:33:17.520
but the first one is already out.
link |
00:33:20.160
You know, it's kind of a fascinating mystery
link |
00:33:22.320
that Twitter, Facebook,
link |
00:33:24.360
like all these social networks are free.
link |
00:33:26.800
And it seems like almost none of them except for YouTube
link |
00:33:31.440
have experimented with removing ads for money.
link |
00:33:35.200
Can you like, do you understand that
link |
00:33:37.160
from both economics and the product perspective?
link |
00:33:39.800
Yeah, it's something that, you know,
link |
00:33:41.000
so I teach a course on digital business models.
link |
00:33:43.240
So I used to at MIT, at Stanford, I'm not quite sure.
link |
00:33:45.800
I'm not teaching until next spring.
link |
00:33:47.360
I'm still thinking what my course is gonna be.
link |
00:33:50.040
But there are a lot of different business models.
link |
00:33:52.200
And when you have something that has zero marginal cost,
link |
00:33:54.880
there's a lot of forces,
link |
00:33:56.400
especially if there's any kind of competition
link |
00:33:57.880
that push prices down to zero.
link |
00:33:59.960
But you can have ad supported systems,
link |
00:34:03.360
you can bundle things together.
link |
00:34:05.520
You can have volunteer, you mentioned Wikipedia,
link |
00:34:07.360
there's donations.
link |
00:34:08.760
And I think economists underestimate
link |
00:34:11.120
the power of volunteerism and donations.
link |
00:34:14.560
Your national public radio.
link |
00:34:16.040
Actually, how do you, this podcast, how is this,
link |
00:34:18.560
what's the revenue model?
link |
00:34:19.480
There's sponsors at the beginning.
link |
00:34:22.240
And then, and people, the funny thing is,
link |
00:34:24.640
I tell people they can, it's very,
link |
00:34:26.640
I tell them the timestamp.
link |
00:34:27.880
So if you wanna skip the sponsors, you're free.
link |
00:34:30.960
But it's funny that a bunch of people,
link |
00:34:33.560
so I read the advertisement
link |
00:34:36.200
and then a bunch of people enjoy reading it.
link |
00:34:38.400
And it's.
link |
00:34:39.240
Well, they may learn something from it.
link |
00:34:40.080
And also from the advertiser's perspective,
link |
00:34:42.920
those are people who are actually interested.
link |
00:34:45.400
I mean, the example I sometimes get is like,
link |
00:34:46.960
I bought a car recently and all of a sudden,
link |
00:34:49.840
all the car ads were like interesting to me.
link |
00:34:52.400
Exactly.
link |
00:34:53.240
And then like, now that I have the car,
link |
00:34:54.360
like I sort of zone out on, but that's fine.
link |
00:34:56.280
The car companies, they don't really wanna be advertising
link |
00:34:58.720
to me if I'm not gonna buy their product.
link |
00:35:01.320
So there are a lot of these different revenue models
link |
00:35:03.560
and it's a little complicated,
link |
00:35:06.880
but the economic theory has to do
link |
00:35:08.000
with what the shape of the demand curve is,
link |
00:35:09.480
when it's better to monetize it with charging people
link |
00:35:13.160
versus when you're better off doing advertising.
link |
00:35:15.640
I mean, in short, when the demand curve
link |
00:35:18.280
is relatively flat and wide,
link |
00:35:20.600
like generic news and things like that,
link |
00:35:22.760
then you tend to do better with advertising.
link |
00:35:25.920
If it's a good that's only useful to a small number
link |
00:35:28.840
of people, but they're willing to pay a lot,
link |
00:35:30.320
they have a very high value for it,
link |
00:35:32.720
then advertising isn't gonna work as well
link |
00:35:34.560
and you're better off charging for it.
link |
00:35:36.080
Both of them have some inefficiencies.
link |
00:35:38.080
And then when you get into targeting
link |
00:35:39.480
and you get into these other revenue models,
link |
00:35:40.600
it gets more complicated,
link |
00:35:41.960
but there's some economic theory on it.
link |
00:35:45.320
I also think to be frank,
link |
00:35:47.560
there's just a lot of experimentation that's needed
link |
00:35:49.560
because sometimes things are a little counterintuitive,
link |
00:35:53.200
especially when you get into what are called
link |
00:35:55.160
two sided networks or platform effects,
link |
00:35:57.640
where you may grow the market on one side
link |
00:36:01.840
and harvest the revenue on the other side.
link |
00:36:04.120
Facebook tries to get more and more users
link |
00:36:06.080
and then they harvest the revenue from advertising.
link |
00:36:08.960
So that's another way of kind of thinking about it.
link |
00:36:12.040
Is it strange to you that they haven't experimented?
link |
00:36:14.400
Well, they are experimenting.
link |
00:36:15.360
So they are doing some experiments
link |
00:36:17.600
about what the willingness is for people to pay.
link |
00:36:22.040
I think that when they do the math,
link |
00:36:23.560
it's gonna work out that they still are better off
link |
00:36:26.400
with an advertising driven model, but...
link |
00:36:29.440
What about a mix?
link |
00:36:30.400
Like this is what YouTube is, right?
link |
00:36:32.400
It's you allow the person to decide,
link |
00:36:36.360
the customer to decide exactly which model they prefer.
link |
00:36:39.360
No, that can work really well.
link |
00:36:40.920
And newspapers, of course,
link |
00:36:41.760
have known this for a long time.
link |
00:36:42.760
The Wall Street Journal, the New York Times,
link |
00:36:44.560
they have subscription revenue.
link |
00:36:45.840
They also have advertising revenue.
link |
00:36:48.080
And that can definitely work.
link |
00:36:52.200
Online, it's a lot easier to have a dial
link |
00:36:54.080
that's much more personalized
link |
00:36:55.240
and everybody can kind of roll their own mix.
link |
00:36:57.720
And I could imagine having a little slider
link |
00:37:00.320
about how much advertising you want or are willing to take.
link |
00:37:05.040
And if it's done right and it's incentive compatible,
link |
00:37:07.400
it could be a win win where both the content provider
link |
00:37:10.960
and the consumer are better off
link |
00:37:12.560
than they would have been before.
link |
00:37:14.480
Yeah, the done right part is a really good point.
link |
00:37:17.960
Like with the Jeff Bezos
link |
00:37:19.600
and the single click purchase on Amazon,
link |
00:37:22.000
the frictionless effort there,
link |
00:37:23.880
if I could just rant for a second
link |
00:37:25.760
about the Wall Street Journal,
link |
00:37:27.240
all the newspapers you mentioned,
link |
00:37:29.280
is I have to click so many times to subscribe to them
link |
00:37:34.800
that I literally don't subscribe
link |
00:37:37.400
just because of the number of times I have to click.
link |
00:37:39.520
I'm totally with you.
link |
00:37:40.360
I don't understand why so many companies make it so hard.
link |
00:37:44.560
I mean, another example is when you buy a new iPhone
link |
00:37:47.240
or a new computer, whatever,
link |
00:37:48.900
I feel like, okay, I'm gonna lose an afternoon
link |
00:37:51.440
just like loading up and getting all my stuff back.
link |
00:37:53.800
And for a lot of us,
link |
00:37:56.080
that's more of a deterrent than the price.
link |
00:37:58.600
And if they could make it painless,
link |
00:38:01.800
we'd give them a lot more money.
link |
00:38:03.680
So I'm hoping somebody listening is working
link |
00:38:06.440
on making it more painless for us to buy your products.
link |
00:38:10.000
If we could just like linger a little bit
link |
00:38:12.280
on the social network thing,
link |
00:38:13.680
because there's this Netflix social dilemma.
link |
00:38:18.200
Yeah, no, I saw that.
link |
00:38:19.280
And Tristan Harris and company, yeah.
link |
00:38:24.000
And people's data,
link |
00:38:29.500
it's really sensitive and social networks
link |
00:38:31.560
are at the core arguably of many of societal like tension
link |
00:38:37.440
and some of the most important things happening in society.
link |
00:38:39.640
So it feels like it's important to get this right,
link |
00:38:42.040
both from a business model perspective
link |
00:38:43.960
and just like a trust perspective.
link |
00:38:46.340
I still gotta, I mean, it just still feels like,
link |
00:38:49.840
I know there's experimentation going on.
link |
00:38:52.140
It still feels like everyone is afraid
link |
00:38:54.740
to try different business models, like really try.
link |
00:38:57.520
Well, I'm worried that people are afraid
link |
00:38:59.600
to try different business models.
link |
00:39:01.220
I'm also worried that some of the business models
link |
00:39:03.480
may lead them to bad choices.
link |
00:39:06.280
And Danny Kahneman talks about system one and system two,
link |
00:39:10.980
sort of like a reptilian brain
link |
00:39:12.280
that reacts quickly to what we see,
link |
00:39:14.360
see something interesting, we click on it,
link |
00:39:16.160
we retweet it versus our system two,
link |
00:39:20.800
our frontal cortex that's supposed to be more careful
link |
00:39:24.080
and rational that really doesn't make
link |
00:39:26.240
as many decisions as it should.
link |
00:39:28.840
I think there's a tendency for a lot of these social networks
link |
00:39:32.680
to really exploit system one, our quick instant reaction,
link |
00:39:37.680
make it so we just click on stuff and pass it on
link |
00:39:40.960
and not really think carefully about it.
link |
00:39:42.320
And that system, it tends to be driven
link |
00:39:45.160
by sex, violence, disgust, anger, fear,
link |
00:39:51.320
these relatively primitive kinds of emotions.
link |
00:39:53.800
Maybe they're important for a lot of purposes,
link |
00:39:55.960
but they're not a great way to organize a society.
link |
00:39:58.920
And most importantly, when you think about this huge,
link |
00:40:01.920
amazing information infrastructure we've had
link |
00:40:04.320
that's connected billions of brains across the globe,
link |
00:40:08.000
not just so we can all access information,
link |
00:40:09.640
but we can all contribute to it and share it.
link |
00:40:12.640
Arguably the most important thing
link |
00:40:14.100
that that network should do is favor truth over falsehoods.
link |
00:40:19.360
And the way it's been designed,
link |
00:40:21.640
not necessarily intentionally, is exactly the opposite.
link |
00:40:24.660
My MIT colleagues are all, and Deb Roy and others at MIT,
link |
00:40:29.440
did a terrific paper in the cover of Science.
link |
00:40:31.760
And they documented what we all feared,
link |
00:40:33.460
which is that lies spread faster than truth
link |
00:40:37.740
on social networks.
link |
00:40:39.760
They looked at a bunch of tweets and retweets,
link |
00:40:42.760
and they found that false information
link |
00:40:44.560
was more likely to spread further, faster, to more people.
link |
00:40:48.960
And why was that?
link |
00:40:49.920
It's not because people like lies.
link |
00:40:53.360
It's because people like things that are shocking,
link |
00:40:55.880
amazing, can you believe this?
link |
00:40:57.840
Something that is not mundane,
link |
00:41:00.280
not something that everybody else already knew.
link |
00:41:02.460
And what are the most unbelievable things?
link |
00:41:05.400
Well, lies.
link |
00:41:07.320
And so if you wanna find something unbelievable,
link |
00:41:09.820
it's a lot easier to do that
link |
00:41:10.660
if you're not constrained by the truth.
link |
00:41:12.440
So they found that the emotional valence
link |
00:41:15.640
of false information was just much higher.
link |
00:41:17.960
It was more likely to be shocking,
link |
00:41:19.680
and therefore more likely to be spread.
link |
00:41:22.880
Another interesting thing was that
link |
00:41:24.040
that wasn't necessarily driven by the algorithms.
link |
00:41:27.580
I know that there is some evidence,
link |
00:41:29.680
Zeynep Tufekci and others have pointed out on YouTube,
link |
00:41:32.400
some of the algorithms unintentionally were tuned
link |
00:41:34.680
to amplify more extremist content.
link |
00:41:37.840
But in the study of Twitter that Sinan and Deb and others did,
link |
00:41:42.420
they found that even if you took out all the bots
link |
00:41:44.480
and all the automated tweets,
link |
00:41:47.880
you still had lies spreading significantly faster.
link |
00:41:50.760
It's just the problems with ourselves
link |
00:41:52.560
that we just can't resist passing on the salacious content.
link |
00:41:58.480
But I also blame the platforms
link |
00:41:59.920
because there's different ways you can design a platform.
link |
00:42:03.160
You can design a platform in a way
link |
00:42:05.400
that makes it easy to spread lies
link |
00:42:07.280
and to retweet and spread things on,
link |
00:42:09.520
or you can kind of put some friction on that
link |
00:42:11.520
and try to favor truth.
link |
00:42:13.960
I had dinner with Jimmy Wales once,
link |
00:42:15.520
the guy who helped found Wikipedia.
link |
00:42:19.720
And he convinced me that, look,
link |
00:42:22.580
you can make some design choices,
link |
00:42:24.500
whether it's at Facebook, at Twitter,
link |
00:42:26.360
at Wikipedia, or Reddit, whatever,
link |
00:42:29.240
and depending on how you make those choices,
link |
00:42:32.340
you're more likely or less likely to have false news.
link |
00:42:35.080
Create a little bit of friction, like you said.
link |
00:42:37.160
Yeah.
link |
00:42:38.000
You know, that's the, and so if I'm...
link |
00:42:39.560
It could be friction, it could be speeding the truth,
link |
00:42:41.560
either way, but, and I don't totally understand...
link |
00:42:44.400
Speeding the truth, I love it.
link |
00:42:45.520
Yeah, yeah.
link |
00:42:47.040
Amplifying it and giving it more credit.
link |
00:42:48.900
And in academia, which is far, far from perfect,
link |
00:42:52.520
but when someone has an important discovery,
link |
00:42:55.640
it tends to get more cited
link |
00:42:56.880
and people kind of look to it more
link |
00:42:58.160
and sort of, it tends to get amplified a little bit.
link |
00:43:00.760
So you could try to do that too.
link |
00:43:03.320
I don't know what the silver bullet is,
link |
00:43:04.680
but the meta point is that if we spend time
link |
00:43:07.440
thinking about it, we can amplify truth over falsehoods.
link |
00:43:10.800
And I'm disappointed in the heads of these social networks
link |
00:43:14.920
that they haven't been as successful
link |
00:43:16.680
or maybe haven't tried as hard to amplify truth.
link |
00:43:19.540
And part of it, going back to what we said earlier,
link |
00:43:21.560
is these revenue models may push them
link |
00:43:25.140
more towards growing fast, spreading information rapidly,
link |
00:43:29.880
getting lots of users,
link |
00:43:31.440
which isn't the same thing as finding truth.
link |
00:43:34.560
Yeah, I mean, implicit in what you're saying now
link |
00:43:38.840
is a hopeful message that with platforms,
link |
00:43:42.240
we can take a step towards a greater
link |
00:43:47.440
and greater popularity of truth.
link |
00:43:51.120
But the more cynical view is that
link |
00:43:54.000
what the last few years have revealed
link |
00:43:56.800
is that there's a lot of money to be made
link |
00:43:59.020
in dismantling even the idea of truth,
link |
00:44:03.120
that nothing is true.
link |
00:44:05.000
And as a thought experiment,
link |
00:44:07.020
I've been thinking about if it's possible
link |
00:44:09.320
that our future will have,
link |
00:44:11.200
like the idea of truth is something we won't even have.
link |
00:44:14.360
Do you think it's possible in the future
link |
00:44:17.800
that everything is on the table in terms of truth,
link |
00:44:20.980
and we're just swimming in this kind of digital economy
link |
00:44:24.720
where ideas are just little toys
link |
00:44:29.720
that are not at all connected to reality?
link |
00:44:33.080
Yeah, I think that's definitely possible.
link |
00:44:35.760
I'm not a technological determinist,
link |
00:44:37.960
so I don't think that's inevitable.
link |
00:44:40.280
I don't think it's inevitable that it doesn't happen.
link |
00:44:42.300
I mean, the thing that I've come away with
link |
00:44:43.960
every time I do these studies,
link |
00:44:45.320
and I emphasize it in my books and elsewhere,
link |
00:44:47.200
is that technology doesn't shape our destiny,
link |
00:44:50.040
we shape our destiny.
link |
00:44:51.680
So just by us having this conversation,
link |
00:44:54.640
I hope that your audience is gonna take it upon themselves
link |
00:44:58.440
as they design their products,
link |
00:44:59.880
and they think about, they use products,
link |
00:45:01.280
as they manage companies,
link |
00:45:02.760
how can they make conscious decisions
link |
00:45:05.300
to favor truth over falsehoods,
link |
00:45:08.840
favor the better kinds of societies,
link |
00:45:10.880
and not abdicate and say, well, we just build the tools.
link |
00:45:13.720
I think there was a saying that,
link |
00:45:16.940
was it the German scientist
link |
00:45:18.300
when they were working on the missiles in late World War II?
link |
00:45:23.000
They said, well, our job is to make the missiles go up.
link |
00:45:25.680
Where they come down, that's someone else's department.
link |
00:45:28.400
And that's obviously not the, I think it's obvious,
link |
00:45:31.840
that's not the right attitude
link |
00:45:32.840
that technologists should have,
link |
00:45:33.980
that engineers should have.
link |
00:45:35.680
They should be very conscious
link |
00:45:36.920
about what the implications are.
link |
00:45:38.800
And if we think carefully about it,
link |
00:45:40.600
we can avoid the kind of world that you just described,
link |
00:45:42.920
where truth is all relative.
link |
00:45:45.040
There are going to be people who benefit from a world
link |
00:45:47.840
of where people don't check facts,
link |
00:45:51.320
and where truth is relative,
link |
00:45:52.680
and popularity or fame or money is orthogonal to truth.
link |
00:45:59.880
But one of the reasons I suspect
link |
00:46:01.880
that we've had so much progress over the past few hundred
link |
00:46:04.540
years is the invention of the scientific method,
link |
00:46:07.600
which is a really powerful tool or meta tool
link |
00:46:10.200
for finding truth and favoring things that are true
link |
00:46:15.400
versus things that are false.
link |
00:46:16.600
If they don't pass the scientific method,
link |
00:46:18.560
they're less likely to be true.
link |
00:46:20.640
And that has, the societies and the people
link |
00:46:25.560
and the organizations that embrace that
link |
00:46:27.760
have done a lot better than the ones who haven't.
link |
00:46:30.520
And so I'm hoping that people keep that in mind
link |
00:46:32.800
and continue to try to embrace not just the truth,
link |
00:46:35.460
but methods that lead to the truth.
link |
00:46:37.640
So maybe on a more personal question,
link |
00:46:41.400
if one were to try to build a competitor to Twitter,
link |
00:46:45.480
what would you advise?
link |
00:46:47.360
Is there, I mean, the bigger, the meta question,
link |
00:46:53.360
is that the right way to improve systems?
link |
00:46:55.680
Yeah, no, I think that the underlying premise
link |
00:46:59.380
behind Twitter and all these networks is amazing,
link |
00:47:01.380
that we can communicate with each other.
link |
00:47:02.800
And I use it a lot.
link |
00:47:04.000
There's a subpart of Twitter called Econ Twitter,
link |
00:47:05.920
where we economists tweet to each other
link |
00:47:08.640
and talk about new papers.
link |
00:47:10.560
Something came out in the NBER,
link |
00:47:11.960
the National Bureau of Economic Research,
link |
00:47:13.320
and we share about it.
link |
00:47:14.160
People critique it.
link |
00:47:15.360
I think it's been a godsend
link |
00:47:16.880
because it's really sped up the scientific process,
link |
00:47:20.040
if you can call economic scientific.
link |
00:47:21.880
Does it get divisive in that little?
link |
00:47:23.560
Sometimes, yeah, sure.
link |
00:47:24.500
Sometimes it does.
link |
00:47:25.340
It can also be done in nasty ways and there's the bad parts.
link |
00:47:28.360
But the good parts are great
link |
00:47:29.680
because you just speed up that clock speed
link |
00:47:31.640
of learning about things.
link |
00:47:33.320
Instead of like in the old, old days,
link |
00:47:35.480
waiting to read it in a journal,
link |
00:47:36.800
or the not so old days when you'd see it posted
link |
00:47:39.520
on a website and you'd read it.
link |
00:47:41.600
Now on Twitter, people will distill it down
link |
00:47:44.000
and it's a real art to getting to the essence of things.
link |
00:47:47.160
So that's been great.
link |
00:47:49.080
But it certainly, we all know that Twitter
link |
00:47:52.320
can be a cesspool of misinformation.
link |
00:47:55.560
And like I just said,
link |
00:47:57.360
unfortunately misinformation tends to spread faster
link |
00:48:00.240
on Twitter than truth.
link |
00:48:02.320
And there are a lot of people
link |
00:48:03.160
who are very vulnerable to it.
link |
00:48:04.200
I'm sure I've been fooled at times.
link |
00:48:06.000
There are agents, whether from Russia
link |
00:48:09.120
or from political groups or others
link |
00:48:11.680
that explicitly create efforts at misinformation
link |
00:48:15.640
and efforts at getting people to hate each other.
link |
00:48:17.900
Or even more important lately I've discovered
link |
00:48:19.720
is nut picking.
link |
00:48:21.200
You know the idea of nut picking?
link |
00:48:22.320
No, what's that?
link |
00:48:23.160
It's a good term.
link |
00:48:24.320
Nut picking is when you find like an extreme nut case
link |
00:48:27.800
on the other side and then you amplify them
link |
00:48:30.700
and make it seem like that's typical of the other side.
link |
00:48:34.000
So you're not literally lying.
link |
00:48:35.480
You're taking some idiot, you know,
link |
00:48:37.760
renting on the subway or just, you know,
link |
00:48:39.920
whether they're in the KKK or Antifa or whatever,
link |
00:48:42.800
they're just, and you,
link |
00:48:44.360
normally nobody would pay attention to this guy.
link |
00:48:46.040
Like 12 people would see him and it'd be the end.
link |
00:48:48.080
Instead with video or whatever,
link |
00:48:51.120
you get tens of millions of people say it.
link |
00:48:54.520
And I've seen this, you know, I look at it,
link |
00:48:56.320
I'm like, I get angry.
link |
00:48:57.160
I'm like, I can't believe that person
link |
00:48:58.280
did something so terrible.
link |
00:48:59.720
Let me tell all my friends about this terrible person.
link |
00:49:02.880
And it's a great way to generate division.
link |
00:49:06.640
I talked to a friend who studied Russian misinformation
link |
00:49:10.520
campaigns, and they're very clever about literally
link |
00:49:13.840
being on both sides of some of these debates.
link |
00:49:15.880
They would have some people pretend to be part of BLM.
link |
00:49:18.620
Some people pretend to be white nationalists
link |
00:49:21.040
and they would be throwing epithets at each other,
link |
00:49:22.960
saying crazy things at each other.
link |
00:49:25.100
And they're literally playing both sides of it,
link |
00:49:26.600
but their goal wasn't for one or the other to win.
link |
00:49:28.600
It was for everybody to get behaving
link |
00:49:30.120
and distrusting everyone else.
link |
00:49:32.000
So these tools can definitely be used for that.
link |
00:49:34.500
And they are being used for that.
link |
00:49:36.580
It's been super destructive for our democracy
link |
00:49:39.680
and our society.
link |
00:49:41.080
And the people who run these platforms,
link |
00:49:43.540
I think have a social responsibility,
link |
00:49:46.100
a moral and ethical, personal responsibility
link |
00:49:48.680
to do a better job and to shut that stuff down.
link |
00:49:51.800
Well, I don't know if you can shut it down,
link |
00:49:52.960
but to design them in a way that, you know,
link |
00:49:55.800
as I said earlier, favors truth over falsehoods
link |
00:49:58.620
and favors positive types of
link |
00:50:03.200
communication versus destructive ones.
link |
00:50:06.060
And just like you said, it's also on us.
link |
00:50:09.600
I try to be all about love and compassion,
link |
00:50:12.400
empathy on Twitter.
link |
00:50:13.280
I mean, one of the things,
link |
00:50:14.820
nut picking is a fascinating term.
link |
00:50:16.600
One of the things that people do,
link |
00:50:18.940
that's I think even more dangerous
link |
00:50:21.800
is nut picking applied to individual statements
link |
00:50:26.760
of good people.
link |
00:50:28.440
So basically worst case analysis in computer science
link |
00:50:32.180
is taking sometimes out of context,
link |
00:50:35.360
but sometimes in context,
link |
00:50:38.480
a statement, one statement by a person,
link |
00:50:42.320
like I've been, because I've been reading
link |
00:50:43.740
The Rise and Fall of the Third Reich,
link |
00:50:45.360
I often talk about Hitler on this podcast with folks
link |
00:50:48.960
and it is so easy.
link |
00:50:50.640
That's really dangerous.
link |
00:50:52.060
But I'm all leaning in, I'm 100%.
link |
00:50:54.560
Because, well, it's actually a safer place
link |
00:50:56.960
than people realize because it's history
link |
00:50:59.200
and history in long form is actually very fascinating
link |
00:51:04.120
to think about and it's,
link |
00:51:06.300
but I could see how that could be taken
link |
00:51:09.600
totally out of context and it's very worrying.
link |
00:51:11.320
You know, these digital infrastructures,
link |
00:51:12.800
not just they disseminate things,
link |
00:51:14.040
but they're sort of permanent.
link |
00:51:14.880
So anything you say at some point,
link |
00:51:16.540
someone can go back and find something you said
link |
00:51:18.160
three years ago, perhaps jokingly, perhaps not,
link |
00:51:21.080
maybe you're just wrong and you made them, you know,
link |
00:51:22.800
and like that becomes, they can use that to define you
link |
00:51:25.600
if they have ill intent.
link |
00:51:26.840
And we all need to be a little more forgiving.
link |
00:51:29.080
I mean, somewhere in my 20s, I told myself,
link |
00:51:32.240
I was going through all my different friends
link |
00:51:33.820
and I was like, you know, every one of them
link |
00:51:37.300
has at least like one nutty opinion.
link |
00:51:39.400
And I was like, there's like nobody
link |
00:51:42.040
who's like completely, except me, of course,
link |
00:51:44.160
but I'm sure they thought that about me too.
link |
00:51:45.700
And so you just kind of like learned
link |
00:51:47.760
to be a little bit tolerant that like, okay,
link |
00:51:49.420
there's just, you know.
link |
00:51:51.140
Yeah, I wonder who the responsibility lays on there.
link |
00:51:55.240
Like, I think ultimately it's about leadership.
link |
00:51:59.680
Like the previous president, Barack Obama,
link |
00:52:02.760
has been, I think, quite eloquent
link |
00:52:06.040
at walking this very difficult line
link |
00:52:07.680
of talking about cancel culture, but it's a difficult,
link |
00:52:10.640
it takes skill.
link |
00:52:12.160
Because you say the wrong thing
link |
00:52:13.800
and you piss off a lot of people.
link |
00:52:15.320
And so you have to do it well.
link |
00:52:17.440
But then also the platform of the technology is,
link |
00:52:21.220
should slow down, create friction,
link |
00:52:23.600
and spreading this kind of nut picking in all its forms.
link |
00:52:26.440
Absolutely.
link |
00:52:27.280
No, and your point that we have to like learn over time,
link |
00:52:29.780
how to manage it.
link |
00:52:30.620
I mean, we can't put it all on the platform
link |
00:52:31.800
and say, you guys design it.
link |
00:52:33.240
Because if we're idiots about using it,
link |
00:52:35.200
nobody can design a platform that withstands that.
link |
00:52:38.480
And every new technology people learn its dangers.
link |
00:52:41.720
You know, when someone invented fire,
link |
00:52:43.960
it's great cooking and everything,
link |
00:52:44.960
but then somebody burned themself.
link |
00:52:46.160
And then you had to like learn how to like avoid,
link |
00:52:48.200
maybe somebody invented a fire extinguisher later.
link |
00:52:50.640
So you kind of like figure out ways
link |
00:52:52.840
of working around these technologies.
link |
00:52:54.640
Someone invented seat belts, et cetera.
link |
00:52:57.440
And that's certainly true
link |
00:52:58.640
with all the new digital technologies
link |
00:53:00.620
that we have to figure out,
link |
00:53:02.320
not just technologies that protect us,
link |
00:53:05.280
but ways of using them that emphasize
link |
00:53:08.640
that are more likely to be successful than dangerous.
link |
00:53:11.520
So you've written quite a bit
link |
00:53:12.560
about how artificial intelligence might change our world.
link |
00:53:19.000
How do you think if we look forward,
link |
00:53:21.240
again, it's impossible to predict the future,
link |
00:53:23.200
but if we look at trends from the past
link |
00:53:26.440
and we tried to predict what's gonna happen
link |
00:53:28.200
in the rest of the 21st century,
link |
00:53:29.720
how do you think AI will change our world?
link |
00:53:33.080
That's a big question.
link |
00:53:34.200
You know, I'm mostly a techno optimist.
link |
00:53:37.440
I'm not at the extreme, you know,
link |
00:53:38.660
the singularity is near end of the spectrum,
link |
00:53:41.080
but I do think that we're likely in
link |
00:53:44.560
for some significantly improved living standards,
link |
00:53:47.480
some really important progress,
link |
00:53:49.260
even just the technologies that are already kind of like
link |
00:53:51.240
in the can that haven't diffused.
link |
00:53:53.080
You know, when I talked earlier about the J curve,
link |
00:53:54.880
it could take 10, 20, 30 years for an existing technology
link |
00:53:58.760
to have the kind of profound effects.
link |
00:54:00.780
And when I look at whether it's, you know,
link |
00:54:03.760
vision systems, voice recognition, problem solving systems,
link |
00:54:07.840
even if nothing new got invented,
link |
00:54:09.400
we would have a few decades of progress.
link |
00:54:11.800
So I'm excited about that.
link |
00:54:13.440
And I think that's gonna lead to us being wealthier,
link |
00:54:16.840
healthier, I mean,
link |
00:54:17.800
the healthcare is probably one of the applications
link |
00:54:19.520
that I'm most excited about.
link |
00:54:22.520
So that's good news.
link |
00:54:23.760
I don't think we're gonna have the end of work anytime soon.
link |
00:54:26.760
There's just too many things that machines still can't do.
link |
00:54:30.960
When I look around the world
link |
00:54:32.000
and think of whether it's childcare or healthcare,
link |
00:54:34.640
cleaning the environment, interacting with people,
link |
00:54:37.740
scientific work, artistic creativity,
link |
00:54:40.900
these are things that for now,
link |
00:54:42.560
machines aren't able to do nearly as well as humans,
link |
00:54:45.640
even just something as mundane as, you know,
link |
00:54:47.160
folding laundry or whatever.
link |
00:54:48.720
And many of these, I think are gonna be years or decades
link |
00:54:52.920
before machines catch up.
link |
00:54:54.720
You know, I may be surprised on some of them,
link |
00:54:56.120
but overall, I think there's plenty of work
link |
00:54:58.760
for humans to do.
link |
00:54:59.760
There's plenty of problems in society
link |
00:55:01.320
that need the human touch.
link |
00:55:02.560
So we'll have to repurpose.
link |
00:55:04.180
We'll have to, as machines are able to do some tasks,
link |
00:55:07.880
people are gonna have to reskill and move into other areas.
link |
00:55:11.040
And that's probably what's gonna be going on
link |
00:55:12.740
for the next, you know, 10, 20, 30 years or more,
link |
00:55:16.240
kind of big restructuring of society.
link |
00:55:18.920
We'll get wealthier and people will have to do new skills.
link |
00:55:22.420
Now, if you turn the dial further, I don't know,
link |
00:55:24.360
50 or a hundred years into the future,
link |
00:55:26.960
then, you know, maybe all bets are off.
link |
00:55:29.640
Then it's possible that machines will be able to do
link |
00:55:32.880
most of what people do.
link |
00:55:34.240
You know, say one or 200 years, I think it's even likely.
link |
00:55:37.360
And at that point,
link |
00:55:38.400
then we're more in the sort of abundance economy.
link |
00:55:41.040
Then we're in a world where there's really little
link |
00:55:44.040
for the humans can do economically better than machines,
link |
00:55:48.000
other than be human.
link |
00:55:49.900
And, you know, that will take a transition as well,
link |
00:55:53.640
kind of more of a transition of how we get meaning in life
link |
00:55:56.480
and what our values are.
link |
00:55:58.220
But shame on us if we screw that up.
link |
00:56:00.400
I mean, that should be like great, great news.
link |
00:56:02.720
And it kind of saddens me that some people see that
link |
00:56:04.520
as like a big problem.
link |
00:56:05.540
I think that would be, should be wonderful
link |
00:56:07.640
if people have all the health and material things
link |
00:56:10.420
that they need and can focus on loving each other
link |
00:56:14.180
and discussing philosophy and playing
link |
00:56:16.840
and doing all the other things that don't require work.
link |
00:56:19.440
Do you think you'd be surprised to see what the 20,
link |
00:56:23.960
if we were to travel in time, 100 years into the future,
link |
00:56:27.420
do you think you'll be able to,
link |
00:56:29.560
like if I gave you a month to like talk to people,
link |
00:56:32.300
no, like let's say a week,
link |
00:56:34.120
would you be able to understand what the hell's going on?
link |
00:56:37.800
You mean if I was there for a week?
link |
00:56:39.200
Yeah, if you were there for a week.
link |
00:56:40.840
A hundred years in the future?
link |
00:56:42.120
Yeah.
link |
00:56:43.000
So like, so I'll give you one thought experiment is like,
link |
00:56:46.600
isn't it possible that we're all living in virtual reality
link |
00:56:49.640
by then?
link |
00:56:50.480
Yeah, no, I think that's very possible.
link |
00:56:52.620
I've played around with some of those VR headsets
link |
00:56:54.640
and they're not great,
link |
00:56:55.480
but I mean the average person spends many waking hours
link |
00:57:00.960
staring at screens right now.
link |
00:57:03.320
They're kind of low res compared to what they could be
link |
00:57:05.720
in 30 or 50 years, but certainly games
link |
00:57:10.680
and why not any other interactions could be done with VR?
link |
00:57:15.360
And that would be a pretty different world
link |
00:57:16.320
and we'd all, in some ways be as rich as we wanted.
link |
00:57:19.520
We could have castles and we could be traveling
link |
00:57:21.360
anywhere we want and it could obviously be multisensory.
link |
00:57:25.960
So that would be possible and of course there's people,
link |
00:57:30.880
you've had Elon Musk on and others, there are people,
link |
00:57:33.360
Nick Bostrom makes the simulation argument
link |
00:57:35.380
that maybe we're already there.
link |
00:57:36.760
We're already there.
link |
00:57:37.720
So, but in general, or do you not even think about
link |
00:57:41.200
in this kind of way, you're self critically thinking,
link |
00:57:45.080
how good are you as an economist at predicting
link |
00:57:48.560
what the future looks like?
link |
00:57:50.340
Do you have a?
link |
00:57:51.180
Well, it starts getting, I mean,
link |
00:57:52.000
I feel reasonably comfortable the next five, 10, 20 years
link |
00:57:55.960
in terms of that path.
link |
00:57:58.720
When you start getting truly superhuman
link |
00:58:01.720
artificial intelligence, kind of by definition,
link |
00:58:06.000
be able to think of a lot of things
link |
00:58:07.040
that I couldn't have thought of and create a world
link |
00:58:09.080
that I couldn't even imagine.
link |
00:58:10.960
And so I'm not sure I can predict what that world
link |
00:58:15.240
is going to be like.
link |
00:58:16.520
One thing that AI researchers, AI safety researchers
link |
00:58:19.840
worry about is what's called the alignment problem.
link |
00:58:22.540
When an AI is that powerful,
link |
00:58:25.080
then they can do all sorts of things.
link |
00:58:27.960
And you really hope that their values
link |
00:58:30.560
are aligned with our values.
link |
00:58:32.440
And it's even tricky to finding what our values are.
link |
00:58:34.480
I mean, first off, we all have different values.
link |
00:58:37.220
And secondly, maybe if we were smarter,
link |
00:58:40.440
we would have better values.
link |
00:58:41.620
Like, I like to think that we have better values
link |
00:58:44.200
than we did in 1860 and, or in the year 200 BC
link |
00:58:50.320
on a lot of dimensions,
link |
00:58:51.360
things that we consider barbaric today.
link |
00:58:53.440
And it may be that if I thought about it more deeply,
link |
00:58:56.080
I would also be morally evolved.
link |
00:58:57.400
Maybe I'd be a vegetarian or do other things
link |
00:59:00.120
that right now, whether my future self
link |
00:59:02.980
would consider kind of immoral.
link |
00:59:05.240
So that's a tricky problem,
link |
00:59:07.740
getting the AI to do what we want,
link |
00:59:11.120
assuming it's even a friendly AI.
link |
00:59:12.960
I mean, I should probably mention
link |
00:59:14.780
there's a nontrivial other branch
link |
00:59:17.100
where we destroy ourselves, right?
link |
00:59:18.720
I mean, there's a lot of exponentially improving
link |
00:59:22.040
technologies that could be ferociously destructive,
link |
00:59:26.640
whether it's in nanotechnology or biotech
link |
00:59:29.480
and weaponized viruses, AI and other things that.
link |
00:59:34.280
nuclear weapons.
link |
00:59:35.120
Nuclear weapons, of course.
link |
00:59:36.240
The old school technology.
link |
00:59:37.320
Yeah, good old nuclear weapons that could be devastating
link |
00:59:42.040
or even existential and new things yet to be invented.
link |
00:59:45.240
So that's a branch that I think is pretty significant.
link |
00:59:52.200
And there are those who think that one of the reasons
link |
00:59:54.260
we haven't been contacted by other civilizations, right?
link |
00:59:57.480
Is that once you get to a certain level of complexity
link |
01:00:01.560
in technology, there's just too many ways to go wrong.
link |
01:00:04.640
There's a lot of ways to blow yourself up.
link |
01:00:06.200
And people, or I should say species,
link |
01:00:09.640
end up falling into one of those traps.
link |
01:00:12.520
The great filter.
link |
01:00:13.580
The great filter.
link |
01:00:14.960
I mean, there's an optimistic view of that.
link |
01:00:16.720
If there is literally no intelligent life out there
link |
01:00:19.380
in the universe, or at least in our galaxy,
link |
01:00:22.340
that means that we've passed at least one
link |
01:00:25.140
of the great filters or some of the great filters
link |
01:00:27.840
that we survived.
link |
01:00:30.040
Yeah, no, I think Robin Hansen has a good way of,
link |
01:00:32.240
maybe others have a good way of thinking about this,
link |
01:00:33.920
that if there are no other intelligence creatures out there
link |
01:00:38.920
that we've been able to detect,
link |
01:00:40.640
one possibility is that there's a filter ahead of us.
link |
01:00:43.440
And when you get a little more advanced,
link |
01:00:44.780
maybe in a hundred or a thousand or 10,000 years,
link |
01:00:47.600
things just get destroyed for some reason.
link |
01:00:50.560
The other one is the great filters behind us.
link |
01:00:53.000
That'll be good, is that most planets don't even evolve life
link |
01:00:57.700
or if they don't evolve life,
link |
01:00:58.920
they don't evolve intelligent life.
link |
01:01:00.280
Maybe we've gotten past that.
link |
01:01:02.040
And so now maybe we're on the good side
link |
01:01:03.960
of the great filter.
link |
01:01:05.680
So if we sort of rewind back and look at the thing
link |
01:01:10.480
where we could say something a little bit more comfortably
link |
01:01:12.760
at five years and 10 years out,
link |
01:01:15.860
you've written about jobs
link |
01:01:20.200
and the impact on sort of our economy and the jobs
link |
01:01:24.680
in terms of artificial intelligence that it might have.
link |
01:01:28.240
It's a fascinating question of what kind of jobs are safe,
link |
01:01:30.560
what kind of jobs are not.
link |
01:01:32.520
Can you maybe speak to your intuition
link |
01:01:34.560
about how we should think about AI changing
link |
01:01:38.320
the landscape of work?
link |
01:01:39.940
Sure, absolutely.
link |
01:01:40.880
Well, this is a really important question
link |
01:01:42.600
because I think we're very far
link |
01:01:43.900
from artificial general intelligence,
link |
01:01:45.720
which is AI that can just do the full breadth
link |
01:01:48.120
of what humans can do.
link |
01:01:49.520
But we do have human level or superhuman level
link |
01:01:52.980
narrow intelligence, narrow artificial intelligence.
link |
01:01:56.800
And obviously my calculator can do math a lot better
link |
01:01:59.880
than I can.
link |
01:02:00.720
And there's a lot of other things
link |
01:02:01.560
that machines can do better than I can.
link |
01:02:03.160
So which is which?
link |
01:02:04.440
We actually set out to address that question
link |
01:02:06.860
with Tom Mitchell.
link |
01:02:08.160
I wrote a paper called what can machine learning do
link |
01:02:12.160
that was in science.
link |
01:02:13.440
And we went and interviewed a whole bunch of AI experts
link |
01:02:16.840
and kind of synthesized what they thought machine learning
link |
01:02:20.440
was good at and wasn't good at.
link |
01:02:22.220
And we came up with what we called a rubric,
link |
01:02:25.540
basically a set of questions you can ask about any task
link |
01:02:28.160
that will tell you whether it's likely to score high or low
link |
01:02:30.960
on suitability for machine learning.
link |
01:02:33.720
And then we've applied that
link |
01:02:34.760
to a bunch of tasks in the economy.
link |
01:02:36.940
In fact, there's a data set of all the tasks
link |
01:02:39.080
in the US economy, believe it or not, it's called ONET.
link |
01:02:41.600
The US government put it together,
link |
01:02:43.120
part of the Bureau of Labor Statistics.
link |
01:02:45.000
They divide the economy into about 970 occupations
link |
01:02:48.680
like bus driver, economist, primary school teacher,
link |
01:02:52.140
radiologist, and then for each one of them,
link |
01:02:54.800
they describe which tasks need to be done.
link |
01:02:57.580
Like for radiologists, there are 27 distinct tasks.
link |
01:03:00.720
So we went through all those tasks
link |
01:03:02.160
to see whether or not a machine could do them.
link |
01:03:04.960
And what we found interestingly was...
link |
01:03:06.680
Brilliant study by the way, that's so awesome.
link |
01:03:08.880
Yeah, thank you.
link |
01:03:10.240
So what we found was that there was no occupation
link |
01:03:13.760
in our data set where machine learning just ran the table
link |
01:03:16.240
and did everything.
link |
01:03:17.520
And there was almost no occupation
link |
01:03:18.980
where machine learning didn't have
link |
01:03:19.900
like a significant ability to do things.
link |
01:03:22.120
Like take radiology, a lot of people I hear saying,
link |
01:03:24.360
you know, it's the end of radiology.
link |
01:03:26.680
And one of the 27 tasks is read medical images.
link |
01:03:29.880
Really important one, like it's kind of a core job.
link |
01:03:31.960
And machines have basically gotten as good
link |
01:03:34.640
or better than radiologists.
link |
01:03:35.880
There was just an article in Nature last week,
link |
01:03:38.360
but they've been publishing them for the past few years
link |
01:03:42.440
showing that machine learning can do as well as humans
link |
01:03:46.480
on many kinds of diagnostic imaging tasks.
link |
01:03:49.600
But other things that radiologists do,
link |
01:03:51.120
they sometimes administer conscious sedation.
link |
01:03:54.440
They sometimes do physical exams.
link |
01:03:55.940
They have to synthesize the results
link |
01:03:57.320
and explain it to the other doctors or to the patients.
link |
01:04:01.680
In all those categories,
link |
01:04:02.520
machine learning isn't really up to snuff yet.
link |
01:04:05.560
So that job, we're gonna see a lot of restructuring.
link |
01:04:09.300
Parts of the job, they'll hand over to machines.
link |
01:04:11.400
Others, humans will do more of.
link |
01:04:13.160
That's been more or less the pattern all of them.
link |
01:04:15.080
So, you know, to oversimplify a bit,
link |
01:04:17.080
we're gonna see a lot of restructuring,
link |
01:04:19.080
reorganization of work.
link |
01:04:20.400
And it's real gonna be a great time.
link |
01:04:22.300
It is a great time for smart entrepreneurs and managers
link |
01:04:24.720
to do that reinvention of work.
link |
01:04:27.280
I'm not gonna see mass unemployment.
link |
01:04:30.600
To get more specifically to your question,
link |
01:04:33.120
the kinds of tasks that machines tend to be good at
link |
01:04:36.560
are a lot of routine problem solving,
link |
01:04:39.040
mapping inputs X into outputs Y.
link |
01:04:42.560
If you have a lot of data on the Xs and the Ys,
link |
01:04:44.840
the inputs and the outputs,
link |
01:04:45.680
you can do that kind of mapping and find the relationships.
link |
01:04:48.520
They tend to not be very good at,
link |
01:04:50.660
even now, fine motor control and dexterity.
link |
01:04:53.680
Emotional intelligence and human interactions
link |
01:04:58.960
and thinking outside the box, creative work.
link |
01:05:01.700
If you give it a well structured task,
link |
01:05:03.220
machines can be very good at it.
link |
01:05:05.040
But even asking the right questions, that's hard.
link |
01:05:08.680
There's a quote that Andrew McAfee and I use
link |
01:05:10.680
in our book, Second Machine Age.
link |
01:05:12.980
Apparently Pablo Picasso was shown an early computer
link |
01:05:16.840
and he came away kind of unimpressed.
link |
01:05:18.460
He goes, well, I don't see all the fusses.
link |
01:05:20.660
All that does is answer questions.
link |
01:05:23.900
And to him, the interesting thing was asking the questions.
link |
01:05:26.740
Yeah, try to replace me, GPT3, I dare you.
link |
01:05:31.260
Although some people think I'm a robot.
link |
01:05:33.160
You have this cool plot that shows,
link |
01:05:37.020
I just remember where economists land,
link |
01:05:39.640
where I think the X axis is the income.
link |
01:05:43.380
And then the Y axis is, I guess,
link |
01:05:46.220
aggregating the information of how replaceable the job is.
link |
01:05:49.380
Or I think there's an index.
link |
01:05:50.780
There's a suitability for machine learning index.
link |
01:05:51.620
Exactly.
link |
01:05:52.460
So we have all 970 occupations on that chart.
link |
01:05:55.300
It's a cool plot.
link |
01:05:56.500
And there's scatters in all four corners
link |
01:05:59.200
have some occupations.
link |
01:06:01.040
But there is a definite pattern,
link |
01:06:02.700
which is the lower wage occupations tend to have more tasks
link |
01:06:05.660
that are suitable for machine learning, like cashiers.
link |
01:06:07.960
I mean, anyone who's gone to a supermarket or CVS
link |
01:06:10.400
knows that they not only read barcodes,
link |
01:06:12.380
but they can recognize an apple and an orange
link |
01:06:14.520
and a lot of things cashiers, humans used to be needed for.
link |
01:06:19.520
At the other end of the spectrum,
link |
01:06:21.020
there are some jobs like airline pilot
link |
01:06:23.580
that are among the highest paid in our economy,
link |
01:06:26.640
but also a lot of them are suitable for machine learning.
link |
01:06:28.780
A lot of those tasks are.
link |
01:06:30.940
And then, yeah, you mentioned economists.
link |
01:06:32.500
I couldn't help peeking at those
link |
01:06:33.820
and they're paid a fair amount,
link |
01:06:36.100
maybe not as much as some of us think they should be.
link |
01:06:39.120
But they have some tasks that are suitable
link |
01:06:43.620
for machine learning, but for now at least,
link |
01:06:45.540
most of the tasks of economists
link |
01:06:47.180
didn't end up being in that category.
link |
01:06:48.540
And I should say, I didn't like create that data.
link |
01:06:50.640
We just took the analysis and that's what came out of it.
link |
01:06:54.480
And over time, that scatter plot will be updated
link |
01:06:57.320
as the technology improves.
link |
01:06:59.940
But it was just interesting to see the pattern there.
link |
01:07:02.860
And it is a little troubling in so far
link |
01:07:05.140
as if you just take the technology as it is today,
link |
01:07:08.100
it's likely to worsen income inequality
link |
01:07:10.520
on a lot of dimensions.
link |
01:07:12.260
So on this topic of the effect of AI
link |
01:07:16.480
on our landscape of work,
link |
01:07:21.060
one of the people that have been speaking about it
link |
01:07:23.660
in the public domain, public discourse
link |
01:07:25.800
is the presidential candidate, Andrew Yang.
link |
01:07:28.100
Yeah.
link |
01:07:29.040
What are your thoughts about Andrew?
link |
01:07:31.900
What are your thoughts about UBI,
link |
01:07:34.340
that universal basic income
link |
01:07:36.700
that he made one of the core ideas,
link |
01:07:39.100
by the way, he has like hundreds of ideas
link |
01:07:40.780
about like everything, it's kind of interesting.
link |
01:07:44.020
But what are your thoughts about him
link |
01:07:45.380
and what are your thoughts about UBI?
link |
01:07:46.740
Let me answer the question about his broader approach first.
link |
01:07:52.060
I mean, I just love that.
link |
01:07:52.900
He's really thoughtful, analytical.
link |
01:07:56.460
I agree with his values.
link |
01:07:58.220
So that's awesome.
link |
01:07:59.420
And he read my book and mentions it sometimes,
link |
01:08:02.220
so it makes me even more excited.
link |
01:08:04.820
And the thing that he really made the centerpiece
link |
01:08:07.660
of his campaign was UBI.
link |
01:08:09.940
And I was originally kind of a fan of it.
link |
01:08:13.260
And then as I studied it more, I became less of a fan,
link |
01:08:15.980
although I'm beginning to come back a little bit.
link |
01:08:17.420
So let me tell you a little bit of my evolution.
link |
01:08:19.300
As an economist, we have, by looking at the problem
link |
01:08:23.060
of people not having enough income and the simplest thing
link |
01:08:25.180
is, well, why don't we write them a check?
link |
01:08:26.860
Problem solved.
link |
01:08:28.040
But then I talked to my sociologist friends
link |
01:08:30.460
and they really convinced me that just writing a check
link |
01:08:34.420
doesn't really get at the core values.
link |
01:08:36.940
Voltaire once said that work solves three great ills,
link |
01:08:40.660
boredom, vice, and need.
link |
01:08:43.380
And you can deal with the need thing by writing a check,
link |
01:08:46.680
but people need a sense of meaning,
link |
01:08:49.300
they need something to do.
link |
01:08:50.820
And when, say, steel workers or coal miners lost their jobs
link |
01:08:57.980
and were just given checks, alcoholism, depression, divorce,
link |
01:09:03.820
all those social indicators, drug use, all went way up.
link |
01:09:06.540
People just weren't happy
link |
01:09:08.020
just sitting around collecting a check.
link |
01:09:11.380
Maybe it's part of the way they were raised.
link |
01:09:13.220
Maybe it's something innate in people
link |
01:09:14.740
that they need to feel wanted and needed.
link |
01:09:17.220
So it's not as simple as just writing people a check.
link |
01:09:19.540
You need to also give them a way to have a sense of purpose.
link |
01:09:23.980
And that was important to me.
link |
01:09:25.380
And the second thing is that, as I mentioned earlier,
link |
01:09:28.740
we are far from the end of work.
link |
01:09:31.160
I don't buy the idea that there's just like
link |
01:09:32.800
not enough work to be done.
link |
01:09:34.140
I see like our cities need to be cleaned up.
link |
01:09:37.100
And robots can't do most of that.
link |
01:09:39.580
We need to have better childcare.
link |
01:09:40.780
We need better healthcare.
link |
01:09:41.640
We need to take care of people who are mentally ill or older.
link |
01:09:44.940
We need to repair our roads.
link |
01:09:46.500
There's so much work that require at least partly,
link |
01:09:49.940
maybe entirely a human component.
link |
01:09:52.300
So rather than like write all these people off,
link |
01:09:54.660
let's find a way to repurpose them and keep them engaged.
link |
01:09:58.240
Now that said, I would like to see more buying power
link |
01:10:04.640
from people who are sort of at the bottom end
link |
01:10:06.400
of the spectrum.
link |
01:10:07.320
The economy has been designed and evolved in a way
link |
01:10:12.540
that's I think very unfair to a lot of hardworking people.
link |
01:10:15.600
I see super hardworking people who aren't really seeing
link |
01:10:18.100
their wages grow over the past 20, 30 years,
link |
01:10:20.720
while some other people who have been super smart
link |
01:10:24.080
and or super lucky have made billions
link |
01:10:29.480
or hundreds of billions.
link |
01:10:30.920
And I don't think they need those hundreds of billions
link |
01:10:33.800
to have the right incentives to invent things.
link |
01:10:35.740
I think if you talk to almost any of them as I have,
link |
01:10:39.440
they don't think that they need an extra $10 billion
link |
01:10:42.440
to do what they're doing.
link |
01:10:43.560
Most of them probably would love to do it for only a billion
link |
01:10:48.120
or maybe for nothing.
link |
01:10:49.360
For nothing, many of them, yeah.
link |
01:10:50.800
I mean, an interesting point to make is,
link |
01:10:54.200
do we think that Bill Gates would have founded Microsoft
link |
01:10:56.640
if tax rates were 70%?
link |
01:10:58.720
Well, we know he would have because they were tax rates
link |
01:11:01.380
of 70% when he founded it.
link |
01:11:03.680
So I don't think that's as big a deterrent
link |
01:11:06.200
and we could provide more buying power to people.
link |
01:11:09.100
My own favorite tool is the Earned Income Tax Credit,
link |
01:11:12.800
which is basically a way of supplementing income
link |
01:11:16.240
of people who have jobs and giving employers
link |
01:11:18.160
an incentive to hire even more people.
link |
01:11:20.300
The minimum wage can discourage employment,
link |
01:11:22.400
but the Earned Income Tax Credit encourages employment
link |
01:11:25.160
by supplementing people's wages.
link |
01:11:27.960
If the employer can only afford to pay them $10 for a task,
link |
01:11:32.680
the rest of us kick in another five or $10
link |
01:11:35.200
and bring their wages up to 15 or 20 total.
link |
01:11:37.640
And then they have more buying power.
link |
01:11:39.360
Then entrepreneurs are thinking, how can we cater to them?
link |
01:11:42.320
How can we make products for them?
link |
01:11:44.080
And it becomes a self reinforcing system
link |
01:11:47.220
where people are better off.
link |
01:11:49.840
Ian Drang and I had a good discussion
link |
01:11:51.840
where he suggested instead of a universal basic income,
link |
01:11:55.940
he suggested, or instead of an unconditional basic income,
link |
01:11:59.080
how about a conditional basic income
link |
01:12:00.600
where the condition is you learn some new skills,
link |
01:12:03.040
we need to reskill our workforce.
link |
01:12:05.040
So let's make it easier for people to find ways
link |
01:12:09.120
to get those skills and get rewarded for doing them.
link |
01:12:11.280
And that's kind of a neat idea as well.
link |
01:12:13.080
That's really interesting.
link |
01:12:13.900
So, I mean, one of the questions,
link |
01:12:16.160
one of the dreams of UBI is that you provide
link |
01:12:19.680
some little safety net while you retrain,
link |
01:12:24.280
while you learn a new skill.
link |
01:12:26.040
But like, I think, I guess you're speaking
link |
01:12:28.360
to the intuition that that doesn't always,
link |
01:12:31.280
like there needs to be some incentive to reskill,
link |
01:12:33.760
to train, to learn a new thing.
link |
01:12:35.280
I think it helps.
link |
01:12:36.120
I mean, there are lots of self motivated people,
link |
01:12:37.960
but there are also people that maybe need a little guidance
link |
01:12:40.600
or help and I think it's a really hard question
link |
01:12:44.960
for someone who is losing a job in one area to know
link |
01:12:48.280
what is the new area I should be learning skills in.
link |
01:12:50.600
And we could provide a much better set of tools
link |
01:12:52.600
and platforms that maps it.
link |
01:12:54.480
Okay, here's a set of skills you already have.
link |
01:12:56.400
Here's something that's in demand.
link |
01:12:58.120
Let's create a path for you to go from where you are
link |
01:13:00.440
to where you need to be.
link |
01:13:03.120
So I'm a total, how do I put it nicely about myself?
link |
01:13:07.080
I'm totally clueless about the economy.
link |
01:13:09.640
It's not totally true, but pretty good approximation.
link |
01:13:12.760
If you were to try to fix our tax system
link |
01:13:20.480
and, or maybe from another side,
link |
01:13:23.240
if there's fundamental problems in taxation
link |
01:13:26.680
or some fundamental problems about our economy,
link |
01:13:29.720
what would you try to fix?
link |
01:13:31.320
What would you try to speak to?
link |
01:13:33.440
You know, I definitely think our whole tax system,
link |
01:13:36.320
our political and economic system has gotten more
link |
01:13:40.080
and more screwed up over the past 20, 30 years.
link |
01:13:43.520
I don't think it's that hard to make headway
link |
01:13:46.520
in improving it.
link |
01:13:47.360
I don't think we need to totally reinvent stuff.
link |
01:13:49.880
A lot of it is what I've been elsewhere with Andy
link |
01:13:52.400
and others called economics 101.
link |
01:13:54.680
You know, there's just some basic principles
link |
01:13:56.400
that have worked really well in the 20th century
link |
01:14:00.640
that we sort of forgot, you know,
link |
01:14:01.880
in terms of investing in education,
link |
01:14:03.960
investing in infrastructure, welcoming immigrants,
link |
01:14:07.560
having a tax system that was more progressive and fair.
link |
01:14:13.280
At one point, tax rates were on top incomes
link |
01:14:16.560
were significantly higher.
link |
01:14:18.080
And they've come down a lot to the point where
link |
01:14:19.880
in many cases they're lower now
link |
01:14:21.440
than they are for poorer people.
link |
01:14:24.760
So, and we could do things like earned income tax credit
link |
01:14:27.960
to get a little more wonky.
link |
01:14:29.240
I'd like to see more Pigouvian taxes.
link |
01:14:31.440
What that means is you tax things that are bad
link |
01:14:35.720
instead of things that are good.
link |
01:14:36.960
So right now we tax labor, we tax capital
link |
01:14:40.640
and which is unfortunate
link |
01:14:42.200
because one of the basic principles of economics
link |
01:14:44.080
if you tax something, you tend to get less of it.
link |
01:14:46.400
So, you know, right now there's still work to be done
link |
01:14:48.800
and still capital to be invested in.
link |
01:14:51.220
But instead we should be taxing things like pollution
link |
01:14:54.600
and congestion.
link |
01:14:57.200
And if we did that, we would have less pollution.
link |
01:15:00.000
So a carbon tax is, you know,
link |
01:15:02.120
almost every economist would say it's a no brainer
link |
01:15:04.120
whether they're Republican or Democrat,
link |
01:15:07.560
Greg Mankiw who is head of George Bush's
link |
01:15:09.680
Council of Economic Advisers or Dick Schmollensie
link |
01:15:13.000
who is another Republican economist agree.
link |
01:15:16.080
And of course a lot of Democratic economists agree as well.
link |
01:15:21.600
If we taxed carbon,
link |
01:15:22.800
we could raise hundreds of billions of dollars.
link |
01:15:26.040
We could take that money and redistribute it
link |
01:15:28.600
through an earned income tax credit or other things
link |
01:15:31.200
so that overall our tax system would become more progressive.
link |
01:15:35.280
We could tax congestion.
link |
01:15:36.960
One of the things that kills me as an economist
link |
01:15:39.040
is every time I sit in a traffic jam,
link |
01:15:41.080
I know that it's completely unnecessary.
link |
01:15:43.280
This is complete wasted time.
link |
01:15:44.840
You just visualize the cost and productivity.
link |
01:15:47.560
Exactly, because they are taking costs for me
link |
01:15:51.260
and all the people around me.
link |
01:15:52.700
And if they charged a congestion tax,
link |
01:15:54.840
they would take that same amount of money
link |
01:15:57.080
and people would, it would streamline the roads.
link |
01:15:59.720
Like when you're in Singapore, the traffic just flows
link |
01:16:01.640
because they have a congestion tax.
link |
01:16:02.640
They listened to economists.
link |
01:16:03.640
They invited me and others to go talk to them.
link |
01:16:06.480
And then I'd still be paying,
link |
01:16:09.240
I'd be paying a congestion tax instead of paying in my time,
link |
01:16:11.740
but that money would now be available for healthcare,
link |
01:16:14.240
be available for infrastructure,
link |
01:16:15.520
or be available just to give to people
link |
01:16:16.880
so they could buy food or whatever.
link |
01:16:18.660
So it's just, it saddens me when you sit,
link |
01:16:22.280
when you're sitting in a traffic jam,
link |
01:16:23.320
it's like taxing me and then taking that money
link |
01:16:25.060
and dumping it in the ocean, just like destroying it.
link |
01:16:27.820
So there are a lot of things like that
link |
01:16:29.500
that economists, and I'm not,
link |
01:16:32.520
I'm not like doing anything radical here.
link |
01:16:33.940
Most, you know, good economists would,
link |
01:16:36.680
I probably agree with me point by point on these things.
link |
01:16:39.440
And we could do those things
link |
01:16:41.000
and our whole economy would become much more efficient.
link |
01:16:43.760
It'd become fairer, invest in R&D and research,
link |
01:16:47.000
which is close to a free lunch is what we have.
link |
01:16:50.060
My erstwhile MIT colleague, Bob Solla,
link |
01:16:53.160
got the Nobel Prize, not yesterday, but 30 years ago,
link |
01:16:57.360
for describing that most improvements
link |
01:17:00.560
in living standards come from tech progress.
link |
01:17:02.880
And Paul Romer later got a Nobel Prize
link |
01:17:04.560
for noting that investments in R&D and human capital
link |
01:17:08.040
can speed the rate of tech progress.
link |
01:17:11.040
So if we do that, then we'll be healthier and wealthier.
link |
01:17:14.680
Yeah, from an economics perspective,
link |
01:17:16.200
I remember taking an undergrad econ,
link |
01:17:18.440
you mentioned econ 101.
link |
01:17:20.380
It seemed from all the plots I saw
link |
01:17:23.660
that R&D is an obvious, as close to free lunch as we have,
link |
01:17:29.040
it seemed like obvious that we should do more research.
link |
01:17:32.340
It is.
link |
01:17:33.180
Like what, what, like, there's no.
link |
01:17:36.620
Well, we should do basic research.
link |
01:17:38.000
I mean, so let me just be clear.
link |
01:17:39.440
It'd be great if everybody did more research
link |
01:17:41.420
and I would make this issue
link |
01:17:42.260
between applied development versus basic research.
link |
01:17:46.080
So applied development, like, you know,
link |
01:17:48.120
how do we get this self driving car, you know,
link |
01:17:52.120
feature to work better in the Tesla?
link |
01:17:53.960
That's great for private companies
link |
01:17:55.240
because they can capture the value from that.
link |
01:17:57.080
If they make a better self driving car system,
link |
01:17:59.700
they can sell cars that are more valuable
link |
01:18:02.240
and then make money.
link |
01:18:03.080
So there's an incentive that there's not a big problem there
link |
01:18:05.720
and smart companies, Amazon, Tesla,
link |
01:18:08.200
and others are investing in it.
link |
01:18:09.440
The problem is with basic research,
link |
01:18:11.260
like coming up with core basic ideas,
link |
01:18:14.420
whether it's in nuclear fusion
link |
01:18:16.120
or artificial intelligence or biotech.
link |
01:18:19.000
There, if someone invents something,
link |
01:18:21.640
it's very hard for them to capture the benefits from it.
link |
01:18:23.920
It's shared by everybody, which is great in a way,
link |
01:18:26.740
but it means that they're not gonna have the incentives
link |
01:18:28.640
to put as much effort into it.
link |
01:18:30.680
There you need, it's a classic public good.
link |
01:18:32.960
There you need the government to be involved in it.
link |
01:18:35.120
And the US government used to be investing much more in R&D,
link |
01:18:39.360
but we have slashed that part of the government
link |
01:18:42.940
really foolishly and we're all poorer,
link |
01:18:46.900
significantly poorer as a result.
link |
01:18:48.440
Growth rates are down.
link |
01:18:50.000
We're not having the kind of scientific progress
link |
01:18:51.680
we used to have.
link |
01:18:53.260
It's been sort of a short term eating the seed corn,
link |
01:18:57.800
whatever metaphor you wanna use
link |
01:19:00.120
where people grab some money, put it in their pockets today,
link |
01:19:03.320
but five, 10, 20 years later,
link |
01:19:07.120
they're a lot poorer than they otherwise would have been.
link |
01:19:10.140
So we're living through a pandemic right now,
link |
01:19:12.320
globally in the United States.
link |
01:19:16.580
From an economics perspective,
link |
01:19:18.840
how do you think this pandemic will change the world?
link |
01:19:23.040
It's been remarkable.
link |
01:19:24.640
And it's horrible how many people have suffered,
link |
01:19:27.760
the amount of death, the economic destruction.
link |
01:19:31.240
It's also striking just the amount of change in work
link |
01:19:34.300
that I've seen.
link |
01:19:35.840
In the last 20 weeks, I've seen more change
link |
01:19:38.440
than there were in the previous 20 years.
link |
01:19:41.200
There's been nothing like it
link |
01:19:42.400
since probably the World War II mobilization
link |
01:19:44.700
in terms of reorganizing our economy.
link |
01:19:47.040
The most obvious one is the shift to remote work.
link |
01:19:50.200
And I and many other people stopped going into the office
link |
01:19:54.280
and teaching my students in person.
link |
01:19:56.160
I did a study on this with a bunch of colleagues
link |
01:19:57.760
at MIT and elsewhere.
link |
01:19:59.180
And what we found was that before the pandemic,
link |
01:20:02.440
in the beginning of 2020, about one in six,
link |
01:20:05.400
a little over 15% of Americans were working remotely.
link |
01:20:09.840
When the pandemic hit, that grew steadily and hit 50%,
link |
01:20:13.560
roughly half of Americans working at home.
link |
01:20:16.080
So a complete transformation.
link |
01:20:17.840
And of course, it wasn't even,
link |
01:20:19.160
it wasn't like everybody did it.
link |
01:20:20.520
If you're an information worker, professional,
link |
01:20:22.760
if you work mainly with data,
link |
01:20:24.400
then you're much more likely to work at home.
link |
01:20:26.880
If you're a manufacturing worker,
link |
01:20:28.800
working with other people or physical things,
link |
01:20:32.320
then it wasn't so easy to work at home.
link |
01:20:34.520
And instead, those people were much more likely
link |
01:20:36.480
to become laid off or unemployed.
link |
01:20:39.280
So it's been something that's had very disparate effects
link |
01:20:41.840
on different parts of the workforce.
link |
01:20:44.520
Do you think it's gonna be sticky in a sense
link |
01:20:46.720
that after vaccine comes out and the economy reopens,
link |
01:20:51.060
do you think remote work will continue?
link |
01:20:55.180
That's a great question.
link |
01:20:57.080
My hypothesis is yes, a lot of it will.
link |
01:20:59.360
Of course, some of it will go back,
link |
01:21:00.800
but a surprising amount of it will stay.
link |
01:21:03.480
I personally, for instance, I moved my seminars,
link |
01:21:06.620
my academic seminars to Zoom,
link |
01:21:08.840
and I was surprised how well it worked.
link |
01:21:10.800
So it works?
link |
01:21:11.640
Yeah, I mean, obviously we were able to reach
link |
01:21:13.600
a much broader audience.
link |
01:21:14.760
So we have people tuning in from Europe
link |
01:21:16.600
and other countries,
link |
01:21:18.520
just all over the United States for that matter.
link |
01:21:20.320
I also actually found that it would,
link |
01:21:21.760
in many ways, is more egalitarian.
link |
01:21:23.520
We use the chat feature and other tools,
link |
01:21:25.920
and grad students and others who might've been
link |
01:21:27.600
a little shy about speaking up,
link |
01:21:29.400
we now kind of have more of ability for lots of voices.
link |
01:21:32.680
And they're answering each other's questions,
link |
01:21:34.360
so you kind of get parallel.
link |
01:21:35.960
Like if someone had some question about some of the data
link |
01:21:39.040
or a reference or whatever,
link |
01:21:40.660
then someone else in the chat would answer it.
link |
01:21:42.480
And the whole thing just became like a higher bandwidth,
link |
01:21:44.480
higher quality thing.
link |
01:21:46.600
So I thought that was kind of interesting.
link |
01:21:48.440
I think a lot of people are discovering that these tools
link |
01:21:51.280
that thanks to technologists have been developed
link |
01:21:54.480
over the past decade,
link |
01:21:56.440
they're a lot more powerful than we thought.
link |
01:21:57.920
I mean, all the terrible things we've seen with COVID
link |
01:22:00.120
and the real failure of many of our institutions
link |
01:22:03.400
that I thought would work better.
link |
01:22:04.960
One area that's been a bright spot is our technologies.
link |
01:22:09.420
Bandwidth has held up pretty well,
link |
01:22:11.840
and all of our email and other tools
link |
01:22:14.200
have just scaled up kind of gracefully.
link |
01:22:18.000
So that's been a plus.
link |
01:22:20.280
Economists call this question
link |
01:22:21.680
of whether it'll go back a hysteresis.
link |
01:22:23.920
The question is like when you boil an egg
link |
01:22:25.880
after it gets cold again, it stays hard.
link |
01:22:29.020
And I think that we're gonna have a fair amount
link |
01:22:30.860
of hysteresis in the economy.
link |
01:22:32.160
We're gonna move to this new,
link |
01:22:33.440
we have moved to a new remote work system,
link |
01:22:35.520
and it's not gonna snap all the way back
link |
01:22:37.260
to where it was before.
link |
01:22:38.720
One of the things that worries me is that the people
link |
01:22:44.160
with lots of followers on Twitter and people with voices,
link |
01:22:51.380
people that can, voices that can be magnified by reporters
link |
01:22:56.380
and all that kind of stuff are the people
link |
01:22:57.900
that fall into this category
link |
01:22:59.240
that we were referring to just now
link |
01:23:01.600
where they can still function
link |
01:23:03.000
and be successful with remote work.
link |
01:23:06.240
And then there is a kind of quiet suffering
link |
01:23:11.240
of what feels like millions of people
link |
01:23:14.800
whose jobs are disturbed profoundly by this pandemic,
link |
01:23:21.200
but they don't have many followers on Twitter.
link |
01:23:26.320
What do we, and again, I apologize,
link |
01:23:31.840
but I've been reading the rise and fall of the Third Reich
link |
01:23:35.840
and there's a connection to the depression
link |
01:23:38.080
on the American side.
link |
01:23:39.580
There's a deep, complicated connection
link |
01:23:42.320
to how suffering can turn into forces
link |
01:23:46.400
that potentially change the world in destructive ways.
link |
01:23:51.960
So like it's something I worry about is like,
link |
01:23:53.840
what is this suffering going to materialize itself
link |
01:23:56.600
in five, 10 years?
link |
01:23:58.080
Is that something you worry about, think about?
link |
01:24:01.020
It's like the center of what I worry about.
link |
01:24:03.320
And let me break it down to two parts.
link |
01:24:05.400
There's a moral and ethical aspect to it.
link |
01:24:07.280
We need to relieve this suffering.
link |
01:24:09.340
I mean, I'm sure the values of, I think most Americans,
link |
01:24:13.280
we like to see shared prosperity
link |
01:24:15.000
or most people on the planet.
link |
01:24:16.620
And we would like to see people not falling behind
link |
01:24:20.220
and they have fallen behind, not just due to COVID,
link |
01:24:23.080
but in the previous couple of decades,
link |
01:24:25.760
median income has barely moved,
link |
01:24:27.920
depending on how you measure it.
link |
01:24:29.900
And the incomes of the top 1% have skyrocketed.
link |
01:24:33.360
And part of that is due to the ways technology has been used.
link |
01:24:36.460
Part of this been due to, frankly, our political system
link |
01:24:38.840
has continually shifted more wealth into those people
link |
01:24:43.680
who have the powerful interest.
link |
01:24:45.120
So there's just, I think, a moral imperative
link |
01:24:48.720
to do a better job.
link |
01:24:49.800
And ultimately, we're all gonna be wealthier
link |
01:24:51.900
if more people can contribute,
link |
01:24:53.320
more people have the wherewithal.
link |
01:24:55.040
But the second thing is that there's a real political risk.
link |
01:24:58.640
I'm not a political scientist,
link |
01:24:59.960
but you don't have to be one, I think,
link |
01:25:02.560
to see how a lot of people are really upset
link |
01:25:05.660
with they're getting a raw deal
link |
01:25:07.380
and they want to smash the system in different ways,
link |
01:25:13.680
in 2016 and 2018.
link |
01:25:15.960
And now I think there are a lot of people
link |
01:25:18.280
who are looking at the political system
link |
01:25:19.600
and they feel like it's not working for them
link |
01:25:21.120
and they just wanna do something radical.
link |
01:25:24.720
Unfortunately, demagogues have harnessed that
link |
01:25:28.140
in a way that is pretty destructive to the country.
link |
01:25:33.140
And an analogy I see is what happened with trade.
link |
01:25:37.240
Almost every economist thinks that free trade
link |
01:25:39.440
is a good thing, that when two people voluntarily exchange
link |
01:25:42.440
almost by definition, they're both better off
link |
01:25:44.940
if it's voluntary.
link |
01:25:47.320
And so generally, trade is a good thing.
link |
01:25:49.800
But they also recognize that trade can lead
link |
01:25:52.480
to uneven effects, that there can be winners and losers
link |
01:25:56.260
in some of the people who didn't have the skills
link |
01:25:59.280
to compete with somebody else or didn't have other assets.
link |
01:26:02.880
And so trade can shift prices
link |
01:26:04.920
in ways that are averse to some people.
link |
01:26:08.460
So there's a formula that economists have,
link |
01:26:11.340
which is that you have free trade,
link |
01:26:13.440
but then you compensate the people who are hurt
link |
01:26:15.920
and free trade makes the pie bigger.
link |
01:26:18.400
And since the pie is bigger,
link |
01:26:19.460
it's possible for everyone to be better off.
link |
01:26:21.920
You can make the winners better off,
link |
01:26:23.200
but you can also compensate those who don't win.
link |
01:26:25.440
And so they end up being better off as well.
link |
01:26:28.460
What happened was that we didn't fulfill that promise.
link |
01:26:33.160
We did have some more increased free trade
link |
01:26:36.040
in the 80s and 90s, but we didn't compensate the people
link |
01:26:39.480
who were hurt.
link |
01:26:40.640
And so they felt like the people in power
link |
01:26:43.800
reneged on the bargain, and I think they did.
link |
01:26:45.900
And so then there's a backlash against trade.
link |
01:26:48.760
And now both political parties,
link |
01:26:50.840
but especially Trump and company,
link |
01:26:53.640
have really pushed back against free trade.
link |
01:26:58.200
Ultimately, that's bad for the country.
link |
01:27:00.680
Ultimately, that's bad for living standards.
link |
01:27:02.720
But in a way I can understand
link |
01:27:04.400
that people felt they were betrayed.
link |
01:27:07.080
Technology has a lot of similar characteristics.
link |
01:27:10.680
Technology can make us all better off.
link |
01:27:14.920
It makes the pie bigger.
link |
01:27:16.120
It creates wealth and health, but it can also be uneven.
link |
01:27:18.920
Not everyone automatically benefits.
link |
01:27:21.280
It's possible for some people,
link |
01:27:22.880
even a majority of people to get left behind
link |
01:27:25.080
while a small group benefits.
link |
01:27:28.200
What most economists would say,
link |
01:27:29.560
well, let's make the pie bigger,
link |
01:27:30.880
but let's make sure we adjust the system
link |
01:27:33.000
so we compensate the people who are hurt.
link |
01:27:35.200
And since the pie is bigger,
link |
01:27:36.920
we can make the rich richer,
link |
01:27:38.000
we can make the middle class richer,
link |
01:27:39.200
we can make the poor richer.
link |
01:27:40.980
Mathematically, everyone could be better off.
link |
01:27:43.640
But again, we're not doing that.
link |
01:27:45.400
And again, people are saying this isn't working for us.
link |
01:27:48.940
And again, instead of fixing the distribution,
link |
01:27:52.540
a lot of people are beginning to say,
link |
01:27:54.280
hey, technology sucks, we've got to stop it.
link |
01:27:57.280
Let's throw rocks at the Google bus.
link |
01:27:59.040
Let's blow it up.
link |
01:27:59.980
Let's blow it up.
link |
01:28:01.240
And there were the Luddites almost exactly 200 years ago
link |
01:28:04.760
who smashed the looms and the spinning machines
link |
01:28:08.040
because they felt like those machines weren't helping them.
link |
01:28:11.320
We have a real imperative,
link |
01:28:12.720
not just to do the morally right thing,
link |
01:28:14.700
but to do the thing that is gonna save the country,
link |
01:28:17.520
which is make sure that we create
link |
01:28:19.440
not just prosperity, but shared prosperity.
link |
01:28:22.680
So you've been at MIT for over 30 years, I think.
link |
01:28:27.600
Don't tell anyone how old I am.
link |
01:28:28.440
Yeah, no, that's true, that's true.
link |
01:28:30.280
And you're now moved to Stanford.
link |
01:28:34.000
I'm gonna try not to say anything
link |
01:28:37.240
about how great MIT is.
link |
01:28:39.760
What's that move been like?
link |
01:28:41.520
What, it's East Coast to West Coast?
link |
01:28:44.960
Well, MIT is great.
link |
01:28:46.160
MIT has been very good to me.
link |
01:28:48.080
It continues to be very good to me.
link |
01:28:49.560
It's an amazing place.
link |
01:28:51.440
I continue to have so many amazing friends
link |
01:28:53.200
and colleagues there.
link |
01:28:54.600
I'm very fortunate to have been able
link |
01:28:56.120
to spend a lot of time at MIT.
link |
01:28:58.480
Stanford's also amazing.
link |
01:29:00.200
And part of what attracted me out here
link |
01:29:01.980
was not just the weather, but also Silicon Valley,
link |
01:29:04.960
let's face it, is really more of the epicenter
link |
01:29:07.360
of the technological revolution.
link |
01:29:09.000
And I wanna be close to the people
link |
01:29:10.400
who are inventing AI and elsewhere.
link |
01:29:12.320
A lot of it is being invested at MIT for that matter
link |
01:29:14.920
in Europe and China and elsewhere, in Nia.
link |
01:29:18.940
But being a little closer to some of the key technologists
link |
01:29:23.800
was something that was important to me.
link |
01:29:25.920
And it may be shallow,
link |
01:29:28.600
but I also do enjoy the good weather.
link |
01:29:30.180
And I felt a little ripped off
link |
01:29:33.120
when I came here a couple of months ago.
link |
01:29:35.040
And immediately there are the fires
link |
01:29:36.640
and my eyes were burning, the sky was orange
link |
01:29:39.840
and there's the heat waves.
link |
01:29:41.320
And so it wasn't exactly what I've been promised,
link |
01:29:44.460
but fingers crossed it'll get back to better.
link |
01:29:47.960
But maybe on a brief aside,
link |
01:29:50.720
there's been some criticism of academia
link |
01:29:52.720
and universities and different avenues.
link |
01:29:55.760
And I, as a person who's gotten to enjoy universities
link |
01:30:00.760
from the pure playground of ideas that it can be,
link |
01:30:06.380
always kind of try to find the words
link |
01:30:08.840
to tell people that these are magical places.
link |
01:30:13.160
Is there something that you can speak to
link |
01:30:17.000
that is beautiful or powerful about universities?
link |
01:30:22.440
Well, sure.
link |
01:30:23.280
I mean, first off, I mean,
link |
01:30:24.500
economists have this concept called revealed preference.
link |
01:30:26.660
You can ask people what they say
link |
01:30:28.300
or you can watch what they do.
link |
01:30:29.940
And so obviously by reveal preferences, I love academia.
link |
01:30:33.960
I could be doing lots of other things,
link |
01:30:35.540
but it's something I enjoy a lot.
link |
01:30:37.600
And I think the word magical is exactly right.
link |
01:30:39.640
At least it is for me.
link |
01:30:41.480
I do what I love, you know,
link |
01:30:43.120
hopefully my Dean won't be listening,
link |
01:30:44.320
but I would do this for free.
link |
01:30:45.640
You know, it's just what I like to do.
link |
01:30:49.060
I like to do research.
link |
01:30:50.160
I love to have conversations like this with you
link |
01:30:51.840
and with my students, with my fellow colleagues.
link |
01:30:53.740
I love being around the smartest people I can find
link |
01:30:55.760
and learning something from them
link |
01:30:57.220
and having them challenge me.
link |
01:30:58.640
And that just gives me joy.
link |
01:31:02.480
And every day I find something new and exciting to work on.
link |
01:31:05.500
And a university environment is really filled
link |
01:31:08.040
with other people who feel that way.
link |
01:31:09.820
And so I feel very fortunate to be part of it.
link |
01:31:12.960
And I'm lucky that I'm in a society
link |
01:31:14.840
where I can actually get paid for it
link |
01:31:16.200
and put food on the table
link |
01:31:17.240
while doing the stuff that I really love.
link |
01:31:19.260
And I hope someday everybody can have jobs
link |
01:31:21.560
that are like that.
link |
01:31:22.800
And I appreciate that it's not necessarily easy
link |
01:31:25.340
for everybody to have a job that they both love
link |
01:31:27.400
and also they get paid for.
link |
01:31:30.660
So there are things that don't go well in academia,
link |
01:31:34.000
but by and large, I think it's a kind of, you know,
link |
01:31:36.000
kinder, gentler version of a lot of the world.
link |
01:31:37.960
You know, we sort of cut each other a little slack
link |
01:31:41.280
on things like, you know, on just a lot of things.
link |
01:31:45.800
You know, of course there's harsh debates
link |
01:31:48.320
and discussions about things
link |
01:31:49.900
and some petty politics here and there.
link |
01:31:52.060
I personally, I try to stay away
link |
01:31:53.520
from most of that sort of politics.
link |
01:31:55.600
It's not my thing.
link |
01:31:56.560
And so it doesn't affect me most of the time,
link |
01:31:58.320
sometimes a little bit, maybe.
link |
01:32:00.480
But, you know, being able to pull together something,
link |
01:32:03.200
we have the digital economy lab.
link |
01:32:04.860
We've got all these brilliant grad students
link |
01:32:07.480
and undergraduates and postdocs
link |
01:32:09.280
that are just doing stuff that I learned from.
link |
01:32:12.320
And every one of them has some aspect
link |
01:32:14.760
of what they're doing that's just,
link |
01:32:16.640
I couldn't even understand.
link |
01:32:17.600
It's like way, way more brilliant.
link |
01:32:19.340
And that's really, to me, actually I really enjoy that,
link |
01:32:23.040
being in a room with lots of other smart people.
link |
01:32:25.120
And Stanford has made it very easy to attract,
link |
01:32:29.440
you know, those people.
link |
01:32:31.260
I just, you know, say I'm gonna do a seminar, whatever,
link |
01:32:33.680
and the people come, they come and wanna work with me.
link |
01:32:36.820
We get funding, we get data sets,
link |
01:32:38.880
and it's come together real nicely.
link |
01:32:41.440
And the rest is just fun.
link |
01:32:44.220
It's fun, yeah.
link |
01:32:45.840
And we feel like we're working on important problems,
link |
01:32:47.480
you know, and we're doing things that, you know,
link |
01:32:50.320
I think are first order in terms of what's important
link |
01:32:53.680
in the world, and that's very satisfying to me.
link |
01:32:56.320
Maybe a bit of a fun question.
link |
01:32:58.080
What three books, technical, fiction, philosophical,
link |
01:33:02.180
you've enjoyed, had a big, big impact in your life?
link |
01:33:07.380
Well, I guess I go back to like my teen years,
link |
01:33:09.980
and, you know, I read Sid Arthur,
link |
01:33:12.300
which is a philosophical book,
link |
01:33:13.420
and kind of helps keep me centered.
link |
01:33:15.260
By Herman Hesse.
link |
01:33:16.180
Yeah, by Herman Hesse, exactly.
link |
01:33:17.340
Don't get too wrapped up in material things
link |
01:33:20.380
or other things, and just sort of, you know,
link |
01:33:21.980
try to find peace on things.
link |
01:33:24.780
A book that actually influenced me a lot
link |
01:33:26.340
in terms of my career was called
link |
01:33:27.620
The Worldly Philosophers by Robert Halbrenner.
link |
01:33:30.460
It's actually about economists.
link |
01:33:31.660
It goes through a series of different,
link |
01:33:33.500
it's written in a very lively form,
link |
01:33:34.900
and it probably sounds boring,
link |
01:33:36.220
but it did describe whether it's Adam Smith
link |
01:33:38.820
or Karl Marx or John Maynard Keynes,
link |
01:33:40.820
and each of them sort of what their key insights were,
link |
01:33:43.340
but also kind of their personalities,
link |
01:33:45.340
and I think that's one of the reasons
link |
01:33:46.520
I became an economist was just understanding
link |
01:33:50.600
how they grapple with the big questions of the world.
link |
01:33:53.100
So would you recommend it as a good whirlwind overview
link |
01:33:56.340
of the history of economics?
link |
01:33:57.540
Yeah, yeah, I think that's exactly right.
link |
01:33:59.060
It kind of takes you through the different things,
link |
01:34:00.940
and so you can understand how they reach,
link |
01:34:04.020
thinking some of the strengths and weaknesses.
link |
01:34:06.380
I mean, it probably is a little out of date now.
link |
01:34:07.900
It needs to be updated a bit,
link |
01:34:08.980
but you could at least look through
link |
01:34:10.380
the first couple hundred years of economics,
link |
01:34:12.940
which is not a bad place to start.
link |
01:34:15.020
More recently, I mean, a book I really enjoyed
link |
01:34:17.580
is by my friend and colleague, Max Tegmark,
link |
01:34:20.260
called Life 3.0.
link |
01:34:21.340
You should have him on your podcast if you haven't already.
link |
01:34:23.260
He was episode number one.
link |
01:34:25.460
Oh my God.
link |
01:34:26.500
And he's back, he'll be back, he'll be back soon.
link |
01:34:30.220
Yeah, no, he's terrific.
link |
01:34:31.460
I love the way his brain works,
link |
01:34:33.460
and he makes you think about profound things.
link |
01:34:35.780
He's got such a joyful approach to life,
link |
01:34:38.540
and so that's been a great book,
link |
01:34:41.060
and I learn a lot from it, I think everybody,
link |
01:34:43.180
but he explains it in a way, even though he's so brilliant,
link |
01:34:45.580
that everyone can understand, that I can understand.
link |
01:34:50.020
That's three, but let me mention maybe one or two others.
link |
01:34:52.920
I mean, I recently read More From Less
link |
01:34:55.340
by my sometimes coauthor, Andrew McAfee.
link |
01:34:58.620
It made me optimistic about how we can continue
link |
01:35:01.940
to have rising living standards
link |
01:35:04.580
while living more lightly on the planet.
link |
01:35:06.140
In fact, because of higher living standards,
link |
01:35:07.860
because of technology,
link |
01:35:09.140
because of digitization that I mentioned,
link |
01:35:11.500
we don't have to have as big an impact on the planet,
link |
01:35:13.580
and that's a great story to tell,
link |
01:35:15.740
and he documents it very carefully.
link |
01:35:19.740
You know, a personal kind of self help book
link |
01:35:21.380
that I found kind of useful, People, is Atomic Habits.
link |
01:35:24.140
I think it's, what's his name, James Clear.
link |
01:35:26.180
Yeah, James Clear.
link |
01:35:27.540
He's just, yeah, it's a good name,
link |
01:35:29.100
because he writes very clearly,
link |
01:35:30.460
and you know, most of the sentences I read in that book,
link |
01:35:33.620
I was like, yeah, I know that,
link |
01:35:34.500
but it just really helps to have somebody like remind you
link |
01:35:37.220
and tell you and kind of just reinforce it, and it's helpful.
link |
01:35:40.860
So build habits in your life that you hope to have,
link |
01:35:45.020
that have a positive impact,
link |
01:35:46.140
and don't have to make it big things.
link |
01:35:48.100
It could be just tiny little.
link |
01:35:49.220
Exactly, I mean, the word atomic,
link |
01:35:50.720
it's a little bit of a pun, I think he says.
link |
01:35:52.540
You know, one, atomic means they're really small.
link |
01:35:54.020
You take these little things, but also like atomic power,
link |
01:35:56.860
can have like, you know, big impact.
link |
01:35:59.460
That's funny, yeah.
link |
01:36:01.460
The biggest ridiculous question,
link |
01:36:04.180
especially to ask an economist, but also a human being,
link |
01:36:06.860
what's the meaning of life?
link |
01:36:08.260
I hope you've gotten the answer to that from somebody else.
link |
01:36:11.460
I think we're all still working on that one, but what is it?
link |
01:36:14.740
You know, I actually learned a lot from my son, Luke,
link |
01:36:18.120
and he's 19 now, but he's always loved philosophy,
link |
01:36:22.100
and he reads way more sophisticated philosophy than I do.
link |
01:36:24.900
I went and took him to Oxford,
link |
01:36:25.860
and he spent the whole time like pulling
link |
01:36:27.060
all these obscure books down and reading them.
link |
01:36:29.020
And a couple of years ago, we had this argument,
link |
01:36:32.600
and he was trying to convince me that hedonism
link |
01:36:34.500
was the ultimate, you know, meaning of life,
link |
01:36:37.480
just pleasure seeking, and...
link |
01:36:40.380
Well, how old was he at the time?
link |
01:36:41.580
17, so...
link |
01:36:42.420
Okay.
link |
01:36:43.260
But he made a really good like intellectual argument
link |
01:36:46.700
for it too, and you know,
link |
01:36:47.540
but you know, it just didn't strike me as right.
link |
01:36:50.180
And I think that, you know, while I am kind of a utilitarian,
link |
01:36:54.540
like, you know, I do think we should do the grace,
link |
01:36:55.940
good for the grace number, that's just too shallow.
link |
01:36:58.740
And I think I've convinced myself that real happiness
link |
01:37:02.820
doesn't come from seeking pleasure.
link |
01:37:04.260
It's kind of a little, it's ironic.
link |
01:37:05.700
Like if you really focus on being happy,
link |
01:37:07.700
I think it doesn't work.
link |
01:37:09.740
You gotta like be doing something bigger.
link |
01:37:12.420
I think the analogy I sometimes use is, you know,
link |
01:37:14.900
when you look at a dim star in the sky,
link |
01:37:17.580
if you look right at it, it kind of disappears,
link |
01:37:19.460
but you have to look a little to the side,
link |
01:37:20.740
and then the parts of your retina
link |
01:37:23.180
that are better at absorbing light,
link |
01:37:24.940
you know, can pick it up better.
link |
01:37:26.340
It's the same thing with happiness.
link |
01:37:27.420
I think you need to sort of find something, other goal,
link |
01:37:32.500
something, some meaning in life,
link |
01:37:33.980
and that ultimately makes you happier
link |
01:37:36.180
than if you go squarely at just pleasure.
link |
01:37:39.060
And so for me, you know, the kind of research I do
link |
01:37:42.260
that I think is trying to change the world,
link |
01:37:44.220
make the world a better place,
link |
01:37:46.140
and I'm not like an evolutionary psychologist,
link |
01:37:47.980
but my guess is that our brains are wired,
link |
01:37:50.860
not just for pleasure, but we're social animals,
link |
01:37:53.860
and we're wired to like help others.
link |
01:37:57.220
And ultimately, you know,
link |
01:37:58.860
that's something that's really deeply rooted in our psyche.
link |
01:38:02.060
And if we do help others, if we do,
link |
01:38:04.500
or at least feel like we're helping others,
link |
01:38:06.660
you know, our reward systems kick in,
link |
01:38:08.220
and we end up being more deeply satisfied
link |
01:38:10.460
than if we just do something selfish and shallow.
link |
01:38:13.620
Beautifully put.
link |
01:38:14.460
I don't think there's a better way to end it, Eric.
link |
01:38:16.980
You were one of the people when I first showed up at MIT,
link |
01:38:20.500
that made me proud to be at MIT.
link |
01:38:22.420
So it's so sad that you're now at Stanford,
link |
01:38:24.540
but I'm sure you'll do wonderful things at Stanford as well.
link |
01:38:28.980
I can't wait till future books,
link |
01:38:30.900
and people should definitely read your other books.
link |
01:38:32.260
Well, thank you so much.
link |
01:38:33.180
And I think we're all part of the invisible college,
link |
01:38:35.580
as we call it.
link |
01:38:36.420
You know, we're all part of this intellectual
link |
01:38:38.700
and human community where we all can learn from each other.
link |
01:38:41.660
It doesn't really matter physically
link |
01:38:43.100
where we are so much anymore.
link |
01:38:44.860
Beautiful.
link |
01:38:45.700
Thanks for talking today.
link |
01:38:46.540
My pleasure.
link |
01:38:48.060
Thanks for listening to this conversation
link |
01:38:49.460
with Eric Brynjolfsson.
link |
01:38:50.860
And thank you to our sponsors.
link |
01:38:52.620
Vincero Watches, the maker of classy,
link |
01:38:55.060
well performing watches.
link |
01:38:56.860
Fort Sigmatic, the maker of delicious mushroom coffee.
link |
01:39:00.060
ExpressVPN, the VPN I've used for many years
link |
01:39:03.140
to protect my privacy on the internet.
link |
01:39:05.260
And CashApp, the app I use to send money to friends.
link |
01:39:09.140
Please check out these sponsors in the description
link |
01:39:11.180
to get a discount and to support this podcast.
link |
01:39:14.900
If you enjoy this thing, subscribe on YouTube.
link |
01:39:17.280
Review it with five stars on Apple Podcast,
link |
01:39:19.500
follow on Spotify, support on Patreon,
link |
01:39:22.100
or connect with me on Twitter at Lex Friedman.
link |
01:39:25.380
And now, let me leave you with some words
link |
01:39:27.700
from Albert Einstein.
link |
01:39:29.980
It has become appallingly obvious
link |
01:39:32.860
that our technology has exceeded our humanity.
link |
01:39:36.600
Thank you for listening and hope to see you next time.