back to index

Erik Brynjolfsson: Economics of AI, Social Networks, and Technology | Lex Fridman Podcast #141


small model | large model

link |
00:00:00.000
The following is a conversation with Eric Brinjalson.
link |
00:00:03.240
He's an economics professor at Stanford
link |
00:00:05.840
and the director of Stanford's Digital Economy Lab.
link |
00:00:09.400
Previously, he was a long, long time professor at MIT
link |
00:00:13.480
where he did groundbreaking work
link |
00:00:15.200
on the economics of information.
link |
00:00:17.760
He's the author of many books,
link |
00:00:19.800
including The Second Machine Age
link |
00:00:22.000
and Machine Platform Crowd, coauthored with Andrew McAfee.
link |
00:00:27.560
Quick mention of each sponsor,
link |
00:00:29.160
followed by some thoughts related to the episode.
link |
00:00:31.680
Ventura Watches, the maker of classy,
link |
00:00:34.200
well performing watches.
link |
00:00:36.080
FourSigmatic, the maker of delicious mushroom coffee.
link |
00:00:39.840
ExpressVPN, the VPN I've used for many years
link |
00:00:42.920
to protect my privacy on the internet.
link |
00:00:45.040
And Cash App, the app I use to send money to friends.
link |
00:00:49.040
Please check out these sponsors in the description
link |
00:00:51.000
to get a discount and to support this podcast.
link |
00:00:54.600
As a side note, let me say that the impact
link |
00:00:56.920
of artificial intelligence and automation
link |
00:00:59.240
on our economy and our world
link |
00:01:01.800
is something worth thinking deeply about.
link |
00:01:04.480
Like with many topics that are linked
link |
00:01:06.400
to predicting the future evolution of technology,
link |
00:01:09.160
it is often too easy to fall into one of two camps,
link |
00:01:12.720
the fear of mongering camp
link |
00:01:14.800
or the technologically utopianism camp.
link |
00:01:18.320
As always, the future will land us the where in between.
link |
00:01:21.640
I prefer to wear two hats in these discussions
link |
00:01:24.400
and alternate between them often.
link |
00:01:26.560
The hat of a pragmatic engineer
link |
00:01:29.520
and the hat of a futurist.
link |
00:01:31.960
This is probably a good time to mention Andrew Yang,
link |
00:01:35.120
the presidential candidate who has been
link |
00:01:38.080
one of the high profile thinkers on this topic.
link |
00:01:41.160
And I'm sure I will speak with him
link |
00:01:42.840
on this podcast eventually.
link |
00:01:44.760
A conversation with Andrew has been on the table many times.
link |
00:01:48.680
Our schedule is just having aligned,
link |
00:01:50.600
especially because I have a strongly held to preference
link |
00:01:54.440
for long form, two, three, four hours or more, and in person.
link |
00:02:00.160
I work hard to not compromise on this.
link |
00:02:03.080
Trust me, it's not easy.
link |
00:02:04.960
Even more so in the times of COVID,
link |
00:02:07.280
which requires getting tested nonstop,
link |
00:02:09.800
staying isolated and doing a lot of costly
link |
00:02:12.640
and uncomfortable things that minimize risk for the guests.
link |
00:02:15.920
The reason I do this is because to me,
link |
00:02:17.960
something is lost in remote conversation.
link |
00:02:20.920
That's something that magic,
link |
00:02:23.560
I think is worth the effort,
link |
00:02:25.320
even if it ultimately leads to a failed conversation.
link |
00:02:29.560
This is how I approach life,
link |
00:02:31.480
treasuring the possibility of a rare moment of magic.
link |
00:02:36.000
I'm willing to go to the ends of the world
link |
00:02:38.440
for just such a moment.
link |
00:02:40.840
If you enjoy this thing, subscribe on YouTube,
link |
00:02:43.240
review it with fast stars on Apple Podcasts,
link |
00:02:45.520
follow on Spotify, support on Patreon,
link |
00:02:48.120
connect with me on Twitter at Lex Freedman.
link |
00:02:51.280
And now, here's my conversation with Eric Greenjawson.
link |
00:02:56.280
You posted a quote on Twitter by Albert Bartlett
link |
00:03:00.000
saying that the greatest shortcoming of the human race
link |
00:03:03.440
is our inability to understand the exponential function.
link |
00:03:07.960
Why would you say the exponential growth
link |
00:03:09.920
is important to understand?
link |
00:03:12.320
Yeah, that quote, I remember posting that.
link |
00:03:15.200
It's actually a reprise of something Andy McAfee
link |
00:03:17.400
and I said in the second machine age,
link |
00:03:19.440
but I posted it in early March
link |
00:03:21.440
when COVID was really just beginning to take off
link |
00:03:23.900
and I was really scared.
link |
00:03:25.840
There were actually only a couple dozen cases,
link |
00:03:28.280
maybe less at that time,
link |
00:03:30.000
but they were doubling every two or three days
link |
00:03:32.240
and I could see, oh my God, this is gonna be a catastrophe
link |
00:03:35.480
and it's gonna happen soon,
link |
00:03:37.040
but nobody was taking it very seriously
link |
00:03:38.960
or not a lot of people were taking it very seriously.
link |
00:03:41.000
In fact, I remember I did my last in person conference
link |
00:03:45.240
that week, I was flying back from Las Vegas
link |
00:03:47.880
and I was the only person on the plane wearing a mask
link |
00:03:50.840
and the flight attendant came over to me,
link |
00:03:52.320
she looked very concerned and she kind of put her hands
link |
00:03:54.080
on my shoulder, she was touching me all over,
link |
00:03:55.320
which I wasn't thrilled about
link |
00:03:56.640
and she goes, do you have some kind of anxiety disorder?
link |
00:03:59.480
Are you okay?
link |
00:04:00.560
And I was like, no, it's because of COVID.
link |
00:04:03.000
This is early March.
link |
00:04:04.120
Early March, but I was worried because I knew I could see,
link |
00:04:09.720
or I suspected, I guess,
link |
00:04:10.880
that that doubling would continue and it did
link |
00:04:13.400
and pretty soon we had thousands of times more cases.
link |
00:04:17.240
Most of the time when I use that quote, I try to,
link |
00:04:19.760
it's motivated by more optimistic things like Moore's law
link |
00:04:22.200
and the wonders of having more computer power,
link |
00:04:25.640
but in either case, it can be very counterintuitive.
link |
00:04:28.680
I mean, if you walk for 10 minutes,
link |
00:04:31.640
you get about 10 times as far away
link |
00:04:33.160
as if you walk for one minute.
link |
00:04:34.920
That's the way our physical world works.
link |
00:04:36.160
That's the way our brains are wired,
link |
00:04:38.240
but if something doubles for 10 times as long,
link |
00:04:41.720
you don't get 10 times as much,
link |
00:04:43.320
you get 1,000 times as much
link |
00:04:45.480
and after 20, it's a billion, after 30, it's a,
link |
00:04:49.640
no, sorry, after 20, it's a million, after 30, it's a billion.
link |
00:04:53.640
And pretty soon after that,
link |
00:04:54.720
it just gets to these numbers that you can barely grasp.
link |
00:04:57.840
Our world is becoming more and more exponential,
link |
00:05:00.880
mainly because of digital technologies.
link |
00:05:03.680
So more and more often our intuitions are out of whack
link |
00:05:06.440
and that can be good in the case of things creating wonders,
link |
00:05:11.000
but it can be dangerous in the case of viruses
link |
00:05:13.760
and other things.
link |
00:05:14.680
Do you think it generally applies?
link |
00:05:16.520
Like, is there spaces where it does apply
link |
00:05:18.360
and where it doesn't?
link |
00:05:19.560
How are we supposed to build an intuition
link |
00:05:21.880
about in which aspects of our society
link |
00:05:25.440
does exponential growth apply?
link |
00:05:27.560
Well, you can learn the math,
link |
00:05:29.760
but the truth is our brains,
link |
00:05:31.240
I think tend to be learned more from experiences.
link |
00:05:35.560
So we just start seeing it more and more often.
link |
00:05:37.680
So hanging around Silicon Valley,
link |
00:05:39.600
hanging around AI and computer researchers,
link |
00:05:41.800
I see this kind of exponential growth a lot more frequently.
link |
00:05:45.040
And I'm getting used to it, but I still make mistakes.
link |
00:05:47.000
I still underestimate some of the progress
link |
00:05:48.920
in just talking to someone about GPT3
link |
00:05:51.160
and how rapidly natural language has improved.
link |
00:05:54.280
But I think that as the world becomes more exponential,
link |
00:05:58.520
we'll all start experiencing it more frequently.
link |
00:06:02.000
The danger is that we may make some mistakes in the meantime
link |
00:06:05.600
using our old kind of caveman intuitions
link |
00:06:07.920
about how the world works.
link |
00:06:09.520
Well, the weird thing is it always kind of looks linear
link |
00:06:12.000
in the moment.
link |
00:06:12.920
Like the, you know, it's hard to feel,
link |
00:06:17.160
it's hard to retrospect and really acknowledge
link |
00:06:22.320
how much has changed in just a couple of years
link |
00:06:24.240
or five years or 10 years with the internet.
link |
00:06:27.440
If we just look at investments of AI
link |
00:06:29.880
or even just social media,
link |
00:06:31.920
all the various technologies
link |
00:06:33.920
that go into the digital umbrella.
link |
00:06:35.840
Yeah.
link |
00:06:36.680
It feels pretty calm and normal and gradual.
link |
00:06:39.680
Well, a lot of stuff, you know,
link |
00:06:40.960
I think there are parts of the world,
link |
00:06:42.200
most of the world is not exponential.
link |
00:06:44.480
You know, the way humans learn,
link |
00:06:47.200
the way organizations change,
link |
00:06:49.440
the way our whole institutions adapt and evolve,
link |
00:06:52.240
those don't improve at exponential paces.
link |
00:06:54.760
And that leads to a mismatch oftentimes
link |
00:06:56.560
between these exponentially improving technologies
link |
00:06:59.160
or let's say changing technologies
link |
00:07:00.640
because some of them are exponentially more dangerous
link |
00:07:03.240
and our intuitions and our human skills
link |
00:07:06.920
and our institutions that just don't change very fast at all.
link |
00:07:11.920
And that mismatch I think is at the root
link |
00:07:13.960
of a lot of the problems in our society,
link |
00:07:15.680
the growing inequality and other dysfunctions
link |
00:07:21.880
in our political and economic systems.
link |
00:07:24.440
So one guy that talks about exponential functions
link |
00:07:28.440
a lot is Elon Musk.
link |
00:07:29.640
He seems to internalize this kind of way
link |
00:07:33.000
of exponential thinking.
link |
00:07:34.800
He calls it first principles thinking,
link |
00:07:37.400
sort of the kind of going to the basics,
link |
00:07:40.600
asking the question like what were the assumptions
link |
00:07:44.280
of the past, how can we throw them out the window?
link |
00:07:47.760
How can we do this 10x much more efficiently
link |
00:07:50.200
and constantly practicing that process?
link |
00:07:52.440
And also using that kind of thinking to estimate sort of
link |
00:07:57.440
when create deadlines and estimate
link |
00:08:02.960
when you'll be able to deliver on some of these technologies.
link |
00:08:07.200
Now, it often gets him in trouble
link |
00:08:10.240
because he overestimates,
link |
00:08:13.200
like he doesn't meet the initial estimates of the deadlines
link |
00:08:18.040
but he seems to deliver late but deliver.
link |
00:08:23.440
And which is kind of an interesting,
link |
00:08:25.040
like what are your thoughts about this whole thing?
link |
00:08:26.880
I think we can all learn from Elon.
link |
00:08:28.840
I think going to first principles,
link |
00:08:30.120
I talked about two ways of getting more of a grip
link |
00:08:32.840
on the exponential function.
link |
00:08:34.560
And one of them just comes from first principles.
link |
00:08:36.440
If you understand the math of it,
link |
00:08:37.800
you can see what's going to happen.
link |
00:08:39.040
And even if it seems counterintuitive
link |
00:08:41.040
that a couple of dozen of COVID cases
link |
00:08:42.960
could become thousands or tens or hundreds
link |
00:08:45.760
of thousands of them in a month,
link |
00:08:48.080
it makes sense once you just do the math.
link |
00:08:51.200
And I think Elon tries to do that a lot.
link |
00:08:53.920
I think he also benefits from hanging out in Silicon Valley
link |
00:08:57.040
and he's experienced it in a lot of different applications.
link |
00:09:00.640
So, it's not as much of a shock to him anymore
link |
00:09:04.120
but that's something we can all learn from.
link |
00:09:07.200
In my own life, I remember one of my first experiences
link |
00:09:10.440
really seeing it was when I was a grad student
link |
00:09:13.000
and my advisor asked me to plot the growth of computer power
link |
00:09:17.600
in the US economy in different industries.
link |
00:09:20.040
And there are all these exponentially growing curves
link |
00:09:23.040
and I was like, holy shit, look at this.
link |
00:09:24.560
In each industry, it was just taking off.
link |
00:09:26.880
And you didn't have to be a rocket scientist
link |
00:09:29.200
to extend that and say, wow, this means that
link |
00:09:31.880
this was in the late 80s and early 90s
link |
00:09:33.600
that if it goes anything like that,
link |
00:09:35.880
we're gonna have orders of magnitude
link |
00:09:37.320
more computer power than we did at that time.
link |
00:09:39.480
And of course we do.
link |
00:09:41.320
So, when people look at Moore's law,
link |
00:09:45.000
they often talk about it as just,
link |
00:09:46.880
so the exponential function is actually a stack of S curves.
link |
00:09:51.360
So basically it's you milk or whatever,
link |
00:09:57.240
take the most advantage of a particular little revolution
link |
00:10:01.240
and then you search for another revolution.
link |
00:10:03.000
And it's basically revolutions stack on top of revolutions.
link |
00:10:06.720
Do you have any intuition about how the head humans
link |
00:10:08.960
keep finding ways to revolutionize things?
link |
00:10:12.280
Well, first let me just unpack that first point
link |
00:10:14.200
that I talked about exponential curves
link |
00:10:17.160
but no exponential curve continues forever.
link |
00:10:21.520
It's been said that if anything can't go on forever,
link |
00:10:25.000
eventually it will stop.
link |
00:10:26.720
And...
link |
00:10:27.560
That's very profound.
link |
00:10:28.400
It's very profound, but it seems that a lot of people
link |
00:10:31.960
don't appreciate that half of it as well either.
link |
00:10:34.000
And that's why all exponential functions
link |
00:10:35.840
eventually turn into some kind of S curve
link |
00:10:38.280
or stop in some other way, maybe catastrophically.
link |
00:10:41.280
And that's happened with COVID as well.
link |
00:10:42.840
I mean, it was, it went up and then it sort of,
link |
00:10:44.600
at some point it starts saturating the pool
link |
00:10:47.200
of people to be infected.
link |
00:10:49.080
There's a standard epidemiological model
link |
00:10:51.240
that's based on that.
link |
00:10:53.000
And it's beginning to happen with Moore's law
link |
00:10:55.080
or different generations of computer power.
link |
00:10:56.960
It happens with all exponential curves.
link |
00:10:59.320
The remarkable thing is you alluded,
link |
00:11:01.080
the second part of your question is that we've been able
link |
00:11:03.600
to come up with a new S curve on top of the previous one
link |
00:11:06.880
and do that generation after generation
link |
00:11:09.120
with new materials, new processes
link |
00:11:12.120
and just extend it further and further.
link |
00:11:15.720
I don't think anyone has a really good theory
link |
00:11:17.440
about why we've been so successful in doing that.
link |
00:11:21.440
It's great that we have been
link |
00:11:23.200
and I hope it continues for some time.
link |
00:11:26.440
But it's, you know, one beginning of a theory
link |
00:11:31.520
is that there's huge incentives when other parts
link |
00:11:34.480
of the system are going on that clock speed
link |
00:11:36.920
of doubling every two to three years.
link |
00:11:39.320
If there's one component of it that's not keeping up,
link |
00:11:42.160
then the economic incentives become really large
link |
00:11:44.840
to improve that one part.
link |
00:11:46.240
It becomes a bottleneck and anyone who can do
link |
00:11:49.240
improvements in that part can reap huge returns
link |
00:11:51.680
so that the resources automatically get focused
link |
00:11:54.040
on whatever part of the system is in keeping up.
link |
00:11:56.600
Do you think some version of the Moore's law will continue?
link |
00:11:59.720
Some version, yes, it is.
link |
00:12:01.400
I mean, one version that has become more important
link |
00:12:04.640
is something I call Kumi's law,
link |
00:12:06.280
which is named after John Kumi,
link |
00:12:08.480
which I should mention was also my college roommate,
link |
00:12:10.280
but he identified the fact that energy consumption
link |
00:12:14.360
has been declining by a factor of two.
link |
00:12:17.240
And for most of us, that's more important, you know,
link |
00:12:18.960
the new iPhones came out today as we're recording this.
link |
00:12:21.320
I'm not sure when you're gonna make it available.
link |
00:12:23.080
Very soon after this, yeah.
link |
00:12:24.920
And for most of us, you know, having the iPhone
link |
00:12:27.360
be twice as fast, you know, it's nice,
link |
00:12:30.840
but having it the battery life longer,
link |
00:12:33.160
that would be much more valuable.
link |
00:12:35.440
And the fact that a lot of the progress in chips now
link |
00:12:38.200
is reducing energy consumption is probably more important
link |
00:12:42.800
for many applications than just the raw speed.
link |
00:12:46.040
Other dimensions of Moore's law are in AI and machine learning.
link |
00:12:51.560
Those tend to be very parallelizable functions,
link |
00:12:55.160
especially deep neural nets.
link |
00:12:58.440
And so instead of having one chip,
link |
00:13:01.280
you can have multiple chips or you can have a GPU,
link |
00:13:05.000
a graphic processing unit that goes faster
link |
00:13:07.000
and now special chips designed for machine learning
link |
00:13:09.640
like tensor processing units.
link |
00:13:11.240
Each time you switch, there's another 10X
link |
00:13:13.040
or 100X improvement above and beyond Moore's law.
link |
00:13:16.760
So I think that the raw silicon
link |
00:13:18.880
isn't improving as much as it used to,
link |
00:13:20.800
but these other dimensions are becoming important,
link |
00:13:23.960
more important than we're seeing progress in them.
link |
00:13:26.320
I don't know if you've seen the work by OpenAI
link |
00:13:28.240
where they show the exponential improvement
link |
00:13:31.960
of the training of neural networks,
link |
00:13:34.400
just literally in the techniques used.
link |
00:13:36.960
So that's almost like the algorithm.
link |
00:13:40.520
It's fascinating to think like, can I actually continue?
link |
00:13:43.640
Us figuring out more and more tricks
link |
00:13:45.160
on how to train networks faster.
link |
00:13:47.000
Well, the progress has been staggering.
link |
00:13:49.520
If you look at image recognition, as you mentioned,
link |
00:13:51.720
I think it's a function of at least three things
link |
00:13:53.440
that are coming together.
link |
00:13:54.520
One, we just talked about faster chips,
link |
00:13:56.560
not just Moore's law, but GPUs, TPUs and other technologies.
link |
00:14:00.400
The second is just a lot more data.
link |
00:14:02.760
I mean, we are awash in digital data today
link |
00:14:05.600
in a way we weren't 20 years ago.
link |
00:14:08.080
Photography, I'm old enough to remember,
link |
00:14:09.960
used to be chemical and now everything is digital.
link |
00:14:12.800
I took probably 50 digital photos yesterday.
link |
00:14:16.440
I wouldn't have done that if it was chemical.
link |
00:14:17.880
And we have the internet of things
link |
00:14:20.680
and all sorts of other types of data.
link |
00:14:22.920
When we walk around with our phone,
link |
00:14:24.120
it's just broadcasting a huge amount of digital data
link |
00:14:27.240
that can be used as training sets.
link |
00:14:29.240
And then last but not least,
link |
00:14:30.800
as they mentioned at OpenAI,
link |
00:14:34.280
there have been significant improvements in the techniques.
link |
00:14:37.320
The core idea of deep neural nets
link |
00:14:39.240
has been around for a few decades,
link |
00:14:41.200
but the advances in making it work more efficiently
link |
00:14:44.200
have also improved a couple of orders of magnitude or more.
link |
00:14:48.200
So you multiply together 100 fold improvement
link |
00:14:50.760
in computer power, 100 fold or more improvement in data,
link |
00:14:55.360
100 fold improvement in techniques of software
link |
00:14:59.680
and algorithms, and soon you're getting
link |
00:15:01.480
into a million fold improvements.
link |
00:15:03.920
Somebody brought this idea with GPT3 that it's,
link |
00:15:09.920
so it's trained in a self supervised way
link |
00:15:11.960
on basically internet data.
link |
00:15:15.000
And that's one of the, I've seen arguments made
link |
00:15:18.920
and they seem to be pretty convincing
link |
00:15:21.120
that the bottleneck there is going to be
link |
00:15:23.440
how much data there is on the internet,
link |
00:15:25.640
which is a fascinating idea that it literally
link |
00:15:29.160
will just run out of human generated data to train on.
link |
00:15:33.280
It's a...
link |
00:15:34.120
I know we make it the point where it's consumed
link |
00:15:35.760
basically all of human knowledge
link |
00:15:37.520
or all digitized human knowledge, yeah.
link |
00:15:39.160
And that would be the bottleneck.
link |
00:15:40.880
But the interesting thing with bottlenecks
link |
00:15:44.560
is people often use bottlenecks as a way
link |
00:15:47.680
to argue against exponential growth.
link |
00:15:49.960
They say, well, there's no way you can
link |
00:15:52.480
overcome this bottleneck,
link |
00:15:53.880
but we seem to somehow keep coming up in new ways
link |
00:15:57.000
to overcome whatever bottlenecks the critics come up with.
link |
00:16:01.200
Which is fascinating.
link |
00:16:02.040
I don't know how you overcome the data bottleneck,
link |
00:16:04.640
but probably more efficient training algorithms.
link |
00:16:07.040
Yeah, well, you already mentioned that,
link |
00:16:08.280
that these training algorithms are getting much better
link |
00:16:10.280
at using smaller amounts of data.
link |
00:16:12.480
We also are just capturing a lot more data
link |
00:16:15.320
than we used to, especially in China,
link |
00:16:17.680
but all around us.
link |
00:16:18.920
So those are both important.
link |
00:16:20.560
In some applications, you can simulate the data,
link |
00:16:24.200
video games, some of the self driving car systems
link |
00:16:28.960
are, you know, simulating driving.
link |
00:16:30.720
And of course that has some risks and weaknesses,
link |
00:16:34.320
but you can also, and if you want to, you know,
link |
00:16:37.560
exhaust all the different ways you could beat a video game,
link |
00:16:39.920
you could just simulate all the options.
link |
00:16:42.360
Can we take a step in that direction of autonomous vehicles?
link |
00:16:45.000
Make sure you're talking to the CTO of Waymo tomorrow.
link |
00:16:48.680
And obviously, I'm talking to Elon again in a couple of weeks.
link |
00:16:52.440
What's your thoughts on autonomous vehicles?
link |
00:16:57.080
Like, where do we stand as a problem
link |
00:17:01.880
that has the potential of revolutionizing the world?
link |
00:17:04.560
Well, you know, I'm really excited about that,
link |
00:17:06.840
but it's become much clearer that the original way
link |
00:17:10.000
that I thought about it, and most people thought about it,
link |
00:17:11.440
like, you know, will we have a self driving car or not,
link |
00:17:13.520
is way too simple.
link |
00:17:15.720
The better way to think about it is that there's a whole
link |
00:17:18.360
continuum of how much driving and assisting
link |
00:17:21.880
a car can do.
link |
00:17:22.720
I noticed that you're right next to your next door
link |
00:17:24.800
to Toyota Research Institute.
link |
00:17:26.000
That's a total accident.
link |
00:17:27.640
I love the TRI folks, but yeah.
link |
00:17:29.360
Have you talked to Gil Pratt?
link |
00:17:30.840
Yeah, we're going to, we're supposed to talk.
link |
00:17:33.800
It's kind of hilarious.
link |
00:17:35.000
So there's kind of the, I think it's a good counterpart
link |
00:17:37.000
to see what Elon is doing, and hopefully they can be frank
link |
00:17:40.080
in how they think about each other,
link |
00:17:41.480
because I've heard both of them talk about it.
link |
00:17:43.920
But they're much more, you know, this is an assistive,
link |
00:17:46.360
a guardian angel that watches over you,
link |
00:17:48.600
as opposed to try to do everything.
link |
00:17:50.480
I think there's some things like driving on a highway,
link |
00:17:53.360
you know, from LA to Phoenix,
link |
00:17:55.360
where it's mostly good weather, straight roads.
link |
00:17:58.160
That's close to a solved problem, let's face it.
link |
00:18:01.280
In other situations, you know, driving through the snow
link |
00:18:03.680
and in Boston, where the roads are kind of crazy.
link |
00:18:06.240
And most importantly, you have to make a lot of judgments
link |
00:18:08.240
about what the other driver's going to do
link |
00:18:09.480
at these intersections that aren't really right angles
link |
00:18:11.720
and aren't very well described.
link |
00:18:13.440
It's more like game theory.
link |
00:18:15.360
That's a much harder problem and requires understanding
link |
00:18:19.120
of human motivations, and so there's a continuum there
link |
00:18:23.400
of some places where the cars will work very well
link |
00:18:27.600
and others where it could probably take decades.
link |
00:18:30.960
What do you think about the Waymo?
link |
00:18:33.520
So as you mentioned, two companies
link |
00:18:36.040
that actually have cars on the road,
link |
00:18:38.080
there's the Waymo approach that it's more like,
link |
00:18:40.680
we're not going to release anything until it's perfect
link |
00:18:42.840
and we're going to be very strict about the streets
link |
00:18:46.720
that we travel on, but it'd be perfect.
link |
00:18:49.200
Yeah.
link |
00:18:50.240
Well, I'm smart enough to be humble
link |
00:18:53.960
and not try to get between.
link |
00:18:55.080
I know there's very bright people on both sides
link |
00:18:57.080
of the argument, I've talked to them
link |
00:18:58.080
and they make convincing arguments to me
link |
00:19:00.040
about how careful they need to be
link |
00:19:01.720
and the social acceptance.
link |
00:19:04.040
Some people thought that when the first few people died
link |
00:19:07.440
from self driving cars, I would shut down the industry,
link |
00:19:09.880
but it was more of a blip actually.
link |
00:19:11.840
And you know, so that was interesting.
link |
00:19:14.480
Of course, there's still a concern
link |
00:19:16.080
that if there could be setbacks,
link |
00:19:19.040
if we do this wrong, you know,
link |
00:19:21.480
your listeners may be familiar
link |
00:19:22.640
with the different levels of self driving,
link |
00:19:24.160
you know, level one, two, three, four, five.
link |
00:19:26.800
I think Andrew Ang has convinced me
link |
00:19:28.760
that this idea of really focusing on level four,
link |
00:19:32.600
where you only go in areas that are well mapped,
link |
00:19:35.000
rather than just going out in the wild,
link |
00:19:37.480
is the way things are going to evolve.
link |
00:19:39.720
But you can just keep expanding those areas
link |
00:19:42.600
where you've mapped things really well,
link |
00:19:44.040
where you really understand them
link |
00:19:45.000
and eventually they all become kind of interconnected.
link |
00:19:47.960
And that could be a kind of another way of progressing
link |
00:19:52.120
to make it more feasible over time.
link |
00:19:55.280
I mean, that's kind of like the Waymo approach,
link |
00:19:57.440
which is they just now released,
link |
00:19:59.560
I think just like a day or two ago,
link |
00:20:04.080
anyone from the public in the Phoenix, Arizona,
link |
00:20:11.160
to, you know, you can get a ride in a Waymo car
link |
00:20:14.400
with no person, no driver.
link |
00:20:16.160
Oh, they've taken away the safety driver?
link |
00:20:17.680
Oh yeah, for a while now there's been no safety driver.
link |
00:20:21.080
Okay, because I mean, I've been following
link |
00:20:22.520
that one in particular, but I thought it was kind of funny
link |
00:20:24.880
about a year ago when they had the safety driver
link |
00:20:27.000
and then they added a second safety driver
link |
00:20:28.440
because the first safety driver would fall asleep.
link |
00:20:30.920
I'm not sure they're going in the right direction with that.
link |
00:20:33.400
No, they've Waymo in particular
link |
00:20:38.200
done a really good job with that.
link |
00:20:39.520
They actually have a very interesting infrastructure.
link |
00:20:44.360
Of remote like observation.
link |
00:20:47.480
So they're not controlling the vehicles remotely,
link |
00:20:49.840
but they're able to, it's like a customer service.
link |
00:20:52.440
They can anytime tune into the car.
link |
00:20:55.520
I bet they can probably remotely control it as well,
link |
00:20:58.160
but that's officially not the function that they.
link |
00:21:00.920
Yeah, I can see that being really,
link |
00:21:02.760
because I think the thing that's proven harder
link |
00:21:06.280
than maybe some of the early people expected was
link |
00:21:08.200
there's a long tail of weird exceptions.
link |
00:21:10.840
So you can deal with 90, 99, 99.99% of the cases,
link |
00:21:15.440
but then there's something that just never been seen
link |
00:21:17.440
before in the training data.
link |
00:21:18.840
And humans more or less can work around that,
link |
00:21:21.120
although let me be clear and note,
link |
00:21:22.880
there are about 30,000 human fatalities
link |
00:21:25.640
just in the United States and maybe a million worldwide.
link |
00:21:28.400
So they're far from perfect,
link |
00:21:30.040
but I think people have higher expectations of machines.
link |
00:21:33.480
They wouldn't tolerate that level of death
link |
00:21:36.320
and damage from a machine.
link |
00:21:40.080
And so we have to do a lot better
link |
00:21:41.840
at dealing with those edge cases.
link |
00:21:43.520
And also the tricky thing that,
link |
00:21:45.920
if I have a criticism for the Waymo folks,
link |
00:21:47.920
there's such a huge focus on safety,
link |
00:21:51.840
where people don't talk enough about creating products
link |
00:21:55.200
that customers love, that human beings love using.
link |
00:22:00.360
You know, it's very easy to create a thing that's safe
link |
00:22:03.360
at the extremes, but then nobody wants to get into it.
link |
00:22:06.960
Yeah, well, back to Elon.
link |
00:22:08.760
I think one of, part of his genius
link |
00:22:10.560
was with the electric cars, before he came along,
link |
00:22:12.840
electric cars were all kind of underpowered, really light,
link |
00:22:15.720
and there were sort of wimpy cars that weren't fun.
link |
00:22:20.720
And the first thing he did was,
link |
00:22:22.680
he made a roadster that went zero to 60 faster
link |
00:22:25.680
than just about any other car and went the other end.
link |
00:22:28.520
And I think that was a really wise marketing move
link |
00:22:30.840
as well as a wise technology move.
link |
00:22:33.200
Yeah, it's difficult to figure out
link |
00:22:34.880
what right marketing move is for AI systems.
link |
00:22:38.000
That's always been, I think it requires guts
link |
00:22:42.000
and risk taking, which is what Elon practices,
link |
00:22:46.440
I mean, to the chagrin of perhaps investors or whatever,
link |
00:22:50.600
but it also requires rethinking what you're doing.
link |
00:22:54.440
I think way too many people are unimaginative,
link |
00:22:57.640
intellectually lazy, and when they take AI,
link |
00:22:59.880
they basically say, what are we doing now?
link |
00:23:01.760
How can we make a machine do the same thing?
link |
00:23:04.120
Maybe we'll save some costs, we'll have less labor.
link |
00:23:06.840
And yeah, it's not necessarily the worst thing
link |
00:23:08.640
in the world to do, but it's really not leading
link |
00:23:10.720
to a quantum change in the way you do things.
link |
00:23:13.360
When Jeff Bezos said, hey, we're gonna use the internet
link |
00:23:16.640
to change how bookstores work, and we're gonna use technology,
link |
00:23:19.280
he didn't go and say, okay, let's put a robot cashier
link |
00:23:22.720
where the human cashier is and leave everything else alone.
link |
00:23:25.640
That would have been a very lame way
link |
00:23:27.080
to automate a bookstore.
link |
00:23:28.360
He went from soup to nut to say,
link |
00:23:30.000
let's just rethink it, we get rid of the physical bookstore,
link |
00:23:33.040
we have a warehouse, we have delivery,
link |
00:23:34.520
we have people order on a screen,
link |
00:23:36.360
and everything was reinvented.
link |
00:23:38.440
And that's been the story of these general purpose
link |
00:23:41.640
technologies all through history.
link |
00:23:43.520
In my books, I write about electricity
link |
00:23:46.280
and how for 30 years, there was almost no productivity gain
link |
00:23:50.320
from the electrification of factories a century ago.
link |
00:23:52.960
And that's not because electricity
link |
00:23:54.240
is a wimpy, useless technology,
link |
00:23:55.760
we all know how awesome electricity is.
link |
00:23:57.520
It's because at first, they really didn't rethink the factories.
link |
00:24:00.480
It was only after they reinvented them,
link |
00:24:02.080
and we describe how in the book,
link |
00:24:04.000
then you suddenly got a doubling
link |
00:24:05.400
and tripling of productivity growth.
link |
00:24:07.520
But it's the combination of the technology
link |
00:24:09.880
with the new business models, new business organization.
link |
00:24:12.920
That just takes a long time
link |
00:24:13.960
and it takes more creativity than most people have.
link |
00:24:16.880
Can you maybe linger on electricity?
link |
00:24:18.960
Because that's a fun one.
link |
00:24:19.920
Yeah, I'll tell you what happened.
link |
00:24:22.440
Before electricity, there were basically steam engines
link |
00:24:25.720
or sometimes water wheels.
link |
00:24:27.000
And to power the machinery,
link |
00:24:28.320
you had to have pulleys and crankshafts.
link |
00:24:30.520
And you really can't make them too long
link |
00:24:32.080
because they'll break the torsion.
link |
00:24:34.000
So all the equipment was kind of clustered around this,
link |
00:24:36.440
one giant steam engine.
link |
00:24:37.920
You can't make small steam engines either
link |
00:24:39.440
because of thermodynamics.
link |
00:24:40.760
So you have one giant steam engine,
link |
00:24:42.320
all the equipment clustered around it, multi story,
link |
00:24:44.320
they have it vertical to minimize the distance
link |
00:24:46.080
as well as horizontal.
link |
00:24:47.720
And then when they did electricity,
link |
00:24:48.920
they took out the steam engine,
link |
00:24:50.040
they got the biggest electric motor
link |
00:24:51.320
they could buy from General Electric
link |
00:24:52.920
or someone like that.
link |
00:24:54.160
And nothing much else changed.
link |
00:24:57.880
It took until a generation of managers retired or died
link |
00:25:01.960
30 years later that people started thinking,
link |
00:25:04.400
wait, we don't have to do it that way.
link |
00:25:05.800
You can make electric motors,
link |
00:25:07.680
big, small, medium.
link |
00:25:09.440
You can put one with each piece of equipment.
link |
00:25:11.480
There's this big debate,
link |
00:25:12.320
if you read the management literature
link |
00:25:13.360
between what they call group drive versus unit drive,
link |
00:25:16.120
where every machine would have its own motor.
link |
00:25:18.920
Well, once they did that,
link |
00:25:19.800
once they went to unit drive,
link |
00:25:21.000
those guys won the debate,
link |
00:25:23.000
then you started having a new kind of factory,
link |
00:25:25.040
which is sometimes spread out over acres,
link |
00:25:28.240
single story and each piece of equipment had its own motor.
link |
00:25:31.320
And most importantly,
link |
00:25:32.320
they weren't laid out based on who needed the most power.
link |
00:25:35.000
They were laid out based on
link |
00:25:37.600
what is the workflow of materials, assembly line.
link |
00:25:40.560
Let's have it go from this machine
link |
00:25:41.720
to that machine to that machine.
link |
00:25:43.240
Once they rethought the factory that way,
link |
00:25:46.040
huge increases in productivity was just staggering.
link |
00:25:48.360
People like Paul David have documented this
link |
00:25:50.080
in their research papers.
link |
00:25:51.760
And I think that there's a lesson you see over and over.
link |
00:25:55.920
It happened when the steam engine
link |
00:25:57.080
changed manual production.
link |
00:25:58.560
It's happened with the computerization.
link |
00:26:00.240
People like Michael Hammer said,
link |
00:26:01.520
don't automate, obliterate.
link |
00:26:03.600
In each case,
link |
00:26:05.160
the big gains only came once smart entrepreneurs
link |
00:26:09.320
and managers basically reinvented their industries.
link |
00:26:13.000
I mean, one other interesting point about all that
link |
00:26:14.640
is that during that reinvention period,
link |
00:26:18.880
you often actually,
link |
00:26:20.640
not only don't see productivity growth,
link |
00:26:22.200
you can actually see a slipping back,
link |
00:26:24.320
measured productivity actually falls.
link |
00:26:26.520
I just wrote a paper with Chad Severson and Daniel Rock
link |
00:26:29.040
called the Productivity J Curve,
link |
00:26:31.520
which basically shows that in a lot of these cases,
link |
00:26:33.840
you have a downward dip before it goes up.
link |
00:26:36.520
And that downward dip is when everyone's trying
link |
00:26:38.320
to reinvent things.
link |
00:26:40.400
And you could say that they're creating knowledge
link |
00:26:43.160
and intangible assets,
link |
00:26:44.640
but that doesn't show up on anyone's balance sheet.
link |
00:26:46.800
It doesn't show up in GDP.
link |
00:26:48.320
So it's as if they're doing nothing.
link |
00:26:50.080
Like take self driving cars, we're just talking about it.
link |
00:26:52.480
There have been hundreds of billions of dollars spent
link |
00:26:55.760
developing self driving cars.
link |
00:26:57.880
And basically no chauffeur has lost his job,
link |
00:27:01.560
no taxi driver.
link |
00:27:02.400
I guess I gotta check on the one thing.
link |
00:27:03.560
It's a big J curve.
link |
00:27:04.480
Yeah, so there's a bunch of spending
link |
00:27:06.080
and no real consumer benefit.
link |
00:27:08.120
Now they're doing that in the belief,
link |
00:27:11.440
I think the justified belief
link |
00:27:13.240
that they will get the upward part of the J curve
link |
00:27:15.160
and they will be some big returns,
link |
00:27:16.920
but in the short run, you're not seeing it.
link |
00:27:19.320
That's happening with a lot of other AI technologies,
link |
00:27:21.880
just as it happened with earlier general purpose technologies.
link |
00:27:25.080
And it's one of the reasons we're having
link |
00:27:26.520
relatively low productivity growth lately.
link |
00:27:29.280
As an economist, one of the things that disappoints me
link |
00:27:31.400
is that as eye popping as these technologies are,
link |
00:27:34.440
you and I are both excited about some things they can do.
link |
00:27:36.880
The economic productivity statistics are kind of dismal.
link |
00:27:40.280
We actually believe it or not,
link |
00:27:42.200
have had lower productivity growth in the past,
link |
00:27:45.440
about 15 years than we did in the previous 15 years,
link |
00:27:48.840
in the 90s and early 2000s.
link |
00:27:51.360
And so that's not what you would have expected
link |
00:27:53.200
if these technologies were that much better.
link |
00:27:55.520
But I think we're in kind of a long J curve there.
link |
00:27:59.400
Personally, I'm optimistic.
link |
00:28:00.560
We'll start seeing the upward tick,
link |
00:28:02.080
maybe as soon as next year.
link |
00:28:04.520
But the past decade has been a bit disappointing
link |
00:28:08.520
if you thought there's a one to one relationship
link |
00:28:10.000
between cool technology and higher productivity.
link |
00:28:12.760
What would you place your biggest hope
link |
00:28:15.240
for productivity increases on?
link |
00:28:17.240
Like you kind of said at a high level AI,
link |
00:28:19.840
but if I were to think about
link |
00:28:22.840
what has been so revolutionary in the last 10 years,
link |
00:28:28.200
I would have 15 years and thinking about the internet,
link |
00:28:32.240
I would say things like,
link |
00:28:35.800
hope I'm not saying anything ridiculous,
link |
00:28:37.160
but everything from Wikipedia to Twitter.
link |
00:28:41.040
So like these kind of websites, not so much AI,
link |
00:28:46.080
but like I would expect to see some kind of big productivity
link |
00:28:49.760
increases from just the connectivity
link |
00:28:52.680
between people and the access to more information.
link |
00:28:58.040
Yeah, well, that's another area
link |
00:29:00.040
I've done quite a bit of research on actually
link |
00:29:01.840
is these free goods like Wikipedia, Facebook, Twitter,
link |
00:29:06.040
Zoom, we're actually doing this in person,
link |
00:29:08.120
but almost everything else I do these days is online.
link |
00:29:12.360
The interesting thing about all those is
link |
00:29:14.680
most of them have a price of zero.
link |
00:29:17.640
You know, what do you pay for Wikipedia?
link |
00:29:19.920
Maybe like a little bit for the electrons
link |
00:29:21.680
to come to your house.
link |
00:29:22.960
Basically zero, right?
link |
00:29:25.600
Take a small pause and say,
link |
00:29:26.880
I donate to Wikipedia often, you should too.
link |
00:29:28.880
It's good for you, yeah.
link |
00:29:30.040
So, but what does that do mean for GDP?
link |
00:29:32.440
GDP is based on the price and quantity
link |
00:29:36.080
of all the goods things bought and sold.
link |
00:29:37.720
If something has zero price,
link |
00:29:39.520
you know how much it contributes to GDP?
link |
00:29:42.240
To a first approximation, zero.
link |
00:29:44.520
So these digital goods that we're getting more and more
link |
00:29:47.440
of we're spending more and more hours a day
link |
00:29:50.200
consuming stuff off the screens, little screens,
link |
00:29:52.600
big screens, that doesn't get priced into GDP.
link |
00:29:56.120
It's like they don't exist.
link |
00:29:58.600
That doesn't mean they don't create value.
link |
00:29:59.920
I get a lot of value from watching cat videos
link |
00:30:03.320
and reading Wikipedia articles and listening to podcasts,
link |
00:30:06.080
even if I don't pay for them.
link |
00:30:08.400
So we've got a mismatch there.
link |
00:30:10.360
Now, in fairness, economists since Simon Kuznets
link |
00:30:13.400
invented GDP and productivity, all those statistics
link |
00:30:16.200
back in the 1930s, he recognized, he in fact said,
link |
00:30:19.640
this is not a measure of wellbeing.
link |
00:30:21.480
This is not a measure of welfare.
link |
00:30:23.160
It's a measure of production.
link |
00:30:25.080
But almost everybody has kind of forgotten
link |
00:30:28.960
that he said that and they just use it.
link |
00:30:30.960
It's like, how well off are we?
link |
00:30:32.120
What was GDP last year?
link |
00:30:33.240
It was 2.3% growth or whatever.
link |
00:30:35.760
That is how much physical production,
link |
00:30:39.440
but it's not the value we're getting.
link |
00:30:42.320
We need a new set of statistics.
link |
00:30:43.800
And I'm working with some colleagues,
link |
00:30:45.280
Avi Collis and others to develop something
link |
00:30:48.360
we call GDP dash B.
link |
00:30:50.440
GDP B measures the benefits you get, not the cost.
link |
00:30:55.440
If you get benefit from Zoom or Wikipedia or Facebook,
link |
00:31:00.400
then that gets counted in GDP B, even if you pay zero for it.
link |
00:31:04.560
So, you know, back to the original point,
link |
00:31:07.360
I think there is a lot of gain over the past decade
link |
00:31:10.480
in these digital goods that doesn't show up in GDP,
link |
00:31:15.280
doesn't show up in productivity.
link |
00:31:16.440
By the way, productivity is just defined as GDP
link |
00:31:18.560
divided by hours worked.
link |
00:31:20.080
So, if you mismeasure GDP, you mismeasure productivity
link |
00:31:23.200
by the exact same amount.
link |
00:31:25.360
That's something we need to fix.
link |
00:31:26.480
I'm working with the statistical agencies
link |
00:31:28.440
to come up with a new set of metrics.
link |
00:31:30.200
And over the coming years, I think we'll see,
link |
00:31:33.240
we're not going to do away with GDP.
link |
00:31:34.480
It's very useful, but we'll see a parallel set of accounts
link |
00:31:37.240
that measure the benefits.
link |
00:31:38.400
How difficult is it to get that B in the GDP B?
link |
00:31:41.080
It's pretty hard.
link |
00:31:41.800
I mean, one of the reasons it hasn't been done before
link |
00:31:44.360
is that, you know, you can measure it,
link |
00:31:46.680
the cash register, what people pay for stuff.
link |
00:31:49.000
But how do you measure what they would have paid?
link |
00:31:51.000
Like what the value is, that's a lot harder.
link |
00:31:53.360
You know, how much is Wikipedia worth to you?
link |
00:31:56.000
That's what we have to answer.
link |
00:31:57.480
And to do that, what we do is,
link |
00:31:59.280
we can use online experiments.
link |
00:32:00.680
We do massive online choice experiments.
link |
00:32:03.080
We ask hundreds of thousands, not millions of people,
link |
00:32:05.680
to do lots of sort of AB tests.
link |
00:32:07.720
How much would I have to pay you
link |
00:32:09.080
to give up Wikipedia for a month?
link |
00:32:10.840
How much would I have to pay you
link |
00:32:12.000
to stop using your phone?
link |
00:32:14.120
And in some cases, it's hypothetical.
link |
00:32:15.960
In other cases, we actually enforce it,
link |
00:32:17.520
which is kind of expensive.
link |
00:32:18.920
Like we pay somebody $30 to stop using Facebook
link |
00:32:22.440
and we see if they'll do it.
link |
00:32:23.440
And some people will give it up for $10.
link |
00:32:26.280
Some people won't give it up
link |
00:32:27.120
even if you give them $100.
link |
00:32:28.880
And then you get a whole demand curve.
link |
00:32:31.080
You get to see what all the different prices are
link |
00:32:33.600
and how much value different people get.
link |
00:32:36.000
And not surprisingly,
link |
00:32:36.920
different people have different values.
link |
00:32:38.280
We find that women tend to value Facebook more than men.
link |
00:32:41.560
Old people tend to value it a little bit more
link |
00:32:43.240
than young people, I was interesting.
link |
00:32:44.480
I think young people maybe know about other networks
link |
00:32:46.640
that I don't know the name of,
link |
00:32:47.720
that are better than Facebook.
link |
00:32:50.320
And so you get to see these patterns,
link |
00:32:53.480
but every person's individual.
link |
00:32:55.480
And then if you add up all those numbers,
link |
00:32:57.280
you start getting an estimate of the value.
link |
00:33:00.080
Okay, first of all, that's brilliant.
link |
00:33:01.280
Is this work that will soon eventually be published?
link |
00:33:05.760
Yeah, well, there's a version of it
link |
00:33:07.080
in the proceedings of the National Academy of Sciences
link |
00:33:09.560
about, I think we call it massive online choice experiments.
link |
00:33:11.880
I should remember the title, but it's on my website.
link |
00:33:14.920
So yeah, we have some more papers coming out on it,
link |
00:33:17.520
but the first one is already out.
link |
00:33:20.160
You know, it's kind of a fascinating mystery
link |
00:33:22.320
that Twitter, Facebook,
link |
00:33:24.320
like all these social networks are free.
link |
00:33:26.800
And it seems like almost none of them,
link |
00:33:29.800
except for YouTube,
link |
00:33:31.440
have experimented with removing ads for money.
link |
00:33:35.200
Can you like, do you understand that
link |
00:33:37.160
from both economics and the product perspective?
link |
00:33:39.760
Yeah, it's something that, so I teach a course
link |
00:33:41.920
on digital business models.
link |
00:33:43.280
So I used to at MIT at Stanford.
link |
00:33:45.040
I'm not quite sure.
link |
00:33:45.880
I'm not teaching until next spring.
link |
00:33:47.400
I'm still thinking what my course is gonna be.
link |
00:33:50.080
But there are a lot of different business models.
link |
00:33:52.240
And when you have something that's zero marginal cost,
link |
00:33:54.920
there's a lot of forces,
link |
00:33:56.440
especially if there's any kind of competition
link |
00:33:57.920
that push prices down to zero.
link |
00:33:59.960
But you can have ad supported systems.
link |
00:34:03.360
You can bundle things together.
link |
00:34:05.560
You can have volunteer, you mentioned Wikipedia.
link |
00:34:07.360
There's donations.
link |
00:34:08.800
And I think economists underestimate the power
link |
00:34:11.680
of volunteerism and donations.
link |
00:34:14.600
You know, national public radio.
link |
00:34:16.080
Actually, how do you, this podcast,
link |
00:34:17.640
how is this, what's the revenue model?
link |
00:34:19.480
There's sponsors at the beginning.
link |
00:34:22.280
And then, and people, the funny thing is,
link |
00:34:24.680
I tell people they can, it's very,
link |
00:34:26.680
I tell them the timestamp.
link |
00:34:27.880
So if you wanna skip the sponsors, you're free.
link |
00:34:31.000
But the, it's funny that a bunch of people,
link |
00:34:33.600
so I read the advertisement.
link |
00:34:36.240
And a bunch of people enjoy reading it.
link |
00:34:38.440
Well, they may learn something from it.
link |
00:34:39.800
And also, from the advertiser's perspective,
link |
00:34:42.960
those are people who are actually interested, you know?
link |
00:34:45.080
Like, I mean, the example I sometimes get is like,
link |
00:34:46.960
I bought a car recently.
link |
00:34:48.360
And all of a sudden, all the car ads were like,
link |
00:34:50.760
interesting to me.
link |
00:34:51.600
And then like, now that I have the car,
link |
00:34:54.360
like I sort of zone out on, okay, but that's fine.
link |
00:34:56.280
The car companies, they don't really wanna be advertising
link |
00:34:58.720
to me if I'm not gonna buy their product.
link |
00:35:01.360
So there are a lot of these different revenue models.
link |
00:35:03.560
And, you know, it's a little complicated,
link |
00:35:06.880
but the economic theory has to do with what the shape
link |
00:35:08.680
of the demand curve is,
link |
00:35:09.520
when it's better to monetize it with charging people
link |
00:35:13.200
versus when you're better off doing advertising.
link |
00:35:15.640
I mean, in short, when the demand curve
link |
00:35:18.320
is relatively flat and wide,
link |
00:35:20.640
like generic news and things like that,
link |
00:35:22.800
then you tend to do better with advertising.
link |
00:35:25.960
If it's a good that's only useful to a small number
link |
00:35:28.880
of people, but they're willing to pay a lot.
link |
00:35:30.360
They have a very high value for it.
link |
00:35:32.800
Then you, advertising isn't gonna work as well.
link |
00:35:34.600
You're better off charging for it.
link |
00:35:36.120
Both of them have some inefficiencies.
link |
00:35:38.080
And then when you get into targeting
link |
00:35:39.480
and you get into these other revenue models,
link |
00:35:40.600
it gets more complicated.
link |
00:35:41.960
But there's some economic theory on it.
link |
00:35:45.320
I also think, to be frank,
link |
00:35:47.560
there's just a lot of experimentation that's needed
link |
00:35:49.560
because sometimes things are a little counterintuitive,
link |
00:35:53.200
especially when you get into what are called
link |
00:35:55.160
two sided networks or platform effects,
link |
00:35:57.680
where you may grow the market on one side
link |
00:36:01.840
and harvest the revenue on the other side.
link |
00:36:03.920
You know, Facebook tries to get more and more users
link |
00:36:06.080
and then they harvest the revenue from advertising.
link |
00:36:08.960
So that's another way of kind of thinking about it.
link |
00:36:12.040
Is it strange to you that they haven't experimented?
link |
00:36:14.400
Well, they are experimenting.
link |
00:36:15.360
So, you know, they are doing some experiments
link |
00:36:17.600
about what the willingness is for people to pay.
link |
00:36:21.000
I think that when they do the math,
link |
00:36:23.560
it's gonna work out that they still are better off
link |
00:36:26.360
with an advertising driven model, but...
link |
00:36:29.400
What about a mix?
link |
00:36:30.360
Like this is what YouTube is, right?
link |
00:36:32.360
Yeah, you allow the person to decide,
link |
00:36:36.360
the customer to decide exactly which model they prefer.
link |
00:36:39.040
Yeah, no, that can work really well, you know.
link |
00:36:40.920
And newspapers, of course,
link |
00:36:41.760
I've known this for a long time,
link |
00:36:42.760
the Wall Street Journal, the New York Times,
link |
00:36:44.560
they had subscription revenue,
link |
00:36:45.840
they also have advertising revenue.
link |
00:36:48.080
And that can definitely work.
link |
00:36:51.960
The online is a lot easier to have a dial
link |
00:36:54.080
that's much more personalized
link |
00:36:55.240
and everybody can kind of roll their own mix.
link |
00:36:57.720
And I could imagine, you know,
link |
00:36:59.360
having a little slider about how much advertising
link |
00:37:03.080
you want or are willing to take.
link |
00:37:05.040
And if it's done right and it's incentive compatible,
link |
00:37:07.400
it could be a win win where both the content provider
link |
00:37:10.960
and the consumer are better off
link |
00:37:12.560
than they would have been before.
link |
00:37:14.480
Yeah, you know, the done right part is a really good point.
link |
00:37:17.920
Like with Jeff Bezos and the single click purchase
link |
00:37:21.080
on Amazon, the frictionless effort there,
link |
00:37:23.880
if I could just rant for a second
link |
00:37:25.760
about the Wall Street Journal,
link |
00:37:27.240
all the newspapers you mentioned
link |
00:37:29.280
is I have to click so many times to subscribe to them
link |
00:37:34.800
that I literally don't subscribe
link |
00:37:37.400
just because of the number of times I have to click.
link |
00:37:39.520
I'm totally with you.
link |
00:37:40.360
I don't understand why so many companies make it so hard.
link |
00:37:44.560
I mean, another example is when you buy a new iPhone
link |
00:37:47.240
or a new computer, whatever,
link |
00:37:48.880
I feel like, okay, I'm gonna like lose an afternoon,
link |
00:37:51.440
just like loading up and getting all my stuff back.
link |
00:37:53.840
And for a lot of us, that's more of a deterrent
link |
00:37:57.360
than the price.
link |
00:37:58.640
And if they could make it painless,
link |
00:38:01.840
we'd give them a lot more money.
link |
00:38:03.720
So I'm hoping somebody listening is working
link |
00:38:06.480
on making it more painless for us to buy your products.
link |
00:38:10.040
If we could just like linger a little bit
link |
00:38:12.320
on the social network thing,
link |
00:38:13.720
because there's this Netflix social dilemma.
link |
00:38:18.240
Yeah, I saw that.
link |
00:38:19.320
And Tristan Harris and company, yeah.
link |
00:38:22.680
And people's data, it's really sensitive
link |
00:38:30.440
and social networks are at the core,
link |
00:38:32.440
arguably of many of societal tension
link |
00:38:37.480
and some of the most important things happening in society.
link |
00:38:39.680
So it feels like it's important to get this right,
link |
00:38:42.080
both from a business model perspective
link |
00:38:44.000
and just like a trust perspective.
link |
00:38:46.400
I still gotta, I mean, it just still feels like,
link |
00:38:49.880
I know there's experimentation going on.
link |
00:38:52.160
It still feels like everyone is afraid
link |
00:38:54.720
to try different business models, like really try.
link |
00:38:57.520
Well, I'm worried that people are afraid
link |
00:38:59.600
to try different business models.
link |
00:39:01.200
I'm also worried that some of the business models
link |
00:39:03.480
may lead them to bad choices.
link |
00:39:06.280
And Danny Kahneman talks about system one
link |
00:39:10.280
and system two, sort of like our reptilian brain
link |
00:39:12.280
that reacts quickly to what we see,
link |
00:39:14.360
see something interesting, we click on it,
link |
00:39:16.160
we retweet it versus our system two,
link |
00:39:20.800
our frontal cortex that's supposed to be more careful
link |
00:39:24.120
and rational that really doesn't make as many decisions
link |
00:39:27.000
as it should.
link |
00:39:28.880
I think there's a tendency for a lot of these social networks
link |
00:39:32.720
to really exploit system one, our quick instant reaction,
link |
00:39:37.760
make it, so we just click on stuff and pass it on
link |
00:39:41.000
and not really think carefully about it.
link |
00:39:42.360
And in that system, it tends to be driven by sex, violence,
link |
00:39:47.360
disgust, anger, fear, these relatively primitive kinds
link |
00:39:52.880
of emotions, maybe they're important for a lot of purposes,
link |
00:39:56.000
but they're not a great way to organize a society.
link |
00:39:58.960
And most importantly, when you think about this huge,
link |
00:40:01.960
amazing information infrastructure we've had
link |
00:40:04.360
that's connected billions of brains across the globe,
link |
00:40:08.040
not just we can all access information,
link |
00:40:09.680
but we can all contribute to it and share it.
link |
00:40:12.720
Arguably the most important thing
link |
00:40:14.160
that that network should do is favor truth over falsehoods.
link |
00:40:19.400
And the way it's been designed,
link |
00:40:21.680
not necessarily intentionally, is exactly the opposite.
link |
00:40:24.680
My MIT colleagues, Aral and Deb Roy and others at MIT
link |
00:40:29.480
did a terrific paper in the cover of science
link |
00:40:31.800
and they document what we all feared,
link |
00:40:33.520
which is that lies spread faster than truth
link |
00:40:37.800
on social networks.
link |
00:40:39.800
They looked at a bunch of tweets and weed tweets
link |
00:40:42.800
and they found that false information was more likely
link |
00:40:45.360
to spread further, faster to more people.
link |
00:40:49.040
And why was that?
link |
00:40:50.000
It's not because people like lies,
link |
00:40:53.440
it's because people like things that are shocking, amazing.
link |
00:40:56.640
Can you believe this?
link |
00:40:57.920
Something that is not mundane,
link |
00:41:00.360
not that it's something everybody else already knew.
link |
00:41:02.560
And what are the most unbelievable things?
link |
00:41:05.480
Well, lies, and so if you wanna find something unbelievable,
link |
00:41:09.920
it's a lot easier to do that
link |
00:41:10.760
if you're not constrained by the truth.
link |
00:41:12.520
So they found that the emotional valence
link |
00:41:15.680
of false information was just much higher,
link |
00:41:18.000
it was more likely to be shocking
link |
00:41:19.720
and therefore more likely to be spread.
link |
00:41:22.920
Another interesting thing was that
link |
00:41:24.040
that wasn't necessarily driven by the algorithms.
link |
00:41:27.600
I know that there is some evidence,
link |
00:41:29.720
Zennip Tafeki and others have pointed out in YouTube,
link |
00:41:32.440
some of the algorithms unintentionally were tuned
link |
00:41:34.720
to amplify more extremist content.
link |
00:41:37.880
But in the study of Twitter that Sinan and Deb and others did,
link |
00:41:42.480
they found that even if you took out all the bots
link |
00:41:44.440
and all the automated tweets,
link |
00:41:47.840
you still had lies spreading significantly faster.
link |
00:41:50.720
It's just the problems with ourselves
link |
00:41:52.560
that we just can't resist passing on the salacious content.
link |
00:41:58.440
But I also blame the platforms
link |
00:41:59.920
because there's different ways you can design a platform.
link |
00:42:03.120
You can design a platform in a way
link |
00:42:05.360
that makes it easy to spread lies
link |
00:42:07.240
and to retweet and spread things on.
link |
00:42:09.480
Or you can kind of put some friction on that
link |
00:42:11.480
and try to favor truth.
link |
00:42:13.920
I had dinner with Jimmy Wales once,
link |
00:42:15.440
the guy who helped found Wikipedia.
link |
00:42:19.680
And he convinced me that,
link |
00:42:22.000
look, you can make some design choices,
link |
00:42:24.440
whether it's at Facebook, at Twitter, at Wikipedia,
link |
00:42:27.600
or Reddit, whatever.
link |
00:42:29.200
And depending on how you make those choices,
link |
00:42:32.280
you're more likely or less likely to have false news.
link |
00:42:35.040
Create a little bit of friction, like you said.
link |
00:42:37.080
Yeah.
link |
00:42:37.920
You know, that's the...
link |
00:42:39.240
It could be friction or it could be speeding the truth.
link |
00:42:41.280
You know, either way.
link |
00:42:42.240
But I don't totally understand...
link |
00:42:44.080
Speeding the truth, I love it.
link |
00:42:45.440
Yeah, yeah.
link |
00:42:46.960
Amplifying it and giving it more credit.
link |
00:42:48.800
And you know, like in academia,
link |
00:42:50.760
which is far, far from perfect,
link |
00:42:52.440
but you know, when someone has important discovery,
link |
00:42:55.560
it tends to get more cited
link |
00:42:56.800
and people kind of look to it more
link |
00:42:58.080
and sort of it tends to get amplified a little bit.
link |
00:43:00.680
So you could try to do that too.
link |
00:43:03.240
I don't know what the silver bullet is,
link |
00:43:04.600
but the meta point is that if we spend time thinking about it,
link |
00:43:08.360
we can amplify truth over falsehoods.
link |
00:43:10.720
And I'm disappointed in the heads of these social networks
link |
00:43:14.840
that they haven't been as successful
link |
00:43:16.600
or maybe haven't tried as hard to amplify truth.
link |
00:43:19.480
And part of it, going back to what we said earlier,
link |
00:43:21.520
is you know, these revenue models
link |
00:43:23.320
may push them more towards growing fast,
link |
00:43:28.080
spreading information rapidly, getting lots of users,
link |
00:43:31.400
which isn't the same thing as finding truth.
link |
00:43:35.400
Yeah.
link |
00:43:36.400
I mean, implicit in what you're saying now
link |
00:43:38.800
is a hopeful message that with platforms,
link |
00:43:42.200
we can take a step towards greater and greater popularity
link |
00:43:49.920
of truth, but the more cynical view
link |
00:43:53.120
is that what the last few years have revealed
link |
00:43:56.800
is that there's a lot of money to be made
link |
00:43:59.000
in dismantling even the idea of truth,
link |
00:44:03.080
that nothing is true.
link |
00:44:04.960
And as a thought experiment,
link |
00:44:07.000
I've been thinking about if it's possible
link |
00:44:09.320
that our future will have,
link |
00:44:11.200
like the idea of truth is something we won't even have.
link |
00:44:14.360
Do you think it's possible like in the future
link |
00:44:17.800
that everything is on the table in terms of truth
link |
00:44:20.960
and we're just swimming in this kind of digital economy
link |
00:44:24.880
where ideas are just little toys
link |
00:44:29.880
that are not at all connected to reality?
link |
00:44:33.200
Yeah, I think that's definitely possible.
link |
00:44:35.880
I'm not a technological determinist.
link |
00:44:38.080
So I don't think that's inevitable.
link |
00:44:40.400
I don't think it's inevitable that it doesn't happen.
link |
00:44:42.440
I mean, the thing that I've come away with
link |
00:44:44.080
every time I do these studies
link |
00:44:45.400
and I emphasize it in my books and elsewhere
link |
00:44:47.320
is that technology doesn't shape our destiny,
link |
00:44:50.160
we shape our destiny.
link |
00:44:51.800
So just by us having this conversation,
link |
00:44:54.760
I hope that your audience is gonna take it upon themselves
link |
00:44:58.560
as they design their products
link |
00:44:59.960
and they think about and they use products
link |
00:45:01.360
as they manage companies.
link |
00:45:02.880
How can they make conscious decisions
link |
00:45:05.400
to favor truth over false?
link |
00:45:08.520
So it's favor the better kinds of societies
link |
00:45:11.000
and not abdicate and say, well, we just build the tools.
link |
00:45:13.800
I think there was a saying that was it the German scientists
link |
00:45:18.400
when they were working on the missiles in late World War II.
link |
00:45:23.120
They said, well, our job is to make the missiles go up
link |
00:45:25.760
where they come down, that's someone else's department.
link |
00:45:28.760
And that's obviously not the,
link |
00:45:31.040
I think it's obvious that's not the right attitude
link |
00:45:32.800
that technologists should have,
link |
00:45:33.920
that engineers should have,
link |
00:45:35.640
they should be very conscious about
link |
00:45:37.320
what the implications are.
link |
00:45:38.760
And if we think carefully about it,
link |
00:45:40.560
we can avoid the kind of world that you just described
link |
00:45:42.880
where truth is all relative.
link |
00:45:45.040
There are going to be people who benefit from a world
link |
00:45:47.800
of where people don't check facts
link |
00:45:51.280
and where truth is relative and popularity
link |
00:45:54.200
or fame or money is orthogonal to truth.
link |
00:45:59.840
But one of the reasons I suspect
link |
00:46:01.840
that we've had so much progress over the past few hundred years
link |
00:46:05.360
is the invention of the scientific method,
link |
00:46:07.560
which is a really powerful tool or meta tool
link |
00:46:10.160
for finding truth and favoring things that are true
link |
00:46:15.360
versus things that are false.
link |
00:46:16.560
If they don't pass the scientific method,
link |
00:46:18.520
they're less likely to be true.
link |
00:46:20.600
And that has the societies and the people
link |
00:46:25.520
and the organizations that embrace that
link |
00:46:27.720
have done a lot better than the ones who haven't.
link |
00:46:30.480
And so I'm hoping that people keep that in mind
link |
00:46:32.760
and continue to try to embrace not just the truth,
link |
00:46:35.400
but methods that lead to the truth.
link |
00:46:37.600
So maybe on a more personal question,
link |
00:46:41.360
if one were to try to build a competitor to Twitter,
link |
00:46:45.440
what would you advise?
link |
00:46:48.600
Is there, I mean, the matter of question,
link |
00:46:53.360
is that the right way to improve systems?
link |
00:46:55.680
Yeah, no, I think that the underlying premise
link |
00:46:59.360
behind Twitter and all these networks is amazing
link |
00:47:01.360
that we can communicate with each other.
link |
00:47:02.800
And I use it a lot.
link |
00:47:04.000
There's a subpart of Twitter called econ Twitter,
link |
00:47:05.920
where we economists tweet to each other
link |
00:47:08.640
and talk about new papers.
link |
00:47:10.560
Something came out in the NBER,
link |
00:47:11.960
the National Bureau of Economic Research,
link |
00:47:13.320
and we share about it.
link |
00:47:14.160
People critique it.
link |
00:47:15.360
I think it's been a godsend
link |
00:47:16.880
because it's really sped up the scientific process,
link |
00:47:20.000
if you can call it economic scientific.
link |
00:47:21.800
Does it get divisive in that little?
link |
00:47:23.520
Sometimes, yeah, sure.
link |
00:47:24.480
Sometimes it does.
link |
00:47:25.320
It can also be done in nasty ways.
link |
00:47:27.000
And there's the bad parts.
link |
00:47:28.320
But the good parts are great
link |
00:47:29.640
because you just speed up that clock speed
link |
00:47:31.600
of learning about things.
link |
00:47:33.280
Instead of like in the old, old days,
link |
00:47:35.440
waiting to read it in a journal,
link |
00:47:36.760
or the not so old days when you'd see it posted
link |
00:47:39.480
on a website and you'd read it.
link |
00:47:41.560
Now on Twitter, people will distill it down
link |
00:47:43.960
and there's a real art to getting to the essence of things.
link |
00:47:47.160
So that's been great.
link |
00:47:49.040
But it certainly, we all know that Twitter
link |
00:47:52.320
can be a cesspool of misinformation.
link |
00:47:55.520
And like I just said, unfortunately misinformation
link |
00:47:59.000
tends to spread faster on Twitter than truth.
link |
00:48:02.320
And there are a lot of people who are very vulnerable to it.
link |
00:48:04.200
I'm sure I've been fooled at times.
link |
00:48:06.000
There are agents, whether from Russia
link |
00:48:09.120
or from political groups or others
link |
00:48:11.680
that explicitly create efforts at misinformation
link |
00:48:15.640
and efforts at getting people to hate each other.
link |
00:48:17.920
Or even more importantly, I've discovered, is nut picking.
link |
00:48:21.200
You know the idea of nut picking?
link |
00:48:22.320
No, what's that?
link |
00:48:23.160
It's a good term.
link |
00:48:24.320
Nut picking is when you find like an extreme nut case
link |
00:48:27.800
on the other side and then you amplify them
link |
00:48:30.680
and make it seem like that's typical of the other side.
link |
00:48:33.960
So you're not literally lying.
link |
00:48:35.440
You're taking some idiot, you know,
link |
00:48:37.720
ranting on the subway or just, you know,
link |
00:48:39.880
whether they're in the KKK or Antifa or whatever,
link |
00:48:42.800
they're just, and you normally,
link |
00:48:44.800
nobody would pay attention to this guy.
link |
00:48:46.000
Like 12 people would see him and it'd be the end.
link |
00:48:48.080
Instead, with video or whatever,
link |
00:48:51.120
you get tens of millions of people say it.
link |
00:48:54.480
And I've seen this, you know, I look at him, I get angry.
link |
00:48:57.120
I'm like, I can't believe that person did something so terrible.
link |
00:48:59.720
Let me tell all my friends about this terrible person.
link |
00:49:02.840
And it's a great way to generate division.
link |
00:49:06.640
I talked to a friend who studied
link |
00:49:09.400
Russian misinformation campaigns.
link |
00:49:11.520
And they're very clever about literally being
link |
00:49:13.960
on both sides of some of these debates.
link |
00:49:15.840
They would have some people pretend to be part of BLM,
link |
00:49:18.600
some people pretend to be white nationalists,
link |
00:49:21.000
and they would be throwing epithets at each other,
link |
00:49:22.920
saying crazy things at each other.
link |
00:49:25.040
And they're literally playing both sides of it,
link |
00:49:26.560
but their goal wasn't for one or the other to win.
link |
00:49:28.560
It was for everybody to get behating
link |
00:49:30.080
and distrusting everyone else.
link |
00:49:31.960
So these tools can definitely be used for that,
link |
00:49:34.440
and they are being used for that.
link |
00:49:36.560
It's been super destructive for our democracy
link |
00:49:39.640
and our society.
link |
00:49:41.080
And the people who run these platforms,
link |
00:49:43.520
I think have a social responsibility,
link |
00:49:46.080
a moral and ethical personal responsibility,
link |
00:49:48.640
to do a better job and to shut that stuff down.
link |
00:49:51.720
Well, I don't know if you can shut it down,
link |
00:49:52.920
but to design them in a way that, as I said earlier,
link |
00:49:56.440
favors truth over falsehoods and favors
link |
00:49:59.720
positive types of communication versus destructive ones.
link |
00:50:06.000
And just like you said, it's also on us.
link |
00:50:09.600
I try to be all about love and compassion and empathy
link |
00:50:12.800
on Twitter.
link |
00:50:13.640
I mean, one of the things,
link |
00:50:14.840
not picking is a fascinating term,
link |
00:50:16.600
one of the things that people do
link |
00:50:18.960
that's I think even more dangerous
link |
00:50:21.800
is not picking applied to individual statements
link |
00:50:26.800
of good people.
link |
00:50:28.440
So basically, worst case analysis in computer science
link |
00:50:32.200
is taking sometimes out of context,
link |
00:50:35.360
but sometimes in context,
link |
00:50:38.480
a statement, one statement by a person.
link |
00:50:42.320
Like I've been,
link |
00:50:43.160
because I've been reading The Rise and Fall of the Third Reich,
link |
00:50:45.360
I often talk about Hitler on this podcast with folks,
link |
00:50:48.960
and it is so easy.
link |
00:50:50.600
That's really dangerous.
link |
00:50:52.040
But I'm all leaning in.
link |
00:50:53.200
I'm 100% because, well, it's actually a safer place
link |
00:50:56.960
than people realize,
link |
00:50:58.040
because it's history and history in long form
link |
00:51:01.920
is actually very fascinating to think about.
link |
00:51:05.000
And it's, but I could see how that could be taken
link |
00:51:09.560
totally out of context and it's very worrying.
link |
00:51:11.320
I think about these digital infrastructures,
link |
00:51:12.800
not just they disseminate things,
link |
00:51:14.040
but they're sort of permanent.
link |
00:51:14.880
Anything you say at some point,
link |
00:51:16.560
someone can go back and find something you said three years ago,
link |
00:51:19.680
perhaps jokingly, perhaps not.
link |
00:51:21.080
Maybe you're just wrong and you made them,
link |
00:51:22.800
and like that becomes,
link |
00:51:24.000
they can use that to define you if they have an intent.
link |
00:51:26.840
And we all need to be a little more forgiving.
link |
00:51:29.080
I mean, somewhere in my 20s,
link |
00:51:31.200
I told myself, I was going through all my different friends
link |
00:51:33.840
and I was like, you know,
link |
00:51:36.600
every one of them has at least like one nutty opinion.
link |
00:51:39.440
Yeah, exactly.
link |
00:51:40.680
And I was like, there's like nobody who's like completely,
link |
00:51:43.120
except me, of course,
link |
00:51:44.160
but I'm sure they thought that about me too.
link |
00:51:45.720
And so you just kind of like learned
link |
00:51:47.760
to be a little bit tolerant that like, okay, there's just,
link |
00:51:51.120
Yeah, I wonder who the responsibility lays on there.
link |
00:51:55.240
Like, I think ultimately it's about leadership,
link |
00:51:59.680
like the previous president, Barack Obama has been,
link |
00:52:04.240
I think quite eloquent at walking this very difficult line
link |
00:52:07.640
of talking about cancel culture,
link |
00:52:09.840
but it's a difficult, it takes skill.
link |
00:52:11.520
Yeah.
link |
00:52:12.360
You say the wrong thing and you piss off a lot of people.
link |
00:52:15.440
And so you have to do it well,
link |
00:52:17.400
but then also the platform of the technology is
link |
00:52:21.160
should slow down,
link |
00:52:22.440
create friction and spreading this kind of nut picking
link |
00:52:25.400
in all its forms.
link |
00:52:26.400
Absolutely.
link |
00:52:27.240
And your point that we have to like learn over time
link |
00:52:29.760
how to manage it.
link |
00:52:30.600
I mean, we can't put it all on the platform
link |
00:52:31.760
and say you guys design it.
link |
00:52:32.960
And cause if we're idiots about using it, you know,
link |
00:52:35.160
nobody can design a platform that withstands that.
link |
00:52:37.840
And every new technology people learn it's dangerous.
link |
00:52:41.680
You know, when someone invented fire,
link |
00:52:43.920
it's great cooking and everything,
link |
00:52:44.960
but then somebody burned himself.
link |
00:52:46.160
And then you had to like learn how to like avoid,
link |
00:52:48.200
maybe somebody invented a fire extinguisher later
link |
00:52:50.000
and what's up.
link |
00:52:50.840
So you kind of like figure out ways
link |
00:52:52.840
of working around these technologies.
link |
00:52:54.680
Someone invented seat belts, et cetera.
link |
00:52:57.440
And that's certainly true
link |
00:52:58.640
with all the new digital technologies
link |
00:53:00.640
that we have to figure out,
link |
00:53:02.360
not just technologies that protect us,
link |
00:53:05.320
but ways of using them that emphasize
link |
00:53:08.680
that are more likely to be successful than dangerous.
link |
00:53:11.560
So you've written quite a bit about
link |
00:53:13.240
how artificial intelligence might change our world.
link |
00:53:19.040
How do you think, if we look forward again,
link |
00:53:21.760
it's impossible to predict the future.
link |
00:53:23.240
But if we look at trends from the past
link |
00:53:26.480
and we try to predict what's gonna happen
link |
00:53:28.240
in the rest of the 21st century,
link |
00:53:29.760
how do you think AI will change our world?
link |
00:53:32.120
That's a big question.
link |
00:53:34.200
You know, I'm mostly a techno optimist.
link |
00:53:37.480
I'm not at the extreme, you know,
link |
00:53:38.680
the singularity is near end of the spectrum.
link |
00:53:41.120
But I do think that we are likely in
link |
00:53:44.600
for some significantly improved living standards,
link |
00:53:47.520
some really important progress,
link |
00:53:49.320
even just the technologies that are already kind of like
link |
00:53:51.280
in the can that haven't diffused.
link |
00:53:53.120
You know, when I talked earlier about the J curve,
link |
00:53:54.920
it could take 10, 20, 30 years for an existing technology
link |
00:53:58.800
to have the kind of profound effects.
link |
00:54:00.800
And when I look at whether it's, you know,
link |
00:54:03.800
vision systems, voice recognition, problem solving systems,
link |
00:54:07.880
even if nothing new got invented,
link |
00:54:09.440
we would have a few decades of progress.
link |
00:54:11.840
So I'm excited about that.
link |
00:54:13.480
And I think that's gonna lead to us being wealthier,
link |
00:54:16.880
healthier, I mean, the healthcare is probably
link |
00:54:18.840
one of the applications I'm most excited about.
link |
00:54:22.560
So that's good news.
link |
00:54:23.800
I don't think we're gonna have the end of work anytime soon.
link |
00:54:27.640
There's just too many things that machines still can't do.
link |
00:54:31.000
When I look around the world
link |
00:54:32.040
and think of whether it's childcare or healthcare,
link |
00:54:34.680
clean the environment, interacting with people,
link |
00:54:37.800
scientific work, artistic creativity.
link |
00:54:40.920
These are things that for now,
link |
00:54:42.600
machines aren't able to do nearly as well as humans,
link |
00:54:45.680
even just something as mundane as, you know,
link |
00:54:47.200
folding laundry or whatever.
link |
00:54:48.760
And many of these, I think are gonna be years or decades
link |
00:54:52.960
before machines catch up.
link |
00:54:54.760
You know, I may be surprised on some of them,
link |
00:54:56.160
but overall, I think there's plenty of work for humans to do.
link |
00:54:59.760
There's plenty of problems in society
link |
00:55:01.360
that need the human touch.
link |
00:55:02.600
So we'll have to repurpose.
link |
00:55:04.200
We'll have to, as machines are able to do some tasks,
link |
00:55:07.880
people are gonna have to reskill and move into other areas.
link |
00:55:11.080
And that's probably what's gonna be going on
link |
00:55:12.760
for the next, you know, 10, 20, 30 years or more,
link |
00:55:16.280
kind of big restructuring of society.
link |
00:55:18.960
We'll get wealthier and people will have to do new skills.
link |
00:55:22.440
Now, if you turn the doubt further,
link |
00:55:23.920
I don't know, 50 or 100 years into the future,
link |
00:55:26.960
then, you know, maybe all bets are off.
link |
00:55:29.640
Then it's possible that machines
link |
00:55:32.200
will be able to do most of what people do.
link |
00:55:34.240
You know, say one or 200 years, I think it's even likely.
link |
00:55:37.360
And at that point,
link |
00:55:38.400
then we're more in the sort of abundance economy.
link |
00:55:41.040
Then we're in a world where there's really little
link |
00:55:44.040
for the humans can do economically better than machines
link |
00:55:48.000
other than be human.
link |
00:55:49.920
And, you know, that will take a transition as well,
link |
00:55:53.640
kind of more of a transition of how we get meaning in life
link |
00:55:56.520
and what our values are.
link |
00:55:58.240
But shame on us if we screw that up.
link |
00:56:00.440
I mean, that should be like great, great news.
link |
00:56:02.760
And it kind of saddens me
link |
00:56:03.720
that some people see that as like a big problem.
link |
00:56:05.560
You know, I think it should be wonderful
link |
00:56:07.640
if people have all the health and material things
link |
00:56:10.400
that they need and can focus on loving each other
link |
00:56:14.200
and discussing philosophy and playing
link |
00:56:16.840
and doing all the other things that don't require work.
link |
00:56:19.440
Do you think you'll be surprised to see what the 20,
link |
00:56:23.480
like if we were to travel in time,
link |
00:56:25.400
100 years into the future,
link |
00:56:27.400
do you think you'll be able to,
link |
00:56:29.560
like if I gave you a month to like talk to people,
link |
00:56:32.320
no, like let's say a week,
link |
00:56:34.160
would you be able to understand what the house going on?
link |
00:56:37.800
You mean if I was there for a week?
link |
00:56:39.200
Yeah, if you were there for a week.
link |
00:56:40.880
100 years in the future?
link |
00:56:42.120
Yeah.
link |
00:56:43.040
So like, so I'll give you one thought experiment.
link |
00:56:45.200
It's like, isn't it possible
link |
00:56:47.600
that we're all living in virtual reality by then?
link |
00:56:50.400
Yeah.
link |
00:56:51.240
No, I think that's very possible.
link |
00:56:52.480
You know, I've played around with some of those VR headsets
link |
00:56:54.680
and they're not great,
link |
00:56:55.520
but I mean, the average person spends
link |
00:56:59.000
many waking hours staring at screens right now.
link |
00:57:03.160
You know, they're kind of low res
link |
00:57:04.440
compared to what they could be in 30 or 50 years,
link |
00:57:08.880
but certainly games and why not any other interactions
link |
00:57:13.960
could be done with VR.
link |
00:57:15.360
And that would be a pretty different world
link |
00:57:16.360
that we'd all, you know, in some ways be as rich as we wanted.
link |
00:57:19.400
You know, we could have castles
link |
00:57:20.520
and it could be traveling anywhere we want.
link |
00:57:23.680
And it could obviously be multi sensory.
link |
00:57:26.000
So that would be possible.
link |
00:57:28.000
You know, of course, there's people, you know,
link |
00:57:30.880
you've had Elon Musk on and others, you know,
link |
00:57:32.720
there are people, Nick Bostrom, you know,
link |
00:57:34.120
makes the simulation argument that maybe we're already there.
link |
00:57:36.880
We're already there.
link |
00:57:37.720
So, but, but in general,
link |
00:57:39.560
or do you not even think about it in this kind of way?
link |
00:57:42.480
You're self critically thinking,
link |
00:57:45.080
how good are you as an economist
link |
00:57:47.880
at predicting what the future looks like?
link |
00:57:50.360
Well, it starts getting, I mean,
link |
00:57:52.000
I feel reasonably comfortable next, you know,
link |
00:57:54.160
five, 10, 20 years in terms of that path.
link |
00:57:58.720
When you start getting truly superhuman
link |
00:58:01.720
artificial intelligence, kind of by definition,
link |
00:58:06.000
be able to think of a lot of things
link |
00:58:07.040
that I couldn't have thought of
link |
00:58:08.280
and create a world that I couldn't even imagine.
link |
00:58:10.960
And so I'm not sure I can,
link |
00:58:14.080
I can predict what that world is going to be like.
link |
00:58:16.520
One thing that AI researchers,
link |
00:58:18.840
AI safety researchers worry about
link |
00:58:20.360
is what's called the alignment problem.
link |
00:58:22.520
When an AI is that powerful,
link |
00:58:25.080
then they can do all sorts of things.
link |
00:58:27.960
And you really hope that their values
link |
00:58:30.560
are aligned with our values.
link |
00:58:32.440
And it's even tricky defining what our values are.
link |
00:58:34.480
I mean, first off, we all have different values.
link |
00:58:37.240
And secondly, maybe if we were smarter,
link |
00:58:40.440
we would have better values.
link |
00:58:41.640
Like, you know, I like to think
link |
00:58:43.240
that we have better values than he did in 1860.
link |
00:58:46.880
And, or in, you know, the year 200 BC on a lot of dimensions,
link |
00:58:51.360
things that we consider barbaric today.
link |
00:58:53.440
And it may be that if I thought about it more deeply,
link |
00:58:56.080
I would also be morally evolved.
link |
00:58:57.400
Maybe I'd be a vegetarian or do other things
link |
00:59:00.120
that right now, whether my future self
link |
00:59:03.000
would consider kind of immoral.
link |
00:59:05.240
So that's a tricky problem,
link |
00:59:07.760
getting the AI to do what we want.
link |
00:59:11.120
Assuming it's even a friendly AI.
link |
00:59:12.960
I mean, I should probably mention,
link |
00:59:14.760
there's a non trivial other branch
link |
00:59:17.120
where we destroy ourselves, right?
link |
00:59:18.720
I mean, there's a lot of exponentially improving technologies
link |
00:59:22.960
that could be ferociously destructive,
link |
00:59:26.640
whether it's in nanotechnology or biotech
link |
00:59:29.480
and weaponized viruses, AI, and other things that...
link |
00:59:34.280
Nuclear weapons.
link |
00:59:35.120
Nuclear weapons, of course.
link |
00:59:36.240
The old school technology.
link |
00:59:37.320
Yeah, good old nuclear weapons
link |
00:59:39.280
that could be devastating or even existential.
link |
00:59:43.520
And new things yet to be invented.
link |
00:59:45.200
So that's a branch that I think is pretty significant.
link |
00:59:52.200
And there are those who think that one of the reasons
link |
00:59:54.240
we haven't been contacted by other civilizations, right?
link |
00:59:57.480
Is that once you get to a certain level
link |
01:00:00.600
of complexity in technology,
link |
01:00:03.040
there's just too many ways to go wrong.
link |
01:00:04.600
There's a lot of ways to blow yourself up and people,
link |
01:00:08.440
or I should say species end up falling into
link |
01:00:10.560
one of those traps, the great filter.
link |
01:00:13.600
The great filter.
link |
01:00:14.960
I mean, there's an optimistic view of that.
link |
01:00:16.720
If there is literally no intelligent life out there
link |
01:00:19.400
in the universe, or at least in our galaxy,
link |
01:00:22.360
that means that we've passed at least one of the great filters
link |
01:00:26.120
or some of the great filters that we survived.
link |
01:00:30.040
Yeah, no, I think, I think it's Robin Hansen
link |
01:00:31.680
has a good way of, maybe others,
link |
01:00:32.800
they have a good way of thinking about this,
link |
01:00:33.920
that if there are no other intelligence creatures out there
link |
01:00:38.640
and that we've been able to detect,
link |
01:00:40.640
one possibility is that there's a filter ahead of us.
link |
01:00:43.440
And when you get a little more advanced,
link |
01:00:44.760
maybe in a hundred or a thousand or 10,000 years,
link |
01:00:47.600
things just get destroyed for some reason.
link |
01:00:50.560
The other one is the great filters behind us.
link |
01:00:53.000
That'll be good is that most planets don't even evolve life
link |
01:00:57.680
or if they don't evolve life,
link |
01:00:58.920
they don't involve intelligent life.
link |
01:01:00.280
Maybe we've gotten past that.
link |
01:01:02.000
And so now maybe we're on the good side of the great filter.
link |
01:01:05.680
So if we sort of rewind back and look at the thing
link |
01:01:10.480
where we could say something a little bit more comfortably
link |
01:01:12.760
at five years and 10 years out,
link |
01:01:15.920
you've written about jobs
link |
01:01:20.240
and the impact on sort of our economy and the jobs
link |
01:01:24.760
in terms of artificial intelligence that it might have.
link |
01:01:28.280
It's a fascinating question of what kind of jobs are safe,
link |
01:01:30.600
what kind of jobs are not.
link |
01:01:32.560
He maybe speak to your intuition
link |
01:01:34.640
about how we should think about AI
link |
01:01:37.720
changing the landscape of work.
link |
01:01:39.960
Sure, absolutely.
link |
01:01:40.920
Well, this is a really important question
link |
01:01:42.640
because I think we're very far
link |
01:01:43.880
from artificial general intelligence,
link |
01:01:45.720
which is AI that can just do the full breadth
link |
01:01:48.120
of what humans can do.
link |
01:01:49.520
But we do have human level or super human level,
link |
01:01:53.000
narrow intelligence, narrow artificial intelligence.
link |
01:01:56.800
And obviously my calculator can do math a lot better
link |
01:01:59.840
than I can.
link |
01:02:00.680
And there's a lot of other things
link |
01:02:01.520
that machines can do better than I can.
link |
01:02:03.160
So which is which?
link |
01:02:04.440
We actually set out to address that question.
link |
01:02:06.840
With Tom Mitchell, I wrote a paper called
link |
01:02:10.760
What Can Machine Learning Do That Was in Science?
link |
01:02:13.440
And we went and interviewed a whole bunch of AI experts
link |
01:02:16.840
and kind of synthesized what they thought
link |
01:02:19.920
machine learning was good at and wasn't good at.
link |
01:02:22.240
And we came up with what we called a rubric,
link |
01:02:25.560
basically a set of questions you can ask about any task
link |
01:02:28.160
that will tell you whether it's likely to score high or low
link |
01:02:30.960
on suitability for machine learning.
link |
01:02:33.760
And then we've applied that to a bunch of tasks
link |
01:02:35.520
in the economy.
link |
01:02:36.960
In fact, there's a data set of all the tasks
link |
01:02:39.120
in the US economy, believe it or not.
link |
01:02:40.160
It's called ONET.
link |
01:02:41.640
The US government put it together,
link |
01:02:43.160
part of Bureau of Labor Statistics.
link |
01:02:45.040
They divide the economy into about 970 occupations
link |
01:02:48.720
like bus driver, economist, primary school teacher,
link |
01:02:52.200
radiologist.
link |
01:02:53.440
And then for each one of them,
link |
01:02:54.840
they describe which tasks need to be done.
link |
01:02:57.640
Like for radiologists, there are 27 distinct tasks.
link |
01:03:00.760
So we went through all those tasks
link |
01:03:02.200
to see whether or not a machine could do them.
link |
01:03:05.000
And what we found interestingly was
link |
01:03:06.760
Brilliant study weather.
link |
01:03:07.680
That's so awesome.
link |
01:03:08.880
Yeah, thank you.
link |
01:03:10.240
So what we found was that there was no occupation
link |
01:03:13.800
in our data set where machine learning just ran the table
link |
01:03:16.280
and did everything.
link |
01:03:17.560
And there was almost no occupation where machine learning
link |
01:03:19.560
didn't have like a significant ability to do things.
link |
01:03:22.160
Like take radiology, a lot of people,
link |
01:03:23.760
I hear it saying, you know, it's the end of radiology.
link |
01:03:26.720
And one of the 27 tasks is read medical images.
link |
01:03:29.920
Really important one, like it's kind of a core job.
link |
01:03:32.000
And machines have basically gotten as good
link |
01:03:34.680
or better than radiologists.
link |
01:03:35.920
There was just an article in Nature last week,
link |
01:03:38.400
but you know, they've been publishing them
link |
01:03:39.640
for the past few years showing that machine learning
link |
01:03:45.000
can do as well as humans
link |
01:03:46.520
on many kinds of diagnostic imaging tasks.
link |
01:03:49.640
But other things radiologists do, you know,
link |
01:03:51.160
they sometimes administer conscious sedation.
link |
01:03:54.480
They sometimes do physical exams.
link |
01:03:56.000
They have to synthesize the results
link |
01:03:57.360
and explain to the other doctors or to the patients.
link |
01:04:01.720
In all those categories,
link |
01:04:02.560
machine learning isn't really up to snuff yet.
link |
01:04:05.600
So that job, we're gonna see a lot of restructuring.
link |
01:04:09.320
Parts of the job, they'll hand over to machines,
link |
01:04:11.440
others, humans will do more of.
link |
01:04:13.160
That's been more or less the pattern in all of them.
link |
01:04:15.120
So, you know, to oversimplify,
link |
01:04:16.920
but we see a lot of restructuring, reorganization of work.
link |
01:04:20.440
And it's real gonna be a great time.
link |
01:04:22.320
It is a great time for smart entrepreneurs and managers
link |
01:04:24.760
to do that reinvention of work.
link |
01:04:27.320
I'm not gonna see mass unemployment
link |
01:04:30.640
to get more specifically to your question.
link |
01:04:33.160
The kinds of tasks that machines tend to be good at
link |
01:04:36.600
are a lot of routine problem solving,
link |
01:04:39.080
mapping inputs X into outputs Y.
link |
01:04:42.560
If you have a lot of data on the X's and the Y's,
link |
01:04:44.880
the inputs and the outputs,
link |
01:04:45.720
you can do that kind of mapping
link |
01:04:47.120
and find the relationships.
link |
01:04:48.560
They tend to not be very good at,
link |
01:04:50.680
even now, fine motor control and dexterity,
link |
01:04:54.560
emotional intelligence and human interactions,
link |
01:04:58.960
and thinking outside the box, creative work.
link |
01:05:01.720
If you give it a well structured task,
link |
01:05:03.200
machines can be very good at it,
link |
01:05:05.040
but even asking the right questions, that's hard.
link |
01:05:08.680
There's a quote that Andrew McAfee and I use
link |
01:05:10.680
in our book, Second Machine Age.
link |
01:05:13.000
Apparently Pablo Picasso was shown an early computer
link |
01:05:16.840
and he came away kind of unimpressed.
link |
01:05:18.480
He goes, well, I don't see all the fusses.
link |
01:05:20.680
All that does is answer questions.
link |
01:05:24.000
And, you know, to him,
link |
01:05:24.840
the interesting thing was asking the questions.
link |
01:05:26.800
Yeah.
link |
01:05:27.640
Try to replace me GPT three, I dare you.
link |
01:05:31.360
Although some people think I'm a robot.
link |
01:05:33.200
You have this cool plot that shows,
link |
01:05:37.080
I just remember where economists landed,
link |
01:05:39.680
where I think the X axis is the income.
link |
01:05:42.880
Yes.
link |
01:05:43.720
And then the Y axis, I guess, aggregating the information
link |
01:05:47.600
of how replaceable the job is,
link |
01:05:49.440
or I think there's an index.
link |
01:05:50.280
There's a suitability for machine learning index, exactly.
link |
01:05:52.480
So we have all 970 occupations on that chart.
link |
01:05:55.520
It's a cool plot.
link |
01:05:56.560
And there's gatters in all four corners
link |
01:05:59.240
have some occupations,
link |
01:06:01.080
but there is a definite pattern,
link |
01:06:02.760
which is the lower wage occupations tend to have more tasks
link |
01:06:05.720
that are suitable for machine learning, like cashiers.
link |
01:06:08.000
I mean, anyone who's gone to a supermarket or CVS knows
link |
01:06:10.760
that, you know, they not only read barcodes,
link |
01:06:12.440
but they can recognize, you know, an apple and an orange
link |
01:06:14.560
and a lot of things that cashiers,
link |
01:06:17.120
humans used to be needed for.
link |
01:06:19.480
At the other end of the spectrum,
link |
01:06:21.040
there are some jobs like airline pilot
link |
01:06:23.560
that are among the highest paid in our economy,
link |
01:06:26.640
but also a lot of them are suitable for machine learning.
link |
01:06:28.800
A lot of those tasks are.
link |
01:06:30.920
And then, yeah, you mentioned economists.
link |
01:06:32.520
I couldn't help peaking at those.
link |
01:06:33.840
And they're paid a fair amount,
link |
01:06:36.080
maybe not as much as some of us think they should be.
link |
01:06:39.120
But they have some tasks they're suitable for machine learning,
link |
01:06:44.160
but for now, at least,
link |
01:06:45.520
most of the tasks that economists do
link |
01:06:46.920
didn't end up being in that category.
link |
01:06:48.520
And I should say, I didn't like create that data.
link |
01:06:50.600
We just took the analysis and that's what came out of it.
link |
01:06:54.440
And over time, that scatter plot will be updated
link |
01:06:57.280
as the technology improves.
link |
01:06:59.920
But it was just interesting to see the pattern there.
link |
01:07:02.840
And it is a little troubling insofar
link |
01:07:05.120
as if you just take the technology as it is today,
link |
01:07:08.080
it's likely to worsen income inequality
link |
01:07:10.480
on a lot of dimensions.
link |
01:07:12.200
So on this topic of the effect of AI
link |
01:07:16.440
on our landscape of work,
link |
01:07:21.000
one of the people that have been speaking about it
link |
01:07:23.640
in the public domain, public discourse
link |
01:07:25.760
is the presidential candidate, Andrew Yang.
link |
01:07:28.080
Yeah.
link |
01:07:29.000
What are your thoughts about Andrew?
link |
01:07:31.880
What are your thoughts about UBI,
link |
01:07:34.320
that Universal Basic Income,
link |
01:07:36.680
that he made one of the core idea.
link |
01:07:39.040
By the way, he has like hundreds of ideas
link |
01:07:40.760
about like everything, it's kind of interesting.
link |
01:07:43.960
But what are your thoughts about him
link |
01:07:45.360
and what are your thoughts about UBI?
link |
01:07:46.720
Let me answer the question about his broader approach first.
link |
01:07:52.040
I mean, I just love that.
link |
01:07:52.880
He's really thoughtful, analytical.
link |
01:07:56.400
I agree with his values.
link |
01:07:58.200
So that's awesome.
link |
01:07:59.360
And he read my book and mentions it sometimes,
link |
01:08:02.160
so it makes me even more exciting.
link |
01:08:04.760
And the thing that he really made
link |
01:08:07.040
the centerpiece of his campaign was UBI.
link |
01:08:09.920
And I was originally kind of a fan of it.
link |
01:08:13.240
And then as I studied it more, I became less of a fan,
link |
01:08:15.960
although I'm beginning to come back a little bit.
link |
01:08:17.360
So let me tell you a little bit of my evolution.
link |
01:08:19.120
You know, as an economist,
link |
01:08:20.160
we have, by looking at the problem
link |
01:08:23.000
of people not having enough income
link |
01:08:24.320
and the simplest thing is, well, why don't we write them
link |
01:08:25.880
a check, problem solved.
link |
01:08:28.000
But then I talked to my sociologist friends
link |
01:08:30.400
and they really convinced me that just writing a check
link |
01:08:34.360
doesn't really get at the core values.
link |
01:08:36.680
You know, Voltaire once said that
link |
01:08:38.720
work solves three great ills, boredom, vice and need.
link |
01:08:43.320
And you know, you can deal with the need thing
link |
01:08:45.440
by writing a check,
link |
01:08:46.640
but people need a sense of meaning,
link |
01:08:49.240
they need something to do.
link |
01:08:50.760
And when, you know, say steelworkers or coal miners
link |
01:08:55.760
lost their jobs and were just given checks,
link |
01:09:00.360
alcoholism, depression, divorce,
link |
01:09:03.760
all those social indicators, drug use all went way up.
link |
01:09:06.520
People just weren't happy just sitting around
link |
01:09:09.360
collecting a check.
link |
01:09:11.400
Maybe it's part of the way they were raised.
link |
01:09:13.200
Maybe it's something innate in people
link |
01:09:14.720
that they need to feel wanted and needed.
link |
01:09:17.200
So it's not as simple as just writing people a check.
link |
01:09:19.560
You need to also give them a way to have a sense of purpose.
link |
01:09:23.960
And that was important to me.
link |
01:09:25.360
And the second thing is that as I mentioned earlier,
link |
01:09:28.560
you know, we are far from the end of work.
link |
01:09:30.800
You know, I don't buy the idea
link |
01:09:32.240
that there's just like not enough work to be done.
link |
01:09:34.160
I see like our cities need to be cleaned up.
link |
01:09:37.120
And I mean, robots can't do most of that.
link |
01:09:39.000
You know, we need to have better childcare,
link |
01:09:40.760
we need better healthcare,
link |
01:09:41.640
we need to take care of people who are mentally ill
link |
01:09:44.000
or older, we need to repair our roads.
link |
01:09:46.480
There's so much work that require at least partly,
link |
01:09:49.960
maybe entirely a human component.
link |
01:09:52.280
So rather than like write all these people off,
link |
01:09:54.680
well, let's find a way to repurpose them
link |
01:09:56.880
and keep them engaged.
link |
01:09:59.600
Now that said, I would like to see more buying power
link |
01:10:04.640
from people who are sort of at the bottom end
link |
01:10:06.360
of the spectrum.
link |
01:10:07.320
The economy has been designed and evolved in a way
link |
01:10:12.520
that's I think very unfair to a lot of hardworking people.
link |
01:10:15.600
I see super hardworking people
link |
01:10:17.000
who aren't really seeing their wages grow
link |
01:10:19.040
over the past 20, 30 years,
link |
01:10:20.720
while some other people who have been super smart
link |
01:10:24.080
and or super lucky have made billions
link |
01:10:29.480
or hundreds of billions.
link |
01:10:30.920
And I don't think they need those hundreds of billions
link |
01:10:33.800
to have the right incentives to invent things.
link |
01:10:35.760
I think if you talk to almost any of them, as I have,
link |
01:10:39.440
they don't think that they need an extra $10 billion
link |
01:10:42.440
to do what they're doing.
link |
01:10:43.560
Most of them probably would love to do it
link |
01:10:46.520
for only a billion or maybe for nothing.
link |
01:10:48.960
For nothing, many of them, yeah.
link |
01:10:50.800
I mean, you know, an interesting point to make
link |
01:10:53.120
is like, do we think that Bill Gates
link |
01:10:55.320
would have founded Microsoft if tax rates were 70%?
link |
01:10:58.720
Well, we know he would have
link |
01:10:59.640
because they were tax rates of 70% when he founded it.
link |
01:11:03.680
So I don't think that's as big a deterrent
link |
01:11:06.200
and we could provide more buying power to people.
link |
01:11:09.100
My own favorite tool is the earned income tax credit,
link |
01:11:12.800
which is basically a way of supplementing income
link |
01:11:16.240
of people who have jobs and giving employers
link |
01:11:18.160
an incentive to hire even more people.
link |
01:11:20.280
The minimum wage can discourage employment,
link |
01:11:22.400
but the earned income tax credit encourages employment
link |
01:11:25.160
by supplementing people's wages.
link |
01:11:27.720
You know, if the employer can only afford to pay him $10
link |
01:11:30.800
for a task, the rest of us kick in another $5 or $10
link |
01:11:35.200
and bring their wages up to 15 or 20 total.
link |
01:11:37.640
And then they have more buying power
link |
01:11:39.360
than entrepreneurs are thinking, how can we cater to them?
link |
01:11:42.320
How can we make products for them?
link |
01:11:44.080
And it becomes a self reinforcing system
link |
01:11:47.200
where people are better off.
link |
01:11:49.840
And I had a good discussion where he suggested
link |
01:11:53.040
instead of a universal basic income,
link |
01:11:55.960
he suggested or instead of an unconditional basic income,
link |
01:11:59.080
how about a conditional basic income
link |
01:12:00.600
where the condition is you learn some new skills,
link |
01:12:03.040
we need to reskill our workforce.
link |
01:12:05.040
So let's make it easier for people to find ways
link |
01:12:09.120
to get those skills and get rewarded for doing them.
link |
01:12:11.280
And that's kind of a neat idea as well.
link |
01:12:13.080
That's really interesting.
link |
01:12:13.920
So I mean, one of the questions,
link |
01:12:16.160
one of the dreams of UBI is that
link |
01:12:19.000
you provide some little safety net while you retrain
link |
01:12:24.320
while you're learning new skill.
link |
01:12:26.080
But I think, I guess you're speaking to the intuition
link |
01:12:29.120
that that doesn't always,
link |
01:12:31.320
like there needs to be some incentive to reskill,
link |
01:12:33.800
to train, to learn new things.
link |
01:12:35.360
I think it helps.
link |
01:12:36.200
I mean, there are lots of self motivated people,
link |
01:12:38.000
but they're also people that maybe need a little guidance
link |
01:12:40.640
or help and I think it's a really hard question
link |
01:12:45.000
for someone who is losing a job in one area
link |
01:12:47.280
to know what is the new area I should be learning skills in
link |
01:12:50.600
and we could provide a much better set of tools
link |
01:12:52.600
and platforms that mapped it.
link |
01:12:54.480
Okay, here's a set of skills you already have.
link |
01:12:56.360
Here's something that's in demand.
link |
01:12:58.120
Let's create a path for you to go from where you are
link |
01:13:00.440
to where you need to be.
link |
01:13:03.120
So I'm a total, how do I put it nicely about myself?
link |
01:13:07.080
I'm totally clueless about the economy.
link |
01:13:09.640
It's not totally true, but pretty good approximation.
link |
01:13:12.760
If you were to try to fix our tax system
link |
01:13:20.520
and or maybe from another side,
link |
01:13:23.280
if there's fundamental problems in taxation
link |
01:13:26.720
or some fundamental problems about our economy,
link |
01:13:29.760
what would you try to fix?
link |
01:13:31.360
What would you try to speak to?
link |
01:13:33.480
You know, I definitely think our whole tax system,
link |
01:13:36.360
our political and economic system has gotten
link |
01:13:39.920
more and more screwed up over the past 20, 30 years.
link |
01:13:43.520
I don't think it's that hard to make headway
link |
01:13:46.520
and improving it.
link |
01:13:47.360
I don't think we need to totally reinvent stuff.
link |
01:13:49.880
A lot of it is what I've elsewhere with Andy
link |
01:13:52.400
and others called economics 101.
link |
01:13:54.680
You know, there's just some basic principles
link |
01:13:56.400
that have worked really well in the 20th century
link |
01:14:00.640
that we sort of forgot, you know,
link |
01:14:01.880
in terms of investing in education,
link |
01:14:03.960
investing in infrastructure, welcoming immigrants,
link |
01:14:07.560
having a tax system that was more progressive and fair.
link |
01:14:13.280
At one point, tax rates were on top incomes,
link |
01:14:16.560
were significantly higher and they've come down a lot
link |
01:14:19.080
to the point where in many cases,
link |
01:14:20.680
they're lower now than they are for poorer people.
link |
01:14:24.760
So, and we could do things like an earned income tax credit
link |
01:14:27.960
to get a little more wonky.
link |
01:14:29.240
I'd like to see more Pagoovian taxes.
link |
01:14:31.440
What that means is you tax things that are bad
link |
01:14:35.720
instead of things that are good.
link |
01:14:37.000
So right now we tax labor, we tax capital,
link |
01:14:40.680
and which is unfortunate because one of the basic principles
link |
01:14:43.640
of economics, if you tax something,
link |
01:14:44.800
you tend to get less of it.
link |
01:14:46.440
So, you know, right now there's still work to be done
link |
01:14:48.840
and still capital to be invested in,
link |
01:14:51.240
but instead we should be taxing things like pollution
link |
01:14:54.640
and congestion.
link |
01:14:57.240
And if we did that, we would have less pollution.
link |
01:15:00.040
So a carbon tax is, you know, almost every economist
link |
01:15:03.080
would say it's a no brainer,
link |
01:15:04.120
whether they're Republican or Democrat.
link |
01:15:07.560
Greg Mankiw, who's head of George Bush's
link |
01:15:09.680
Council of Economic Advisors, or Dick Schmollensy,
link |
01:15:13.000
who is another Republican economist degree,
link |
01:15:16.080
and of course a lot of a Democratic economist degree
link |
01:15:20.800
as well, if we taxed carbon,
link |
01:15:22.800
we could raise hundreds of billions of dollars.
link |
01:15:26.040
We could take that money and redistribute it
link |
01:15:28.680
through an earned income tax credit or other things
link |
01:15:31.200
so that overall our tax system would become more progressive.
link |
01:15:35.280
We could tax congestion.
link |
01:15:37.000
One of the things that kills me as an economist
link |
01:15:39.040
is every time I sit in a traffic jam,
link |
01:15:41.120
I know that it's completely unnecessary.
link |
01:15:43.040
It's this is complete waste of time.
link |
01:15:44.840
You could just visualize the cost and productivity
link |
01:15:47.000
that this is creating.
link |
01:15:47.840
Exactly, because they are taking costs for me
link |
01:15:51.280
and all the people around me.
link |
01:15:52.720
And if they charged a congestion tax,
link |
01:15:54.880
they would take that same amount of money
link |
01:15:57.120
and people would, it would streamline the roads,
link |
01:15:59.760
like when you're in Singapore, the traffic just flows
link |
01:16:01.680
because they have a congestion tax.
link |
01:16:02.680
They listen to economists.
link |
01:16:03.680
They invite it be and others to go talk to them.
link |
01:16:06.520
And then I'd still be paying,
link |
01:16:09.280
I'd be paying a congestion tax instead of paying in my time,
link |
01:16:11.760
but that money would now be available for healthcare,
link |
01:16:14.280
be available for infrastructure,
link |
01:16:15.560
or be available just to give to people
link |
01:16:16.920
so they could buy food or whatever.
link |
01:16:18.680
So it saddens me when you sit in a traffic jam,
link |
01:16:23.360
it's like taxing me and then taking that money
link |
01:16:25.080
and dumping it in the ocean, just like destroying it.
link |
01:16:27.840
So there are a lot of things like that
link |
01:16:29.520
that economists, and I'm not,
link |
01:16:32.520
I'm not like doing anything radical here.
link |
01:16:33.920
Most good economists would,
link |
01:16:36.680
I probably agree with me point by point on these things.
link |
01:16:39.440
And we could do those things in our whole economy,
link |
01:16:42.200
become much more efficient,
link |
01:16:43.760
it become fair, invest in R&D and research,
link |
01:16:47.000
which is close to a free lunch is what we have.
link |
01:16:50.080
My erstwhile MIT colleague, Bob Solo,
link |
01:16:53.160
got the Nobel Prize, not yesterday, but 30 years ago,
link |
01:16:57.360
for describing that most improvements
link |
01:17:00.560
in living standards come from tech progress.
link |
01:17:02.840
And Paul Romer later got a Nobel Prize
link |
01:17:04.560
for noting that investments in R&D and human capital
link |
01:17:08.040
can speed the rate of tech progress.
link |
01:17:11.040
So if we do that, then we'll be healthier and wealthier.
link |
01:17:14.680
Yeah, from an economics perspective,
link |
01:17:16.200
I remember taking an undergrad econ,
link |
01:17:18.440
you mentioned econ 101,
link |
01:17:20.360
it seemed from all the plots I saw that R&Ds,
link |
01:17:24.880
that's close to free lunches as we have.
link |
01:17:29.040
It seemed like obvious that we should do more research.
link |
01:17:32.360
It is.
link |
01:17:33.200
Like what?
link |
01:17:34.040
Like, there's no,
link |
01:17:36.640
but we should do basic research.
link |
01:17:38.040
I mean, so, well, let me just be clear,
link |
01:17:39.480
it'd be great if everybody did more research.
link |
01:17:41.440
And I would make these things be to apply development
link |
01:17:44.360
versus basic research.
link |
01:17:46.080
So apply development, like,
link |
01:17:48.120
how do we get this self driving car feature
link |
01:17:52.600
to work better in the Tesla?
link |
01:17:53.960
That's great for private companies
link |
01:17:55.240
because they can capture the value from that.
link |
01:17:57.080
If they make a better self driving car system,
link |
01:17:59.680
they can sell cars that are more valuable
link |
01:18:02.240
and then make money.
link |
01:18:03.080
So there's an incentive,
link |
01:18:03.920
there's not a big problem there.
link |
01:18:05.720
And smart companies, Amazon, Tesla and others
link |
01:18:08.480
are investing in it.
link |
01:18:09.440
The problem is with basic research,
link |
01:18:11.240
like coming up with core basic ideas,
link |
01:18:14.400
whether it's in nuclear fusion
link |
01:18:16.120
or artificial intelligence or biotech,
link |
01:18:18.960
there, if someone invents something,
link |
01:18:21.640
it's very hard for them to capture the benefits from it.
link |
01:18:23.920
It's shared by everybody, which is great in a way,
link |
01:18:26.760
but it means that they're not gonna have the incentives
link |
01:18:28.680
to put as much effort into it.
link |
01:18:30.720
There you need, it's a classic public good,
link |
01:18:33.000
there you need the government to be involved in it.
link |
01:18:35.120
And the US government used to be investing much more in R&D,
link |
01:18:39.400
but we have slashed that part of the government
link |
01:18:43.000
really foolishly and we're all poorer,
link |
01:18:46.960
significantly poorer as a result.
link |
01:18:48.480
Growth rates are down,
link |
01:18:50.040
we're not having the kind of scientific progress
link |
01:18:51.720
we used to have.
link |
01:18:53.280
It's been sort of a short term, eating the seed corn,
link |
01:18:57.840
whatever metaphor you wanna use,
link |
01:19:00.280
where people grab some money,
link |
01:19:01.840
put it in their pockets today,
link |
01:19:03.360
but five, 10, 20 years later,
link |
01:19:07.160
they're a lot poorer than they otherwise would have been.
link |
01:19:10.200
So we're living through a pandemic right now,
link |
01:19:12.360
globally in the United States.
link |
01:19:16.600
From an economics perspective,
link |
01:19:18.880
how do you think this pandemic will change the world?
link |
01:19:23.120
It's been remarkable.
link |
01:19:24.720
And it's horrible how many people have suffered,
link |
01:19:27.840
the amount of death, the economic destruction.
link |
01:19:31.320
It's also striking just the amount of change in work
link |
01:19:34.360
that I've seen.
link |
01:19:35.920
In the last 20 weeks,
link |
01:19:37.400
I've seen more change than there were in the previous 20 years.
link |
01:19:41.280
There's been nothing like it
link |
01:19:42.440
since probably the World War II mobilization
link |
01:19:44.760
in terms of reorganizing our economy.
link |
01:19:47.200
The most obvious one is the shift to remote work.
link |
01:19:50.240
And I and many other people stopped going into the office
link |
01:19:54.280
and teaching my students in person.
link |
01:19:56.200
I did a study on this with a bunch of colleagues
link |
01:19:57.800
at MIT and elsewhere.
link |
01:19:59.200
And what we found was that before the pandemic,
link |
01:20:02.480
in the beginning of 2020,
link |
01:20:04.120
about one in six, a little over 15% of Americans
link |
01:20:07.240
were working remotely.
link |
01:20:09.960
When the pandemic hit,
link |
01:20:11.120
that grew steadily and hit 50%,
link |
01:20:13.600
roughly half of Americans working at home.
link |
01:20:16.080
So a complete transformation.
link |
01:20:17.760
And of course, it wasn't even,
link |
01:20:19.120
it wasn't like everybody did it.
link |
01:20:20.520
If you're an information worker, professional,
link |
01:20:22.760
if you work mainly with data,
link |
01:20:24.400
then you're much more likely to work at home.
link |
01:20:26.880
If you're a manufacturing worker,
link |
01:20:28.800
working with other people or physical things,
link |
01:20:32.320
then it wasn't so easy to work at home.
link |
01:20:34.520
And instead, those people were much more likely
link |
01:20:36.480
to become laid off or unemployed.
link |
01:20:39.280
So it's been something that has had very disparate effects
link |
01:20:41.840
on different parts of the workforce.
link |
01:20:44.080
Do you think it's gonna be sticky
link |
01:20:46.200
in a sense that after vaccine comes out
link |
01:20:49.320
and the economy reopens,
link |
01:20:51.080
do you think remote work will continue?
link |
01:20:55.200
That's a great question.
link |
01:20:57.080
My hypothesis is yes, a lot of it will.
link |
01:20:59.360
Of course, some of it will go back,
link |
01:21:00.800
but a surprising amount of it will stay.
link |
01:21:03.520
I personally, for instance,
link |
01:21:04.840
I moved my seminars, my academic seminars to Zoom,
link |
01:21:08.840
and I was surprised how well it worked.
link |
01:21:10.840
So it works.
link |
01:21:11.680
Yeah, I mean, obviously,
link |
01:21:12.960
we were able to reach a much broader audience.
link |
01:21:14.800
So we have people tuning in from Europe
link |
01:21:16.600
and other countries,
link |
01:21:18.560
just all over the United States for that matter.
link |
01:21:20.320
I also actually found that it would,
link |
01:21:21.800
in many ways, is more egalitarian.
link |
01:21:23.520
We use the chat feature and other tools,
link |
01:21:25.920
and grad students and others
link |
01:21:27.200
who might've been a little shy about speaking up,
link |
01:21:29.400
we now kind of have more of ability for lots of voices,
link |
01:21:32.680
and they're answering each other's questions
link |
01:21:34.400
so you kind of get parallel.
link |
01:21:35.960
Like if someone had some question about some of the data
link |
01:21:39.080
or a reference or whatever,
link |
01:21:40.640
then someone else in the chat would answer it.
link |
01:21:42.600
And the whole thing just became
link |
01:21:43.680
like a higher bandwidth, higher quality thing.
link |
01:21:46.680
So I thought that was kind of interesting.
link |
01:21:48.480
I think a lot of people are discovering
link |
01:21:50.360
that these tools that, thanks to technologies
link |
01:21:53.960
have been developed over the past decade,
link |
01:21:56.520
they're a lot more powerful than we thought.
link |
01:21:58.000
I mean, all the terrible things we've seen with COVID
link |
01:22:00.200
and the real failure of many of our institutions
link |
01:22:03.460
that I thought would work better.
link |
01:22:05.040
One area that's been a bright spot is our technologies.
link |
01:22:09.480
Bandwidth has held up pretty well,
link |
01:22:11.920
and all of our email and other tools
link |
01:22:14.240
have just scaled up kind of gracefully.
link |
01:22:18.040
So that's been a plus.
link |
01:22:20.320
Economists call this question
link |
01:22:21.720
of whether it'll go back a hysteresis.
link |
01:22:24.000
The question is like when you boil an egg,
link |
01:22:25.920
after it gets cold again, it stays hard.
link |
01:22:29.080
And I think that we're gonna have a fair amount
link |
01:22:30.880
of hysteresis in the economy.
link |
01:22:32.200
We're gonna move to this new,
link |
01:22:33.480
we have moved to a new remote work system,
link |
01:22:35.640
and it's not gonna snap all the way back
link |
01:22:37.280
to where it was before.
link |
01:22:38.760
One of the things that worries me is that the people
link |
01:22:44.200
with lots of followers on Twitter and people with voices,
link |
01:22:51.400
people that can, voices that can be magnified by,
link |
01:22:55.240
you know, reporters and all that kind of stuff
link |
01:22:57.400
are the people that fall into this category
link |
01:22:59.280
that we were referring to just now
link |
01:23:01.640
where they can still function and be successful
link |
01:23:04.480
with remote work.
link |
01:23:06.280
And then there is a kind of quiet suffering
link |
01:23:11.240
of what feels like millions of people
link |
01:23:14.800
whose jobs are disturbed profoundly by this pandemic,
link |
01:23:21.200
but they don't have many followers on Twitter.
link |
01:23:26.320
What do we, and again, I apologize,
link |
01:23:31.840
but I've been reading the rise and fall of the third Reich
link |
01:23:35.840
and there's a connection to the depression
link |
01:23:38.120
on the American side.
link |
01:23:39.600
There's a deep, complicated connection
link |
01:23:42.360
to how suffering can turn into forces
link |
01:23:46.440
that potentially change the world in destructive ways.
link |
01:23:52.000
So like it's something I worry about is like,
link |
01:23:53.880
what is this suffering going to materialize itself
link |
01:23:56.640
in five, 10 years?
link |
01:23:58.120
Is that something you worry about, think about?
link |
01:24:01.040
It's like the center of what I worry about.
link |
01:24:03.360
And let me break it down to two parts.
link |
01:24:05.440
There's a moral and ethical aspect to it.
link |
01:24:07.320
We need to relieve this suffering.
link |
01:24:09.360
I mean, I share the values of I think most Americans,
link |
01:24:13.320
we like to see shared prosperity
link |
01:24:15.040
or most people on the planet.
link |
01:24:16.640
And we would like to see people not falling behind
link |
01:24:20.240
and they have fallen behind, not just due to COVID,
link |
01:24:23.120
but in the previous couple of decades,
link |
01:24:25.800
median income has barely moved,
link |
01:24:28.000
depending on how you measure it.
link |
01:24:29.920
And the incomes of the top 1% have skyrocketed.
link |
01:24:33.400
And part of that is due to the ways technology has been used.
link |
01:24:36.480
Part of this has been due to, frankly,
link |
01:24:37.920
our political system has continually shifted more wealth
link |
01:24:42.520
into those people who have the powerful interest.
link |
01:24:45.160
So there's just, I think, a moral imperative
link |
01:24:48.760
to do a better job.
link |
01:24:49.840
And ultimately, we're all going to be wealthier
link |
01:24:51.920
if more people can contribute,
link |
01:24:53.360
more people have the wherewithal.
link |
01:24:55.080
But the second thing is that there's a real political risk.
link |
01:24:58.680
I'm not a political scientist,
link |
01:24:59.960
but you don't have to be one, I think, to see
link |
01:25:03.320
how a lot of people are really upset
link |
01:25:05.680
with their getting a raw deal.
link |
01:25:07.400
And they want to smash the system in different ways
link |
01:25:13.720
in 2016 and 2018.
link |
01:25:16.000
And now, I think there are a lot of people
link |
01:25:18.280
who are looking at the political system
link |
01:25:19.600
and they feel like it's not working for them
link |
01:25:21.120
and they just want to do something radical.
link |
01:25:24.720
Unfortunately, demagogues have harnessed that
link |
01:25:28.120
in a way that is pretty destructive to the country.
link |
01:25:33.120
And an analogy I see is what happened with trade.
link |
01:25:37.240
Almost every economist thinks that free trade
link |
01:25:39.440
is a good thing, that when two people voluntarily exchange
link |
01:25:42.440
almost by definition, they're both better off
link |
01:25:44.920
if it's voluntary.
link |
01:25:47.320
And so generally, trade is a good thing,
link |
01:25:49.800
but they also recognize that trade can lead
link |
01:25:52.480
to uneven effects, that there can be winners and losers
link |
01:25:56.240
in some of the people who didn't have the skills
link |
01:25:59.280
to compete with somebody else or didn't have other assets.
link |
01:26:02.880
And so trade can shift prices in ways
link |
01:26:05.240
that are averse to some people.
link |
01:26:08.480
So there's a formula that economists have,
link |
01:26:11.360
which is that you have free trade,
link |
01:26:13.440
but then you compensate the people who are hurt.
link |
01:26:15.920
And free trade makes the pie bigger.
link |
01:26:18.400
And since the pie is bigger, it's possible
link |
01:26:20.200
for everyone to be better off.
link |
01:26:21.920
You can make the winners better off,
link |
01:26:23.200
but you can also compensate those who don't win.
link |
01:26:25.440
And so they end up being better off as well.
link |
01:26:28.480
What happened was that we didn't fulfill that promise.
link |
01:26:33.160
We did have some more increased free trade
link |
01:26:36.040
in the 80s and 90s, but we didn't compensate
link |
01:26:39.120
the people who were hurt.
link |
01:26:40.640
And so they felt like the people in power
link |
01:26:43.760
were negged on the bargain, and I think they did.
link |
01:26:45.920
And so then there's a backlash against trade.
link |
01:26:48.760
And now both political parties,
link |
01:26:50.840
but especially Trump and company,
link |
01:26:53.640
have really pushed back against free trade.
link |
01:26:58.200
Ultimately, that's bad for the country.
link |
01:27:00.680
Ultimately, that's bad for living standards,
link |
01:27:02.760
but in a way I can understand
link |
01:27:04.400
that people felt they were betrayed.
link |
01:27:07.120
Technology has a lot of similar characteristics.
link |
01:27:10.680
Technology can make us all better off.
link |
01:27:14.920
It makes the pie bigger, it creates wealth and health,
link |
01:27:17.680
but it can also be uneven.
link |
01:27:18.920
Not everyone automatically benefits.
link |
01:27:21.280
It's possible for some people,
link |
01:27:22.760
even a majority of people to get left behind,
link |
01:27:25.080
while a small group benefits.
link |
01:27:28.200
What most economists would say,
link |
01:27:29.560
well, let's make the pie bigger,
link |
01:27:30.880
but let's make sure we adjust the system
link |
01:27:33.000
so we compensate the people who are hurt.
link |
01:27:35.240
And since the pie is bigger, we can make the rich richer,
link |
01:27:38.000
we can make the middle class richer,
link |
01:27:39.200
we can make the poor richer.
link |
01:27:41.000
Mathematically, everyone could be better off.
link |
01:27:43.640
But again, we're not doing that.
link |
01:27:45.400
And again, people are saying, this isn't working for us.
link |
01:27:48.960
And again, instead of fixing the distribution,
link |
01:27:52.560
a lot of people are beginning to say,
link |
01:27:54.280
hey, technology sucks, we've got to stop it.
link |
01:27:57.280
Let's throw rocks at the Google bus.
link |
01:27:59.040
Let's blow it up.
link |
01:28:00.000
Let's blow it up.
link |
01:28:01.240
And there were the Luddites almost exactly 200 years ago
link |
01:28:04.760
who smashed the looms and the spinning machines
link |
01:28:08.040
because they felt like those machines weren't helping them.
link |
01:28:11.320
We have a real imperative,
link |
01:28:12.720
not just to do the morally right thing,
link |
01:28:14.720
but to do the thing that is gonna save the country,
link |
01:28:17.520
which is make sure that we create,
link |
01:28:19.440
not just prosperity, but shared prosperity.
link |
01:28:22.900
So you've been at MIT for over 30 years, I think.
link |
01:28:27.600
Don't tell everyone how old I am.
link |
01:28:28.520
Yeah, no, that's true, that's true.
link |
01:28:30.280
And you're now moved to Stanford.
link |
01:28:34.000
I'm gonna try not to say anything
link |
01:28:37.240
about how great MIT is.
link |
01:28:39.760
What's that move been like?
link |
01:28:41.520
What, it's East Coast, the West Coast?
link |
01:28:44.960
Well, MIT is great.
link |
01:28:46.160
MIT has been very good to me.
link |
01:28:48.080
It continues to be very good to me.
link |
01:28:49.560
It's an amazing place.
link |
01:28:51.000
There's, I continue to have so many amazing friends
link |
01:28:53.200
and colleagues there.
link |
01:28:54.600
I'm very fortunate to have been able
link |
01:28:56.120
to spend a lot of time at MIT.
link |
01:28:58.480
Stanford's also amazing.
link |
01:29:00.200
And part of what attracted me out here
link |
01:29:02.000
was not just the weather, but also Silicon Valley,
link |
01:29:04.960
let's face it, is really more of the epicenter
link |
01:29:07.360
of the technological revolution.
link |
01:29:09.000
And I wanna be close to the people
link |
01:29:10.400
who are inventing AI and elsewhere.
link |
01:29:12.360
A lot of it is being invested at MIT, for that matter,
link |
01:29:14.920
in Europe and China and elsewhere in NIA.
link |
01:29:20.280
But being a little closer to some of the key technologists
link |
01:29:23.800
was something that was important to me.
link |
01:29:25.920
And it may be shallow, but I also do enjoy the good weather.
link |
01:29:30.240
And I felt a little ripped off
link |
01:29:33.120
when I came here a couple of months ago.
link |
01:29:35.040
And immediately there are the fires
link |
01:29:36.640
and my eyes were burning, the sky was orange,
link |
01:29:39.840
and there's the heat waves.
link |
01:29:41.320
And so it wasn't exactly what I'd been promised,
link |
01:29:44.440
but fingers crossed it'll get back to better.
link |
01:29:47.960
But maybe on a brief aside,
link |
01:29:50.720
there's been some criticism of academia
link |
01:29:52.720
and universities and different avenues.
link |
01:29:55.760
And I, as a person who's gotten to enjoy universities
link |
01:30:00.760
from the pure playground of ideas that it can be,
link |
01:30:06.360
always kind of try to find the words
link |
01:30:08.840
to tell people that these are magical places.
link |
01:30:13.160
Is there something that you can speak to
link |
01:30:17.000
that is beautiful or powerful about universities?
link |
01:30:22.440
Well, sure.
link |
01:30:23.280
I mean, first off, I mean,
link |
01:30:24.480
economists have this concept called revealed preference.
link |
01:30:26.680
You can ask people what they say,
link |
01:30:28.320
or you can watch what they do.
link |
01:30:29.960
And so obviously by reveal preferences,
link |
01:30:32.200
I love academia, I'm happy here.
link |
01:30:34.000
I could be doing lots of other things,
link |
01:30:35.560
but it's something I enjoy a lot.
link |
01:30:37.600
And I think the word magical is exactly right.
link |
01:30:39.680
At least it is for me.
link |
01:30:41.480
I do what I love, you know,
link |
01:30:43.120
hopefully my Dean won't be listening,
link |
01:30:44.320
but I would do this for free.
link |
01:30:45.440
You know, it's just what I like to do.
link |
01:30:49.080
I like to do research.
link |
01:30:50.160
I love to have conversations like this with you
link |
01:30:51.840
and with my students, with my fellow colleagues.
link |
01:30:53.720
I love being around the smartest people I can find
link |
01:30:55.760
and learning something from them
link |
01:30:57.200
and having them challenge me.
link |
01:30:58.640
And that just gives me joy.
link |
01:31:02.480
And every day I find something new and exciting to work on.
link |
01:31:05.520
And a university environment is really filled
link |
01:31:08.040
with other people who feel that way.
link |
01:31:09.840
And so I feel very fortunate to be part of it.
link |
01:31:13.000
And I'm lucky that I'm in a society
link |
01:31:14.880
where I can actually get paid for it
link |
01:31:16.240
and put food on the table
link |
01:31:17.280
while doing the stuff that I really love.
link |
01:31:19.320
And I hope someday everybody can have jobs
link |
01:31:21.640
that are like that.
link |
01:31:22.840
And I appreciate that it's not necessarily easy
link |
01:31:25.400
for everybody to have a job that they both love
link |
01:31:27.440
and also they get paid for.
link |
01:31:30.720
So there are things that don't go well in academia,
link |
01:31:34.080
but by and large, I think it's a kind of,
link |
01:31:35.720
you know, kinder, gentler version of a lot of the world.
link |
01:31:38.160
You know, we sort of cut each other a little slack
link |
01:31:41.240
on things like, you know, on just a lot of things.
link |
01:31:45.760
You know, of course there's harsh debates
link |
01:31:48.280
and discussions about things
link |
01:31:49.840
and some petty politics here and there.
link |
01:31:52.400
Personally, I try to stay away
link |
01:31:53.480
from most of that sort of politics.
link |
01:31:55.560
It's not my thing.
link |
01:31:56.520
And so it doesn't affect me most of the time,
link |
01:31:58.280
sometimes a little bit maybe.
link |
01:32:00.440
But, you know, being able to pull together
link |
01:32:02.960
something we have the digital economy lab,
link |
01:32:04.840
we get all these brilliant grad students
link |
01:32:07.440
and undergraduates and postdocs
link |
01:32:09.280
that are just doing stuff that I learn from.
link |
01:32:12.280
And every one of them has some aspect
link |
01:32:14.720
of what they're doing that's just,
link |
01:32:16.640
I couldn't even understand.
link |
01:32:17.560
It's like way, way more brilliant.
link |
01:32:19.320
And that's really, to me, actually, I really enjoy that.
link |
01:32:23.000
Being in a room with lots of other smart people.
link |
01:32:25.080
And Stanford has made it very easy to attract,
link |
01:32:29.400
you know, those people.
link |
01:32:31.240
I just, you know, say I'm gonna do a seminar or whatever
link |
01:32:33.640
and the people come, they come and wanna work with me.
link |
01:32:36.800
We get funding, we get data sets
link |
01:32:38.840
and it's come together real nicely.
link |
01:32:41.400
And the rest is just fun.
link |
01:32:44.200
It's fun, yeah.
link |
01:32:45.880
And we feel like we're working on important problems,
link |
01:32:47.480
you know, and we're doing things that, you know,
link |
01:32:50.280
I think our first order in terms of
link |
01:32:52.800
what's important in the world
link |
01:32:54.080
and that's very satisfying to me.
link |
01:32:56.280
Maybe a bit of a fun question.
link |
01:32:58.040
What three books, technical, fiction, philosophical,
link |
01:33:02.120
you've enjoyed, had a big impact in your life?
link |
01:33:07.400
Well, I guess I go back to like my teen years
link |
01:33:10.000
and, you know, I read Sid Artha,
link |
01:33:12.360
which is a philosophical book
link |
01:33:13.480
and kind of helps keep me centered.
link |
01:33:15.320
Yeah, my Herman Hess, exactly.
link |
01:33:17.400
Don't get too wrapped up in material things
link |
01:33:20.400
or other things and just sort of, you know,
link |
01:33:22.000
try to find peace on things.
link |
01:33:24.840
A book that actually influenced me a lot
link |
01:33:26.360
in terms of my career was called
link |
01:33:27.680
The Worldly Philosophers by Robert Howe Brenner.
link |
01:33:30.480
It's actually about economists.
link |
01:33:31.720
It goes through a series of different companies,
link |
01:33:33.600
written in a very lively form.
link |
01:33:34.960
And it probably sounds boring,
link |
01:33:36.240
but it did describe whether it's Adam Smith,
link |
01:33:38.880
or Karl Marx, or John Maynard Keynes
link |
01:33:40.800
and each of them sort of what their key insights were,
link |
01:33:43.360
but also kind of their personalities.
link |
01:33:45.360
And I think that's one of the reasons
link |
01:33:46.560
I became an economist was just understanding
link |
01:33:50.640
how they grappled with the big questions of the world.
link |
01:33:53.160
So would you recommend it
link |
01:33:54.000
as a good whirlwind overview of the history of economics?
link |
01:33:57.560
Yeah, yeah, I think that's exactly right.
link |
01:33:59.080
It kind of takes you through the different things
link |
01:34:00.680
and, you know, so you can understand how they reach,
link |
01:34:04.040
thinking some of the strengths and weaknesses.
link |
01:34:06.400
I mean, probably there's a little out of date now.
link |
01:34:07.920
It needs to be updated a bit, but, you know,
link |
01:34:09.520
you could at least look through
link |
01:34:10.400
the first couple hundred years of economics,
link |
01:34:12.960
which is not a bad place to start.
link |
01:34:15.040
More recently, I mean, a book I really enjoyed
link |
01:34:17.560
is by my friend and colleague, Max Tagmark,
link |
01:34:20.280
called Life 3.0.
link |
01:34:21.320
You should have him on your podcast
link |
01:34:22.400
if you haven't already.
link |
01:34:23.240
He was episode number one.
link |
01:34:25.480
Oh my God.
link |
01:34:26.520
And he's back, he'll be back, he'll be back soon.
link |
01:34:30.240
Yeah, no, he's terrific.
link |
01:34:31.560
I love the way his brain works
link |
01:34:33.440
and he makes you think about profound things.
link |
01:34:35.760
He's got such a joyful approach to life.
link |
01:34:38.560
And so that's been a great book.
link |
01:34:41.080
And, you know, I learn a lot from it, I think everybody,
link |
01:34:43.160
but he explains it in a way,
link |
01:34:44.360
even though he's so brilliant,
link |
01:34:45.600
that, you know, everyone can understand,
link |
01:34:47.360
that I can understand.
link |
01:34:49.640
You know, that's three, but let me mention
link |
01:34:51.720
maybe one or two others.
link |
01:34:52.920
I mean, I recently read More From Less
link |
01:34:55.360
by my sometimes coauthor, Andrew McAfee.
link |
01:34:58.640
It made me optimistic about how we can continue
link |
01:35:01.920
to have rising living standards
link |
01:35:04.560
while living more lightly on the planet.
link |
01:35:06.120
In fact, because of higher living standards,
link |
01:35:07.840
because of technology,
link |
01:35:09.120
because of digitization that I mentioned,
link |
01:35:11.480
we don't have to have as big an impact on the planet.
link |
01:35:13.600
And that's a great story to tell
link |
01:35:15.720
and he documents it very carefully.
link |
01:35:19.760
You know, a personal kind of self help book
link |
01:35:21.400
that I found kind of useful, People is Atomic Habits.
link |
01:35:23.840
I think it's, what's his name?
link |
01:35:25.200
James Clear.
link |
01:35:26.200
Yeah, James Clear.
link |
01:35:27.520
He's just, yeah, it's a good name
link |
01:35:29.080
because he writes very clearly.
link |
01:35:30.440
And, you know, most of the sentences I read in that book,
link |
01:35:33.600
I was like, yeah, I know that,
link |
01:35:34.480
but it just really helps to have somebody
link |
01:35:36.120
like remind you and tell you
link |
01:35:37.720
and kind of just reinforce it and...
link |
01:35:40.840
So build habits in your life
link |
01:35:42.480
that you hope to have a positive impact
link |
01:35:46.120
and don't have to make it big things.
link |
01:35:48.040
It could be just tiny little...
link |
01:35:49.200
Exactly, I mean, the word atomic,
link |
01:35:50.680
it's a little bit of a pun, I think he says.
link |
01:35:52.520
You know, atomic means a really small thing
link |
01:35:54.080
to take these little things,
link |
01:35:55.480
but also like atomic power,
link |
01:35:56.800
it can have like, you know, big impact.
link |
01:35:59.440
That's funny, yeah.
link |
01:36:01.440
The biggest ridiculous question,
link |
01:36:04.160
especially to ask an economist,
link |
01:36:05.760
but also a human being, what's the meaning of life?
link |
01:36:08.360
I hope you've gotten the answer to that from somebody else.
link |
01:36:11.480
I think we're all still working on that one,
link |
01:36:13.480
but what is it?
link |
01:36:14.720
You know, I actually learned a lot from my son, Luke,
link |
01:36:18.080
and he's 19 now, but he's always loved philosophy
link |
01:36:22.080
and he reads way more sophisticated philosophy than I do.
link |
01:36:24.880
I once took him to Oxford
link |
01:36:25.880
and he spent the whole time like
link |
01:36:26.880
pulling all these obscure books down and reading them.
link |
01:36:29.040
And a couple of years ago, we had this argument
link |
01:36:32.600
and he was trying to convince me that hedonism
link |
01:36:34.520
was the ultimate, you know, meaning of life,
link |
01:36:37.480
just pleasure, yeah, seeking and...
link |
01:36:40.400
Well, how old was he at the time?
link |
01:36:41.600
17.
link |
01:36:44.440
But he made a really good like intellectual argument
link |
01:36:46.680
for it too, and you know...
link |
01:36:47.880
Of course.
link |
01:36:48.720
But it just didn't strike me as right.
link |
01:36:50.200
And I think that, you know,
link |
01:36:52.840
while I am kind of a utilitarian,
link |
01:36:54.520
like, you know, I do think we should do the grace,
link |
01:36:55.920
good for the grace number, that's just too shallow.
link |
01:36:58.720
And I think I've convinced myself
link |
01:37:00.520
that real happiness doesn't come from seeking pleasure.
link |
01:37:04.240
It's kind of a little, it's ironic.
link |
01:37:05.680
Like if you really focus on being happy,
link |
01:37:07.680
I think it doesn't work.
link |
01:37:09.720
You gotta like be doing something bigger.
link |
01:37:11.600
It's, I think the analogy I sometimes use is, you know,
link |
01:37:14.880
when you look at a dim star in the sky,
link |
01:37:17.600
if you look right at it, it kind of disappears,
link |
01:37:19.440
but you have to look a little to the side
link |
01:37:20.720
and then the parts of your retina
link |
01:37:23.160
that are better at absorbing light,
link |
01:37:24.960
you know, can pick it up better.
link |
01:37:26.360
It's the same thing with happiness.
link |
01:37:27.440
I think you need to sort of find something other goal,
link |
01:37:32.520
something, some meaning in life.
link |
01:37:34.000
And that ultimately makes you happier
link |
01:37:36.200
than if you go squarely at just pleasure.
link |
01:37:39.080
And so for me, you know, the kind of research I do
link |
01:37:42.320
that I think is trying to change the world,
link |
01:37:44.280
make the world a better place.
link |
01:37:46.160
And I'm not like an evolutionary psychologist,
link |
01:37:48.000
but my guess is that our brains are wired,
link |
01:37:50.880
not just for pleasure, but we're social animals
link |
01:37:53.880
and we're wired to like help others.
link |
01:37:57.200
And ultimately, you know, that's something
link |
01:37:59.440
that's really deeply rooted in our psyche.
link |
01:38:02.040
And if we do help others, if we do,
link |
01:38:04.520
or at least feel like we're helping others,
link |
01:38:06.640
you know, our reward systems kick in
link |
01:38:08.240
and we end up being more deeply satisfied
link |
01:38:10.440
than if we just do something selfish and shallow.
link |
01:38:13.600
Beautifully put, I don't think there's a better way
link |
01:38:15.680
to end it, Eric.
link |
01:38:17.000
You're one of the people when I first showed up at MIT
link |
01:38:20.520
that made me proud to be at MIT.
link |
01:38:22.440
So it's so sad that you're now at Stanford,
link |
01:38:24.560
but I'm sure you'll do wonderful things at Stanford as well.
link |
01:38:28.720
I can't wait till future books
link |
01:38:30.920
and people should definitely read the other books.
link |
01:38:32.280
Well, thank you so much.
link |
01:38:33.120
And I think we're all,
link |
01:38:34.120
we're all part of the invisible college as we call it.
link |
01:38:36.200
You know, we're all part of this intellectual
link |
01:38:38.720
and human community where we all can learn from each other.
link |
01:38:41.680
It doesn't really matter physically
link |
01:38:43.120
where we are so much anymore.
link |
01:38:44.880
Beautiful. Thanks for talking today.
link |
01:38:46.560
My pleasure.
link |
01:38:48.040
Thanks for listening to this conversation
link |
01:38:49.440
with Eric Brynjalsson and thank you to our sponsors.
link |
01:38:52.640
Bencero Watches, the maker of classy,
link |
01:38:55.080
well performing watches, FourSigmatic,
link |
01:38:57.840
the maker of delicious mushroom coffee, ExpressVPN,
link |
01:39:01.360
the VPN I've used for many years
link |
01:39:03.120
to protect my privacy on the internet, and Cash App,
link |
01:39:06.640
the app I use to send money to friends.
link |
01:39:09.160
Please check out these sponsors in the description
link |
01:39:11.200
to get a discount and to support this podcast.
link |
01:39:14.920
If you enjoy this thing, subscribe on YouTube,
link |
01:39:17.280
review it with five stars on Apple Podcast,
link |
01:39:19.480
follow on Spotify, support on Patreon,
link |
01:39:22.080
or connect with me on Twitter at Lex Freedman.
link |
01:39:25.400
And now let me leave you with some words
link |
01:39:27.680
from Albert Einstein.
link |
01:39:30.000
It has become appallingly obvious
link |
01:39:32.880
that our technology has exceeded our humanity.
link |
01:39:36.600
Thank you for listening and hope to see you next time.