back to index

Charles Isbell: Computing, Interactive AI, and Race in America | Lex Fridman Podcast #135


small model | large model

link |
00:00:00.000
The following is a conversation with Charles Isbell,
link |
00:00:03.120
Dean of the College of Computing at Georgia Tech,
link |
00:00:06.320
a researcher and educator in the field of artificial intelligence,
link |
00:00:10.720
and someone who deeply thinks about what exactly
link |
00:00:14.320
is the field of computing and how do we teach it.
link |
00:00:18.000
He also has a fascinatingly varied set of interests, including music,
link |
00:00:22.640
books, movies, sports, and history. They make him especially fun to talk with.
link |
00:00:28.160
When I first saw him speak, his charisma immediately took over the room,
link |
00:00:32.880
and I had a stupid, excited smile on my face,
link |
00:00:35.680
and I knew I had to eventually talk to him on this podcast.
link |
00:00:39.360
Quick mention of each sponsor, followed by some thoughts related to the episode.
link |
00:00:44.320
First is Neuro, the maker of functional sugar free gum and mints
link |
00:00:48.800
that I use to give my brain a quick caffeine boost.
link |
00:00:52.320
Second is Decoding Digital, a podcast on tech and entrepreneurship
link |
00:00:56.960
that I listen to and enjoy. Third is Masterclass,
link |
00:01:00.560
online courses that I watch from some of the most amazing humans in history.
link |
00:01:04.960
And finally, Cash App, the app I use to send money to friends
link |
00:01:08.560
for food and drinks. Please check out these sponsors
link |
00:01:12.080
in the description to get a discount and to support this podcast.
link |
00:01:16.240
As a side note, let me say that I'm trying to make it so that the
link |
00:01:19.440
conversations with Charles, Eric Weinstein, and Dan Carlin
link |
00:01:23.280
will be published before Americans vote for president on November 3rd.
link |
00:01:28.160
There's nothing explicitly political in these conversations,
link |
00:01:31.360
but they do touch on something in human nature
link |
00:01:34.400
that I hope can bring context to our difficult time.
link |
00:01:37.840
And maybe, for a moment, allow us to empathize with people we disagree with.
link |
00:01:43.200
With Eric, we talk about the nature of evil.
link |
00:01:45.840
With Charles, besides AI and music, we talk a bit about race in America
link |
00:01:51.440
and how we can bring more love and empathy to
link |
00:01:54.800
our online communication. And with Dan Carlin,
link |
00:01:58.640
well, we talk about Alexander the Great, Jengas Khan,
link |
00:02:02.960
Hitler, Stalin, and all the complicated parts of human history in between,
link |
00:02:07.840
with a hopeful eye toward a brighter future for our humble
link |
00:02:11.440
little civilization here on Earth. The conversation with Dan
link |
00:02:15.280
will hopefully be posted tomorrow on Monday, November 2nd.
link |
00:02:20.160
If you enjoy this thing, subscribe on YouTube,
link |
00:02:22.640
review it with 5,000 up a podcast, follow on Spotify,
link |
00:02:26.240
support on Patreon, and connect with me on Twitter at Lex Freedman.
link |
00:02:31.120
And now, here's my conversation with Charles Isbell.
link |
00:02:36.080
You've mentioned that you love movies and TV shows.
link |
00:02:40.960
Let's ask an easy question, but you have to be definitively,
link |
00:02:44.640
objectively conclusive. What's your top three movies of all time?
link |
00:02:49.760
So, you're asking me to be definitive and to be conclusive. That's a little hard.
link |
00:02:53.040
I'm going to tell you why. It's very simple. It's because
link |
00:02:56.800
movies is too broad of a category. I got to pick subgenres, but I will tell you
link |
00:03:00.560
that of those genres, I'll pick one or two for me to the genres.
link |
00:03:04.080
I'll get us to three, sometimes I'm going to cheat. So, my favorite
link |
00:03:08.080
comedy of all times, which probably my favorite movie of all time,
link |
00:03:12.000
is His Girl Friday, which is probably a movie that you've not ever heard of, but
link |
00:03:16.640
it's based on a play called The Front Page from, I don't know, early 1900s.
link |
00:03:22.400
And the movie is a fantastic film. What's the story? What's the independent film?
link |
00:03:28.160
No, no, no. What are we talking about? This is one of the movies that would
link |
00:03:31.440
have been very popular. It's a screwball comedy. Have you ever
link |
00:03:33.440
seen Moonlighting, the TV show? You know what I'm talking about?
link |
00:03:35.840
So, you've seen these shows where there's a man and a woman,
link |
00:03:38.720
and they clearly are in love with one another, and they're constantly fighting
link |
00:03:41.360
and always talking over each other. Yeah.
link |
00:03:43.120
Banter, banter, banter, banter, banter. This was the movie that started all that,
link |
00:03:47.440
as far as I'm concerned. It's very much of its time, so it's,
link |
00:03:51.680
I don't know, it must have come out sometime between 1934 and 1939. I'm not sure exactly when
link |
00:03:56.400
the movie itself came out. It's black and white. It's just a fantastic
link |
00:04:01.200
film. It is hilarious. It's just most of the conversation.
link |
00:04:04.240
Not entirely, but mostly, mostly. Just a lot of back and forth. There's a story there.
link |
00:04:08.880
Someone's on death row, and they're newspaper men, including her. They're all newspaper men.
link |
00:04:16.240
They were divorced. The editor, the publisher, I guess, and the reporter, they were divorced,
link |
00:04:22.400
but they clearly, he's thinking, trying to get back together, and there's this whole other
link |
00:04:26.240
thing that's going on, but none of that matters. The plot doesn't matter.
link |
00:04:28.800
Yeah. There's just total play in conversation.
link |
00:04:31.520
It's fantastic, and I just love everything about the conversation, because at the end of the day,
link |
00:04:35.680
sort of narrative and conversation are the sort of things that drive me, and so I really like
link |
00:04:39.680
that movie for that reason. Similarly, I'm now going to cheat, and I'm going to give you two
link |
00:04:43.680
movies as one, and they're Crouching Tiger, Hidden Dragon, and John Wick, both relatively modern.
link |
00:04:50.800
John Wick, of course. One, two, or three.
link |
00:04:52.640
One. I love them all for different reasons, and they're increasingly more ridiculous,
link |
00:04:57.520
kind of like loving alien and aliens, despite the fact they're two completely different movies.
link |
00:05:01.600
But the reason I put Crouching Tiger, Hidden Dragon, and John Wick together is because I
link |
00:05:06.080
actually think they're the same movie, or what I like about them, the same movie,
link |
00:05:09.760
which is both of them create a world that you're coming in the middle of, and they don't explain
link |
00:05:16.480
it to you, but the story is done so well that you pick it up. So anyone who's seen John Wick,
link |
00:05:21.600
you know, you have these little coins, and they're headed out, and there are these
link |
00:05:25.360
rules, and apparently every single person in New York City is an assassin.
link |
00:05:28.240
There's like two people who come through who aren't, but otherwise they are. But there's
link |
00:05:31.680
this complicated world, and everyone knows each other, they don't sit down and explain it to you,
link |
00:05:35.200
but you figure it out. Crouching Tiger, Hidden Dragon is a lot like that. You get the feeling
link |
00:05:38.720
that this is chapter nine of a 10 part story, and you've missed the first eight chapters,
link |
00:05:43.280
and they're not going to explain it to you, but there's this sort of rich world behind you.
link |
00:05:46.000
You get pulled in anyway, like immediately.
link |
00:05:47.280
You get pulled in anyway. So it's just excellent storytelling in both cases,
link |
00:05:50.800
and very, very different.
link |
00:05:51.840
And also you like the outfit, I assume, the John Wick outfit?
link |
00:05:54.480
Oh yeah, of course. Well, of course. Yes. I think John Wick outfit first.
link |
00:05:58.000
And so that's number two. And then...
link |
00:05:59.760
But sorry to pause on that, Martial Arts. You have a long list of hobbies,
link |
00:06:03.920
like it scrolls off the page, but I didn't see Martial Arts as one of them.
link |
00:06:07.680
I do not do Martial Arts, but I certainly watch Martial Arts.
link |
00:06:10.240
Oh, I appreciate it very much. Oh, we could talk about every Jackie Chan movie I've ever made.
link |
00:06:13.680
Okay.
link |
00:06:14.000
And I would be on board with that.
link |
00:06:15.680
The Couch Hour 2, like that kind of comedy of a cop?
link |
00:06:18.880
Yes, yes. By the way, my favorite Jackie Chan movie would be
link |
00:06:22.160
Drunken Master 2, known in the States usually as Legend of the Drunken Master.
link |
00:06:29.200
Actually, Drunken Master, the first one is the first Kung Fu movie I ever saw,
link |
00:06:33.360
but I did not know that.
link |
00:06:34.320
The first Jackie Chan movie?
link |
00:06:35.920
No, first one ever that I saw on Remember.
link |
00:06:38.000
But I had no idea that that's what it was, and I didn't know that was Jackie Chan.
link |
00:06:41.840
That was like his first major movie. I was a kid. It was done in the 70s.
link |
00:06:46.320
I only later rediscovered that it was actually...
link |
00:06:49.600
And he creates his own martial art by drinking.
link |
00:06:52.320
Was he actually drinking or was he played drinking?
link |
00:06:58.000
You mean as an actor or?
link |
00:06:59.520
No, I'm sure as an actor.
link |
00:07:02.560
No, he was 70s or whatever.
link |
00:07:04.160
He was definitely drinking, and in the end, he drinks industrial grade alcohol.
link |
00:07:10.000
Yeah.
link |
00:07:10.480
Yeah, and has one of the most fantastic fights ever in that subgenre.
link |
00:07:15.040
Anyway, that's my favorite one of his movies, but I'll tell you the last movie
link |
00:07:18.080
is actually a movie called Nothing but a Man, which is the 1960s.
link |
00:07:23.680
Start Ivan Dixon, who you'll know from Hogan's Heroes and Abby Lincoln.
link |
00:07:31.840
It's just a really small little drama. It's a beautiful story,
link |
00:07:35.040
but my favorite scene, so I'm cheating.
link |
00:07:36.880
My favorite, one of my favorite movies just for the ending is The Godfather.
link |
00:07:42.560
I think the last scene of that is just fantastic.
link |
00:07:45.520
It's the whole movie all summarized in just eight, nine seconds.
link |
00:07:48.400
Godfather, part one?
link |
00:07:49.440
Part one.
link |
00:07:50.240
How does it end?
link |
00:07:51.280
I don't think you need to worry about spoilers if you haven't seen The Godfather.
link |
00:07:54.880
Spoiler alert.
link |
00:07:55.680
It ends with the wife coming to Michael, and he says,
link |
00:08:01.920
just this once, I'll let you ask me my business.
link |
00:08:04.000
And she asks him if he did this terrible thing, and he looks her in the eye and he lies,
link |
00:08:07.760
and he says, no.
link |
00:08:09.120
And she says, thank you.
link |
00:08:10.000
And she walks out the door, and you see him, you see her going out of the door,
link |
00:08:17.920
and all these people are coming in, and they're kissing Michael's hands,
link |
00:08:21.680
and Godfather.
link |
00:08:23.280
And then the camera switches perspective.
link |
00:08:25.360
So instead of looking at him, you're looking at her,
link |
00:08:29.120
and the door closes in her face, and that's the end of the movie.
link |
00:08:32.320
And that's the whole movie right there.
link |
00:08:34.480
Do you see parallels between that and your position as Dean at Georgia Tech Chrome?
link |
00:08:38.000
I'm just kidding, a trick question.
link |
00:08:42.000
Sometimes, certainly, the door gets closed on me every once in a while.
link |
00:08:45.440
Okay, that was a rhetorical question.
link |
00:08:48.000
You've also mentioned that you, I think, enjoy all kinds of experiments, including on yourself,
link |
00:08:54.000
but I saw a video where you said you did an experiment where you tracked all kinds of
link |
00:08:58.000
information about yourself, and a few others, sort of wiring up your home.
link |
00:09:04.640
And this little idea that you mentioned in that video, which is kind of interesting,
link |
00:09:09.200
that you thought that two days worth of data is enough to capture majority of the behavior
link |
00:09:16.560
of the human being.
link |
00:09:18.960
First, can you describe what the heck you did to collect all that data?
link |
00:09:23.200
Because it's fascinating, just like little details of how you collect that data,
link |
00:09:26.720
and also what your intuition behind the two days is.
link |
00:09:29.840
So first off, it has to be the right two days, but I was thinking of a very specific
link |
00:09:33.680
experiment. There's actually a suite of them that I've been a part of, and other people
link |
00:09:36.720
have done this. Of course, I just sort of dabbled in that part of the world.
link |
00:09:39.760
But to be very clear, the specific thing that I was talking about had to do with recording
link |
00:09:44.320
all the IR going on in my, infrared going on in my house.
link |
00:09:47.920
So this is a long time ago.
link |
00:09:49.520
So this is everything's being curled by pressing buttons on remote controls as opposed to speaking
link |
00:09:54.720
to Alexa or Siri or someone like that.
link |
00:09:57.120
And I was just trying to figure out if you could get enough data on people to figure out
link |
00:10:00.960
what they were going to do with their TVs or their lights.
link |
00:10:03.760
My house was completely wired up at the time, which, you know, what I'm about to look at
link |
00:10:08.400
a movie or I'm about to turn on the TV or whatever and just see what I could predict
link |
00:10:11.360
from it. It was kind of surprising.
link |
00:10:15.040
It shouldn't have been, but that's all very easy to do, by the way, just capturing all
link |
00:10:18.320
the little stuff. I mean, it's a bunch of computer systems.
link |
00:10:20.080
It's really easy to capture the day, if you know what you're looking for at Georgia Tech
link |
00:10:23.200
Long before I got there, we had this thing called the aware home where everything was
link |
00:10:26.640
wired up and you captured everything that was going on.
link |
00:10:29.200
Nothing even difficult, not with video or anything like that, just the way that the
link |
00:10:32.800
system was just capturing everything.
link |
00:10:35.440
So it turns out that, and I did this with myself and then I had students and they worked
link |
00:10:40.640
with many other people, and it turns out at the end of the day, people do the same things
link |
00:10:45.680
over and over and over again.
link |
00:10:47.840
So it has to be the right two days, like a weekend, but it turns out not only can you
link |
00:10:51.360
predict what someone's going to do next at the level of what button they're going to
link |
00:10:54.880
press next on a remote control, but you can do it with something really, really simple,
link |
00:11:00.480
like a, you don't even need a hidden markoff model.
link |
00:11:02.480
It's like a mark, just simply, I press this, this is my prediction of the next thing.
link |
00:11:06.240
It turns out you can get 93% accuracy just by doing something very simple and stupid
link |
00:11:11.680
and just counting statistics.
link |
00:11:13.520
What was actually more interesting is that you could use that information.
link |
00:11:16.320
This comes up again and again in my work.
link |
00:11:18.320
If you try to represent people or objects by the things they do, the things you can
link |
00:11:24.080
measure about them that have to do with action in the world, so distribution over actions,
link |
00:11:28.880
and you try to represent them by the distribution of actions that are done on them, then you
link |
00:11:34.800
do a pretty good job of sort of understanding how people are and they cluster remarkably
link |
00:11:40.480
well, in fact, irritatingly so.
link |
00:11:43.040
And so by clustering people this way, you can, maybe, you know, I got the 93% accuracy
link |
00:11:49.440
of what's the next button you're going to press, but I can get 99% accuracy or somewhere
link |
00:11:53.680
where there's about on the collections of things you might press.
link |
00:11:56.400
And it turns out the things that you might press are all related to number to each other
link |
00:12:00.160
in exactly the way that you would expect.
link |
00:12:01.280
So, for example, all the numbers on the keypad, it turns out all have the same behavior with
link |
00:12:07.440
respect to you as a human being.
link |
00:12:09.040
And so you would naturally cluster them together and you discover that numbers are all related
link |
00:12:14.960
to one another in some way and all these other things.
link |
00:12:16.960
And then, and here's the part that I think is important.
link |
00:12:19.440
I mean, you can see this in all kinds of things.
link |
00:12:21.280
Every individual is different, but any given individual is remarkably predictable,
link |
00:12:28.320
because you keep doing the same things over and over again.
link |
00:12:30.560
And the two things that I've learned in the long time that I've been thinking about this
link |
00:12:34.080
is people are easily predictable and people hate when you tell them that they're easily
link |
00:12:38.320
predictable, but they are.
link |
00:12:40.320
And there you go.
link |
00:12:41.280
And what about, let me play devil's advocate and philosophically speaking,
link |
00:12:46.800
is it possible to say that what defines humans is the outlier?
link |
00:12:51.520
So, even though some large percentage of our behaviors, whatever the signal we measure is
link |
00:12:56.880
the same and it would cluster nicely, but maybe it's the special moments of when we
link |
00:13:02.320
break out of the routine is the definitive things and the way we break out of that routine
link |
00:13:07.440
for each one of us might be different.
link |
00:13:09.200
It's possible.
link |
00:13:09.680
I would say it a little differently, I think.
link |
00:13:12.560
I would make two things.
link |
00:13:13.600
One is I'm going to disagree with the premise, I think.
link |
00:13:16.400
But that's fine.
link |
00:13:17.360
I think the way I would put it is there are people who are very different from lots of
link |
00:13:23.600
other people, but they're not 0%.
link |
00:13:25.760
They're closer to 10%, right?
link |
00:13:27.360
So, in fact, even if you do this kind of clustering of people,
link |
00:13:29.680
that'll turn out to be the small number of people, they all behave like each other,
link |
00:13:33.280
even if they individually behave very differently from everyone else.
link |
00:13:36.960
So, I think that's kind of important.
link |
00:13:38.560
But what you're really asking, I think, and I think this is really a question is,
link |
00:13:41.680
what do you do when you're faced with the situation you've never seen before?
link |
00:13:46.400
What do you do when you're faced with an extraordinary situation maybe you've seen
link |
00:13:49.280
others do and you're actually forced to do something and you react to that very
link |
00:13:52.320
differently?
link |
00:13:52.640
And that is the thing that makes you human.
link |
00:13:53.920
I would agree with that, at least at a philosophical level, that it's the times
link |
00:13:58.560
when you are faced with something difficult, a decision that you have to make,
link |
00:14:04.080
where the answer isn't easy, even if you know what the right answer is,
link |
00:14:07.600
that's sort of what defines you as the individual.
link |
00:14:09.440
And I think what defines people broadly, it's the hard problem.
link |
00:14:12.480
It's not the easy problem.
link |
00:14:13.280
It's the thing that's going to hurt you.
link |
00:14:14.400
It's not the thing.
link |
00:14:16.160
It's not even that it's difficult.
link |
00:14:17.360
It's just that you know that the outcome is going to be highly suboptimal for you.
link |
00:14:22.160
And I do think that that's a reasonable place to start for the question of what makes us human.
link |
00:14:28.000
So, before we talk about sort of exploring the different ideas underlying interactive
link |
00:14:32.240
artificial intelligence, which we are working on, let me just go along this thread
link |
00:14:35.920
to skip to kind of our world of social media, which is something that,
link |
00:14:41.680
at least on the artificial intelligence side, you think about there's a popular narrative.
link |
00:14:46.960
I don't know if it's true, but that we have these silos in social media and we have these
link |
00:14:54.160
clusterings as you're kind of mentioning.
link |
00:14:56.480
And the idea is that, you know, along that narrative is that, you know, we want to break
link |
00:15:04.240
each other out of those silos so we can be empathetic to other people.
link |
00:15:10.480
To if you're a Democrat, you'd be empathetic to the Republican.
link |
00:15:14.000
If you're a Republican, you're empathetic Democrat.
link |
00:15:16.240
Those are just two silly bins that we seem to be very excited about.
link |
00:15:20.800
But there's other binnings that we can think about.
link |
00:15:24.080
Is there from an artificial intelligence perspective?
link |
00:15:27.920
Because you're just saying we cluster along the data, but then interactive artificial
link |
00:15:32.400
intelligence is referring to throwing agents into that mix, AI systems in that mix,
link |
00:15:38.480
helping us interacting with us humans and maybe getting us out of those silos.
link |
00:15:43.760
Is that something that you think is possible?
link |
00:15:46.640
Do you see a hopeful possibility for artificial intelligence systems in these large networks
link |
00:15:53.760
of people to get us outside of our habits in at least the idea space?
link |
00:16:00.000
To where we can sort of be empathetic to other people's lived experiences,
link |
00:16:07.760
other people's points of view, you know, all that kind of stuff?
link |
00:16:11.360
Yes, I actually don't think it's that hard.
link |
00:16:13.200
Well, it's not hard in this sense.
link |
00:16:15.200
So imagine that you can, let's make life simple for a minute.
link |
00:16:20.480
Let's assume that you can do a kind of partial ordering over ideas or clusterings of behavior.
link |
00:16:27.040
It doesn't even matter what I mean here.
link |
00:16:28.800
So long as there's some way that this is a cluster, this is a cluster,
link |
00:16:31.440
there's some edge between them, right?
link |
00:16:32.640
And this is kind of, they don't quite touch even or maybe they come very close.
link |
00:16:35.920
If you can imagine that conceptually, then the way you get from here to here is not by going
link |
00:16:40.480
from here to here. The way you get from here to here is you find the edge and you move slowly
link |
00:16:43.760
together, right? And I think that machines are actually very good at that sort of thing
link |
00:16:47.120
once we kind of define the problem, either in terms of behavior or ideas or words or whatever.
link |
00:16:51.280
So it's easy in the sense that if you already have the network and you know the relationships,
link |
00:16:55.680
you know, the edges and sort of the strengths on them, and you kind of have some semantic
link |
00:16:59.520
meaning for them, the machine doesn't have to, you do as the designer, then yeah,
link |
00:17:04.000
I think you can kind of move people along and sort of expand them.
link |
00:17:06.800
But it's harder than that. And the reason it's harder than that
link |
00:17:10.960
or sort of coming up with the network structure itself is hard is because I'm going to tell
link |
00:17:14.000
you a story that I, someone else told me and I don't, I may get some of the details a little
link |
00:17:18.640
bit wrong, but it's, it's roughly, it roughly goes like this. You take two sets of people
link |
00:17:23.280
from the same backgrounds and you want them to solve a problem. So you separate them up,
link |
00:17:28.880
which we do all the time, right? Oh, you know, we're going to break out in the,
link |
00:17:31.200
we're going to break out groups. You're going to go over there and you're going to talk about
link |
00:17:33.200
this. You're going to go over there and you're going to talk about this. And then you have them
link |
00:17:36.560
sort of in this big room, but far apart from one another and you have them sort of interact with
link |
00:17:39.600
one another. When they come back to talk about what they learned, you want to merge what they've
link |
00:17:44.080
done together, it can be extremely hard because they don't, they basically don't speak the same
link |
00:17:49.200
language anymore. Like when you create these problems and you dive into them, you create
link |
00:17:52.480
your own language. So the example this one person gave me, which I found kind of interesting because
link |
00:17:57.040
we were in the middle of that at the time, was they're sitting over there and they're talking
link |
00:18:01.440
about this, these rooms that you can see, but you're seeing them from different vantage points,
link |
00:18:05.280
depending upon what side of the room you're on. They can see a clock very easily. And so they
link |
00:18:10.000
start referring to the room as the one with the clock. This group over here looking at the same
link |
00:18:14.880
room, they can see the clock, but it's, you know, not in their line of sight or whatever. So they
link |
00:18:18.640
end up referring to it by some other way. When they get back together and they're talking about
link |
00:18:24.400
things, they're referring to the same room and they don't even realize they're referring to the
link |
00:18:28.080
same room. In fact, this group doesn't even see that there's a clock there and this group doesn't
link |
00:18:31.440
see whatever, the clock on the wall is the thing that stuck with me. So if you create these different
link |
00:18:35.120
silos, the problem isn't that the ideologies disagree. It's that you're using the same words
link |
00:18:40.800
and they mean radically different things. The hard part is just getting them to agree on the,
link |
00:18:45.600
well, maybe we'd say the axioms in our world, right? But, you know, just get them to agree on
link |
00:18:50.480
some basic definitions. Because right now, they're talking past each other, just completely
link |
00:18:55.440
talking past each other. That's the hard part. Getting them to meet, getting them to interact,
link |
00:18:59.760
that may not be that difficult. Getting them to see where their language is leading them to
link |
00:19:04.720
be past one another. That's the hard part. It's a really interesting question to me. It could be
link |
00:19:09.360
on the layer of language, but it feels like there's multiple layers to this. Like it could be
link |
00:19:12.960
worldview. It could be, I mean, all boils down to empathy, being able to put yourself in the shoes
link |
00:19:18.160
of the other person, to learn the language, to learn like visually how they see the world,
link |
00:19:24.320
to learn like the, I mean, I experienced this now with trolls, the degree of humor in that world.
link |
00:19:33.280
For example, I talk about love a lot. I'm very lucky to have this amazing community of loving
link |
00:19:39.600
people. But whenever I encounter trolls, they always roll their eyes at the idea of love because
link |
00:19:45.200
it's so quote unquote cringe. So they show love by like derision, I would say. And I think about,
link |
00:19:57.280
on the human level, that's a whole another discussion, that's psychology, that's sociology,
link |
00:20:01.040
so on. But I wonder if AI systems can help somehow and bridge the gap of what is this person's life
link |
00:20:11.520
like. Encourage me to just ask that question, to put myself in their shoes, to experience the
link |
00:20:18.160
agitations, the fears, the hopes they have, the, to experience, you know, the, even just to think
link |
00:20:25.840
about what was there upbringing like, like having a single parent home, or a shitty education, or
link |
00:20:34.080
all those kinds of things, just to put myself in that mind space. It feels like that's really
link |
00:20:39.680
important for us to bring those clusters together to find that similar language. But it's unclear
link |
00:20:46.560
how AI can help that, because it seems like AI systems need to understand both parties first.
link |
00:20:51.760
So, you know, the word understand there's doing a lot of work, right? So, do you have to understand
link |
00:20:56.800
it, or do you just simply have to note that there is something similar as a point to touch, right?
link |
00:21:03.200
So, you know, you use the word empathy, and I like that word for a lot of reasons, I think you're
link |
00:21:08.800
right in the way that you're using, in the way that you're describing, but let's separate it from
link |
00:21:11.760
sympathy, right? So, you know, sympathy is feeling sort of for someone, empathy is kind of understanding
link |
00:21:18.320
where they're coming from and how they feel, right? And for most people, those things go hand in
link |
00:21:23.600
hand. For some people, some are very good at empathy and very bad at sympathy. Some people
link |
00:21:28.720
cannot experience, well, my observation would be, I'm not a psychologist, my observation would be
link |
00:21:33.200
that some people seem incapable of feeling sympathy unless they feel empathy first.
link |
00:21:37.920
You can understand someone, understand where they're coming from and still think, no, I can't
link |
00:21:42.400
support that, right? It doesn't mean that the only way, because if that isn't the case, then what
link |
00:21:47.920
it requires is that you, you must, the only way that you can, to understand someone means you must
link |
00:21:54.480
agree with everything that they do, which is it right, right? And if the only way I can feel for
link |
00:22:00.400
someone is to completely understand them and make them like me in some way, well, then we're lost,
link |
00:22:07.120
right? Because we're not all exactly like each other. I don't have to understand everything
link |
00:22:10.720
that you've gone through. It helps clearly, but they're separable ideas, right? Even though they
link |
00:22:14.560
get clearly, clearly tangled up in one another. So what I think AI could help you do actually is if,
link |
00:22:20.080
and I'm being quite fanciful as it were, but if you think of these as kind of, I understand how
link |
00:22:25.920
you interact, the words that you use, the actions you take, I have some way of doing this. Let's
link |
00:22:29.680
not worry about what that is, but I can see you as a kind of distribution of experiences and actions
link |
00:22:36.080
taken upon you, things you've done and so on. And I can do this with someone else and I can
link |
00:22:40.320
find the places where there's some kind of commonality, a mapping as it were, even if it's
link |
00:22:45.120
not total. If I think of this as distribution, right, then I can take the cosine of the angle
link |
00:22:50.400
between you and if it's zero, you've got nothing in common. If it's one, you're completely the
link |
00:22:54.960
same person, well, you're probably not one. You're almost certainly not zero. I can find the place
link |
00:23:01.280
where there's the overlap, then I might be able to introduce you on that basis or connect you in
link |
00:23:05.200
that way and make it easier for you to take that step of empathy. It's not impossible to do,
link |
00:23:14.400
although I wonder if it requires that everyone involved is at least interested in asking the
link |
00:23:20.320
question. So maybe the hard part is just getting them interested in asking the question. In fact,
link |
00:23:24.000
maybe if you can get them to ask the question, how are we more alike than we are different,
link |
00:23:27.200
they'll solve it themselves. Maybe that's the problem that AI should be working on,
link |
00:23:30.160
not telling you how you're similar or different, but just getting you to decide that it's
link |
00:23:34.320
worth while asking the question. It feels like an economist to answer actually.
link |
00:23:38.960
Well, first of all, people would disagree. So let me disagree slightly, which is, I think
link |
00:23:44.720
everything you said is brilliant, but I tend to believe philosophically speaking that people
link |
00:23:50.320
are interested underneath it all. And I would say that AI, the possibility that an AI system would
link |
00:23:58.640
show the commonality is incredible. That's a really good starting point. I would say if you
link |
00:24:03.440
should, if on social media, I could discover the common things deep or shallow between me
link |
00:24:10.720
and a person who there's tension with, I think that my basic human nature would take over from
link |
00:24:17.920
there. And I think enjoy that commonality. And like, there's something sticky about that that
link |
00:24:25.280
my mind will linger on. And that person in my mind will become like warmer and warmer. And
link |
00:24:30.880
like, I'll start to feel more and more compassionate towards them. I think for majority of the
link |
00:24:35.520
population, that's true. But that might be, that's a hypothesis.
link |
00:24:39.440
Yeah. I mean, it's an empirical question, right? You'd have to figure it out. I mean,
link |
00:24:42.960
I want to believe you're right. And so I'm going to say that I think you're right.
link |
00:24:46.400
Of course, some people come to those things for the purpose of trolling, right? And it
link |
00:24:52.320
doesn't matter. They're playing a different game. But I don't know. My experience is,
link |
00:24:57.280
it requires two things. In fact, maybe this is really at the end what you're saying. And I do
link |
00:25:02.560
agree with this for sure. So it's hard to hold on to that kind of anger or to hold on to just a
link |
00:25:12.960
desire to humiliate someone for that long. It's just difficult to do. It takes a toll on you.
link |
00:25:17.600
But more importantly, we know this, both from people having done studies on it, but also from
link |
00:25:22.320
our own experiences, that it is much easier to be dismissive of a person if they're not in front
link |
00:25:27.680
of you, if they're not real, right? So much of the history of the world is about making people
link |
00:25:34.080
other, right? So if you're in social media, if you're on the web, if you're doing whatever in
link |
00:25:38.400
the internet, being forced to deal with someone as a person, some equivalent to being in the same
link |
00:25:44.640
room, makes a huge difference. Because then you're one, you're forced to deal with their humanity
link |
00:25:49.120
because it's in front of you. The other is, of course, that they might punch you in the face
link |
00:25:52.720
if you go too far. So both of those things kind of work together, I think, to the right end.
link |
00:25:57.200
So I think bringing people together is really a kind of substitute for forcing them to see the
link |
00:26:04.400
humanity in another person and to not be able to treat them as bits. It's hard to troll someone
link |
00:26:08.880
when you're looking them in the eye. This is very difficult to do.
link |
00:26:11.440
Agreed. Your broad set of research interests fall under interactive AI, as I mentioned,
link |
00:26:18.960
which is a fascinating set of ideas. And you have some concrete things that you're particularly
link |
00:26:24.400
interested in. But maybe could you talk about how you think about the field of interactive
link |
00:26:30.480
artificial intelligence? Sure. So let me say up front that if you look at, certainly my early
link |
00:26:35.440
work, but even if you look at most of it, I'm a machine learning guy, right? I do machine
link |
00:26:40.320
learning. First paper I ever published was a NIPS. Back then, it was NIPS. Now, it's NURPS.
link |
00:26:45.440
It's a long story there. Anyway, that's another thing. So I'm a machine learning guy. I believe
link |
00:26:49.600
in data. I believe in statistics and all those kinds of things. And the reason I'm bringing it up
link |
00:26:53.120
is even though I'm a newfangled statistical machine learning guy and have been for a very long time,
link |
00:26:57.920
the problem I really care about is AI. I care about artificial intelligence. I care about
link |
00:27:02.880
building some kind of intelligent artifact. However, that gets expressed, that would be
link |
00:27:08.480
the, at least as intelligent as humans and as interesting as humans, perhaps on their,
link |
00:27:14.240
on their, they're sort of in their own way. So that's the deep underlying love and dream is the
link |
00:27:19.200
bigger AI. Yes. Whatever the heck that is. Yeah. The machine learning in some ways is a means to
link |
00:27:25.280
the end. It is not the end. And I don't understand how one could be intelligent without learning.
link |
00:27:30.000
So therefore, I got to figure out how to do that, right? So that's important. But machine learning,
link |
00:27:33.360
by the way, is also a tool. I said statistical because that's what most people think of themselves.
link |
00:27:38.320
Machine learning people, that's how they think. Thinking that Pat Langley might disagree or at
link |
00:27:41.440
least 1980s, Pat Langley might disagree with what it takes to do machine learning. But I care about
link |
00:27:47.440
the AI problem, which is why it's interactive AI, not just interactive ML. I think it's important
link |
00:27:51.520
to understand that there's a longterm goal here, which I will probably never live to see. But I
link |
00:27:56.400
would love to have been a part of, which is building something truly intelligent outside of,
link |
00:28:01.200
outside of ourselves. Can we take a tiny tangent or am I interrupting? Which is,
link |
00:28:05.680
is there something you can say concrete about the mysterious gap between the subset ML and the
link |
00:28:14.400
bigger AI? What's missing? What's what do you think? I mean, obviously, it's totally unknown,
link |
00:28:20.960
not totally, but in part unknown at this time. But is it something like with Pat Langley's,
link |
00:28:25.520
is it knowledge, like expert system reasoning type of kind of thing? So AI is bigger than ML,
link |
00:28:31.520
but ML is bigger than AI. This is kind of the real problem here is that they're really overlapping
link |
00:28:36.560
things that are really interested in slightly different problems. I tend to think of ML, and
link |
00:28:40.400
there are many people out there are going to be very upset at me about this, but I tend to think
link |
00:28:43.360
of ML being much more concerned with the engineering of solving a problem and AI about
link |
00:28:47.120
the sort of more philosophical goal of true intelligence. And that's the thing that motivates
link |
00:28:51.360
me, even if I end up finding myself living in this kind of engineeringish space. I've now made
link |
00:28:56.880
Michael Jordan upset. But to me, they just feel very different. You're just measuring them
link |
00:29:02.720
differently. You're sort of goals of where you're trying to be are somewhat different.
link |
00:29:07.200
But to me, AI is about trying to build that intelligent thing. And typically, but not always,
link |
00:29:13.680
for the purpose of understanding ourselves a little bit better. Machine learning is,
link |
00:29:17.520
I think, trying to solve the problem, whatever that problem is. Now, that's my take. Others,
link |
00:29:22.240
of course, would disagree. So on that note, with the interactive AI, do you tend to in your mind
link |
00:29:28.000
visualize AI as a singular system, or is it as a collective huge amount of systems interacting
link |
00:29:33.760
with each other? Is the social interaction of us humans and of AI systems the fundamental
link |
00:29:40.640
to intelligence? Well, it's certainly fundamental to our kind of intelligence,
link |
00:29:44.560
right? And I actually think it matters quite a bit. So the reason the interactive AI part matters
link |
00:29:49.840
to me is because I don't, this is going to sound simple, but I don't care whether a tree makes a
link |
00:29:58.960
sound when it falls and there's no one around because I don't think it matters, right? If there's
link |
00:30:03.920
no observer in some sense. And I think what's interesting about the way that we're intelligent
link |
00:30:08.480
is we're intelligent with other people, right, or other things anyway. And we go out of our way to
link |
00:30:13.760
make other things intelligent. We're hardwired to like find intention, even whether there is no
link |
00:30:18.640
intention to why we anthropomorphize everything. We, I think anyway, we, we, I think the interactive
link |
00:30:23.360
AI part is being intelligent in and of myself in an isolation is a meaningless act in some sense.
link |
00:30:30.320
The correct answer is you have to be intelligent in the way that you interact others. It's also
link |
00:30:33.760
efficient because it allows you to learn faster because you can import from, you know, past history.
link |
00:30:39.600
It also allows you to be efficient in the transmission of that. So we ask ourselves about
link |
00:30:43.680
me. Am I intelligent? Clearly, I think so. But I'm also intelligent as a part of a larger
link |
00:30:49.760
species and group of people and we're trying to move the species forward as well. And so I think
link |
00:30:54.160
that notion of being intelligent with others is kind of the key thing because otherwise you,
link |
00:30:58.880
you come and you go and then it doesn't matter. And so that's why I care about that aspect of it.
link |
00:31:04.400
And it has lots of other implications. One is not just, you know, building something
link |
00:31:09.280
intelligent with others, but understanding that you can't always communicate with those others.
link |
00:31:13.040
They have been in a room where there's a clock on the wall that you haven't seen, which means you
link |
00:31:16.720
have to spend an enormous amount of time communicating with one another constantly
link |
00:31:20.720
in order to figure out what the other, what each other wants, right? So, I mean, this is why people
link |
00:31:24.960
project, right? You project your own intentions and your own reasons for doing things on the others
link |
00:31:29.760
as a way of understanding them so that you know, you know how to behave. But by the way, you,
link |
00:31:34.560
completely predictable person, I don't know how you're predictable. I don't know you well enough,
link |
00:31:37.840
but you probably eat the same five things over and over again or whatever it is that you do, right?
link |
00:31:42.000
I know I do. If I'm going to a new Chinese restaurant, I will get General Gal's chicken
link |
00:31:45.760
because that's the thing that's easy. I will get hot and sour soup. You know, the people do the
link |
00:31:50.240
things that they do, but other people get the chicken and broccoli. I can push this analogy way
link |
00:31:54.720
too far. The chicken and broccoli. I don't know what's wrong with those people. I don't know what's
link |
00:31:58.160
wrong with them either. They, we have all had our trauma. So they get their chicken and broccoli
link |
00:32:03.600
and their egg drop soup or whatever. We got to communicate and it's going to change, right?
link |
00:32:08.400
So it's not, interactive AI is not just about learning to solve a problem or a task. It's about
link |
00:32:14.720
having to adapt that over time, over a very long period of time and interacting with other people
link |
00:32:19.520
who will themselves change. This is what we mean about things like adaptable models, right?
link |
00:32:22.880
That you have to have a model that model's going to change. And by the way, it's not just the case
link |
00:32:26.400
that you're different from that person, but you're different from the person you were 15 minutes ago
link |
00:32:30.240
or certainly 15 years ago. And I have to assume that you're at least going to drift, hopefully not
link |
00:32:35.120
too many discontinuities, but you're going to, you're going to drift over time. And I have to
link |
00:32:38.640
have some mechanism for adapting to that as you an individual over time and across individuals
link |
00:32:45.120
over time. On the topic of adaptive modeling and you talk about lifelong learning, which is a,
link |
00:32:52.400
I think a topic that's understudied or maybe because nobody knows what to do with it.
link |
00:32:59.360
But like, you know, if you look at Alexa or most of our artificial intelligence systems that are
link |
00:33:04.480
primarily machine learning based systems or dialogue systems, all those kinds of things,
link |
00:33:08.800
they know very little about you in the sense of the lifelong learning sense that we learn,
link |
00:33:18.320
as humans, we learn a lot about each other, not in the quantity effects, but like the
link |
00:33:25.920
temporally rich side of information that seems to like pick up the crumbs along the way that
link |
00:33:34.400
somehow seems to capture a person pretty well. Do you have any ideas how to do lifelong learning?
link |
00:33:42.000
Because it seems like most of the machine learning community does not.
link |
00:33:45.360
No. Well, by the way, not only does the machine learning community not,
link |
00:33:47.840
not spend a lot of time on lifelong learning. I don't think they spend a lot of time on
link |
00:33:51.040
learning period in the sense that they tend to be very task focused. Everybody is overfitting to
link |
00:33:56.640
whatever problem is they happen to have. They're overengineering their solutions to the task.
link |
00:34:01.360
Even the people, and I think these people do, are trying to solve a hard problem of
link |
00:34:05.040
transfer learning, right? I'm going to learn on one task and learn the other task.
link |
00:34:07.920
It's, you still end up creating the task. You know, it's like looking for your keys where the
link |
00:34:11.200
light is because that's where the light is, right? It's not because the keys have to be there.
link |
00:34:14.720
I mean, one could argue that we tend to do this in general. We tend to kind of do it as a group.
link |
00:34:20.400
We tend to hill climb and get stuck in local optima. And I think we do this in the small,
link |
00:34:25.200
the small as well. I think it's very hard to do because, so, look, here's the hard thing about
link |
00:34:31.600
AI, right? The hard thing about AI is it keeps changing on us, right? You know, what is AI?
link |
00:34:35.280
AI is the, you know, the art and science of making computers act the way they do in the movies,
link |
00:34:39.280
right? That's what it is, right? That's a good definition.
link |
00:34:42.400
But beyond that, it's... And they keep coming up with new movies.
link |
00:34:45.120
Yes. Right, exactly. We are driven by this kind of need to sort of ineffable quality of who we
link |
00:34:52.240
are, which means that the moment you understand something is no longer AI, right? Well, we
link |
00:34:57.360
understand this. That's just, you take the derivatives and you divide by two and then you
link |
00:35:00.800
average it out over time in the window. So, therefore, that's no longer AI.
link |
00:35:03.760
So, the problem is unsolvable because it keeps kind of going away. This creates a kind of
link |
00:35:07.520
illusion, which I don't think is an entire illusion, of either there's very simple task
link |
00:35:10.800
based things you can do very well than over engineer. There's all of AI and there's like
link |
00:35:15.440
nothing in the middle. Like, it's very hard to get from here to here and it's very hard to see
link |
00:35:19.600
how to get from here to here. And I don't think that we've done a very good job of it because
link |
00:35:24.400
we get stuck trying to solve the small problems in front of it, myself included. I'm not going to
link |
00:35:28.400
pretend that I'm better at this than anyone else. And of course, all the incentives in academia
link |
00:35:34.240
and in industry are set to make that very hard because you have to get the next paper out.
link |
00:35:38.800
You have to get the next product out. You have to solve this problem and it's very sort of
link |
00:35:42.400
naturally incremental. And none of the incentives are set up to allow you to take a huge risk
link |
00:35:48.720
unless you're already so well established you can take that big risk. And if you're that well
link |
00:35:54.320
established that you can take that big risk, then you've probably spent much of your career
link |
00:35:57.760
taking these little risks, relatively speaking. And so, you have got a lifetime of experience
link |
00:36:02.640
telling you not to take that particular big risk, right? So, the whole system's set up to make
link |
00:36:06.160
progress very slow. That's fine. It's just the way it is. But it does make this gap seem really big,
link |
00:36:11.040
which is my long way of saying, I don't have a great answer to it, except that stop doing
link |
00:36:16.240
n equals one. At least try to get n equal two and maybe n equals seven so that you can say,
link |
00:36:21.600
I'm going to, or maybe t is a better variable here. I'm going to not just solve this problem,
link |
00:36:25.360
I'm going to solve this problem and another problem. I'm not going to learn just on you.
link |
00:36:28.160
I'm going to keep living out there in the world and just seeing what happens and that we'll learn
link |
00:36:32.080
something as designers and our machine learning algorithms and our AI algorithms can learn as
link |
00:36:36.800
well. But unless you're willing to build a system, which you're going to have live for months at a
link |
00:36:41.360
time in an environment that is messy and chaotic, you cannot control, then you're never going to
link |
00:36:47.040
make progress in that direction. So, I guess my answer to you is yes. My idea is that you should,
link |
00:36:51.440
it's not no, it's yes. You should be deploying these things and making them live for months at
link |
00:36:57.920
a time and be okay with the fact that it's going to take you five years to do this. Not rerunning
link |
00:37:02.480
the same experiment over and over again and refining the machine so it's slightly better
link |
00:37:06.400
at whatever, but actually having it out there and living in the chaos of the world and seeing what
link |
00:37:12.320
its learning algorithm say can learn, what data structure it can build and how it can go from
link |
00:37:16.800
there. Without that, you're going to be stuck all the moment. What do you think about the
link |
00:37:21.360
possibility of n equals one growing is probably a crude approximation, but growing like if you
link |
00:37:28.240
look at language models like GPT3, if you just make it big enough, it'll swallow the world,
link |
00:37:34.240
meaning like it'll solve all your t to infinity by just growing in size of this, taking the small
link |
00:37:42.320
overengineered solution and just pumping it full of steroids in terms of compute, in terms of
link |
00:37:48.240
size of training data and the Yanlacoon style self supervised or open AI self supervised,
link |
00:37:54.160
just throw all of YouTube at it and it will learn how to reason, how to paint, how to create music,
link |
00:38:03.280
how to love all of that by watching YouTube videos. I mean, I can't think of a more terrifying
link |
00:38:08.000
world to live in than a world that is based on YouTube videos, but yeah, I think the answer
link |
00:38:12.640
that I just kind of don't think that'll quite well, it won't work that easily. You will get
link |
00:38:18.400
somewhere and you will learn something, which means it's probably worth it, but you won't get
link |
00:38:22.800
there. You won't solve the problem. Here's the thing. We build these things and we say we want
link |
00:38:29.440
them to learn, but what actually happens, and let's say they do learn, I mean, certainly every
link |
00:38:34.160
paper I've gotten published, the things learn, I don't know about anyone else, but they actually
link |
00:38:38.320
change us. We react to it differently, so we keep redefining what it means to be successful,
link |
00:38:44.160
both in the negative in the case, but also in the positive in that, oh, well, this is an accomplishment.
link |
00:38:50.960
I'll give you an example, which is like the one you just described with YouTube. Let's
link |
00:38:54.000
get completely out of machine learning. Well, not completely, but mostly out of machine learning.
link |
00:38:57.280
Think about Google. People were trying to solve information retrieval, the ad hoc information
link |
00:39:02.880
retrieval problem forever. I mean, first major book I ever read about it was, what, 71? I think
link |
00:39:08.880
it was when it came out. Anyway, we'll treat everything as a vector and we'll do these vector
link |
00:39:14.080
space models and whatever, and that was all great. We made very little progress. I mean,
link |
00:39:18.960
we made some progress. Then Google comes and makes the ad hoc problem seem pretty easy. I mean,
link |
00:39:24.640
it's not. There's lots of computers and databases involved, and there's some brilliant
link |
00:39:29.120
algorithmic stuff behind it, too, and some systems building. But the problem changed, right?
link |
00:39:37.520
If you've got a world that's that connected so that you have, you know,
link |
00:39:41.680
there are 10 million answers quite literally to the question that you're asking,
link |
00:39:46.560
then the problem wasn't, give me the things that are relevant. The problem is, don't give me anything
link |
00:39:51.440
that's irrelevant, at least in the first page, because nothing else matters. So Google is not
link |
00:39:57.760
solving the information retrieval problem, at least not on this web page. Google is minimizing
link |
00:40:04.000
false positives, which is not the same thing as getting an answer. It turns out it's good enough
link |
00:40:09.280
for what it is we want to use Google for, but it also changes what the problem was we thought we
link |
00:40:13.760
were trying to solve in the first place. You thought you were trying to find an answer,
link |
00:40:17.200
but you're not. We're trying to find the answer, but it turns out you're just trying to find an
link |
00:40:20.400
answer. Now, yes, it is true. It's also very good at finding you exactly that web page. Of course,
link |
00:40:24.400
you trained yourself to figure out what the keywords were to get you that web page,
link |
00:40:28.960
but in the end, by having that much data, you've just changed the problem into something else.
link |
00:40:33.680
You haven't actually learned what you set out to learn. Now, the counter to that would be,
link |
00:40:37.040
maybe we're not doing that either. We just think we are because, you know, we're in our own heads.
link |
00:40:41.360
Maybe we're learning the wrong problem in the first place, but I don't think that matters. I
link |
00:40:45.520
think the point is, is that Google has not solved information retrieval. Google has done amazing
link |
00:40:50.720
service. I have nothing bad to say about what they've done. Lord knows my entire life is better,
link |
00:40:54.000
because Google exists in form for Google Maps. I don't think I've ever found this place.
link |
00:40:59.120
Where is this? Like in 95, I see 110 and I see, but where did 95 go? You know, so I'm very grateful
link |
00:41:06.720
for Google, but you know, they just have to make certain the first five things are right.
link |
00:41:11.040
And everything after that is wrong. Look, we're going off in a totally different
link |
00:41:14.560
topic here, but think about the way we hire faculty. Exactly the same thing.
link |
00:41:20.880
How are you getting controversial? How are you getting controversial? It's
link |
00:41:24.640
exactly the same problem, right? It's minimizing false positives. We say things like we want to
link |
00:41:31.600
find the best person to be an assistant professor at MIT in the new College of Computing, which I
link |
00:41:39.440
will point out was founded 30 years after the College of Computing I'm a part of. Both are my
link |
00:41:45.200
alma mater. Both are fighting words. I'm just saying I appreciate all that they did and all that
link |
00:41:50.480
they're doing. Anyway, so we're going to try to hire the best professor. That's what we say,
link |
00:41:58.320
the best person for this job. But that's not what we do at all, right? Do you know which
link |
00:42:02.880
percentage of faculty in the top four earn their PhDs from the top four? Say in 2017,
link |
00:42:10.240
for which we have, which is the most recent year for which I have data.
link |
00:42:12.720
Maybe a large percentage. It's about 60%. 60%. 60% of the faculty in the top four earn their
link |
00:42:17.600
PhDs in top four. This is computer science for which there is no top five. There's only a top
link |
00:42:21.760
four, right? Because they're all tied for one. For people who don't know, by the way, that would be
link |
00:42:25.280
MIT stand for Berkeley CMU. Yep. Georgia Tech. Number eight. Number eight. You're keeping track.
link |
00:42:34.080
Oh, yes. It's a large part of my job. Number five is Illinois. Number six is a tie with
link |
00:42:39.280
UW and Cornell and Princeton and Georgia Tech are tied for eight and UT Austin is number 10.
link |
00:42:43.680
Michigan's number 11, by the way. So if you look at the top 10,
link |
00:42:48.880
you know what percentage of faculty in the top 10 earn their PhDs from the top 10?
link |
00:42:54.000
65, roughly. 65%. If you look at the top 55 ranked departments, 50% of the faculty earn
link |
00:43:02.240
their PhDs from the top 10. There is no universe in which all the best faculty, even just for R1
link |
00:43:10.720
universities, the majority of them come from 10 places. There's just no way that's true,
link |
00:43:15.360
especially when you consider how small some of those universities are in terms of the number
link |
00:43:18.880
of PhDs they produce. Yeah. Now, that's not a negative. I mean, it is a negative. It also has
link |
00:43:23.600
a habit of entrenching certain historical inequities and accidents. But what it tells you is, well,
link |
00:43:31.760
ask yourself the question. Why is it like that? Well, because it's easier.
link |
00:43:36.960
If we go all the way back to the 1980s, there was a saying that nobody ever lost his job
link |
00:43:43.840
buying a computer from IBM. It was true. Nobody ever lost their job hiring a PhD from MIT.
link |
00:43:51.680
If the person turned out to be terrible, well, they came from MIT. What did you expect me to
link |
00:43:55.520
know? However, that same person coming from pick whichever is your least favorite place that produces
link |
00:44:02.240
PhDs and say computer science, well, you took a risk. So all the incentives, particularly because
link |
00:44:08.000
you're only going to hire one this year, well, now we're hiring 10, but you're only going to hire one
link |
00:44:11.760
or two or three this year. And by the way, when they come in, you're stuck with them for at least
link |
00:44:15.200
seven years in most places because that's before you know whether you're getting tenure or not.
link |
00:44:18.480
And if they get tenure, stuck with them for 30 years unless they decide to leave. That means
link |
00:44:22.320
the pressure to get this right is very high. So what are you going to do? You're going to minimize
link |
00:44:26.160
false positives. You don't care about saying no inappropriately. You only care about saying yes
link |
00:44:32.080
inappropriately. So all the pressure drives you into that particular direction. Google,
link |
00:44:37.840
not to put too fine a point on it, wasn't exactly the same situation with their search.
link |
00:44:41.840
It turns out you just don't want to give people the wrong page in the first three or four pages.
link |
00:44:46.880
And if there's 10 million right answers and a hundred bazillion wrong answers,
link |
00:44:50.480
just make certain the wrong answers don't get up there. And who cares if you,
link |
00:44:53.120
the right answer was actually the 13th page. A right answer, a satisfying answer,
link |
00:44:58.320
is number one, two, three or four. So who cares? Or an answer that will make you discover something
link |
00:45:02.560
beautiful, profound to your question. Well, that's a different problem, right?
link |
00:45:06.800
But isn't that the problem? Can we link on this topic without sort of walking with Grace?
link |
00:45:15.040
How do we get for hiring faculty? How do we get that 13th page with the truly special person?
link |
00:45:24.960
Like there's, I mean, it depends on the department of computer science probably has those department,
link |
00:45:29.920
those kinds of people, like you have the Russian guy, Grigory Perlman,
link |
00:45:36.480
like just these awkward, strange minds that don't know how to play the little game of etiquette that
link |
00:45:45.840
faculty have all agreed somehow, like converged over the decades, how to play with each other.
link |
00:45:50.480
And also is not, you know, on top of that is not from the top four, top whatever numbers,
link |
00:45:56.800
the schools. And maybe actually just says a few every once in a while to the
link |
00:46:03.200
traditions of old within the computer science community, maybe talks trash about machine
link |
00:46:08.480
learning is a total waste of time. And that's there on the resume. So like, how do you allow
link |
00:46:16.080
the system to give those folks a chance? Well, you have to be willing to take a certain kind of,
link |
00:46:20.880
without taking a particular position on any particular person, you'd have to take,
link |
00:46:24.000
you have to be willing to take risk, right? A small amount of, I mean, if we were treating this as a,
link |
00:46:29.440
well, as a machine learning problem, right, as a search problem, which is what it is,
link |
00:46:32.640
it's a search problem. If we were treating it that way, you would say, oh, well, the main thing is,
link |
00:46:36.080
you know, you've got a prior, you want some data because I'm Bayesian, if you don't want to do it
link |
00:46:40.560
that way, we'll just inject some randomness in and it'll be okay. The problem is that feels
link |
00:46:45.840
very, very hard to do with people. All the incentives are wrong there. But it turns out,
link |
00:46:53.040
and let's say, let's say that's the right answer, let's just give for the sake of argument that,
link |
00:46:56.720
you know, injecting randomness in the system at that level for who you hire is just not,
link |
00:47:00.480
not worth doing because the price is too high or the cost is too high. We had infinite resources,
link |
00:47:05.040
sure, but we don't. And also, you've got to teach people. So, you know, you're ruining other
link |
00:47:08.080
people's lives if you get it too wrong. But we've taken that principle, even if I grant it,
link |
00:47:14.240
and pushed it all the way back, right? So, we could have a better pool than we have of people
link |
00:47:22.720
we look at and give an opportunity to. If we do that, then we have a better chance of finding that.
link |
00:47:28.160
Of course, that just pushes the problem back, back another level. But let me tell you something
link |
00:47:31.920
else, you know, I did a sort of study, I call it a study, I call the pay to my friends and ask
link |
00:47:36.160
them for all of their data for graduate admissions. But then someone else followed up and did an
link |
00:47:39.520
actual study. And it turns out that I can tell you how everybody gets into grad school, more or
link |
00:47:45.200
less, more or less. You basically admit everyone from places higher ranked than you. You admit
link |
00:47:50.960
most people from places ranked around you, and you admit almost no one from places ranked below
link |
00:47:54.160
you, with the exception of the small liberal arts colleges that aren't ranked at all, like Harvey
link |
00:47:57.840
Mudd, because they don't, they're a PhD, so they aren't ranked. This is all CS. Which means the
link |
00:48:03.280
decision of whether, you know, you become a professor at Cornell was determined when you
link |
00:48:10.080
were 17, right? By where, what you knew to go to undergrad to do whatever, right? So if we can push
link |
00:48:16.800
these things back a little bit and just make the pool a little bit bigger, at least you raise the
link |
00:48:20.000
probability that you will be able to see someone interesting and take the risk. The other answer
link |
00:48:26.000
to that question, by the way, which you could argue is the same as you either adjust the pool
link |
00:48:31.360
so the probabilities go up. That's a way of injecting a little bit of uniform noise in the
link |
00:48:36.800
system as it were. As you change your loss function, you just let yourself be measured by
link |
00:48:41.920
something other than whatever it is that we're measuring ourselves by now. I mean, US News and
link |
00:48:47.840
World Report, every time they change their formula for determining rankings, move entire universities
link |
00:48:53.120
to behave differently, because rankings matter. Can you talk trash about those rankings for a
link |
00:48:59.040
second? No, I'm joking about talking trash. I actually, it's so funny how from my perspective,
link |
00:49:05.280
from a very shallow perspective, how dogmatic, like how much I trust those rankings, they're
link |
00:49:11.600
almost ingrained in my head. I mean, at MIT, everybody kind of, it's a propagated,
link |
00:49:20.080
mutually agreed upon, like idea that those rankings matter. And I don't think anyone knows what
link |
00:49:26.160
they're like. Most people don't know what they're based on. And what are they exactly based on?
link |
00:49:32.880
And what are the flaws in that? Well, so it depends on which rankings you're talking about.
link |
00:49:38.000
Do you want to talk about computer science or are we going to talk about university?
link |
00:49:40.160
Computer science, US News is not the main one. The only one that matters is US News,
link |
00:49:45.280
nothing that matters. Sorry, CSRankings.org, but nothing else matters, but US News. So,
link |
00:49:50.320
US News has formula that it uses for many things, but not for computer science because computer
link |
00:49:55.680
science is considered a science, which is absurd. So, the rankings for computer science is 100%
link |
00:50:02.960
reputation. So, two people at each department, it's not really department, whatever, at each
link |
00:50:10.560
department basically rank everybody, slightly more complicated than that, but whatever, they
link |
00:50:15.920
rank everyone. And then those things are put together and then somehow rankings come up.
link |
00:50:20.720
How do you improve reputation? How do you move up and down the space of reputation?
link |
00:50:25.600
Yes, that's exactly the question. Twitter? It can help. I can tell you how Georgia Tech did it,
link |
00:50:31.760
or at least how I think Georgia Tech did it, because Georgia Tech is actually the case to look
link |
00:50:35.680
at, not just because I'm at Georgia Tech, but because Georgia Tech is the only computing unit
link |
00:50:40.240
that was not in the top 20 that has made it into the top 10. It's also the only one in the last
link |
00:50:44.560
two decades, I think, that moved up in the top 10 as opposed to having someone else move down.
link |
00:50:53.040
So, we used to be number 10, and then we became number nine because UT Austin went down slightly
link |
00:50:58.320
and now we retired for ninth because that's how rankings work. And we moved from nine to eight
link |
00:51:02.880
because our raw score moved up a point. So, Georgia, something about Georgia Tech, computer science,
link |
00:51:09.760
or computing, anyway. I think it's because we have shown leadership at every crisis level.
link |
00:51:16.880
So, we created a college, first public university to do it, second college, second university to
link |
00:51:20.880
do it after CMU is number one. I also think it's no accident that CMU is the largest and we're
link |
00:51:26.400
depending upon how you count and depending on exactly where MIT ends up with its final
link |
00:51:30.080
college of computing, second or third largest. I don't think that's an accident. We've been
link |
00:51:33.120
doing this for a long time. But in the 2000s, when there was a crisis about undergraduate
link |
00:51:37.840
education, Georgia Tech took a big risk and succeeded at rethinking undergrad education
link |
00:51:43.600
and computing. I think we created these schools at a time when most public universities,
link |
00:51:49.120
in a way, were afraid to do it. We did the online masters and that mattered because
link |
00:51:54.880
people were trying to figure out what to do with MOOCs and so on. I think it's about
link |
00:51:58.240
being observed by your peers and having an impact. So, I mean, that is what reputation is, right?
link |
00:52:04.080
So, the way you move up in the reputation rankings is by doing something that makes people turn
link |
00:52:08.800
and look at you and say, that's good, they're better than I thought. Beyond that, it's just inertia
link |
00:52:14.880
and there's huge hysteresis in the system, right? I mean, there was these, I can't remember this,
link |
00:52:18.880
this is maybe apocryphal, but there's a major or department that like MIT was ranked number one in
link |
00:52:25.440
and they didn't have it, right? It's just about what you, I don't know if that's true, but someone
link |
00:52:29.520
said that to me anyway. But it's a thing, right? It's all about reputation. Of course, MIT is great
link |
00:52:35.360
because MIT is great. It's always been great. By the way, because MIT is great, the best students
link |
00:52:39.760
come, which keeps it being great. I mean, it's just a positive feedback loop. It's not surprising.
link |
00:52:45.520
I don't think it's wrong. Yeah, but it's almost like a narrative. It doesn't actually have to be
link |
00:52:50.560
backed by reality and it's not to say anything about MIT, but it does feel like we're playing
link |
00:52:59.120
in the space of narratives, not the space of something grounded. And one of the surprising
link |
00:53:04.640
things when I showed up at MIT and just all the students I've worked with and all the research
link |
00:53:10.720
I've done is they're the same people as I've met other places.
link |
00:53:18.400
I mean, what MIT has going for it? Well, MIT has many things going for it. One of the things MIT
link |
00:53:22.080
has going for it is a nice logo. It's a lot better than it was when I was here. Nice colors too.
link |
00:53:29.040
Terrible, terrible name for a mascot. But the thing that MIT has going for it is it really
link |
00:53:35.120
does get the best students. It just doesn't get all of the best students. There are many more
link |
00:53:39.520
best students out there, right? And the best students want to be here because it's the best place
link |
00:53:44.240
to be or one of the best places to be. And it just kind of, it's a sort of positive feedback.
link |
00:53:47.680
But you said something earlier, which I think is worth examining for a moment, right? And you said
link |
00:53:53.040
it's, I forget the word G as you said, we're living in the space of narrative as opposed to
link |
00:53:58.720
something objective. Narrative is objective. I mean, one could argue that the only thing that
link |
00:54:03.360
we do as humans is narrative. We just build stories to explain why we do this. Someone
link |
00:54:08.160
once asked me, but wait, there's nothing objective. No, it's completely an objective measure. It's
link |
00:54:12.960
an objective measure of the opinions of everybody else. Now, is that physics? I don't know. But,
link |
00:54:20.800
you know, what, I mean, tell me something you think is actually objective and measurable in a
link |
00:54:24.800
way that makes sense. Like cameras, they don't, do you know that, I mean, you're getting me off
link |
00:54:30.720
on something, but do you know that cameras, which are just reflecting light and putting them on film,
link |
00:54:37.040
like did not work for dark skinned people until like the 1970s? You know why? Because you were
link |
00:54:44.480
building cameras for the people who were going to buy cameras, who all, at least in the United
link |
00:54:48.560
States and Western Europe, were relatively light skinned. Turns out, took terrible pictures of
link |
00:54:53.600
people who look like me that got fixed with better film and whole processes. Do you know why?
link |
00:55:00.160
Because furniture manufacturers wanted to be able to take pictures of mahogany furniture,
link |
00:55:05.360
right? Because candy manufacturers wanted to be able to take pictures of chocolate. Now,
link |
00:55:12.400
the reason I bring that up is because you might think that cameras are objective. They're
link |
00:55:18.560
objective. They're just capturing light. No, they're made, they are doing the things that
link |
00:55:22.880
they are doing based upon decisions by real human beings to privilege, if I may use that word,
link |
00:55:28.480
some physics over others because it's an engineering problem. They're tradeoffs, right?
link |
00:55:32.160
So I can either worry about this part of the spectrum or this part of the spectrum. This
link |
00:55:36.000
costs more, that costs less, this costs the same, but I have more people paying money over here,
link |
00:55:39.760
right? And it turns out that, you know, if a jack, you know, if an agglomerate wants you,
link |
00:55:44.400
demands that you do something different and it's going to involve all kinds of money for you,
link |
00:55:47.600
suddenly the tradeoffs change, right? And so there you go. I actually don't know how I ended
link |
00:55:51.840
up there. Oh, it's because of this notion of objectiveness, right? So even the objective
link |
00:55:56.320
isn't objective because at the end you've got to tell a story, you've got to make decisions,
link |
00:55:59.200
you've got to make tradeoff or what else is engineering other than that. So I think that
link |
00:56:03.120
the rankings capture something. They just don't necessarily capture what people assume they capture.
link |
00:56:10.480
You know, just to linger on this, this idea, why is there not more people who just like play
link |
00:56:19.920
with whatever that narrative is, have fun with it, have like excite the world,
link |
00:56:23.760
whether it's in the Carl Sagan style of like that calm, sexy voice of explaining the stars and all
link |
00:56:30.560
the romantic stuff, or the Elon Musk, dare I even say Donald Trump, where you're like trolling and
link |
00:56:36.320
shaking up the system and just saying controversial things. I talked to Lisa from a Barrett, who's
link |
00:56:43.280
a neuroscientist who just enjoys playing the controversy, finds the counterintuitive ideas
link |
00:56:51.600
in the particular science and throws them out there and sees how they play in the public discourse.
link |
00:56:57.520
Like why don't we see more of that? And why does an academia attract an Elon Musk type?
link |
00:57:03.520
Well, tenure is a powerful thing that allows you to do whatever you want. But getting tenure
link |
00:57:09.840
typically requires you to be relatively narrow, right? Because people are judging you. Well,
link |
00:57:14.400
I think the answer is we have told ourselves a story, a narrative that that is vulgar,
link |
00:57:22.000
which we just described as vulgar. It's certainly unscientific, right? And it is easy to convince
link |
00:57:30.880
yourself that in some ways, you're the mathematician, right? The fewer there are in your major,
link |
00:57:37.920
the more that proves your purity, right? So once you tell yourself that story,
link |
00:57:44.080
then it is beneath you to do that kind of thing, right? I think that's wrong. I think that... And
link |
00:57:52.160
by the way, everyone doesn't have to do this. Everyone's not good at it. And everyone,
link |
00:57:54.320
even if they would be good at it, would enjoy it. So it's fine. But I do think you need some diversity
link |
00:57:59.680
in the way that people choose to relate to the world as academics, because I think the great
link |
00:58:06.880
universities are ones that engage with the rest of the world. It is a home for public intellectuals.
link |
00:58:14.800
And in 2020, being a public intellectual probably means being on Twitter. Whereas,
link |
00:58:20.880
of course, that wasn't true 20 years ago, because Twitter wasn't around 20 years ago,
link |
00:58:25.120
and if it was, it wasn't around in a meaningful way. I don't actually know how long Twitter's
link |
00:58:27.840
been around. As I get older, I find that my notion of time has gotten worse and worse,
link |
00:58:32.960
like Google really has been around that long. Anyway, the point is that I think that we sometimes
link |
00:58:40.880
forget that a part of our job is to impact the people who aren't in the world that we're in,
link |
00:58:46.240
and that that's the point of being at a great place and being a great person, frankly.
link |
00:58:50.160
There's an interesting force in terms of public intellectuals. If we get Twitter,
link |
00:58:54.240
we can look at just online courses that are public facing in some part. There is a kind of
link |
00:59:01.040
force that pulls you back. Let me just call it off because I don't give a damn at this point.
link |
00:59:09.600
There's a little bit of all of us have this, but certainly faculty have this,
link |
00:59:13.040
which is jealousy. It's whoever is popular at being a good communicator, exciting the world
link |
00:59:21.200
with their science. And of course, when you excite the world with the science, it's not
link |
00:59:27.840
peer reviewed, clean. It all sounds like bullshit. It's like a TED talk. And people roll their eyes
link |
00:59:36.960
and they hate that a TED talk gets millions of views or something like that. And then everybody
link |
00:59:42.080
pulls each other back. There's this force that's just kind of, it's hard to stand out unless you
link |
00:59:47.200
like win a Nobel Prize or whatever. It's only when you get senior enough, we just stop giving a damn.
link |
00:59:54.720
But just like you said, even when you get tenure, that was always the surprising thing to me.
link |
01:00:00.240
I have many colleagues and friends who have gotten tenure, but there's not a switch.
link |
01:00:09.040
There's not an FU money switch where you're like, you know what, now I'm going to be more bold.
link |
01:00:15.120
It doesn't, I don't see it. Well, there's a reason for that. Tenure isn't a test. It's a training
link |
01:00:20.160
process. It teaches you to behave in a certain way, to think in a certain way, to accept certain
link |
01:00:25.520
values and to react accordingly. And the better you are at that, the more likely you are to earn
link |
01:00:30.160
tenure. And by the way, this is not a bad thing. Most things are like that. And I think most of
link |
01:00:35.680
my colleagues are interested in doing great work and they're just having impact in the way that
link |
01:00:39.600
they want to have impact. I do think that as a field, not just as a field, as a profession,
link |
01:00:46.720
but we have a habit of belittling those who are popular, as it were, as if the word itself is a
link |
01:00:57.040
kind of scarlet A, right? I think it's easy to convince yourself and no one is immune to this
link |
01:01:07.680
that the people who are better known are better known for bad reasons. The people who are out
link |
01:01:13.040
there dumbing it down are not being pure to whatever the values and ethos is for your field.
link |
01:01:20.640
And it's just very easy to do. Now, having said that, I think that ultimately, people who are
link |
01:01:27.200
able to be popular and out there and are touching the world and making a difference,
link |
01:01:33.680
our colleagues do, in fact, appreciate that in the long run. It's just you have to be very good
link |
01:01:38.880
at it or you have to be very interested in pursuing it. And once you get past a certain level,
link |
01:01:42.800
I think people accept that for who it is. I mean, I don't know. I'd be really interested
link |
01:01:46.560
in how Rod Brooks felt about how people were interacting with him when he did fast, cheap,
link |
01:01:51.280
and out of control way, way, way back when.
link |
01:01:55.120
What's fast, cheap, and out of control?
link |
01:01:56.240
It was a documentary that involved four people. I remember nothing about it other than Rod Brooks
link |
01:02:01.440
was in it and something about naked mole rats. Can't remember what the other two things were.
link |
01:02:06.320
It was robots, naked mole rats, and then two other.
link |
01:02:08.400
By the way, Rod Brooks used to be the head of the Artificial Intelligence Laboratory at
link |
01:02:12.480
MIT and then launched, I think, iRobot and then Think Robotics, Rethink Robotics.
link |
01:02:21.120
Think is in the word and also is a little bit of a rock star personality in the AI world,
link |
01:02:28.160
a very opinionated, very intelligent. Anyway, sorry, mole rats and naked.
link |
01:02:32.480
Naked mole rats. Also, he was one of my two advisors for my PhD.
link |
01:02:36.960
This explains a lot.
link |
01:02:37.840
I love Rod. But I also live with my other advisor, Paul. Paul, if you're listening,
link |
01:02:44.240
I love you too. Both very, very different people.
link |
01:02:46.000
Paul Veol.
link |
01:02:46.560
Paul Veol. Both very interesting people, very different in many ways.
link |
01:02:50.720
But I don't know what Rod would say to you about what the reaction was. I know that for the
link |
01:02:56.000
students at the time, because I was a student at the time, it was amazing, right? This guy was in
link |
01:03:01.520
a movie being very much himself. Actually, the movie version of him is a little bit more Rod
link |
01:03:08.400
than Rod. I mean, I think they edited it appropriately for him. But it was very much Rod.
link |
01:03:13.840
And he did all this while doing great work to me. Was he running the iLab at that point or
link |
01:03:17.680
not? I don't know. But anyway, he's running the iLab or would be soon. He's a giant in the field.
link |
01:03:21.840
He did amazing things, made a lot of his bones by doing the kind of counterintuitive science,
link |
01:03:27.440
right? And saying, no, you're doing this all wrong. Representation is crazy.
link |
01:03:30.640
The world is your own representation. You just react to me. These are amazing things.
link |
01:03:34.160
And continues to do those sorts of things as he's moved on. I have, I think he might tell you,
link |
01:03:40.720
I don't know if he would tell you it was good or bad, but I know that for everyone else out
link |
01:03:44.720
there in the world, it was a good thing. And certainly, he continued to be respected. So
link |
01:03:48.320
it's not as if it destroyed his career by being popular.
link |
01:03:51.920
All right. Let's go into a topic where I'm on thin ice, because I grew up in the Soviet Union,
link |
01:03:57.520
Russia, my knowledge of music. This American thing you guys do is quite foreign. So your
link |
01:04:08.320
research group is called, as we've talked about, the Lab for Interactive Artificial Intelligence.
link |
01:04:14.480
But also, there's just a bunch of mystery around this. My research fails me. Also called PFUNC.
link |
01:04:20.240
Yep. P stands for probabilistic. And what does FUNC stand for?
link |
01:04:27.760
So a lot of my life is about making acronyms. So if I have one quirk, it's that people will say
link |
01:04:32.400
words and I see if they make acronyms. And if they do, then I'm happy. And then if they don't,
link |
01:04:37.520
I try to change it so that they make acronyms. It's just a thing that I do. So PFUNC is an
link |
01:04:42.320
acronym. It has three or four different meanings. But finally, I decided that the P stands for
link |
01:04:47.680
probabilistic because at the end of the day, it's machine learning and it's randomness and
link |
01:04:51.440
it's uncertainty, which is the important thing here. And the FUNC can be lots of different things,
link |
01:04:56.400
but I decided I should leave it up to the individual to figure out exactly what it is.
link |
01:05:00.480
But I will tell you that when my students graduate, when they get out, as we say at Tech,
link |
01:05:07.200
I hand them, they put on a hat and starglasses and a medallion from the PFUNC era. And we take
link |
01:05:16.560
a picture and I hand them a pair of fuzzy dice, which they get to keep. So there's a sense to it,
link |
01:05:24.160
which is not an acronym, like literally FUNC. You have a dark mysterious past.
link |
01:05:32.880
Oh, it's not dark. It's just fun. It's in hip hop and FUNC. So can you educate
link |
01:05:40.880
a Soviet born Russian about this thing called hip hop? Like if you were to give me, you know,
link |
01:05:49.360
if we went on a journey together and you were trying to educate me about, especially the past
link |
01:05:56.480
couple of decades in the 90s about hip hop or FUNC, what records or artists would you
link |
01:06:01.440
introduce me to? Would you tell me about or maybe what influenced you in your journey
link |
01:06:12.160
or what you just love? Like when the family's gone and you just sit back and just blast some stuff
link |
01:06:17.840
these days, what do you listen to? Well, so I listen to a lot, but I will tell you, well,
link |
01:06:22.400
first off, all great music was made when I was 14. And that statement is true for all people,
link |
01:06:26.400
no matter how old they are or where they live. But for me, the first thing that's
link |
01:06:30.720
worth pointing out is that hip hop and rap aren't the same thing. So depending on what you talk
link |
01:06:34.480
to about this, and there are people who feel very strongly about this, much more strongly.
link |
01:06:38.480
You're offending everybody in this conversation. So this is great. Let's keep going.
link |
01:06:42.240
Hip hop is a culture. It's a whole set of things of which rap is a part. So tagging is a part of
link |
01:06:47.600
hip hop. I don't know why that's true, but people tell me it's true and I'm willing to go along
link |
01:06:50.720
with it because they get very angry about it. But hip hop is like graffiti. Tagging is like graffiti.
link |
01:06:55.680
And there's all these, including the popping and the locking and all the dancing and all those
link |
01:06:58.640
things. That's all a part of hip hop. It's a way of life, which I think is true. And then there's
link |
01:07:03.280
rap, which is this particular. It's the music part. Yes. A music part. I mean, you wouldn't
link |
01:07:09.360
call the stuff that DJs do the scratching. That's not rap, right? But it's a part of hip hop, right?
link |
01:07:14.160
So given that we understand that hip hop is this whole thing, what are the rap albums that
link |
01:07:19.200
best touched that for me? Well, if I were going to educate you, I would try to figure out what
link |
01:07:22.640
you'd like. And then I would work you there. Skinner. Oh my God. I would probably start with
link |
01:07:32.160
a supplement. There's a fascinating exercise. There's a fascinating exercise one can do by
link |
01:07:37.360
watching old episodes of I love the 70s. I love the 80s. I love the 90s with a bunch of friends
link |
01:07:44.080
and just see where people come in and out of pop culture. So if you're talking about
link |
01:07:48.480
those people, then I would actually start you with where I would hope to start you with anyway,
link |
01:07:53.760
which is public enemy. Particularly, it takes a nation of millions to hold us back, which is
link |
01:07:59.200
clearly the best album ever produced and certainly the best hip hop album ever produced, in part,
link |
01:08:05.200
because it was so much of what was great about the time. Fantastic lyrics. Excuse me. It's all
link |
01:08:10.000
about the lyrics. Amazing music that was coming from Rick Rubin was the was the producer of that.
link |
01:08:15.200
And he did a lot of very kind of heavy mental ish, at least in the 80s sense at the time.
link |
01:08:20.640
And it was focused on politics in the 1980s, which was what made hip hop so great then.
link |
01:08:27.360
I would start you there, then I would move you up through things that have been happening more
link |
01:08:31.600
recently. I'd probably get you to someone like a most deaf. I would give you a history lesson,
link |
01:08:35.520
basically. Most deaf is amazing. He hosted a poetry jam thing on HBO or something like that.
link |
01:08:39.920
Probably. I don't think I've seen it, but I wouldn't be surprised. Yeah. Spoken poetry.
link |
01:08:43.280
Spoken poetry. He's amazing. He's amazing. And then I would, after I got you there,
link |
01:08:48.800
I'd work you back to EPMD. And eventually, I would take you back to The Last Poets,
link |
01:08:55.840
and particularly the first album, The Last Poets, which was 1970, to give you a sense of history
link |
01:09:01.280
and that it actually has been building up over a very, very long time. So we would start there,
link |
01:09:06.720
because that's where your music aligns. And then we would cycle out and I'd move you to the present
link |
01:09:10.880
and then I'd take you back to the past. Because I think a large part of people who are kind of
link |
01:09:14.720
confused about any kind of music, the truth is this is the same thing we've always been talking
link |
01:09:19.280
about. It's about narrative and being a part of something and being immersed in something,
link |
01:09:22.800
so you understand it, right? Jazz, which I also like, is one of the things that's cool about
link |
01:09:29.440
jazz is that you come and you meet someone who's talking to you about jazz and you have no idea
link |
01:09:32.480
what they're talking about. And then one day it all clicks and you've been so immersed in it,
link |
01:09:36.480
you go, oh yeah, that's a Charlie Parker. You start using words that nobody else
link |
01:09:39.760
understands, right? And it becomes a part of hip hop's the same way. Everything's the same way.
link |
01:09:43.520
They're all cultural artifacts. But I would help you to see that there's a history of it and how
link |
01:09:47.120
it connects to other genres of music that you might like to bring you in so that you could kind
link |
01:09:52.400
of see how it connects to what you already like, including some of the good work that's been done
link |
01:09:58.480
with fusions of hip hop and bluegrass. Oh no. Yes. Some of it's even good. Not all of it,
link |
01:10:08.000
but some of it is good. But I'd start you with, it takes a nation to make as a whole as back.
link |
01:10:12.320
There's an interesting tradition and like more modern hip hop of integrating almost like classic
link |
01:10:18.800
croc songs or whatever, like integrating into their music, into the beat, into the whatever.
link |
01:10:25.440
It's kind of interesting. It gives a whole new, not just classic croc, but what is it,
link |
01:10:30.960
the Kanye Gold Digger, the old R&B? Taking and pulling old R&B, right?
link |
01:10:37.920
Well, that's been true since the beginning. I mean, in fact, that's in some ways,
link |
01:10:41.120
that's why the DJ used to get top billing, because it was the DJ that brought all the records together
link |
01:10:47.600
and made it worth so that people could dance. If you go back, you go back to those days,
link |
01:10:51.520
mostly in New York, though not exclusively, but mostly in New York where it sort of came out of,
link |
01:10:56.240
as the DJ that brought all the music together in the beats and showed that basically
link |
01:10:59.840
music is itself an instrument, very meta. You can bring it together and then you sort of wrap
link |
01:11:03.920
over it and so on. It moved that way. That's going way, way back. Now, in the period of time
link |
01:11:09.360
where I grew up, when I became really into it, which was most of the 80s, it was more funk,
link |
01:11:16.160
was the back for a lot of the stuff, public enemy at that time notwithstanding,
link |
01:11:22.160
which is very nice because it tied into what my parents listened to and what I vaguely remember
link |
01:11:26.080
listening to when I was very small. And by the way, complete revival of George Clinton and Parliament
link |
01:11:32.480
and Funkadelic and all of those things to bring it sort of back into the 80s and into the 90s.
link |
01:11:37.120
And as we go on, you're going to see the last decade and the decade before that being brought in.
link |
01:11:42.320
And when you don't think that you're hearing something you've heard, it's probably because
link |
01:11:45.600
it's being sampled by someone who, referring to something they remembered when they were young,
link |
01:11:53.920
perhaps from somewhere else altogether. And you just didn't realize what it was because it wasn't
link |
01:11:58.960
a popular song where you happened to grow up. So this stuff's been going on for a long time.
link |
01:12:02.080
It's one of the things that I think is beautiful. Run DMC, Jam Master Jay used to play, he played
link |
01:12:08.480
piano. He would record himself playing piano and then sample that to make it a part of what was
link |
01:12:13.200
going on rather than play the piano. That's how his mind can think. Well, it's pieces. You're
link |
01:12:17.600
putting pieces together. You're putting pieces of music together to create new music, right?
link |
01:12:20.480
Now, that doesn't mean that the roots are doing their own thing. Yeah. Those are, that's a whole.
link |
01:12:25.920
Yeah. But still, it's the right attitude. And what else is jazz, right? Jazz is about
link |
01:12:32.160
putting pieces together and then putting your own spin on. It's all the same. It's all the same
link |
01:12:35.200
thing. It's all the same thing. Because you mentioned lyrics. It does make me sad. Again,
link |
01:12:40.240
this is me talking trash about modern hip hop. I haven't, invest again. I'm sure people correct
link |
01:12:46.000
me that there's a lot of great artists. That's part of the reason I'm saying it is they'll
link |
01:12:49.520
leave it in the comments that you should listen to this person is the lyrics went away from
link |
01:12:57.280
talking about maybe not just politics, but life and so on. Like, you know, the kind of like
link |
01:13:02.880
protest songs, even if you look at like a Bob Marley or you said public enemy or rage against
link |
01:13:07.600
the machine more on the rock side. That's the place where we go to those lyrics. Like classic
link |
01:13:14.880
rock is all about like my woman left me or or I'm really happy that she's still with me
link |
01:13:22.080
or the flip side. It's like love songs of different kinds. It's all love, but it's less
link |
01:13:26.240
political, like less interesting, I would say, in terms of like deep, profound knowledge. And
link |
01:13:32.080
it seems like rap is the place where you would find that. And it's sad that for the most part,
link |
01:13:38.000
what I see, like you look at like mumble rap or whatever, they're moving away from lyrics and
link |
01:13:42.480
more towards the beat and the musicality of it. I've always been a fan of the lyrics. In fact,
link |
01:13:47.040
if you go back in, you read my reviews, which I recently was reading. Man, I wrote my last review
link |
01:13:54.560
the month I graduated. I got my PhD, which says something about something. I'm not sure what
link |
01:13:59.200
though. I always would, I don't always, but I often would start with it's all about the lyrics.
link |
01:14:03.040
For me, it's all it's about the lyrics. Someone has already written in the comments before I've
link |
01:14:08.480
even finished having this conversation that, you know, neither of us knows what we're talking about.
link |
01:14:12.000
And it's all in the underground hip hop. And here's who you should go listen to.
link |
01:14:15.520
And that is true. Every time I despair for popular rap, someone points me to or I discover
link |
01:14:22.640
some underground hip hop song and I'm made happy and whole again. So I know it's out there.
link |
01:14:28.400
I don't listen to as much as I used to because I'm listening to podcasts and old music from
link |
01:14:33.120
the 1980s. Kind of rap. No beat. It's a kind of, no, no beat at all. But you know, there's a little
link |
01:14:37.440
bit of sampling here and there, I'm sure. By the way, James Brown is funk or no? Yes.
link |
01:14:43.040
And so is Junior Wells, by the way. Who's that? Oh, Junior Wells, Chicago Blues.
link |
01:14:48.240
He was James Brown before James Brown was. It's hard to imagine somebody being James Brown.
link |
01:14:52.640
Go look up who do man blues, Junior Wells. And just listen to snatch it back and hold it.
link |
01:15:01.200
And you'll see it. And they were contemporaries. Where do you put like Little Richard or all
link |
01:15:05.360
that kind of stuff like Ray Charles, like when they get like, hit the road, Jack,
link |
01:15:11.040
and don't you come back? Isn't that like, there's a funkiness in it?
link |
01:15:13.920
Oh, that's definitely a funkiness in it. I mean, it's all, I mean, it's all, it's all a line.
link |
01:15:18.080
I mean, it's all, there's all a line that carries it all together. You know, it's,
link |
01:15:22.320
I guess I have an answer question. I'm thinking about it in 2020 or I'm thinking about it in 1960.
link |
01:15:27.520
I'd probably give a different answer. I'm just thinking in terms of, you know, that was rock.
link |
01:15:31.040
But when you look back on it, it's, it was funky, but we didn't use those words. Or maybe we did.
link |
01:15:36.800
I wasn't around. But you know, I don't think we use the word 1960 funk. Certainly not the way
link |
01:15:41.280
we used it in the 70s and the 80s. Do you reject disco? I do not reject disco. I appreciate all the
link |
01:15:46.640
mistakes that we have made to get there. Actually, some of the disco is actually really, really good.
link |
01:15:51.040
John Travolta. Oh boy. He regrets it probably. Maybe not. Well, like it's the mistakes thing.
link |
01:15:56.720
Yes. And it got him to where he's going, where he is. Oh, well, thank you for taking that detour.
link |
01:16:03.760
You've, you've talked about computing where we've already talked about computing a little bit,
link |
01:16:08.320
but can you try to describe how you think about the world of computing where it fits into the
link |
01:16:14.320
sets of different disciplines? We mentioned College of Computing. What, what should people,
link |
01:16:20.160
how should they think about computing, especially from an educational perspective
link |
01:16:24.080
of like, what is the perfect curriculum that defines for a young mind what computing is?
link |
01:16:32.640
So I don't know about a perfect curriculum, although that's an important question. Because
link |
01:16:35.440
at the end of the day, without the curriculum, you don't get anywhere. Curriculum, to me,
link |
01:16:39.680
is the fundamental data structure. It's not even the classroom. The world is, right? I,
link |
01:16:45.680
so I think the curriculum is where I like to play. So I spent a lot of time thinking about this.
link |
01:16:50.240
But I will tell you, I'll answer your question by answering a slightly different question first
link |
01:16:53.040
than getting back to this, which is, you know, you talked about disciplines and what does it mean to,
link |
01:16:56.880
to be a discipline. The truth is what we really educate people in from the beginning,
link |
01:17:03.040
but certainly through college, you'd sort of failed if you don't think about it this way,
link |
01:17:06.640
I think, is the world, people often think about tools and tool sets. And when you're really trying
link |
01:17:14.000
to be good, you think about skills and skill sets. But disciplines are about mindsets, right?
link |
01:17:18.640
They're about fundamental ways of thinking, not just the, the, the hammer that you pick up whatever
link |
01:17:24.160
that is to hit the nail, not just the, the skill of learning how to hammer well or whatever. It's
link |
01:17:29.840
the mindset of like, what's the fundamental way to think about, to think about the world, right?
link |
01:17:35.920
And disciplines, different disciplines give you different mindsets to give you different ways
link |
01:17:40.240
of sort of thinking through. So with that in mind, I think that computing, even ask the question,
link |
01:17:45.440
whether it's a discipline, is you have to decide, does it have a mindset? Does it have a way of
link |
01:17:48.720
thinking about the world that is different from, you know, the scientist who is doing discover
link |
01:17:53.280
and using the scientific method as a way of doing it, or the mathematician who builds abstractions
link |
01:17:57.280
and tries to find sort of steady state truths about the abstractions that may be artificial,
link |
01:18:02.400
but whatever. Or is it the engineer who's all about, you know, building demonstrably superior
link |
01:18:07.760
technology with respect to some notion of tradeoffs, whatever that means, right? That's sort of the
link |
01:18:12.320
world that you live in. What is computing? You know, how is computing different? So I've thought
link |
01:18:17.040
about this for a long time, and I've come to a view about what computing actually is, what the
link |
01:18:22.320
mindset is. And it's, you know, it's a little abstract, but that would be appropriate for computing.
link |
01:18:27.200
I think that what distinguishes the computationalists from others is that he or she understands
link |
01:18:35.360
that models, languages, and machines are equivalent. They're the same thing. And because
link |
01:18:42.640
it's not just a model, but it's a machine that is an executable thing that can be described
link |
01:18:47.440
as a language, that means that it's dynamic. So it's not the, it is mathematical in some sense,
link |
01:18:54.000
in the kind of sense of abstraction, but it is fundamentally dynamic and executable. The
link |
01:18:58.320
mathematician is not necessarily worried about either the dynamic part. In fact, whenever I
link |
01:19:03.120
tried to write something for mathematicians, they invariably demand that I make it static.
link |
01:19:08.080
And that's not a bad thing. It's just, it's a way of viewing the world that truth is a thing,
link |
01:19:12.160
right? It's not a process that continually runs, right? So that dynamic thing matters,
link |
01:19:17.920
that self reflection of the system itself matters. And that is what computing brought us. So it is
link |
01:19:23.600
a science because the models fundamentally represent truths in the world. Information is
link |
01:19:28.960
a scientific thing to discover, right? Not just a mathematical conceit that gets created. But,
link |
01:19:33.920
of course, it's engineering because you're actually dealing with constraints in the world
link |
01:19:37.520
and trying to execute machines that actually run. But it's also math because you're actually
link |
01:19:43.680
worrying about these languages that describe what's happening. But the fact that regular
link |
01:19:51.840
expressions and finite state automata, one of which feels like a machine or at least an abstraction
link |
01:19:56.960
machine, the other is a language that they're actually the equivalent thing. I mean, that is
link |
01:20:00.160
not a small thing. And it permeates everything that we do, even when we're just trying to figure
link |
01:20:04.320
out how to do debugging. So that idea, I think, is fundamental. And we would do better if we made
link |
01:20:10.560
that more explicit. How my life has changed in my thinking about this in the 10 or 15 years,
link |
01:20:17.200
it's been since I tried to put that to paper with some colleagues, is the realization which
link |
01:20:23.360
comes to a question you actually asked me earlier, which has to do with trees falling down and whether
link |
01:20:29.200
it matters, is this sort of triangle of equality? It only matters because there's a person inside
link |
01:20:36.720
the triangle, right? That what's changed about computing, computer science or whatever you want
link |
01:20:43.360
to call it, is we now have so much data and so much computational power. We're able to do really,
link |
01:20:49.760
really interesting, promising things. But the interesting and the promising kind of only
link |
01:20:55.440
matters with respect to human beings and their relationship to it. So the triangle exists,
link |
01:20:59.760
that is fundamentally computing. What makes it worthwhile and interesting and potentially
link |
01:21:05.360
world species changing is that there are human beings inside of it and intelligence that has
link |
01:21:11.040
to interact with it to change the data, the information that makes sense and gives meaning to
link |
01:21:16.080
the models, the languages and the machines. So if the curriculum can
link |
01:21:21.600
convey that while conveying the tools and the skills that you need in order to succeed,
link |
01:21:26.640
then it is a big win. That's what I think you have to do.
link |
01:21:30.960
Do you pull psychology, these human things into that, into the idea, into this framework of computing?
link |
01:21:38.960
Do you pull in psychology, neuroscience, like parts of psychology, parts of neuroscience,
link |
01:21:43.120
parts of sociology? What about philosophy, like studies of human nature from different
link |
01:21:48.480
perspectives? Absolutely. And by the way, it works both ways. So let's take biology for a
link |
01:21:52.800
moment. It turns out a cell is basically a bunch of if then statements. If you look at it the right
link |
01:21:56.800
way, which is nice because I understand if then statements. I never really enjoyed biology,
link |
01:22:01.680
but I do understand if then statements. And if you tell the biologists that and they begin to
link |
01:22:05.360
understand that, it actually helps them to think about a bunch of really cool things.
link |
01:22:09.520
There'll still be biology involved, but whatever. On the other hand, the fact
link |
01:22:13.440
of biology is in fact a bunch of, the cell is a bunch of if then statements or whatever
link |
01:22:18.080
allows the computationalists to think differently about the language and the way that we, well,
link |
01:22:22.080
certainly the way we would do AI and machine learning, but it's just even the way that we
link |
01:22:25.120
think about, we think about computation. So the important thing to me is, as you know,
link |
01:22:30.240
my engineering colleagues who are not in computer science worry about computer science eating up
link |
01:22:34.160
engineering to colleges where computer science is trapped. It's not a worry. You shouldn't
link |
01:22:40.560
worry about that at all. Computing is computer science computing. It's not, it's central,
link |
01:22:44.800
but it's not the most important thing in the world. It's not more important. It is just key
link |
01:22:48.960
to helping others do other cool things they're going to do. You're not going to be a historian
link |
01:22:53.840
in 2030. You're not going to get a PhD in history without understanding some data science and
link |
01:22:57.120
computing because the way you're going to get history done in part, and I say done, the way
link |
01:23:01.840
you're going to get it done is you're going to look at data and you're going to let,
link |
01:23:05.120
you're going to have a system that's going to help you to analyze things to help you to think
link |
01:23:08.000
about a better way to describe history and to understand what's going on and what it tells us
link |
01:23:12.000
about where we might be going. The same is true for psychology, same true for all of these things.
link |
01:23:15.920
The reason I brought that up is because the philosopher has a lot to say about computing.
link |
01:23:20.000
The psychologist has a lot to say about the way humans interact with computing, right?
link |
01:23:24.640
And certainly a lot about intelligence, which for me, ultimately is kind of the goal of building
link |
01:23:30.960
these computational devices is to build something intelligent.
link |
01:23:33.440
Did you think computing will eat everything in some certain sense or almost like disappear
link |
01:23:38.480
because it's part of everything? It's so funny you say this. I want to say it's
link |
01:23:41.680
going to metastasize, but there's kind of two ways that fields destroy themselves.
link |
01:23:46.560
One is they become super narrow, and I think we can think of fields that might be that way.
link |
01:23:53.360
They become pure, and we have that instinct. We have that impulse. I'm sure you can think of
link |
01:23:58.000
several people who want computer science to be this pure thing. The other way is you become
link |
01:24:03.360
everywhere and you become everything and nothing. And so everyone says, I'm going to teach Fortran
link |
01:24:08.880
for engineers or whatever. I'm going to do this. And then you lose the thing that makes it worth
link |
01:24:13.200
studying in and of itself. The thing about computing, and this is not unique to computing,
link |
01:24:18.240
though at this point in time it is distinctive about computing where we happen to be in 2020,
link |
01:24:23.520
is we are both a thriving major. In fact, the thriving major almost every place.
link |
01:24:30.480
And we're a service unit because people need to know the things we need to know. And our job,
link |
01:24:37.280
much as the mathematician's job is to help this person over here to think like a mathematician
link |
01:24:41.760
much the way the point of you taking chemistry as a freshman is not to learn chemistry. It's to
link |
01:24:47.120
learn to think like a scientist. Our job is to help them to think like a computationalist.
link |
01:24:51.840
And we have to take both of those things very seriously. And I'm not sure that
link |
01:24:55.920
as a field, we have historically certainly taken the second thing that our job is to
link |
01:25:00.160
help them to think a certain way. People who aren't going to be on me, I don't think we've
link |
01:25:03.120
taken that very seriously at all. I don't know if you know who Dan Carlin is. He has this podcast
link |
01:25:08.320
called Hardcore History. Yes. I've just did an amazing four hour conversation with him,
link |
01:25:14.240
mostly about Hitler. But I bring him up because he talks about this idea that it's possible that
link |
01:25:21.360
history as a field will become like currently most people study history a little bit, kind of
link |
01:25:30.240
are aware of it. We have a conversation about it, different parts of it. I mean, there's a lot of
link |
01:25:34.640
criticism to say that some parts of history are being ignored, blah, blah, blah, blah, so on.
link |
01:25:38.640
But most people are able to have a curiosity and able to learn it. His thought is it's possible
link |
01:25:47.200
given the way social media works, the current way we communicate, that history becomes a niche
link |
01:25:52.960
field where literally most people just ignore because everything is happening so fast that
link |
01:25:59.440
the history starts losing its meaning. And then it starts being a thing that only,
link |
01:26:05.360
you know, like the theoretical computer science part of computer science, it becomes a niche
link |
01:26:10.240
thing that only like the rare holders of the world wars and the, you know, all the history,
link |
01:26:17.520
the founding of the United States, all those kinds of things, the civil wars. And it's a kind of
link |
01:26:23.920
profound thing to think about how we can lose track, how we can lose these fields when they're
link |
01:26:32.080
best, like in the case of history, is best for that to be a pervasive thing that everybody
link |
01:26:37.760
learns and thinks about and so on. And I would say computing is quite obviously similar to history
link |
01:26:45.520
in the sense that it seems like it should be a part of everybody's life to some degree,
link |
01:26:51.360
especially like as we move into the later parts of the 21st century. And it's not obvious that
link |
01:26:57.520
that's the way it'll go. It might be in the hands of the few still, like it, depending if it's machine
link |
01:27:03.920
learning, you know, it's unclear that it'll, computing will win out. It's currently very
link |
01:27:09.760
successful, but it's not, I would say that's something, I mean, you're at the leadership
link |
01:27:14.560
level of this, you're defining the future. So it's in your hands. No pressure. But like,
link |
01:27:19.840
it feels like there's multiple ways this can go. And there's this kind of conversation of everybody
link |
01:27:25.840
should learn to code, right? The changing nature of jobs and so on. Do you have a sense of what
link |
01:27:34.960
your role in education of computing is here? Like, what's the hopeful path forward?
link |
01:27:42.800
There's a lot there. I will say that, well, first off, it would be an absolute shame
link |
01:27:47.360
if no one studied history. On the other hand, as T approaches infinity, the amount of history
link |
01:27:52.160
is presumably also growing at least linearly. And so it's, you have to forget more and more
link |
01:27:59.440
of history. But history needs to always be there. I mean, I can imagine a world where,
link |
01:28:02.960
you know, if you think of your brains as being outside of your head, that you can kind of learn
link |
01:28:07.360
the history you need to know when you need to know it, that seems fanciful. But it's a,
link |
01:28:11.280
it's a kind of way of, you know, is there a sufficient statistic of history? No. And there's
link |
01:28:16.400
certainly, but there may be for the particular thing you have to care about. But, you know,
link |
01:28:19.600
those who do not remember. It's for our objective camera discussion, right?
link |
01:28:23.040
Yeah. Right. And, you know, we've already lost lots of history. And of course,
link |
01:28:26.320
you have your own history that some of which will be, or it's even lost to you, right? You
link |
01:28:29.920
don't even remember whatever it was you were doing 17 years ago. All the ex girlfriends.
link |
01:28:34.400
Yeah. Gone. Exactly. So, you know, history is being lost anyway, but the big lessons of history
link |
01:28:40.960
shouldn't be. And I think, you know, to take it to the question of computing and sort of
link |
01:28:45.440
education, the point is you have to get across those lessons. You have to get across the way
link |
01:28:49.120
of thinking. And you have to be able to go back and, you know, you don't want to lose the data,
link |
01:28:54.480
even if, you know, you don't necessarily have the information at your fingertips.
link |
01:28:57.680
With computing, I think it's somewhat different. Everyone doesn't have to learn how to code,
link |
01:29:02.640
but everyone needs to learn how to think in the way that you can be precise. And I mean,
link |
01:29:07.680
precise in the sense of repeatable, not just, you know, in the sense of not resolution in the
link |
01:29:13.600
sense of get the right number of bits. In saying what it is you want the machine to do,
link |
01:29:19.360
and being able to describe a problem in such a way that it is executable, which we are not,
link |
01:29:25.840
human beings are not very good at that. In fact, I think we spend much of our time talking back
link |
01:29:29.600
and forth just to kind of vaguely understand what the other person means and hope we get
link |
01:29:33.200
it good enough that we can act accordingly. You can't do that with machines, at least not yet.
link |
01:29:38.240
And so, you know, having to think that precisely about things is quite important. And that's
link |
01:29:45.520
somewhat different from coding. Coding is a crude means to an end. On the other hand,
link |
01:29:52.000
the idea of coding, what that means, that it's a programming language and it has these sort of
link |
01:29:56.960
things that you fiddle with in these ways that you express, that is an incredibly important
link |
01:30:00.720
point. In fact, I would argue that one of the big holes in machine learning right now in AI is
link |
01:30:05.120
that we forget that we are basically doing software engineering. We forget that we are using programming.
link |
01:30:13.440
We're using languages to express what we're doing. We get just all caught up in the deep
link |
01:30:16.720
network or we get all caught up in whatever that we forget that, you know, we're making decisions
link |
01:30:22.640
based upon a set of parameters that we made up. And if we did slightly different parameters,
link |
01:30:26.880
we'd have completely different outcomes. And so, the lesson of computing, computer science
link |
01:30:31.040
education is to be able to think like that and to be aware of it when you're doing it.
link |
01:30:36.640
Basically, at the end of the day, it's a way of surfacing your assumptions.
link |
01:30:41.200
I mean, we call them parameters or, you know, we call them if then statements or whatever,
link |
01:30:45.760
but you're forced to surface those assumptions. That's the key. The key thing that you should
link |
01:30:50.320
get out of a computing education, that and that the models and languages and the machines are
link |
01:30:53.520
equivalent. But it actually follows from that that you have to be explicit about what it is
link |
01:30:59.040
you're trying to do because the model you're building is something you will one day run.
link |
01:31:04.160
So, you better get it right or at least understand it and be able to express roughly what you want
link |
01:31:08.960
to express. So, I think it is key that we figure out how to educate everyone to think that way
link |
01:31:18.960
because at the end, it would not only make them better at whatever it is that they
link |
01:31:23.520
are doing and I emphasize doing, it'll also make them better citizens. It'll help them to
link |
01:31:30.080
understand what others are doing to them so that they can react accordingly because you're not
link |
01:31:35.680
going to solve the problem of social media insofar as you think of social media as a problem
link |
01:31:40.720
by just making slightly better code, right? It only works if people react to it appropriately
link |
01:31:47.520
and know what's happening and therefore take control over what they're doing. I mean, that's
link |
01:31:53.520
my take on it. Okay. Let me try to proceed awkwardly into the topic of race. Okay. One is because
link |
01:32:02.160
it's a fascinating part of your story and you're just eloquent and fun about it and then the second
link |
01:32:06.720
is because we're living through a pretty tense time in terms of race, tensions and discussions
link |
01:32:14.080
and ideas in this time in America. You grew up in Atlanta, not born in Atlanta. Is some
link |
01:32:22.960
southern state somewhere in Tennessee, something like that? Tennessee. Nice. Okay. But early on,
link |
01:32:28.720
you moved, you're basically, you identify as an Atlanta native. Yeah. And you've mentioned that
link |
01:32:38.800
you grew up in a predominantly black neighborhood. By the way, black African American personal color.
link |
01:32:44.880
I prefer black. Black. With a capital B. With a capital B. The other letters are.
link |
01:32:50.720
The rest of them. Okay. So the predominantly black neighborhood. And so you didn't almost see
link |
01:32:58.880
race. Maybe you can correct me on that. And then in the video you talked about when you
link |
01:33:04.720
showed up to Georgia Tech for your undergrad. You're one of the only black folks there. And
link |
01:33:12.000
that was like, oh, that was a new experience. So can you take me from just a human perspective,
link |
01:33:19.280
but also from a race perspective, your journey growing up in Atlanta and then showing up at
link |
01:33:24.000
Georgia Tech. Okay. And by the way, that story continues through MIT as well. In fact, it was
link |
01:33:29.200
quite a bit more stark at MIT and Boston. So maybe just a quick pause. Georgia Tech was undergrad.
link |
01:33:36.240
MIT was graduate school. And I went directly to grad school from undergrad. So I had no distractions
link |
01:33:42.960
in between my bachelor's and my master's and PhD. You didn't go on a backpacking trip in Europe?
link |
01:33:47.760
Didn't do any of that. In fact, I literally went to IBM for three months, got in a car,
link |
01:33:52.640
and drove straight to Boston with my mother or Cambridge. Yeah. Moved into an apartment I've
link |
01:33:57.760
never seen over the Royal East. Anyway, that's another story. So let me tell you a little bit
link |
01:34:03.360
about this. You miss MIT? Oh, I loved MIT. I don't miss Boston at all, but I loved MIT.
link |
01:34:09.120
And that was fighting war. So let's back up to this. So as you said, I was born in Chattanooga,
link |
01:34:14.000
Tennessee. My earliest memory is arriving in Atlanta in a moving truck at the age of three
link |
01:34:17.840
and a half. So I think of myself as being from Atlanta. I'm a very distinct memory of that.
link |
01:34:21.360
So I grew up in Atlanta. It's the only place I ever knew as a kid. I loved it. Like much
link |
01:34:26.800
of the country and certainly much of Atlanta in the 70s and 80s, it was deeply highly segregated,
link |
01:34:32.960
though not in a way that I think was obvious to you unless you were looking at it or were old enough
link |
01:34:38.240
to have noticed it. But you could divide up Atlanta and Atlanta is hardly unique in this way
link |
01:34:42.480
by highway and you could get race and class that way. So I grew up not only in a predominantly black
link |
01:34:48.160
area to say the very least, I grew up on the poor side of that. But I was very much aware of race
link |
01:34:56.400
for a bunch of reasons, one that people made certain that I was, my family did, but also that
link |
01:35:01.440
it would come up. So in first grade, I had a girlfriend. I say I had a girlfriend. I didn't
link |
01:35:08.320
have a girlfriend. I wasn't even entirely sure what girls were in the first grade. But I do remember
link |
01:35:13.200
she decided I was her girlfriend's little white girl named Heather. And we had a long discussion
link |
01:35:17.680
about how it was okay for us to be boyfriend and girlfriend, despite the fact that she was white
link |
01:35:21.440
and I was black. Between the two of you? Between the two. Did your parents know about this? Yes,
link |
01:35:26.880
but being a girlfriend and boyfriend in first grade just basically meant that you spent slightly
link |
01:35:31.520
more time together during recess. It had no, I think we Eskimo kissed once. It doesn't mean,
link |
01:35:37.120
it didn't mean anything. It was at the time, it felt very scandalous because everyone was watching.
link |
01:35:41.120
I was like, ah, my life is, now my life has changed in first grade. No one told me elementary
link |
01:35:45.520
school would be like this. Did you write poetry or? Not in first grade. That would come later.
link |
01:35:49.520
Okay. That would come during puberty when I wrote lots and lots of poetry. Anyway, so I was aware
link |
01:35:55.840
of it. I didn't think too much about it, but I was aware of it. But I was surrounded. It wasn't
link |
01:36:00.800
that I wasn't aware of race. It's that I wasn't aware that I was a minority. It's different. And
link |
01:36:07.120
it's because I wasn't. As far as my world was concerned, I mean, I'm six years old, five years
link |
01:36:11.360
old in first grade. The world is the seven people I see every day. So it didn't feel that way at all.
link |
01:36:16.560
And by the way, this being Atlanta, home of the civil rights movement and all the rest,
link |
01:36:21.840
it meant that when I looked at TV, which back then one did because there were only three,
link |
01:36:25.600
four or five channels, right? And I saw the news, which my mother might make me watch,
link |
01:36:30.720
the Monica Kaufman was on TV telling me the news and they were all black and the mayor was black
link |
01:36:37.200
and always been black. And so it just never occurred to me. When I went to Georgia Tech,
link |
01:36:42.080
I remember the first day walking across campus from West Campus to East Campus and realizing
link |
01:36:48.000
along the way that of the hundreds and hundreds and hundreds of students that I was seeing,
link |
01:36:52.240
I was the only black one. That was enlightening and very off putting because it occurred to me.
link |
01:36:59.440
And then of course, it continued that way for, well, for the rest of my, for much of the rest
link |
01:37:04.480
of my career at Georgia Tech. Of course, I found lots of other students and I met people because
link |
01:37:08.400
in Atlanta, you're either black or you're white, there was nothing else. So I began to meet students
link |
01:37:13.520
of Asian descent and I met students who we would call Hispanic and so on and so forth. And, you
link |
01:37:17.680
know, so my world, this is what college is supposed to do, right? It's supposed to open you up to
link |
01:37:21.120
people and it did. But it was a very strange thing to be in the minority. When I came to Boston,
link |
01:37:30.640
I will tell you a story. I applied to one place as an undergrad, Georgia Tech, because I was
link |
01:37:37.600
stupid. I didn't know any better numbers. It didn't know any better, right? No one told me.
link |
01:37:42.400
When I went to grad school, I applied to three places, Georgia Tech, because that's where I was,
link |
01:37:46.320
MIT and CMU. When I got in to MIT, I got into CMU, but I had a friend who went to CMU. And so I
link |
01:37:58.000
asked him what he thought about it. He spent his time explaining to me about Pittsburgh,
link |
01:38:01.680
much less about CMU, but more about Pittsburgh, which I developed a strong opinion based upon
link |
01:38:06.720
his strong opinion, something about the sun coming out two days out of the year. And I
link |
01:38:11.120
didn't get a chance to go there because the timing was wrong. I think it was because the timing was
link |
01:38:15.040
wrong. At MIT, I asked 20 people I knew, either when I visited or I had already known for a variety
link |
01:38:24.000
of reasons, whether they liked Boston. And 10 of them loved it and 10 of them hated it. The 10 who
link |
01:38:29.840
loved it were all white. The 10 who hated it were all black. And they explained to me very much why
link |
01:38:35.040
that was the case. Both stats told me why. And the stories were remarkably the same for the two
link |
01:38:41.280
clusters. And I came up here and I could see it immediately, why people would love it and why
link |
01:38:46.800
people would not. And why people tell you about the nice coffee shops. Well, when coffee shops,
link |
01:38:52.000
it was used CD places. But yeah, it was that kind of a thing. Nice shops. Oh, there's all these
link |
01:38:57.040
students here. Harvard Square is beautiful. You can do all these things and you can walk
link |
01:39:00.800
in something about the outdoors, which I was a bit interested in. The outdoors is for the bugs.
link |
01:39:04.480
It's not for humans. That should be a t shirt. Yeah, I mean, it's the way I feel about it.
link |
01:39:12.400
And the black folk told me completely different stories about which part of town
link |
01:39:15.920
you did not want to be caught in after dark. And I heard all, but that was nothing new.
link |
01:39:22.000
So I decided that MIT was a great place to be as a university. And I believed it then,
link |
01:39:27.600
I believe it now. And that whatever it is I wanted to do, I thought I knew what I wanted to do,
link |
01:39:32.160
but what if I was wrong? Someone there would know how to do it. Of course, then I would pick the
link |
01:39:36.960
one topic that nobody was working on at the time, but that's okay. It was great. And so I thought
link |
01:39:42.320
that I would be fine and not only be there for like four or five years, I told myself,
link |
01:39:46.480
which turned out not to be true at all. But I enjoyed my time. I enjoyed my time there.
link |
01:39:50.320
But I did see a lot of, I ran across a lot of things that were driven by what I look like.
link |
01:39:57.680
While I was here, I got asked a lot of questions. I ran into a lot of cops. I did a,
link |
01:40:04.800
I saw a lot about the city, but at the time, I mean, I haven't been here a long time. These are
link |
01:40:08.640
the things that I remember. So this is 1990. There was not a single black radio station.
link |
01:40:16.160
Now this is 1990. There aren't, I don't know if there are any radio stations anymore. I'm sure
link |
01:40:19.920
there are, but you know, I don't listen to the radio anymore and almost no one does.
link |
01:40:23.040
At least if you're under a certain age. But the idea is you could be in a major
link |
01:40:28.000
metropolitan area and there wasn't a single black radio station, by which I mean a radio
link |
01:40:31.120
station to play what we would call black music then, was absurd, but somehow captured kind of
link |
01:40:38.080
everything about the city. I grew up in Atlanta and, you know, you've heard me tell you about
link |
01:40:42.800
Atlanta. Boston had no economically viable or socially cohesive black middle class.
link |
01:40:50.320
Insofar as it existed, it was uniformly distributed throughout large parts, not all
link |
01:40:56.240
parts, but large parts of the city. You know, where you had concentrated concentrations
link |
01:41:00.240
of black Bostonians, they tended to be poor. It was very different from where I grew up.
link |
01:41:06.000
I grew up on the poor side of town, sure. But then in high school, well, in ninth grade,
link |
01:41:10.880
we didn't have middle school. I went to an eighth grade school where there was a lot of,
link |
01:41:15.680
let's just say we had a riot the year that I was there. There was at least one major fight
link |
01:41:19.040
every week. It was an amazing, it was an amazing experience. But when I went to ninth grade, I
link |
01:41:25.760
went to academy. Math and science academy at Mays High. It was a public school. It was a
link |
01:41:31.920
magnet school. That's why I was able to go there. It was the first school, high school,
link |
01:41:36.240
I think in the state of Georgia, to sweep the state math and science fairs. It was great.
link |
01:41:40.560
It had 385 students, all but four of whom were black. I went to school with the daughter of
link |
01:41:50.400
the former mayor of Atlanta, Michael Jackson's cousin. I mean, you know, there was, it was an
link |
01:41:55.760
upper middle class. Dropping names. You know, I just dropped names occasionally. You know,
link |
01:42:00.640
drop the mic, drop some names. Just to let you know, I used to hang out with Michael Jackson's
link |
01:42:03.600
cousin, 12 cousin, nine guys removed. I don't know. The point is, they had my, we had a parking
link |
01:42:08.240
problem because the kids had cars. I did not come from a place where you had cars. I had my
link |
01:42:12.480
first car when I came to MIT actually. So it was, it was just a, it was just a very, very
link |
01:42:20.480
different experience for me. But I had been to places where whether you were rich or whether
link |
01:42:24.880
you were poor, you know, you could be black and rich or black and poor and it was there and there
link |
01:42:28.560
were places and they were segregated by class as well as by race. But that existed here, at least
link |
01:42:34.880
when I was here, didn't feel that way at all. And it felt like a bunch of a really interesting
link |
01:42:39.360
contradiction. It felt like it was the interracial dating capital of the country.
link |
01:42:46.080
Yeah.
link |
01:42:46.800
You really felt that way. But it also felt like the most racist place I ever spent any time.
link |
01:42:54.080
You know, you couldn't go up the orange line at that time. I mean, again, that was 30 years ago.
link |
01:42:58.720
I don't know what it's like now, but there were places you couldn't go and you knew it.
link |
01:43:02.960
Everybody knew it. And there were places you couldn't live and everybody knew that.
link |
01:43:09.440
And that was just the greater Boston area in 1992.
link |
01:43:12.720
Southern racism or explicit racism?
link |
01:43:14.880
Both.
link |
01:43:16.720
In terms of within the institutions, did you feel, was, okay, was there levels in which you
link |
01:43:23.200
were empowered to be first or one of the first black people in a particular discipline in
link |
01:43:29.520
in some of these great institutions that you were part of, Georgia Tech or MIT?
link |
01:43:35.520
And was there a part where it was felt limiting?
link |
01:43:39.600
I always felt empowered. Some of that was my own delusion, I think, but it worked out.
link |
01:43:46.240
So I never felt, in fact, quite the opposite. Not only did I not feel as if no one was
link |
01:43:52.320
trying to stop me. I had the distinct impression that people wanted me to succeed.
link |
01:43:57.360
By people, I meant the people in power. Not my fellow students. Not that they didn't want me
link |
01:44:02.320
to succeed. But I felt supported, or at least that people were happy to see me succeed at least
link |
01:44:10.560
as much as anyone else. But 1990, you're dealing with a different set of problems, which you're
link |
01:44:16.240
very early, at least in computer science, you're very early in the sort of Jackie Robinson period.
link |
01:44:21.680
There's this thing called the Jackie Robinson syndrome, which is that you have to,
link |
01:44:25.120
you know, the first one has to be perfect, or has to be sure to succeed because if that person
link |
01:44:30.240
fails, no one else comes after for a long time. So, you know, it was kind of in everyone's best
link |
01:44:35.120
interest. But I think it came from a sincere place. I'm completely sure that people went out
link |
01:44:40.000
of their way to try to make certain that the environment would be good, not just for me,
link |
01:44:45.760
but for the other people who, of course, were around. And I was hardly the only person in
link |
01:44:49.200
the iLab, but I wasn't the only person in MIT by a long shot. On the other hand, we were what? At
link |
01:44:55.840
that point, we would have been, what, less than 20 years away from the first black PhD to graduate
link |
01:45:00.720
from MIT, right? Shirley Jackson, right? 1971, something like that, somewhere around then.
link |
01:45:06.880
So, we weren't that far away from the first first. And we were still another eight years away from
link |
01:45:12.800
the first black PhD to computer science, right? So, it was a sort of interesting time. But I
link |
01:45:19.760
did not feel as if the institutions of the university were against any of that. And furthermore,
link |
01:45:27.920
I felt as if there was enough of a critical mass across the institute from students and probably
link |
01:45:33.520
faculty that I didn't know them, who wanted to make certain that the right thing happened.
link |
01:45:38.400
That's very different from the institutions of the rest of the city, which I think were
link |
01:45:43.520
designed in such a way that they felt no need to be supportive.
link |
01:45:47.360
Let me ask a touchy question on that. So, you kind of said that you didn't feel,
link |
01:45:58.640
you felt empowered. Is there some lesson advice in the sense that no matter what,
link |
01:46:05.200
you should feel empowered. You should use the word, I think, illusion or delusion. Is there a sense
link |
01:46:12.640
from the individual perspective where you should always kind of ignore the little forces that you
link |
01:46:31.040
are able to observe around you that are trying to mess with you of whether it's jealousy, whether
link |
01:46:37.040
it's hatred in its pure form, whether it's just hatred in its deluded form, all that kind of
link |
01:46:43.840
stuff, and just kind of see yourself as empowered and confident in all those kinds of things.
link |
01:46:49.680
I mean, it certainly helps. But there's a tradeoff, right? You have to be deluded enough to think
link |
01:46:53.280
that you can succeed. I mean, you can't get a PhD unless you're crazy enough to think you can
link |
01:46:56.880
invent something that no one else has come up with. I mean, that kind of massive delusion is that.
link |
01:47:02.080
So, you have to be deluded enough to believe that you can succeed despite whatever odds you see
link |
01:47:05.600
in front of you. But you can't be so deluded that you don't think that you need to step out of the
link |
01:47:08.960
way of the oncoming train. So, it's all a tradeoff, right? You have to kind of believe in yourself.
link |
01:47:14.000
It helps to have a support group around you in some way or another. I was able to find that.
link |
01:47:19.280
I've been able to find that wherever I've gone, even if it wasn't necessarily on the floor that
link |
01:47:23.520
I was in. I had lots of friends when I was here. Many of them still live here. And I've kept up
link |
01:47:28.240
with many of them. So, I felt supported. And certainly, I had my mother and my family and
link |
01:47:32.560
those people back home that I could always lean back on, even if it were a long distance call
link |
01:47:38.640
that cost money, which is not something that any of the kids today even know what I'm talking about.
link |
01:47:42.960
But back then, it mattered calling my mom was an expensive proposition. But you have that,
link |
01:47:47.680
and it's fine. I think it helps. But you cannot be so deluded that you miss the obvious because
link |
01:47:52.720
it makes things slower. And it makes you think you're doing better than you are. And it will hurt
link |
01:47:57.360
you in the long run. You mentioned cops. You tell the story of being pulled over. Perhaps it happened
link |
01:48:05.920
more than once. More than once, for sure. One, could you tell that story? And in general, can you
link |
01:48:12.640
give me a sense of what the world looks like when the law doesn't always look at you with the blank
link |
01:48:24.080
slate with objective eyes? I don't know how to say it more poetically. Well, I guess I don't
link |
01:48:35.600
either. I guess the answer is it looks exactly the way it looks now because this is the world that
link |
01:48:40.720
we happen to live in, right? It's people clustering and doing the things that they do and making
link |
01:48:46.480
decisions based on one or two bits of information they find relevant, which, by the way, are all
link |
01:48:52.880
positive feedback loops, which makes it easier for you to believe what you believed before because
link |
01:48:57.760
you behave in a certain way that makes it true and it goes on and circles and cycles and cycles
link |
01:49:01.360
and cycles. So it's just about being on edge. I do not, despite having made it over 50 now.
link |
01:49:11.120
Congratulations, brother. Thank you. God, I have a few gray hairs here and there.
link |
01:49:16.080
You did pretty good. I think, you know, I don't imagine I will ever see a police officer and not
link |
01:49:23.680
get very, very tense. Now, everyone gets a little tense because it probably means you're being pulled
link |
01:49:30.160
over for speeding or something or you're going to get a ticket or whatever, right? I mean,
link |
01:49:34.400
the interesting thing about the law in general is that most human beings experience of it is
link |
01:49:39.280
fundamentally negative, right? You're only dealing with the lawyer if you're in trouble,
link |
01:49:44.400
except in a few very small circumstances, right? So that's just an underlying reality. Now,
link |
01:49:50.320
imagine that that's also at the hands of the police officer. I remember at the time when I was,
link |
01:49:54.400
when I got pulled over that time, halfway between Boston and Wellesley, actually. I remember thinking
link |
01:50:04.640
as he, when he pulled his gun on me that if he shot me right now, he'd get away with it.
link |
01:50:11.360
That was the worst thing that I felt about that particular moment is that if he shoots me now,
link |
01:50:16.480
he will get away with it. It would be years later when I realized actually much worse than that is
link |
01:50:24.800
that he'd get away with it. And if anyone, if it became a thing that other people knew about,
link |
01:50:30.880
odds would be, of course, that it wouldn't. But if it became a thing that other people knew about,
link |
01:50:34.240
if I was living in today's world as opposed to the world 30 years ago, that not only we get away
link |
01:50:40.000
with it, but that I would be painted a villain. I was probably big and scary and I probably moved
link |
01:50:45.920
too fast and if only I'd done what he said and da, da, da, da, da, da, da, which is somehow
link |
01:50:49.760
worse, right? You, you know, that hurts not just you, you're dead. But your family and the way
link |
01:50:56.160
people look at you and look at your legacy or your history, that's terrible. And it would work. I
link |
01:51:02.320
absolutely believe it would have worked had he done it. Now, he didn't. I don't think he wanted to
link |
01:51:06.080
shoot me when he felt like killing your body. Did not go out that night expecting to do that or
link |
01:51:10.240
planning on doing it. And I wouldn't be surprised if he never ever did that or ever even pulled
link |
01:51:14.880
his gun again. I don't know the man's name. I don't remember anything about him. I do remember
link |
01:51:18.000
the gun. Guns are very big when they're in your face. I can tell you this much. They're much
link |
01:51:21.280
larger than they seem. But and you're basically like speeding or something like that. He said I
link |
01:51:26.000
ran a light. I ran a light. I don't think I ran a light. But you know, in fact, I may not have
link |
01:51:30.480
even gotten a ticket. I may have just gotten a warning. I think he was a little. But he pulled
link |
01:51:34.400
a gun. Yeah. Apparently I moved too fast or something. Rolled my window down before I should
link |
01:51:39.120
have. It's unclear. I think he thought I was going to do something or at least that's how he
link |
01:51:43.360
behaved. So how if we can take a little walk around your brain,
link |
01:51:51.360
how do you feel about that guy and how do you feel about cops after that experience?
link |
01:51:58.960
Well, I don't remember that guy. But my views on police officers is the same view I have about
link |
01:52:03.760
lots of things. Fire is an important and necessary thing in the world. But you must respect fire
link |
01:52:12.560
because it will burn you. Fire is a necessary evil in the sense that it can burn you
link |
01:52:18.800
necessary in the sense that, you know, heat and all the other things that we use fire for. So
link |
01:52:24.720
when I see a cop, I see a giant ball of flame. And I just try to avoid it.
link |
01:52:31.680
And then some people might see a nice place, a nice thing to roast marshmallows with family over.
link |
01:52:37.760
Which is fine. I don't roast marshmallows.
link |
01:52:39.280
Okay. So let me go a little darker. I apologize. Just talked to Dan Carlin about it over four
link |
01:52:44.800
hours. So sorry if I go dark here a little bit. But is it easy for this experience of
link |
01:52:54.160
just being careful with the fire and avoiding it to turn to hatred?
link |
01:52:58.400
Yeah, of course. And one might even argue that it is a
link |
01:53:01.040
a logical conclusion, right? On the other hand, you've got to live in the world. And
link |
01:53:09.760
I don't think it's helpful. Hate is something that takes a lot of energy.
link |
01:53:17.840
So one should reserve it for when it is useful and not carry it around with you all the time.
link |
01:53:24.160
Again, there's a big difference between the happy delusion that convinces you that you can
link |
01:53:29.120
actually get out of bed and make it to work today without getting hit by a car. And the
link |
01:53:34.240
sad delusion that means you can not worry about this car that is barreling towards you, right?
link |
01:53:39.600
So we all have to be a little deluded because otherwise we're paralyzed, right? But one should
link |
01:53:44.800
not be ridiculous. We go all the way back to something you said earlier about empathy.
link |
01:53:49.280
I think what I would ask other people to get out of this one of many, many, many stories
link |
01:54:00.080
is to recognize that it is real. People would ask me to empathize with the police officer.
link |
01:54:07.120
I would quote back statistics saying that being a police officer isn't even in the top 10 most
link |
01:54:13.440
dangerous jobs in the United States, you're much more likely to be killed in a taxicab.
link |
01:54:17.440
Half of police officers are actually killed by suicide. But that means their lives are
link |
01:54:24.720
something. Something's going on there with them. And I would more than happy to be empathetic about
link |
01:54:30.160
what it is they go through and how they see the world. I think, though, that if we step back from
link |
01:54:36.240
what I feel and we step back from what an individual police officer feels,
link |
01:54:40.080
you step up a level and all this because all things tie back into interactive AR.
link |
01:54:44.560
The real problem here is that we've built a narrative. We built a big structure that has
link |
01:54:49.600
made it easy for people to put themselves into different pots in the different clusters and to
link |
01:54:57.360
basically forget that the people in the other clusters are ultimately like them.
link |
01:55:02.160
It is a useful exercise to ask yourself sometimes, I think, that if I had grown up in a completely
link |
01:55:07.040
different house and a completely different household as a completely different person,
link |
01:55:10.240
if I had been a woman, would I see the world differently? Would I believe with that crazy
link |
01:55:14.720
person over their beliefs? And the answer is probably yes, because after all, they believe it.
link |
01:55:21.680
And fundamentally, they're the same as you. So then what can you possibly do to fix it?
link |
01:55:27.520
How do you fix Twitter? If you think Twitter needs to be broken or Facebook,
link |
01:55:31.120
if you think Facebook is broken, how do you fix racism? How do you fix any of these things?
link |
01:55:35.280
It's all structural. Individual conversations matter a lot, but you have to create structures
link |
01:55:44.400
that allow people to have those individual conversations all the time in a way that is
link |
01:55:48.320
relatively safe and that allows them to understand that other people have had different experiences,
link |
01:55:53.680
but that ultimately were the same, which sounds very – I don't even know what the right word is.
link |
01:55:59.440
I'm trying to avoid a word like saccharine, but it feels very optimistic. But I think
link |
01:56:07.520
that's okay. I think that's a part of the delusion is you want to be a little optimistic
link |
01:56:11.520
and then recognize that the hard problem is actually setting up the structures in the first
link |
01:56:14.560
place, because it's an almost no one's interest to change the infrastructure.
link |
01:56:20.400
Right. I tend to believe that leaders have a big role to that of selling that optimistic
link |
01:56:25.920
delusion to everybody and that eventually leads to the building of the structures,
link |
01:56:31.200
but that requires a leader that unites everybody on a vision as opposed to divides on a vision,
link |
01:56:38.320
which is this particular moment in history feels like there's a nonzero probability
link |
01:56:47.440
if we go to the P of something akin to a violent or a nonviolent civil war.
link |
01:56:53.280
This is one of the most divisive periods of American history in recent – you can speak to
link |
01:56:59.680
this from perhaps a more knowledgeable and deeper perspective than me, but from my naive
link |
01:57:06.240
perspective, this seems like a very strange time. There's a lot of anger and it has to do with
link |
01:57:13.040
people – I mean, for many reasons. One, the thing that's not spoken about, I think,
link |
01:57:18.000
I think, much is the quiet economic pain of millions that's growing because of COVID,
link |
01:57:28.400
because of closed businesses, because of lost dreams. That's building, whatever that tension
link |
01:57:34.080
is building. The other is there seems to be an elevated level of emotion. I'm not sure if you
link |
01:57:40.480
can psychoanalyze where that's coming from, but this sort of from which the protests and so on
link |
01:57:46.560
percolated. It's like, why now? Why this particular moment in history?
link |
01:57:50.160
Oh, because time – enough time has passed. I mean, the very first race riots were Boston,
link |
01:57:55.760
not to draw anything. Really? When? Oh.
link |
01:57:58.160
This is before – Going way – I mean, like the 1700s or whatever, right? I mean, there was a
link |
01:58:02.640
massive one in New York. I mean, I'm talking way, way, way back when. So, Boston used to be the
link |
01:58:06.800
hotbed of riots. It's just what Boston was all about. So, I'm told from history class.
link |
01:58:12.160
There's an interesting one in New York. I remember when that was.
link |
01:58:15.440
Anyway, the point is, basically, you got to get another generation, old enough to be angry,
link |
01:58:24.240
but not so old to remember what happened the last time, right? Yeah.
link |
01:58:28.160
And that's sort of what happens. But you said two completely – you said two things there that
link |
01:58:33.360
I think are worth unpacking. One has to do with this sort of moment in time and why? Why is this
link |
01:58:42.000
sort of up built? And the other has to do with the kind of – you sort of the economic reality
link |
01:58:47.360
of COVID. So, I'm actually – I want to separate those things, because, for example,
link |
01:58:52.800
this happened before COVID happened, right? So, let's separate these two things for a moment.
link |
01:58:59.120
Now, let me preface all this by saying that although I am interested in history, one of my
link |
01:59:05.680
three minors is an undergrad with history, specifically history of the 1960s.
link |
01:59:09.120
Interesting. The other was Spanish and –
link |
01:59:12.560
Okay, that's a mistake. Oh, I loved that one.
link |
01:59:14.800
Okay. And history of Spanish. And Spanish history, actually. But Spanish and the other was what we
link |
01:59:19.920
would now call cognitive science at the time. Oh, that's fascinating. Interesting.
link |
01:59:25.120
I minored in COGSI here for grad school. That was really – that was really fascinating.
link |
01:59:30.640
It was a very different experience from all the computer science classes I've been taking,
link |
01:59:34.240
even the COGSI classes I was taking at an undergrad. But anyway, I'm not – I am a –
link |
01:59:41.840
I'm interested in history, but I'm hardly historian, right?
link |
01:59:44.160
Yes.
link |
01:59:44.960
So, forgive my – I will ask the audience to forgive my simplification. But
link |
01:59:54.400
I think the question that's always worth asking as opposed to – it's the same question,
link |
01:59:59.200
but a little different. Not why now, but why not before, right? So, why the 1950s,
link |
02:00:08.480
60s civil rights movement as opposed to 1930s, 1940s? Well, first off, there was a civil rights
link |
02:00:12.640
movement in the 30s and 40s. It just wasn't of the same character or quite as well known.
link |
02:00:17.280
Post World War II, lots of interesting things were happening. It's not as if a switch was
link |
02:00:22.160
turned on and Brown versus the Board of Education or the Montgomery bus boycott,
link |
02:00:27.360
and that's when it happened. These things have been building up forever and go all the way back
link |
02:00:30.320
and all the way back and all the way back. And Harriet Tubman was not born in 1950, right?
link |
02:00:34.480
So, we can take these things –
link |
02:00:35.840
It could have easily happened right after World War II.
link |
02:00:38.800
Yes. I think – and again, I am not a scholar – I think that the big difference was TV.
link |
02:00:49.200
These things are visible. People can see them. It's hard to avoid, right? The – why not James
link |
02:00:56.400
Farmer? Why Martin Luther King? Because one was born 20 years after the other or whatever.
link |
02:01:02.560
I think it turns out that – you know, King's biggest failure was in the early days? It was in
link |
02:01:08.960
Georgia. You know, they were doing some – doing the usual thing, trying to integrate. And I forget
link |
02:01:17.600
the guy's name, but you can look this up. But he – a cop, he was a sheriff, made a deal with the
link |
02:01:24.000
whole state of Georgia. We're going to take people and we're going to nonviolently put them in trucks
link |
02:01:28.640
and then we're going to take them and put them in jails very far away from here.
link |
02:01:33.280
And we're going to do that. And we're not going to – there'll be no reason for the press to
link |
02:01:36.800
hang around. And they did that and it worked. And the press left and nothing changed. So,
link |
02:01:43.680
next they went to Birmingham, Alabama and Bullo Connor. And you got to see on TV little boys
link |
02:01:49.600
and girls being hit with fire hoses and being knocked down. And there was outrage and things
link |
02:01:55.040
changed, right? Part of the delusion is pretending that nothing bad is happening that might force
link |
02:02:00.720
you to do something big you don't want to do. But sometimes it gets put in your face and then
link |
02:02:04.480
you kind of can't ignore it. And a large part, in my view, of what happened right was that it
link |
02:02:10.320
was too public to ignore. Now, we created other ways of ignoring it. Lots of change happened in
link |
02:02:15.440
the south. But part of that delusion was that it wasn't going to affect the west or the northeast.
link |
02:02:19.200
And of course, it did. And that caused its own set of problems, which went into the late 60s,
link |
02:02:23.040
into the 70s. And in some ways, we're living with that legacy now and so on. So, why not – what's
link |
02:02:29.920
happening now? Why didn't happen 10 years ago? I think it's – people have more voices. There's
link |
02:02:35.280
not just more TV. There's social media. It's very easy for these things to kind of build on
link |
02:02:39.360
themselves. And things are just quite visible. And there's demographic change. I mean, the world
link |
02:02:46.320
is changing rapidly, right? And so, it's very difficult. You're now seeing people you could
link |
02:02:50.080
have avoided seeing most of your life growing up in a particular time. And it's happening.
link |
02:02:54.960
It's dispersing at a speed that is fast enough to cause concern for some people, but not so fast
link |
02:03:00.960
to cause massive negative reaction. So, that's that. On the other hand – and again, that's a
link |
02:03:07.840
massive oversimplification. But I think there's something there anyway, at least something worth
link |
02:03:11.440
exploring. I'm happy to be yelled at by a real historian. Oh, yeah. I mean, there's just the
link |
02:03:16.720
obvious thing. I mean, I guess you're implying, but not saying this. I mean, it seemed to have
link |
02:03:21.520
percolated the most with just a single video, for example, the George Floyd video.
link |
02:03:25.920
A few difference. It makes it – it's fascinating to think that whatever the mechanisms that put
link |
02:03:33.280
injustice in front of our face, not like – like directly in front of our face, those mechanisms
link |
02:03:41.280
are the mechanisms of change. Yeah. On the other hand, Rodney King. So, no one remembers this. I
link |
02:03:46.080
seem to be the only person who remembers this. But sometime before the Rodney King incident,
link |
02:03:50.400
there was a guy who was a police officer who was saying that things were really bad in Southern
link |
02:03:55.840
California. And he was going to prove it by having some news – some camera people follow
link |
02:04:01.760
him around. And he says, I'm going to go into these towns and just follow me for a week and you
link |
02:04:05.520
will see that I'll get harassed. And like the first night, he goes out there and he crosses
link |
02:04:09.920
into the city. Some cops pull him over and he's a police officer, remember? They don't know that,
link |
02:04:14.800
of course. They like shove his face through a glass window. This was on the news. I distinctly
link |
02:04:19.600
remember watching this as a kid. Actually, I guess I wasn't a kid. I was in college at
link |
02:04:23.920
2000 grad school at the time. So, that's not enough. Like just – just –
link |
02:04:27.680
Well, it disappeared. Like a day like – it didn't go viral.
link |
02:04:31.440
Yeah, whatever that is, whatever that magic thing is.
link |
02:04:33.840
And whatever it was in 92. It was harder to go viral in 92, right? Or 91. Actually,
link |
02:04:38.560
it must have been 90 or 91. But that happened. And like two days later, it's like it never
link |
02:04:42.400
happened. Like nobody – again, nobody remembers this, but I'm like the only person. Sometimes
link |
02:04:45.520
I think I must have dreamed it. Anyway, Rodney King happens. It goes viral or the moral equivalent
link |
02:04:51.120
thereof at the time. And eventually, we get April 29th, right? And I don't know what the
link |
02:04:57.840
difference was between the two things other than one thing caught and one thing didn't.
link |
02:05:01.360
Maybe what's happening now is two things are feeding on one another. One is more people are
link |
02:05:07.360
willing to believe. And the other is there's easier and easier ways to give evidence.
link |
02:05:14.160
Cameras, body cams. But we're still finding ourselves telling the same story. It's the
link |
02:05:17.440
same thing over and over again. I would invite you to go back and read the op eds from what people
link |
02:05:22.480
were saying about the violence is not the right answer after Rodney King. And then go back to 1980
link |
02:05:28.320
and the big riots that were happening around then and read the same op eds. It's the same words,
link |
02:05:32.800
over and over and over again. I mean, there's your remembering history right there. I mean,
link |
02:05:37.760
it's like literally the same words. Like you could have just caught. I'm surprised no one got
link |
02:05:40.960
flagged for plagiarism. It's interesting if you have an opinion on the question of violence
link |
02:05:46.560
and the popular perhaps caricature of Malcolm X versus King Martin Luther King.
link |
02:05:53.120
You know, Malcolm X was older than Martin Luther King. People kind of have it in their head that
link |
02:05:57.280
he's younger. Well, he died sooner. But only by a few years. People think of him as the older
link |
02:06:05.040
statesman and they think of Malcolm X as the young, angry, whatever. But that's more of a
link |
02:06:10.320
narrative device. It's not true at all. I reject the choice as I think it's a false choice. I
link |
02:06:19.600
think they're just things that happen. You just do, as I said, hatred is not, it takes a lot of
link |
02:06:24.080
energy. But you know, every once in a while, you have to fight. One thing I will say without
link |
02:06:29.840
taking a moral position, which I will not take on this matter. Violence has worked.
link |
02:06:40.240
Yeah, that's the annoying thing. It seems like over the top anchor works,
link |
02:06:47.440
outrage works. So you can say being calm and rational and just talking it out is going to
link |
02:06:56.000
lead to progress. But it seems like if you just look through history, being irrationally upset
link |
02:07:04.640
is the way you make progress. Well, it's certainly the way that you get someone to notice you.
link |
02:07:09.680
Yeah. And if they don't notice you, I mean, what's the difference between that and what
link |
02:07:13.600
a... Again, without taking a moral position on this, I'm just trying to observe history here. If you
link |
02:07:18.720
maybe if television didn't exist, the civil rights movement doesn't happen, or it takes longer,
link |
02:07:23.360
or it takes a very different form. Maybe if social media doesn't exist, a whole host of
link |
02:07:27.760
things, positive and negative, don't happen. And what do any of those things do other than
link |
02:07:36.160
expose things to people? Violence is a way of shouting. I mean, many people far more
link |
02:07:41.840
talented and thoughtful than I have said this in one form or another, that violence is the voice
link |
02:07:47.840
of the unheard, right? I mean, it's a thing that people do when they feel as if they have no other
link |
02:07:54.560
option. And sometimes we agree and sometimes we disagree. Sometimes we think they're justified.
link |
02:08:00.320
Sometimes we think they are not. But regardless, it is a way of shouting. And when you shout,
link |
02:08:07.120
people tend to hear you, even if they don't necessarily hear the words that you're saying.
link |
02:08:10.160
They hear that you were shouting. I see no way. So another way of putting it, which I think is
link |
02:08:15.760
less, let us just say provocative, but I think is true, is that all change, particularly change
link |
02:08:27.760
that impacts power, requires struggle. The struggle doesn't have to be violent,
link |
02:08:32.480
you know, but it's a struggle nonetheless. The powerful don't give up power easily.
link |
02:08:41.120
I mean, why should they? But even so, you still, it has to be a struggle. And by the way,
link |
02:08:45.840
this isn't just about, you know, violent, political, whatever, nonviolent political change,
link |
02:08:49.600
right? This is true for understanding calculus, right? I mean, everything requires a struggle.
link |
02:08:53.680
We're back to talking about faculty hiring. At the end of the day, in the end of the day,
link |
02:08:57.040
it all comes down to faculty hiring. That is all a metaphor. Faculty hiring is a metaphor
link |
02:09:02.240
for all of life. Let me ask a strange question. Do you think everything is going to be okay
link |
02:09:11.840
in the next year? Do you have a hope that we're going to be okay?
link |
02:09:16.640
I tend to think that everything's going to be okay, because I just tend to think that everything's
link |
02:09:20.560
going to be okay. My mother says something to me a lot and always has, and I find it quite
link |
02:09:26.240
comforting, which is, this too shall pass. And this too shall pass. Now, this too shall pass is not
link |
02:09:32.160
just this bad thing is going away. Everything passes. I mean, I have a 16 year old daughter
link |
02:09:38.480
who's going to go to college probably at about 15 minutes, given how fast she seems to be growing
link |
02:09:43.920
up. And, you know, I get to hang out with her now. But one day I won't. She'll ignore me just as
link |
02:09:48.480
much as I ignored my parents when I was in college and went to grad school. This too shall pass.
link |
02:09:52.160
But I think that, you know, one day, if we're all lucky, you live long enough to look back on
link |
02:09:57.200
something that happened a while ago, even if it was painful and mostly it's a memory. So yes,
link |
02:10:03.680
I think it'll be okay. What about humans? Do you think we'll live into the 21st century?
link |
02:10:11.440
I certainly hope so. Are you worried that we might destroy ourselves with nuclear weapons,
link |
02:10:17.520
with AGI, with the engineer? I'm not worried about AGI doing it. But I am worried. I mean,
link |
02:10:22.720
at any given moment, right? Also, but you know, at any given moment, a comic. I mean, you know,
link |
02:10:26.640
whatever. I didn't think that outside of things completely beyond our control,
link |
02:10:33.440
we have a better chance than not of making it. You know, I talked to Alex Filipenko from Berkeley.
link |
02:10:40.240
He was talking about comics and then they can come out of nowhere. And that was the realization
link |
02:10:45.360
to me. Wow, we're just watching this darkness and they can just enter and then we have less than a
link |
02:10:52.240
month. And yet you make it from day to day. That one shall not pass. Well, maybe for Earth
link |
02:11:00.160
they'll pass but not for humans. But I'm just choosing to believe that it's going to be okay.
link |
02:11:07.360
And we're not going to get hit by an asteroid, at least not while I'm around. And if we are,
link |
02:11:11.600
well, there's very little I can do about it. So I might as well assume it's not going to happen.
link |
02:11:16.080
It makes food taste better. It makes food taste better. So you, out of the millions of things
link |
02:11:22.800
you've done in your life, you've also began the this week in black history calendar effects.
link |
02:11:30.160
There's, there's like a million questions that can ask here. You said you're not a historian.
link |
02:11:34.560
But is there, let's start at the big history question of, is there somebody in, in history,
link |
02:11:44.240
in black history that you draw a lot of philosophical or personal inspiration from?
link |
02:11:51.040
Or you just find interesting or a moment in history you find interesting?
link |
02:11:55.040
Well, I find the, the entirety of the 40s to the 60s and the civil rights movement that didn't
link |
02:12:01.120
happen and did happen at the same time during then quite inspirational. I mean, I've,
link |
02:12:05.520
I've, I've read quite a bit of the time period, at least I did in my younger days when I had
link |
02:12:10.480
more time to read as many things as I wanted to. What was quirky about this week in black history
link |
02:12:16.960
when I started in the 80s was how focused it was. And it was because of the sources I was
link |
02:12:23.760
stealing from. And I was very much stealing from sort of like a calendar, anything I could find,
link |
02:12:27.920
Google didn't exist, right? And I just pulled as much as I could and just put it together in one
link |
02:12:31.120
place for other people. What ended up being quirky about it, and I started getting people
link |
02:12:34.560
sending me information on it, was the inventors. People who, you know, Garrett Morgan, Benjamin
link |
02:12:41.920
Bannaker, right? People who were inventing things. At a time when how in the world did they manage
link |
02:12:52.720
to invent anything? Like all these other things were happening, mother necessity, right? All these
link |
02:12:57.200
other things were happening. And, you know, there were so many terrible things happening around them.
link |
02:13:00.640
And, you know, they went to the wrong state at the wrong time. I mean, never, never come back.
link |
02:13:03.920
But they were inventing things we use, right? And it was always inspiring to me that people would
link |
02:13:09.840
still create even under those circumstances. I got a lot out of that. I also learned a few lessons.
link |
02:13:16.480
I think, you know, the Charles Richard Jews of the world, you know, you, you, you create things
link |
02:13:23.120
that impact people. You don't necessarily get credit for them. And that's not right. But it's
link |
02:13:28.640
also okay. You're okay with that? Up to a point, yeah. I mean, look, in our world, all we really
link |
02:13:37.600
have is credit. I was always bothered by how much value credit is given. That's the only thing you
link |
02:13:44.480
got. I mean, if you're an academic in some sense, no, it isn't the only thing you've got, but it feels
link |
02:13:48.640
that way sometimes. But you got the actual, we're all going to be dead soon. You got the joy of
link |
02:13:55.280
having created the credit, you know, the credit with Jan. I talked to Jorian Schmidhuber, right?
link |
02:14:04.400
The Turing Award given to three people for deep learning. And you could say that a lot of other
link |
02:14:10.080
people should be on that list. It's the Nobel Prize question. Yeah, it's sad. It's sad. And people
link |
02:14:15.440
like talking about it. But I feel like in the long arc of history, the only person who'll be
link |
02:14:21.200
remembered is Einstein, Hitler, maybe Elon Musk. And the rest of us are just like...
link |
02:14:27.360
Well, you know, someone asked me about immortality once. And I said, and I stole this from somebody
link |
02:14:32.160
else. I don't remember who, but it was, you know, I asked them, what's your great grandfather's name?
link |
02:14:37.840
Any of them? Of course, they don't know. Most of us do not know. I mean, I'm not entirely sure. I know
link |
02:14:44.080
my grandparents, all my grandparents names. I know what I called them. I don't know their middle
link |
02:14:48.240
names, for example. Didn't within living memories, so I could find out. Actually, my grandfather
link |
02:14:54.560
didn't know when he was born. Had no idea how old he was. But I definitely don't know
link |
02:15:00.480
any of my great grandparents are. So in some sense, immortality is doing something preferably
link |
02:15:06.400
positive so that your great grandchildren know who you are. And that's kind of what you can hope
link |
02:15:11.920
for, which is very depressing in some ways. You can... I could turn it into something uplifting
link |
02:15:16.720
if you need me to, but... Yeah, can you do the work here? Yeah, it's simple, right? It doesn't
link |
02:15:21.520
matter. I don't have to know what my great grandfather was to know that I wouldn't be here
link |
02:15:24.800
without him. Yeah. And I don't know who my great grandchildren are, and certainly who my great,
link |
02:15:29.840
great grandchildren are, and I'll probably never meet them, although I would very much like to.
link |
02:15:34.880
But hopefully, I'll set the world in motion in such a way that their lives will be better
link |
02:15:39.520
than they would have been if I hadn't done that. Well, certainly, they wouldn't have existed
link |
02:15:42.720
if I hadn't done the things that I did. So I think that's a good positive thing. You live on
link |
02:15:47.600
through other people. Are you afraid of death? I don't know if I'm afraid of death, but I don't
link |
02:15:52.720
like it. That's another T shirt. I mean, do you ponder it? Do you think about the...
link |
02:16:02.560
The inevitability of oblivion? Yes. I do occasionally. It feels like a very Russian
link |
02:16:07.280
conversation, actually. It's very, yeah. I will tell you a story, a very something that happened
link |
02:16:11.840
to me recently. If you look very carefully, you'll see I have a scar, which, by the way,
link |
02:16:18.480
is an interesting story of its own about why people have half of their thyroid taken out.
link |
02:16:21.840
Some people get scars and some don't. But anyway, I had half my thyroid taken out. The way I got
link |
02:16:28.480
there, by the way, is its own interesting story, but I won't go into it. Just suffice it to say,
link |
02:16:31.520
I did what I keep telling people you should never do, which is never go to the doctor unless you
link |
02:16:34.640
have to, because there's nothing good that's ever going to come out of a doctor's visit, right?
link |
02:16:37.760
So I went to the doctor to look at one thing. It's a little bump I had on the side that I
link |
02:16:42.080
thought might be something bad because my mother made me. And I went there and he's like,
link |
02:16:45.840
oh, it's nothing. But by the way, your thyroid is huge. Can you breathe? Yes, I can breathe.
link |
02:16:49.520
Are you sure? Because it's pushing on your windpipe. You should be dead. Right? So I ended up going
link |
02:16:53.360
there. And to get my... To look at my thyroid, it was growing, what's called a goiter. And he said,
link |
02:17:01.360
we're going to have to take it out at some point. When? Sometime before you're 85 probably,
link |
02:17:05.280
but if you wait till you're 85, that'll be really bad because you don't want to have
link |
02:17:09.920
surgery when you're 85 years old if you can help it. Certainly not the kind of surgery it takes
link |
02:17:14.480
to take out your thyroid. So I went there and we decided, I would decide I would put it off until
link |
02:17:21.760
December 19th because my birthday is December 18th. And I wanted Bill to say I made it to 49
link |
02:17:27.120
or whatever. So I said, I'll wait till after my birthday. In the first six months of that,
link |
02:17:33.520
nothing changed. Apparently in the next three months, it had grown. I had noticed this at all.
link |
02:17:41.440
I went and had surgery. They took out half of it. The other half is still there and working fine,
link |
02:17:45.440
by the way. I don't have to take a pill or anything like that. It's great. I'm in the hospital room
link |
02:17:50.960
and the doctor comes in. I've got these things in my arm. They're going to do whatever. They're
link |
02:17:58.080
talking to me. And the anesthesiologist says, huh, your blood pressure is through the roof.
link |
02:18:02.480
Are you, do you have high blood pressure? I said, no, but I'm terrified if that helps you at all.
link |
02:18:07.920
And the anesthesist, who's the nurse who supports the anesthesiologist,
link |
02:18:12.160
if I got that right, said, oh, don't worry about it. I just put some stuff in your IV.
link |
02:18:15.920
You're going to be feeling pretty good in a couple of minutes. And I remember turning
link |
02:18:18.640
and saying, well, I'm going to feel pretty good in a couple of minutes. Next thing I know, there's
link |
02:18:23.840
this guy and he's moving my bed. And I have this, and he's talking to me and I have this
link |
02:18:28.560
distinct impression that I've met this guy and I should know what he's talking about,
link |
02:18:33.920
but I kind of like just don't remember what just happened. And I look up and I see the
link |
02:18:38.880
tiles going by and I'm like, oh, it's just like in the movies where you see the tiles go by.
link |
02:18:43.840
And then I have this brief thought that I'm in an infinitely long warehouse,
link |
02:18:49.680
and there's someone sitting next to me. And I remember thinking, oh, she's not talking to me.
link |
02:18:54.960
And then I'm back in the hospital bed. And in between the time where the tiles were going by,
link |
02:19:02.160
and I got in the hospital bed, something like five hours had passed. Apparently it had grown
link |
02:19:06.400
so much that it was a four and a half hour procedure instead of an hour long procedure. I lost
link |
02:19:10.640
a neck size and a half. It was pretty big. Apparently it was as big as my heart.
link |
02:19:17.200
Why am I telling you this? I'm telling you this because
link |
02:19:20.240
It's a hell of a story already.
link |
02:19:21.440
So between the tiles going by and me waking up in my hospital bed, no time passed. There was
link |
02:19:29.520
no sensation of time pass. When I go to sleep and I wake up in the morning,
link |
02:19:34.800
I have this feeling that time has passed, this feeling that something has physically changed
link |
02:19:38.800
about me. Nothing happened between the time they put the magic juice in me and the time that I woke
link |
02:19:45.440
up. Nothing. By the way, my wife was there with me talking. Apparently I was also talking. I don't
link |
02:19:51.040
remember any of this, but luckily I didn't say anything I wouldn't normally say. My memory of
link |
02:19:55.600
it is I would talk to her and she would teleport around the room. And then I accused her of witchcraft
link |
02:20:01.760
and that was the end of that. But she, her point of view is I would start talking and then I would
link |
02:20:06.400
fall asleep and then I would wake up and leave off where I was before. I had no notion of any
link |
02:20:10.320
time passing. I kind of imagine that that's death. Is the lack of sensation of time passing.
link |
02:20:18.960
And on the one hand, I am, I don't know, soothed by the idea that I won't notice. On the other
link |
02:20:27.280
hand, I am very unhappy at the idea that I won't notice. So I don't know if I'm afraid of death,
link |
02:20:33.840
but I am completely sure that I don't like it. And that I particularly would prefer to discover
link |
02:20:39.680
on my own whether immortality sucks and be able to make a decision about it. That's what I would
link |
02:20:45.200
prefer. You'd like to have a choice in the matter. I would like to have a choice in the matter.
link |
02:20:49.760
Well, again, on the Russian thing, I think the finiteness of it is the thing that gives it
link |
02:20:55.280
a little flavor, a little spice. Well, in reinforcement learning, we believe
link |
02:20:58.960
that. That's why we have discount factors. Otherwise, it doesn't matter what you do.
link |
02:21:01.440
Amen. Well, let me one last question to sticking on the Russian theme. You
link |
02:21:11.120
talked about your great grandparents, not remembering their name. What do you think is the,
link |
02:21:18.000
in this kind of Markov chain that is life, what do you think is the meaning of it all?
link |
02:21:25.920
What's the meaning of life? Well, in a world where eventually you won't
link |
02:21:31.280
know who your great grandchildren are, I am reminded of something I heard once,
link |
02:21:38.960
or I read once that I really like, which is it is well worth remembering
link |
02:21:45.120
that the entire universe, say for one trifling exception, is composed entirely of others.
link |
02:21:57.040
I think that's the meaning of life.
link |
02:22:01.040
Charles, this is one of the best conversations I've ever had, and I get to see you tomorrow again to
link |
02:22:06.720
hang out with who looks to be one of the most, how should I say, interesting personalities
link |
02:22:15.600
that I'll ever get to meet with Michael Liebman. I can't wait. I'm excited to have had this
link |
02:22:20.640
opportunity. Thank you for traveling all the way here. It was amazing. I always love Georgia Tech.
link |
02:22:26.640
I'm excited to see with you being involved there or with the future holds. Thank you for talking
link |
02:22:31.280
to me. Thank you for having me. I enjoyed every minute of it. Thanks for listening to this
link |
02:22:35.120
conversation with Charles Lisbeau, and thank you to our sponsors, Neuro, the maker of functional
link |
02:22:40.800
sugar free gum and mints that I used to give my brain a quick caffeine boost, decoding digital,
link |
02:22:47.600
a podcast on tech and entrepreneurship that I listen to and enjoy, masterclass online courses
link |
02:22:54.160
that I watch from some of the most amazing humans in history, and Cash App, the app I used to send
link |
02:23:00.240
money to friends for food and drinks. Please check out these sponsors in the description to get a
link |
02:23:06.160
discount and to support this podcast. If you enjoy this thing, subscribe on YouTube, review it with
link |
02:23:12.000
five stars on Apple podcast, follow on Spotify, support on Patreon, or connect with me on Twitter
link |
02:23:17.840
and Lex Friedman. And now let me leave you with some poetic words from Martin Luther King Jr.
link |
02:23:24.160
There comes a time when people get tired of being pushed out of the glittering sunlight of
link |
02:23:30.880
life's July and left standing amid the piercing chill of an alpine November. Thank you for listening
link |
02:23:39.520
and hope to see you next time.