back to index

Charles Isbell: Computing, Interactive AI, and Race in America | Lex Fridman Podcast #135


small model | large model

link |
00:00:00.000
The following is a conversation with Charles Isbell,
link |
00:00:03.120
Dean of the College of Computing at Georgia Tech,
link |
00:00:06.320
a researcher and educator in the field of artificial intelligence,
link |
00:00:10.640
and someone who deeply thinks about what exactly is the field of computing and how do we teach it.
link |
00:00:18.000
He also has a fascinatingly varied set of interests including music,
link |
00:00:22.640
books, movies, sports, and history that make him especially fun to talk with.
link |
00:00:28.080
When I first saw him speak, his charisma immediately took over the room,
link |
00:00:32.800
and I had a stupid excited smile on my face,
link |
00:00:35.600
and I knew I had to eventually talk to him on this podcast.
link |
00:00:39.280
Quick mention of each sponsor, followed by some thoughts related to the episode.
link |
00:00:44.240
First is Neuro, the maker of functional sugar free gum
link |
00:00:48.000
and mints that I use to give my brain a quick caffeine boost.
link |
00:00:52.240
Second is Decoding Digital, a podcast on tech and entrepreneurship
link |
00:00:56.880
that I listen to and enjoy.
link |
00:00:59.120
Third is Masterclass, online courses that I watch from some of the most amazing humans in history.
link |
00:01:04.880
And finally, Cash App, the app I use to send money to friends for food and drinks.
link |
00:01:10.560
Please check out these sponsors in the description to get a discount and to support this podcast.
link |
00:01:16.160
As a side note, let me say that I'm trying to make it so that the conversations with Charles,
link |
00:01:21.200
Eric Weinstein, and Dan Carlin will be published before Americans vote for president on November 3rd.
link |
00:01:28.080
There's nothing explicitly political in these conversations,
link |
00:01:31.280
but they do touch on something in human nature that I hope can bring context to our difficult time,
link |
00:01:37.760
and maybe, for a moment, allow us to empathize with people we disagree with.
link |
00:01:43.120
With Eric, we talk about the nature of evil.
link |
00:01:45.760
With Charles, besides AI and music, we talk a bit about race in America,
link |
00:01:51.360
and how we can bring more love and empathy to our online communication.
link |
00:01:56.880
And with Dan Carlin, well, we talk about Alexander the Great,
link |
00:02:01.680
Genghis Khan, Hitler, Stalin, and all the complicated parts of human history in between,
link |
00:02:07.840
with a hopeful eye toward a brighter future for our humble, little civilization here on Earth.
link |
00:02:13.120
The conversation with Dan will hopefully be posted tomorrow, on Monday, November 2nd.
link |
00:02:19.360
If you enjoy this thing, subscribe on YouTube, review it with 5 Stars and Apple Podcasts,
link |
00:02:24.000
follow on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman.
link |
00:02:30.080
And now, here's my conversation with Charles Isbell.
link |
00:02:35.280
You've mentioned that you love movies and TV shows.
link |
00:02:39.040
Let's ask an easy question, but you have to be definitively, objectively, conclusive.
link |
00:02:44.640
What's your top three movies of all time?
link |
00:02:47.680
So, you're asking me to be definitive and to be conclusive.
link |
00:02:50.320
That's a little hard. I'm going to tell you why.
link |
00:02:51.920
It's very simple. It's because movies is too broad of a category.
link |
00:02:56.160
I got to pick subgenres, but I will tell you that of those genres,
link |
00:02:59.920
I'll pick one or two from each of the genres, and I'll get us to three, so I'm going to cheat.
link |
00:03:03.920
So, my favorite comedy of all times, which is probably my favorite movie of all time,
link |
00:03:10.000
is His Girl Friday, which is probably a movie that you've not ever heard of,
link |
00:03:14.320
but it's based on a play called The Front Page from, I don't know, early 1900s.
link |
00:03:20.400
And the movie is a fantastic film.
link |
00:03:23.840
What's the story? What's the independent film?
link |
00:03:26.080
No, no, no. What are we talking about?
link |
00:03:27.120
This is one of the movies that would have been very popular. It's a screwball comedy.
link |
00:03:31.120
You ever see Moonlighting, the TV show? You know what I'm talking about?
link |
00:03:33.920
So, you've seen these shows where there's a man and a woman, and they clearly are in love with one another,
link |
00:03:38.320
and they're constantly fighting and always talking over each other.
link |
00:03:40.960
Banter, banter, banter, banter, banter.
link |
00:03:43.280
This was the movie that started all that, as far as I'm concerned.
link |
00:03:46.880
It's very much of its time. So, it's, I don't know, must have come out sometime between 1934 and 1939.
link |
00:03:53.280
I'm not sure exactly when the movie itself came out. It's black and white.
link |
00:03:57.040
It's just a fantastic film, and it's hilarious.
link |
00:04:01.840
So, it's mostly conversation?
link |
00:04:03.840
Not entirely, but mostly, mostly. Just a lot of back and forth.
link |
00:04:07.440
There's a story there. Someone's on death row, and they're newspaper men, including her.
link |
00:04:14.480
They're all newspaper men. They were divorced.
link |
00:04:17.280
The editor, the publisher, I guess, and the reporter, they were divorced.
link |
00:04:22.400
But, you know, they clearly, he's thinking, trying to get back together,
link |
00:04:25.360
and there's this whole other thing that's going on.
link |
00:04:27.120
But none of that matters. The plot doesn't matter.
link |
00:04:28.720
Yeah, it's just a little play in conversation.
link |
00:04:31.440
It's fantastic. And I just love everything about the conversation, because at the end of the day,
link |
00:04:35.680
sort of narrative and conversation are the sort of things that drive me.
link |
00:04:38.160
And so, I really like that movie for that reason.
link |
00:04:41.280
Similarly, I'm now going to cheat, and I'm going to give you two movies as one.
link |
00:04:45.360
And they're Crouching Tiger, Hidden Dragon, and John Wick.
link |
00:04:49.760
Both relatively modern. John Wick, of course.
link |
00:04:51.440
One, two, or three?
link |
00:04:52.560
One. It gets increasingly, I love them all for different reasons,
link |
00:04:56.000
and increasingly more ridiculous. Kind of like Loving Alien and Aliens,
link |
00:04:59.440
despite the fact they're two completely different movies.
link |
00:05:01.520
But the reason I put Crouching Tiger, Hidden Dragon, and John Wick together is because I
link |
00:05:06.000
actually think they're the same movie, or what I like about them, the same movie.
link |
00:05:09.680
Which is both of them create a world that you're coming in the middle of,
link |
00:05:15.440
and they don't explain it to you. But the story is done so well that you pick it up.
link |
00:05:20.000
So, anyone who's seen John Wick, you know, you have these little coins,
link |
00:05:23.760
and they're headed out, and there are these rules,
link |
00:05:25.680
and apparently every single person in New York City is an assassin.
link |
00:05:28.960
There's like two people who come through who aren't, but otherwise they are.
link |
00:05:31.360
But there's this complicated world, and everyone knows each other.
link |
00:05:34.080
They don't sit down and explain it to you, but you figure it out.
link |
00:05:35.920
Crouching Tiger, Hidden Dragon is a lot like that.
link |
00:05:38.000
You get the feeling that this is chapter nine of a 10 part story,
link |
00:05:41.280
and you've missed the first eight chapters, and they're not going to explain it to you,
link |
00:05:44.400
but there's this sort of rich world behind you.
link |
00:05:45.920
You get pulled in anyway, like immediately.
link |
00:05:47.280
You get pulled in anyway. So, it's just excellent storytelling in both cases,
link |
00:05:50.880
and very, very different.
link |
00:05:51.840
And also you like the outfit, I assume? The John Wick outfit?
link |
00:05:54.480
Oh yeah, of course. Well, of course. Yes. I think John Wick outfit is perfect.
link |
00:05:58.000
And so that's number two, and then…
link |
00:05:59.760
But sorry to pause on that. Martial arts? You have a long list of hobbies.
link |
00:06:03.920
Like it scrolls off the page, but I didn't see martial arts as one of them.
link |
00:06:07.680
I do not do martial arts, but I certainly watch martial arts.
link |
00:06:10.240
Oh, I appreciate it very much. Oh, we could talk about every Jackie Chan movie ever made,
link |
00:06:14.000
and I would be on board with that.
link |
00:06:15.600
The Shower, too? Like that kind of comedy of a cop?
link |
00:06:18.880
Yes, yes. By the way, my favorite Jackie Chan movie would be Drunken Master 2,
link |
00:06:25.120
known in the States usually as Legend of the Drunken Master.
link |
00:06:29.200
Actually, Drunken Master, the first one, is the first kung fu movie I ever saw,
link |
00:06:33.360
but I did not know that.
link |
00:06:34.880
First Jackie Chan movie?
link |
00:06:36.000
No, first one ever that I saw and remember, but I had no idea that that's what it was,
link |
00:06:40.640
and I didn't know that was Jackie Chan. That was like his first major movie.
link |
00:06:43.200
Yeah. I was a kid. It was done in the 70s.
link |
00:06:46.320
I only later rediscovered that that was actually.
link |
00:06:49.680
And he creates his own martial art by drinking. Was he actually drinking or was he played drinking?
link |
00:06:58.000
You mean as an actor or as a character?
link |
00:06:59.600
No. I'm sure as an actor. He was in the 70s or whatever.
link |
00:07:04.240
He was definitely drinking, and in the end, he drinks industrial grade alcohol.
link |
00:07:09.440
Ah, yeah.
link |
00:07:10.480
Yeah, and has one of the most fantastic fights ever in that subgenre.
link |
00:07:15.040
Anyway, that's my favorite one of his movies, but I'll tell you the last movie.
link |
00:07:19.280
It's actually a movie called Nothing But a Man, which is the 1960s,
link |
00:07:23.760
starred Ivan Dixon, who you'll know from Hogan's Heroes, and Abby Lincoln.
link |
00:07:31.840
It's just a really small little drama. It's a beautiful story.
link |
00:07:35.040
But my favorite scenes, I'm cheating, one of my favorite movies just for the ending is The
link |
00:07:41.440
Godfather. I think the last scene of that is just fantastic. It's the whole movie all summarized in
link |
00:07:47.360
just eight, nine seconds.
link |
00:07:48.400
Godfather Part One?
link |
00:07:49.440
Part One.
link |
00:07:50.240
How does it end? I don't think you need to worry about spoilers if you haven't seen The Godfather.
link |
00:07:54.880
Spoiler alert. It ends with the wife coming to Michael, and he says,
link |
00:08:01.920
just this once, I'll let you ask me my business. And she asks him if he did this terrible thing,
link |
00:08:06.400
and he looks her in the eye and he lies, and he says, no. And she says, thank you. And she
link |
00:08:10.800
walks out the door, and you see her going out of the door, and all these people are coming in,
link |
00:08:19.920
and they're kissing Michael's hands, and Godfather. And then the camera switches
link |
00:08:24.720
perspective. So instead of looking at him, you're looking at her, and the door
link |
00:08:29.760
closes in her face, and that's the end of the movie. And that's the whole movie right there.
link |
00:08:33.840
Do you see parallels between that and your position as Dean at Georgia Tech Chrome?
link |
00:08:37.440
Just kidding. Trick question.
link |
00:08:39.520
Sometimes, certainly. The door gets closed on me every once in a while.
link |
00:08:44.160
Okay. That was a rhetorical question. You've also mentioned that you, I think, enjoy all kinds of
link |
00:08:51.120
experiments, including on yourself. But I saw a video where you said you did an experiment where
link |
00:08:56.320
you tracked all kinds of information about yourself and a few others sort of wiring up your
link |
00:09:03.440
home. And this little idea that you mentioned in that video, which is kind of interesting, that
link |
00:09:10.240
you thought that two days worth of data is enough to capture majority of the behavior of the human
link |
00:09:16.960
being. First, can you describe what the heck you did to collect all the data? Because it's
link |
00:09:23.440
fascinating, just like little details of how you collect that data and also what your intuition
link |
00:09:28.320
behind the two days is. So first off, it has to be the right two days. But I was thinking of a
link |
00:09:32.960
very specific experiment. There's actually a suite of them that I've been a part of, and other people
link |
00:09:36.480
have done this, of course. I just sort of dabbled in that part of the world. But to be very clear,
link |
00:09:40.320
the specific thing that I was talking about had to do with recording all the IR going on in my
link |
00:09:46.160
infrared going on in my house. So this is a long time ago. So this is everything's being curled
link |
00:09:50.480
by pressing buttons on remote controls, as opposed to speaking to Alexa or Siri or someone like that.
link |
00:09:56.960
And I was just trying to figure out if you could get enough data on people to figure out what they
link |
00:10:01.280
were going to do with their TVs or their lights. My house was completely wired up at the time.
link |
00:10:06.400
But you know, what I'm about to look at a movie, I'm about to turn on the TV or whatever and just
link |
00:10:10.240
see what I could predict from it. It was kind of surprising. It shouldn't have been. But that's all
link |
00:10:16.720
very easy to do, by the way, just capturing all the little stuff. I mean, it's a bunch of computers
link |
00:10:19.520
systems. It's really easy to capture today if you know what you're looking for. At Georgia Tech,
link |
00:10:22.880
long before I got there, we had this thing called the Aware Home, where everything was wired up and
link |
00:10:27.200
you captured everything that was going on. Nothing even difficult, not with video or anything like
link |
00:10:31.200
that, just the way that the system was just capturing everything. So it turns out that,
link |
00:10:37.920
and I did this with myself and then I had students and they worked with many other people. And it
link |
00:10:42.000
turns out at the end of the day, people do the same things over and over and over again. So it
link |
00:10:47.360
has to be the right two days, like a weekend. But it turns out not only can you predict what
link |
00:10:51.840
someone's going to do next at the level of what button they're going to press next on a remote
link |
00:10:55.040
control, but you can do it with something really, really simple. You don't even need a hidden mark
link |
00:11:01.360
off model. It's like a mark, just simply, I press this, this is my prediction of the next thing.
link |
00:11:05.440
It turns out you can get 93% accuracy just by doing something very simple and stupid and just
link |
00:11:11.120
counting statistics. But what was actually more interesting is that you could use that information.
link |
00:11:15.520
This comes up again and again in my work. If you try to represent people or objects by the things
link |
00:11:21.440
they do, the things you can measure about them that have to do with action in the world. So
link |
00:11:26.560
distribution over actions, and you try to represent them by the distribution of actions that are done
link |
00:11:32.400
on them, then you do a pretty good job of sort of understanding how people are and they cluster
link |
00:11:38.960
remarkably well, in fact, irritatingly so. And so by clustering people this way,
link |
00:11:44.000
you can maybe, you know, I got the 93% accuracy of what's the next button you're going to press,
link |
00:11:49.360
but I can get 99% accuracy or somewhere there's about on the collections of things you might
link |
00:11:54.240
press. And it turns out the things that you might press are all related to number to each other and
link |
00:11:58.480
exactly what you would expect. So for example, all the key, all the numbers on a keypad, it turns out
link |
00:12:04.480
all have the same behavior with respect to you as a human being. And so you would naturally cluster
link |
00:12:08.640
them together and you discover that numbers are all related to one another in some way and all
link |
00:12:14.640
these other things. And then, and here's the part that I think is important. I mean, you can see
link |
00:12:18.400
this in all kinds of things. Every individual is different, but any given individual is remarkably
link |
00:12:25.280
predictable because you keep doing the same things over and over again. And the two things that I've
link |
00:12:29.680
learned in the long time that I've been thinking about this is people are easily predictable and
link |
00:12:34.960
people hate when you tell them that they're easily predictable, but they are. And there you go.
link |
00:12:39.280
Yeah. What about, let me play devil's advocate and philosophically speaking, is it possible to
link |
00:12:46.160
say that what defines humans is the outlier. So even though many, some large percentage of our
link |
00:12:52.480
behaviors, whatever the signal we measure is the same and it would cluster nicely, but maybe it's
link |
00:12:57.760
the special moments of when we break out of the routine is the definitive thing that we're
link |
00:13:03.200
breaking out of the routine is the definitive things. And the way we break out of that routine
link |
00:13:07.920
for each one of us might be different. It's possible. I would say that I would say it a
link |
00:13:11.520
little differently. I think I would make two things. One is a, I'm going to disagree with
link |
00:13:15.920
the premise, I think, but that's fine. I think the way I would put it is there are people who
link |
00:13:22.880
are very different from lots of other people, but they're not 0%, they're closer to 10%, right? So
link |
00:13:28.240
in fact, even if you do this kind of clustering of people, that'll turn out to be the small number
link |
00:13:31.760
they all behave like each other, even if they individually behave very differently from everyone
link |
00:13:36.960
else. So I think that's kind of important. But what you're really asking, I think, and I think
link |
00:13:40.720
this is really a question is, what do you do when you're faced with the situation you've never seen
link |
00:13:46.160
before? What do you do when you're faced with an extraordinary situation maybe you've seen others
link |
00:13:49.600
do and you're actually forced to do something and you react to that very differently. And that is
link |
00:13:53.040
the thing that makes you human. I would agree with that, at least at a philosophical level, that it's
link |
00:13:56.880
the times when you are faced with something difficult, a decision that you have to make
link |
00:14:04.080
where the answer isn't easy, even if you know what the right answer is, that's sort of what defines
link |
00:14:08.560
you as the individual. And I think what defines people broadly, it's the hard problem. It's not
link |
00:14:12.800
the easy problem. It's the thing that's going to hurt you. It's not even that it's difficult. It's
link |
00:14:17.600
just that you know that the outcome is going to be highly suboptimal for you. And I do think that
link |
00:14:22.880
that's a reasonable place to start for the question of what makes us human. So before we talk about
link |
00:14:29.360
sort of explore the different ideas underlying interactive artificial intelligence, which we
link |
00:14:33.440
are working on, let me just go along this thread to skip to kind of our world of social media,
link |
00:14:39.520
which is something that at least on the artificial intelligence side you think about.
link |
00:14:44.080
There's a popular narrative, I don't know if it's true, but that we have these silos in social
link |
00:14:51.840
media and we have these clusterings, as you're kind of mentioning. And the idea is that, you know,
link |
00:14:58.400
along that narrative is that, you know, we want to, we want to break each other out of those silos
link |
00:15:06.320
so we can be empathetic to other people, to if you're a Democrat, you'd be empathetic to the
link |
00:15:12.480
Republican. If you're Republican, you're empathetic Democrat. Those are just two silly bins that we
link |
00:15:17.680
seem to be very excited about, but there's other binnings that we can think about. Is there, from
link |
00:15:24.160
an artificial intelligence perspective, because you're just saying we cluster along the data,
link |
00:15:29.840
but then interactive artificial intelligence is referring to throwing agents into that mix,
link |
00:15:35.520
AI systems in that mix, helping us interacting with us humans and maybe getting us out of that
link |
00:15:41.600
mix, maybe getting us out of those silos. Is that something that you think is possible? Do you see
link |
00:15:48.720
a hopeful possibility for artificial intelligence systems in these large networks of people to get
link |
00:15:56.160
us outside of our habits in at least the idea space to where we can sort of be empathetic to
link |
00:16:04.800
other people's lived experiences, other people's points of view, you know, all that kind of stuff?
link |
00:16:11.280
Yes, and I actually don't think it's that hard. Well, it's not hard in this sense. So imagine that
link |
00:16:16.000
you can, let's just, let's make life simple for a minute. Let's assume that you can do a kind of
link |
00:16:22.720
partial ordering over ideas or clusterings of behavior. It doesn't even matter what I mean here,
link |
00:16:28.720
so long as there's some way that this is a cluster, this is a cluster, there's some edge
link |
00:16:31.920
between them, right? And this is kind of, they don't quite touch even, or maybe they come very
link |
00:16:35.280
close. If you can imagine that conceptually, then the way you get from here to here is not by going
link |
00:16:40.480
from here to here. The way you get from here to here is you find the edge and you move slowly
link |
00:16:43.680
together, right? And I think that machines are actually very good at that sort of thing once we
link |
00:16:47.360
can kind of define the problem, either in terms of behavior or ideas or words or whatever. So it's
link |
00:16:51.840
easy in the sense that if you already have the network and you know the relationships, you know,
link |
00:16:55.840
the edges and sort of the strengths on them and you kind of have some semantic meaning for them,
link |
00:17:00.480
the machine doesn't have to, you do as the designer, then yeah, I think you can kind of move
link |
00:17:04.960
people along and sort of expand them. But it's harder than that. And the reason it's harder than
link |
00:17:08.880
that, or sort of coming up with the network structure itself is hard, is because I'm gonna
link |
00:17:13.920
tell you a story that someone else told me and I don't, I may get some of the details a little bit
link |
00:17:18.800
wrong, but it's roughly, it roughly goes like this. You take two sets of people from the same
link |
00:17:24.320
backgrounds and you want them to solve a problem. So you separate them up, which we do all the time,
link |
00:17:29.600
right? Oh, you know, we're gonna break out in the, we're gonna break out groups. You're gonna go
link |
00:17:32.400
over there and you're gonna talk about this. You're gonna go over there and you're gonna talk
link |
00:17:34.080
about this. And then you have them sort of in this big room, but far apart from one another,
link |
00:17:38.560
and you have them sort of interact with one another. When they come back to talk about what
link |
00:17:43.040
they learn, you want to merge what they've done together. It can be extremely hard because they
link |
00:17:47.440
don't, they basically don't speak the same language anymore. Like when you create these problems and
link |
00:17:51.360
you dive into them, you create your own language. So the example this one person gave me, which I
link |
00:17:56.160
found kind of interesting because we were in the middle of that at the time, was
link |
00:17:58.720
they're sitting over there and they're talking about these rooms that you can see, but you're
link |
00:18:03.520
seeing them from different vantage points, depending on what side of the room you're on.
link |
00:18:06.880
They can see a clock very easily. And so they start referring to the room as the one with the clock.
link |
00:18:12.960
This group over here, looking at the same room, they can see the clock, but it's, you know,
link |
00:18:16.720
not in their line of sight or whatever. So they end up referring to it by some other way.
link |
00:18:22.000
When they get back together and they're talking about things, they're referring to the same room
link |
00:18:26.080
and they don't even realize they're referring to the same room. And in fact, this group doesn't
link |
00:18:29.040
even see that there's a clock there and this group doesn't see whatever's the clock on the wall is
link |
00:18:32.480
the thing that stuck with me. So if you create these different silos, the problem isn't that
link |
00:18:36.160
the ideologies disagree. It's that you're using the same words and they mean radically different
link |
00:18:41.680
things. The hard part is just getting them to agree on the, well, maybe we'd say the axioms in
link |
00:18:47.680
our world, right? But you know, just get them to agree on some basic definitions because right now
link |
00:18:52.640
they talk, they're talking past each other, just completely talking past each other. That's the
link |
00:18:56.480
hard part, getting them to meet, getting them to interact. That may not be that difficult. Getting
link |
00:19:01.120
them to see where their language is leading them to lead past one another. That's, that's the hard
link |
00:19:06.400
part. It's a really interesting question to me. It could be on the layer of language, but it feels
link |
00:19:10.240
like there's multiple layers to this. Like it could be worldview. It could be, I mean, all boils
link |
00:19:14.640
down to empathy, being able to put yourself in the shoes of the other person to learn the language, to
link |
00:19:20.400
learn like visually how they see the world, to learn like the, I mean, I experienced this now
link |
00:19:28.400
with, with trolls, the, the degree of humor in that world. For example, I talk about love a lot.
link |
00:19:33.840
I'm very like, I'm really lucky to have this amazing community of loving people. But whenever I
link |
00:19:39.840
encounter trolls, they always roll their eyes at the idea of love because it's so quote unquote
link |
00:19:45.440
cringe. So, so they, they show love by like derision, I would say. And I think about on the
link |
00:19:56.080
human level, that's a whole nother discussion. That's psychology, that's sociology, so on. But
link |
00:20:00.240
I wonder if AI systems can help somehow and to bridge the gap of what is this person's life like?
link |
00:20:10.400
Encourage me to just ask that question, to put myself in their shoes, to experience the agitations,
link |
00:20:16.800
the fears, the hopes they have, the, to experience, you know, the, to even just to think about what
link |
00:20:23.920
was their upbringing like, like having a, a single parent home or a shitty education or all those
link |
00:20:32.000
kinds of things, just to put myself in that mind space. It feels like that's really important.
link |
00:20:37.760
For us to, to, to bring those clusters together, to find that similar language. But it's unclear
link |
00:20:43.600
how AI can help that because it seems like AI systems need to understand both parties first.
link |
00:20:48.880
So the, you know, the word understand, there's doing a lot of work, right?
link |
00:20:51.760
Yes.
link |
00:20:52.240
So do you have to understand it or do you just simply have to note that there is something
link |
00:20:57.840
similar as a point to touch, right? So, you know, you use the word empathy and I like that word,
link |
00:21:04.480
for a lot of reasons, I think you're right in the way that you're using in the ways you're describing,
link |
00:21:07.600
but let's separate it from sympathy, right? So, you know, sympathy is feeling sort of for someone,
link |
00:21:13.600
empathy is kind of understanding where they're coming from and how they, how they feel, right?
link |
00:21:16.720
And for most people, those things go hand in hand. For some people, some are very good at empathy
link |
00:21:22.480
and very, very bad at sympathy. Some people cannot experience, well, my observation would be,
link |
00:21:28.160
I'm not a psychologist, my observation would be that some people seem to be very, very,
link |
00:21:32.480
very bad at sympathy. My observation would be that some people seem incapable of feeling sympathy
link |
00:21:37.680
unless they feel empathy first. You can understand someone, understand where they're coming from and
link |
00:21:42.160
still think, no, I can't support that, right? It doesn't mean that the only way I, because if that,
link |
00:21:48.160
if that isn't the case, then what it requires is that you, you must, the only way that you can,
link |
00:21:54.480
to understand someone means you must agree with everything that they do, which isn't right, right?
link |
00:21:59.760
I can feel for someone is to completely understand them and make them like me in some way. Well,
link |
00:22:06.240
then we're lost, right? Because we're not all exactly like each other. I have to understand
link |
00:22:10.480
everything that you've gone through. It helps clearly, but they're separable ideas, right?
link |
00:22:14.240
Even though they get clearly, clearly tangled up in one another. So what I think AI could help you
link |
00:22:18.480
do actually is if, and you know, I'm, I'm being quite fanciful as it were, but if you, if you
link |
00:22:23.920
think of these as kind of, I understand how you interact, the words that you use, the, you know,
link |
00:22:28.000
the actions you take, I have some way of doing this. Let's not worry about what that is.
link |
00:22:31.840
But I can see you as a kind of distribution of experiences and actions taken upon you,
link |
00:22:36.720
things you've done and so on. And I can do this with someone else and I can find the places where
link |
00:22:41.360
there's some kind of commonality, a mapping as it were, even if it's not total, you know, the,
link |
00:22:46.720
if I think of this as distribution, right, then, you know, I can take the cosine of the angle
link |
00:22:50.400
between you and if it's, you know, if it's zero, you've got nothing in common. If it's one,
link |
00:22:54.240
you're completely the same person. Well, you know, you're probably not one. You're almost certainly
link |
00:22:59.760
not zero. I can find the place where there's the overlap, then I might be able to introduce you on
link |
00:23:03.840
that basis or connect you in that, connect you in that way and make it easier for you to take that
link |
00:23:09.200
step of that step of empathy. It's not, it's not impossible to do. Although I wonder if it requires
link |
00:23:17.760
that everyone involved is at least interested in asking the question. So maybe the hard part
link |
00:23:22.000
is just getting them interested in asking the question. In fact, maybe if you can get them to
link |
00:23:24.800
ask the question, how are we more alike than we are different, they'll solve it themselves. Maybe
link |
00:23:28.480
that's the problem that AI should be working on, not telling you how you're similar or different,
link |
00:23:32.640
but just getting you to decide that it's worthwhile asking the question. It feels like an economist's
link |
00:23:37.520
answer actually. Well, people, okay, first of all, people like would disagree. So let me disagree
link |
00:23:43.040
slightly, which is, I think everything you said is brilliant, but I tend to believe philosophically
link |
00:23:49.360
speaking that people are interested underneath it all. And I would say that AI, the, the possibility
link |
00:23:57.600
that an AI system would show the commonality is incredible. That's a really good starting point.
link |
00:24:02.560
I would say if you, if on social media, I could discover the common things deep or shallow between
link |
00:24:10.400
me and a person who there's tension with, I think that my basic human nature would take over from
link |
00:24:17.920
there. And I think enjoy that commonality. And like, there's something sticky about that, that
link |
00:24:25.280
my mind will linger on and that person in my mind will become like warmer and warmer. And like, I'll
link |
00:24:31.520
start to give a feel more and more compassion towards them. I think for majority of the
link |
00:24:35.520
population, that's true, but that might be, that's a hypothesis. Yeah. I mean, it's an empirical
link |
00:24:40.800
question, right? You'd have to figure it out. I mean, I want to believe you're right. And so I'm
link |
00:24:44.480
going to say that I think you're right. Of course, some people come to those things for the purpose
link |
00:24:50.800
of trolling, right? And it doesn't matter that they're playing a different game. Yeah. But I
link |
00:24:55.520
don't know. I, you know, my experience is it requires two things. It requires, in fact, maybe
link |
00:25:00.320
this is really at the end what you're saying. And I, and I do agree with this for sure. So
link |
00:25:06.000
you, it's hard to hold onto that kind of anger or to hold onto just the desire to humiliate someone
link |
00:25:14.080
for that long. It's just difficult to do. It takes, it takes a toll on you. But more importantly,
link |
00:25:18.880
we know this both from people having done studies on it, but also from our own experiences,
link |
00:25:23.520
that it is much easier to be dismissive of a person if they're not in front of you,
link |
00:25:27.920
if they're not real, right? So much of the history of the world is about making people other, right?
link |
00:25:35.040
So if you're social media, if you're on the web, if you're doing whatever in the internet,
link |
00:25:38.880
but being forced to deal with someone as a person, some equivalent to being in the same room,
link |
00:25:45.920
makes a huge difference. Cause then you're one, you're forced to deal with their humanity because
link |
00:25:49.360
it's in front of you. The other is of course that, you know, they might punch you in the face
link |
00:25:52.800
if you go too far. So, you know, both of those things kind of work together, I think to the,
link |
00:25:56.560
to the right end. So I think bringing people together is really a kind of substitute for
link |
00:26:03.360
forcing them to see the humanity in another person and to not be able to treat them as bits,
link |
00:26:07.920
it's hard to troll someone when you're looking them in the eye. This is very difficult to do.
link |
00:26:12.720
Agreed. Your broad set of research interests fall under interactive AI, as I mentioned,
link |
00:26:18.960
which is a fascinating set of ideas and you have some concrete things that you're
link |
00:26:23.760
particularly interested in, but maybe could you talk about how you think about the field of
link |
00:26:30.000
interactive artificial intelligence? Sure. So let me say upfront that if you look at,
link |
00:26:34.880
certainly my early work, but even if you look at most of it, I'm a machine learning guy,
link |
00:26:39.600
right? I do machine learning. First paper ever published, it was in NIPS. Back then it was
link |
00:26:43.760
NIPS. Now it's NeurIPS. It's a long story there. Anyway, that's another thing. But so,
link |
00:26:48.160
so I'm a machine learning guy, right? I believe in data, I believe in statistics and all those
link |
00:26:51.280
kinds of things. And the reason I'm bringing that up is even though I'm a newfangled statistical
link |
00:26:55.360
machine learning guy and have been for a very long time, the problem I really care about is AI,
link |
00:27:00.080
right? I care about artificial intelligence. I care about building some kind of
link |
00:27:04.480
intelligent artifact. However that gets expressed, that would be at least as intelligent as humans
link |
00:27:12.160
and as interesting as humans, perhaps in their own way. So that's the deep underlying love and
link |
00:27:18.320
dream is the bigger AI. Whatever the heck that is. Yeah. The machine learning in some ways is
link |
00:27:24.640
a means to the end. It is not the end. And I don't understand how one could be intelligent
link |
00:27:29.360
without learning. So therefore I got to figure out how to do that, right? So that's important.
link |
00:27:32.720
But machine learning, by the way, is also a tool. I said statistical, because that's what most
link |
00:27:37.200
people think of themselves as machine learning people. That's how they think. I think Pat Langley
link |
00:27:40.720
might disagree, or at least 1980s Pat Langley might disagree with what it takes to do machine
link |
00:27:46.240
learning. But I care about the AI problem, which is why it's interactive AI, not just interactive
link |
00:27:50.480
ML. I think it's important to understand that there's a longterm goal here, which I will
link |
00:27:55.120
probably never live to see, but I would love to have been a part of, which is building something
link |
00:27:59.040
truly intelligent outside of ourselves. Can we take a tiny tangent? Or am I interrupting,
link |
00:28:04.800
which is, is there something you can say concrete about the mysterious gap between
link |
00:28:12.800
the subset ML and the bigger AI? What's missing? What do you think? I mean, obviously it's
link |
00:28:19.600
totally unknown, not totally, but in part unknown at this time, but is it something like with Pat
link |
00:28:25.040
Langley, is it knowledge, like expert system reasoning type of kind of thing?
link |
00:28:29.680
So AI is bigger than ML, but ML is bigger than AI. This is kind of the real problem here,
link |
00:28:35.440
is that they're really overlapping things that are really interested in slightly different problems.
link |
00:28:39.360
I tend to think of ML, and there are many people out there who are going to be very upset at me
link |
00:28:42.240
about this, but I tend to think of ML being much more concerned with the engineering of solving a
link |
00:28:45.600
problem, and AI about the sort of more philosophical goal of true intelligence. And that's the thing
link |
00:28:50.800
that motivates me, even if I end up finding myself living in this kind of engineering ish space,
link |
00:28:56.000
I've now made Michael Jordan upset. But to me, they just feel very different. You're just
link |
00:29:02.160
measuring them differently, your goals of where you're trying to be are somewhat different.
link |
00:29:07.280
But to me, AI is about trying to build that intelligent thing. And typically, but not always,
link |
00:29:13.760
for the purpose of understanding ourselves a little bit better. Machine learning is, I think,
link |
00:29:17.920
trying to solve the problem, whatever that problem is. Now, that's my take. Others, of course,
link |
00:29:22.560
would disagree. So on that note, so with the interactive AI, do you tend to, in your mind,
link |
00:29:28.000
visualize AI as a singular system, or is it as a collective huge amount of systems interacting
link |
00:29:33.760
with each other? Like, is the social interaction of us humans and of AI systems the fundamental
link |
00:29:40.640
to intelligence? I think, well, it's certainly fundamental to our kind of intelligence, right?
link |
00:29:44.800
And I actually think it matters quite a bit. So the reason the interactive AI part matters to me
link |
00:29:50.960
is because I don't, this is going to sound simple, but I don't care whether a tree makes a sound
link |
00:30:00.000
when it falls and there's no one around, because I don't think it matters, right? If there's no
link |
00:30:04.080
observer in some sense. And I think what's interesting about the way that we're intelligent
link |
00:30:08.480
is we're intelligent with other people, right? Or other things anyway. And we go out of our way to
link |
00:30:13.760
make other things intelligent. We're hardwired to find intention, even whether there is no intention,
link |
00:30:18.960
why we anthropomorphize everything. I think the interactive AI part is being intelligent in and
link |
00:30:25.360
of myself in isolation is a meaningless act in some sense. The correct answer is you have to
link |
00:30:31.760
be intelligent in the way that you interact with others. It's also efficient because it allows you
link |
00:30:35.360
to learn faster because you can import from past history. It also allows you to be efficient in
link |
00:30:41.120
the transmission of that. So we ask ourselves about me. Am I intelligent? Clearly, I think so.
link |
00:30:47.520
But I'm also intelligent as a part of a larger species and group of people, and we're trying to
link |
00:30:51.680
move the species forward as well. And so I think that notion of being intelligent with others is
link |
00:30:56.960
kind of the key thing because otherwise you come and you go, and then it doesn't matter. And so
link |
00:31:01.920
that's why I care about that aspect of it. And it has lots of other implications. One is not just
link |
00:31:08.720
building something intelligent with others, but understanding that you can't always communicate
link |
00:31:12.240
with those others. They have been in a room where there's a clock on the wall that you haven't seen,
link |
00:31:16.240
which means you have to spend an enormous amount of time communicating with one another constantly
link |
00:31:20.720
in order to figure out what each other wants. So this is why people project, right? You project your
link |
00:31:26.400
own intentions and your own reasons for doing things on the others as a way of understanding
link |
00:31:30.800
them so that you know how to behave. But by the way, you, completely predictable person,
link |
00:31:35.840
I don't know how you're predictable. I don't know you well enough, but you probably eat the same five
link |
00:31:39.280
things over and over again or whatever it is that you do, right? I know I do. If I'm going to a new
link |
00:31:43.920
Chinese restaurant, I will get general gals chicken because that's the thing that's easy.
link |
00:31:47.600
I will get hot and sour soup. People do the things that they do, but other people get the chicken and
link |
00:31:52.560
broccoli. I can push this analogy way too far. The chicken and broccoli. I don't know what's
link |
00:31:56.960
wrong with those people. I don't know what's wrong with them either. We have all had our trauma.
link |
00:32:02.320
So they get their chicken and broccoli and their egg drop soup or whatever. We got to communicate and
link |
00:32:06.880
it's going to change, right? So interactive AI is not just about learning to solve a problem or a
link |
00:32:13.200
task. It's about having to adapt that over time, over a very long period of time and interacting
link |
00:32:18.080
with other people who will themselves change. This is what we mean about things like adaptable
link |
00:32:22.400
models, right? That you have to have a model. That model is going to change. And by the way,
link |
00:32:25.600
it's not just the case that you're different from that person, but you're different from the person
link |
00:32:28.960
you were 15 minutes ago or certainly 15 years ago. And I have to assume that you're at least going
link |
00:32:33.760
to drift, hopefully not too many discontinuities, but you're going to drift over time. And I have
link |
00:32:38.480
to have some mechanism for adapting to that as you and an individual over time and across individuals
link |
00:32:45.120
over time. On the topic of adaptive modeling and you talk about lifelong learning, which is
link |
00:32:51.200
a, I think a topic that's understudied or maybe because nobody knows what to do with it. But like,
link |
00:32:59.760
you know, if you look at Alexa or most of our artificial intelligence systems that are primarily
link |
00:33:04.880
machine learning based systems or dialogue systems, all those kinds of things, they know very little
link |
00:33:10.960
about you in the sense of the lifelong learning sense that we learn as humans, we learn a lot about
link |
00:33:21.360
each other, not in the quantity effects, but like the temporally rich set of information that seems
link |
00:33:31.280
to like pick up the crumbs along the way that somehow seems to capture a person pretty well.
link |
00:33:36.880
Do you have any ideas how to do lifelong learning? Because it seems like most of the machine learning
link |
00:33:44.000
community does not. No, well, by the way, not only does the machine learning community not spend a
link |
00:33:48.320
lot of time on lifelong learning, I don't think they spend a lot of time on learning period in
link |
00:33:52.720
the sense that they tend to be very task focused. Everybody is overfitting to whatever problem is
link |
00:33:57.520
they happen to have. They're overengineering their solutions to the task. Even the people,
link |
00:34:01.920
and I think these people too, are trying to solve a hard problem of transfer learning, right? I'm
link |
00:34:06.000
going to learn on one task and learn another task. You still end up creating the task. It's like
link |
00:34:10.000
looking for your keys where the light is because that's where the light is, right? It's not because
link |
00:34:13.280
the keys have to be there. I mean, one could argue that we tend to do this in general. As a group,
link |
00:34:20.400
we tend to hill climb and get stuck in local optima. I think we do this in the small as well.
link |
00:34:26.160
I think it's very hard to do. Here's the hard thing about AI. The hard thing about AI is it
link |
00:34:32.960
keeps changing on us, right? What is AI? AI is the art and science of making computers act the way
link |
00:34:38.480
they do in the movies, right? That's what it is, right? But beyond that. They keep coming up with
link |
00:34:44.560
new movies. Yes. Right, exactly. We are driven by this kind of need to the ineffable quality of who
link |
00:34:52.080
we are, which means that the moment you understand something is no longer AI, right? Well, we
link |
00:34:57.360
understand this. That's just you take the derivative and you divide by two and then you average it out
link |
00:35:01.360
over time in the window. Therefore, that's no longer AI. The problem is unsolvable because
link |
00:35:05.200
it keeps kind of going away. This creates a kind of illusion, which I don't think is an entire
link |
00:35:08.720
illusion, of either there's very simple task based things you can do very well and over engineer,
link |
00:35:13.600
there's all of AI, and there's nothing in the middle. It's very hard to get from here to here,
link |
00:35:18.640
and it's very hard to see how to get from here to here. I don't think that we've done
link |
00:35:23.040
a very good job of it because we get stuck trying to solve the small problems in front of it,
link |
00:35:27.360
myself included. I'm not going to pretend that I'm better at this than anyone else. Of course,
link |
00:35:31.680
all the incentives in academia and in industry are set to make that very hard because you have
link |
00:35:37.840
to get the next paper out, you have to get the next product out, you have to solve this problem,
link |
00:35:41.680
and it's very sort of naturally incremental. None of the incentives are set up to allow you to take
link |
00:35:47.600
a huge risk unless you're already so well established you can take that big risk.
link |
00:35:53.600
If you're that well established that you can take that big risk, then you've probably spent
link |
00:35:57.040
much of your career taking these little risks, relatively speaking, and so you have got a
link |
00:36:01.680
lifetime of experience telling you not to take that particular big risk. So the whole system's
link |
00:36:05.520
set up to make progress very slow. That's fine. It's just the way it is, but it does make this
link |
00:36:10.080
gap seem really big, which is my long way of saying I don't have a great answer to it except
link |
00:36:14.400
that stop doing n equals one. At least try to get n equal two and maybe n equal seven so that you
link |
00:36:21.120
can say I'm going to, or maybe t is a better variable here. I'm going to not just solve this
link |
00:36:25.120
problem and solve this problem and another problem. I'm not going to learn just on you.
link |
00:36:28.240
I'm going to keep living out there in the world and just seeing what happens and that we'll learn
link |
00:36:32.080
something as designers and our machine learning algorithms and our AI algorithms can learn as
link |
00:36:36.800
well. But unless you're willing to build a system which you're going to have live for months at a
link |
00:36:41.360
time in an environment that is messy and chaotic, you cannot control, then you're never going to
link |
00:36:47.120
make progress in that direction. So I guess my answer to you is yes. My idea is that you should,
link |
00:36:51.440
it's not no, it's yes. You should be deploying these things and making them live for a month
link |
00:36:57.760
at a time and be okay with the fact that it's going to take you five years to do this. Not
link |
00:37:02.000
rerunning the same experiment over and over again and refining the machine so it's slightly better
link |
00:37:06.320
at whatever, but actually having it out there and living in the chaos of the world and seeing what
link |
00:37:12.320
its learning algorithm say can learn, what data structure it can build and how it can go from
link |
00:37:16.800
there. Without that, you're going to be stuck all the time. What do you think about the possibility
link |
00:37:22.000
of N equals one growing, it's probably crude approximation, but growing like if you look at
link |
00:37:28.720
language models like GPT3, if you just make it big enough, it'll swallow the world. Meaning like
link |
00:37:35.040
it'll solve all your T to infinity by just growing in size of this, taking the small overengineered
link |
00:37:43.120
solution and just pumping it full of steroids in terms of compute, in terms of size of training
link |
00:37:49.040
data and the Yann LeCun style self supervised or open AI self supervised. Just throw all of YouTube
link |
00:37:56.720
at it and it will learn how to reason, how to paint, how to create music, how to love all that
link |
00:38:04.640
by watching YouTube videos. I mean, I can't think of a more terrifying world to live in than a world
link |
00:38:08.960
that is based on YouTube videos, but yeah, I think the answer that I just kind of don't think that'll
link |
00:38:14.960
quite well, it won't work that easily. You will get somewhere and you will learn something, which
link |
00:38:20.320
means it's probably worth it, but you won't get there. You won't solve the problem. Here's the
link |
00:38:25.520
thing, we build these things and we say we want them to learn, but what actually happens, and
link |
00:38:31.600
let's say they do learn, I mean, certainly every paper I've gotten published, the things learn,
link |
00:38:35.600
I don't know about anyone else, but they actually change us, right? We react to it differently,
link |
00:38:40.960
right? So we keep redefining what it means to be successful, both in the negative in the case,
link |
00:38:45.600
but also in the positive in that, oh, well, this is an accomplishment. I'll give you an example,
link |
00:38:51.840
which is like the one you just described with YouTube. Let's get completely out of machine
link |
00:38:55.040
learning. Well, not completely, but mostly out of machine learning. Think about Google.
link |
00:38:59.360
People were trying to solve information retrieval, the ad hoc information retrieval
link |
00:39:03.280
problem forever. I mean, first major book I ever read about it was what, 71, I think it was when
link |
00:39:09.280
it came out. Anyway, we'll treat everything as a vector and we'll do these vector space models
link |
00:39:14.640
and whatever. And that was all great. And we made very little progress. I mean, we made some progress
link |
00:39:20.160
and then Google comes and makes the ad hoc problem seem pretty easy. I mean, it's not,
link |
00:39:24.960
there's lots of computers and databases involved, and there's some brilliant algorithmic stuff
link |
00:39:29.920
behind it too, and some systems building. But the problem changed, right?
link |
00:39:37.520
If you've got a world that's that connected so that you have, you know, there are 10 million
link |
00:39:42.640
answers quite literally to the question that you're asking, then the problem wasn't give me
link |
00:39:48.960
the things that are relevant. The problem is don't give me anything that's irrelevant, at least in
link |
00:39:52.800
the first page, because nothing else matters. So Google is not solving the information retrieval
link |
00:39:59.120
problem, at least not on this webpage. Google is minimizing false positives, which is not the same
link |
00:40:06.560
thing as getting an answer. It turns out it's good enough for what it is we want to use Google for,
link |
00:40:11.120
but it also changes what the problem was we thought we were trying to solve in the first place.
link |
00:40:15.440
You thought you were trying to find an answer, but you're not, or you're trying to find the answer,
link |
00:40:18.960
but it turns out you're just trying to find an answer. Now, yes, it is true. It's also very good
link |
00:40:22.480
at finding you exactly that webpage. Of course, you trained yourself to figure out what the keywords
link |
00:40:26.480
were to get you that webpage. But in the end, by having that much data, you've just changed the
link |
00:40:32.640
problem into something else. You haven't actually learned what you set out to learn. Now, the
link |
00:40:35.840
counter to that would be maybe we're not doing that either. We just think we are because, you know,
link |
00:40:40.480
we're in our own heads. Maybe we're learning the wrong problem in the first place, but I don't
link |
00:40:44.720
think that matters. I think the point is that Google has not solved information retrieval.
link |
00:40:49.360
Google has done amazing service. I have nothing bad to say about what they've done. Lord knows
link |
00:40:53.040
my entire life is better because Google exists. For Google Maps, I don't think I've ever found
link |
00:40:57.440
this place. Where is this? I see 110 and I see where did 95 go? So I'm very grateful for Google,
link |
00:41:07.360
but they just have to make certain the first five things are right.
link |
00:41:11.040
And everything after that is wrong. Look, we're going off on a totally different topic here, but
link |
00:41:17.600
think about the way we hire faculty. It's exactly the same thing.
link |
00:41:21.200
I get in controversial, getting controversial. It's exactly the same problem, right? It's
link |
00:41:27.520
minimizing false positives. We say things like we want to find the best person to be an assistant
link |
00:41:34.880
professor at MIT in the new college of computing, which I will point out was founded 30 years after
link |
00:41:42.160
the college of computing. I'm a part of both of my alma mater. I'm just saying I appreciate all
link |
00:41:49.120
that they did and all that they're doing. Anyway, so we're going to try to hire the best professor.
link |
00:41:57.760
That's what we say, the best person for this job, but that's not what we do at all, right?
link |
00:42:02.320
Do you know which percentage of faculty in the top four earn their PhDs from the top four,
link |
00:42:08.720
say in 2017, which is the most recent year for which I have data?
link |
00:42:12.720
Maybe a large percentage.
link |
00:42:14.080
About 60%.
link |
00:42:15.040
60.
link |
00:42:15.600
60% of the faculty in the top four earn their PhDs in the top four. This is computer science,
link |
00:42:19.920
for which there is no top five. There's only a top four, right? Because they're all tied for one.
link |
00:42:23.280
For people who don't know, by the way, that would be MIT Stanford, Berkeley, CMU.
link |
00:42:27.040
Yep.
link |
00:42:29.520
Georgia Tech is number eight.
link |
00:42:31.440
Number eight. You're keeping track.
link |
00:42:34.080
Oh yes. It's a large part of my job. Number five is Illinois. Number six is a tie with
link |
00:42:39.280
UW and Cornell and Princeton and Georgia Tech are tied for eight and UT Austin is number 10.
link |
00:42:43.760
Michigan is number 11, by the way. So if you look at the top 10, you know what percentage of
link |
00:42:50.320
faculty in the top 10 earn their PhDs from the top 10? 65, roughly. 65%.
link |
00:42:56.960
If you look at the top 55 ranked departments, 50% of the faculty earn their PhDs from the top 10.
link |
00:43:04.160
There is no universe in which all the best faculty, even just for R1 universities,
link |
00:43:11.360
the majority of them come from 10 places. There's no way that's true, especially when you consider
link |
00:43:16.160
how small some of those universities are in terms of the number of PhDs they produce.
link |
00:43:20.000
Yeah.
link |
00:43:20.480
Now that's not a negative. I mean, it is a negative. It also has a habit of entrenching
link |
00:43:24.880
certain historical inequities and accidents. But what it tells you is, well, ask yourself the
link |
00:43:32.400
question, why is it like that? Well, because it's easier. If we go all the way back to the 1980s,
link |
00:43:40.960
you know, there was a saying that nobody ever lost his job buying a computer from IBM,
link |
00:43:45.440
and it was true. And nobody ever lost their job hiring a PhD from MIT, right? If the person turned
link |
00:43:52.480
out to be terrible, well, you know, they came from MIT, what did you expect me to know?
link |
00:43:55.920
However, that same person coming from pick whichever is your least favorite place that
link |
00:44:01.840
produces PhDs in, say, computer science, well, you took a risk, right? So all the incentives,
link |
00:44:07.280
particularly because you're only going to hire one this year, well, now we're hiring 10,
link |
00:44:10.320
but you know, you're only going to have one or two or three this year. And by the way,
link |
00:44:13.360
when they come in, you're stuck with them for at least seven years at most places,
link |
00:44:16.640
because that's before you know whether they're getting tenure or not. And if they get tenure,
link |
00:44:19.360
stuck with them for a good 30 years, unless they decide to leave. That means the pressure to get
link |
00:44:22.880
this right is very high. So what are you going to do? You're going to minimize false positives.
link |
00:44:27.680
You don't care about saying no inappropriately. You only care about saying yes inappropriately.
link |
00:44:33.120
So all the pressure drives you into that particular direction. Google,
link |
00:44:36.640
not to put too fine a point on it, was in exactly the same situation with their search. It turns out
link |
00:44:41.440
you just don't want to give people the wrong page in the first three or four pages. And if there's
link |
00:44:46.400
10 million right answers and 100 bazillion wrong answers, just make certain the wrong answers
link |
00:44:50.400
don't get up there. And who cares if you, the right answer was actually the 13th page. A right
link |
00:44:55.200
answer, a satisficing answer is number one, two, three, or four. So who cares?
link |
00:44:59.200
Or an answer that will make you discover something beautiful, profound to your question.
link |
00:45:04.160
Well, that's a different problem, right?
link |
00:45:06.800
But isn't that the problem? Can we linger on this topic without sort of walking with grace?
link |
00:45:15.040
How do we get for hiring faculty, how do we get that 13th page with a truly special person? Like
link |
00:45:25.600
there's, I mean, it depends on the department. Computer science probably has those department,
link |
00:45:29.200
those kinds of people. Like you have the Russian guy, Grigory Perlman, like just these awkward,
link |
00:45:38.800
strange minds that don't know how to play the little game of etiquette that faculty have all
link |
00:45:46.640
agreed somehow like converged over the decades, how to play with each other. And also is not,
link |
00:45:51.920
you know, on top of that is not from the top four, top whatever numbers, the schools. And maybe
link |
00:45:58.640
actually just says a few every once in a while to the traditions of old within the computer science
link |
00:46:05.520
community. Maybe talks trash about machine learning is a total waste of time. And that's
link |
00:46:11.520
there on their resume. So like how do you allow the system to give those folks a chance?
link |
00:46:19.440
Well, you have to be willing to take a certain kind of, without taking a particular position
link |
00:46:22.080
on any particular person, you'd have to take, you have to be willing to take risk, right? A small
link |
00:46:26.800
amount of it. I mean, if we were treating this as a, well, as a machine learning problem, right?
link |
00:46:31.280
There's a search problem, which is what it is. It's a search problem. If we were treating it that
link |
00:46:34.080
way, you would say, oh, well, the main thing is you want, you know, you've got a prior,
link |
00:46:37.840
you want some data because I'm Bayesian. If you don't want to do it that way,
link |
00:46:41.120
we'll just inject some randomness in and it'll be okay. The problem is that feels very,
link |
00:46:46.800
very hard to do with people. All the incentives are wrong there. But it turns out, and let's say,
link |
00:46:53.600
let's say that's the right answer. Let's just give for the sake of argument that, you know,
link |
00:46:57.040
injecting randomness in the system at that level for who you hire is just not worth doing because
link |
00:47:02.560
the price is too high or the cost is too high. If we had infinite resources, sure, but we don't.
link |
00:47:05.600
And also you've got to teach people. So, you know, you're ruining other people's lives if you get it
link |
00:47:09.120
too wrong. But we've taken that principle, even if I grant it and pushed it all the way back, right?
link |
00:47:17.120
So, we could have a better pool than we have of people we look at and give an opportunity to.
link |
00:47:25.840
If we do that, then we have a better chance of finding that. Of course, that just pushes the
link |
00:47:29.120
problem back another level. But let me tell you something else. You know, I did a sort of study,
link |
00:47:34.320
I call it a study. I called up eight of my friends and asked them for all of their data for
link |
00:47:37.360
graduate admissions. But then someone else followed up and did an actual study. And it
link |
00:47:41.440
turns out that I can tell you how everybody gets into grad school more or less, more or less.
link |
00:47:47.920
You basically admit everyone from places higher ranked than you. You admit most people from
link |
00:47:51.520
places ranked around you. And you meant almost no one from places ranked below you, with the
link |
00:47:54.800
exception of the small liberal arts colleges that aren't ranked at all, like Harvey Mudd,
link |
00:47:58.000
because they don't have a PhD, so they aren't ranked. This is all CS. Which means the decision
link |
00:48:04.000
of whether you become a professor at Cornell was determined when you were 17, by what you knew to
link |
00:48:13.520
go to undergrad to do whatever. So if we can push these things back a little bit and just make the
link |
00:48:18.400
pool a little bit bigger, at least you raise the probability that you will be able to see someone
link |
00:48:22.560
interesting and take the risk. The other answer to that question, by the way, which you could argue
link |
00:48:29.120
is the same as you either adjust the pool so the probabilities go up, that's a way of injecting a
link |
00:48:34.080
little bit of uniform noise in the system, as it were, is you change your loss function.
link |
00:48:40.080
You just let yourself be measured by something other than whatever it is that we're measuring
link |
00:48:44.640
ourselves by now. I mean, US News and World Report, every time they change their formula
link |
00:48:50.400
for determining rankings, move entire universities to behave differently, because rankings matter.
link |
00:48:55.760
TITO Can you talk trash about those rankings for a second? No, I'm joking about talking trash.
link |
00:49:01.200
I actually, it's so funny how from my perspective, from a very shallow perspective,
link |
00:49:07.280
how dogmatic, like how much I trust those rankings. They're almost ingrained in my head.
link |
00:49:13.920
I mean, at MIT, everybody kind of, it's a propagated, mutually agreed upon
link |
00:49:21.600
TITO idea that those rankings matter. And I don't think anyone knows what they're,
link |
00:49:26.960
like, most people don't know what they're based on. And what are they exactly based on? And what
link |
00:49:33.280
are the flaws in that? TITO Well, so it depends on which rankings you're talking about. Do you
link |
00:49:38.160
want to talk about computer science or talk about universities? TITO Computer science, US News,
link |
00:49:42.320
isn't that the main one? TITO Yeah, the only one that matters is US News. Nothing else matters.
link |
00:49:46.400
Sorry, CSRankings.org, but nothing else matters but US News. So US News has formula that it uses
link |
00:49:52.960
for many things, but not for computer science because computer science is considered a science,
link |
00:49:57.440
which is absurd. So the rankings for computer science is 100% reputation. So two people
link |
00:50:06.640
at each department, it's not really department, whatever, at each department,
link |
00:50:11.040
basically rank everybody. Slightly more complicated than that, but whatever, they rank everyone. And
link |
00:50:16.880
then those things are put together and somehow. TITO So that means how do you improve reputation?
link |
00:50:22.000
How do you move up and down the space of reputation? TITO Yes, that's exactly the
link |
00:50:26.960
question. TITO Twitter? TITO It can help. I can tell you how Georgia Tech did it,
link |
00:50:31.760
or at least how I think Georgia Tech did it, because Georgia Tech is actually the case to
link |
00:50:35.520
look at. Not just because I'm at Georgia Tech, but because Georgia Tech is the only computing unit
link |
00:50:40.240
that was not in the top 20 that has made it into the top 10. It's also the only one in the last
link |
00:50:45.280
two decades, I think, that moved up in the top 10, as opposed to having someone else moved down.
link |
00:50:53.040
So we used to be number 10, and then we became number nine because UT Austin went down slightly,
link |
00:50:58.320
and now we were tied for ninth because that's how rankings work. And we moved from nine to eight
link |
00:51:02.960
because our raw score moved up a point. So something about Georgia Tech, computer science,
link |
00:51:10.320
or computing anyway. I think it's because we have shown leadership at every crisis level, right? So
link |
00:51:17.120
we created a college, first public university to do it, second college, second university to do it
link |
00:51:21.040
after CMU is number one. I also think it's no accident that CMU is the largest and we're,
link |
00:51:26.400
depending upon how you count and depending on exactly where MIT ends up with its final college
link |
00:51:30.320
of computing, second or third largest. I don't think that's an accident. We've been doing this
link |
00:51:33.360
for a long time. But in the 2000s when there was a crisis about undergraduate education,
link |
00:51:39.440
Georgia Tech took a big risk and succeeded at rethinking undergrad education and computing.
link |
00:51:45.040
I think we created these schools at a time when most public universities anyway were afraid to
link |
00:51:49.920
do it. We did the online masters and that mattered because people were trying to figure out what to
link |
00:51:56.000
do with MOOCs and so on. I think it's about being observed by your peers and having an impact. So,
link |
00:52:02.400
I mean, that is what reputation is, right? So the way you move up in the reputation rankings is by
link |
00:52:07.520
doing something that makes people turn and look at you and say, that's good. They're better than I
link |
00:52:12.320
thought. Beyond that, it's just inertia and there's huge hysteresis in the system, right? I mean,
link |
00:52:17.440
there was these, I can't remember this, this may be apocryphal, but there's a major or department
link |
00:52:23.120
that MIT was ranked number one in and they didn't have it. It's just about what you... I don't know
link |
00:52:28.640
if that's true, but someone said that to me anyway. But it's a thing, right? It's all about
link |
00:52:33.360
reputation. Of course, MIT is great because MIT is great. It's always been great. By the way,
link |
00:52:37.280
because MIT is great, the best students come, which keeps it being great. I mean,
link |
00:52:42.960
it's just a positive feedback loop. It's not surprising. I don't think it's wrong.
link |
00:52:46.560
Yeah. But it's almost like a narrative. It doesn't actually have to be backed by reality. Not to
link |
00:52:53.360
say anything about MIT, but it does feel like we're playing in the space of narratives, not the
link |
00:53:01.040
space of something grounded. One of the surprising things when I showed up at MIT and just all the
link |
00:53:06.880
students I've worked with and all the research I've done is they're the same people as I've met
link |
00:53:14.880
at other places. I mean, what MIT has going for it... Well, MIT has many things going for it. One
link |
00:53:21.520
of the things MIT has going for it is... Nice logo?
link |
00:53:23.840
Is nice logo. It's a lot better than it was when I was here. Nice colors too. Terrible,
link |
00:53:30.080
terrible name for a mascot. But the thing that MIT has going for it is it really does get the
link |
00:53:35.760
best students. It just doesn't get all of the best students. There are many more best students out
link |
00:53:40.400
there. And the best students want to be here because it's the best place to be or one of the
link |
00:53:45.040
best places to be. And it's a sort of positive feedback. But you said something earlier,
link |
00:53:50.240
which I think is worth examining for a moment. I forget the word you used. You said,
link |
00:53:56.240
we're living in the space of narrative as opposed to something objective.
link |
00:54:00.000
Narrative is objective. I mean, one could argue that the only thing that we do as humans is
link |
00:54:04.320
narrative. We just build stories to explain why we do what we do. Someone once asked me,
link |
00:54:08.720
but wait, there's nothing objective. No, it's completely an objective measure.
link |
00:54:12.800
It's an objective measure of the opinions of everybody else. Now, is that physics? I don't
link |
00:54:19.600
know. Tell me something you think is actually objective and measurable in a way that makes
link |
00:54:25.280
sense. Cameras, they don't... You're getting me off on something here, but do you know that
link |
00:54:31.840
cameras, which are just reflecting light and putting them on film, did not work for dark
link |
00:54:39.440
skin people until the 1970s? You know why? Because you were building cameras for the people who were
link |
00:54:45.840
going to buy cameras, who all, at least in the United States and Western Europe, were relatively
link |
00:54:51.200
light skin. Turns out it took terrible pictures of people who look like me. That got fixed with
link |
00:54:56.560
better film and whole processes. Do you know why? Because furniture manufacturers wanted to be able
link |
00:55:03.440
to take pictures of mahogany furniture, right? Because candy manufacturers wanted to be able
link |
00:55:10.080
to take pictures of chocolate. Now, the reason I bring that up is because you might think that
link |
00:55:16.480
cameras are objective. They're just capturing light. No, they're doing the things that they're
link |
00:55:23.200
doing based upon decisions by real human beings to privilege, if I may use that word, some physics
link |
00:55:29.200
over others, because it's an engineering problem. There are tradeoffs, right? So I can either worry
link |
00:55:33.280
about this part of the spectrum or this part of the spectrum. This costs more. That costs less.
link |
00:55:37.200
This costs the same, but I have more people paying money over here, right? And it turns out that
link |
00:55:41.600
if a giant conglomerate demands that you do something different and it's going to involve
link |
00:55:46.640
all kinds of money for you, suddenly the tradeoffs change, right? And so there you go. I actually
link |
00:55:51.200
don't know how I ended up there. Oh, it's because of this notion of objectiveness, right? So even
link |
00:55:55.760
the objective isn't objective because at the end you've got to tell a story. You've got to make
link |
00:55:58.800
decisions. You've got to make tradeoff. What else is engineering other than that? So I think that
link |
00:56:03.120
the rankings capture something. They just don't necessarily capture what people assume they
link |
00:56:09.120
capture. You know, just to linger on this idea, why is there not more people who just like play
link |
00:56:19.920
with whatever that narrative is, have fun with it, have like excite the world, whether it's in
link |
00:56:24.400
the Carl Sagan style of like that calm, sexy voice of explaining the stars and all the romantic stuff
link |
00:56:31.600
or the Elon Musk, dare I even say Donald Trump, where you're like trolling and shaking up the
link |
00:56:37.040
system and just saying controversial things. I talked to Lisa Feldman Barrett, who's a
link |
00:56:43.920
neuroscientist who just enjoys playing the controversy. Things like finds the counter
link |
00:56:50.320
intuitive ideas in the particular science and throws them out there and sees how they play in
link |
00:56:56.080
the public discourse. Like why don't we see more of that? And why doesn't academia attract an Elon
link |
00:57:02.400
Musk type? Well, tenure is a powerful thing that allows you to do whatever you want, but getting
link |
00:57:08.480
tenure typically requires you to be relatively narrow, right? Because people are judging you.
link |
00:57:14.240
Well, I think the answer is we have told ourselves a story, a narrative that is vulgar,
link |
00:57:22.000
what you just described is vulgar. It's certainly unscientific, right? And it is easy to convince
link |
00:57:30.880
yourself that in some ways you're the mathematician, right? The fewer there are in your major,
link |
00:57:37.360
the more that proves your purity, right? So once you tell yourself that story, then it is
link |
00:57:46.240
beneath you to do that kind of thing, right? I think that's wrong. I think that, and by the way,
link |
00:57:51.840
everyone doesn't have to do this. Everyone's not good at it. And everyone, even if they would be
link |
00:57:54.320
good at it, would enjoy it. So it's fine. But I do think you need some diversity in the way that
link |
00:58:00.080
people choose to relate to the world as academics, because I think the great universities are ones
link |
00:58:09.440
that engage with the rest of the world. It is a home for public intellectuals. And in 2020,
link |
00:58:15.760
being a public intellectual probably means being on Twitter. Whereas of course that wasn't true
link |
00:58:21.520
20 years ago, because Twitter wasn't around 20 years ago. And if it was, it wasn't around in a
link |
00:58:25.280
meaningful way. I don't actually know how long Twitter has been around. As I get older, I find
link |
00:58:28.960
that my notion of time has gotten worse and worse. Like Google really has been around that long?
link |
00:58:33.840
Anyway, the point is that I think that we sometimes forget that a part of our job is to
link |
00:58:43.360
impact the people who aren't in the world that we're in, and that that's the point of being at
link |
00:58:47.040
a great place and being a great person, frankly. There's an interesting force in terms of public
link |
00:58:51.520
intellectuals. Forget Twitter, we could look at just online courses that are public facing in
link |
00:58:57.120
some part. There is a kind of force that pulls you back. Let me just call it out because I don't
link |
00:59:06.960
give a damn at this point. There's a little bit of, all of us have this, but certainly faculty
link |
00:59:12.000
have this, which is jealousy. Whoever's popular at being a good communicator, exciting the world
link |
00:59:20.480
with their science. And of course, when you excite the world with the science, it's not
link |
00:59:27.280
peer reviewed, clean. It all sounds like bullshit. It's like a Ted talk and people roll their eyes
link |
00:59:36.160
and they hate that a Ted talk gets millions of views or something like that. And then everybody
link |
00:59:41.360
pulls each other back. There's this force that just kind of, it's hard to stand out unless you
link |
00:59:46.400
win a Nobel prize or whatever. It's only when you get senior enough where you just stop giving a
link |
00:59:53.200
damn. But just like you said, even when you get tenure, that was always the surprising thing to
link |
00:59:58.160
me. I have many colleagues and friends who have gotten tenure, but there's not a switch.
link |
01:00:08.000
There's not an F you money switch where you're like, you know what? Now I'm going to be more
link |
01:00:13.520
bold. It doesn't, I don't see it. Well, there's a reason for that. Tenure isn't a test. It's a
link |
01:00:18.880
training process. It teaches you to behave in a certain way, to think in a certain way, to accept
link |
01:00:24.240
certain values and to react accordingly. And the better you are at that, the more likely you are to
link |
01:00:28.880
earn tenure. And by the way, this is not a bad thing. Most things are like that. And I think most
link |
01:00:34.720
of my colleagues are interested in doing great work and they're just having impact in the way that
link |
01:00:38.880
they want to have impact. I do think that as a field, not just as a field, as a profession,
link |
01:00:46.480
we have a habit of belittling those who are popular as it were, as if the word itself is a
link |
01:00:56.000
kind of Scarlet A, right? I think it's easy to convince yourself, and no one is immune to this,
link |
01:01:06.720
no one is immune to this, that the people who are better known are better known for bad reasons.
link |
01:01:11.600
The people who are out there dumbing it down are not being pure to whatever the values and ethos
link |
01:01:19.520
is for your field. And it's just very easy to do. Now, having said that, I think that ultimately,
link |
01:01:26.480
people who are able to be popular and out there and are touching the world and making a difference,
link |
01:01:31.920
you know, our colleagues do, in fact, appreciate that in the long run. It's just, you know,
link |
01:01:36.800
you have to be very good at it or you have to be very interested in pursuing it. And once you get
link |
01:01:40.960
past a certain level, I think people accept that for who it is. I mean, I don't know. I'd be really
link |
01:01:45.200
interested in how Rod Brooks felt about how people were interacting with him when he did Fast,
link |
01:01:50.000
Cheap, and Out of Control way, way, way back when.
link |
01:01:53.120
What's Fast, Cheap, and Out of Control?
link |
01:01:55.200
It was a documentary that involved four people. I remember nothing about it other than Rod Brooks
link |
01:02:00.480
was in it and something about naked mole rats. I can't remember what the other two things were.
link |
01:02:05.200
It was robots, naked mole rats, and then two others.
link |
01:02:07.440
By the way, Rod Brooks used to be the head of the artificial intelligence laboratory at MIT
link |
01:02:12.480
and then launched, I think, iRobot and then Think Robotics, Rethink Robotics.
link |
01:02:18.160
Yes.
link |
01:02:18.720
Think is in the word. And also is a little bit of a rock star personality in the AI world,
link |
01:02:27.200
a very opinionated, very intelligent. Anyway, sorry, mole rats and naked.
link |
01:02:32.480
Naked mole rats. Also, he was one of my two advisors for my PhD.
link |
01:02:36.960
This explains a lot.
link |
01:02:37.840
I don't know how to explain it. I love Rod. But I also love my other advisor, Paul. Paul,
link |
01:02:43.840
if you're listening, I love you too. Both very, very different people.
link |
01:02:46.080
Paul Viola.
link |
01:02:46.640
Paul Viola. Both very interesting people, very different in many ways. But I don't know what
link |
01:02:51.360
Rod would say to you about what the reaction was. I know that for the students at the time,
link |
01:02:56.800
because I was a student at the time, it was amazing. This guy was in a movie, being very
link |
01:03:03.680
much himself. Actually, the movie version of him is a little bit more Rod than Rod. I think they
link |
01:03:10.720
edited it appropriately for him. But it was very much Rod. And he did all this while doing great
link |
01:03:15.200
work. Was he running the iLab at that point or not? I don't know. But anyway, he was running
link |
01:03:18.720
the iLab, or would be soon. He was a giant in the field. He did amazing things, made a lot of his
link |
01:03:23.280
bones by doing the kind of counterintuitive science. And saying, no, you're doing this all
link |
01:03:28.960
wrong. Representation is crazy. The world is your own representation. You just react to it. I mean,
link |
01:03:32.720
these are amazing things. And continues to do those sorts of things as he's moved on.
link |
01:03:37.760
I think he might tell you, I don't know if he would tell you it was good or bad, but I know that
link |
01:03:43.280
for everyone else out there in the world, it was a good thing. And certainly,
link |
01:03:46.880
he continued to be respected. So it's not as if it destroyed his career by being popular.
link |
01:04:21.120
P stands for probabilistic. And what does
link |
01:04:26.000
FUNC stand for? So a lot of my life is about making acronyms. So if I have one quirk,
link |
01:04:31.440
it's that people will say words and I see if they make acronyms. And if they do, then I'm happy. And
link |
01:04:36.880
then if they don't, I try to change it so that they make acronyms. It's just a thing that I do.
link |
01:04:41.440
So PFUNC is an acronym. It has three or four different meanings. But finally, I decided that
link |
01:04:46.960
the P stands for probabilistic because at the end of the day, it's machine learning and it's randomness and
link |
01:05:13.840
it's fun.
link |
01:05:40.720
So there's a sense to it, which is not an acronym, like literally FUNC. You have a dark, mysterious past.
link |
01:05:43.760
So there's a sense to it, which is not an acronym, like literally FUNC.
link |
01:06:13.680
There's a sense to it, which is not an acronym, like literally FUNC.
link |
01:06:43.600
It's a whole set of things of which rap is a part. So tagging is a part of hip hop. I don't know why
link |
01:06:48.480
that's true, but people tell me it's true and I'm willing to go along with it because they get very
link |
01:06:51.680
angry about it. But hip hop is like graffiti. Tagging is like graffiti. And there's all these,
link |
01:06:56.400
including the popping and the locking and all the dancing and all those things. That's all a part of
link |
01:06:59.600
hip hop. It's a way of life, which I think is true. And then there's rap, which is this particular.
link |
01:07:04.960
It's the music part.
link |
01:07:05.840
Yes. A music part. I mean, you wouldn't call the stuff that DJs do the scratching. That's not rap,
link |
01:07:12.560
right? But it's a part of hip hop, right? So given that we understand that hip hop is this whole
link |
01:07:16.640
thing, what are the rap albums that best touch that for me? Well, if I were going to educate you,
link |
01:07:21.840
I would try to figure out what you liked and then I would work you there.
link |
01:07:26.400
Oh my God. I would probably start with, there's a fascinating exercise one can do by watching old
link |
01:07:38.400
episodes of I love the seventies. I love the eighties. I love the nineties with a bunch of
link |
01:07:43.680
friends and just see where people come in and out of pop culture. So if you're talking about
link |
01:07:50.240
those people, then I would actually start you with where I would hope to start you with anyway,
link |
01:07:54.960
which is public enemy. Particularly it takes a nation of millions to hold us back, which is
link |
01:08:00.480
clearly the best album ever produced. And certainly the best hip hop album ever produced
link |
01:08:05.440
in part because it was so much of what was great about the time. Fantastic lyrics is me. It's all
link |
01:08:11.040
about the lyrics. Amazing music that was coming from Rick Rubin was the producer of that. And he
link |
01:08:16.160
did a lot, very kind of heavy mental ish, at least in the eighties sense at the time. And it was
link |
01:08:21.680
focused on politics in the 1980s, which was what made hip hop so great. Then I would start you
link |
01:08:28.160
there. Then I would move you up through things that are been happening more recently. I'd probably
link |
01:08:33.040
get you to someone like a most deaf. I would give you a history lesson, basically. Most of them.
link |
01:08:37.280
That's amazing. He hosted a poetry jam thing on HBO or something like that. Probably. I don't
link |
01:08:41.440
think I've seen it, but I wouldn't be surprised. Yeah. Spoken poetry. That's amazing. He's amazing.
link |
01:08:46.720
And then I would, I, after I got you there, I'd work you back to EPMD.
link |
01:08:51.520
And eventually I would take you back to the last poets and particularly the first album,
link |
01:08:57.200
the last poets, which was 1970 to give you a sense of history and that it actually has been building
link |
01:09:02.720
up over a very, very long time. So we would start there because that's where your music aligns. And
link |
01:09:08.480
then we would cycle out and I'd move you to the present. And then I'd take you back to the past.
link |
01:09:12.480
Because I think a large part of people who are kind of confused about any kind of music,
link |
01:09:16.880
you know, the truth is this is the same thing we've always been talking about, right? It's
link |
01:09:19.680
about narrative and being a part of something and being immersed in something. So you understand it,
link |
01:09:23.520
right? Jazz, which I also like is one of the things that's cool about jazz is that you come
link |
01:09:30.160
and you meet someone who's talking to you about jazz and you have no idea what they're talking
link |
01:09:32.960
about. And then one day it all clicks and you've been so immersed in it. You go, oh yeah, that's a
link |
01:09:37.200
Charlie Parker. You start using words that nobody else understands, right? And it becomes part of
link |
01:09:41.440
hip hop's the same way. Everything's the same way. They're all cultural artifacts, but I would help
link |
01:09:45.200
you to see that there's a history of it and how it connects to other genres of music that you might
link |
01:09:49.120
like to bring you in. So that you could kind of see how it connects to what you already like,
link |
01:09:54.320
including some of the good work that's been done with fusions of hip hop and bluegrass.
link |
01:10:02.480
Oh no.
link |
01:10:03.200
Yes. Some of it's even good. Not all of it, but some of it is good,
link |
01:10:09.520
but I'd start you with, it takes a nation of millions to hold this back.
link |
01:10:12.320
There's an interesting tradition in like more modern hip hop of integrating almost like classic
link |
01:10:18.800
crock songs or whatever, like integrating into their music, into the beat, into the whatever.
link |
01:10:25.440
It's kind of interesting. It gives a whole new, not just classic crock, but what is it?
link |
01:10:32.000
Kanye with Gold Digger.
link |
01:10:33.120
Old R&B.
link |
01:10:35.680
It's taking and pulling old R&B, right?
link |
01:10:38.080
Well, that's been true since the beginning. I mean, in fact, that's in some ways,
link |
01:10:41.200
that's why the DJ used to get top billing because it was the DJ that brought all the records together
link |
01:10:47.600
and made it worth so that people could dance. You go back to those days, mostly in New York,
link |
01:10:52.800
though not exclusively, but mostly in New York where it sort of came out of,
link |
01:10:56.400
the DJ that brought all the music together and the beats and showed that basically music is
link |
01:11:00.240
itself an instrument, very meta, and you can bring it together and then you sort of wrap over it and
link |
01:11:04.480
so on. And it moved that way. So that's going way, way back. Now, in the period of time where I grew
link |
01:11:10.000
up, when I became really into it, which was mostly the 80s, it was more funk was the back for a lot
link |
01:11:17.200
of the stuff, Public Enemy at that time, notwithstanding. And so, which is very nice
link |
01:11:22.800
because it tied into what my parents listened to and what I vaguely remember listening to when I
link |
01:11:26.720
was very small. And by the way, complete revival of George Clinton and Parliament and Funkadelic
link |
01:11:33.120
and all of those things to bring it sort of back into the 80s and into the 90s. And as we go on,
link |
01:11:38.000
you're going to see the last decade and the decade before that being brought in.
link |
01:11:42.240
And when you don't think that you're hearing something you've heard, it's probably because
link |
01:11:45.600
it's being sampled by someone who, referring to something they remembered when they were young,
link |
01:11:53.920
perhaps from somewhere else altogether. And you just didn't realize what it was because it wasn't
link |
01:11:59.040
a popular song where you happened to grow up. So this stuff has been going on for a long time.
link |
01:12:02.160
It's one of the things that I think is beautiful. Run DMC, Jam Master Jay used to play, he played
link |
01:12:08.480
piano. He would record himself playing piano and then sample that to make it a part of what was
link |
01:12:13.200
going on rather than play the piano. That's how his mind can think. Well, it's pieces. You're
link |
01:12:17.600
putting pieces together, you're putting pieces of music together to create new music, right?
link |
01:12:20.480
Now that doesn't mean that the root, I mean, the roots are doing their own thing.
link |
01:12:23.440
Yeah. Right. Those, those are, that's a whole. Yeah. But still it's the right attitude that,
link |
01:12:30.240
you know, and what else is jazz, right? Jazz is about putting pieces together and then putting
link |
01:12:33.280
your own spin on it. It's all the same. It's all the same thing. It's all the same thing.
link |
01:12:36.640
Yeah. Cause you mentioned lyrics. It does make me sad. Again, this is me talking trash about
link |
01:12:41.600
modern hip hop. I haven't, you know, investigated. I'm sure people correct me that there's a lot of
link |
01:12:46.880
great artists. That's part of the reason I'm saying it is they'll leave it in the comments that you
link |
01:12:50.640
should listen to this person is the lyrics went away from talking about maybe not just politics,
link |
01:12:59.680
but life and so on. Like, you know, the kind of like protest songs, even if you look at like a
link |
01:13:04.720
Bob Marley or you said Public Enemy or Rage Against the Machine, More on the Rockside,
link |
01:13:09.040
there's, that's the place where we go to those lyrics. Like classic rock is all about like,
link |
01:13:16.800
my woman left me or, or I'm really happy that she's still with me or the flip side. It's like
link |
01:13:23.360
love songs of different kinds. It's all love, but it's less political, like less interesting. I
link |
01:13:27.920
would say in terms of like deep, profound knowledge. And it seems like rap is the place where you
link |
01:13:34.320
would find that. And it's sad that for the most part, what I see, like you look like mumble rap
link |
01:13:40.160
or whatever, they're moving away from lyrics and more towards the beat and the, and the musicality
link |
01:13:45.120
of it. I've always been a fan of the lyrics. In fact, if you go back and you read my reviews,
link |
01:13:49.120
which I recently was rereading, man, I wrote my last review the month I graduated. I got my PhD,
link |
01:13:57.120
which says something about something. I'm not sure what though. I always would always,
link |
01:14:01.120
but I often would start with, it's all about the lyrics. For me, it's all, it's about the lyrics.
link |
01:14:06.560
Someone has already written in the comments before I've even finished having this conversation
link |
01:14:10.000
that, you know, neither of us knows what we're talking about and it's all in the underground
link |
01:14:13.600
hip hop and here's who you should go listen to. And that is true. Every time I despair for popular
link |
01:14:19.040
rap, I get someone points me to, or I discover some underground hip hop song and I'm, I am made
link |
01:14:25.360
happy and whole again. So I know it's out there. I don't listen to as much as I used to because
link |
01:14:30.720
I'm listening to podcasts and old music from the 1980s. It's a kind of no beat at all, but you know,
link |
01:14:37.040
there's a little bit of sampling here and there. I'm sure. By the way, James Brown is funk or no?
link |
01:14:42.480
Yes. And so is junior Wells, by the way. Who's that? Oh, junior Wells, Chicago blues. He was
link |
01:14:48.560
James Brown before James Brown was. It's hard to imagine somebody being James Brown. Go look up
link |
01:14:53.440
hoodoo man blues, junior Wells, and just listen to, snatch it back and hold it and you'll see it.
link |
01:15:01.920
And they were contemporaries. Where do you put like little Richard or all that kind of stuff?
link |
01:15:06.080
Like Ray Charles, like when they get like, hit the road Jack and don't you come back. Isn't that
link |
01:15:12.160
like, there's a funkiness in it. Oh, that's definitely a funkiness in it. I mean, it's all,
link |
01:15:16.160
I mean, it's all, it's all a line. I mean, it's all, there's all a line that carries it all
link |
01:15:20.320
together. You know, it's, I guess I would answer your question, depending on whether I'm thinking
link |
01:15:24.240
about it in 2020 or I'm thinking about it in 1960. Um, I'd probably give a different answer.
link |
01:15:28.800
I'm just thinking in terms of, you know, that was rock, but when you look back on it, it's,
link |
01:15:33.440
it was funky, but we didn't use those words or maybe we did. I wasn't around. Uh, but you know,
link |
01:15:38.240
I don't think we used the word 1960 funk, certainly not the way we used it in the seventies
link |
01:15:42.320
and the eighties. Do you reject disco? I do not reject disco. I appreciate all the mistakes that
link |
01:15:47.120
we have made. Actually, some of the disco is actually really, really good. John Travolta.
link |
01:15:52.320
Oh boy. He regrets it probably. Maybe not. Well, like it's the mistakes thing.
link |
01:15:56.720
Yeah. And it got him to where he's going, where he is.
link |
01:16:00.880
Oh, well, thank you for taking that detour. You've, you've talked about computing. We've
link |
01:16:05.920
already talked about computing a little bit, but can you try to describe how you think about the
link |
01:16:11.600
world of computing or it fits into the sets of different disciplines? We mentioned College of
link |
01:16:16.880
Computing. What, what should people, how should they think about computing, especially from an
link |
01:16:22.560
educational perspective of like, what is the perfect curriculum that defines for a young mind,
link |
01:16:30.800
uh, what computing is? So I don't know about a perfect curriculum, although that's an important
link |
01:16:34.960
question because at the end of the day, without the curriculum, you don't get anywhere. Curriculum
link |
01:16:39.280
to me is the fundamental data structure. It's not even the classroom. I mean, the world is,
link |
01:16:44.080
right? I, I, I, so I think the curriculum is where I like to play. Uh, so I spend a lot of time
link |
01:16:49.200
thinking about this, but I will tell you, I'll answer your question by answering a slightly
link |
01:16:52.400
different question first and then getting back to this, which is, you know, you talked about
link |
01:16:55.520
disciplines and what does it mean to be a discipline? The truth is what we really educate
link |
01:17:01.120
people in from the beginning, but certainly through college, you sort of failed. If you
link |
01:17:05.520
don't think about it this way, I think is the world. People often think about tools and tool
link |
01:17:12.640
sets, and when you're really trying to be good, you think about skills and skill sets,
link |
01:17:16.560
but disciplines are about mindsets, right? They're about fundamental ways of thinking, not just the,
link |
01:17:22.400
the, the hammer that you pick up, whatever that is to hit the nail, um, not just the,
link |
01:17:27.280
the skill of learning how to hammer well or whatever. It's the mindset of like,
link |
01:17:30.960
what's the fundamental way to think about, to think about the world, right? And disciplines,
link |
01:17:37.200
different disciplines give you different mindsets to give you different ways of sort of thinking
link |
01:17:40.960
through. So with that in mind, I think that computing, even ask the question, whether
link |
01:17:45.680
it's a discipline that you have to decide, does it have a mindset? Does it have a way of thinking
link |
01:17:48.960
about the world that is different from, you know, the scientist who is doing a discovery and using
link |
01:17:53.600
the scientific method as a way of doing it, or the mathematician who builds abstractions and tries
link |
01:17:57.680
to find sort of steady state truth about the abstractions that may be artificial, but whatever,
link |
01:18:03.600
or is it the engineer who's all about, you know, building demonstrably superior technology with
link |
01:18:08.560
respect to some notion of trade offs, whatever that means, right? That's sort of the world that
link |
01:18:12.640
you, the world that you live in. What is computing? You know, how is computing different? So I've
link |
01:18:16.880
thought about this for a long time and I've come to a view about what computing actually is, what
link |
01:18:22.160
the mindset is. And, and it's, you know, it's a little abstract, but that would be appropriate
link |
01:18:26.320
for computing. I think that what distinguishes the computationalist from others is that he or
link |
01:18:34.320
she understands that models, languages and machines are equivalent. They're the same thing. And
link |
01:18:41.600
because it's not just a model, but it's a machine that is an executable thing that can be described
link |
01:18:47.440
as a language. That means that it's dynamic. So it's not the, it is mathematical in some sense,
link |
01:18:54.000
in the kind of sense of abstraction, but it is fundamentally dynamic and executable. The
link |
01:18:58.320
mathematician is not necessarily worried about either the dynamic part. In fact, whenever I tried
link |
01:19:03.280
to write something for mathematicians, they invariably demand that I make it static. And
link |
01:19:08.160
that's not a bad thing. It's just, it's a way of viewing the world that truth is a thing, right?
link |
01:19:12.560
It's not a process that continually runs, right? So that dynamic thing matters. That self reflection
link |
01:19:19.440
of the system itself matters. And that is what computing, that is what computing brought us.
link |
01:19:23.280
So it is a science because it, the models fundamentally represent truths in the world.
link |
01:19:27.840
Information is a scientific thing to discover, right? Not just a mathematical conceit that
link |
01:19:33.040
that gets created. But of course it's engineering because you're actually dealing with constraints
link |
01:19:37.040
in the world and trying to execute machines that actually run. But it's also a math because
link |
01:19:43.120
you're actually worrying about these languages that describe what's happening. But the fact that
link |
01:19:51.440
regular expressions and finite state automata, one of which feels like a machine or at least
link |
01:19:56.400
an abstraction machine and the other is a language that they're actually the equivalent thing. I mean,
link |
01:19:59.760
that is not a small thing and it permeates everything that we do, even when we're just
link |
01:20:03.840
trying to figure out how to, how to do debugging. So that idea I think is fundamental and we would
link |
01:20:09.120
do better if we made that more explicit. How my life has changed and my thinking about this
link |
01:20:15.200
in the 10 or 15 years it's been since I tried to put that to paper with some colleagues is
link |
01:20:21.680
the realization, which comes to a question you actually asked me earlier, which has to do with
link |
01:20:28.000
trees falling down and whether it matters, is this sort of triangle of equality.
link |
01:20:34.560
It only matters because there's a person inside the triangle, right? That what's changed about
link |
01:20:40.720
computing, computer science or whatever you want to call it, is we now have so much data
link |
01:20:46.560
and so much computational power. We're able to do really, really interesting, promising things.
link |
01:20:53.040
But the interesting and the promising kind of only matters with respect to human beings and
link |
01:20:56.960
their relationship to it. So the triangle exists, that is fundamentally computing.
link |
01:21:01.680
What makes it worthwhile and interesting and potentially world species changing is that there
link |
01:21:08.880
are human beings inside of it and intelligence that has to interact with it to change the data,
link |
01:21:12.640
the information that makes sense and gives meaning to the models, the languages and the machines.
link |
01:21:19.120
So if the curriculum can convey that while conveying the tools and the skills that you need
link |
01:21:25.120
in order to succeed, then it is a big win. That's what I think you have to do.
link |
01:21:30.880
Do you pull psychology, like these human things into that, into the idea,
link |
01:21:36.960
into this framework of computing? Do you pull in psychology and neuroscience,
link |
01:21:41.120
like parts of psychology, parts of neuroscience, parts of sociology?
link |
01:21:44.560
What about philosophy, like studies of human nature from different perspectives?
link |
01:21:49.120
Absolutely. And by the way, it works both ways. So let's take biology for a moment.
link |
01:21:53.200
It turns out a cell is basically a bunch of if, then statements, if you look at it the right way,
link |
01:21:57.680
which is nice because I understand if, then statements. I never really enjoyed biology,
link |
01:22:01.680
but I do understand if, then statements. And if you tell the biologists that and they begin to
link |
01:22:05.360
understand that, it actually helps them to think about a bunch of really cool things.
link |
01:22:09.520
There'll still be biology involved, but whatever. On the other hand, the fact of biology is,
link |
01:22:15.040
in fact, the cell is a bunch of if, then statements or whatever, allows the computationalist to think
link |
01:22:19.520
differently about the language and the way that we, well, certainly the way we would do AI machine
link |
01:22:23.520
learning, but there's just even the way that we think about computation. So the important thing
link |
01:22:28.000
to me is as my engineering colleagues who are not in computer science worry about computer science
link |
01:22:33.680
eating up engineering to colleges where computer science is trapped. It's not a worry. You shouldn't
link |
01:22:40.560
worry about that at all. Computer science computing, it's central, but it's not the
link |
01:22:45.280
most important thing in the world. It's not more important. It is just key to helping others do
link |
01:22:50.960
other cool things they're going to do. You're not going to be a historian in 2030. You're not going
link |
01:22:54.800
to get your PhD in history without understanding some data science and computing, because the way
link |
01:22:58.480
you're going to get history done in part, and I say done, the way you're going to get it done
link |
01:23:02.960
is you're going to look at data and you're going to let, you're going to have the system that's
link |
01:23:06.160
going to help you to analyze things to help you to think about a better way to describe history
link |
01:23:09.920
and to understand what's going on and what it tells us about where we might be going. The same
link |
01:23:13.600
is true for psychology. Same is true for all of these things. The reason I brought that up is
link |
01:23:17.040
because the philosopher has a lot to say about computing. The psychologist has a lot to say
link |
01:23:21.920
about the way humans interact with computing, right? And certainly a lot about intelligence, which
link |
01:23:27.440
at least for me, ultimately is kind of the goal of building these computational devices is to build
link |
01:23:32.480
something intelligent. Did you think computing will eat everything in some certain sense or almost
link |
01:23:37.680
like disappear because it's part of everything? It's so funny you say this. I want to say it's
link |
01:23:41.680
going to metastasize, but there's kind of two ways that fields destroy themselves. One is they become
link |
01:23:47.600
super narrow. And I think we can think of fields that might be that way. They become pure. And we
link |
01:23:55.120
have that instinct. We have that impulse. I'm sure you can think of several people who want computer
link |
01:23:59.600
science to be this pure thing. The other way is you become everywhere and you become everything
link |
01:24:05.200
and nothing. And so everyone says, you know, I'm going to teach Fortran for engineers or whatever.
link |
01:24:10.240
I'm going to do this. And then you lose the thing that makes it worth studying in and of itself.
link |
01:24:15.440
The thing about computing, and this is not unique to computing, though at this point in time,
link |
01:24:19.520
it is distinctive about computing where we happen to be in 2020 is we are both a thriving major.
link |
01:24:26.480
In fact, the thriving major, almost every place. And we're a service unit because people need to
link |
01:24:34.320
know the things we need to know. And our job, much as the mathematician's job is to help,
link |
01:24:39.520
you know, this person over here to think like a mathematician, much the way the point isn't the
link |
01:24:43.440
point of view taking chemistry as a freshman is not to learn chemistry. It's to learn to think
link |
01:24:47.600
like a scientist, right? Our job is to help them to think, think like a computationalist. And we
link |
01:24:52.080
have to take both of those things very seriously. And I'm not sure that as a field, we have
link |
01:24:56.880
historically certainly taken the second thing that our job is to help them to think a certain way.
link |
01:25:01.840
People who aren't going to be our major, I don't think we've taken that that very seriously at all.
link |
01:25:04.960
I don't know if you know who Dan Carlin is. He has this podcast called Hardcore History.
link |
01:25:09.600
Yes.
link |
01:25:10.000
I've just did an amazing four hour conversation with him, mostly about Hitler. But I bring him
link |
01:25:18.160
up because he talks about this idea that it's possible that history as a field will become,
link |
01:25:24.160
like currently, most people study history a little bit, kind of are aware of it. We have a
link |
01:25:31.440
conversation about it, different parts of it. I mean, there's a lot of criticism to say that some
link |
01:25:35.920
parts of history are being ignored, blah, blah, blah, so on. But most people are able to have a
link |
01:25:41.760
curiosity and able to learn it. His thought is it's possible given the way social media works,
link |
01:25:49.200
the current way we communicate, that history becomes a niche field where literally most people
link |
01:25:55.840
just ignore because everything is happening so fast that the history starts losing its meaning.
link |
01:26:01.680
And then it starts being a thing that only, you know, like the theoretical computer science,
link |
01:26:07.840
part of computer science, it becomes a niche thing that only like the rare holders of the
link |
01:26:13.840
world wars and the, you know, all the history, the founding of the United States, all those kinds of
link |
01:26:19.920
things, the Civil Wars. And it's a kind of profound thing to think about how these,
link |
01:26:27.360
how we can lose track, how we can lose these fields when they're best, like in the case of
link |
01:26:33.360
history, is best for that to be a pervasive thing that everybody learns and thinks about and so on.
link |
01:26:40.480
And I would say computing is quite obviously similar to history in the sense that it seems
link |
01:26:46.880
like it should be a part of everybody's life to some degree, especially like as we move into the
link |
01:26:53.520
later parts of the 21st century. And it's not obvious that that's the way it'll go. It might
link |
01:26:59.840
be in the hands of the few still. Like depending if it's machine learning, you know, it's unclear
link |
01:27:06.640
that computing will win out. It's currently very successful, but it's not, I would say that's
link |
01:27:12.240
something, I mean, you're at the leadership level of this. You're defining the future. So it's in
link |
01:27:16.880
your hands. No pressure. But like, it feels like there's multiple ways this can go. And there's
link |
01:27:23.760
this kind of conversation of everybody should learn to code, right? The changing nature of jobs
link |
01:27:29.200
and so on. Do you have a sense of what your role in education of computing is here? Like what's
link |
01:27:41.200
the hopeful path forward? There's a lot there. I will say that, well, first off, it would be an
link |
01:27:46.000
absolute shame if no one studied history. On the other hand, as t approaches infinity, the amount
link |
01:27:51.360
of history is presumably also growing at least linearly. And so you have to forget more and more
link |
01:27:59.440
of history, but history needs to always be there. I mean, I can imagine a world where,
link |
01:28:02.960
you know, if you think of your brains as being outside of your head, that you can kind of learn
link |
01:28:07.360
the history you need to know when you need to know it. That seems fanciful, but it's a kind of way of,
link |
01:28:12.560
you know, is there a sufficient statistic of history? No. And there certainly, but there may
link |
01:28:17.440
be for the particular thing you have to care about, but you know, those who do not remember.
link |
01:28:20.560
It's for our objective camera discussion, right?
link |
01:28:23.040
Yeah. Right. And, you know, we've already lost lots of history. And of course you have your
link |
01:28:26.800
own history that some of which will be, it's even lost to you, right? You don't even remember
link |
01:28:31.120
whatever it was you were doing 17 years ago.
link |
01:28:33.280
All the ex girlfriends.
link |
01:28:34.320
Yeah.
link |
01:28:34.640
Gone.
link |
01:28:35.680
Exactly. So, you know, history is being lost anyway, but the big lessons of history shouldn't
link |
01:28:41.280
be. And I think, you know, to take it to the question of computing and sort of education,
link |
01:28:46.080
the point is you have to get across those lessons. You have to get across the way of thinking.
link |
01:28:50.880
And you have to be able to go back and, you know, you don't want to lose the data,
link |
01:28:54.480
even if, you know, you don't necessarily have the information at your fingertips.
link |
01:28:57.680
With computing, I think it's somewhat different. Everyone doesn't have to learn how to code,
link |
01:29:02.640
but everyone needs to learn how to think in the way that you can be precise. And I mean,
link |
01:29:07.680
precise in the sense of repeatable, not just, you know, in the sense of not resolution in the sense
link |
01:29:13.840
of get the right number of bits, um, in saying what it is you want the machine to do and being
link |
01:29:19.680
able to describe a problem in such a way that it is executable, which we are not human beings are
link |
01:29:26.480
not very good at that. In fact, I think we spend much of our time talking back and forth just to
link |
01:29:30.160
kind of vaguely understand what the other person means and hope we get it good enough that we can,
link |
01:29:34.080
we can act accordingly. Um, you can't do that with machines, at least not yet. And so,
link |
01:29:39.440
you know, having to think that precisely about things is quite important. And that's somewhat
link |
01:29:45.840
different from coding. Coding is a crude means to an end. On the other hand, the idea of coding,
link |
01:29:53.600
what that means that it's a programming language and it has these sort of things that you fiddle
link |
01:29:57.760
with in these ways that you express. That is an incredibly important point. In fact, I would argue
link |
01:30:01.920
that one of the big holes in machine learning right now in an AI is that we forget that we are
link |
01:30:07.120
basically doing software engineering. We forget that we are doing, um, we're using programming,
link |
01:30:13.280
like we're using languages to express what we're doing. We get just so all caught up in the deep
link |
01:30:16.720
network or we get all caught up in whatever that we forget that, you know, we're making decisions
link |
01:30:22.560
based upon a set of parameters that we made up. And if we did slightly different parameters,
link |
01:30:26.880
we'd have completely different, different outcomes. And so the lesson of computing,
link |
01:30:30.400
computer science education is to be able to think like that and to be aware of it when you're doing
link |
01:30:36.480
it. Basically, it's, you know, at the end of the day, it's a way of, um, surfacing your assumptions.
link |
01:30:41.120
I mean, we call them parameters or, you know, we, we, we call them if then statements or whatever,
link |
01:30:45.760
but you're forced to surface those, those assumptions. That's the key, the key thing that
link |
01:30:50.080
you should get out of a computing education that, and that the models and languages and
link |
01:30:53.040
machines are equivalent, but it actually follows from that, that you have to be explicit about,
link |
01:30:58.400
about what it is you're trying to do because the model you're building is something you will one
link |
01:31:02.800
day run. So you better get it right, or at least understand it and be able to express roughly what,
link |
01:31:08.480
what you want to express. So I think it is key that we figure out how to educate everyone to
link |
01:31:17.200
think that way, because at the end, it would not only make them better at whatever it is that they
link |
01:31:23.440
are doing. And I emphasize doing it'll also make them better citizens. It'll help them to understand
link |
01:31:30.480
what others are doing to them so that they can react accordingly. Cause you're not going to
link |
01:31:35.920
solve the problem of social media in so far as you think of social media as a problem
link |
01:31:40.720
by just making slightly better code, right? It only works if people react to it
link |
01:31:46.320
appropriately and know what's happening and therefore take control over what they're doing.
link |
01:31:52.160
I mean, that's, that's my take on it.
link |
01:31:53.920
Okay. Let me try to proceed awkwardly into the topic of race.
link |
01:32:00.000
Okay.
link |
01:32:00.560
One is because it's a fascinating part of your story and you're just eloquent and fun about it.
link |
01:32:05.440
And then the second is because we're living through a pretty tense time in terms of race,
link |
01:32:12.560
tensions and discussions and ideas in this time in America. You grew up in Atlanta,
link |
01:32:20.880
not born in Atlanta. Is some Southern state, somewhere in Tennessee, something like that?
link |
01:32:24.960
Tennessee.
link |
01:32:25.600
Nice. Okay. But early on you moved, you're basically, you identify as an Atlanta native.
link |
01:32:34.480
Mm hmm. Yeah. And you've mentioned that you grew up in a predominantly black neighborhood,
link |
01:32:42.320
by the way, black African American person of color.
link |
01:32:44.960
I prefer black.
link |
01:32:46.000
Black.
link |
01:32:46.400
With a capital B.
link |
01:32:47.520
With a capital B. The other letters are...
link |
01:32:50.720
The rest of them, no matter.
link |
01:32:54.640
Okay. So the predominantly black neighborhood. And so you didn't almost see race. Maybe you
link |
01:32:59.600
can correct me on that. And then just in the video you talked about when you showed up to
link |
01:33:05.200
Georgia Tech for your undergrad, you're one of the only black folks there. And that was like,
link |
01:33:12.640
oh, that was a new experience. So can you take me from just a human perspective,
link |
01:33:19.280
but also from a race perspective, your journey growing up in Atlanta
link |
01:33:23.120
and then showing up at Georgia Tech?
link |
01:33:24.800
Okay. That's easy. And by the way, that story continues through MIT as well.
link |
01:33:28.160
Yeah. In fact, it was quite a bit more stark at MIT and Boston.
link |
01:33:32.960
So maybe just a quick pause, Georgia Tech was undergrad, MIT was graduate school.
link |
01:33:37.920
Mm hmm. And I went directly to grad school from undergrad. So I had no
link |
01:33:42.240
distractions in between my bachelor's and my master's and PhD.
link |
01:33:45.600
You didn't go on a backpacking trip in Europe?
link |
01:33:47.760
Didn't do any of that. In fact, I literally went to IBM for three months, got in a car,
link |
01:33:52.640
and drove straight to Boston with my mother, or Cambridge.
link |
01:33:55.600
Yeah.
link |
01:33:55.840
I moved into an apartment I'd never seen over the Royal East. Anyway, that's another story.
link |
01:34:02.080
So let me tell you a little bit about it.
link |
01:34:03.440
You miss MIT?
link |
01:34:04.880
Oh, I loved MIT. I don't miss Boston at all, but I loved MIT.
link |
01:34:09.040
That was fighting words.
link |
01:34:11.120
So let's back up to this. So as you said, I was born in Chattanooga, Tennessee.
link |
01:34:14.800
My earliest memory is arriving in Atlanta in a moving truck at the age of three and a half.
link |
01:34:18.080
So I think of myself as being from Atlanta, very distinct memory of that. So I grew up in Atlanta.
link |
01:34:22.400
It's the only place I ever knew as a kid. I loved it. Like much of the country, and certainly
link |
01:34:28.560
much of Atlanta in the 70s and 80s, it was deeply highly segregated, though not in a way that I
link |
01:34:34.240
think was obvious to you unless you were looking at it or were old enough to have noticed it.
link |
01:34:39.680
But you could divide up Atlanta, and Atlanta is hardly unique in this way, by highway,
link |
01:34:43.360
and you could get racing class that way. So I grew up not only in a predominantly
link |
01:34:47.920
black area, to say the very least, I grew up on the poor side of that. But I was very much aware
link |
01:34:55.920
of race for a bunch of reasons, one that people made certain that I was, my family did, but also
link |
01:35:01.200
that it would come up. So in first grade, I had a girlfriend. I say I had a girlfriend. I didn't
link |
01:35:08.320
have a girlfriend. I wasn't even entirely sure what girls were in the first grade. But I do remember
link |
01:35:13.120
she decided I was her girlfriend's little white girl named Heather. And we had a long discussion
link |
01:35:17.680
about how it was okay for us to be boyfriend and girlfriend, despite the fact that she was white
link |
01:35:21.440
and I was black. Between the two of you? Did your parents know about this?
link |
01:35:26.400
Yes. But being a girlfriend and boyfriend in first grade just basically meant that you spent
link |
01:35:31.120
slightly more time together during recess. I think we Eskimo kissed once. It didn't mean anything.
link |
01:35:38.160
It was. At the time, it felt very scandalous because everyone was watching. I was like,
link |
01:35:41.440
ah, my life is now my life has changed in first grade. No one told me elementary school would be
link |
01:35:45.920
like this. Did you write poetry or not in first grade? That would come later. That would come
link |
01:35:50.400
during puberty when I wrote lots and lots of poetry. Anyway, so I was aware of it. I didn't
link |
01:35:56.400
think too much about it, but I was aware of it. But I was surrounded. It wasn't that I wasn't
link |
01:36:01.280
aware of race. It's that I wasn't aware that I was a minority. It's different. And it's because I
link |
01:36:07.920
wasn't as far as my world was concerned. I mean, I'm six years old, five years old in first grade.
link |
01:36:12.880
The world is the seven people I see every day. So it didn't feel that way at all.
link |
01:36:17.840
And by the way, this being Atlanta, home of the civil rights movement and all the rest,
link |
01:36:21.840
it meant that when I looked at TV, which back then one did because there were only three,
link |
01:36:25.520
four or five channels. And I saw the news, which my mother might make me watch. Monica Kaufman was
link |
01:36:33.520
on TV telling me the news and they were all black and the mayor was black and always been
link |
01:36:37.680
black. And so it just never occurred to me. When I went to Georgia Tech, I remember the first day
link |
01:36:43.520
walking across campus from West campus to East campus and realizing along the way that of the
link |
01:36:49.360
hundreds and hundreds and hundreds and hundreds of students that I was seeing, I was the only black
link |
01:36:53.200
one. That was enlightening and very off putting because it occurred to me. And then of course,
link |
01:36:59.920
it continued that way for, well, for the rest of my, for much of the rest of my career at Georgia
link |
01:37:05.440
Tech. Of course, I found lots of other students and I met people cause in Atlanta, you're either
link |
01:37:09.440
black or you're white. There was nothing else. So I began to meet students of Asian descent and I
link |
01:37:14.560
met students who we would call Hispanic and so on and so forth. And you know, so my world,
link |
01:37:18.320
this is what college is supposed to do, right? It's supposed to open you up to people. And it
link |
01:37:22.320
did, but it was a very strange thing to be in the minority. When I came to Boston, I will tell you
link |
01:37:31.200
a story. I applied to one place as an undergrad, Georgia Tech, because I was stupid. I didn't know
link |
01:37:38.320
any better. I just didn't know any better, right? No one told me. When I went to grad school,
link |
01:37:43.360
I applied to three places, Georgia Tech, because that's where I was, MIT and CMU.
link |
01:37:49.600
When I got in to MIT, I got into CMU, but I had a friend who went to CMU. And so I asked him what
link |
01:37:58.480
he thought about it. He spent his time explaining to me about Pittsburgh, much less about CMU,
link |
01:38:03.200
but more about Pittsburgh, which I developed a strong opinion based upon his strong opinion,
link |
01:38:07.760
something about the sun coming out two days out of the year. And I didn't get a chance to go there
link |
01:38:12.240
because the timing was wrong. I think it was because the timing was wrong. At MIT, I asked
link |
01:38:19.120
20 people I knew, either when I visited or I had already known for a variety of reasons,
link |
01:38:24.480
whether they liked Boston. And 10 of them loved it, and 10 of them hated it. The 10 who loved it
link |
01:38:30.160
were all white. The 10 who hated it were all black. And they explained to me very much why
link |
01:38:35.040
that was the case. Both stats told me why. And the stories were remarkably the same for the
link |
01:38:41.040
two clusters. And I came up here, and I could see it immediately, why people would love it
link |
01:38:46.560
and why people would not. And why people tell you about the nice coffee shops.
link |
01:38:50.720
Well, it wasn't coffee shops. It was used CD places. But yeah, it was that kind of a thing.
link |
01:38:55.840
Nice shops. Oh, there's all these students here. Harvard Square is beautiful. You can do all these
link |
01:39:00.000
things, and you can walk. And something about the outdoors, which I wasn't the slightest bit
link |
01:39:02.720
interested in. The outdoors is for the bugs. It's not for humans.
link |
01:39:08.400
That should be a t shirt.
link |
01:39:09.680
Yeah, that's the way I feel about it. And the black folk told me completely different stories
link |
01:39:14.800
about which part of town you did not want to be caught in after dark. But that was nothing new.
link |
01:39:22.000
So I decided that MIT was a great place to be as a university. And I believed it then,
link |
01:39:27.600
I believe it now. And that whatever it is I wanted to do, I thought I knew what I wanted to do,
link |
01:39:32.160
but what if I was wrong? Someone there would know how to do it. Of course, then I would pick the
link |
01:39:36.960
one topic that nobody was working on at the time, but that's okay. It was great. And so I thought
link |
01:39:42.320
that I would be fine. And I'd only be there for like four or five years. I told myself,
link |
01:39:46.480
which turned out not to be true at all. But I enjoyed my time. I enjoyed my time there.
link |
01:39:50.400
But I did see a lot of... I ran across a lot of things that were driven by what I look like
link |
01:39:58.400
while I was here. I got asked a lot of questions. I ran into a lot of cops. I saw a lot about the
link |
01:40:05.920
city. But at the time, I mean, I haven't been here a long time. These are the things that I
link |
01:40:09.120
remember. So this is 1990. There was not a single black radio station. Now this is 1990. I don't
link |
01:40:17.920
know if there are any radio stations anymore. I'm sure there are, but I don't listen to the radio
link |
01:40:21.920
anymore and almost no one does, at least if you're under a certain age. But the idea is you could be
link |
01:40:27.520
in a major metropolitan area and there wasn't a single black radio station, by which I mean,
link |
01:40:30.800
a radio station to play what we would call black music then, was absurd, but somehow captured kind
link |
01:40:37.840
of everything about the city. I grew up in Atlanta and you've heard me tell you about Atlanta.
link |
01:40:44.960
Boston had no economically viable or socially cohesive black middle class.
link |
01:40:51.840
Insofar as it existed, it was uniformly distributed throughout large parts, not all parts,
link |
01:40:56.480
but large parts of the city. And where you had concentrations of black Bostonians,
link |
01:41:02.160
they tended to be poor. It was very different from where I grew up. I grew up on the poor side of
link |
01:41:07.040
town, sure. But then in high school, well, in ninth grade, we didn't have middle school. I went
link |
01:41:13.040
to an eighth grade school where there was a lot of, let's just say, we had a riot the year that
link |
01:41:17.200
I was there. There was at least one major fight every week. It was an amazing experience. But
link |
01:41:24.800
when I went to ninth grade, I went to Academy. Math and Science Academy, Mays High. It was a
link |
01:41:30.640
public school. It was a magnet school. That's why I was able to go there. It was the first high school,
link |
01:41:36.240
I think, in the state of Georgia to sweep the state math and science fairs. It was great. It had
link |
01:41:44.080
385 students, all but four of whom were black. I went to school with the daughter of the
link |
01:41:51.200
former mayor of Atlanta, Michael Jackson's cousin. I mean, it was an upper middle class.
link |
01:41:56.560
Dr. Justin Marchegiani Dropping names.
link |
01:41:57.120
Dr. Justin Marchegiani You know, I just drop names occasionally.
link |
01:41:59.520
You know, drop the mic, drop some names. Just to let you know, I used to hang out with Michael
link |
01:42:03.120
Jackson's cousin, 12th cousin, nine times removed. I don't know. The point is, they had money. We
link |
01:42:07.680
had a parking problem because the kids had cars. I did not come from a place where you had cars.
link |
01:42:12.160
I had my first car when I came to MIT, actually. So, it was just a very different experience for
link |
01:42:21.280
me. But I'd been to places where whether you were rich or whether you were poor, you know,
link |
01:42:26.000
you could be black and rich or black and poor. And it was there and there were places and they
link |
01:42:29.200
were segregated by class as well as by race. But that existed. Here, at least when I was here,
link |
01:42:35.520
didn't feel that way at all. And it felt like a bunch of a really interesting contradiction.
link |
01:42:41.280
It felt like it was the interracial dating capital of the country. It really felt that way.
link |
01:42:49.360
But it also felt like the most racist place I ever spent any time. You know, you couldn't go
link |
01:42:55.120
up the Orange Line at that time. I mean, again, that was 30 years ago. I don't know what it's
link |
01:42:59.360
like now. But there were places you couldn't go. And you knew it. Everybody knew it. And there were
link |
01:43:05.520
places you couldn't live. And everybody knew that. And that was just the greater Boston area in 1992.
link |
01:43:12.720
Subtle racism or explicit racism?
link |
01:43:14.880
Both.
link |
01:43:16.720
In terms of within the institutions, did you feel...
link |
01:43:19.440
Was there levels in which you were empowered to be first or one of the first black people in a
link |
01:43:26.080
particular discipline in some of these great institutions that you were a part of? You know,
link |
01:43:31.360
Georgia Tech or MIT? And was there a part where it felt limiting?
link |
01:43:37.760
I always felt empowered. Some of that was my own delusion, I think. But it worked out. So I never
link |
01:43:45.120
felt... In fact, quite the opposite. Not only did I not feel as if no one was trying to stop me,
link |
01:43:52.240
I had the distinct impression that people wanted me to succeed. By people, I meant the people in
link |
01:43:57.760
power. Not my fellow students. Not that they didn't want me to succeed. But I felt supported,
link |
01:44:04.960
or at least that people were happy to see me succeed at least as much as anyone else. But,
link |
01:44:10.480
you know, 1990, you're dealing with a different set of problems. You're very early, at least in
link |
01:44:15.280
computer science, you're very early in the Jackie Robinson period. There's this thing called the
link |
01:44:20.720
Jackie Robinson syndrome, which is that the first one has to be perfect or has to be sure to
link |
01:44:27.120
succeed because if that person fails, no one else comes after for a long time. So it was kind of in
link |
01:44:32.960
everyone's best interest. But I think it came from a sincere place. I'm completely sure that people
link |
01:44:37.040
went out of their way to try to make certain that the environment would be good. Not just for me,
link |
01:44:43.200
but for the other people who, of course, were around. And I was the only person in the iLab,
link |
01:44:47.200
but I wasn't the only person at MIT by a long shot. On the other hand, we're what?
link |
01:44:53.120
At that point, we would have been, what, less than 20 years away from the first black PhD to
link |
01:44:57.760
graduate from MIT, right? Shirley Jackson, right? 1971, something like that? Somewhere around then.
link |
01:45:03.840
So we weren't that far away from the first first, and we were still another eight years away from
link |
01:45:09.680
the first black PhD in computer science, right? So it was a sort of interesting time. But I did
link |
01:45:16.880
not feel as if the institutions of the university were against any of that. And furthermore, I felt
link |
01:45:25.280
as if there was enough of a critical mass across the institute from students and probably faculty
link |
01:45:30.960
that I didn't know them, who wanted to make certain that the right thing happened. It was very
link |
01:45:35.680
different from the institutions of the rest of the city, which I think were designed in such a way
link |
01:45:41.600
that they felt no need to be supportive.
link |
01:45:44.000
Let me ask a touchy question on that. So you kind of said that you didn't feel,
link |
01:45:52.640
you felt empowered. Is there some lesson, advice, in the sense that no matter what,
link |
01:46:00.160
you should feel empowered? You said, you used the word, I think, illusion or delusion.
link |
01:46:05.920
Is there a sense from the individual perspective where you should always kind of ignore, you know,
link |
01:46:14.480
the, ignore your own eyes, ignore the little forces that you are able to observe around you,
link |
01:46:27.120
that are like trying to mess with you of whether it's jealousy, whether it's hatred in its pure
link |
01:46:33.040
form, whether it's just hatred in its like deluded form, all that kind of stuff?
link |
01:46:38.960
And just kind of see yourself as empowered and confident and all those kinds of things.
link |
01:46:44.480
I mean, it certainly helps, but it's, there's a trade off, right? You have to be deluded enough
link |
01:46:47.680
to think that you can succeed. I mean, you can't get a PhD unless you're crazy enough to think you
link |
01:46:51.360
can invent something that no one else has come up with. I mean, that kind of massive delusion is that
link |
01:46:56.320
you have to be deluded enough to believe that you can succeed despite whatever odds you see
link |
01:46:59.840
in front of you, but you can't be so deluded that you don't think that you need to step out of
link |
01:47:03.280
the way of the oncoming train, right? So it's all a trade off, right? You have to kind of believe in
link |
01:47:11.760
yourself. It helps to have a support group around you in some way or another. I was able to find
link |
01:47:16.800
that, I've been able to find that wherever I've gone, even if it wasn't necessarily on the floor
link |
01:47:21.440
that I was in, I had lots of friends when I was here. Many of them still live here. And I've kept
link |
01:47:26.080
up with many of them. So I felt supported. And certainly I had my mother and my family and those
link |
01:47:30.800
people back home that I could always lean back on, even if it were a long distance call that cost
link |
01:47:37.200
money, which is not something that any of the kids today even know what I'm talking about. But
link |
01:47:41.680
back then it mattered, calling my mom was an expensive proposition. But you have that and
link |
01:47:45.920
it's fine. I think it helps. But you cannot be so deluded that you miss the obvious because it makes
link |
01:47:51.280
things slower and it makes you think you're doing better than you are and it will hurt you in the
link |
01:47:55.920
long run. You mentioned cops. You tell a story of being pulled over. Perhaps it happened more than
link |
01:48:04.400
once. More than once, for sure. One, could you tell that story? And in general, can you give me
link |
01:48:11.040
a sense of what the world looks like when the law doesn't always look at you with a blank slate?
link |
01:48:22.640
With a blank slate with objective eyes? I don't know how to say it more poetically.
link |
01:48:33.840
Well, I guess the, I don't either. I guess the answer is it looks exactly the way it looks now
link |
01:48:39.760
because this is the world that we happen to live in, right? It's people clustering and doing the
link |
01:48:44.640
things that they do and making decisions based on one or two bits of information they find
link |
01:48:50.880
relevant, which, by the way, are all positive feedback loops, which makes it easier for you
link |
01:48:56.080
to believe what you believed before because you behave in a certain way that makes it true and
link |
01:48:59.520
it goes on and circles and then cycles and cycles and then cycles. So it's just about being on edge.
link |
01:49:06.960
I do not, despite having made it over 50 now.
link |
01:49:11.760
Congratulations, brother.
link |
01:49:13.520
God, I have a few gray hairs here and there.
link |
01:49:16.080
You did pretty good.
link |
01:49:16.800
I think, I don't imagine I will ever see a police officer and not get very, very tense.
link |
01:49:25.600
Now, everyone gets a little tense because it probably means you're being pulled over for
link |
01:49:30.560
speeding or something, or you're going to get a ticket or whatever, right? I mean,
link |
01:49:34.400
the interesting thing about the law in general is that most human beings experience of it is
link |
01:49:39.280
fundamentally negative, right? You're only dealing with a lawyer if you're in trouble,
link |
01:49:43.920
except in a few very small circumstances, right? So that's an underlying reality.
link |
01:49:49.680
Now, imagine that that's also at the hands of the police officer. I remember the time when I got
link |
01:49:55.120
pulled over that time, halfway between Boston and Wellesley, actually. I remember thinking
link |
01:50:04.160
when he pulled his gun on me that if he shot me right now, he'd get away with it. That was the
link |
01:50:11.360
that was the worst thing that I felt about that particular moment, is that if he shoots me now,
link |
01:50:16.480
he will get away with it. It would be years later when I realized actually much worse than that
link |
01:50:24.480
is that he'd get away with it. And if it became a thing that other people knew about,
link |
01:50:30.880
odds would be, of course, that it wouldn't. But if it became a thing that other people knew about,
link |
01:50:34.240
if I was living in today's world as opposed to the world 30 years ago, that not only would
link |
01:50:39.680
get away with it, but that I would be painted a villain. I was probably big and scary, and I
link |
01:50:45.360
probably moved too fast, and if only I'd done what he said, and da, da, da, da, da, da, da,
link |
01:50:49.040
which is somehow worse, right? You know, that hurts not just you, you're dead, but your family,
link |
01:50:55.760
and the way people look at you, and look at your legacy or your history, that's terrible.
link |
01:51:00.960
And it would work. I absolutely believe it would have worked had he done it. Now, he didn't. I
link |
01:51:05.360
don't think he wanted to shoot me. I don't think he felt like killing anybody. He did not go out
link |
01:51:08.640
that night expecting to do that or planning on doing it, and I wouldn't be surprised if he never,
link |
01:51:12.880
ever did that or ever even pulled his gun again. I don't know the man's name. I don't remember
link |
01:51:16.800
anything about him. I do remember the gun. Guns are very big when they're in your face. I can tell
link |
01:51:20.320
you this much. They're much larger than they seem. But... And you're basically like speeding or
link |
01:51:24.480
something like that? He said I ran a light, I think. You ran a light. I don't think I ran a
link |
01:51:28.400
light, but you know, in fact, I may not have even gotten a ticket. I may have just gotten a warning.
link |
01:51:33.040
I think he was a little... But he pulled a gun. Yeah. Apparently I moved too fast or something.
link |
01:51:37.920
Rolled my window down before I should have. It's unclear. I think he thought I was going to do
link |
01:51:42.240
something, or at least that's how he behaved. So how, if we can take a little walk around your
link |
01:51:48.640
brain, how do you feel about that guy and how do you feel about cops after that experience?
link |
01:51:58.880
Well, I don't remember that guy, but my view on police officers is the same view I have about
link |
01:52:03.680
lots of things. Fire is an important and necessary thing in the world, but you must respect fire
link |
01:52:12.560
because it will burn you. Fire is a necessary evil in the sense that it can burn you. Necessary
link |
01:52:19.360
in the sense that, you know, heat and all the other things that we use fire for. So when I see
link |
01:52:25.280
a cop, I see a giant ball of flame and I just try to avoid it. And then some people might see
link |
01:52:33.040
a nice place, a nice thing to roast marshmallows with a family over.
link |
01:52:37.760
Which is fine, but I don't roast marshmallows.
link |
01:52:40.880
Okay. So let me go a little dark and I apologize. Just talked to Dan Carlin about
link |
01:52:44.160
Hitler for four hours. So sorry if I go dark here a little bit, but
link |
01:52:50.800
is it easy for this experience of just being careful with the fire and avoiding it to turn
link |
01:52:57.200
to hatred? Yeah, of course. And one might even argue that it is a logical conclusion, right?
link |
01:53:05.840
On the other hand, you've got to live in the world and I don't think it's helpful. Hate is something
link |
01:53:12.560
one should, I mean, hate is something that takes a lot of energy. So one should reserve it for
link |
01:53:20.000
when it is useful and not carried around with you all the time. Again, there's a big difference
link |
01:53:25.360
between the happy delusion that convinces you that you can actually get out of bed and
link |
01:53:30.400
make it to work today without getting hit by a car and the sad delusion that means you can
link |
01:53:36.720
not worry about this car that is barreling towards you, right? So we all have to be a
link |
01:53:40.640
little deluded because otherwise we're paralyzed, right? But one should not be ridiculous.
link |
01:53:46.240
If we go all the way back to something you said earlier about empathy,
link |
01:53:49.360
I think what I would ask other people to get out of this one of many, many, many stories
link |
01:53:57.600
is to recognize that it is real. People would ask me to empathize with the police officer.
link |
01:54:04.560
I would quote back statistics saying that being a police officer isn't even in the top 10 most
link |
01:54:11.040
dangerous jobs in the United States, you're much more likely to get killed in a taxicab.
link |
01:54:14.960
Half of police officers are actually killed by suicide, but that means their lives are something,
link |
01:54:22.960
something's going on there with them and I would more than happy to be empathetic about what it is
link |
01:54:28.240
they go through and how they see the world. I think though that if we step back from what I feel,
link |
01:54:34.720
if we step back from what an individual police officer feels, you step up a level and all this,
link |
01:54:39.840
because all things tie back into interactive AI. The real problem here is that we've built a
link |
01:54:44.560
narrative. We built a big structure that has made it easy for people to put themselves into different
link |
01:54:50.240
pots in the different clusters and to basically forget that the people in the other clusters are
link |
01:54:57.440
ultimately like them. It is useful exercise to ask yourself sometimes, I think, that if I had grown
link |
01:55:03.840
up in a completely different house and a completely different household as a completely different
link |
01:55:07.760
person, if I had been a woman, would I see the world differently? Would I believe what that crazy
link |
01:55:12.320
person over there believes? And the answer is probably yes, because after all, they believe it.
link |
01:55:19.440
And fundamentally, they're the same as you. So then what can you possibly do to fix it? How do
link |
01:55:25.440
you fix Twitter? If you think Twitter needs to be broken or Facebook, if you think Facebook is
link |
01:55:29.760
broken, how do you fix racism? How do you fix any of these things? That's all structural.
link |
01:55:35.280
I mean, individual conversations matter a lot, but you have to create structures that allow people
link |
01:55:42.320
to have those individual conversations all the time in a way that is relatively safe and that
link |
01:55:47.760
allows them to understand that other people have had different experiences, but that ultimately
link |
01:55:51.600
we're the same, which sounds very, I don't even know what the right word is. I'm trying to avoid
link |
01:55:57.200
a word like saccharine, but it feels very optimistic.
link |
01:56:01.760
But I think that's okay. I think that's a part of the delusion, is you want to be a little
link |
01:56:06.640
optimistic and then recognize that the hard problem is actually setting up the structures
link |
01:56:10.160
in the first place, because it's in almost no one's interest to change the infrastructure.
link |
01:56:16.320
Right. I tend to believe that leaders have a big role to that, of selling that optimistic
link |
01:56:22.560
delusion to everybody, and that eventually leads to the building of the structures. But that
link |
01:56:27.600
requires a leader that unites, sort of unites everybody on a vision as opposed to divides
link |
01:56:33.600
on a vision, which is, this particular moment in history feels like there's a nonzero probability,
link |
01:56:43.600
if we go to the P, of something akin to a violent or a nonviolent civil war. This is one of the
link |
01:56:51.440
most divisive periods of American history in recent, you can speak to this from a perhaps
link |
01:56:58.800
a more knowledgeable and deeper perspective than me, but from my naive perspective, this seems like
link |
01:57:03.600
a very strange time. There's a lot of anger, and it has to do with people, I mean, for many reasons.
link |
01:57:11.440
One, the thing that's not spoken about, I think, much is the conflict of opinion,
link |
01:57:18.480
much is the quiet economic pain of millions that's like growing because of COVID, because of closed
link |
01:57:29.040
businesses, because of like lost dreams. So that's building, whatever that tension is building.
link |
01:57:35.440
The other is, there seems to be an elevated level of emotion. I'm not sure if you can psychoanalyze
link |
01:57:41.680
where that's coming from, but this sort of, from which the protests and so on percolated. It's like,
link |
01:57:47.440
why now? Why this particular moment in history? Oh, because time, enough time has passed, right?
link |
01:57:52.080
I mean, you know, the very first race riots were in Boston, not to draw anything from that.
link |
01:57:56.800
Really? When? Oh, this is before like... Going way, I mean, like the 1700s or whatever,
link |
01:58:01.200
right? I mean, there was a massive one in New York. I mean, I'm talking way, way, way back when.
link |
01:58:05.280
So Boston used to be the hotbed of riots. It's just what Boston was all about,
link |
01:58:09.920
or so I'm told from history class. There's an interesting one in New York. I remember when
link |
01:58:15.120
that was. Anyway, the point is, you know, basically you got to get another generation,
link |
01:58:22.960
old enough to be angry, but not so old to remember what happened the last time, right?
link |
01:58:28.160
And that's sort of what happens. But, you know, you said like two completely, you said two things
link |
01:58:33.120
there that I think are worth unpacking. One has to do with this sort of moment in time.
link |
01:58:38.320
And, you know, why? Why is this sort of up built? And the other has to do with a kind of, you know,
link |
01:58:43.760
sort of the economic reality of COVID. So I'm actually, I want to separate those things because,
link |
01:58:47.600
for example, you know, this happened before COVID happened, right? So let's separate these two
link |
01:58:54.160
things for a moment. Now, let me preface all this by saying that although I am interested in history,
link |
01:59:01.920
one of my three minors as an undergrad was history, specifically history, the 1960s. Interesting. The
link |
01:59:07.520
other was Spanish. And, okay, that's a mistake. Oh, I loved that. And history of Spanish and Spanish
link |
01:59:14.160
history, actually, but Spanish and the other was what we would now call cognitive science. But at
link |
01:59:17.760
the time, that's fascinating. Interesting. I minored in Cogsci here for grad school. That was
link |
01:59:24.960
really, that was really fascinating. It was a very different experience. I mean, it was a very
link |
01:59:29.440
it was really fascinating. It was a very different experience from all the computer science classes
link |
01:59:33.600
I've been taking, even the Cogsci classes I was taking at an undergrad. Anyway, I'm interested
link |
01:59:42.400
in history, but I'm hardly a historian, right? So, you know, forgive my, I will ask the audience to
link |
01:59:48.720
forgive my simplification. But I think the question that's always worth asking, as opposed, it's the
link |
01:59:58.640
same question, but a little different. Not why now, but why not before? Right? So why the 1950s,
link |
02:00:08.480
60s civil rights movement as opposed to the 1930s, 1940s? Well, first off, there was a civil
link |
02:00:12.400
rights movement in the 30s and 40s. It just wasn't of the same character or quite as well known. Post
link |
02:00:17.440
World War II, lots of interesting things were happening. It's not as if a switch was turned on
link |
02:00:22.560
and Brown versus the Board of Education or the Montgomery bus boycott. And that's when it
link |
02:00:27.840
happened. These things been building up forever and go all the way back and all the way back and
link |
02:00:30.960
all the way back. And, you know, Harriet Tubman was not born in 1950, right? So, you know, we can
link |
02:00:35.120
take these things. It could have easily happened right after World War II. Yes. I think,
link |
02:00:42.640
and again, I'm not a scholar. I think that the big difference was TV. These things are visible.
link |
02:00:50.800
People can see them. It's hard to avoid, right? Why not James Farmer? Why Martin Luther King? Because
link |
02:00:59.200
one was born 20 years after the other, whatever. I think it turns out that, you know what King's
link |
02:01:06.000
biggest failure was in the early days? It was in Georgia. They were doing the usual thing,
link |
02:01:13.120
trying to integrate. And I forget the guy's name, but you can look this up. But he, a cop,
link |
02:01:21.680
he was a sheriff made a deal with the whole state of Georgia. We're going to take people and we are
link |
02:01:26.080
going to nonviolently put them in trucks. And then we're going to take them and put them in jails
link |
02:01:30.800
very far away from here. And we're going to do that. And we're not going to, there'll be no
link |
02:01:35.760
reason for the press to hang around. And they did that and it worked. And the press left and
link |
02:01:41.440
nothing changed. So next they went to Birmingham, Alabama and Bull O Connor. And you got to see on
link |
02:01:48.320
TV, little boys and girls being hit with fire hoses and being knocked down. And there was
link |
02:01:53.920
outrage and things changed, right? Part of the delusion is pretending that nothing bad is
link |
02:01:59.840
happening that might force you to do something big you don't want to do. But sometimes it gets
link |
02:02:03.360
put in your face and then you kind of can't ignore it. And a large part in my view of what happened
link |
02:02:08.640
right was that it was too public to ignore. Now we created other ways of ignoring it.
link |
02:02:14.480
Lots of change happened in the South, but part of that delusion was that it wasn't going to affect
link |
02:02:17.680
the West or the Northeast. And of course it did. And that caused its own set of problems, which
link |
02:02:21.760
went into the late sixties into the seventies. And, you know, in some ways we're living with
link |
02:02:25.200
that legacy now and so on. So why not what's happening now? Why didn't happen 10 years ago?
link |
02:02:32.400
I think it's people have more voices. There's not just more TV, there's social media. It's very easy
link |
02:02:38.000
for these things to kind of build on themselves and things are just quite visible. And there's
link |
02:02:44.960
demographic change. I mean, the world is changing rapidly, right? And so it's very difficult.
link |
02:02:49.040
You're now seeing people you could have avoided seeing most of your life growing up in a particular
link |
02:02:52.560
time. And it's happening, it's dispersing at a speed that is fast enough to cause
link |
02:02:58.720
concern for some people, but not so fast to cause massive negative reaction. So that's that.
link |
02:03:06.000
On the other hand, and again, that's a massive oversimplification, but I think there's something
link |
02:03:10.400
there anyway, at least something worth exploring. I'm happy to be yelled at by a real historian.
link |
02:03:13.920
Oh yeah. I mean, there's just the obvious thing. I mean, I guess you're implying, but not
link |
02:03:19.520
saying this. I mean, it seemed to have percolated the most with just a single video, for example,
link |
02:03:24.480
the George Floyd video. It's fascinating to think that whatever the mechanisms that put injustice
link |
02:03:34.000
in front of our face, not like directly in front of our face, those mechanisms are the mechanisms
link |
02:03:42.080
of change. Yeah. On the other hand, Rodney King. So no one remembers this. I seem to be the only
link |
02:03:46.720
person who remembers this, but sometime before the Rodney King incident, there was a guy who
link |
02:03:51.360
was a police officer who was saying that things were really bad in Southern California. And he
link |
02:03:58.080
was going to prove it by having some news, some camera people follow him around. And he says,
link |
02:04:03.040
I'm going to go into these towns and just follow me for a week. And you will see that I'll get
link |
02:04:06.160
harassed. And like the first night he goes out there and he crosses into the city, some cops
link |
02:04:11.120
pull him over and he's a police officer. Remember, they don't know that. Of course they like shove
link |
02:04:15.840
his face through a glass window. This was on the new, like I distinctly remember watching this as
link |
02:04:20.640
a kid. Actually, I guess I wasn't a kid. I was in college, I was in grad school at the time.
link |
02:04:25.040
So that's not enough. Well, it disappeared like a day late. It didn't go viral.
link |
02:04:30.400
Yeah. Whatever that is, whatever that magic thing is.
link |
02:04:33.920
And whatever it was in 92, it was harder to go viral in 92, right? Or 91,
link |
02:04:38.320
actually it must've been 90 or 91, but that happened. And like two days later,
link |
02:04:42.000
it's like it never happened. Again, nobody remembers this, but I'm like the only person.
link |
02:04:45.280
Sometimes I think I must've dreamed it. Anyway, Rodney King happens. It goes viral
link |
02:04:50.000
or the moral equivalent thereof at the time. And eventually we get April 29th. And I don't know
link |
02:04:57.520
what the difference was between the two things, other than one thing caught on and one thing
link |
02:05:00.640
didn't. Maybe what's happening now is two things are feeding onto one another. One is more people
link |
02:05:07.280
are willing to believe. And the other is there's easier and easier ways to give evidence. Cameras,
link |
02:05:14.560
body cams or whatever, but we're still finding ourselves telling the same story. It's the same
link |
02:05:17.600
thing over and over again. I would invite you to go back and read the op eds from what people were
link |
02:05:22.640
saying about the violence is not the right answer after Rodney King. And then go back to 1980 and
link |
02:05:28.400
the big riots that were happening around then and read the same op ed. It's the same words over and
link |
02:05:33.680
over and over again. I mean, there's your remembering history right there. I mean,
link |
02:05:37.760
it's like literally the same words. Like it could have just caught, but I'm surprised no one got
link |
02:05:40.960
flagged for plagiarism. It's interesting if you have an opinion on the question of violence
link |
02:05:46.560
and the popular perhaps caricature of Malcolm X versus Martin Luther King.
link |
02:05:53.120
You know, Malcolm X was older than Martin Luther King. People kind of have it in their head that
link |
02:05:57.280
he's younger. Well, he died sooner, but only by a few years. People think of MLK as the older
link |
02:06:05.040
statesman and they think of Malcolm X as the young, angry, whatever, but that's more of a
link |
02:06:10.320
narrative device. It's not true at all. I don't, I just, I reject the choice as I think it's a
link |
02:06:18.960
false choice. I think they're just things that happen. You just do, as I said, hatred is not,
link |
02:06:23.360
it takes a lot of energy, but you know, every once in a while you have to fight.
link |
02:06:28.720
One thing I will say without taking a moral position, which I will not take on this matter,
link |
02:06:34.160
violence has worked.
link |
02:06:38.640
Yeah, that's the annoying thing.
link |
02:06:41.360
That's the annoying thing.
link |
02:06:43.600
It seems like over the top anger works. Outrage works. So you can say like being calm and rational
link |
02:06:53.600
and just talking it out is going to lead to progress. But it seems like if you just look
link |
02:06:59.600
through history being irrationally upset is the way you make progress.
link |
02:07:06.640
Well, it's certainly the way that you get someone to notice you.
link |
02:07:09.680
Yeah.
link |
02:07:10.320
And if they don't notice you, I mean, what's the difference between that and what did you,
link |
02:07:13.760
again, without taking a moral position on this, I'm just trying to observe history here.
link |
02:07:17.040
If you, maybe if television didn't exist, the civil rights movement doesn't happen
link |
02:07:22.240
or it takes longer or it takes a very different form. Maybe if social media doesn't exist,
link |
02:07:27.120
a whole host of things, positive and negative don't happen. And what do any of those things
link |
02:07:33.440
do other than expose things to people? Violence is a way of shouting. I mean,
link |
02:07:40.880
many people far more talented and thoughtful than I have have said this in one form or another,
link |
02:07:45.680
right? That violence is the voice of the unheard. It's a thing that people do when they feel as if
link |
02:07:54.160
they have no other option. And sometimes we agree and sometimes we disagree. Sometimes we think
link |
02:08:00.240
they're justified. Sometimes we think they are not, but regardless, it is a way of shouting.
link |
02:08:06.400
And when you shout, people tend to hear you, even if they don't necessarily hear the words
link |
02:08:10.240
that you're saying, they hear that you were shouting. I see no way. So another way of putting
link |
02:08:15.200
it, which I think is less, let us just say provocative, but I think is true is that all
link |
02:08:26.400
change, particularly change that impacts power requires struggle. The struggle doesn't have to
link |
02:08:32.000
be violent, you know, but it's a struggle nonetheless. The powerful don't give up power
link |
02:08:38.880
easily. I mean, why should they? But even so, it still has to be a struggle. And by the way,
link |
02:08:45.840
this isn't just about, you know, violent political, whatever, nonviolent political
link |
02:08:49.360
change, right? This is true for understanding calculus, right? I mean, everything requires
link |
02:08:53.280
a struggle. We're back to talking about faculty hiring. At the end of the day,
link |
02:08:56.560
in the end of the day, it all comes down to faculty hiring. All a metaphor. Faculty
link |
02:09:01.600
hiring is a metaphor for all of life. Let me ask a strange question. Do you think everything is
link |
02:09:10.400
going to be okay in the next year? Do you have a hope that we're going to be okay?
link |
02:09:16.640
I tend to think that everything's going to be okay because I just tend to think that everything's
link |
02:09:20.560
going to be okay. My mother says something to me a lot and always has, and I find it quite
link |
02:09:26.240
comforting, which is this too shall pass and this too shall pass. Now, this too shall pass is not
link |
02:09:32.080
just this bad thing is going away. Everything passes. I mean, I have a 16 year old daughter
link |
02:09:38.480
who's going to go to college probably at about 15 minutes, given how fast she seems to be growing
link |
02:09:43.920
up. And you know, I get to hang out with her now, but one day I won't. She'll ignore me just as much
link |
02:09:48.720
as I ignored my parents when I was in college and went to grad school. This too shall pass.
link |
02:09:52.160
But I think that one day, if we're all lucky, you live long enough to look back on something that
link |
02:09:57.600
happened a while ago, even if it was painful and mostly it's a memory. So yes, I think it'll be okay.
link |
02:10:06.080
What about humans? Do you think we'll live into the 21st century?
link |
02:10:11.360
I certainly hope so.
link |
02:10:12.560
Are you worried that we might destroy ourselves with nuclear weapons, with AGI, with engineering?
link |
02:10:19.920
I'm not worried about AGI doing it, but I am worried. I mean, at any given moment, right? Also,
link |
02:10:24.000
but you know, at any given moment, a comet could, I mean, you know, whatever. I tend to think that
link |
02:10:28.640
outside of things completely beyond our control, we have a better chance than not of making it.
link |
02:10:36.800
You know, I talked to Alex Filipenko from Berkeley. He was talking about comets and
link |
02:10:41.920
that they can come out of nowhere. And that was a realization to me. Wow. We're just watching
link |
02:10:49.040
this darkness and they can just enter. And then we have less than a month.
link |
02:10:53.680
And yet you make it from day to day.
link |
02:10:57.760
That one shall not pass. Well, maybe for Earth they'll pass, but not for humans.
link |
02:11:02.160
But I'm just choosing to believe that it's going to be okay. And we're not going to get hit by
link |
02:11:08.720
an asteroid, at least not while I'm around. And if we are, well, there's very little I can do about
link |
02:11:13.280
it. So I might as well assume it's not going to happen. It makes food taste better.
link |
02:11:17.680
It makes food taste better.
link |
02:11:19.680
So you, out of the millions of things you've done in your life,
link |
02:11:24.240
you've also began the This Week in Black History calendar of facts.
link |
02:11:30.640
There's like a million questions that can ask here. You said you're not a historian,
link |
02:11:35.760
but is there, let's start at the big history question of, is there somebody in history,
link |
02:11:44.240
in black history that you draw a lot of philosophical or personal inspiration from,
link |
02:11:50.960
or you just find interesting or a moment in history you find interesting?
link |
02:11:55.040
Well, I find the entirety of the 40s to the 60s and the civil rights movement that didn't happen
link |
02:12:01.360
and did happen at the same time during then quite inspirational. I mean, I've read quite a bit of the
link |
02:12:07.440
time period, at least I did in my younger days when I had more time to read as many things as I
link |
02:12:12.000
wanted to. What was quirky about This Week in Black History when I started in the 80s was how
link |
02:12:22.000
focused it was. It was because of the sources I was stealing from. And I was very much stealing
link |
02:12:25.280
from sort of like, I'd take calendars, anything I could find, Google didn't exist, right? And I
link |
02:12:29.040
just pulled as much as I could and just put it together in one place for other people.
link |
02:12:32.320
What ended up being quirky about it, and I started getting people sending me information on it,
link |
02:12:36.080
was the inventors. People who, you know, Gerard Morgan to Benjamin Banneker, right? People who
link |
02:12:43.120
were inventing things. At a time when, how in the world did they manage to invent anything?
link |
02:12:54.080
Like, all these other things were happening, mother necessity, right? All these other things
link |
02:12:57.600
were happening. And, you know, there were so many terrible things happening around them. And, you
link |
02:13:00.960
know, they went to the wrong state at the wrong time. They may never, never come back, but they
link |
02:13:04.160
were inventing things we use, right? And it was always inspiring to me that people would still
link |
02:13:10.080
create even under those circumstances. I got a lot out of that. I also learned a few lessons. I
link |
02:13:16.560
think, you know, the Charles Richard Drews of the world, you know, you create things that impact
link |
02:13:23.680
people. You don't necessarily get credit for them. And that's not right, but it's also okay.
link |
02:13:29.280
TK You okay with that?
link |
02:13:31.080
CK Up to a point, yeah. I mean, look, in our world,
link |
02:13:36.880
all we really have is credit.
link |
02:13:38.320
TK I was always bothered by how much value credit is given.
link |
02:13:43.520
CK That's the only thing you got. I mean, if you're an academic in some sense,
link |
02:13:46.960
well, it isn't the only thing you've got, but it feels that way sometimes.
link |
02:13:49.360
TK But you got the actual, we're all going to be dead soon. You got the joy of having created
link |
02:13:56.160
the, you know, the credit with Jan. I've talked to Jorgen Schmidhuber, right? The Turing Award
link |
02:14:05.200
given to three people for deep learning. And you could say that a lot of other people should be on
link |
02:14:10.880
that list. It's the Nobel Prize question. Yeah, it's sad. It's sad. And people like talking about
link |
02:14:16.320
it. But I feel like in the long arc of history, the only person who will be remembered is Einstein,
link |
02:14:22.560
Hitler, maybe Elon Musk. And the rest of us are just like...
link |
02:14:27.040
CK Well, you know, someone asked me about immortality once and I said,
link |
02:14:31.040
and I stole this from somebody else. I don't remember who, but it was,
link |
02:14:34.400
you know, I asked them, what's your great grandfather's name? Any of them? Of course,
link |
02:14:39.920
they don't know. Most of us do not know. I mean, I'm not entirely sure. I know my grandparents,
link |
02:14:44.960
all my grandparents names. I know what I called them, right? I don't know their middle names,
link |
02:14:48.560
for example. It's within living memory, so I could find out. Actually, my grandfather
link |
02:14:54.640
didn't know when he was born. I had no idea how old he was, right? But I definitely don't know
link |
02:15:00.480
any of my great grandparents are. So in some sense, immortality is doing something preferably
link |
02:15:06.400
positive so that your great grandchildren know who you are, right? And that's kind of what you
link |
02:15:11.600
can hope for, which is very depressing in some ways. I could turn it into something uplifting
link |
02:15:16.720
if you need me to, but it's simple, right? It doesn't matter. I don't have to know my great
link |
02:15:23.200
grandfather was to know that I wouldn't be here without him. And I don't know who my great
link |
02:15:28.400
grandchildren are. Certainly my great, great grandchildren are, and I'll probably never meet
link |
02:15:32.240
them. Although I would very much like to, but hopefully I'll set the world in motion in such
link |
02:15:38.080
a way that their lives will be better than they would have been if I hadn't done that. Well,
link |
02:15:41.440
certainly they wouldn't have existed if I hadn't done the things that I did.
link |
02:15:44.320
So I think that's a good positive thing you live on through other people.
link |
02:15:49.280
Are you afraid of death?
link |
02:15:51.200
I don't know if I'm afraid of death, but I don't like it.
link |
02:15:54.960
That's another t shirt. I mean, do you ponder it? Do you think about the
link |
02:16:02.720
inevitability of oblivion? I do occasionally. This feels like a very rushing conversation.
link |
02:16:07.760
I will tell you a story, something that happened to me recently. If you look very carefully,
link |
02:16:14.320
you will see I have a scar, which by the way, is an interesting story of its own about why people
link |
02:16:20.560
have half of their thyroid taken out. Some people get scars and some don't. But anyway, I had half
link |
02:16:26.720
my thyroid taken out. The way I got there, by the way, is its own interesting story, but I won't go
link |
02:16:30.320
into it. Just suffice it to say, I did what I keep telling people you should never do, which is never
link |
02:16:33.760
go to the doctor unless you have to, because there's nothing good that's ever going to come
link |
02:16:36.720
out of a doctor's visit. So I went to the doctor to look at one thing. It's a little bump I had on
link |
02:16:41.440
the side that I thought might be something bad because my mother made me. And I went there and
link |
02:16:45.600
he's like, oh, it's nothing. But by the way, your thyroid is huge. Can you breathe? Yes,
link |
02:16:49.120
I can breathe. Are you sure? Because it's pushing on your windpipe. You should be dead.
link |
02:16:52.560
So I ended up going there. And to look at my thyroid, it was growing. I had what's called a
link |
02:16:59.520
goiter. And he said, we're going to have to take it out at some point. When? Sometime before you're
link |
02:17:03.920
85, probably. But if you wait till you're 85, that'll be really bad because you don't want to
link |
02:17:09.280
have surgery when you're 85 years old, if you can help it. Certainly not the kind of surgery it
link |
02:17:14.240
takes to take out your thyroid. So I went there and I would decide I would put it off until
link |
02:17:21.680
December 19th because my birthday is December 18th. And I wouldn't be able to say I made it to
link |
02:17:26.560
49 or whatever. So I said, I'll wait till after my birthday. In the first six months of that,
link |
02:17:32.640
nothing changed. Apparently in the next three months, it had grown. I hadn't noticed this at
link |
02:17:39.120
all. I went and had surgery. They took out half of it. The other half is still there and working
link |
02:17:44.480
fine, by the way. I don't have to take a pill or anything like that. It's great. I'm in the
link |
02:17:49.520
hospital room and the doctor comes in. I've got these things in my arm. They're going to do
link |
02:17:56.880
whatever. They're talking to me. And the anesthesiologist says, huh, your blood
link |
02:18:00.880
pressure is through the roof. Do you have high blood pressure? I said, no, but I'm terrified if
link |
02:18:04.880
that helps you at all. And the anesthesist, who's the nurse who supports the anesthesiologist,
link |
02:18:11.440
if I got that right, said, oh, don't worry about it. I've just put some stuff in your IV. You're
link |
02:18:15.040
going to be feeling pretty good in a couple of minutes. And I remember turning and saying,
link |
02:18:19.280
well, I'm going to feel pretty good in a couple of minutes. Next thing I know, there's this guy
link |
02:18:23.680
and he's moving my bed. And he's talking to me and I have this distinct impression that I've met
link |
02:18:29.280
this guy and I should know what he's talking about, but I kind of just don't remember what
link |
02:18:35.920
just happened. And I look up and I see the tiles going by and I'm like, oh, it's just like in the
link |
02:18:41.040
movies where you see the tiles go by. And then I have this brief thought that I'm in an infinitely
link |
02:18:47.680
long warehouse and there's someone sitting next to me. And I remember thinking, oh, she's not
link |
02:18:53.280
talking to me. And then I'm back in the hospital bed. And in between the time where the tiles were
link |
02:19:00.960
going by and I got in the hospital bed, something like five hours had passed. Apparently it had
link |
02:19:05.520
grown so much that it was a four and a half hour procedure instead of an hour long procedure. I
link |
02:19:09.760
lost a neck size and a half. It was pretty big. Apparently it was as big as my heart.
link |
02:19:16.560
Why am I telling you this? I'm telling you this because...
link |
02:19:19.120
It's a hell of a story already. Between tiles going by and me waking up in
link |
02:19:25.920
my hospital bed, no time passed. There was no sensation of time passing.
link |
02:19:31.840
When I go to sleep and I wake up in the morning, I have this feeling that time has passed. This
link |
02:19:36.960
feeling that something has physically changed about me. Nothing happened between the time they
link |
02:19:42.320
put the magic juice in me and the time that I woke up. Nothing. By the way, my wife was there
link |
02:19:47.440
with me talking. Apparently I was also talking. I don't remember any of this, but luckily I didn't
link |
02:19:53.120
say anything I wouldn't normally say. My memory of it is I would talk to her and she would teleport
link |
02:19:58.480
around the room. And then I accused her of witchcraft and that was the end of that.
link |
02:20:03.760
Her point of view is I would start talking and then I would fall asleep and then I would wake
link |
02:20:07.760
up and leave off where I was before. I had no notion of any time passing.
link |
02:20:10.880
I kind of imagine that that's death, is the lack of sensation of time passing. And on the one hand,
link |
02:20:20.400
I am, I don't know, soothed by the idea that I won't notice. On the other hand, I'm very unhappy
link |
02:20:28.720
at the idea that I won't notice. So I don't know if I'm afraid of death, but I'm completely sure
link |
02:20:35.680
that I don't like it and that I particularly would prefer to discover on my own whether immortality
link |
02:20:41.520
sucks and be able to make a decision about it. That's what I would prefer. You like to have a
link |
02:20:47.040
choice in the matter. I would like to have a choice in the matter. Well, again, on the Russian thing,
link |
02:20:51.440
I think the finiteness of it is the thing that gives it a little flavor, a little spice. Well,
link |
02:20:57.680
in reinforcement learning, we believe that. That's why we have discount factors. Otherwise,
link |
02:21:00.640
it doesn't matter what you do. Amen. Well, let me, one last question sticking on the Russian theme.
link |
02:21:09.280
You talked about your great grandparents not remembering their name. What do you think is the,
link |
02:21:18.000
in this kind of Markov chain that is life, what do you think is the meaning of it all?
link |
02:21:26.000
What's the meaning of life? Well, in a world where eventually you won't know who your great
link |
02:21:32.000
grandchildren are, I'm reminded of something I heard once or I read once that I really like,
link |
02:21:42.000
which is, it is well worth remembering that the entire universe, save for one trifling exception,
link |
02:21:51.760
is composed entirely of others. And I think that's the meaning of life.
link |
02:22:01.040
Charles, this is one of the best conversations I've ever had. And I get to see you tomorrow
link |
02:22:06.240
again to hang out with who looks to be one of the most, how should I say, interesting personalities
link |
02:22:15.600
that I'll ever get to meet with Michael Lippmann. So I can't wait. I'm excited to have had this
link |
02:22:20.640
opportunity. Thank you for traveling all the way here. It was amazing. I'm excited. I always love
link |
02:22:25.920
Georgia Tech. I'm excited to see with you being involved there what the future holds. So thank you
link |
02:22:30.960
for talking to me. Thank you for having me. I enjoyed every minute of it. Thanks for listening
link |
02:22:34.800
to this conversation with Charles Isbell and thank you to our sponsors, Neuro, the maker of
link |
02:22:40.320
functional sugar free gum and mints that I used to give my brain a quick caffeine boost, Decoding
link |
02:22:46.880
Digital, a podcast on tech and entrepreneurship that I listen to and enjoy, Masterclass, online
link |
02:22:53.360
courses that I watch from some of the most amazing humans in history, and Cash App, the app I used to
link |
02:23:00.080
send money to friends for food and drinks. Please check out these sponsors in the description to get
link |
02:23:06.080
a discount and to support this podcast. If you enjoy this thing, subscribe on YouTube, review it
link |
02:23:11.920
with Five Stars and Apple Podcast, follow on Spotify, support on Patreon, or connect with me
link |
02:23:17.040
on Twitter at Lex Friedman. And now let me leave you with some poetic words from Martin Luther
link |
02:23:23.520
King Jr. There comes a time when people get tired of being pushed out of the glittering sunlight
link |
02:23:30.720
of life's July and left standing amid the piercing chill of an alpine November.
link |
02:23:37.280
Thank you for listening and hope to see you next time.