back to index

Judea Pearl: Causal Reasoning, Counterfactuals, and the Path to AGI | Lex Fridman Podcast #56


small model | large model

link |
00:00:00.000
The following is a conversation with Judea Pearl,
link |
00:00:03.240
professor at UCLA and a winner of the Turing Award
link |
00:00:06.720
that's generally recognized as the Nobel Prize of Computing.
link |
00:00:10.640
He's one of the seminal figures
link |
00:00:12.400
in the field of artificial intelligence,
link |
00:00:14.440
computer science, and statistics.
link |
00:00:16.640
He has developed and championed probabilistic approaches
link |
00:00:19.960
to AI, including Beijing networks,
link |
00:00:22.680
and profound ideas in causality in general.
link |
00:00:26.040
These ideas are important not just to AI,
link |
00:00:29.080
but to our understanding and practice of science.
link |
00:00:32.800
But in the field of AI, the idea of causality,
link |
00:00:35.840
cause and effect, to many, lie at the core
link |
00:00:39.400
of what is currently missing and what must be developed
link |
00:00:42.080
in order to build truly intelligent systems.
link |
00:00:46.080
For this reason, and many others,
link |
00:00:48.000
his work is worth returning to often.
link |
00:00:50.760
I recommend his most recent book called Book of Why
link |
00:00:54.200
that presents key ideas from a lifetime of work
link |
00:00:57.160
in a way that is accessible to the general public.
link |
00:01:00.440
This is the Artificial Intelligence Podcast.
link |
00:01:03.480
If you enjoy it, subscribe on YouTube,
link |
00:01:05.880
give it five stars on Apple Podcast,
link |
00:01:07.840
support it on Patreon, or simply connect with me on Twitter
link |
00:01:11.360
at Lex Friedman, spelled F R I D M A N.
link |
00:01:15.320
If you leave a review on Apple Podcasts especially,
link |
00:01:18.200
but also a cast box or comment on YouTube,
link |
00:01:20.960
consider mentioning topics, people, ideas, questions,
link |
00:01:23.920
quotes, and science, tech, and philosophy
link |
00:01:26.120
you find interesting, and I'll read them on this podcast.
link |
00:01:29.560
I won't call out names, but I love comments
link |
00:01:31.960
with kindness and thoughtfulness in them,
link |
00:01:33.880
so I thought I'd share them with you.
link |
00:01:35.760
Someone on YouTube highlighted a quote
link |
00:01:37.840
from the conversation with Noam Chomsky,
link |
00:01:40.040
where he said that the significance of your life
link |
00:01:42.800
is something you create.
link |
00:01:44.780
I like this line as well.
link |
00:01:46.600
On most days, the existentialist approach to life
link |
00:01:49.880
is one I find liberating and fulfilling.
link |
00:01:53.660
I recently started doing ads
link |
00:01:55.040
at the end of the introduction.
link |
00:01:56.640
I'll do one or two minutes after introducing the episode,
link |
00:01:59.400
and never any ads in the middle
link |
00:02:01.080
that break the flow of the conversation.
link |
00:02:03.280
I hope that works for you
link |
00:02:04.760
and doesn't hurt the listening experience.
link |
00:02:08.280
This show is presented by Cash App,
link |
00:02:10.600
the number one finance app in the App Store.
link |
00:02:13.040
I personally use Cash App to send money to friends,
link |
00:02:15.520
but you can also use it to buy, sell,
link |
00:02:17.420
and deposit Bitcoin in just seconds.
link |
00:02:20.100
Cash App also has a new investing feature.
link |
00:02:22.800
You can buy fractions of a stock, say $1 worth,
link |
00:02:25.860
no matter what the stock price is.
link |
00:02:28.040
Broker services are provided by Cash App Investing,
link |
00:02:30.720
a subsidiary of Square, a member of SIPC.
link |
00:02:34.280
I'm excited to be working with Cash App
link |
00:02:36.600
to support one of my favorite organizations called First,
link |
00:02:39.940
best known for their first robotics and Lego competitions.
link |
00:02:43.440
They educate and inspire hundreds of thousands of students
link |
00:02:47.120
in over 110 countries,
link |
00:02:49.000
and have a perfect rating on Charity Navigator,
link |
00:02:51.760
which means the donated money
link |
00:02:53.120
is used to the maximum effectiveness.
link |
00:02:56.000
When you get Cash App from the App Store or Google Play,
link |
00:02:58.680
and use the code LEXPODCAST, you'll get $10,
link |
00:03:02.640
and Cash App will also donate $10 to First,
link |
00:03:05.600
which again, is an organization
link |
00:03:07.560
that I've personally seen inspire girls and boys
link |
00:03:10.120
to dream of engineering a better world.
link |
00:03:12.800
And now, here's my conversation with Judea Pearl.
link |
00:03:18.080
You mentioned in an interview
link |
00:03:19.720
that science is not a collection of facts,
link |
00:03:21.840
but a constant human struggle
link |
00:03:23.960
with the mysteries of nature.
link |
00:03:26.700
What was the first mystery that you can recall
link |
00:03:29.240
that hooked you, that kept you in the creaset?
link |
00:03:30.800
Oh, the first mystery, that's a good one.
link |
00:03:34.440
Yeah, I remember that.
link |
00:03:37.840
I had a fever for three days.
link |
00:03:41.440
And when I learned about Descartes, analytic geometry,
link |
00:03:45.680
and I found out that you can do all the construction
link |
00:03:49.880
in geometry using algebra.
link |
00:03:52.880
And I couldn't get over it.
link |
00:03:54.480
I simply couldn't get out of bed.
link |
00:03:58.280
So what kind of world does analytic geometry unlock?
link |
00:04:02.800
Well, it connects algebra with geometry.
link |
00:04:07.320
Okay, so Descartes had the idea
link |
00:04:09.360
that geometrical construction and geometrical theorems
link |
00:04:14.360
and assumptions can be articulated
link |
00:04:17.520
in the language of algebra,
link |
00:04:19.520
which means that all the proof that we did in high school,
link |
00:04:24.840
and trying to prove that the three bisectors
link |
00:04:28.560
meet at one point, and that, okay,
link |
00:04:33.640
all this can be proven by just shuffling around notation.
link |
00:04:39.440
Yeah, that was a traumatic experience.
link |
00:04:43.560
That was a traumatic experience.
link |
00:04:44.400
For me, it was, I'm telling you.
link |
00:04:45.520
So it's the connection
link |
00:04:46.960
between the different mathematical disciplines,
link |
00:04:49.120
that they all.
link |
00:04:49.960
Not in between two different languages.
link |
00:04:52.960
Languages.
link |
00:04:53.800
Yeah.
link |
00:04:54.640
So which mathematic discipline is most beautiful?
link |
00:04:57.200
Is geometry it for you?
link |
00:04:58.560
Both are beautiful.
link |
00:04:59.440
They have almost the same power.
link |
00:05:02.400
But there's a visual element to geometry, being a.
link |
00:05:04.920
Visually, it's more transparent.
link |
00:05:08.080
But once you get over to algebra,
link |
00:05:10.680
then the linear equation is a straight line.
link |
00:05:14.400
This translation is easily absorbed, okay?
link |
00:05:18.160
And to pass a tangent to a circle,
link |
00:05:22.760
you know, you have the basic theorems,
link |
00:05:25.480
and you can do it with algebra.
link |
00:05:27.480
So but the transition from one to another was really,
link |
00:05:31.520
I thought that Descartes was the greatest mathematician
link |
00:05:34.120
of all times.
link |
00:05:35.160
So you have been at the, if you think of engineering
link |
00:05:40.840
and mathematics as a spectrum.
link |
00:05:43.240
Yes.
link |
00:05:44.080
You have been, you have walked casually along this spectrum
link |
00:05:49.200
throughout your life.
link |
00:05:51.520
You know, a little bit of engineering,
link |
00:05:53.000
and then, you know,
link |
00:05:55.920
you've done a little bit of mathematics here and there.
link |
00:05:58.800
Not a little bit.
link |
00:05:59.640
I mean, we got a very solid background in mathematics,
link |
00:06:04.080
because our teachers were geniuses.
link |
00:06:07.080
Our teachers came from Germany in the 1930s,
link |
00:06:09.720
running away from Hitler.
link |
00:06:11.880
They left their careers in Heidelberg and Berlin,
link |
00:06:15.080
and came to teach high school in Israel.
link |
00:06:17.840
And we were the beneficiary of that experiment.
link |
00:06:21.600
So I, and they taught us math the good way.
link |
00:06:25.200
What's the good way to teach math?
link |
00:06:26.720
Chronologically.
link |
00:06:29.040
The people.
link |
00:06:29.880
The people behind the theorems, yeah.
link |
00:06:33.320
Their cousins, and their nieces, and their faces.
link |
00:06:39.040
And how they jumped from the bathtub when they scream,
link |
00:06:41.520
Eureka!
link |
00:06:43.480
And ran naked in town.
link |
00:06:46.000
So you're almost educated as a historian of math.
link |
00:06:49.240
No, we just got a glimpse of that history
link |
00:06:51.920
together with a theorem,
link |
00:06:53.760
so every exercise in math was connected with a person.
link |
00:06:59.600
And the time of the person.
link |
00:07:01.040
The period.
link |
00:07:03.520
The period, also mathematically speaking.
link |
00:07:05.600
Mathematically speaking, yes.
link |
00:07:06.880
Not the politics, no.
link |
00:07:09.040
So, and then in university,
link |
00:07:14.000
you have gone on to do engineering.
link |
00:07:16.240
Yeah.
link |
00:07:17.200
I get a B.S. in engineering and a technical, right?
link |
00:07:20.680
And then I moved here for graduate work,
link |
00:07:25.600
and I got, I did engineering in addition to physics
link |
00:07:30.280
in Rutgers, and it combined very nicely with my thesis,
link |
00:07:35.840
which I did in RCA Laboratories in superconductivity.
link |
00:07:40.480
And then somehow thought to switch
link |
00:07:43.760
to almost computer science, software,
link |
00:07:46.720
even, not switch, but long to become,
link |
00:07:50.800
to get into software engineering a little bit.
link |
00:07:53.000
Yes.
link |
00:07:53.840
And programming, if you can call it that in the 70s.
link |
00:07:56.200
So there's all these disciplines.
link |
00:07:58.120
Yeah.
link |
00:07:58.960
So to pick a favorite, in terms of engineering
link |
00:08:02.880
and mathematics, which path do you think has more beauty?
link |
00:08:07.120
Which path has more power?
link |
00:08:08.600
It's hard to choose, no.
link |
00:08:10.560
I enjoy doing physics, and even have a vortex
link |
00:08:14.640
named on my name.
link |
00:08:16.000
So I have investment in immortality.
link |
00:08:23.400
So what is a vortex?
link |
00:08:25.160
Vortex is in superconductivity.
link |
00:08:27.000
In the superconductivity, yeah.
link |
00:08:27.840
You have permanent current swirling around.
link |
00:08:30.880
One way or the other, you can have a store one or zero
link |
00:08:34.560
for a computer.
link |
00:08:35.400
That's what we worked on in the 1960 in RCA.
link |
00:08:39.680
And I discovered a few nice phenomena with the vortices.
link |
00:08:44.080
You push current and they move.
link |
00:08:44.920
So that's a pearl vortex.
link |
00:08:46.600
Pearl vortex, right, you can Google it, right?
link |
00:08:50.240
I didn't know about it, but the physicists,
link |
00:08:52.360
they picked up on my thesis, on my PhD thesis,
link |
00:08:57.200
and it becomes popular when thin film superconductors
link |
00:09:03.240
became important for high temperature superconductors.
link |
00:09:06.920
So they called it pearl vortex without my knowledge.
link |
00:09:10.840
I discovered it only about 15 years ago.
link |
00:09:14.760
You have footprints in all of the sciences.
link |
00:09:17.560
So let's talk about the universe a little bit.
link |
00:09:20.960
Is the universe at the lowest level deterministic
link |
00:09:23.880
or stochastic in your amateur philosophy view?
link |
00:09:27.400
Put another way, does God play dice?
link |
00:09:30.120
We know it is stochastic, right?
link |
00:09:33.000
Today, today we think it is stochastic.
link |
00:09:35.160
Yes.
link |
00:09:37.000
We think because we have the Heisenberg uncertainty principle
link |
00:09:40.120
and we have some experiments to confirm that.
link |
00:09:45.760
All we have is experiments to confirm it.
link |
00:09:47.960
We don't understand why.
link |
00:09:50.400
Why is already?
link |
00:09:51.440
You wrote a book about why.
link |
00:09:52.960
Yeah, it's a puzzle.
link |
00:09:57.240
It's a puzzle that you have the dice flipping machine,
link |
00:10:02.400
oh God, and the result of the flipping
link |
00:10:08.560
propagate with the speed faster than the speed of light.
link |
00:10:12.280
We can't explain it, okay?
link |
00:10:14.240
So, but it only governs microscopic phenomena.
link |
00:10:19.240
Microscopic phenomena.
link |
00:10:21.320
So you don't think of quantum mechanics
link |
00:10:23.560
as useful for understanding the nature of reality?
link |
00:10:28.240
No, diversionary.
link |
00:10:30.400
So in your thinking, the world might
link |
00:10:34.520
as well be deterministic.
link |
00:10:36.000
The world is deterministic,
link |
00:10:38.480
and as far as the neuron firing is concerned,
link |
00:10:42.800
it is deterministic to first approximation.
link |
00:10:47.240
What about free will?
link |
00:10:48.920
Free will is also a nice exercise.
link |
00:10:52.920
Free will is an illusion that we AI people are gonna solve.
link |
00:10:59.320
So what do you think once we solve it,
link |
00:11:01.800
that solution will look like?
link |
00:11:03.360
Once we put it in the page.
link |
00:11:04.200
The solution will look like,
link |
00:11:06.240
first of all, it will look like a machine.
link |
00:11:08.920
A machine that act as though it has free will.
link |
00:11:12.560
It communicates with other machines
link |
00:11:14.760
as though they have free will,
link |
00:11:17.160
and you wouldn't be able to tell the difference
link |
00:11:19.480
between a machine that does
link |
00:11:21.560
and a machine that doesn't have free will, okay?
link |
00:11:24.680
So the illusion, it propagates the illusion
link |
00:11:26.840
of free will amongst the other machines.
link |
00:11:28.960
And faking it is having it, okay?
link |
00:11:33.120
That's what Turing test is all about.
link |
00:11:35.120
Faking intelligence is intelligent
link |
00:11:37.160
because it's not easy to fake.
link |
00:11:41.080
It's very hard to fake,
link |
00:11:43.200
and you can only fake if you have it.
link |
00:11:45.200
So that's such a beautiful statement.
link |
00:11:54.080
Yeah, you can't fake it if you don't have it, yeah.
link |
00:11:59.520
So let's begin at the beginning with probability,
link |
00:12:06.800
both philosophically and mathematically.
link |
00:12:09.360
What does it mean to say the probability
link |
00:12:11.560
of something happening is 50%?
link |
00:12:15.160
What is probability?
link |
00:12:16.960
It's a degree of uncertainty
link |
00:12:18.680
that an agent has about the world.
link |
00:12:22.400
You're still expressing some knowledge in that statement.
link |
00:12:24.760
Of course.
link |
00:12:26.000
If the probability is 90%,
link |
00:12:27.840
it's absolutely a different kind of knowledge
link |
00:12:29.800
than if it is 10%.
link |
00:12:32.480
But it's still not solid knowledge, it's...
link |
00:12:36.080
It is solid knowledge, but hey,
link |
00:12:38.520
if you tell me that 90% assurance smoking
link |
00:12:43.520
will give you lung cancer in five years versus 10%,
link |
00:12:48.880
it's a piece of useful knowledge.
link |
00:12:52.400
So the statistical view of the universe,
link |
00:12:56.120
why is it useful?
link |
00:12:57.600
So we're swimming in complete uncertainty,
link |
00:13:00.760
most of everything around us.
link |
00:13:01.600
It allows you to predict things with a certain probability,
link |
00:13:06.120
and computing those probabilities are very useful.
link |
00:13:09.240
That's the whole idea of prediction.
link |
00:13:15.080
And you need prediction to be able to survive.
link |
00:13:18.160
If you can't predict the future,
link |
00:13:19.640
then you're just crossing the street,
link |
00:13:22.320
will be extremely fearful.
link |
00:13:26.080
And so you've done a lot of work in causation,
link |
00:13:28.840
and so let's think about correlation.
link |
00:13:32.120
I started with probability.
link |
00:13:34.320
You started with probability.
link |
00:13:35.640
You've invented the Bayesian networks.
link |
00:13:38.760
Yeah.
link |
00:13:39.600
And so we'll dance back and forth
link |
00:13:43.880
between these levels of uncertainty.
link |
00:13:47.480
But what is correlation?
link |
00:13:49.320
What is it?
link |
00:13:50.360
So probability of something happening is something,
link |
00:13:54.000
but then there's a bunch of things happening.
link |
00:13:56.480
And sometimes they happen together,
link |
00:13:58.680
sometimes not, they're independent or not.
link |
00:14:00.760
So how do you think about correlation of things?
link |
00:14:03.640
Correlation occurs when two things vary together
link |
00:14:06.280
over a very long time is one way of measuring it.
link |
00:14:09.720
Or when you have a bunch of variables
link |
00:14:11.840
that they all vary cohesively,
link |
00:14:15.880
then because we have a correlation here.
link |
00:14:18.600
And usually when we think about correlation,
link |
00:14:21.720
we really think causally.
link |
00:14:24.400
Things cannot be correlated unless there is a reason
link |
00:14:27.960
for them to vary together.
link |
00:14:30.280
Why should they vary together?
link |
00:14:32.040
If they don't see each other, why should they vary together?
link |
00:14:35.560
So underlying it somewhere is causation.
link |
00:14:38.240
Yes.
link |
00:14:39.240
Hidden in our intuition, there is a notion of causation
link |
00:14:43.160
because we cannot grasp any other logic except causation.
link |
00:14:48.600
And how does conditional probability differ from causation?
link |
00:14:55.600
So what is conditional probability?
link |
00:14:57.920
Conditional probability, how things vary
link |
00:15:00.560
when one of them stays the same.
link |
00:15:05.040
Now staying the same means that I have chosen
link |
00:15:09.320
to look only at those incidents
link |
00:15:11.720
where the guy has the same value as previous one.
link |
00:15:16.120
It's my choice as an experimenter.
link |
00:15:19.280
So things that are not correlated before
link |
00:15:22.280
could become correlated.
link |
00:15:24.240
Like for instance, if I have two coins
link |
00:15:26.840
which are uncorrelated, okay,
link |
00:15:29.240
and I choose only those flippings, experiments
link |
00:15:33.760
in which a bell rings, and the bell rings
link |
00:15:36.680
when at least one of them is a tail, okay,
link |
00:15:40.760
then suddenly I see correlation between the two coins
link |
00:15:44.360
because I only look at the cases where the bell rang.
link |
00:15:49.280
You see, it's my design, with my ignorance essentially,
link |
00:15:53.680
with my audacity to ignore certain incidents,
link |
00:15:58.680
I suddenly create a correlation
link |
00:16:04.760
where it doesn't exist physically.
link |
00:16:07.760
Right, so that's, you just outlined one of the flaws
link |
00:16:11.400
of observing the world and trying to infer something
link |
00:16:14.960
from the math about the world
link |
00:16:16.080
from looking at the correlation.
link |
00:16:17.520
I don't look at it as a flaw, the world works like that.
link |
00:16:20.960
But the flaws comes if we try to impose
link |
00:16:24.960
causal logic on correlation, it doesn't work too well.
link |
00:16:34.680
I mean, but that's exactly what we do.
link |
00:16:36.320
That's what, that has been the majority of science.
link |
00:16:40.040
The majority of naive science, the decisions know it.
link |
00:16:45.040
The decisions know that if you condition
link |
00:16:47.840
on a third variable, then you can destroy
link |
00:16:50.960
or create correlations among two other variables.
link |
00:16:55.640
They know it, it's in their data.
link |
00:16:58.480
It's nothing surprising, that's why they all dismiss
link |
00:17:01.120
the Simpson Paradox, ah, we know it.
link |
00:17:04.120
They don't know anything about it.
link |
00:17:07.280
Well, there's disciplines like psychology
link |
00:17:09.680
where all the variables are hard to account for.
link |
00:17:12.880
And so oftentimes there's a leap
link |
00:17:15.200
between correlation to causation.
link |
00:17:17.520
You're imposing.
link |
00:17:18.840
You're implying a leap.
link |
00:17:21.120
Who is trying to get causation from correlation?
link |
00:17:26.520
You're not proving causation,
link |
00:17:27.960
but you're sort of discussing it,
link |
00:17:31.680
implying, sort of hypothesizing with our ability to prove.
link |
00:17:35.360
Which discipline you have in mind?
link |
00:17:37.040
I'll tell you if they are obsolete,
link |
00:17:40.440
or if they are outdated, or they are about to get outdated.
link |
00:17:44.240
Yes, yes.
link |
00:17:45.440
Tell me which one you have in mind.
link |
00:17:46.760
Oh, psychology, you know.
link |
00:17:48.160
Psychology, what, is it SEM, structural equation?
link |
00:17:50.760
No, no, I was thinking of applied psychology studying.
link |
00:17:54.280
For example, we work with human behavior
link |
00:17:57.200
in semi autonomous vehicles, how people behave.
link |
00:18:00.360
And you have to conduct these studies
link |
00:18:02.560
of people driving cars.
link |
00:18:03.960
Everything starts with the question.
link |
00:18:05.520
What is the research question?
link |
00:18:07.800
What is the research question?
link |
00:18:10.400
The research question, do people fall asleep
link |
00:18:14.320
when the car is driving itself?
link |
00:18:18.560
Do they fall asleep, or do they tend to fall asleep
link |
00:18:22.240
more frequently?
link |
00:18:23.160
More frequently.
link |
00:18:24.000
Than the car not driving itself.
link |
00:18:25.960
Not driving itself.
link |
00:18:26.880
That's a good question, okay.
link |
00:18:28.720
And so you measure, you put people in the car
link |
00:18:32.480
because it's real world.
link |
00:18:33.720
You can't conduct an experiment
link |
00:18:35.160
where you control everything.
link |
00:18:36.320
Why can't you control?
link |
00:18:37.840
You could.
link |
00:18:38.680
Why can't you control the automatic module on and off?
link |
00:18:44.760
Because it's on road, public.
link |
00:18:47.720
I mean, there's aspects to it that's unethical.
link |
00:18:52.680
Because it's testing on public roads.
link |
00:18:54.920
So you can only use vehicle.
link |
00:18:56.640
They have to, the people, the drivers themselves
link |
00:19:00.240
have to make that choice themselves.
link |
00:19:02.840
And so they regulate that.
link |
00:19:05.080
And so you just observe when they drive autonomously
link |
00:19:09.040
and when they don't.
link |
00:19:10.320
And then.
link |
00:19:11.160
But maybe they turn it off when they were very tired.
link |
00:19:13.120
Yeah, that kind of thing.
link |
00:19:14.520
But you don't know those variables.
link |
00:19:16.600
Okay, so that you have now uncontrolled experiment.
link |
00:19:19.360
Uncontrolled experiment.
link |
00:19:20.680
We call it observational study.
link |
00:19:23.200
And we form the correlation detected.
link |
00:19:27.160
We have to infer causal relationship.
link |
00:19:30.520
Whether it was the automatic piece
link |
00:19:33.440
has caused them to fall asleep, or.
link |
00:19:35.960
So that is an issue that is about 120 years old.
link |
00:19:45.440
I should only go 100 years old, okay.
link |
00:19:51.400
Well, maybe it's not.
link |
00:19:52.840
Actually I should say it's 2,000 years old.
link |
00:19:55.240
Because we have this experiment by Daniel.
link |
00:19:58.520
But the Babylonian king that wanted the exiled,
link |
00:20:07.440
the people from Israel that were taken in exile
link |
00:20:12.760
to Babylon to serve the king.
link |
00:20:14.720
He wanted to serve them king's food, which was meat.
link |
00:20:18.080
And Daniel as a good Jew couldn't eat non kosher food.
link |
00:20:22.760
So he asked them to eat vegetarian food.
link |
00:20:26.640
But the king overseer says, I'm sorry,
link |
00:20:29.320
but if the king sees that your performance falls below
link |
00:20:34.760
that of other kids, he's going to kill me.
link |
00:20:39.360
Daniel said, let's make an experiment.
link |
00:20:41.600
Let's take four of us from Jerusalem, okay.
link |
00:20:44.320
Give us vegetarian food.
link |
00:20:46.440
Let's take the other guys to eat the king's food.
link |
00:20:50.280
And in about a week's time, we'll test our performance.
link |
00:20:54.080
And you know the answer.
link |
00:20:55.440
Of course he did the experiment.
link |
00:20:57.840
And they were so much better than the others.
link |
00:21:02.120
And the kings nominated them to super position in his king.
link |
00:21:08.000
So it was a first experiment, yes.
link |
00:21:10.160
So there was a very simple,
link |
00:21:12.760
it's also the same research questions.
link |
00:21:15.520
We want to know if vegetarian food
link |
00:21:18.640
assist or obstruct your mental ability.
link |
00:21:23.480
And okay, so the question is very old one.
link |
00:21:30.440
Even Democritus said, if I could discover one cause
link |
00:21:38.040
of things, I would rather discover one cause
link |
00:21:41.480
and be a king of Persia, okay.
link |
00:21:43.720
The task of discovering causes was in the mind
link |
00:21:48.720
of ancient people from many, many years ago.
link |
00:21:53.480
But the mathematics of doing that was only developed
link |
00:21:58.360
in the 1920s.
link |
00:22:00.480
So science has left us orphan, okay.
link |
00:22:05.120
Science has not provided us with the mathematics
link |
00:22:08.320
to capture the idea of X causes Y
link |
00:22:12.000
and Y does not cause X.
link |
00:22:14.320
Because all the question of physics are symmetrical,
link |
00:22:17.400
algebraic, the equality sign goes both ways.
link |
00:22:21.680
Okay, let's look at machine learning.
link |
00:22:23.120
Machine learning today, if you look at deep neural networks,
link |
00:22:26.880
you can think of it as kind of conditional probability
link |
00:22:32.840
estimators.
link |
00:22:33.680
Correct, beautiful.
link |
00:22:35.520
So where did you say that?
link |
00:22:39.280
Conditional probability estimators.
link |
00:22:41.520
None of the machine learning people clevered you?
link |
00:22:45.360
Attacked you?
link |
00:22:46.840
Listen, most people, and this is why today's conversation
link |
00:22:52.400
I think is interesting, is most people would agree with you.
link |
00:22:55.800
There's certain aspects that are just effective today,
link |
00:22:58.680
but we're going to hit a wall and there's a lot of ideas.
link |
00:23:02.440
I think you're very right that we're gonna have to return to
link |
00:23:05.400
about causality and it would be, let's try to explore it.
link |
00:23:11.560
Let's even take a step back.
link |
00:23:13.160
You've invented Bayesian networks that look awfully a lot
link |
00:23:19.800
like they express something like causation,
link |
00:23:22.040
but they don't, not necessarily.
link |
00:23:25.280
So how do we turn Bayesian networks
link |
00:23:28.600
into expressing causation?
link |
00:23:30.840
How do we build causal networks?
link |
00:23:33.160
This A causes B, B causes C,
link |
00:23:36.480
how do we start to infer that kind of thing?
link |
00:23:38.840
We start asking ourselves question,
link |
00:23:41.460
what are the factors that would determine
link |
00:23:44.520
the value of X?
link |
00:23:46.360
X could be blood pressure, death, hunger.
link |
00:23:53.820
But these are hypotheses that we propose for ourselves.
link |
00:23:56.240
Hypothesis, everything which has to do with causality
link |
00:23:59.040
comes from a theory.
link |
00:24:03.400
The difference is only how you interrogate
link |
00:24:06.960
the theory that you have in your mind.
link |
00:24:09.080
So it still needs the human expert to propose.
link |
00:24:14.280
You need the human expert to specify the initial model.
link |
00:24:20.980
Initial model could be very qualitative.
link |
00:24:24.040
Just who listens to whom?
link |
00:24:27.040
By whom listen to, I mean one variable listen to the other.
link |
00:24:31.240
So I say, okay, the tide is listening to the moon.
link |
00:24:34.720
And not to the rooster crow.
link |
00:24:42.280
And so forth.
link |
00:24:43.120
This is our understanding of the world in which we live.
link |
00:24:46.840
Scientific understanding of reality.
link |
00:24:51.540
We have to start there.
link |
00:24:53.480
Because if we don't know how to handle
link |
00:24:56.960
cause and effect relationship,
link |
00:24:58.560
when we do have a model,
link |
00:25:01.240
and we certainly do not know how to handle it
link |
00:25:03.720
when we don't have a model.
link |
00:25:05.400
So let's start first.
link |
00:25:07.240
In AI, slogan is representation first, discovery second.
link |
00:25:13.440
But if I give you all the information that you need,
link |
00:25:17.240
can you do anything useful with it?
link |
00:25:19.840
That is the first, representation.
link |
00:25:21.520
How do you represent it?
link |
00:25:22.520
I give you all the knowledge in the world.
link |
00:25:24.560
How do you represent it?
link |
00:25:26.920
When you represent it, I ask you,
link |
00:25:30.680
can you infer X or Y or Z?
link |
00:25:33.200
Can you answer certain queries?
link |
00:25:35.240
Is it complex?
link |
00:25:36.920
Is it polynomial?
link |
00:25:39.280
All the computer science exercises we do,
link |
00:25:42.040
once you give me a representation for my knowledge,
link |
00:25:47.260
then you can ask me,
link |
00:25:48.280
now I understand how to represent things.
link |
00:25:51.680
How do I discover them?
link |
00:25:52.800
It's a secondary thing.
link |
00:25:54.920
So first of all, I should echo the statement
link |
00:25:57.040
that mathematics and the current,
link |
00:25:59.840
much of the machine learning world has not considered
link |
00:26:04.280
causation that A causes B, just in anything.
link |
00:26:07.880
So that seems like a non obvious thing
link |
00:26:15.360
that you think we would have really acknowledged it,
link |
00:26:18.280
but we haven't.
link |
00:26:19.200
So we have to put that on the table.
link |
00:26:21.080
So knowledge, how hard is it to create a knowledge
link |
00:26:26.420
from which to work?
link |
00:26:28.480
In certain area, it's easy,
link |
00:26:31.280
because we have only four or five major variables,
link |
00:26:37.400
and an epidemiologist or an economist can put them down,
link |
00:26:43.440
what, minimum wage, unemployment, policy, X, Y, Z,
link |
00:26:52.040
and start collecting data,
link |
00:26:54.320
and quantify the parameters that were left unquantified
link |
00:27:00.240
with the initial knowledge.
link |
00:27:01.560
That's the routine work that you find
link |
00:27:07.680
in experimental psychology, in economics, everywhere.
link |
00:27:13.000
In the health science, that's a routine thing.
link |
00:27:16.600
But I should emphasize,
link |
00:27:18.800
you should start with the research question.
link |
00:27:21.240
What do you want to estimate?
link |
00:27:24.880
Once you have that, you have to have a language
link |
00:27:27.480
of expressing what you want to estimate.
link |
00:27:30.200
You think it's easy?
link |
00:27:31.680
No.
link |
00:27:32.760
So we can talk about two things, I think.
link |
00:27:35.840
One is how the science of causation is very useful
link |
00:27:42.920
for answering certain questions.
link |
00:27:47.360
And then the other is, how do we create intelligent systems
link |
00:27:51.240
that need to reason with causation?
link |
00:27:53.600
So if my research question is,
link |
00:27:55.200
how do I pick up this water bottle from the table?
link |
00:28:00.920
All of the knowledge that is required
link |
00:28:03.560
to be able to do that,
link |
00:28:05.280
how do we construct that knowledge base?
link |
00:28:07.960
Do we return back to the problem
link |
00:28:11.000
that we didn't solve in the 80s with expert systems?
link |
00:28:13.580
Do we have to solve that problem
link |
00:28:15.440
of automated construction of knowledge?
link |
00:28:19.680
You're talking about the task
link |
00:28:23.560
of eliciting knowledge from an expert.
link |
00:28:26.560
Task of eliciting knowledge from an expert,
link |
00:28:28.520
or the self discovery of more knowledge,
link |
00:28:32.720
more and more knowledge.
link |
00:28:34.240
So automating the building of knowledge as much as possible.
link |
00:28:38.600
It's a different game in the causal domain,
link |
00:28:42.400
because it's essentially the same thing.
link |
00:28:46.480
You have to start with some knowledge,
link |
00:28:48.680
and you're trying to enrich it.
link |
00:28:51.520
But you don't enrich it by asking for more rules.
link |
00:28:56.460
You enrich it by asking for the data,
link |
00:28:58.960
to look at the data and quantifying,
link |
00:29:01.800
and ask queries that you couldn't answer when you started.
link |
00:29:06.480
You couldn't because the question is quite complex,
link |
00:29:11.480
and it's not within the capability
link |
00:29:16.600
of ordinary cognition, of ordinary person,
link |
00:29:19.840
or ordinary expert even, to answer.
link |
00:29:23.200
So what kind of questions do you think
link |
00:29:24.920
we can start to answer?
link |
00:29:26.960
Even a simple one.
link |
00:29:27.800
Suppose, yeah, I'll start with easy one.
link |
00:29:31.200
Let's do it.
link |
00:29:32.040
Okay, what's the effect of a drug on recovery?
link |
00:29:37.520
What is the aspirin that caused my headache
link |
00:29:39.500
to be cured, or what did the television program,
link |
00:29:44.640
or the good news I received?
link |
00:29:47.680
This is already, you see, it's a difficult question,
link |
00:29:49.920
because it's find the cause from effect.
link |
00:29:53.680
The easy one is find the effect from cause.
link |
00:29:56.760
That's right.
link |
00:29:57.720
So first you construct a model,
link |
00:29:59.180
saying that this is an important research question.
link |
00:30:01.280
This is an important question.
link |
00:30:02.840
Then you do.
link |
00:30:04.320
I didn't construct a model yet.
link |
00:30:05.520
I just said it's an important question.
link |
00:30:07.680
And the first exercise is express it mathematically.
link |
00:30:12.280
What do you want to do?
link |
00:30:13.800
Like, if I tell you what will be the effect
link |
00:30:17.000
of taking this drug, you have to say that in mathematics.
link |
00:30:21.320
How do you say that?
link |
00:30:22.880
Yes.
link |
00:30:23.720
Can you write down the question, not the answer?
link |
00:30:27.720
I want to find the effect of the drug on my headache.
link |
00:30:32.420
Right.
link |
00:30:33.260
Write down, write it down.
link |
00:30:34.680
That's where the do calculus comes in.
link |
00:30:35.920
Yes.
link |
00:30:36.760
The do operator, what is the do operator?
link |
00:30:38.240
Do operator, yeah.
link |
00:30:40.080
Which is nice.
link |
00:30:40.920
It's the difference in association and intervention.
link |
00:30:43.280
Very beautifully sort of constructed.
link |
00:30:45.760
Yeah, so we have a do operator.
link |
00:30:48.880
So the do calculus connected on the do operator itself
link |
00:30:52.560
connects the operation of doing
link |
00:30:55.560
to something that we can see.
link |
00:30:57.880
Right.
link |
00:30:58.720
So as opposed to the purely observing,
link |
00:31:01.740
you're making the choice to change a variable.
link |
00:31:05.920
That's what it expresses.
link |
00:31:08.240
And then the way that we interpret it,
link |
00:31:11.840
the mechanism by which we take your query
link |
00:31:15.400
and we translate it into something that we can work with
link |
00:31:18.640
is by giving it semantics,
link |
00:31:21.040
saying that you have a model of the world
link |
00:31:23.320
and you cut off all the incoming error into X
link |
00:31:26.840
and you're looking now in the modified mutilated model,
link |
00:31:30.680
you ask for the probability of Y.
link |
00:31:33.660
That is interpretation of doing X
link |
00:31:36.360
because by doing things, you've liberated them
link |
00:31:40.240
from all influences that acted upon them earlier
link |
00:31:45.200
and you subject them to the tyranny of your muscles.
link |
00:31:50.080
So you remove all the questions about causality
link |
00:31:54.040
by doing them.
link |
00:31:55.740
So you're now.
link |
00:31:56.580
There's one level of questions.
link |
00:31:58.160
Yeah.
link |
00:31:59.000
Answer questions about what will happen if you do things.
link |
00:32:01.920
If you do, if you drink the coffee,
link |
00:32:03.320
if you take the aspirin.
link |
00:32:04.160
Right.
link |
00:32:05.320
So how do we get the doing data?
link |
00:32:11.360
Now the question is, if we cannot run experiments,
link |
00:32:16.560
then we have to rely on observational study.
link |
00:32:20.960
So first we could, sorry to interrupt,
link |
00:32:22.560
we could run an experiment where we do something,
link |
00:32:25.880
where we drink the coffee and this,
link |
00:32:28.200
the do operator allows you to sort of be systematic
link |
00:32:31.120
about expressing.
link |
00:32:31.960
So imagine how the experiment will look like
link |
00:32:34.560
even though we cannot physically
link |
00:32:36.880
and technologically conduct it.
link |
00:32:38.760
I'll give you an example.
link |
00:32:40.600
What is the effect of blood pressure on mortality?
link |
00:32:44.960
I cannot go down into your vein
link |
00:32:47.360
and change your blood pressure,
link |
00:32:49.400
but I can ask the question,
link |
00:32:52.060
which means I can even have a model of your body.
link |
00:32:55.120
I can imagine the effect of your,
link |
00:32:58.600
how the blood pressure change will affect your mortality.
link |
00:33:04.720
How?
link |
00:33:05.560
I go into the model and I conduct this surgery
link |
00:33:09.760
about the blood pressure,
link |
00:33:12.080
even though physically I can do, I cannot do it.
link |
00:33:17.200
Let me ask the quantum mechanics question.
link |
00:33:19.760
Does the doing change the observation?
link |
00:33:23.720
Meaning the surgery of changing the blood pressure is,
link |
00:33:28.240
I mean.
link |
00:33:29.080
No, the surgery is,
link |
00:33:31.160
I call the very delicate.
link |
00:33:35.440
It's very delicate, infinitely delicate.
link |
00:33:37.840
Incisive and delicate, which means,
link |
00:33:40.760
do means, do X means,
link |
00:33:44.720
I'm gonna touch only X.
link |
00:33:46.480
Only X.
link |
00:33:47.320
Directly into X.
link |
00:33:50.160
So that means that I change only things
link |
00:33:52.800
which depends on X by virtue of X changing,
link |
00:33:56.800
but I don't depend things which are not depend on X.
link |
00:34:00.400
Like I wouldn't change your sex or your age,
link |
00:34:04.240
I just change your blood pressure.
link |
00:34:06.880
So in the case of blood pressure,
link |
00:34:08.720
it may be difficult or impossible
link |
00:34:11.160
to construct such an experiment.
link |
00:34:12.800
No, physically yes, but hypothetically no.
link |
00:34:16.560
Hypothetically no.
link |
00:34:17.400
If we have a model, that is what the model is for.
link |
00:34:20.720
So you conduct surgeries on a model,
link |
00:34:24.640
you take it apart, put it back,
link |
00:34:26.880
that's the idea of a model.
link |
00:34:28.880
It's the idea of thinking counterfactually, imagining,
link |
00:34:31.640
and that's the idea of creativity.
link |
00:34:35.120
So by constructing that model,
link |
00:34:36.400
you can start to infer if the higher the blood pressure
link |
00:34:43.040
leads to mortality, which increases or decreases by.
link |
00:34:47.320
I construct the model, I still cannot answer it.
link |
00:34:50.800
I have to see if I have enough information in the model
link |
00:34:53.800
that would allow me to find out the effects of intervention
link |
00:34:58.360
from a noninterventional study, hence of study.
link |
00:35:04.520
So what's needed?
link |
00:35:06.240
You need to have assumptions about who affects whom.
link |
00:35:13.480
If the graph had a certain property,
link |
00:35:16.400
the answer is yes, you can get it from observational study.
link |
00:35:20.560
If the graph is too mushy, bushy, bushy,
link |
00:35:23.760
the answer is no, you cannot.
link |
00:35:25.680
Then you need to find either different kind of observation
link |
00:35:30.680
that you haven't considered, or one experiment.
link |
00:35:35.120
So basically, that puts a lot of pressure on you
link |
00:35:38.880
to encode wisdom into that graph.
link |
00:35:41.920
Correct.
link |
00:35:42.960
But you don't have to encode more than what you know.
link |
00:35:47.520
God forbid, if you put the,
link |
00:35:49.680
like economists are doing this,
link |
00:35:51.400
they call identifying assumption.
link |
00:35:52.880
They put assumptions, even if they don't prevail
link |
00:35:55.120
in the world, they put assumptions
link |
00:35:56.840
so they can identify things.
link |
00:35:59.240
But the problem is, yes, beautifully put,
link |
00:36:01.480
but the problem is you don't know what you don't know.
link |
00:36:04.440
So.
link |
00:36:05.920
You know what you don't know.
link |
00:36:07.560
Because if you don't know, you say it's possible.
link |
00:36:10.600
It's possible that X affect the traffic tomorrow.
link |
00:36:17.720
It's possible.
link |
00:36:18.640
You put down an arrow which says it's possible.
link |
00:36:20.880
Every arrow in the graph says it's possible.
link |
00:36:23.960
So there's not a significant cost to adding arrows that.
link |
00:36:28.000
The more arrow you add, the less likely you are
link |
00:36:32.240
to identify things from purely observational data.
link |
00:36:37.680
So if the whole world is bushy,
link |
00:36:41.360
and everybody affect everybody else,
link |
00:36:45.400
the answer is, you can answer it ahead of time.
link |
00:36:49.160
I cannot answer my query from observational data.
link |
00:36:54.240
I have to go to experiments.
link |
00:36:56.640
So you talk about machine learning
link |
00:36:58.360
is essentially learning by association,
link |
00:37:01.600
or reasoning by association,
link |
00:37:03.120
and this do calculus is allowing for intervention.
link |
00:37:07.120
I like that word.
link |
00:37:09.000
Action.
link |
00:37:09.840
So you also talk about counterfactuals.
link |
00:37:12.400
Yeah.
link |
00:37:13.220
And trying to sort of understand the difference
link |
00:37:15.880
between counterfactuals and intervention.
link |
00:37:19.360
First of all, what is counterfactuals,
link |
00:37:22.320
and why are they useful?
link |
00:37:26.200
Why are they especially useful,
link |
00:37:29.700
as opposed to just reasoning what effect actions have?
link |
00:37:34.880
Well, counterfactual contains
link |
00:37:37.040
what we normally call explanations.
link |
00:37:40.280
Can you give an example of a counterfactual?
link |
00:37:41.600
If I tell you that acting one way affects something else,
link |
00:37:45.240
I didn't explain anything yet.
link |
00:37:47.720
But if I ask you, was it the aspirin that cured my headache?
link |
00:37:55.400
I'm asking for explanation, what cured my headache?
link |
00:37:58.640
And putting a finger on aspirin provide an explanation.
link |
00:38:04.640
It was aspirin that was responsible
link |
00:38:08.160
for your headache going away.
link |
00:38:11.560
If you didn't take the aspirin,
link |
00:38:14.420
you would still have a headache.
link |
00:38:16.000
So by saying if I didn't take aspirin,
link |
00:38:20.280
I would have a headache, you're thereby saying
link |
00:38:22.760
that aspirin is the thing that removes the headache.
link |
00:38:26.360
But you have to have another important information.
link |
00:38:30.440
I took the aspirin, and my headache is gone.
link |
00:38:34.560
It's very important information.
link |
00:38:36.400
Now I'm reasoning backward,
link |
00:38:38.080
and I said, was it the aspirin?
link |
00:38:40.520
Yeah.
link |
00:38:42.080
By considering what would have happened
link |
00:38:44.440
if everything else is the same, but I didn't take aspirin.
link |
00:38:47.000
That's right.
link |
00:38:47.840
So you know that things took place.
link |
00:38:51.040
Joe killed Schmoe, and Schmoe would be alive
link |
00:38:56.360
had John not used his gun.
link |
00:38:59.320
Okay, so that is the counterfactual.
link |
00:39:02.080
It had the conflict here, or clash,
link |
00:39:06.640
between observed fact,
link |
00:39:09.600
but he did shoot, okay?
link |
00:39:13.600
And the hypothetical predicate,
link |
00:39:16.600
which says had he not shot,
link |
00:39:18.800
you have a logical clash.
link |
00:39:21.540
They cannot exist together.
link |
00:39:23.820
That's the counterfactual.
link |
00:39:24.920
And that is the source of our explanation
link |
00:39:28.280
of the idea of responsibility, regret, and free will.
link |
00:39:34.820
Yeah, so it certainly seems
link |
00:39:37.200
that's the highest level of reasoning, right?
link |
00:39:39.760
Yeah, and physicists do it all the time.
link |
00:39:41.880
Who does it all the time?
link |
00:39:42.760
Physicists.
link |
00:39:43.580
Physicists.
link |
00:39:44.920
In every equation of physics,
link |
00:39:47.120
let's say you have a Hooke's law,
link |
00:39:49.560
and you put one kilogram on the spring,
link |
00:39:52.240
and the spring is one meter,
link |
00:39:54.680
and you say, had this weight been two kilogram,
link |
00:39:58.400
the spring would have been twice as long.
link |
00:40:02.060
It's no problem for physicists to say that,
link |
00:40:05.560
except that mathematics is only in the form of equation,
link |
00:40:10.240
okay, equating the weight,
link |
00:40:14.680
proportionality constant, and the length of the string.
link |
00:40:18.520
So you don't have the asymmetry
link |
00:40:22.080
in the equation of physics,
link |
00:40:23.280
although every physicist thinks counterfactually.
link |
00:40:26.840
Ask the high school kids,
link |
00:40:29.160
had the weight been three kilograms,
link |
00:40:31.120
what would be the length of the spring?
link |
00:40:33.360
They can answer it immediately,
link |
00:40:35.160
because they do the counterfactual processing in their mind,
link |
00:40:38.760
and then they put it into equation,
link |
00:40:41.120
algebraic equation, and they solve it, okay?
link |
00:40:44.280
But a robot cannot do that.
link |
00:40:46.720
How do you make a robot learn these relationships?
link |
00:40:51.000
Well, why you would learn?
link |
00:40:53.240
Suppose you tell him, can you do it?
link |
00:40:55.560
So before you go learning,
link |
00:40:57.520
you have to ask yourself,
link |
00:40:59.360
suppose I give you all the information, okay?
link |
00:41:01.820
Can the robot perform the task that I ask him to perform?
link |
00:41:07.820
Can he reason and say, no, it wasn't the aspirin.
link |
00:41:10.980
It was the good news you received on the phone.
link |
00:41:15.500
Right, because, well, unless the robot had a model,
link |
00:41:19.620
a causal model of the world.
link |
00:41:23.620
Right, right.
link |
00:41:24.740
I'm sorry I have to linger on this.
link |
00:41:26.260
But now we have to linger and we have to say,
link |
00:41:27.860
how do we do it?
link |
00:41:29.100
How do we build it?
link |
00:41:29.940
Yes.
link |
00:41:30.760
How do we build a causal model
link |
00:41:32.180
without a team of human experts running around?
link |
00:41:37.260
Why don't you go to learning right away?
link |
00:41:39.560
You're too much involved with learning.
link |
00:41:41.180
Because I like babies.
link |
00:41:42.100
Babies learn fast.
link |
00:41:43.140
I'm trying to figure out how they do it.
link |
00:41:45.140
Good.
link |
00:41:45.980
So that's another question.
link |
00:41:47.660
How do the babies come out with a counterfactual
link |
00:41:50.340
model of the world?
link |
00:41:51.760
And babies do that.
link |
00:41:53.540
They know how to play in the crib.
link |
00:41:56.860
They know which balls hit another one.
link |
00:41:59.500
And they learn it by playful manipulation of the world.
link |
00:42:06.820
Yes.
link |
00:42:07.660
The simple world involves only toys and balls and chimes.
link |
00:42:13.620
But if you think about it, it's a complex world.
link |
00:42:17.260
We take for granted how complicated.
link |
00:42:20.700
And kids do it by playful manipulation
link |
00:42:23.740
plus parents guidance, peer wisdom, and hearsay.
link |
00:42:30.860
They meet each other and they say,
link |
00:42:34.780
you shouldn't have taken my toy.
link |
00:42:38.940
Right.
link |
00:42:40.820
And these multiple sources of information,
link |
00:42:43.540
they're able to integrate.
link |
00:42:45.780
So the challenge is about how to integrate,
link |
00:42:49.260
how to form these causal relationships
link |
00:42:52.620
from different sources of data.
link |
00:42:54.260
Correct.
link |
00:42:55.600
So how much information is it to play,
link |
00:42:59.980
how much causal information is required
link |
00:43:03.060
to be able to play in the crib with different objects?
link |
00:43:06.900
I don't know.
link |
00:43:08.260
I haven't experimented with the crib.
link |
00:43:11.340
Okay, not a crib.
link |
00:43:12.700
Picking up, manipulating physical objects
link |
00:43:15.860
on this very, opening the pages of a book,
link |
00:43:19.800
all the tasks, the physical manipulation tasks.
link |
00:43:23.740
Do you have a sense?
link |
00:43:25.260
Because my sense is the world is extremely complicated.
link |
00:43:27.980
It's extremely complicated.
link |
00:43:29.420
I agree, and I don't know how to organize it
link |
00:43:31.260
because I've been spoiled by easy problems
link |
00:43:34.660
such as cancer and death, okay?
link |
00:43:37.240
And I'm a, but she's a.
link |
00:43:39.820
First we have to start trying to.
link |
00:43:41.060
No, but it's easy.
link |
00:43:42.660
There is in a sense that you have only 20 variables.
link |
00:43:46.980
And they are just variables and not mechanics.
link |
00:43:51.060
Okay, it's easy.
link |
00:43:52.300
You just put them on the graph and they speak to you.
link |
00:43:57.980
Yeah, and you're providing a methodology
link |
00:44:00.540
for letting them speak.
link |
00:44:02.380
Yeah.
link |
00:44:03.220
I'm working only in the abstract.
link |
00:44:05.140
The abstract was knowledge in, knowledge out,
link |
00:44:08.980
data in between.
link |
00:44:11.820
Now, can we take a leap to trying to learn
link |
00:44:15.100
in this very, when it's not 20 variables,
link |
00:44:18.140
but 20 million variables, trying to learn causation
link |
00:44:23.180
in this world?
link |
00:44:24.060
Not learn, but somehow construct models.
link |
00:44:27.180
I mean, it seems like you would only have to be able
link |
00:44:29.620
to learn because constructing it manually
link |
00:44:33.920
would be too difficult.
link |
00:44:35.580
Do you have ideas of?
link |
00:44:37.900
I think it's a matter of combining simple models
link |
00:44:41.220
from many, many sources, from many, many disciplines,
link |
00:44:45.460
and many metaphors.
link |
00:44:48.220
Metaphors are the basics of human intelligence, basis.
link |
00:44:51.940
Yeah, so how do you think of about a metaphor
link |
00:44:53.940
in terms of its use in human intelligence?
link |
00:44:56.180
Metaphors is an expert system.
link |
00:45:01.900
An expert, it's mapping problem
link |
00:45:05.700
from a problem with which you are not familiar
link |
00:45:09.580
to a problem with which you are familiar.
link |
00:45:13.800
Like, I'll give you a good example.
link |
00:45:15.980
The Greek believed that the sky is an opaque shell.
link |
00:45:22.240
It's not really infinite space.
link |
00:45:25.940
It's an opaque shell, and the stars are holes
link |
00:45:30.580
poked in the shells through which you see
link |
00:45:32.980
the eternal light.
link |
00:45:34.260
That was a metaphor.
link |
00:45:35.940
Why?
link |
00:45:36.960
Because they understand how you poke holes in the shells.
link |
00:45:42.800
They were not familiar with infinite space.
link |
00:45:47.260
And we are walking on a shell of a turtle,
link |
00:45:52.780
and if you get too close to the edge,
link |
00:45:54.520
you're gonna fall down to Hades or wherever.
link |
00:45:58.940
That's a metaphor.
link |
00:46:00.780
It's not true.
link |
00:46:02.180
But this kind of metaphor enabled Aristoteles
link |
00:46:07.220
to measure the radius of the Earth,
link |
00:46:10.680
because he said, Kamal, if we are walking on a turtle shell,
link |
00:46:15.140
then the ray of light coming to this place
link |
00:46:20.700
will be a different angle than coming to this place.
link |
00:46:23.040
I know the distance, I'll measure the two angles,
link |
00:46:26.420
and then I have the radius of the shell of the turtle.
link |
00:46:33.640
And he did, and he found his measurement
link |
00:46:38.460
very close to the measurements we have today,
link |
00:46:43.300
through the, what, 6,700 kilometers of the Earth.
link |
00:46:52.020
That's something that would not occur
link |
00:46:55.060
to Babylonian astronomer,
link |
00:46:59.760
even though the Babylonian experiment
link |
00:47:01.900
were the machine learning people of the time.
link |
00:47:04.600
They fit curves, and they could predict
link |
00:47:07.900
the eclipse of the moon much more accurately
link |
00:47:12.220
than the Greek, because they fit curve.
link |
00:47:17.160
That's a different metaphor.
link |
00:47:19.220
Something that you're familiar with,
link |
00:47:20.580
a game, a turtle shell.
link |
00:47:21.820
Okay?
link |
00:47:24.380
What does it mean if you are familiar?
link |
00:47:27.520
Familiar means that answers to certain questions
link |
00:47:31.900
are explicit.
link |
00:47:33.460
You don't have to derive them.
link |
00:47:35.580
And they were made explicit because somewhere in the past
link |
00:47:39.700
you've constructed a model of that.
link |
00:47:43.540
Yeah, you're familiar with,
link |
00:47:44.940
so the child is familiar with billiard balls.
link |
00:47:48.340
So the child could predict that if you let loose
link |
00:47:51.420
of one ball, the other one will bounce off.
link |
00:47:55.740
You obtain that by familiarity.
link |
00:48:00.180
Familiarity is answering questions,
link |
00:48:02.920
and you store the answer explicitly.
link |
00:48:05.880
You don't have to derive them.
link |
00:48:08.020
So this is the idea of a metaphor.
link |
00:48:09.620
All our life, all our intelligence
link |
00:48:11.620
is built around metaphors,
link |
00:48:13.420
mapping from the unfamiliar to the familiar.
link |
00:48:16.220
But the marriage between the two is a tough thing,
link |
00:48:20.540
which we haven't yet been able to algorithmatize.
link |
00:48:24.740
So you think of that process of using metaphor
link |
00:48:29.300
to leap from one place to another,
link |
00:48:31.900
we can call it reasoning?
link |
00:48:34.540
Is it a kind of reasoning?
link |
00:48:35.900
It is reasoning by metaphor, metaphorical reasoning.
link |
00:48:39.500
Do you think of that as learning?
link |
00:48:44.220
So learning is a popular terminology today
link |
00:48:46.500
in a narrow sense.
link |
00:48:47.620
It is, it is, it is definitely a form.
link |
00:48:49.780
So you may not, okay, right.
link |
00:48:51.620
It's one of the most important learnings,
link |
00:48:53.780
taking something which theoretically is derivable
link |
00:48:57.620
and store it in accessible format.
link |
00:49:01.700
I'll give you an example, chess, okay?
link |
00:49:07.780
Finding the winning starting move in chess is hard.
link |
00:49:12.780
It is hard, but there is an answer.
link |
00:49:21.260
Either there is a winning move for white
link |
00:49:23.780
or there isn't, or there is a draw, okay?
link |
00:49:26.820
So it is, the answer to that
link |
00:49:30.260
is available through the rule of the games.
link |
00:49:33.700
But we don't know the answer.
link |
00:49:35.460
So what does a chess master have that we don't have?
link |
00:49:38.500
He has stored explicitly an evaluation
link |
00:49:41.780
of certain complex pattern of the board.
link |
00:49:45.340
We don't have it.
link |
00:49:46.180
Ordinary people like me, I don't know about you,
link |
00:49:50.940
I'm not a chess master.
link |
00:49:52.820
So for me, I have to derive things that for him is explicit.
link |
00:49:58.620
He has seen it before, or he has seen the pattern before,
link |
00:50:02.500
or similar pattern, you see metaphor, yeah?
link |
00:50:05.100
And he generalize and said, don't move, it's a dangerous move.
link |
00:50:13.460
It's just that not in the game of chess,
link |
00:50:15.540
but in the game of billiard balls,
link |
00:50:18.940
we humans are able to initially derive very effectively
link |
00:50:22.420
and then reason by metaphor very effectively
link |
00:50:25.100
and make it look so easy that it makes one wonder
link |
00:50:28.700
how hard is it to build it in a machine.
link |
00:50:31.260
So in your sense, how far away are we
link |
00:50:38.260
to be able to construct?
link |
00:50:40.660
I don't know, I'm not a futurist.
link |
00:50:42.780
All I can tell you is that we are making tremendous progress
link |
00:50:48.380
in the causal reasoning domain.
link |
00:50:52.100
Something that I even dare to call it revolution,
link |
00:50:57.100
the code of revolution, because what we have achieved
link |
00:51:02.780
in the past three decades is something that dwarf
link |
00:51:09.660
everything that was derived in the entire history.
link |
00:51:13.660
So there's an excitement about
link |
00:51:15.860
current machine learning methodologies,
link |
00:51:18.700
and there's really important good work you're doing
link |
00:51:22.100
in causal inference.
link |
00:51:24.620
Where does the future, where do these worlds collide
link |
00:51:32.980
and what does that look like?
link |
00:51:35.060
First, they're gonna work without collision.
link |
00:51:37.780
It's gonna work in harmony.
link |
00:51:40.580
Harmony, it's not collision.
link |
00:51:41.740
The human is going to jumpstart the exercise
link |
00:51:46.740
by providing qualitative, noncommitting models
link |
00:51:55.060
of how the universe works, how in reality
link |
00:52:00.300
the domain of discourse works.
link |
00:52:03.780
The machine is gonna take over from that point of view
link |
00:52:06.820
and derive whatever the calculus says can be derived.
link |
00:52:11.820
Namely, quantitative answer to our questions.
link |
00:52:15.100
Now, these are complex questions.
link |
00:52:18.420
I'll give you some example of complex questions
link |
00:52:21.220
that will bug your mind if you think about it.
link |
00:52:27.580
You take result of studies in diverse population
link |
00:52:33.660
under diverse condition, and you infer the cause effect
link |
00:52:38.660
of a new population which doesn't even resemble
link |
00:52:43.140
any of the ones studied, and you do that by do calculus.
link |
00:52:48.300
You do that by generalizing from one study to another.
link |
00:52:52.660
See, what's common with Berto?
link |
00:52:54.940
What is different?
link |
00:52:57.020
Let's ignore the differences and pull out the commonality,
link |
00:53:01.180
and you do it over maybe 100 hospitals around the world.
link |
00:53:06.140
From that, you can get really mileage from big data.
link |
00:53:11.140
It's not only that you have many samples,
link |
00:53:15.060
you have many sources of data.
link |
00:53:18.740
So that's a really powerful thing, I think,
link |
00:53:21.340
especially for medical applications.
link |
00:53:23.420
I mean, cure cancer, right?
link |
00:53:25.860
That's how from data you can cure cancer.
link |
00:53:28.580
So we're talking about causation,
link |
00:53:30.100
which is the temporal relationships between things.
link |
00:53:35.180
Not only temporal, it's both structural and temporal.
link |
00:53:38.700
Temporal enough, temporal precedence by itself
link |
00:53:43.020
cannot replace causation.
link |
00:53:46.820
Is temporal precedence the arrow of time in physics?
link |
00:53:50.900
It's important, necessary.
link |
00:53:52.220
It's important.
link |
00:53:53.060
It's efficient, yes.
link |
00:53:54.260
Is it?
link |
00:53:55.780
Yes, I never seen cause propagate backward.
link |
00:54:00.220
But if we use the word cause,
link |
00:54:03.940
but there's relationships that are timeless.
link |
00:54:07.060
I suppose that's still forward in the arrow of time.
link |
00:54:10.380
But are there relationships, logical relationships,
link |
00:54:14.860
that fit into the structure?
link |
00:54:18.580
Sure, the whole do calculus is logical relationship.
link |
00:54:21.980
That doesn't require a temporal.
link |
00:54:23.780
It has just the condition that
link |
00:54:26.940
you're not traveling back in time.
link |
00:54:28.580
Yes, correct.
link |
00:54:31.180
So it's really a generalization of,
link |
00:54:34.020
a powerful generalization of what?
link |
00:54:39.740
Of Boolean logic.
link |
00:54:40.700
Yeah, Boolean logic.
link |
00:54:41.700
Yes.
link |
00:54:43.460
That is sort of simply put,
link |
00:54:46.260
and allows us to reason about the order of events,
link |
00:54:53.620
the source, the.
link |
00:54:54.460
Not about, between, we're not deriving the order of events.
link |
00:54:58.020
We are given cause effects relationship, okay?
link |
00:55:01.340
They ought to be obeying the time presidents relationship.
link |
00:55:08.900
We are given it.
link |
00:55:09.940
And now that we ask questions about
link |
00:55:12.500
other causes of relationship,
link |
00:55:14.260
that could be derived from the initial ones,
link |
00:55:17.860
but were not given to us explicitly.
link |
00:55:20.940
Like the case of the firing squad I gave you
link |
00:55:26.260
in the first chapter.
link |
00:55:28.220
And I ask, what if rifleman A declined to shoot?
link |
00:55:33.860
Would the prisoner still be dead?
link |
00:55:37.940
To decline to shoot, it means that he disobey order.
link |
00:55:42.020
And the rule of the games were that he is a
link |
00:55:48.180
obedient marksman, okay?
link |
00:55:51.820
That's how you start.
link |
00:55:52.660
That's the initial order.
link |
00:55:53.500
But now you ask question about breaking the rules.
link |
00:55:56.540
What if he decided not to pull the trigger?
link |
00:56:00.420
He just became a pacifist.
link |
00:56:03.620
And you and I can answer that.
link |
00:56:06.020
The other rifleman would have killed him, okay?
link |
00:56:09.220
I want the machine to do that.
link |
00:56:12.300
Is it so hard to ask a machine to do that?
link |
00:56:15.420
It's such a simple task.
link |
00:56:16.740
You have to have a calculus for that.
link |
00:56:19.380
Yes, yeah.
link |
00:56:21.180
But the curiosity, the natural curiosity for me is
link |
00:56:24.300
that yes, you're absolutely correct and important.
link |
00:56:27.980
And it's hard to believe that we haven't done this
link |
00:56:31.060
seriously extensively already a long time ago.
link |
00:56:35.380
So this is really important work.
link |
00:56:37.020
But I also wanna know, maybe you can philosophize
link |
00:56:41.340
about how hard is it to learn.
link |
00:56:43.340
Okay, let's assume we're learning.
link |
00:56:44.420
We wanna learn it, okay?
link |
00:56:45.620
We wanna learn.
link |
00:56:46.460
So what do we do?
link |
00:56:47.280
We put a learning machine that watches execution trials
link |
00:56:51.980
in many countries and many locations, okay?
link |
00:56:57.220
All the machine can learn is to see shut or not shut.
link |
00:57:01.040
Dead, not dead.
link |
00:57:02.920
A court issued an order or didn't, okay?
link |
00:57:05.620
Just the facts.
link |
00:57:07.340
For the fact you don't know who listens to whom.
link |
00:57:10.140
You don't know that the condemned person
link |
00:57:13.740
listened to the bullets,
link |
00:57:15.280
that the bullets are listening to the captain, okay?
link |
00:57:19.300
All we hear is one command, two shots, dead, okay?
link |
00:57:24.300
A triple of variable.
link |
00:57:27.100
Yes, no, yes, no.
link |
00:57:29.300
Okay, that you can learn who listens to whom
link |
00:57:32.140
and you can answer the question, no.
link |
00:57:33.980
Definitively no.
link |
00:57:35.220
But don't you think you can start proposing ideas
link |
00:57:39.120
for humans to review?
link |
00:57:41.660
You want machine to learn, you want a robot.
link |
00:57:44.380
So robot is watching trials like that, 200 trials,
link |
00:57:50.780
and then he has to answer the question,
link |
00:57:52.580
what if rifleman A refrain from shooting?
link |
00:57:56.920
Yeah.
link |
00:57:58.660
How do I do that?
link |
00:58:00.260
That's exactly my point.
link |
00:58:03.620
It's looking at the facts,
link |
00:58:04.900
don't give you the strings behind the facts.
link |
00:58:07.140
Absolutely, but do you think of machine learning
link |
00:58:11.860
as it's currently defined as only something
link |
00:58:15.220
that looks at the facts and tries to do?
link |
00:58:17.620
Right now, they only look at the facts, yeah.
link |
00:58:19.180
So is there a way to modify, in your sense?
link |
00:58:23.500
Playful manipulation.
link |
00:58:25.140
Playful manipulation.
link |
00:58:26.140
Yes, once in a while.
link |
00:58:26.980
Doing the interventionist kind of thing, intervention.
link |
00:58:29.700
But it could be at random.
link |
00:58:31.140
For instance, the rifleman is sick that day
link |
00:58:34.540
or he just vomits or whatever.
link |
00:58:37.260
So machine can observe this unexpected event
link |
00:58:41.240
which introduce noise.
link |
00:58:43.580
The noise still have to be random
link |
00:58:46.420
to be able to relate it to randomized experiment.
link |
00:58:51.660
And then you have observational studies
link |
00:58:55.460
from which to infer the strings behind the facts.
link |
00:58:59.940
It's doable to a certain extent.
link |
00:59:02.980
But now that we are expert in what you can do
link |
00:59:06.260
once you have a model, we can reason back and say,
link |
00:59:09.080
what kind of data you need to build a model.
link |
00:59:13.020
Got it, so I know you're not a futurist,
link |
00:59:17.300
but are you excited?
link |
00:59:19.760
Have you, when you look back at your life,
link |
00:59:22.500
longed for the idea of creating
link |
00:59:24.540
a human level intelligence system?
link |
00:59:25.940
Yeah, I'm driven by that.
link |
00:59:28.340
All my life, I'm driven just by one thing.
link |
00:59:33.780
But I go slowly.
link |
00:59:34.900
I go from what I know to the next step incrementally.
link |
00:59:39.380
So without imagining what the end goal looks like.
link |
00:59:42.440
Do you imagine what an eight?
link |
00:59:44.860
The end goal is gonna be a machine
link |
00:59:47.780
that can answer sophisticated questions,
link |
00:59:50.900
counterfactuals of regret, compassion,
link |
00:59:55.500
responsibility, and free will.
link |
00:59:59.380
So what is a good test?
link |
01:00:01.620
Is a Turing test a reasonable test?
link |
01:00:04.980
A test of free will doesn't exist yet.
link |
01:00:08.380
How would you test free will?
link |
01:00:10.940
So far, we know only one thing.
link |
01:00:14.940
If robots can communicate with reward and punishment
link |
01:00:20.500
among themselves and hitting each other on the wrist
link |
01:00:25.420
and say, you shouldn't have done that, okay?
link |
01:00:27.820
Playing better soccer because they can do that.
link |
01:00:33.900
What do you mean, because they can do that?
link |
01:00:35.940
Because they can communicate among themselves.
link |
01:00:38.100
Because of the communication they can do.
link |
01:00:40.100
Because they communicate like us.
link |
01:00:42.660
Reward and punishment, yes.
link |
01:00:44.580
You didn't pass the ball at the right time,
link |
01:00:47.580
and so therefore you're gonna sit on the bench
link |
01:00:50.060
for the next two.
link |
01:00:51.540
If they start communicating like that,
link |
01:00:53.660
the question is, will they play better soccer?
link |
01:00:56.380
As opposed to what?
link |
01:00:57.660
As opposed to what they do now?
link |
01:00:59.660
Without this ability to reason about reward and punishment.
link |
01:01:04.980
Responsibility.
link |
01:01:06.420
And?
link |
01:01:07.940
Artifactions.
link |
01:01:08.780
So far, I can only think about communication.
link |
01:01:11.700
Communication is, and not necessarily natural language,
link |
01:01:15.380
but just communication.
link |
01:01:16.380
Just communication.
link |
01:01:17.580
And that's important to have a quick and effective means
link |
01:01:21.980
of communicating knowledge.
link |
01:01:24.100
If the coach tells you you should have passed the ball,
link |
01:01:26.460
pink, he conveys so much knowledge to you
link |
01:01:28.740
as opposed to what?
link |
01:01:30.460
Go down and change your software.
link |
01:01:33.420
That's the alternative.
link |
01:01:35.300
But the coach doesn't know your software.
link |
01:01:37.700
So how can the coach tell you
link |
01:01:39.580
you should have passed the ball?
link |
01:01:41.540
But our language is very effective.
link |
01:01:44.260
You should have passed the ball.
link |
01:01:45.580
You know your software.
link |
01:01:47.020
You tweak the right module, and next time you don't do it.
link |
01:01:52.180
Now that's for playing soccer,
link |
01:01:53.620
the rules are well defined.
link |
01:01:55.180
No, no, no, no, they're not well defined.
link |
01:01:57.420
When you should pass the ball.
link |
01:01:58.820
Is not well defined.
link |
01:02:00.020
No, it's very soft, very noisy.
link |
01:02:04.420
Yes, you have to do it under pressure.
link |
01:02:06.700
It's art.
link |
01:02:07.820
But in terms of aligning values
link |
01:02:11.340
between computers and humans,
link |
01:02:15.260
do you think this cause and effect type of thinking
link |
01:02:20.220
is important to align the values,
link |
01:02:22.300
values, morals, ethics under which the machines
link |
01:02:25.540
make decisions, is the cause effect
link |
01:02:28.500
where the two can come together?
link |
01:02:32.100
Cause and effect is necessary component
link |
01:02:34.780
to build an ethical machine.
link |
01:02:38.300
Because the machine has to empathize
link |
01:02:40.460
to understand what's good for you,
link |
01:02:42.660
to build a model of you as a recipient,
link |
01:02:47.180
which should be very much, what is compassion?
link |
01:02:50.900
They imagine that you suffer pain as much as me.
link |
01:02:56.020
As much as me.
link |
01:02:57.020
I do have already a model of myself, right?
link |
01:03:00.300
So it's very easy for me to map you to mine.
link |
01:03:02.780
I don't have to rebuild the model.
link |
01:03:04.620
It's much easier to say, oh, you're like me.
link |
01:03:06.940
Okay, therefore I would not hate you.
link |
01:03:09.620
And the machine has to imagine,
link |
01:03:12.100
has to try to fake to be human essentially
link |
01:03:14.540
so you can imagine that you're like me, right?
link |
01:03:19.420
And moreover, who is me?
link |
01:03:21.660
That's the first, that's consciousness.
link |
01:03:24.180
They have a model of yourself.
link |
01:03:26.860
Where do you get this model?
link |
01:03:28.100
You look at yourself as if you are a part
link |
01:03:30.980
of the environment.
link |
01:03:32.380
If you build a model of yourself
link |
01:03:33.940
versus the environment, then you can say,
link |
01:03:36.100
I need to have a model of myself.
link |
01:03:38.220
I have abilities, I have desires and so forth, okay?
link |
01:03:41.820
I have a blueprint of myself though.
link |
01:03:44.340
Not the full detail because I cannot
link |
01:03:46.140
get the whole thing problem.
link |
01:03:49.220
But I have a blueprint.
link |
01:03:50.700
So on that level of a blueprint, I can modify things.
link |
01:03:54.220
I can look at myself in the mirror and say,
link |
01:03:56.580
hmm, if I change this model, tweak this model,
link |
01:03:59.180
I'm gonna perform differently.
link |
01:04:02.340
That is what we mean by free will.
link |
01:04:05.740
And consciousness.
link |
01:04:06.940
And consciousness.
link |
01:04:08.260
What do you think is consciousness?
link |
01:04:10.380
Is it simply self awareness?
link |
01:04:11.820
So including yourself into the model of the world?
link |
01:04:14.660
That's right.
link |
01:04:15.500
Some people tell me, no, this is only part of consciousness.
link |
01:04:19.620
And then they start telling me what they really mean
link |
01:04:21.460
by consciousness, and I lose them.
link |
01:04:24.740
For me, consciousness is having a blueprint
link |
01:04:29.420
of your software.
link |
01:04:31.580
Do you have concerns about the future of AI?
link |
01:04:37.020
All the different trajectories of all of our research?
link |
01:04:39.660
Yes.
link |
01:04:40.700
Where's your hope, where the movement has,
link |
01:04:43.300
where are your concerns?
link |
01:04:44.380
I'm concerned, because I know we are building a new species
link |
01:04:49.500
that has a capability of exceeding our, exceeding us,
link |
01:04:56.820
exceeding our capabilities, and can breed itself
link |
01:05:01.300
and take over the world.
link |
01:05:02.540
Absolutely.
link |
01:05:03.580
It's a new species that is uncontrolled.
link |
01:05:07.660
We don't know the degree to which we control it.
link |
01:05:10.140
We don't even understand what it means
link |
01:05:12.660
to be able to control this new species.
link |
01:05:16.500
So I'm concerned.
link |
01:05:18.660
I don't have anything to add to that,
link |
01:05:21.100
because it's such a gray area, it's unknown.
link |
01:05:26.100
It never happened in history.
link |
01:05:29.820
The only time it happened in history
link |
01:05:34.260
was evolution with human beings.
link |
01:05:37.580
It wasn't very successful, was it?
link |
01:05:39.420
Some people say it was a great success.
link |
01:05:42.700
For us it was, but a few people along the way,
link |
01:05:46.380
a few creatures along the way would not agree.
link |
01:05:49.340
So it's just because it's such a gray area,
link |
01:05:53.100
there's nothing else to say.
link |
01:05:54.980
We have a sample of one.
link |
01:05:56.820
Sample of one.
link |
01:05:58.100
It's us.
link |
01:05:59.860
But some people would look at you and say,
link |
01:06:04.860
yeah, but we were looking to you to help us
link |
01:06:09.780
make sure that the sample two works out okay.
link |
01:06:13.180
We have more than a sample of one.
link |
01:06:14.860
We have theories, and that's a good.
link |
01:06:18.700
We don't need to be statisticians.
link |
01:06:20.740
So sample of one doesn't mean poverty of knowledge.
link |
01:06:25.420
It's not.
link |
01:06:26.460
Sample of one plus theory, conjectural theory,
link |
01:06:30.540
of what could happen.
link |
01:06:32.940
That we do have.
link |
01:06:34.380
But I really feel helpless in contributing
link |
01:06:38.260
to this argument, because I know so little,
link |
01:06:41.980
and my imagination is limited,
link |
01:06:46.640
and I know how much I don't know,
link |
01:06:50.140
and I, but I'm concerned.
link |
01:06:55.540
You were born and raised in Israel.
link |
01:06:57.780
Born and raised in Israel, yes.
link |
01:06:59.260
And later served in Israel military,
link |
01:07:03.900
defense forces.
link |
01:07:05.220
In the Israel Defense Force.
link |
01:07:07.980
Yeah.
link |
01:07:09.720
What did you learn from that experience?
link |
01:07:13.820
From this experience?
link |
01:07:16.540
There's a kibbutz in there as well.
link |
01:07:18.140
Yes, because I was in the nachal,
link |
01:07:20.540
which is a combination of agricultural work
link |
01:07:26.140
and military service.
link |
01:07:28.540
We were supposed, I was really idealist.
link |
01:07:31.220
I wanted to be a member of the kibbutz throughout my life,
link |
01:07:36.180
and to live a communal life,
link |
01:07:38.700
and so I prepared myself for that.
link |
01:07:46.060
Slowly, slowly, I wanted a greater challenge.
link |
01:07:51.360
So that's a far world away, both.
link |
01:07:55.340
What I learned from that, what I can add,
link |
01:07:57.780
it was a miracle.
link |
01:08:01.340
It was a miracle that I served in the 1950s.
link |
01:08:07.700
I don't know how we survived.
link |
01:08:11.380
The country was under austerity.
link |
01:08:15.380
It tripled its population from 600,000 to a million point eight
link |
01:08:21.460
when I finished college.
link |
01:08:23.320
No one went hungry.
link |
01:08:24.860
And austerity, yes.
link |
01:08:29.380
When you wanted to make an omelet in a restaurant,
link |
01:08:34.220
you had to bring your own egg.
link |
01:08:38.140
And they imprisoned people from bringing the food
link |
01:08:43.740
from farming here, from the villages, to the city.
link |
01:08:49.380
But no one went hungry.
link |
01:08:50.780
And I always add to it,
link |
01:08:53.620
and higher education did not suffer any budget cut.
link |
01:09:00.220
They still invested in me, in my wife, in our generation
link |
01:09:05.460
to get the best education that they could, okay?
link |
01:09:09.700
So I'm really grateful for the opportunity,
link |
01:09:15.200
and I'm trying to pay back now, okay?
link |
01:09:18.620
It's a miracle that we survived the war of 1948.
link |
01:09:22.900
We were so close to a second genocide.
link |
01:09:27.260
It was all planned.
link |
01:09:30.160
But we survived it by miracle,
link |
01:09:32.180
and then the second miracle
link |
01:09:33.580
that not many people talk about, the next phase.
link |
01:09:37.780
How no one went hungry,
link |
01:09:40.280
and the country managed to triple its population.
link |
01:09:43.940
You know what it means to triple?
link |
01:09:45.300
Imagine United States going from what, 350 million
link |
01:09:50.220
to a trillion, unbelievable.
link |
01:09:53.860
So it's a really tense part of the world.
link |
01:09:57.020
It's a complicated part of the world,
link |
01:09:59.060
Israel and all around.
link |
01:10:01.820
Religion is at the core of that complexity.
link |
01:10:08.100
One of the components.
link |
01:10:09.460
Religion is a strong motivating cause
link |
01:10:12.420
to many, many people in the Middle East, yes.
link |
01:10:16.500
In your view, looking back, is religion good for society?
link |
01:10:23.180
That's a good question for robotic, you know?
link |
01:10:26.900
There's echoes of that question.
link |
01:10:28.300
Equip robot with religious belief.
link |
01:10:32.260
Suppose we find out, or we agree
link |
01:10:34.660
that religion is good to you, to keep you in line, okay?
link |
01:10:37.940
Should we give the robot the metaphor of a god?
link |
01:10:43.340
As a matter of fact, the robot will get it without us also.
link |
01:10:47.340
Why?
link |
01:10:48.180
The robot will reason by metaphor.
link |
01:10:51.020
And what is the most primitive metaphor
link |
01:10:56.380
a child grows with?
link |
01:10:58.940
Mother smile, father teaching,
link |
01:11:02.940
father image and mother image, that's god.
link |
01:11:05.380
So, whether you want it or not,
link |
01:11:08.800
the robot will, well, assuming that the robot
link |
01:11:12.560
is gonna have a mother and a father,
link |
01:11:14.800
it may only have a programmer,
link |
01:11:16.360
which doesn't supply warmth and discipline.
link |
01:11:20.960
Well, discipline it does.
link |
01:11:22.440
So the robot will have a model of the trainer,
link |
01:11:27.360
and everything that happens in the world,
link |
01:11:29.320
cosmology and so on, is going to be mapped
link |
01:11:32.280
into the programmer, it's god.
link |
01:11:36.520
Man, the thing that represents the origin
link |
01:11:41.380
of everything for that robot.
link |
01:11:43.960
It's the most primitive relationship.
link |
01:11:46.340
So it's gonna arrive there by metaphor.
link |
01:11:48.680
And so the question is if overall
link |
01:11:51.880
that metaphor has served us well as humans.
link |
01:11:56.200
I really don't know.
link |
01:11:58.000
I think it did, but as long as you keep
link |
01:12:01.680
in mind it's only a metaphor.
link |
01:12:05.160
So, if you think we can, can we talk about your son?
link |
01:12:11.580
Yes, yes.
link |
01:12:13.240
Can you tell his story?
link |
01:12:15.080
His story?
link |
01:12:17.180
Daniel?
link |
01:12:18.020
His story is known, he was abducted
link |
01:12:21.200
in Pakistan by Al Qaeda driven sect,
link |
01:12:26.200
and under various pretenses.
link |
01:12:32.040
I don't even pay attention to what the pretence was.
link |
01:12:35.200
Originally they wanted to have the United States
link |
01:12:42.720
deliver some promised airplanes.
link |
01:12:47.520
It was all made up, and all these demands were bogus.
link |
01:12:51.520
Bogus, I don't know really, but eventually
link |
01:12:57.000
he was executed in front of a camera.
link |
01:13:03.680
At the core of that is hate and intolerance.
link |
01:13:07.400
At the core, yes, absolutely, yes.
link |
01:13:10.000
We don't really appreciate the depth of the hate
link |
01:13:15.000
at which billions of peoples are educated.
link |
01:13:26.080
We don't understand it.
link |
01:13:27.560
I just listened recently to what they teach you
link |
01:13:31.360
in Mogadishu.
link |
01:13:32.320
Okay, okay, when the water stopped in the tap,
link |
01:13:45.680
we knew exactly who did it, the Jews.
link |
01:13:49.360
The Jews.
link |
01:13:50.600
We didn't know how, but we knew who did it.
link |
01:13:55.280
We don't appreciate what it means to us.
link |
01:13:58.080
The depth is unbelievable.
link |
01:14:00.440
Do you think all of us are capable of evil?
link |
01:14:06.920
And the education, the indoctrination
link |
01:14:09.840
is really what creates evil.
link |
01:14:10.680
Absolutely we are capable of evil.
link |
01:14:12.240
If you're indoctrinated sufficiently long and in depth,
link |
01:14:18.800
you're capable of ISIS, you're capable of Nazism.
link |
01:14:24.840
Yes, we are, but the question is whether we,
link |
01:14:28.400
after we have gone through some Western education
link |
01:14:32.840
and we learn that everything is really relative.
link |
01:14:35.680
It is not absolute God.
link |
01:14:37.680
It's only a belief in God.
link |
01:14:40.120
Whether we are capable now of being transformed
link |
01:14:43.520
under certain circumstances to become brutal.
link |
01:14:49.920
Yeah.
link |
01:14:51.680
I'm worried about it because some people say yes,
link |
01:14:55.080
given the right circumstances,
link |
01:14:57.360
given bad economical crisis,
link |
01:15:04.120
you are capable of doing it too.
link |
01:15:06.160
That worries me.
link |
01:15:08.420
I want to believe it, I'm not capable.
link |
01:15:12.200
So seven years after Daniel's death,
link |
01:15:14.560
you wrote an article at the Wall Street Journal
link |
01:15:16.800
titled Daniel Pearl and the Normalization of Evil.
link |
01:15:19.760
Yes.
link |
01:15:20.580
What was your message back then
link |
01:15:23.040
and how did it change today over the years?
link |
01:15:27.560
I lost.
link |
01:15:30.600
What was the message?
link |
01:15:31.840
The message was that we are not treating terrorism
link |
01:15:39.960
as a taboo.
link |
01:15:41.840
We are treating it as a bargaining device that is accepted.
link |
01:15:46.840
People have grievance and they go and bomb restaurants.
link |
01:15:54.000
It's normal.
link |
01:15:55.240
Look, you're even not surprised when I tell you that.
link |
01:15:59.600
20 years ago you say, what?
link |
01:16:01.640
For grievance you go and blow a restaurant?
link |
01:16:05.000
Today it's becoming normalized.
link |
01:16:07.960
The banalization of evil.
link |
01:16:11.080
And we have created that to ourselves by normalizing,
link |
01:16:16.080
by making it part of political life.
link |
01:16:23.960
It's a political debate.
link |
01:16:27.760
Every terrorist yesterday becomes a freedom fighter today
link |
01:16:34.440
and tomorrow it becomes terrorist again.
link |
01:16:36.600
It's switchable.
link |
01:16:38.600
Right, and so we should call out evil when there's evil.
link |
01:16:42.220
If we don't want to be part of it.
link |
01:16:46.860
Becoming.
link |
01:16:49.020
Yeah, if we want to separate good from evil,
link |
01:16:52.300
that's one of the first things that,
link |
01:16:55.060
what was it, in the Garden of Eden,
link |
01:16:57.540
remember the first thing that God told him was,
link |
01:17:02.900
hey, you want some knowledge, here's a tree of good and evil.
link |
01:17:07.380
Yeah, so this evil touched your life personally.
link |
01:17:12.260
Does your heart have anger, sadness, or is it hope?
link |
01:17:20.300
Look, I see some beautiful people coming from Pakistan.
link |
01:17:26.140
I see beautiful people everywhere.
link |
01:17:29.420
But I see horrible propagation of evil in this country too.
link |
01:17:38.700
It shows you how populistic slogans
link |
01:17:42.780
can catch the mind of the best intellectuals.
link |
01:17:48.220
Today is Father's Day.
link |
01:17:50.100
I didn't know that.
link |
01:17:51.060
Yeah, what's a fond memory you have of Daniel?
link |
01:17:56.060
What's a fond memory you have of Daniel?
link |
01:17:58.380
Oh, very good memories, immense.
link |
01:18:02.660
He was my mentor.
link |
01:18:06.100
He had a sense of balance that I didn't have.
link |
01:18:15.340
He saw the beauty in every person.
link |
01:18:19.460
He was not as emotional as I am,
link |
01:18:22.020
the more looking things in perspective.
link |
01:18:26.220
He really liked every person.
link |
01:18:29.340
He really grew up with the idea that a foreigner
link |
01:18:34.740
is a reason for curiosity, not for fear.
link |
01:18:42.780
That one time we went in Berkeley,
link |
01:18:45.280
and a homeless came out from some dark alley,
link |
01:18:49.220
and said, hey, man, can you spare a dime?
link |
01:18:51.180
I retreated back, two feet back,
link |
01:18:54.620
and then I just hugged him and say,
link |
01:18:56.220
here's a dime, enjoy yourself.
link |
01:18:58.420
Maybe you want some money to take a bus or whatever.
link |
01:19:05.340
Where did you get it?
link |
01:19:06.580
Not from me.
link |
01:19:10.460
Do you have advice for young minds today,
link |
01:19:12.400
dreaming about creating as you have dreamt,
link |
01:19:16.200
creating intelligent systems?
link |
01:19:17.880
What is the best way to arrive at new breakthrough ideas
link |
01:19:21.380
and carry them through the fire of criticism
link |
01:19:23.820
and past conventional ideas?
link |
01:19:28.780
Ask your questions freely.
link |
01:19:34.920
Your questions are never dumb.
link |
01:19:37.660
And solve them your own way.
link |
01:19:40.700
And don't take no for an answer.
link |
01:19:42.680
Look, if they are really dumb,
link |
01:19:46.240
you will find out quickly by trying an arrow
link |
01:19:49.880
to see that they're not leading any place.
link |
01:19:52.420
But follow them and try to understand things your way.
link |
01:19:59.440
That is my advice.
link |
01:20:01.600
I don't know if it's gonna help anyone.
link |
01:20:03.980
Not as brilliantly.
link |
01:20:05.520
There is a lot of inertia in science, in academia.
link |
01:20:15.120
It is slowing down science.
link |
01:20:18.560
Yeah, those two words, your way, that's a powerful thing.
link |
01:20:23.320
It's against inertia, potentially, against the flow.
link |
01:20:27.000
Against your professor.
link |
01:20:28.640
Against your professor.
link |
01:20:30.440
I wrote the Book of Why in order to democratize
link |
01:20:34.840
common sense.
link |
01:20:38.680
In order to instill rebellious spirit in students
link |
01:20:45.460
so they wouldn't wait until the professor get things right.
link |
01:20:53.720
So you wrote the manifesto of the rebellion
link |
01:20:56.640
against the professor.
link |
01:20:58.360
Against the professor, yes.
link |
01:21:00.360
So looking back at your life of research,
link |
01:21:02.840
what ideas do you hope ripple through the next many decades?
link |
01:21:06.200
What do you hope your legacy will be?
link |
01:21:10.600
I already have a tombstone carved.
link |
01:21:19.880
Oh, boy.
link |
01:21:21.560
The fundamental law of counterfactuals.
link |
01:21:25.840
That's what, it's a simple equation.
link |
01:21:29.800
Counterfactual in terms of a model surgery.
link |
01:21:35.560
That's it, because everything follows from that.
link |
01:21:40.000
If you get that, all the rest, I can die in peace.
link |
01:21:46.600
And my student can derive all my knowledge
link |
01:21:49.640
by mathematical means.
link |
01:21:51.920
The rest follows.
link |
01:21:53.520
Yeah.
link |
01:21:55.040
Thank you so much for talking today.
link |
01:21:56.420
I really appreciate it.
link |
01:21:57.260
Thank you for being so attentive and instigating.
link |
01:22:03.920
We did it.
link |
01:22:05.240
We did it.
link |
01:22:06.080
The coffee helped.
link |
01:22:07.520
Thanks for listening to this conversation with Judea Pearl.
link |
01:22:11.280
And thank you to our presenting sponsor, Cash App.
link |
01:22:14.280
Download it, use code LexPodcast, you'll get $10,
link |
01:22:18.320
and $10 will go to FIRST, a STEM education nonprofit
link |
01:22:21.960
that inspires hundreds of thousands of young minds
link |
01:22:24.880
to learn and to dream of engineering our future.
link |
01:22:28.320
If you enjoy this podcast, subscribe on YouTube,
link |
01:22:31.120
give it five stars on Apple Podcast,
link |
01:22:33.040
support on Patreon, or simply connect with me on Twitter.
link |
01:22:36.880
And now, let me leave you with some words of wisdom
link |
01:22:39.320
from Judea Pearl.
link |
01:22:41.080
You cannot answer a question that you cannot ask,
link |
01:22:44.000
and you cannot ask a question that you have no words for.
link |
01:22:47.400
Thank you for listening, and hope to see you next time.