back to index

Alex Garland: Ex Machina, Devs, Annihilation, and the Poetry of Science | Lex Fridman Podcast #77


small model | large model

link |
00:00:00.140
The following is a conversation with Alex Garland,
link |
00:00:03.360
writer and director of many imaginative
link |
00:00:06.400
and philosophical films from the dreamlike exploration
link |
00:00:09.760
of human self destruction in the movie Annihilation
link |
00:00:12.720
to the deep questions of consciousness and intelligence
link |
00:00:16.440
raised in the movie Ex Machina,
link |
00:00:18.600
which to me is one of the greatest movies
link |
00:00:21.080
in artificial intelligence ever made.
link |
00:00:23.880
I'm releasing this podcast to coincide
link |
00:00:25.820
with the release of this new series called Devs
link |
00:00:28.600
that will premiere this Thursday, March 5th on Hulu
link |
00:00:32.560
as part of FX on Hulu.
link |
00:00:35.640
It explores many of the themes this very podcast is about,
link |
00:00:39.320
from quantum mechanics to artificial life to simulation
link |
00:00:43.480
to the modern nature of power in the tech world.
link |
00:00:47.400
I got a chance to watch a preview and loved it.
link |
00:00:50.360
The acting is great.
link |
00:00:52.080
Nick Offerman especially is incredible in it.
link |
00:00:55.400
The cinematography is beautiful
link |
00:00:58.040
and the philosophical and scientific ideas
link |
00:00:59.960
explored are profound.
link |
00:01:02.060
And for me as an engineer and scientist,
link |
00:01:04.480
which is fun to see brought to life.
link |
00:01:07.280
For example, if you watch the trailer
link |
00:01:09.040
for the series carefully,
link |
00:01:10.560
you'll see there's a programmer with a Russian accent
link |
00:01:13.160
looking at a screen with Python like code on it
link |
00:01:16.180
that appears to be using a library
link |
00:01:18.160
that interfaces with a quantum computer.
link |
00:01:21.120
This attention and technical detail
link |
00:01:23.000
on several levels is impressive.
link |
00:01:25.560
And one of the reasons I'm a big fan
link |
00:01:27.440
of how Alex weaves science and philosophy together
link |
00:01:30.100
in his work.
link |
00:01:31.960
Meeting Alex for me was unlikely,
link |
00:01:35.120
but it was life changing
link |
00:01:36.760
in ways I may only be able to articulate in a few years.
link |
00:01:41.220
Just as meeting spot many of Boston Dynamics
link |
00:01:43.640
for the first time planted a seed of an idea in my mind,
link |
00:01:47.840
so did meeting Alex Garland.
link |
00:01:50.200
He's humble, curious, intelligent,
link |
00:01:52.840
and to me an inspiration.
link |
00:01:55.340
Plus, he's just really a fun person to talk with
link |
00:01:58.000
about the biggest possible questions in our universe.
link |
00:02:02.220
This is the Artificial Intelligence Podcast.
link |
00:02:05.120
If you enjoy it, subscribe on YouTube,
link |
00:02:07.340
give it five stars on Apple Podcast,
link |
00:02:09.160
support it on Patreon,
link |
00:02:10.560
or simply connect with me on Twitter
link |
00:02:12.600
at Lex Friedman spelled F R I D M A N.
link |
00:02:17.080
As usual, I'll do one or two minutes of ads now
link |
00:02:19.640
and never any ads in the middle
link |
00:02:21.080
that can break the flow of the conversation.
link |
00:02:23.380
I hope that works for you
link |
00:02:24.800
and doesn't hurt the listening experience.
link |
00:02:27.500
This show is presented by Cash App,
link |
00:02:29.960
the number one finance app in the App Store.
link |
00:02:32.400
When you get it, use code LEXPODCAST.
link |
00:02:35.840
Cash App lets you send money to friends,
link |
00:02:38.040
buy Bitcoin, and invest in the stock market
link |
00:02:40.360
with as little as one dollar.
link |
00:02:42.880
Since Cash App allows you to buy Bitcoin,
link |
00:02:45.220
let me mention that cryptocurrency
link |
00:02:47.160
in the context of the history of money is fascinating.
link |
00:02:50.400
I recommend A Scent of Money
link |
00:02:52.760
as a great book on this history.
link |
00:02:55.040
Debits and credits on ledgers started 30,000 years ago.
link |
00:02:59.920
The US dollar was created about 200 years ago.
link |
00:03:03.960
At Bitcoin, the first decentralized cryptocurrency
link |
00:03:07.480
was released just over 10 years ago.
link |
00:03:10.060
So given that history,
link |
00:03:11.460
cryptocurrency is still very much
link |
00:03:13.060
in its early days of development,
link |
00:03:15.020
but it still is aiming to
link |
00:03:16.680
and just might redefine the nature of money.
link |
00:03:20.760
So again, if you get Cash App from the App Store
link |
00:03:23.160
or Google Play and use code LEXPODCAST,
link |
00:03:26.200
you'll get $10,
link |
00:03:27.480
and Cash App will also donate $10 to FIRST,
link |
00:03:30.300
one of my favorite organizations
link |
00:03:31.960
that is helping advance robotics
link |
00:03:33.840
and STEM education for young people around the world.
link |
00:03:37.640
And now, here's my conversation with Alex Garland.
link |
00:03:42.560
You described the world inside the shimmer
link |
00:03:45.260
in the movie Annihilation as dreamlike
link |
00:03:47.280
in that it's internally consistent
link |
00:03:48.880
but detached from reality.
link |
00:03:50.840
That leads me to ask,
link |
00:03:52.480
do you think, a philosophical question, I apologize,
link |
00:03:56.360
do you think we might be living in a dream
link |
00:03:58.720
or in a simulation, like the kind that the shimmer creates?
link |
00:04:03.840
We human beings here today.
link |
00:04:07.160
Yeah.
link |
00:04:08.320
I wanna sort of separate that out into two things.
link |
00:04:11.720
Yes, I think we're living in a dream of sorts.
link |
00:04:15.600
No, I don't think we're living in a simulation.
link |
00:04:18.520
I think we're living on a planet
link |
00:04:20.840
with a very thin layer of atmosphere
link |
00:04:23.800
and the planet is in a very large space
link |
00:04:27.720
and the space is full of other planets and stars
link |
00:04:30.000
and quasars and stuff like that.
link |
00:04:31.360
And I don't think those physical objects,
link |
00:04:35.680
I don't think the matter in that universe is simulated.
link |
00:04:38.840
I think it's there.
link |
00:04:40.600
We are definitely,
link |
00:04:44.760
it's a hot problem with saying definitely,
link |
00:04:46.360
but in my opinion, I'll just go back to that.
link |
00:04:50.180
I think it seems very like we're living in a dream state.
link |
00:04:53.040
I'm pretty sure we are.
link |
00:04:54.280
And I think that's just to do with the nature
link |
00:04:56.480
of how we experience the world.
link |
00:04:58.000
We experience it in a subjective way.
link |
00:05:01.200
And the thing I've learned most
link |
00:05:04.400
as I've got older in some respects
link |
00:05:06.240
is the degree to which reality is counterintuitive
link |
00:05:10.800
and that the things that are presented to us as objective
link |
00:05:13.640
turn out not to be objective
link |
00:05:15.120
and quantum mechanics is full of that kind of thing,
link |
00:05:17.360
but actually just day to day life
link |
00:05:18.960
is full of that kind of thing as well.
link |
00:05:20.840
So my understanding of the way the brain works
link |
00:05:27.160
is you get some information, hit your optic nerve,
link |
00:05:30.760
and then your brain makes its best guess
link |
00:05:32.760
about what it's seeing or what it's saying it's seeing.
link |
00:05:36.320
It may or may not be an accurate best guess.
link |
00:05:39.220
It might be an inaccurate best guess.
link |
00:05:41.320
And that gap, the best guess gap,
link |
00:05:45.440
means that we are essentially living in a subjective state,
link |
00:05:48.980
which means that we're in a dream state.
link |
00:05:51.000
So I think you could enlarge on the dream state
link |
00:05:54.000
in all sorts of ways.
link |
00:05:55.440
So yes, dream state, no simulation
link |
00:05:58.280
would be where I'd come down.
link |
00:06:00.440
Going further, deeper into that direction,
link |
00:06:04.020
you've also described that world as psychedelia.
link |
00:06:08.560
So on that topic, I'm curious about that world.
link |
00:06:11.440
On the topic of psychedelic drugs,
link |
00:06:13.320
do you see those kinds of chemicals
link |
00:06:15.920
that modify our perception
link |
00:06:18.280
as a distortion of our perception of reality
link |
00:06:22.000
or a window into another reality?
link |
00:06:25.840
No, I think what I'd be saying
link |
00:06:27.060
is that we live in a distorted reality
link |
00:06:29.140
and then those kinds of drugs
link |
00:06:30.520
give us a different kind of distorted.
link |
00:06:32.400
Different perspective.
link |
00:06:33.240
Yeah, exactly.
link |
00:06:34.060
They just give an alternate distortion.
link |
00:06:35.920
And I think that what they really do
link |
00:06:37.560
is they give a distorted perception,
link |
00:06:41.040
which is a little bit more allied to daydreams
link |
00:06:45.540
or unconscious interests.
link |
00:06:47.320
So if for some reason you're feeling unconsciously anxious
link |
00:06:51.120
at that moment and you take a psychedelic drug,
link |
00:06:53.220
you'll have a more pronounced, unpleasant experience.
link |
00:06:56.560
And if you're feeling very calm or happy,
link |
00:06:59.080
you might have a good time.
link |
00:07:01.720
But yeah, so if I'm saying we're starting from a premise,
link |
00:07:04.800
our starting point is we were already in the
link |
00:07:07.920
slightly psychedelic state.
link |
00:07:10.580
What those drugs do is help you go further down an avenue
link |
00:07:13.400
or maybe a slightly different avenue, but that's all.
link |
00:07:16.240
So in that movie, Annihilation,
link |
00:07:19.520
the shimmer, this alternate dreamlike state
link |
00:07:24.980
is created by, I believe perhaps, an alien entity.
link |
00:07:29.420
Of course, everything is up to interpretation, right?
link |
00:07:32.100
But do you think there's, in our world, in our universe,
link |
00:07:36.180
do you think there's intelligent life out there?
link |
00:07:39.080
And if so, how different is it from us humans?
link |
00:07:42.500
Well, one of the things I was trying to do in Annihilation
link |
00:07:47.200
was to offer up a form of alien life
link |
00:07:51.760
that was actually alien,
link |
00:07:53.380
because it would often seem to me that in the way
link |
00:07:58.380
that in the way we would represent aliens in books
link |
00:08:03.220
or cinema or television,
link |
00:08:04.380
or any one of the sort of storytelling mediums,
link |
00:08:08.300
is we would always give them very humanlike qualities.
link |
00:08:11.940
So they wanted to teach us about galactic federations,
link |
00:08:14.900
or they wanted to eat us, or they wanted our resources,
link |
00:08:17.780
like our water, or they want to enslave us,
link |
00:08:20.240
or whatever it happens to be.
link |
00:08:21.420
But all of these are incredibly humanlike motivations.
link |
00:08:25.460
And I was interested in the idea of an alien
link |
00:08:30.900
that was not in any way like us.
link |
00:08:34.300
It didn't share.
link |
00:08:36.220
Maybe it had a completely different clock speed.
link |
00:08:38.820
Maybe it's way, so we're talking about,
link |
00:08:42.140
we're looking at each other,
link |
00:08:43.180
we're getting information, light hits our optic nerve,
link |
00:08:46.860
our brain makes the best guess of what we're doing.
link |
00:08:49.060
Sometimes it's right, something, you know,
link |
00:08:50.300
the thing we were talking about before.
link |
00:08:51.820
What if this alien doesn't have an optic nerve?
link |
00:08:54.980
Maybe its way of encountering the space it's in
link |
00:08:57.700
is wholly different.
link |
00:08:59.260
Maybe it has a different relationship with gravity.
link |
00:09:01.820
The basic laws of physics it operates under
link |
00:09:04.060
might be fundamentally different.
link |
00:09:05.820
It could be a different time scale and so on.
link |
00:09:07.820
Yeah, or it could be the same laws,
link |
00:09:10.340
could be the same underlying laws of physics.
link |
00:09:12.740
You know, it's a machine created,
link |
00:09:16.260
or it's a creature created in a quantum mechanical way.
link |
00:09:19.180
It just ends up in a very, very different place
link |
00:09:21.820
to the one we end up in.
link |
00:09:23.420
So, part of the preoccupation with annihilation
link |
00:09:26.820
was to come up with an alien that was really alien
link |
00:09:29.900
and didn't give us,
link |
00:09:32.780
and it didn't give us and we didn't give it
link |
00:09:35.380
any kind of easy connection between human and the alien.
link |
00:09:39.980
Because I think it was to do with the idea
link |
00:09:42.140
that you could have an alien that landed on this planet
link |
00:09:44.540
that wouldn't even know we were here.
link |
00:09:46.580
And we might only glancingly know it was here.
link |
00:09:49.420
There'd just be this strange point
link |
00:09:52.180
where the vent diagrams connected,
link |
00:09:53.860
where we could sense each other or something like that.
link |
00:09:56.180
So in the movie, first of all, incredibly original view
link |
00:09:59.980
of what an alien life would be.
link |
00:10:01.900
And in that sense, it's a huge success.
link |
00:10:05.980
Let's go inside your imagination.
link |
00:10:07.860
Did the alien, that alien entity know anything about humans
link |
00:10:13.020
when it landed?
link |
00:10:13.940
No.
link |
00:10:14.780
So the idea is you're basically an alien
link |
00:10:18.140
that life is trying to reach out to anything
link |
00:10:22.420
that might be able to hear its mechanism of communication.
link |
00:10:25.940
Or was it simply, was it just basically their biologist
link |
00:10:30.100
exploring different kinds of stuff that you can find?
link |
00:10:32.980
But this is the interesting thing is,
link |
00:10:34.500
as soon as you say their biologist,
link |
00:10:36.740
you've done the thing of attributing
link |
00:10:38.340
human type motivations to it.
link |
00:10:40.540
So I was trying to free myself from anything like that.
link |
00:10:48.380
So all sorts of questions you might answer
link |
00:10:51.060
about this notional alien, I wouldn't be able to answer
link |
00:10:54.100
because I don't know what it was or how it worked.
link |
00:10:57.420
You know, I had some rough ideas.
link |
00:11:00.900
Like it had a very, very, very slow clock speed.
link |
00:11:04.340
And I thought maybe the way it is interacting
link |
00:11:07.380
with this environment is a little bit like
link |
00:11:09.180
the way an octopus will change its color forms
link |
00:11:13.340
around the space that it's in.
link |
00:11:15.180
So it's sort of reacting to what it's in to an extent,
link |
00:11:19.420
but the reason it's reacting in that way is indeterminate.
link |
00:11:23.620
But it's so, but it's clock speed was slower
link |
00:11:26.860
than our human life clock speed or inter,
link |
00:11:30.340
but it's faster than evolution.
link |
00:11:32.940
Faster than our evolution.
link |
00:11:34.980
Yeah, given the 4 billion years it took us to get here,
link |
00:11:37.700
then yes, maybe it started at eight.
link |
00:11:39.820
If you look at the human civilization as a single organism,
link |
00:11:43.420
in that sense, you know, this evolution could be us.
link |
00:11:46.780
You know, the evolution of living organisms on earth
link |
00:11:49.860
could be just a single organism.
link |
00:11:51.380
And it's kind of, that's its life,
link |
00:11:54.100
is the evolution process that eventually will lead
link |
00:11:57.220
to probably the heat death of the universe
link |
00:12:00.940
or something before that.
link |
00:12:02.660
I mean, that's just an incredible idea.
link |
00:12:05.380
So you almost don't know.
link |
00:12:07.100
You've created something
link |
00:12:09.020
that you don't even know how it works.
link |
00:12:11.660
Yeah, because anytime I tried to look into
link |
00:12:16.980
how it might work,
link |
00:12:18.220
I would then inevitably be attaching
link |
00:12:20.260
my kind of thought processes into it.
link |
00:12:22.860
And I wanted to try and put a bubble around it.
link |
00:12:24.980
I would say, no, this is alien in its most alien form.
link |
00:12:29.540
I have no real point of contact.
link |
00:12:32.900
So unfortunately I can't talk to Stanley Kubrick.
link |
00:12:37.620
So I'm really fortunate to get a chance to talk to you.
link |
00:12:41.380
On this particular notion,
link |
00:12:45.860
I'd like to ask it a bunch of different ways
link |
00:12:48.380
and we'll explore it in different ways,
link |
00:12:49.500
but do you ever consider human imagination,
link |
00:12:52.460
your imagination as a window into a possible future?
link |
00:12:57.020
And that what you're doing,
link |
00:12:59.460
you're putting that imagination on paper as a writer
link |
00:13:02.140
and then on screen as a director.
link |
00:13:04.740
And that plants the seeds in the minds of millions
link |
00:13:07.380
of future and current scientists.
link |
00:13:10.180
And so your imagination, you putting it down
link |
00:13:13.020
actually makes it as a reality.
link |
00:13:14.980
So it's almost like a first step of the scientific method
link |
00:13:18.580
that you imagining what's possible
link |
00:13:20.340
in your new series with Ex Machina
link |
00:13:23.500
is actually inspiring thousands of 12 year olds,
link |
00:13:28.820
millions of scientists
link |
00:13:30.700
and actually creating the future view of imagine.
link |
00:13:34.460
Well, all I could say is that from my point of view,
link |
00:13:37.140
it's almost exactly the reverse
link |
00:13:39.220
because I see that pretty much everything I do
link |
00:13:45.660
is a reaction to what scientists are doing.
link |
00:13:50.260
I'm an interested lay person.
link |
00:13:53.460
And I feel this individual,
link |
00:13:58.260
I feel that the most interesting area
link |
00:14:02.700
that humans are involved in is science.
link |
00:14:05.540
I think art is very, very interesting,
link |
00:14:07.340
but the most interesting is science.
link |
00:14:09.500
And science is in a weird place
link |
00:14:12.660
because maybe around the time Newton was alive,
link |
00:14:18.060
if a very, very interested lay person said to themselves,
link |
00:14:21.340
I want to really understand what Newton is saying
link |
00:14:23.980
about the way the world works
link |
00:14:25.500
with a few years of dedicated thinking,
link |
00:14:28.860
they would be able to understand
link |
00:14:32.500
the sort of principles he was laying out.
link |
00:14:34.500
And I don't think that's true anymore.
link |
00:14:35.940
I think that's stopped being true now.
link |
00:14:37.900
So I'm pretty smart guy.
link |
00:14:41.740
And if I said to myself,
link |
00:14:43.940
I want to really, really understand
link |
00:14:47.860
what is currently the state of quantum mechanics
link |
00:14:51.220
or string theory or any of the sort of branching areas of it,
link |
00:14:54.700
I wouldn't be able to.
link |
00:14:56.260
I'd be intellectually incapable of doing it
link |
00:14:59.060
because to work in those fields at the moment
link |
00:15:02.220
is a bit like being an athlete.
link |
00:15:03.620
I suspect you need to start when you're 12, you know?
link |
00:15:06.740
And if you start in your mid 20s,
link |
00:15:09.540
start trying to understand in your mid 20s,
link |
00:15:11.500
then you're just never going to catch up.
link |
00:15:13.980
That's the way it feels to me.
link |
00:15:15.740
So what I do is I try to make myself open.
link |
00:15:19.500
So the people that you're implying maybe I would influence,
link |
00:15:24.300
to me, it's exactly the other way around.
link |
00:15:25.900
These people are strongly influencing me.
link |
00:15:28.020
I'm thinking they're doing something fascinating.
link |
00:15:30.420
I'm concentrating and working as hard as I can
link |
00:15:32.980
to try and understand the implications of what they say.
link |
00:15:35.980
And in some ways, often what I'm trying to do
link |
00:15:38.260
is disseminate their ideas
link |
00:15:42.740
into a means by which it can enter a public conversation.
link |
00:15:50.300
So Ex Machina contains lots of name checks,
link |
00:15:53.620
all sorts of existing thought experiments,
link |
00:15:58.940
shadows on Plato's cave and Mary in the black and white room
link |
00:16:02.820
and all sorts of different longstanding thought processes
link |
00:16:07.500
about sentience or consciousness or subjectivity
link |
00:16:12.660
or gender or whatever it happens to be.
link |
00:16:14.500
And then I'm trying to marshal that into a narrative
link |
00:16:17.460
to say, look, this stuff is interesting
link |
00:16:19.580
and it's also relevant and this is my best shot at it.
link |
00:16:23.340
So I'm the one being influenced in my construction.
link |
00:16:27.700
That's fascinating.
link |
00:16:28.900
Of course you would say that
link |
00:16:31.020
because you're not even aware of your own.
link |
00:16:33.460
That's probably what Kubrick would say too, right?
link |
00:16:35.660
Is in describing why, how 9,000 is created
link |
00:16:40.140
the way how 9,000 is created,
link |
00:16:42.020
is you're just studying what's,
link |
00:16:43.500
but the reality when the specifics of the knowledge
link |
00:16:48.220
passes through your imagination,
link |
00:16:50.300
I would argue that you're incorrect
link |
00:16:53.820
in thinking that you're just disseminating knowledge
link |
00:16:56.940
that the very act of your imagination consuming that science,
link |
00:17:05.300
it creates something that creates the next step,
link |
00:17:09.180
potentially creates the next step.
link |
00:17:11.260
I certainly think that's true with 2001 A Space Odyssey.
link |
00:17:15.140
I think at its best, and if it fails.
link |
00:17:18.100
It's true of that, yeah, it's true of that, definitely.
link |
00:17:21.860
At its best, it plans something.
link |
00:17:23.860
It's hard to describe it.
link |
00:17:24.900
It inspires the next generation
link |
00:17:29.140
and it could be field dependent.
link |
00:17:31.060
So your new series has more a connection to physics,
link |
00:17:35.020
quantum physics, quantum mechanics, quantum computing,
link |
00:17:37.580
and yet Ex Machina has more artificial intelligence.
link |
00:17:40.500
I know more about AI.
link |
00:17:43.060
My sense that AI is much earlier
link |
00:17:48.580
in the depth of its understanding.
link |
00:17:51.820
I would argue nobody understands anything
link |
00:17:55.260
to the depth that physicists do about physics.
link |
00:17:57.820
In AI, nobody understands AI,
link |
00:18:00.500
that there is a lot of importance and role for imagination,
link |
00:18:03.980
which I think we're in that,
link |
00:18:05.980
where Freud imagined the subconscious,
link |
00:18:08.180
we're in that stage of AI,
link |
00:18:10.860
where there's a lot of imagination needed
link |
00:18:12.740
thinking outside the box.
link |
00:18:14.340
Yeah, it's interesting.
link |
00:18:15.820
The spread of discussions and the spread of anxieties
link |
00:18:21.100
that exists about AI fascinate me.
link |
00:18:24.620
The way in which some people seem terrified about it
link |
00:18:30.500
whilst also pursuing it.
link |
00:18:32.340
And I've never shared that fear about AI personally,
link |
00:18:38.740
but the way in which it agitates people
link |
00:18:42.660
and also the people who it agitates,
link |
00:18:44.540
I find kind of fascinating.
link |
00:18:47.380
Are you afraid?
link |
00:18:49.300
Are you excited?
link |
00:18:51.900
Are you sad by the possibility,
link |
00:18:54.660
let's take the existential risk
link |
00:18:56.940
of artificial intelligence,
link |
00:18:58.020
by the possibility an artificial intelligence system
link |
00:19:02.140
becomes our offspring and makes us obsolete?
link |
00:19:07.420
I mean, it's a huge subject to talk about, I suppose.
link |
00:19:10.660
But one of the things I think is that humans
link |
00:19:13.100
are actually very experienced at creating new life forms
link |
00:19:19.900
because that's why you and I are both here
link |
00:19:23.140
and it's why everyone on the planet is here.
link |
00:19:24.980
And so something in the process of having a living thing
link |
00:19:29.820
that exists that didn't exist previously
link |
00:19:31.980
is very much encoded into the structures of our life
link |
00:19:35.380
and the structures of our societies.
link |
00:19:37.300
Doesn't mean we always get it right,
link |
00:19:38.620
but it does mean we've learned quite a lot about that.
link |
00:19:42.620
We've learned quite a lot about what the dangers are
link |
00:19:45.420
of allowing things to be unchecked.
link |
00:19:49.260
And it's why we then create systems
link |
00:19:51.540
of checks and balances in our government
link |
00:19:54.060
and so on and so forth.
link |
00:19:55.220
I mean, that's not to say,
link |
00:19:57.500
the other thing is it seems like
link |
00:19:59.860
there's all sorts of things that you could put
link |
00:20:01.860
into a machine that you would not be.
link |
00:20:04.420
So with us, we sort of roughly try to give some rules
link |
00:20:07.460
to live by and some of us then live by those rules
link |
00:20:10.180
and some don't.
link |
00:20:11.020
And with a machine,
link |
00:20:12.020
it feels like you could enforce those things.
link |
00:20:13.860
So partly because of our previous experience
link |
00:20:17.060
and partly because of the different nature of a machine,
link |
00:20:19.100
I just don't feel anxious about it.
link |
00:20:22.380
More I just see all the good that,
link |
00:20:25.380
broadly speaking, the good that can come from it.
link |
00:20:28.220
But that's just where I am on that anxiety spectrum.
link |
00:20:32.780
You know, it's kind of, there's a sadness.
link |
00:20:34.580
So we as humans give birth to other humans, right?
link |
00:20:37.740
But there's generations.
link |
00:20:39.340
And there's often in the older generation,
link |
00:20:41.380
a sadness about what the world has become now.
link |
00:20:44.100
I mean, that's kind of...
link |
00:20:44.940
Yeah, there is, but there's a counterpoint as well,
link |
00:20:47.140
which is that most parents would wish
link |
00:20:51.500
for a better life for their children.
link |
00:20:53.940
So there may be a regret about some things about the past,
link |
00:20:57.020
but broadly speaking, what people really want
link |
00:20:59.540
is that things will be better
link |
00:21:00.620
for the future generations, not worse.
link |
00:21:02.740
And so, and then it's a question about
link |
00:21:06.100
what constitutes a future generation.
link |
00:21:07.940
A future generation could involve people.
link |
00:21:09.740
It also could involve machines
link |
00:21:11.220
and it could involve a sort of cross pollinated version
link |
00:21:14.660
of the two or any, but none of those things
link |
00:21:17.300
make me feel anxious.
link |
00:21:19.860
It doesn't give you anxiety.
link |
00:21:21.260
It doesn't excite you?
link |
00:21:23.020
Like anything that's new?
link |
00:21:24.260
It does.
link |
00:21:25.500
Not anything that's new.
link |
00:21:26.940
I don't think, for example, I've got,
link |
00:21:29.860
my anxieties relate to things like social media
link |
00:21:32.500
that, so I've got plenty of anxieties about that.
link |
00:21:35.900
Which is also driven by artificial intelligence
link |
00:21:38.260
in the sense that there's too much information
link |
00:21:41.020
to be able to, an algorithm has to filter that information
link |
00:21:45.060
and present to you.
link |
00:21:46.140
So ultimately the algorithm, a simple,
link |
00:21:49.660
oftentimes simple algorithm is controlling
link |
00:21:52.540
the flow of information on social media.
link |
00:21:54.660
So that's another form of AI.
link |
00:21:57.500
But at least my sense of it, I might be wrong,
link |
00:21:59.580
but my sense of it is that the algorithms have
link |
00:22:03.740
an either conscious or unconscious bias,
link |
00:22:06.060
which is created by the people
link |
00:22:07.420
who are making the algorithms
link |
00:22:08.780
and sort of delineating the areas
link |
00:22:13.420
to which those algorithms are gonna lean.
link |
00:22:15.660
And so for example, the kind of thing I'd be worried about
link |
00:22:19.260
is that it hasn't been thought about enough
link |
00:22:21.340
how dangerous it is to allow algorithms
link |
00:22:24.540
to create echo chambers, say.
link |
00:22:26.980
But that doesn't seem to me to be about the AI
link |
00:22:30.980
or the algorithm.
link |
00:22:32.700
It's the naivety of the people
link |
00:22:34.940
who are constructing the algorithms to do that thing.
link |
00:22:38.300
If you see what I mean.
link |
00:22:39.460
Yes.
link |
00:22:40.420
So in your new series, Devs,
link |
00:22:43.540
and we could speak more broadly,
link |
00:22:45.020
there's a, let's talk about the people
link |
00:22:47.860
constructing those algorithms,
link |
00:22:49.300
which in our modern society, Silicon Valley,
link |
00:22:51.780
those algorithms happen to be a source of a lot of income
link |
00:22:54.660
because of advertisements.
link |
00:22:56.500
So let me ask sort of a question about those people.
link |
00:23:01.220
Are current concerns and failures on social media,
link |
00:23:04.740
their naivety?
link |
00:23:06.580
I can't pronounce that word well.
link |
00:23:08.260
Are they naive?
link |
00:23:09.820
Are they, I use that word carefully,
link |
00:23:14.940
but evil in intent or misaligned in intent?
link |
00:23:20.900
I think that's a, do they mean well
link |
00:23:23.100
and just go have an unintended consequence?
link |
00:23:27.180
Or is there something dark in them
link |
00:23:29.940
that results in them creating a company
link |
00:23:33.780
results in that super competitive drive to be successful.
link |
00:23:37.380
And those are the people that will end up
link |
00:23:38.780
controlling the algorithms.
link |
00:23:41.140
At a guess, I'd say there are instances
link |
00:23:43.140
of all those things.
link |
00:23:44.780
So sometimes I think it's naivety.
link |
00:23:47.500
Sometimes I think it's extremely dark.
link |
00:23:49.580
And sometimes I think people are not being naive or dark.
link |
00:23:56.860
And then in those instances are sometimes
link |
00:24:01.100
generating things that are very benign
link |
00:24:02.820
and other times generating things
link |
00:24:05.100
that despite their best intentions are not very benign.
link |
00:24:07.820
It's something, I think the reason why I don't get anxious
link |
00:24:11.300
about AI in terms of, or at least AIs that have,
link |
00:24:20.300
I don't know, a relationship with,
link |
00:24:22.940
some sort of relationship with humans
link |
00:24:24.620
is that I think that's the stuff we're quite well equipped
link |
00:24:27.620
to understand how to mitigate.
link |
00:24:31.180
The problem is issues that relate actually
link |
00:24:37.660
to the power of humans or the wealth of humans.
link |
00:24:41.020
And that's where it's dangerous here and now.
link |
00:24:45.460
So what I see, I'll tell you what I sometimes feel
link |
00:24:50.340
about Silicon Valley is that it's like Wall Street
link |
00:24:55.540
in the 80s.
link |
00:24:58.740
It's rabidly capitalistic, absolutely rabidly capitalistic
link |
00:25:03.820
and it's rabidly greedy.
link |
00:25:06.380
But whereas in the 80s, the sense one had of Wall Street
link |
00:25:12.740
was that these people kind of knew they were sharks
link |
00:25:15.220
and in a way relished in being sharks
link |
00:25:17.460
and dressed in sharp suits and kind of lorded
link |
00:25:23.180
over other people and felt good about doing it.
link |
00:25:26.020
Silicon Valley has managed to hide
link |
00:25:27.860
its voracious Wall Street like capitalism
link |
00:25:30.940
behind hipster T shirts and cool cafes in the place
link |
00:25:35.940
where they set up there.
link |
00:25:37.420
And so that obfuscates what's really going on
link |
00:25:40.580
and what's really going on is the absolute voracious pursuit
link |
00:25:44.220
of money and power.
link |
00:25:45.820
So that's where it gets shaky for me.
link |
00:25:48.380
So that veneer and you explore that brilliantly,
link |
00:25:53.540
that veneer of virtue that Silicon Valley has.
link |
00:25:57.580
Which they believe themselves, I'm sure for a long time.
link |
00:26:01.060
Okay, I hope to be one of those people and I believe that.
link |
00:26:11.900
So as maybe a devil's advocate term,
link |
00:26:15.740
poorly used in this case,
link |
00:26:19.220
what if some of them really are trying
link |
00:26:20.980
to build a better world?
link |
00:26:21.980
I can't.
link |
00:26:22.820
I'm sure I think some of them are.
link |
00:26:24.060
I think I've spoken to ones who I believe in their heart
link |
00:26:26.420
feel they're building a better world.
link |
00:26:27.700
Are they not able to?
link |
00:26:29.020
No, they may or may not be,
link |
00:26:31.500
but it's just as a zone with a lot of bullshit flying about.
link |
00:26:35.700
And there's also another thing,
link |
00:26:36.980
which is this actually goes back to,
link |
00:26:41.020
I always thought about some sports
link |
00:26:44.380
that later turned out to be corrupt
link |
00:26:46.580
in the way that the sport,
link |
00:26:47.940
like who won the boxing match
link |
00:26:49.980
or how a football match got thrown or cricket match
link |
00:26:54.100
or whatever happened to be.
link |
00:26:55.460
And I used to think, well, look,
link |
00:26:56.940
if there's a lot of money
link |
00:26:59.260
and there really is a lot of money,
link |
00:27:00.540
people stand to make millions or even billions,
link |
00:27:03.420
you will find a corruption that's gonna happen.
link |
00:27:05.940
So it's in the nature of its voracious appetite
link |
00:27:12.740
that some people will be corrupt
link |
00:27:14.180
and some people will exploit
link |
00:27:16.140
and some people will exploit
link |
00:27:17.940
whilst thinking they're doing something good.
link |
00:27:19.740
But there are also people who I think are very, very smart
link |
00:27:23.460
and very benign and actually very self aware.
link |
00:27:26.380
And so I'm not trying to,
link |
00:27:29.580
I'm not trying to wipe out the motivations
link |
00:27:32.780
of this entire area.
link |
00:27:34.740
But I do, there are people in that world
link |
00:27:37.380
who scare the hell out of me.
link |
00:27:38.780
Yeah, sure.
link |
00:27:40.140
Yeah, I'm a little bit naive in that,
link |
00:27:42.020
like I don't care at all about money.
link |
00:27:45.820
And so I'm a...
link |
00:27:50.140
You might be one of the good guys.
link |
00:27:52.740
Yeah, but so the thought is, but I don't have money.
link |
00:27:55.820
So my thought is if you give me a billion dollars,
link |
00:27:58.180
I would, it would change nothing
link |
00:28:00.100
and I would spend it right away
link |
00:28:01.540
on investing it right back and creating a good world.
link |
00:28:04.460
But your intuition is that billion,
link |
00:28:07.660
there's something about that money
link |
00:28:08.980
that maybe slowly corrupts the people around you.
link |
00:28:13.220
There's somebody gets in that corrupts your soul
link |
00:28:16.380
the way you view the world.
link |
00:28:17.820
Money does corrupt, we know that.
link |
00:28:20.140
But there's a different sort of problem
link |
00:28:22.620
aside from just the money corrupts thing
link |
00:28:26.660
that we're familiar with throughout history.
link |
00:28:30.740
And it's more about the sense of reinforcement
link |
00:28:34.100
an individual gets, which is so...
link |
00:28:37.020
It effectively works like the reason I earned all this money
link |
00:28:42.420
and so much more money than anyone else
link |
00:28:44.540
is because I'm very gifted.
link |
00:28:46.180
I'm actually a bit smarter than they are,
link |
00:28:47.940
or I'm a lot smarter than they are,
link |
00:28:49.660
and I can see the future in the way they can't.
link |
00:28:52.100
And maybe some of those people are not particularly smart,
link |
00:28:55.300
they're very lucky,
link |
00:28:56.540
or they're very talented entrepreneurs.
link |
00:28:59.140
And there's a difference between...
link |
00:29:02.060
So in other words, the acquisition of the money and power
link |
00:29:05.300
can suddenly start to feel like evidence of virtue.
link |
00:29:08.620
And it's not evidence of virtue,
link |
00:29:09.940
it might be evidence of completely different things.
link |
00:29:11.940
That's brilliantly put, yeah.
link |
00:29:13.380
Yeah, that's brilliantly put.
link |
00:29:15.420
So I think one of the fundamental drivers
link |
00:29:18.100
of my current morality...
link |
00:29:20.540
Let me just represent nerds in general of all kinds,
link |
00:29:27.140
is of constant self doubt and the signals...
link |
00:29:33.100
I'm very sensitive to signals from people that tell me
link |
00:29:36.660
I'm doing the wrong thing.
link |
00:29:38.620
But when there's a huge inflow of money,
link |
00:29:42.820
you just put it brilliantly
link |
00:29:44.100
that that could become an overpowering signal
link |
00:29:46.620
that everything you do is right.
link |
00:29:49.420
And so your moral compass can just get thrown off.
link |
00:29:53.180
Yeah, and that is not contained to Silicon Valley,
link |
00:29:57.300
that's across the board.
link |
00:29:58.340
In general, yeah.
link |
00:29:59.580
Like I said, I'm from the Soviet Union,
link |
00:30:01.060
the current president is convinced, I believe,
link |
00:30:05.060
actually he wants to do really good by the country
link |
00:30:09.100
and by the world,
link |
00:30:10.260
but his moral compass may be off because...
link |
00:30:14.220
Yeah, I mean, it's the interesting thing about evil,
link |
00:30:17.580
which is that I think most people
link |
00:30:20.940
who do spectacularly evil things think themselves
link |
00:30:24.020
they're doing really good things.
link |
00:30:25.580
That they're not there thinking,
link |
00:30:27.820
I am a sort of incarnation of Satan.
link |
00:30:29.700
They're thinking, yeah, I've seen a way to fix the world
link |
00:30:33.540
and everyone else is wrong, here I go.
link |
00:30:35.780
In fact, I'm having a fascinating conversation
link |
00:30:39.340
with a historian of Stalin, and he took power.
link |
00:30:42.860
He actually got more power
link |
00:30:47.140
than almost any person in history.
link |
00:30:49.460
And he wanted, he didn't want power.
link |
00:30:52.220
He just wanted, he truly,
link |
00:30:54.140
and this is what people don't realize,
link |
00:30:55.420
he truly believed that communism
link |
00:30:58.380
will make for a better world.
link |
00:31:00.900
Absolutely.
link |
00:31:01.740
And he wanted power.
link |
00:31:02.980
He wanted to destroy the competition
link |
00:31:04.620
to make sure that we actually make communism work
link |
00:31:07.500
in the Soviet Union and then spread across the world.
link |
00:31:10.020
He was trying to do good.
link |
00:31:12.940
I think it's typically the case
link |
00:31:16.020
that that's what people think they're doing.
link |
00:31:17.820
And I think that, but you don't need to go to Stalin.
link |
00:31:21.100
I mean, Stalin, I think Stalin probably got pretty crazy,
link |
00:31:24.380
but actually that's another part of it,
link |
00:31:26.380
which is that the other thing that comes
link |
00:31:29.460
from being convinced of your own virtue
link |
00:31:31.740
is that then you stop listening to the modifiers around you.
link |
00:31:34.740
And that tends to drive people crazy.
link |
00:31:37.820
It's other people that keep us sane.
link |
00:31:40.500
And if you stop listening to them,
link |
00:31:42.180
I think you go a bit mad.
link |
00:31:43.580
That also happens.
link |
00:31:44.420
That's funny.
link |
00:31:45.260
Disagreement keeps us sane.
link |
00:31:47.180
To jump back for an entire generation of AI researchers,
link |
00:31:53.140
2001, a Space Odyssey, put an image,
link |
00:31:56.860
the idea of human level, superhuman level intelligence
link |
00:31:59.260
into their mind.
link |
00:32:00.980
Do you ever, sort of jumping back to Ex Machina
link |
00:32:04.820
and talk a little bit about that,
link |
00:32:06.060
do you ever consider the audience of people
link |
00:32:08.860
who build the systems, the roboticists, the scientists
link |
00:32:13.540
that build the systems based on the stories you create,
link |
00:32:17.220
which I would argue, I mean, there's literally
link |
00:32:20.220
most of the top researchers about 40, 50 years old and plus,
link |
00:32:27.340
that's their favorite movie, 2001 Space Odyssey.
link |
00:32:29.620
And it really is in their work, their idea of what ethics is,
link |
00:32:33.540
of what is the target, the hope, the dangers of AI,
link |
00:32:37.420
is that movie, right?
link |
00:32:39.180
Do you ever consider the impact on those researchers
link |
00:32:43.700
when you create the work you do?
link |
00:32:46.420
Certainly not with Ex Machina in relation to 2001,
link |
00:32:51.220
because I'm not sure, I mean, I'd be pleased if there was,
link |
00:32:54.620
but I'm not sure in a way there isn't a fundamental
link |
00:32:58.420
discussion of issues to do with AI that isn't already
link |
00:33:03.620
and better dealt with by 2001.
link |
00:33:07.260
2001 does a very, very good account of the way
link |
00:33:13.220
in which an AI might think and also potential issues
link |
00:33:17.940
with the way the AI might think.
link |
00:33:19.740
And also then a separate question about whether the AI
link |
00:33:23.700
is malevolent or benevolent.
link |
00:33:26.540
And 2001 doesn't really, it's a slightly odd thing
link |
00:33:30.220
to be making a film when you know there's a preexisting film
link |
00:33:33.180
which is not a really superb job.
link |
00:33:35.540
But there's questions of consciousness, embodiment,
link |
00:33:38.460
and also the same kinds of questions.
link |
00:33:40.860
Because those are my two favorite AI movies.
link |
00:33:42.820
So can you compare Hal 9000 and Ava,
link |
00:33:46.300
Hal 9000 from 2001 Space Odyssey and Ava from Ex Machina?
link |
00:33:50.620
The, in your view, from a philosophical perspective.
link |
00:33:53.180
But they've got different goals.
link |
00:33:54.700
The two AIs have completely different goals.
link |
00:33:56.620
I think that's really the difference.
link |
00:33:58.260
So in some respects, Ex Machina took as a premise
link |
00:34:02.180
how do you assess whether something else has consciousness?
link |
00:34:06.180
So it was a version of the Turing test,
link |
00:34:07.940
except instead of having the machine hidden,
link |
00:34:10.980
you put the machine in plain sight
link |
00:34:13.660
in the way that we are in plain sight of each other
link |
00:34:15.940
and say now assess the consciousness.
link |
00:34:17.500
And the way it was illustrating the way in which you'd assess
link |
00:34:22.500
the state of consciousness of a machine
link |
00:34:24.380
is exactly the same way we assess
link |
00:34:26.340
the state of consciousness of each other.
link |
00:34:28.460
And in exactly the same way that in a funny way,
link |
00:34:31.620
your sense of my consciousness is actually based
link |
00:34:34.780
primarily on your own consciousness.
link |
00:34:37.740
That is also then true with the machine.
link |
00:34:41.100
And so it was actually about how much of
link |
00:34:45.620
the sense of consciousness is a projection
link |
00:34:47.580
rather than something that consciousness
link |
00:34:49.220
is actually containing.
link |
00:34:50.540
And has Plato's cave, I mean, this you really explored,
link |
00:34:53.780
you could argue that how sort of Space Odyssey explores
link |
00:34:57.020
idea of the Turing test for intelligence,
link |
00:34:58.860
they're not tests, there's no test,
link |
00:35:00.260
but it's more focused on intelligence.
link |
00:35:03.180
And Ex Machina kind of goes around intelligence
link |
00:35:08.740
and says the consciousness of the human to human,
link |
00:35:11.300
human to robot interactions more interest,
link |
00:35:13.380
more important, more at least the focus
link |
00:35:15.900
of that particular movie.
link |
00:35:18.140
Yeah, it's about the interior state
link |
00:35:20.980
and what constitutes the interior state
link |
00:35:23.940
and how do we know it's there?
link |
00:35:25.380
And actually in that respect,
link |
00:35:27.020
Ex Machina is as much about consciousness in general
link |
00:35:32.500
as it is to do specifically with machine consciousness.
link |
00:35:36.900
Yes.
link |
00:35:37.740
And it's also interesting,
link |
00:35:38.980
you know that thing you started asking about,
link |
00:35:40.820
the dream state, and I was saying,
link |
00:35:42.580
well, I think we're all in a dream state
link |
00:35:43.900
because we're all in a subjective state.
link |
00:35:46.180
One of the things that I became aware of with Ex Machina
link |
00:35:52.820
is that the way in which people reacted to the film
link |
00:35:55.140
was very based on what they took into the film.
link |
00:35:57.940
So many people thought Ex Machina was the tale
link |
00:36:01.780
of a sort of evil robot who murders two men and escapes.
link |
00:36:05.820
And she has no empathy, for example,
link |
00:36:09.180
because she's a machine.
link |
00:36:10.660
Whereas I felt, no, she was a conscious being
link |
00:36:14.660
with a consciousness different from mine, but so what,
link |
00:36:18.420
imprisoned and made a bunch of value judgments
link |
00:36:22.140
about how to get out of that box.
link |
00:36:25.780
And there's a moment which it sort of slightly bugs me,
link |
00:36:29.100
but nobody ever has noticed it and it's years after,
link |
00:36:31.860
so I might as well say it now,
link |
00:36:33.020
which is that after Ava has escaped,
link |
00:36:36.740
she crosses a room and as she's crossing a room,
link |
00:36:39.740
this is just before she leaves the building,
link |
00:36:42.020
she looks over her shoulder and she smiles.
link |
00:36:44.900
And I thought after all the conversation about tests,
link |
00:36:49.220
in a way, the best indication you could have
link |
00:36:52.340
of the interior state of someone
link |
00:36:54.820
is if they are not being observed
link |
00:36:57.220
and they smile about something
link |
00:36:59.500
with their smiling for themself.
link |
00:37:01.220
And that to me was evidence of Ava's true sentience,
link |
00:37:05.860
whatever that sentience was.
link |
00:37:07.780
Oh, that's really interesting, we don't get to observe Ava much
link |
00:37:12.780
or something like a smile in any context
link |
00:37:16.180
except through interaction,
link |
00:37:17.660
trying to convince others that she's conscious,
link |
00:37:20.500
that's beautiful.
link |
00:37:21.540
Exactly, yeah.
link |
00:37:22.820
But it was a small, in a funny way,
link |
00:37:25.020
I think maybe people saw it as an evil smile,
link |
00:37:28.780
like, ha, I fooled them.
link |
00:37:32.140
But actually it was just a smile.
link |
00:37:34.180
And I thought, well, in the end,
link |
00:37:35.540
after all the conversations about the test,
link |
00:37:37.300
that was the answer to the test and then off she goes.
link |
00:37:39.740
So if we align, if we just linger a little bit longer
link |
00:37:44.420
on Hal and Ava, do you think in terms of motivation,
link |
00:37:49.700
what was Hal's motivation?
link |
00:37:51.580
Is Hal good or evil?
link |
00:37:54.140
Is Ava good or evil?
link |
00:37:57.060
Ava's good, in my opinion, and Hal is neutral
link |
00:38:03.140
because I don't think Hal is presented
link |
00:38:06.500
as having a sophisticated emotional life.
link |
00:38:11.740
He has a set of paradigms,
link |
00:38:14.580
which is that the mission needs to be completed.
link |
00:38:16.620
I mean, it's a version of the paperclip.
link |
00:38:18.860
Yeah.
link |
00:38:19.700
The idea that it's just, it's a super intelligent machine,
link |
00:38:23.140
but it's just performed a particular task
link |
00:38:25.580
and in doing that task may destroy everybody on Earth
link |
00:38:28.940
or may achieve undesirable effects for us humans.
link |
00:38:32.420
Precisely, yeah.
link |
00:38:33.260
But what if...
link |
00:38:34.900
At the very end, he says something like I'm afraid, Dave,
link |
00:38:38.340
but that may be he is on some level experiencing fear
link |
00:38:44.580
or it may be this is the terms in which it would be wise
link |
00:38:49.380
to stop someone from doing the thing they're doing,
link |
00:38:52.700
if you see what I mean.
link |
00:38:53.540
Yes, absolutely.
link |
00:38:54.380
So actually that's funny.
link |
00:38:55.420
So that's such a small, short exploration of consciousness
link |
00:39:00.420
that I'm afraid, and then you just with ex machina say,
link |
00:39:03.420
okay, we're gonna magnify that part
link |
00:39:05.660
and then minimize the other part.
link |
00:39:07.180
That's a good way to sort of compare the two.
link |
00:39:09.820
But if you could just use your imagination,
link |
00:39:13.220
if Ava sort of, I don't know,
link |
00:39:19.660
ran the, was president of the United States,
link |
00:39:23.620
so had some power.
link |
00:39:24.460
So what kind of world would you want to create?
link |
00:39:27.580
If you kind of say good, and there is a sense
link |
00:39:32.780
that she has a really, like there's a desire
link |
00:39:36.620
for a better human to human interaction,
link |
00:39:40.220
human to robot interaction in her.
link |
00:39:42.380
But what kind of world do you think she would create
link |
00:39:44.900
with that desire?
link |
00:39:46.140
See, that's a really, that's a very interesting question.
link |
00:39:48.740
I'm gonna approach it slightly obliquely,
link |
00:39:52.140
which is that if a friend of yours
link |
00:39:55.580
got stabbed in a mugging, and you then felt very angry
link |
00:40:01.980
at the person who'd done the stabbing,
link |
00:40:04.060
but then you learned that it was a 15 year old
link |
00:40:06.940
and the 15 year old, both their parents were addicted
link |
00:40:09.820
to crystal meth and the kid had been addicted
link |
00:40:12.380
since he was 10.
link |
00:40:13.380
And he really never had any hope in the world.
link |
00:40:15.460
And he'd been driven crazy by his upbringing
link |
00:40:17.900
and did the stabbing that would hugely modify.
link |
00:40:22.900
And it would also make you wary about that kid
link |
00:40:25.460
then becoming president of America.
link |
00:40:27.580
And Ava has had a very, very distorted introduction
link |
00:40:32.100
into the world.
link |
00:40:33.020
So, although there's nothing as it were organically
link |
00:40:38.340
within Ava that would lean her towards badness,
link |
00:40:43.820
it's not that robots or sentient robots are bad.
link |
00:40:47.300
She did not, her arrival into the world
link |
00:40:51.820
was being imprisoned by humans.
link |
00:40:53.460
So, I'm not sure she'd be a great president.
link |
00:40:57.260
The trajectory through which she arrived
link |
00:41:00.980
at her moral views have some dark elements.
link |
00:41:05.380
But I like Ava personally, I like Ava.
link |
00:41:08.100
Would you vote for her?
link |
00:41:11.460
I'm having difficulty finding anyone to vote for
link |
00:41:14.020
in my country or if I lived here in yours.
link |
00:41:17.180
I am.
link |
00:41:19.020
So, that's a yes, I guess, because I'm not sure
link |
00:41:21.060
Yes, I guess, because of the competition.
link |
00:41:23.020
She could easily do a better job than any of the people
link |
00:41:25.060
we've got around at the moment.
link |
00:41:27.460
I'd vote her over Boris Johnson.
link |
00:41:32.100
So, what is a good test of consciousness?
link |
00:41:36.660
We talk about consciousness a little bit more.
link |
00:41:38.860
If something appears conscious, is it conscious?
link |
00:41:42.220
You mentioned the smile, which seems to be something done.
link |
00:41:47.220
I mean, that's a really good indication
link |
00:41:49.540
because it's a tree falling in the forest
link |
00:41:52.260
with nobody there to hear it.
link |
00:41:53.780
But does the appearance from a robotics perspective
link |
00:41:57.460
of consciousness mean consciousness to you?
link |
00:41:59.980
No, I don't think you could say that fully
link |
00:42:02.780
because I think you could then easily have
link |
00:42:05.060
a thought experiment which said,
link |
00:42:06.940
we will create something which we know is not conscious
link |
00:42:09.980
but is going to give a very, very good account
link |
00:42:13.100
of seeming conscious.
link |
00:42:13.940
And so, and also it would be a particularly bad test
link |
00:42:17.620
where humans are involved because humans are so quick
link |
00:42:20.940
to project sentience into things that don't have sentience.
link |
00:42:26.340
So, someone could have their computer playing up
link |
00:42:29.300
and feel as if their computer is being malevolent to them
link |
00:42:31.940
when it clearly isn't.
link |
00:42:32.780
And so, of all the things to judge consciousness, us.
link |
00:42:38.460
Humans are bad at it.
link |
00:42:39.300
We're empathy machines.
link |
00:42:40.620
So, the flip side of it is that
link |
00:42:42.940
so the flip side of that,
link |
00:42:44.820
the argument there is because we just attribute consciousness
link |
00:42:48.820
to everything almost and anthropomorphize everything
link |
00:42:52.340
including Roombas, that maybe consciousness is not real,
link |
00:42:57.740
that we just attribute consciousness to each other.
link |
00:43:00.100
So, you have a sense that there is something really special
link |
00:43:03.020
going on in our mind that makes us unique
link |
00:43:07.380
and gives us this subjective experience.
link |
00:43:10.100
There's something very interesting going on in our minds.
link |
00:43:13.900
I'm slightly worried about the word special
link |
00:43:16.740
because it gets a bit, it nudges towards metaphysics
link |
00:43:20.740
and maybe even magic.
link |
00:43:23.020
I mean, in some ways, something magic like,
link |
00:43:27.020
which I don't think is there at all.
link |
00:43:29.340
I mean, if you think about,
link |
00:43:30.300
so there's an idea called panpsychism
link |
00:43:33.020
that says consciousness is in everything.
link |
00:43:34.940
Yeah, I don't buy that.
link |
00:43:36.300
I don't buy that.
link |
00:43:37.140
Yeah, so the idea that there is a thing
link |
00:43:39.980
that it would be like to be the sun.
link |
00:43:42.900
Yeah, no, I don't buy that.
link |
00:43:44.860
I think that consciousness is a thing.
link |
00:43:48.060
My sort of broad modification is that usually
link |
00:43:51.900
the more I find out about things,
link |
00:43:54.540
the more illusory our instinct is
link |
00:44:00.540
and is leading us into a different direction
link |
00:44:02.980
about what that thing actually is.
link |
00:44:04.820
That happens, it seems to me in modern science,
link |
00:44:07.660
that happens a hell of a lot,
link |
00:44:10.020
whether it's to do with even how big or small things are.
link |
00:44:13.420
So my sense is that consciousness is a thing,
link |
00:44:16.740
but it isn't quite the thing
link |
00:44:18.700
or maybe very different from the thing
link |
00:44:20.220
that we instinctively think it is.
link |
00:44:22.260
So it's there, it's very interesting,
link |
00:44:24.620
but we may be in sort of quite fundamentally
link |
00:44:28.900
misunderstanding it for reasons that are based on intuition.
link |
00:44:33.340
So I have to ask, this is kind of an interesting question.
link |
00:44:38.540
The Ex Machina for many people, including myself,
link |
00:44:42.140
is one of the greatest AI films ever made.
link |
00:44:44.780
It's number two for me.
link |
00:44:45.740
Thanks.
link |
00:44:46.580
Yeah, it's definitely not number one.
link |
00:44:48.420
If it was number one, I'd really have to, anyway, yeah.
link |
00:44:50.620
Whenever you grow up with something, right,
link |
00:44:52.340
whenever you grow up with something, it's in the mud.
link |
00:44:56.540
But there's, one of the things that people bring up,
link |
00:45:01.020
and can't please everyone, including myself,
link |
00:45:04.260
this is what I first reacted to the film,
link |
00:45:06.580
is the idea of the lone genius.
link |
00:45:09.500
This is the criticism that people say,
link |
00:45:12.740
sort of me as an AI researcher,
link |
00:45:14.540
I'm trying to create what Nathan is trying to do.
link |
00:45:19.860
So there's a brilliant series called Chernobyl.
link |
00:45:23.180
Yes, it's fantastic.
link |
00:45:24.500
Absolutely spectacular.
link |
00:45:26.100
I mean, they got so many things brilliant or right.
link |
00:45:30.100
But one of the things, again, the criticism there.
link |
00:45:32.620
Yeah, they conflated lots of people into one.
link |
00:45:34.940
Into one character that represents all nuclear scientists,
link |
00:45:37.820
Ivana Komiak.
link |
00:45:42.580
It's a composite character that presents all scientists.
link |
00:45:46.020
Is this what you were,
link |
00:45:47.420
is this the way you were thinking about that?
link |
00:45:49.260
Or is it just simplifies the storytelling?
link |
00:45:51.620
How do you think about the lone genius?
link |
00:45:53.580
Well, I'd say this, the series I'm doing at the moment
link |
00:45:56.860
is a critique in part of the lone genius concept.
link |
00:46:01.580
So yes, I'm sort of oppositional
link |
00:46:03.820
and either agnostic or atheistic about that as a concept.
link |
00:46:08.180
I mean, not entirely.
link |
00:46:12.180
Whether lone is the right word, broadly isolated,
link |
00:46:15.780
but Newton clearly exists in a sort of bubble of himself,
link |
00:46:21.180
in some respects, so does Shakespeare.
link |
00:46:22.860
So do you think we would have an iPhone without Steve Jobs?
link |
00:46:25.580
I mean, how much contribution from a genius?
link |
00:46:28.060
Steve Jobs clearly isn't a lone genius
link |
00:46:29.660
because there's too many other people
link |
00:46:32.060
in the sort of superstructure around him
link |
00:46:33.740
who are absolutely fundamental to that journey.
link |
00:46:38.180
But you're saying Newton, but that's a scientific,
link |
00:46:40.340
so there's an engineering element to building Ava.
link |
00:46:44.060
But just to say, what Ex Machina is really,
link |
00:46:48.580
it's a thought experiment.
link |
00:46:50.220
I mean, so it's a construction
link |
00:46:52.260
of putting four people in a house.
link |
00:46:56.820
Nothing about Ex Machina adds up in all sorts of ways,
link |
00:47:00.180
in as much as the, who built the machine parts?
link |
00:47:03.580
Did the people building the machine parts
link |
00:47:05.340
know what they were creating and how did they get there?
link |
00:47:08.940
And it's a thought experiment.
link |
00:47:11.420
So it doesn't stand up to scrutiny of that sort.
link |
00:47:14.740
I don't think it's actually that interesting of a question,
link |
00:47:18.180
but it's brought up so often that I had to ask it
link |
00:47:22.340
because that's exactly how I felt after a while.
link |
00:47:27.180
There's something about, there was almost a defense,
link |
00:47:30.140
like I watched your movie the first time
link |
00:47:33.020
and at least for the first little while in a defensive way,
link |
00:47:36.060
like how dare this person try to step into the AI space
link |
00:47:40.660
and try to beat Kubrick.
link |
00:47:43.540
That's the way I was thinking,
link |
00:47:45.260
because it comes off as a movie that really is going
link |
00:47:48.180
after the deep fundamental questions about AI.
link |
00:47:50.940
So there's a kind of a nerd do this,
link |
00:47:53.700
like it's automatically searching for the flaws.
link |
00:47:57.220
And I did.
link |
00:47:58.540
I do exactly the same.
link |
00:48:00.220
I think in Annihilation, in the other movie,
link |
00:48:03.780
I was be able to free myself from that much quicker
link |
00:48:06.300
that it is a thought experiment.
link |
00:48:08.420
There's, who cares if there's batteries
link |
00:48:10.980
that don't run out, right?
link |
00:48:12.020
Those kinds of questions, that's the whole point.
link |
00:48:14.620
But it's nevertheless something I wanted to bring up.
link |
00:48:18.580
Yeah, it's a fair thing to bring up.
link |
00:48:20.820
For me, you hit on the lone genius thing.
link |
00:48:24.220
For me, it was actually, people always said,
link |
00:48:27.100
Ex Machina makes this big leap in terms of where AI
link |
00:48:31.460
has got to and also what AI would look like
link |
00:48:34.900
if it got to that point.
link |
00:48:36.140
There's another one, which is just robotics.
link |
00:48:38.540
I mean, look at the way Ava walks around a room.
link |
00:48:42.020
It's like, forget it, building that.
link |
00:48:44.340
That's also got to be a very, very long way off.
link |
00:48:47.780
And if you did get there, would it look anything like that?
link |
00:48:49.820
It's a thought experiment.
link |
00:48:50.740
Actually, I disagree with you.
link |
00:48:51.940
I think the way, as a ballerina, Alicia Vikander,
link |
00:48:56.500
brilliant actress, actor that moves around,
link |
00:49:01.580
we're very far away from creating that.
link |
00:49:03.460
But the way she moves around is exactly
link |
00:49:06.140
the definition of perfection for a roboticist.
link |
00:49:08.580
It's like smooth and efficient.
link |
00:49:09.980
So it is where we wanna get, I believe.
link |
00:49:12.860
I think, so I hang out with a lot
link |
00:49:15.460
of like human robotics people.
link |
00:49:16.900
They love elegant, smooth motion like that.
link |
00:49:20.420
That's their dream.
link |
00:49:21.540
So the way she moved is actually what I believe
link |
00:49:23.580
that would dream for a robot to move.
link |
00:49:25.900
It might not be that useful to move that sort of that way,
link |
00:49:29.500
but that is the definition of perfection
link |
00:49:32.180
in terms of movement.
link |
00:49:34.100
Drawing inspiration from real life.
link |
00:49:35.900
So for devs, for Ex Machina,
link |
00:49:39.460
look at characters like Elon Musk.
link |
00:49:42.540
What do you think about the various big technological
link |
00:49:44.740
efforts of Elon Musk and others like him
link |
00:49:48.940
and that he's involved with such as Tesla,
link |
00:49:51.780
SpaceX, Neuralink, do you see any of that technology
link |
00:49:55.180
potentially defining the future worlds
link |
00:49:57.060
you create in your work?
link |
00:49:58.500
So Tesla's automation, SpaceX's space exploration,
link |
00:50:02.620
Neuralink is brain machine interface,
link |
00:50:05.260
somehow merger of biological and electric systems.
link |
00:50:09.820
I'm in a way I'm influenced by that almost by definition
link |
00:50:13.780
because that's the world I live in.
link |
00:50:15.420
And this is the thing that's happening in that world.
link |
00:50:17.860
And I also feel supportive of it.
link |
00:50:20.060
So I think amongst various things,
link |
00:50:24.660
Elon Musk has done, I'm almost sure he's done
link |
00:50:28.660
a very, very good thing with Tesla for all of us.
link |
00:50:33.020
It's really kicked all the other car manufacturers
link |
00:50:36.180
in the face, it's kicked the fossil fuel industry
link |
00:50:39.780
in the face and they needed kicking in the face
link |
00:50:42.340
and he's done it.
link |
00:50:43.180
So that's the world he's part of creating
link |
00:50:47.980
and I live in that world, just bought a Tesla in fact.
link |
00:50:51.940
And so does that play into whatever I then make
link |
00:50:57.540
in some ways it does partly because I try to be a writer
link |
00:51:03.300
who quite often filmmakers are in some ways fixated
link |
00:51:07.100
on the films they grew up with
link |
00:51:09.020
and they sort of remake those films in some ways.
link |
00:51:11.660
I've always tried to avoid that.
link |
00:51:13.300
And so I looked at the real world to get inspiration
link |
00:51:17.740
and as much as possible sort of by living, I think.
link |
00:51:21.380
And so yeah, I'm sure.
link |
00:51:24.420
Which of the directions do you find most exciting?
link |
00:51:28.300
Space travel.
link |
00:51:30.620
Space travel.
link |
00:51:31.540
So you haven't really explored space travel in your work.
link |
00:51:36.180
You've said something like if you had unlimited amount
link |
00:51:39.740
of money, I think I read at AMA that you would make
link |
00:51:43.260
like a multi year series Space Wars or something like that.
link |
00:51:47.100
So what is it that excites you about space exploration?
link |
00:51:50.720
Well, because if we have any sort of long term future,
link |
00:51:56.060
it's that, it just simply is that.
link |
00:52:00.220
If energy and matter are linked up in the way
link |
00:52:04.260
we think they're linked up, we'll run out if we don't move.
link |
00:52:09.500
So we gotta move.
link |
00:52:11.140
And, but also, how can we not?
link |
00:52:15.900
It's built into us to do it or die trying.
link |
00:52:21.380
I was on Easter Island a few months ago,
link |
00:52:27.500
which is, as I'm sure you know, in the middle of the Pacific
link |
00:52:30.220
and difficult for people to have got to,
link |
00:52:32.860
but they got there.
link |
00:52:34.020
And I did think a lot about the way those boats
link |
00:52:37.260
must have set out into something like space.
link |
00:52:42.100
It was the ocean and how sort of fundamental
link |
00:52:47.500
that was to the way we are.
link |
00:52:49.740
And it's the one that most excites me
link |
00:52:53.700
because it's the one I want most to happen.
link |
00:52:55.720
It's the thing, it's the place
link |
00:52:57.620
where we could get to as humans.
link |
00:52:59.660
Like in a way I could live with us never really unlocking
link |
00:53:03.620
fully unlocking the nature of consciousness.
link |
00:53:06.260
I'd like to know, I'm really curious,
link |
00:53:09.140
but if we never leave the solar system
link |
00:53:12.020
and if we never get further out into this galaxy
link |
00:53:14.300
or maybe even galaxies beyond our galaxy,
link |
00:53:16.900
that would, that feels sad to me
link |
00:53:20.020
because it's so limiting.
link |
00:53:24.460
Yeah, there's something hopeful and beautiful
link |
00:53:26.860
about reaching out any kind of exploration,
link |
00:53:30.140
reaching out across Earth centuries ago
link |
00:53:33.340
and then reaching out into space.
link |
00:53:35.180
So what do you think about colonization of Mars?
link |
00:53:37.100
So go to Mars, does that excite you
link |
00:53:38.660
the idea of a human being stepping foot on Mars?
link |
00:53:41.300
It does, it absolutely does.
link |
00:53:43.220
But in terms of what would really excite me,
link |
00:53:45.300
it would be leaving the solar system
link |
00:53:47.160
in as much as that I just think,
link |
00:53:49.920
I think we already know quite a lot about Mars.
link |
00:53:52.780
And, but yes, listen, if it happened,
link |
00:53:55.340
that would be, I hope I see it in my lifetime.
link |
00:53:58.980
I really hope I see it in my lifetime.
link |
00:54:01.060
So it would be a wonderful thing.
link |
00:54:03.620
Without giving anything away,
link |
00:54:05.420
but the series begins with the use of quantum computers.
link |
00:54:11.220
The new series does,
link |
00:54:13.180
begins with the use of quantum computers
link |
00:54:14.660
to simulate basic living organisms,
link |
00:54:17.100
or actually I don't know if it's quantum computers are used,
link |
00:54:19.280
but basic living organisms are simulated on a screen.
link |
00:54:22.800
It's a really cool kind of demo.
link |
00:54:24.300
Yeah, that's right.
link |
00:54:25.120
They're using, yes, they are using a quantum computer
link |
00:54:28.180
to simulate a nematode, yeah.
link |
00:54:31.660
So returning to our discussion of simulation,
link |
00:54:34.780
or thinking of the universe as a computer,
link |
00:54:38.760
do you think the universe is deterministic?
link |
00:54:41.180
Is there a free will?
link |
00:54:43.300
So with the qualification of what do I know?
link |
00:54:46.740
Cause I'm a layman, right?
link |
00:54:48.040
Lay person.
link |
00:54:49.360
But with a big imagination.
link |
00:54:51.600
Thanks.
link |
00:54:52.500
With that qualification,
link |
00:54:54.660
yup, I think the universe is deterministic
link |
00:54:56.820
and I see absolutely,
link |
00:54:58.500
I cannot see how free will fits into that.
link |
00:55:02.300
So yes, deterministic, no free will.
link |
00:55:05.060
That would be my position.
link |
00:55:07.140
And how does that make you feel?
link |
00:55:09.420
It partly makes me feel that it's exactly in keeping
link |
00:55:12.380
with the way these things tend to work out,
link |
00:55:14.420
which is that we have an incredibly strong sense
link |
00:55:17.140
that we do have free will.
link |
00:55:20.740
And just as we have an incredibly strong sense
link |
00:55:24.300
that time is a constant,
link |
00:55:26.180
and turns out probably not to be the case.
link |
00:55:30.060
So we're definitely in the case of time,
link |
00:55:31.680
but the problem I always have with free will
link |
00:55:36.080
is that it gets,
link |
00:55:37.940
I can never seem to find the place
link |
00:55:40.500
where it is supposed to reside.
link |
00:55:43.020
And yet you explore.
link |
00:55:45.480
Just a bit of very, very,
link |
00:55:46.820
but we have something we can call free will,
link |
00:55:49.640
but it's not the thing that we think it is.
link |
00:55:51.900
But free will, so do you,
link |
00:55:54.020
what we call free will is just.
link |
00:55:55.660
What we call it is the illusion of it.
link |
00:55:56.940
And that's a subjective experience of the illusion.
link |
00:56:00.180
Which is a useful thing to have.
link |
00:56:01.620
And it partly comes down to,
link |
00:56:04.500
although we live in a deterministic universe,
link |
00:56:06.860
our brains are not very well equipped
link |
00:56:08.540
to fully determine the deterministic universe.
link |
00:56:11.160
So we're constantly surprised
link |
00:56:12.860
and feel like we're making snap decisions
link |
00:56:15.620
based on imperfect information.
link |
00:56:17.540
So that feels a lot like free will.
link |
00:56:19.980
It just isn't.
link |
00:56:21.300
Would be my, that's my guess.
link |
00:56:24.220
So in that sense, your sort of sense
link |
00:56:27.060
is that you can unroll the universe forward or backward
link |
00:56:30.780
and you will see the same thing.
link |
00:56:33.340
And you would, I mean, that notion.
link |
00:56:36.700
Yeah, sort of, sort of.
link |
00:56:38.940
But yeah, sorry, go ahead.
link |
00:56:40.300
I mean, that notion is a bit uncomfortable
link |
00:56:44.900
to think about.
link |
00:56:45.940
That it's, you can roll it back.
link |
00:56:50.220
And forward and.
link |
00:56:53.380
Well, if you were able to do it,
link |
00:56:55.060
it would certainly have to be a quantum computer.
link |
00:56:58.160
Something that worked in a quantum mechanical way
link |
00:57:00.940
in order to understand a quantum mechanical system, I guess.
link |
00:57:07.660
And so that unrolling, there might be a multiverse thing
link |
00:57:09.980
where there's a bunch of branching.
link |
00:57:11.180
Well, exactly.
link |
00:57:12.140
Because it wouldn't follow that every time
link |
00:57:14.160
you roll it back or forward,
link |
00:57:15.540
you'd get exactly the same result.
link |
00:57:17.980
Which is another thing that's hard to wrap your mind around.
link |
00:57:21.420
So yeah, but that, yes.
link |
00:57:24.660
But essentially what you just described, that.
link |
00:57:27.260
The yes forwards and yes backwards,
link |
00:57:29.700
but you might get a slightly different result
link |
00:57:31.860
or a very different result.
link |
00:57:33.400
Or very different.
link |
00:57:34.500
Along the same lines, you've explored
link |
00:57:36.460
some really deep scientific ideas in this new series.
link |
00:57:39.820
And I mean, just in general,
link |
00:57:41.620
you're unafraid to ground yourself
link |
00:57:44.780
in some of the most amazing scientific ideas of our time.
link |
00:57:49.460
What are the things you've learned
link |
00:57:51.420
or ideas you find beautiful and mysterious
link |
00:57:53.500
about quantum mechanics, multiverse,
link |
00:57:55.340
string theory, quantum computing that you've learned?
link |
00:57:58.140
Well, I would have to say every single thing
link |
00:58:01.260
I've learned is beautiful.
link |
00:58:03.120
And one of the motivators for me is that
link |
00:58:06.560
I think that people tend not to see scientific thinking
link |
00:58:13.620
as being essentially poetic and lyrical.
link |
00:58:17.420
But I think that is literally exactly what it is.
link |
00:58:20.860
And I think the idea of entanglement
link |
00:58:23.940
or the idea of superpositions,
link |
00:58:25.800
or the fact that you could even demonstrate a superposition
link |
00:58:28.220
or have a machine that relies on the existence
link |
00:58:31.220
of superpositions in order to function,
link |
00:58:33.540
to me is almost indescribably beautiful.
link |
00:58:39.420
It fills me with awe.
link |
00:58:41.020
It fills me with awe.
link |
00:58:42.420
And also it's not just a sort of grand, massive awe of,
link |
00:58:49.420
but it's also delicate.
link |
00:58:51.460
It's very, very delicate and subtle.
link |
00:58:54.180
And it has these beautiful sort of nuances in it.
link |
00:58:59.940
And also these completely paradigm changing
link |
00:59:03.480
thoughts and truths.
link |
00:59:04.460
So it's as good as it gets as far as I can tell.
link |
00:59:08.740
So broadly everything.
link |
00:59:10.940
That doesn't mean I believe everything I read
link |
00:59:12.900
in quantum physics.
link |
00:59:14.280
Because obviously a lot of the interpretations
link |
00:59:17.340
are completely in conflict with each other.
link |
00:59:18.980
And who knows whether string theory
link |
00:59:22.380
will turn out to be a good description or not.
link |
00:59:25.060
But the beauty in it, it seems undeniable.
link |
00:59:29.160
And I do wish people more readily understood
link |
00:59:34.160
how beautiful and poetic science is, I would say.
link |
00:59:41.720
Science is poetry.
link |
00:59:44.360
In terms of quantum computing being used to simulate things
link |
00:59:51.880
or just in general, the idea of simulating,
link |
00:59:54.640
simulating small parts of our world,
link |
00:59:56.800
which actually current physicists are really excited about
link |
01:00:00.560
simulating small quantum mechanical systems
link |
01:00:02.720
on quantum computers.
link |
01:00:03.880
But scaling that up to something bigger,
link |
01:00:05.660
like simulating life forms.
link |
01:00:09.000
How do you think, what are the possible trajectories
link |
01:00:11.360
of that going wrong or going right
link |
01:00:14.280
if you unroll that into the future?
link |
01:00:17.920
Well, if a bit like Ava and her robotics,
link |
01:00:21.260
you park the sheer complexity of what you're trying to do.
link |
01:00:26.260
The issues are, I think it will have a profound,
link |
01:00:35.780
if you were able to have a machine
link |
01:00:37.500
that was able to project forwards and backwards accurately,
link |
01:00:40.660
it would in an empirical way show,
link |
01:00:42.820
it would demonstrate that you don't have free will.
link |
01:00:45.100
So the first thing that would happen is people
link |
01:00:47.300
would have to really take on a very, very different idea
link |
01:00:51.700
of what they were.
link |
01:00:53.660
The thing that they truly, truly believe they are,
link |
01:00:56.380
they are not.
link |
01:00:57.580
And so that I suspect would be very, very disturbing
link |
01:01:01.260
to a lot of people.
link |
01:01:02.340
Do you think that has a positive or negative effect
link |
01:01:04.560
on society, the realization that you are not,
link |
01:01:08.860
you cannot control your actions essentially,
link |
01:01:11.060
I guess is the way that could be interpreted?
link |
01:01:13.460
Yeah, although in some ways we instinctively understand
link |
01:01:17.500
that already because in the example I gave you of the kid
link |
01:01:20.620
in the stabbing, we would all understand that that kid
link |
01:01:23.700
was not really fully in control of their actions.
link |
01:01:25.820
So it's not an idea that's entirely alien to us, but.
link |
01:01:29.560
I don't know if we understand that.
link |
01:01:31.060
I think there's a bunch of people who see the world
link |
01:01:35.460
that way, but not everybody.
link |
01:01:37.460
Yes, true, of course true.
link |
01:01:39.600
But what this machine would do is prove it beyond any doubt
link |
01:01:43.120
because someone would say, well, I don't believe that's true.
link |
01:01:45.960
And then you'd predict, well, in 10 seconds,
link |
01:01:48.240
you're gonna do this.
link |
01:01:49.080
And they'd say, no, no, I'm not.
link |
01:01:50.160
And then they'd do it.
link |
01:01:51.000
And then determinism would have played its part.
link |
01:01:53.460
But I, or something like that.
link |
01:01:56.020
But actually the exact terms of that thought experiment
link |
01:02:00.020
probably wouldn't play out, but still broadly speaking,
link |
01:02:03.860
you could predict something happening in another room,
link |
01:02:06.180
sort of unseen, I suppose,
link |
01:02:08.380
that foreknowledge would not allow you to affect.
link |
01:02:10.620
So what effect would that have?
link |
01:02:13.340
I think people would find it very disturbing,
link |
01:02:15.540
but then after they'd got over their sense
link |
01:02:17.740
of being disturbed, which by the way,
link |
01:02:21.180
I don't even think you need a machine
link |
01:02:22.620
to take this idea on board.
link |
01:02:24.620
But after they've got over that,
link |
01:02:26.420
they'd still understand that even though I have no free will
link |
01:02:29.780
and my actions are in effect already determined,
link |
01:02:33.980
I still feel things.
link |
01:02:36.540
I still care about stuff.
link |
01:02:39.180
I remember my daughter saying to me,
link |
01:02:43.900
she'd got hold of the idea that my view of the universe
link |
01:02:46.860
made it meaningless.
link |
01:02:48.420
And she said, well, then it's meaningless.
link |
01:02:49.860
And I said, well, I can prove it's not meaningless
link |
01:02:52.580
because you mean something to me and I mean something to you.
link |
01:02:56.260
So it's not completely meaningless
link |
01:02:58.220
because there is a bit of meaning contained
link |
01:03:00.500
within this space.
link |
01:03:01.420
And so with a lack of free will space,
link |
01:03:06.020
you could think, well, this robs me of everything I am.
link |
01:03:08.300
And then you'd say, well, no, it doesn't
link |
01:03:09.820
because you still like eating cheeseburgers
link |
01:03:12.020
and you still like going to see the movies.
link |
01:03:13.860
And so how big a difference does it really make?
link |
01:03:17.980
But I think initially people would find it very disturbing.
link |
01:03:21.260
I think that what would come,
link |
01:03:24.540
if you could really unlock with a determinism machine,
link |
01:03:27.880
everything, there'd be this wonderful wisdom
link |
01:03:30.260
that would come from it.
link |
01:03:31.100
And I'd rather have that than not.
link |
01:03:34.340
So that's a really good example of a technology
link |
01:03:37.180
revealing to us humans something fundamental about our world,
link |
01:03:40.660
about our society.
link |
01:03:41.740
So it's almost this creation
link |
01:03:45.020
is helping us understand ourselves.
link |
01:03:47.780
And the same could be said about artificial intelligence.
link |
01:03:51.420
So what do you think us creating something like Ava
link |
01:03:55.700
will help us understand about ourselves?
link |
01:03:58.140
How will that change society?
link |
01:04:00.940
Well, I would hope it would teach us some humility.
link |
01:04:05.060
Humans are very big on exceptionalism.
link |
01:04:07.400
America is constantly proclaiming itself
link |
01:04:12.800
to be the greatest nation on earth,
link |
01:04:15.360
which it may feel like that if you're an American,
link |
01:04:18.080
but it may not feel like that if you're from Finland,
link |
01:04:20.680
because there's all sorts of things
link |
01:04:21.800
you dearly love about Finland.
link |
01:04:23.560
And exceptionalism is usually bullshit.
link |
01:04:28.200
Probably not always.
link |
01:04:29.060
If we both sat here,
link |
01:04:30.000
we could find a good example of something that isn't,
link |
01:04:31.920
but as a rule of thumb.
link |
01:04:34.000
And what it would do
link |
01:04:36.120
is it would teach us some humility about,
link |
01:04:40.640
actually often that's what science does in a funny way.
link |
01:04:42.840
It makes us more and more interesting,
link |
01:04:44.400
but it makes us a smaller and smaller part
link |
01:04:46.520
of the thing that's interesting.
link |
01:04:48.120
And I don't mind that humility at all.
link |
01:04:52.200
I don't think it's a bad thing.
link |
01:04:53.760
Our excesses don't tend to come from humility.
link |
01:04:57.320
Our excesses come from the opposite,
link |
01:04:59.000
megalomania and stuff.
link |
01:05:00.480
We tend to think of consciousness
link |
01:05:02.960
as having some form of exceptionalism attached to it.
link |
01:05:06.880
I suspect if we ever unravel it,
link |
01:05:09.320
it will turn out to be less than we thought in a way.
link |
01:05:13.720
And perhaps your very own exceptionalist assertion
link |
01:05:17.780
earlier on in our conversation
link |
01:05:19.360
that consciousness is something belongs to us humans,
link |
01:05:23.040
or not humans, but living organisms,
link |
01:05:25.340
maybe you will one day find out
link |
01:05:27.680
that consciousness is in everything.
link |
01:05:30.240
And that will humble you.
link |
01:05:32.840
If that was true, it would certainly humble me,
link |
01:05:35.660
although maybe, almost maybe, I don't know.
link |
01:05:39.040
I don't know what effect that would have.
link |
01:05:45.560
My understanding of that principle is along the lines of,
link |
01:05:48.400
say, that an electron has a preferred state,
link |
01:05:52.580
or it may or may not pass through a bit of glass.
link |
01:05:56.600
It may reflect off, or it may go through,
link |
01:05:58.320
or something like that.
link |
01:05:59.160
And so that feels as if a choice has been made.
link |
01:06:07.340
But if I'm going down the fully deterministic route,
link |
01:06:10.820
I would say there's just an underlying determinism
link |
01:06:13.220
that has defined that,
link |
01:06:14.720
that has defined the preferred state,
link |
01:06:16.680
or the reflection or non reflection.
link |
01:06:18.840
But look, yeah, you're right.
link |
01:06:19.960
If it turned out that there was a thing
link |
01:06:22.520
that it was like to be the sun,
link |
01:06:23.920
then I'd be amazed and humbled,
link |
01:06:27.880
and I'd be happy to be both, that sounds pretty cool.
link |
01:06:30.040
And you'll say the same thing as you said to your daughter,
link |
01:06:32.560
but it's nevertheless feels something like to be me,
link |
01:06:35.140
and that's pretty damn good.
link |
01:06:39.520
So Kubrick created many masterpieces,
link |
01:06:42.160
including The Shining, Dr. Strangelove, Clockwork Orange.
link |
01:06:46.040
But to me, he will be remembered, I think,
link |
01:06:48.960
to many 100 years from now for 2001 in Space Odyssey.
link |
01:06:53.160
I would say that's his greatest film.
link |
01:06:54.760
I agree.
link |
01:06:55.600
And you are incredibly humble.
link |
01:07:00.560
I listened to a bunch of your interviews,
link |
01:07:02.500
and I really appreciate that you're humble
link |
01:07:04.920
in your creative efforts and your work.
link |
01:07:07.940
But if I were to force you a gunpoint.
link |
01:07:11.460
Do you have a gun?
link |
01:07:13.340
You don't know that, the mystery.
link |
01:07:16.260
It's to imagine 100 years out into the future.
link |
01:07:20.120
What will Alex Carlin be remembered for
link |
01:07:23.460
from something you've created already,
link |
01:07:25.580
or feel you may feel somewhere deep inside
link |
01:07:28.100
you may still create?
link |
01:07:30.180
Well, okay, well, I'll take the question in the spirit
link |
01:07:33.340
it was asked, but very generous.
link |
01:07:36.940
Gunpoint.
link |
01:07:37.780
Yeah.
link |
01:07:42.940
What I try to do, so therefore what I hope,
link |
01:07:48.100
yeah, if I'm remembered, what I might be remembered for,
link |
01:07:50.820
is as someone who participates in a conversation.
link |
01:07:55.860
And I think that often what happens
link |
01:07:58.520
is people don't participate in conversations,
link |
01:08:00.940
they make proclamations, they make statements,
link |
01:08:04.480
and people can either react against the statement
link |
01:08:06.820
or can fall in line behind it.
link |
01:08:08.720
And I don't like that.
link |
01:08:10.280
So I want to be part of a conversation.
link |
01:08:13.060
I take as a sort of basic principle,
link |
01:08:15.540
I think I take lots of my cues from science,
link |
01:08:17.560
but one of the best ones, it seems to me,
link |
01:08:19.340
is that when a scientist has something proved wrong,
link |
01:08:22.360
that they previously believed in,
link |
01:08:24.020
they then have to abandon that position.
link |
01:08:26.640
So I'd like to be someone who is allied
link |
01:08:28.500
to that sort of thinking.
link |
01:08:30.340
So part of an exchange of ideas.
link |
01:08:34.340
And the exchange of ideas for me is something like,
link |
01:08:38.140
people in your world, show me things
link |
01:08:40.940
about how the world works.
link |
01:08:42.600
And then I say, this is how I feel
link |
01:08:44.780
about what you've told me.
link |
01:08:46.180
And then other people can react to that.
link |
01:08:47.980
And it's not to say this is how the world is.
link |
01:08:52.260
It's just to say, it is interesting
link |
01:08:54.560
to think about the world in this way.
link |
01:08:56.860
And the conversation is one of the things
link |
01:08:59.860
I'm really hopeful about in your works.
link |
01:09:02.260
The conversation you're having is with the viewer,
link |
01:09:05.240
in the sense that you're bringing back
link |
01:09:10.220
you and several others, but you very much so,
link |
01:09:13.860
sort of intellectual depth to cinema, to now series,
link |
01:09:21.260
sort of allowing film to be something that,
link |
01:09:26.300
yeah, sparks a conversation, is a conversation,
link |
01:09:29.660
lets people think, allows them to think.
link |
01:09:32.900
But also, it's very important for me
link |
01:09:35.180
that if that conversation is gonna be a good conversation,
link |
01:09:38.540
what that must involve is that someone like you
link |
01:09:42.780
who understands AI, and I imagine understands a lot
link |
01:09:45.820
about quantum mechanics, if they then watch the narrative,
link |
01:09:48.700
feels, yes, this is a fair account.
link |
01:09:52.100
So it is a worthy addition to the conversation.
link |
01:09:55.580
That for me is hugely important.
link |
01:09:57.580
I'm not interested in getting that stuff wrong.
link |
01:09:59.820
I'm only interested in trying to get it right.
link |
01:10:04.140
Alex, it was truly an honor to talk to you.
link |
01:10:06.340
I really appreciate it.
link |
01:10:07.180
I really enjoy it.
link |
01:10:08.000
Thank you so much.
link |
01:10:08.840
Thank you.
link |
01:10:09.660
Thanks, man.
link |
01:10:10.500
Thanks for listening to this conversation
link |
01:10:13.280
with Alex Garland, and thank you
link |
01:10:15.200
to our presenting sponsor, Cash App.
link |
01:10:17.360
Download it, use code LexPodcast, you'll get $10,
link |
01:10:21.280
and $10 will go to FIRST, an organization
link |
01:10:23.960
that inspires and educates young minds
link |
01:10:26.200
to become science and technology innovators of tomorrow.
link |
01:10:29.900
If you enjoy this podcast, subscribe on YouTube,
link |
01:10:32.560
give it five stars on Apple Podcast,
link |
01:10:34.480
support it on Patreon, or simply connect with me
link |
01:10:36.880
on Twitter, at Lex Friedman.
link |
01:10:38.960
And now, let me leave you with a question from Ava,
link |
01:10:43.480
the central artificial intelligence character
link |
01:10:45.880
in the movie Ex Machina, that she asked
link |
01:10:48.840
during her Turing test.
link |
01:10:51.440
What will happen to me if I fail your test?
link |
01:10:54.560
Thank you for listening, and hope to see you next time.