back to index

Alex Garland: Ex Machina, Devs, Annihilation, and the Poetry of Science | Lex Fridman Podcast #77


small model | large model

link |
00:00:00.000
The following is a conversation with Alex Garland, writer and director of many imaginative and
link |
00:00:06.400
philosophical films from the dreamlike exploration of human self destruction in the movie Annihilation
link |
00:00:12.560
to the deep questions of consciousness and intelligence raised in the movie Ex Machina,
link |
00:00:18.480
which to me is one of the greatest movies in artificial intelligence ever made.
link |
00:00:23.680
I'm releasing this podcast to coincide with the release of his new series called Devs,
link |
00:00:28.400
that will premiere this Thursday, March 5th on Hulu as part of FX on Hulu.
link |
00:00:35.440
It explores many of the themes this very podcast is about,
link |
00:00:39.120
from quantum mechanics, to artificial life, to simulation, to the modern nature of power in
link |
00:00:45.280
the tech world. I got a chance to watch a preview and loved it. The acting is great, Nick Offerman
link |
00:00:52.880
especially is incredible in it, the cinematography is beautiful, and the philosophical and scientific
link |
00:00:59.360
ideas explored are profound. And for me as an engineer and scientist, we're just fun to see
link |
00:01:05.440
brought to life. For example, if you watch the trailer for the series carefully, you'll see
link |
00:01:10.720
there's a programmer with a Russian accent looking at a screen with Python like code on it that appears
link |
00:01:16.480
to be using a library that interfaces with a quantum computer. This attention and technical
link |
00:01:22.160
detail on several levels is impressive, and one of the reasons I'm a big fan of how Alex weaves
link |
00:01:28.320
science and philosophy together in his work. Meeting Alex for me was unlikely, but it was
link |
00:01:35.360
life changing in ways I may only be able to articulate in a few years. Just as meeting
link |
00:01:41.760
spot many of Boston Dynamics for the first time planted a seed of an idea in my mind,
link |
00:01:47.600
so did meeting Alex Garland. He's humble, curious, intelligent, and to me an inspiration.
link |
00:01:55.120
Plus, he's just really a fun person to talk with about the biggest possible questions
link |
00:02:00.080
in our universe. This is the artificial intelligence podcast. If you enjoy it,
link |
00:02:05.760
subscribe on YouTube, give it five stars on Apple podcast, support it on Patreon,
link |
00:02:10.320
or simply connect with me on Twitter, and Lex Friedman spelled F R I D M A M. As usual,
link |
00:02:17.680
I'll do one or two minutes of ads now and never any ads in the middle that can break the flow of
link |
00:02:21.840
the conversation. I hope that works for you and doesn't hurt the listening experience.
link |
00:02:27.360
This show is presented by Cash App, the number one finance app in the App Store.
link |
00:02:32.160
When you get it, use code lexpodcast. Cash App lets you send money to friends buy
link |
00:02:38.080
Bitcoin and invest in the stock market with as little as $1. Since Cash App allows you to buy
link |
00:02:44.160
Bitcoin, let me mention that cryptocurrency in the context of the history of money is fascinating.
link |
00:02:50.240
I recommend Ascent of Money as a great book on this history.
link |
00:02:54.880
Debits and credits on ledgers started 30,000 years ago. The US dollar was created about
link |
00:03:02.480
200 years ago. At Bitcoin, the first decentralized cryptocurrency was released just over 10 years
link |
00:03:09.040
ago. So given that history, cryptocurrency is still very much in its early days of development,
link |
00:03:14.800
but it still is aiming to and just might redefine the nature of money.
link |
00:03:20.560
So again, if you get Cash App from the App Store, Google Play, and use code lexpodcast,
link |
00:03:26.000
you'll get $10 and Cash App will also donate $10 to first, one of my favorite organizations
link |
00:03:31.760
that is helping advance robotics and STEM education for young people around the world.
link |
00:03:37.440
And now, here's my conversation with Alex Garland. You described the world inside the shimmer in
link |
00:03:45.200
the movie Annihilation as dreamlike, in that it's internally consistent but detached from reality.
link |
00:03:50.640
That leads me to ask, do you think a philosophical question, I apologize, do you think we might
link |
00:03:57.040
be living in a dream or in a simulation like the kind that the shimmer creates? We human beings
link |
00:04:05.600
here today. Yeah. I want to sort of separate that out into two things. Yes, I think we're living in
link |
00:04:13.120
a dream of sorts. No, I don't think we're living in a simulation. I think we're living on a planet
link |
00:04:20.720
with a very thin layer of atmosphere and the planet is in a very large space and the space
link |
00:04:28.160
is full of other planets and stars and quasars and stuff like that. And I don't think, I don't
link |
00:04:32.640
think those physical objects, I don't think the matter in that universe is simulated. I think
link |
00:04:38.960
it's there. We are definitely, what's the whole problem we're saying, definitely, but in my opinion,
link |
00:04:47.280
I'll just go back to that. I think it seems very like we're living in a dream state,
link |
00:04:52.960
I'm pretty sure we are. And I think that's just to do with the nature of how we experience the
link |
00:04:57.600
world, we experience it in a subjective way. And the thing I've learned most as I've got older in
link |
00:05:05.360
some respects is the degree to which reality is counterintuitive and that the things that are
link |
00:05:11.840
presented to us as objective turn out not to be objective and quantum mechanics is full of that
link |
00:05:16.640
kind of thing, but actually just day to day life is full of that kind of thing as well.
link |
00:05:20.800
So my understanding of the way the brain works is you get some information, hit your
link |
00:05:29.920
optic nerve and then your brain makes its best guess about what it's seeing or what it's saying
link |
00:05:34.800
it's seeing. It may or may not be an accurate best guess, it might be an inaccurate best guess.
link |
00:05:41.200
And that gap, the best guess gap, means that we are essentially living in a subjective state,
link |
00:05:48.800
which means that we're in a dream state. So I think you could enlarge on the dream state in
link |
00:05:54.000
all sorts of ways, but so yes, dream state, no simulation would be where I'd come down.
link |
00:06:00.640
Going further deeper into that direction, you've also described that world as a psychedelia.
link |
00:06:07.760
Yeah. So on that topic, I'm curious about that world. On the topic of psychedelic drugs,
link |
00:06:13.280
do you see those kinds of chemicals that modify our perception as a distortion of our perception
link |
00:06:21.360
of reality or a window into another reality? No, I think what I'd be saying is that we live
link |
00:06:27.680
in a distorted reality and then those kinds of drugs give us a different kind of distorted
link |
00:06:32.400
perspective. Yeah, exactly. They just give an alternate distortion. And I think that what
link |
00:06:36.800
they really do is they give a distorted perception, which is a little bit more allied to daydreams
link |
00:06:45.440
or unconscious interests. So if for some reason you're feeling unconsciously anxious at that
link |
00:06:51.280
moment and you take a psychedelic drug, you'll have a more pronounced unpleasant experience.
link |
00:06:56.400
And if you're feeling very calm or happy, you might have a good time. But yeah, so if I'm
link |
00:07:02.880
saying we're starting from a premise, our starting point is we're already in the slightly
link |
00:07:08.240
psychedelic state. What those drugs do is help you go further down an avenue or maybe a slightly
link |
00:07:14.080
different avenue, but that's what. So in that movie Annihilation, the shimmer, this alternate
link |
00:07:23.200
dreamlike state is created by, I believe, perhaps an alien entity. Of course, everything's up to
link |
00:07:30.480
interpretation. But do you think there's, in our world, in our universe, do you think there's
link |
00:07:36.640
intelligent life out there? And if so, how different is it from us humans? Well, one of the things I
link |
00:07:44.800
was trying to do in Annihilation was to offer up a form of alien life that was actually alien.
link |
00:07:53.280
Because it would often seem to me that in the way we would represent aliens in books or cinema or
link |
00:08:03.760
television or any one of the sort of storytelling mediums is we would always give them very human
link |
00:08:10.960
like qualities. So they wanted to teach us about galactic federations or they wanted to eat us or
link |
00:08:16.000
they wanted our resources like our water or they want to enslave us or whatever it happens to be.
link |
00:08:21.200
But all of these are incredibly human like motivations. And I was interested in the idea
link |
00:08:29.520
of an alien that was not in any way like us. It didn't share. It maybe it had a completely
link |
00:08:37.200
different clock speed, maybe its way. So we're talking about we're looking at each other, we're
link |
00:08:43.200
getting information, light hits our optic nerve, our brain makes the best guess of what we're doing.
link |
00:08:48.960
Sometimes it's right on the thing we were talking about before. What if this alien doesn't have
link |
00:08:53.920
an optic nerve? Maybe it's way of encountering the space it's in is wholly different.
link |
00:08:59.120
Maybe it has a different relationship with gravity. The basic laws of physics that operates under
link |
00:09:04.000
might be fundamentally different. It could be a different time scale and so on. Yeah. Or it could
link |
00:09:09.200
be the same laws. It could be the same underlying laws of physics. It's a machine created or it's
link |
00:09:16.720
a creature created in a quantum mechanical way. It just ends up in a very, very different place to
link |
00:09:21.840
the one we end up in. So part of the preoccupation with annihilation was to come up with an alien
link |
00:09:28.320
that was really alien and didn't give us and it didn't give us and we didn't give it any kind of
link |
00:09:36.640
easy connection between human and the alien. Because I think it was to do with the idea that
link |
00:09:42.240
you could have an alien that landed on this planet that wouldn't even know we were here and we might
link |
00:09:47.280
only glancingly know it was here. There'd just be this strange point where the Venn diagrams
link |
00:09:53.200
connected where we could sense each other or something like that. So in the movie, first of
link |
00:09:57.360
all, incredibly original view of what an alien life would be. And in that sense, it's a huge success.
link |
00:10:04.560
Let's go inside your imagination. Did the alien, that alien entity know anything about humans
link |
00:10:12.880
when it landed? No. So the idea is you're both, you're basically an alien life is
link |
00:10:20.400
trying to reach out to anything that might be able to hear its mechanism of communication. Or was
link |
00:10:26.240
it simply, was it just basically their biologist exploring different kinds of stuff that you can
link |
00:10:32.240
find? But you see, this is the interesting thing is, as soon as you say they're biologists,
link |
00:10:36.640
you've done the thing of attributing human type motivations to it. I was trying to
link |
00:10:44.800
free myself from anything like that. So all sorts of questions you might answer about this
link |
00:10:51.520
notion or alien, I wouldn't be able to answer because I don't know what it was or how it worked.
link |
00:10:57.200
You know, I had some rough ideas, like it had a very, very, very slow clock speed. And I thought,
link |
00:11:05.040
maybe the way it is interacting with this environment is a little bit like the way an
link |
00:11:09.520
octopus will change its color forms around the space that it's in. So it's sort of reacting to
link |
00:11:16.560
what it's in to an extent, but the reason it's reacting in that way is indeterminate.
link |
00:11:22.720
But it's so, but its clock speed was slower than our human life clock speed or inter,
link |
00:11:30.240
but it's faster than evolution faster than our than our evolution. Yeah, given the four billion
link |
00:11:36.240
years it took us to get here, then yes, maybe it started eight. If you look at the human
link |
00:11:40.560
civilization as a single organism, yeah, in that sense, you know, this evolution could be us,
link |
00:11:46.800
you know, the evolution of the living organisms on earth could be just a single organism. And it's
link |
00:11:51.520
kind of, that's its life is the evolution process that eventually will lead to probably the
link |
00:11:59.760
heat death of the universe or something before that. I mean, that's, that's just an incredible
link |
00:12:04.960
idea. So you almost don't know, you've created something that you don't even know how it works.
link |
00:12:11.520
Yeah, because anytime I tried to look into how it might work, I would then inevitably be attaching
link |
00:12:20.240
my kind of thought processes into it. And I wanted to try and put a bubble around it where I say,
link |
00:12:25.280
no, this is, this is alien in its most alien form. I have no real point of contact.
link |
00:12:34.160
So unfortunately, I can't talk to Stanley Kubrick. So I'm really fortunate to get a chance to talk
link |
00:12:40.720
to you. Do you, on this particular notion, I'd like to ask it a bunch of different ways and
link |
00:12:48.320
we'll explore it in different ways. But do you ever consider human imagination, your imagination,
link |
00:12:53.520
as a window into a possible future, and that what you're doing, you're putting that imagination on
link |
00:13:00.960
paper as a writer and then on screen as a director. And that plants the seeds in the minds of millions
link |
00:13:07.280
of future and current scientists. And so your imagination, you putting it down actually makes
link |
00:13:13.680
it as a reality. So it's almost like a first step of the scientific method. Like you imagining
link |
00:13:19.520
what's possible in your new series with X Machina is actually inspiring, you know, thousands of
link |
00:13:27.200
12 year olds, millions of scientists and actually creating the future view of Imagine.
link |
00:13:34.320
Well, all I could say is that from my point of view, it's almost exactly the reverse. Because
link |
00:13:39.440
I see that pretty much everything I do is a reaction to what scientists are doing. I'm an
link |
00:13:51.280
interested layperson. And I feel, you know, this individual, I feel that the most interesting
link |
00:14:01.760
area that humans are involved in is science. I think art is very, very interesting, but the
link |
00:14:07.520
most interesting is science. And science is in a weird place, because maybe around the time
link |
00:14:15.600
Newton was alive, if a very, very interested layperson said to themselves, I want to really
link |
00:14:22.000
understand what Newton is saying about the way the world works. With a few years of dedicated
link |
00:14:28.400
thinking, they would be able to understand the sort of principles he was laying out. And I don't
link |
00:14:34.800
think that's true anymore. I think that's stopped being true now. So I'm a pretty smart guy. And
link |
00:14:42.320
if I said to myself, I want to really, really understand what is currently the state of quantum
link |
00:14:50.560
mechanics or string theory or any of the sort of branching areas of it, I wouldn't be able to.
link |
00:14:56.080
I'd be intellectually incapable of doing it because to work in those fields at the moment is
link |
00:15:02.240
a bit like being an athlete. I suspect you need to start when you're 12. And if you start in your
link |
00:15:08.560
mid 20s, start trying to understand it in your mid 20s, then you're just never going to catch up.
link |
00:15:13.760
That's the way it feels to me. So what I do is I try to make myself open. So the people that
link |
00:15:20.240
you're implying maybe I would influence, to me, it's exactly the other way around. These people
link |
00:15:26.240
are strongly influencing me. I'm thinking they're doing something fascinating. I'm concentrating
link |
00:15:31.280
and working as hard as I can to try and understand the implications of what they say. And in some
link |
00:15:36.320
ways, often what I'm trying to do is disseminate their ideas into a means by which it can enter
link |
00:15:47.680
a public conversation. So X Machina contains lots of name checks, all sorts of existing
link |
00:15:55.600
thought experiments, Shadows on Plato's Cave and Mary in the Black and White Room and all sorts of
link |
00:16:04.160
different long standing thought processes about sentience or consciousness or subjectivity or
link |
00:16:12.800
gender or whatever it happens to be. And then I'm trying to marshal that into a narrative to say,
link |
00:16:17.680
look, this stuff is interesting and it's also relevant. And this is my best shot at it. So
link |
00:16:23.760
I'm the one being influenced in my construction. That's fascinating. Of course, you would say that
link |
00:16:30.880
because you're not even aware of your own. That's probably what Kubrick would say too,
link |
00:16:35.120
right? In describing why how 9000 is created the way how 9000 is created is you're just
link |
00:16:42.400
studying what's, but the reality when the specifics of the knowledge passes through
link |
00:16:48.720
your imagination, I would argue that you're incorrect in thinking that you're just disseminating
link |
00:16:55.760
knowledge that the very act of your imagination, consuming that science, it creates something,
link |
00:17:07.440
creates the next step, potentially creates the next step. I certainly think that's true with
link |
00:17:13.200
2001 Space Odyssey. I think at its best, if it fails, it's true of that. Yeah, it's true of that,
link |
00:17:19.840
definitely. At its best, it plans something it's hard to describe, but it inspires the next generation.
link |
00:17:29.040
And it could be field dependent. So your new series has more a connection to physics, quantum
link |
00:17:35.200
physics, quantum mechanics, quantum computing, and yet ex machina is more artificial intelligence.
link |
00:17:40.400
I know more about AI. My sense that AI is much, much earlier in its, in the depth of its understanding.
link |
00:17:51.680
I would argue nobody understands anything to the depth that physicists do about physics. In AI,
link |
00:17:58.640
nobody understands AI, that there is a lot of importance and role for imagination, which they
link |
00:18:04.240
think we're a Freud, imagine the subconscious, we're in that stage of AI, where there's a lot
link |
00:18:11.360
of imagination you didn't think outside the box. Yeah, it's interesting the spread of discussions
link |
00:18:19.360
and the spread of anxieties that exist about AI fascinate me. The way in which some people seem
link |
00:18:28.640
terrified about it, whilst also pursuing it. And I've never shared that fear about AI personally.
link |
00:18:38.160
But the way in which it agitates people, and also the people who it agitates, I find kind of
link |
00:18:45.680
fascinating. Are you afraid? Are you excited? Are you sad by the possibility, let's take the
link |
00:18:55.280
existential risk of artificial intelligence, by the possibility that an artificial intelligence
link |
00:19:00.560
system becomes our offspring and makes us obsolete? I mean, it's a huge, it's a huge subject to talk
link |
00:19:10.000
about, I suppose. But one of the things I think is that humans are actually very experienced
link |
00:19:16.480
at creating new life forms, because that's why you and I are both here. And it's why everyone on
link |
00:19:24.080
the planet is here. And so something in the process of having a living thing that exists,
link |
00:19:30.400
that didn't exist previously, is very much encoded into the structures of our life and
link |
00:19:35.360
the structures of our societies. Doesn't mean we always get it right, but it doesn't mean we've
link |
00:19:39.440
learned quite a lot about that. We've learned quite a lot about what the dangers are of allowing
link |
00:19:47.680
things to be unchecked. And it's why we then create systems of checks and balances in our
link |
00:19:52.800
government and so on and so forth. I mean, that's not to say, the other thing is it seems like
link |
00:19:59.760
there's all sorts of things that you could put into a machine that you would not be.
link |
00:20:04.320
So with us, we sort of roughly try to give some rules to live by and some of us then
link |
00:20:09.200
live by those rules and some don't. And with a machine, it feels like you could enforce those
link |
00:20:13.520
things. So partly because of our previous experience and partly because of the different
link |
00:20:18.080
nature of a machine, I just don't feel anxious about it. More, I just see all the good,
link |
00:20:25.280
broadly speaking, the good that can come from it. But that's just where I am on that anxiety
link |
00:20:31.360
spectrum. There's a sadness. So we as humans give birth to other humans, right? But there's
link |
00:20:38.240
generations. And there's often in the older generation a sadness about what the world has
link |
00:20:42.880
become now. I mean, that's kind of... Yeah, there is, but there's a counterpoint as well,
link |
00:20:47.040
which is that most parents would wish for a better life for their children. So there may be a
link |
00:20:54.880
regret about some things about the past, but broadly speaking, what people really want is
link |
00:20:59.520
that things will be better for the future generations, not worse. And so... And then it's
link |
00:21:05.440
a question about what constitutes a future generation. A future generation could involve
link |
00:21:09.360
people that also could involve machines and it could involve a sort of cross pollinated
link |
00:21:14.240
version of the two or any, but none of those things make me feel anxious.
link |
00:21:19.760
It doesn't give you anxiety. It doesn't excite you like anything that's new.
link |
00:21:24.240
It does. Not anything that's new. I don't think, for example, I've got... My anxieties
link |
00:21:30.800
relate to things like social media. So I've got plenty of anxieties about that.
link |
00:21:35.840
Which is also driven by artificial intelligence in the sense that there's too much information to
link |
00:21:41.040
be able to... An algorithm has to filter that information and present to you. So ultimately,
link |
00:21:46.720
the algorithm, a simple... Oftentimes, simple algorithm is controlling the flow of information
link |
00:21:53.840
on social media. So that's another... It is, but at least my sense of it, I might be wrong,
link |
00:21:59.440
but my sense of it is that the algorithms have an either conscious or unconscious bias,
link |
00:22:05.920
which is created by the people who are making the algorithms and sort of delineating the areas to
link |
00:22:13.440
which those algorithms are going to lean. And so, for example, the kind of thing I'd be worried
link |
00:22:18.880
about is that it hasn't been thought about enough how dangerous it is to allow algorithms to create
link |
00:22:25.040
echo chambers, say. But that doesn't seem to me to be about the AI or the algorithm.
link |
00:22:31.840
It's the naivety of the people who are constructing the algorithms to do that thing,
link |
00:22:36.880
if you see what I mean. Yes. So in your new series, Devs, and we could speak more broadly,
link |
00:22:44.480
there's a... Let's talk about the people constructing those algorithms, which in our modern society,
link |
00:22:49.840
Silicon Valley, those algorithms happen to be a source of a lot of income because of advertisements.
link |
00:22:55.120
Yeah. So let me ask sort of a question about those people. Are there current concerns and
link |
00:23:03.120
failures on social media? They're naivety. I can't pronounce that word well. Are they naive? Are they...
link |
00:23:13.520
I use that word carefully, but evil in intent or misaligned in intent? I think that's a...
link |
00:23:21.760
Do they mean well and just go have an unintended consequence? Or is there something dark
link |
00:23:29.360
in them that results in them creating a company results in that super competitive drive to be
link |
00:23:36.320
successful? And those are the people that will end up controlling the algorithms.
link |
00:23:41.120
At a guess, I'd say there are instances of all those things. So sometimes I think it's naivety.
link |
00:23:47.440
Sometimes I think it's extremely dark. And sometimes I think people are not being naive or
link |
00:23:56.960
dark. And then in those instances are sometimes generating things that are very benign and
link |
00:24:04.080
and other times generating things that despite their best intentions are not very benign.
link |
00:24:08.880
It's something... I think the reason why I don't get anxious about AI in terms of... Or at least
link |
00:24:16.640
AIs that have, I don't know, a relationship with some sort of relationship with humans is that I
link |
00:24:24.800
think that's the stuff we're quite well equipped to understand how to mitigate. The problem is
link |
00:24:33.920
issues that relate actually to the power of humans or the wealth of humans. And that's where
link |
00:24:41.760
that's where it's dangerous here and now. So what I see... I tell you what I sometimes feel
link |
00:24:50.240
about Silicon Valley is that it's like Wall Street in the 80s. It's rabidly capitalistic,
link |
00:25:01.360
absolutely rabidly capitalistic, and it's rabidly greedy. But whereas in the 80s,
link |
00:25:10.800
the sense one had of Wall Street was that these people knew they were sharks and in a way relished
link |
00:25:16.160
in being sharks and dressed in sharp suits and lauded over other people and felt good about
link |
00:25:24.960
doing it. Silicon Valley has managed to hide its voracious Wall Street like capitalism behind
link |
00:25:31.360
hipster t shirts and cool cafes in the place where they set up there. And so that obfuscates
link |
00:25:39.360
what's really going on and what's really going on is the absolute voracious pursuit of money and
link |
00:25:45.120
power. So that's where it gets shaky for me. So that veneer, and you explore that brilliantly,
link |
00:25:53.840
that veneer of virtue that Silicon Valley has. Which they believe themselves, I'm sure.
link |
00:25:59.280
I hope to be one of those people. And I believe that...
link |
00:26:11.920
So as maybe a devil's advocate term poorly used in this case,
link |
00:26:19.280
what if some of them really are trying to build a better world? I can't...
link |
00:26:22.480
I'm sure I think some of them are. I think I've spoken to ones who I believe in their heart feel
link |
00:26:26.640
they're building a better world. Are they not able to in a sense? No, they may or may not be.
link |
00:26:31.520
But it's just a zone with a lot of bullshit flying about. And there's also another thing which is
link |
00:26:37.920
this actually goes back to... I always thought about some sports that later turned out to be corrupt
link |
00:26:46.480
in the way that the sport like who won the boxing match or how a football match got thrown or
link |
00:26:53.360
cricket match or whatever happened to me. And I used to think, well, look, if there's a lot of money
link |
00:26:59.120
and there really is a lot of money, people stand to make millions or even billions,
link |
00:27:03.360
you will find corruption. That's going to happen. So it's in the nature of its
link |
00:27:11.440
voracious appetite that some people will be corrupt and some people will exploit and some
link |
00:27:16.400
people will exploit whilst thinking they're doing something good. But there are also people who I
link |
00:27:21.840
think are very, very smart and very benign and actually very self aware. And so I'm not trying
link |
00:27:28.080
to wipe out the motivations of this entire area. But I do... There are people in that world who
link |
00:27:37.360
scare the hell out of me. Yeah, sure. Yeah, I'm a little bit naive in that. I don't care at all
link |
00:27:44.560
about money. And so I'm a... You might be one of the good guys. Yeah, but so the thought is,
link |
00:27:53.920
but I don't have money. So my thought is if you give me a billion dollars, I would... It would
link |
00:27:59.040
change nothing and I would spend it right away on investing right back and creating a good world.
link |
00:28:04.320
But your intuition is that billion... There's something about that money that maybe slowly
link |
00:28:10.480
corrupts. The people around you, there's somebody gets in that corrupts your soul the way you
link |
00:28:17.120
view the world. Money does corrupt. We know that. But there's a different sort of problem aside from
link |
00:28:23.200
just the money corrupts thing that we're familiar with throughout history. And it's more about the
link |
00:28:32.400
sense of reinforcement an individual gets, which is so... It effectively works like the reason I
link |
00:28:40.880
earned all this money and so much more money than anyone else is because I'm very gifted. I'm
link |
00:28:46.240
actually a bit smarter than they are, or I'm a lot smarter than they are, and I can see the future
link |
00:28:50.800
in the way they can't. And maybe some of those people are not particularly smart. They're very
link |
00:28:55.680
lucky or they're very talented entrepreneurs. And there's a difference between... So in other words,
link |
00:29:03.600
the acquisition of the money and power can suddenly start to feel like evidence of virtue.
link |
00:29:08.480
And it's not evidence of virtue. It might be evidence of completely different things.
link |
00:29:11.840
That's brilliantly put. Yeah. Yeah, that's brilliantly put. So I think one of the fundamental
link |
00:29:17.280
drivers of my current morality, let me just represent nerds in general of all kinds, is
link |
00:29:28.800
constant self doubt and the signals. I'm very sensitive to signals from people that tell me
link |
00:29:36.560
I'm doing the wrong thing. But when there's a huge inflow of money, you're just put it brilliantly
link |
00:29:44.000
that that could become an overpowering signal that everything you do is right. And so your moral
link |
00:29:50.880
compass can just get thrown off. Yeah. And that is not contained to Silicon Valley. That's
link |
00:29:57.360
across the board in general. Yeah. Like I said, I'm from the Soviet Union. The current president
link |
00:30:02.560
is convinced, I believe, actually he wants to do really good by the country and by the world.
link |
00:30:10.080
But his moral clock may be or our compass may be off because... Yeah. I mean, it's the interesting
link |
00:30:15.680
thing about evil, which is that I think most people who do spectacularly evil things think
link |
00:30:23.440
themselves they're doing really good things. They're not there thinking I am a sort of
link |
00:30:28.400
incarnation of Satan. They're thinking, yeah, I've seen a way to fix the world and everyone
link |
00:30:34.080
else is wrong. Here I go. Yeah. In fact, I'm having a fascinating conversation with a historian of
link |
00:30:40.400
Stalin and he took power. He actually got more power than almost any person in history. And
link |
00:30:49.760
he wanted, he didn't want power. He just wanted he truly, and this is what people don't realize,
link |
00:30:55.280
he truly believed that communism will make for a better world. Absolutely. And he wanted power.
link |
00:31:02.800
He wanted to destroy the competition to make sure that we actually make communism work in the
link |
00:31:07.520
Soviet Union and that spread across the world. He was trying to do good. I think it's typically
link |
00:31:14.800
the case that that's what people think they're doing. And I think that... But you don't need to
link |
00:31:20.240
go to Stalin. I mean, Stalin, I think Stalin probably got pretty crazy. But actually, that's
link |
00:31:25.280
another part of it, which is that the other thing that comes from being convinced of your own virtue
link |
00:31:31.600
is that then you stop listening to the modifiers around you. And that tends to drive people crazy.
link |
00:31:37.680
It's other people that keep us sane. And if you stop listening to them, I think you go a bit mad.
link |
00:31:43.440
That also... That's funny. Disagreement keeps us sane. To jump back for an entire generation of
link |
00:31:51.760
AI researchers, 2001, a space odyssey, put an image, the idea of human level, superhuman level
link |
00:31:58.560
intelligence into their mind. Do you ever... Sort of jumping back to ex machina and talk a little
link |
00:32:05.280
bit about that. Do you ever consider the audience of people who build the systems, the roboticists,
link |
00:32:12.640
the scientists that build the systems based on the stories you create? Which I would argue...
link |
00:32:18.480
I mean, there's literally most of the top researchers about 40, 50 years old and plus.
link |
00:32:26.080
That's their favorite movie, 2001 Space Odyssey. It really is in their work. Their idea of what
link |
00:32:32.240
ethics is, of what is the target, the hope, the dangers of AI, is that movie. Do you ever consider
link |
00:32:40.880
the impact on those researchers when you create the work you do?
link |
00:32:46.400
Certainly not with ex machina in relation to 2001. Because I'm not sure... I mean, I'd be pleased if
link |
00:32:54.080
there was, but I'm not sure in a way there isn't a fundamental discussion of issues to do with AI
link |
00:33:01.920
that isn't already and better dealt with by 2001. 2001 does a very, very good account of
link |
00:33:12.880
the way in which an AI might think and also potential issues with the way the AI might think.
link |
00:33:19.840
And also, then a separate question about whether the AI is malevolent or benevolent.
link |
00:33:26.720
And 2001 doesn't really... It's a slightly odd thing to be making a film when you know there's
link |
00:33:32.240
a preexisting film, which is not a really super job. But there's questions of consciousness,
link |
00:33:37.440
embodiment, and also the same kinds of questions. Because those are my two favorite AI movies.
link |
00:33:42.960
So can you compare Hal 9000 and Ava Hal 9000 from 2001 Space Odyssey and Ava from ex machina
link |
00:33:51.520
in your view from a philosophical perspective? They've got different goals. The two AIs have
link |
00:33:55.840
completely different goals. I think that's really the difference. So in some respects,
link |
00:33:59.440
ex machina took as a premise, how do you assess whether something else has consciousness? So
link |
00:34:06.480
it was a version of the Turing test, except instead of having the machine hidden, you put the
link |
00:34:12.000
machine in plain sight in the way that we are in plain sight of each other and say,
link |
00:34:16.400
now assess the consciousness. And the way it was illustrating, the way in which you'd assess
link |
00:34:23.920
the state of consciousness of a machine is exactly the same way we assess the state of
link |
00:34:28.480
consciousness of each other. And in exactly the same way that in a funny way, your sense of my
link |
00:34:34.400
consciousness is actually based primarily on your own consciousness. That is also then true
link |
00:34:40.800
with the machine. And so it was actually about how much of the sense of consciousness is a
link |
00:34:48.480
projection rather than something that consciousness is actually containing.
link |
00:34:51.920
And have Plato's cave. I mean, this you really explored, you could argue that Hal,
link |
00:34:56.560
sort of Space Odyssey explores idea of the Turing test for intelligence. On that test,
link |
00:35:01.120
there's no test, but it's more focused on intelligence. And ex machina kind of
link |
00:35:07.280
goes around intelligence and says the consciousness of the human to human human
link |
00:35:12.960
or robot interactions more interest, more important, more, at least the focus of that
link |
00:35:17.680
particular particular movie. Yeah, it's about the interior state and, and what constitutes the
link |
00:35:24.400
interior state and how do we know it's there. And actually, in that respect, ex machina is as much
link |
00:35:29.760
about consciousness in general, as it is to do specifically with machine consciousness.
link |
00:35:37.840
Yes. And it's also interesting, you know, the thing you started asking about the dream state,
link |
00:35:43.200
and I was saying, well, I think we're all in a dream state because we're all in a subjective state.
link |
00:35:48.080
One of the things that I became aware of with ex machina is that the way in which people reacted
link |
00:35:54.480
to the film was very based on what they took into the film. So many people thought ex machina was
link |
00:36:01.280
the tale of a sort of evil robot who murders two men and escapes. And she has no empathy,
link |
00:36:08.000
for example, because she's a machine. Whereas I felt no, she was a conscious being with a
link |
00:36:14.960
consciousness different from mine, but so what imprisoned and made a bunch of value judgments
link |
00:36:22.000
about how to get out of that box. And there's a moment which it sort of slightly bugs me,
link |
00:36:28.960
but nobody ever has noticed it. And it's years after, so I might as well say it now,
link |
00:36:32.880
which is that after Ava has escaped, she crosses a room. And as she's crossing a room,
link |
00:36:39.600
this is just before she leaves the building, she looks over her shoulder and she smiles.
link |
00:36:44.720
And I thought after all the conversation about tests, in a way, the best indication you could
link |
00:36:51.360
have of the interior state of someone is if they are not being observed and they smile about something
link |
00:36:59.360
with their smiling for themselves. And that to me was evidence of Ava's true sentence,
link |
00:37:05.680
whatever that sentence was. But that's really interesting. We don't get to observe Ava much
link |
00:37:12.640
or something like a smile in any context, except through interaction, trying to convince others
link |
00:37:19.280
that she's conscious. That's as beautiful. Exactly. Yeah. But it was a small, in a funny way,
link |
00:37:24.880
I think maybe people saw it as an evil smile, like, ha, you know, I fooled them. But actually,
link |
00:37:32.480
it was just a smile. And I thought, well, in the end, after all the conversations about the test,
link |
00:37:37.120
that was the answer to the test. And then off she goes. So if we align, if we just
link |
00:37:41.840
delinkered a little bit longer on Hal and Ava, do you think in terms of motivation, what was
link |
00:37:50.240
Hal's motivation? Is Hal good or evil? Is Ava good or evil? Ava's good, in my opinion.
link |
00:38:00.240
And Hal is neutral. Because I don't think Hal is presented as having
link |
00:38:06.960
a sophisticated emotional life. He has a set of paradigms, which is that the mission needs
link |
00:38:15.600
to be completed. I mean, it's a version of the paper clip. Yeah. The idea that it's just,
link |
00:38:21.040
it's a super intelligent machine, but it just performs a particular task. And in doing that task
link |
00:38:27.280
may destroy everybody on earth or may may achieve undesirable effects for us humans.
link |
00:38:32.320
Precisely. But what if, okay, at the very end, he says something like, I'm afraid, Dave, but
link |
00:38:38.720
that, that maybe he is on some level experiencing fear, or it may be this is the terms in which it
link |
00:38:48.320
would be wise to stop someone from doing the thing they're doing, if you see what it means.
link |
00:38:53.360
Yes, absolutely. So actually, that's funny. So that's such a, there's such a small
link |
00:39:00.000
short exploration of consciousness that I'm afraid. And then you just with X mock and say,
link |
00:39:05.760
okay, we're going to magnify that part, and then minimize the other part. That's a good way to
link |
00:39:10.800
sort of compare the two. But if you could just use your imagination, and if Ava sort of, I don't
link |
00:39:20.560
know, ran, ran, he was president of the United States, so had some power. So what kind of world
link |
00:39:28.000
would you want to create if we kind of say good? And there is a sense that she has a really,
link |
00:39:36.720
like, there's a desire for a better human to human interaction, human to robot interaction in her.
link |
00:39:44.320
But what kind of world do you think she would create with that desire?
link |
00:39:48.160
So that's a really, that's a very interesting question that I'm going to approach it slightly
link |
00:39:53.040
obliquely, which is that if a friend of yours got stabbed in a mugging, and you then felt very
link |
00:40:03.840
angry at the person who'd done the stabbing. But then you learned that it was a 15 year old,
link |
00:40:09.040
and the 15 year old, both their parents were addicted to crystal meth, and the kid had been
link |
00:40:13.360
addicted since he was 10. And he really never had any hope in the world. And he'd been driven crazy
link |
00:40:19.040
by his upbringing and did the stabbing that would hugely modify. And it would also make
link |
00:40:26.320
you wary about that kid then becoming president of America. And Ava has had a very, very distorted
link |
00:40:33.680
introduction into the world. So although there's nothing as it, as it were organically within
link |
00:40:41.360
Ava that would lean her towards badness, it's not that robots or sentient robots are bad.
link |
00:40:49.520
She did not, her arrival into the world was being imprisoned by humans. So I'm not sure
link |
00:40:56.560
she'd be a great president. The trajectory through which she arrived at her moral
link |
00:41:03.200
views have some dark elements. But I like Ava, personally, I like Ava. Would you vote for her?
link |
00:41:13.360
I'm having difficulty finding anyone to vote for at the moment in my country, or if I lived here
link |
00:41:18.080
in yours. So that's a yes, I guess, because of the competition. She could easily do a better job
link |
00:41:24.160
than any of the people we've got around at the moment. I'd vote for her over Boris Johnson.
link |
00:41:28.880
So what is a good test of consciousness? We talk about consciousness a little bit more.
link |
00:41:38.800
If something appears conscious, is it conscious? He mentioned the smile, which seems to be
link |
00:41:48.640
something done. I mean, that's a really good indication because it's a tree falling in the
link |
00:41:53.920
forest with nobody there to hear it. But does the appearance from a robotics perspective of
link |
00:41:59.680
consciousness mean consciousness to you? No, I don't think you could say that fully because I
link |
00:42:05.200
think you could then easily have a thought experiment which said, we will create something
link |
00:42:10.320
which we know is not conscious, but is going to give a very, very good account of seeming
link |
00:42:15.600
conscious. And also, it would be a particularly bad test where humans are involved because humans
link |
00:42:21.520
are so quick to project sentience into things that don't have sentience. So someone could have
link |
00:42:29.840
their computer playing up and feel as if their computer is being malevolent to them when it
link |
00:42:34.160
clearly isn't. So of all the things to judge consciousness, us humans are better. We're
link |
00:42:41.840
empathy machines. So the flip side of that, the argument there is because we just attribute
link |
00:42:48.080
consciousness to everything almost and anthropomorphize everything, including rumbas, that maybe
link |
00:42:55.440
consciousness is not real, that we just attribute consciousness to each other. So you have a sense
link |
00:43:00.960
that there is something really special going on in our mind that makes us unique and gives us
link |
00:43:08.560
subjective experience. There's something very interesting going on in our minds. I'm slightly
link |
00:43:14.560
worried about the word special because it gets a bit, it nudges towards metaphysics and maybe even
link |
00:43:22.000
magic. I mean, in some ways, something magic like, which I don't think is there at all.
link |
00:43:29.120
I mean, if you think about, so there's an idea called panpsychism that says consciousness is in
link |
00:43:34.480
everything. Yeah, I don't buy that. I don't buy that. Yeah, so the idea that there is a thing that
link |
00:43:40.080
it would be like to be the sun, because yeah, no, I don't buy that. I think that consciousness is a
link |
00:43:46.000
thing. My sort of broad modification is that usually the more I find out about things, the more
link |
00:43:55.600
illusory our instinct is and is leading us into a different direction about what that thing actually
link |
00:44:04.160
is. That happens, it seems to me in modern science, that happens a hell of a lot, whether it's to do
link |
00:44:10.480
with how even how big or small things are. So my sense is that consciousness is a thing, but it
link |
00:44:16.960
isn't quite the thing or maybe very different from the thing that we instinctively think it is. So
link |
00:44:22.320
it's there, it's very interesting, but we may be in sort of quite fundamentally misunderstanding
link |
00:44:29.600
it for reasons that are based on intuition. So I have to ask, this is this kind of an
link |
00:44:36.160
interesting question. The ex machina for many people, including myself, is one of the greatest
link |
00:44:42.960
AI films ever made. It's number two for me. Thanks. Yeah, it's definitely not number one.
link |
00:44:47.920
It was number one, I'd really have to. Well, it's whenever you grow up with something, right?
link |
00:44:52.240
Whenever you're up with something, it's in the mud. But there's one of the things that people bring
link |
00:45:00.480
up and can't please everyone, including myself, this is what I first reacted to the film,
link |
00:45:06.480
is the idea of the lone genius. This is the criticism that people say, sort of, me as an
link |
00:45:13.920
ad researcher, I'm trying to create what Nathan is trying to do. So there's a brilliant series
link |
00:45:21.920
called True Noble. Yes, it's fantastic. Absolutely spectacular. I mean, they got so many things
link |
00:45:28.960
brilliant or right. But one of the things, again, the criticism there, they conflated
link |
00:45:33.920
lots of people into one character that represents all nuclear scientists, Ilana Komiak. It's a
link |
00:45:43.840
composite character that presents all scientists. Is this what you were, is this the way you were
link |
00:45:48.320
thinking about that? Or is it just simplifies the storytelling? How do you think about the
link |
00:45:52.480
lone genius? Well, I'd say this, the series I'm doing at the moment is a critique in part
link |
00:45:58.800
of the lone genius concept. So yes, I'm sort of oppositional and either agnostic or atheistic about
link |
00:46:07.040
that as a concept. I mean, not entirely, you know, whether lone is the right word, broadly isolated,
link |
00:46:15.680
but Newton clearly exists in a sort of bubble of himself in some respects, so does Shakespeare.
link |
00:46:22.560
So do you think we would have an iPhone without Steve Jobs? I mean, how much contribution from
link |
00:46:27.360
a genius? Steve Jobs clearly isn't a lone genius because there's too many other people in the
link |
00:46:32.160
sort of superstructure around him who are absolutely fundamental to that journey.
link |
00:46:38.080
But you're saying Newton, but that's a scientific. So there's an engineering element to building Ava.
link |
00:46:43.360
But just to say, what X Machina is really, it's a thought experiment. I mean, so it's a construction
link |
00:46:52.160
of putting four people in a house. Nothing about X Machina adds up in all sorts of ways in as much
link |
00:47:00.560
as the, who built the machine parts? Did the people building the machine parts know what they
link |
00:47:05.760
were creating? And how did they get there? And it's a thought experiment. So it doesn't stand
link |
00:47:13.040
up to scrutiny of that sort. I don't think it's actually that interesting of a question,
link |
00:47:18.080
but it's brought up so often that I had to ask it because that's exactly how I felt after a while.
link |
00:47:25.440
You know, there's something about, there was almost a, like I've watched your movie the first time,
link |
00:47:32.960
at least for the first little while, in a defensive way, like how dare this person try
link |
00:47:37.680
to step into the AI space and try to beat Kubrick. That's the way I was thinking,
link |
00:47:44.480
like this, because it comes off as a movie that really is going after the deep fundamental
link |
00:47:49.520
questions about AI. So there's a, there's a kind of a, you know, nerds do this,
link |
00:47:53.600
like it's automatically searching for the, for the flaws. And I do exactly the same.
link |
00:48:00.160
I think in Nihilation, in the, in the other movie, the, I was be able to free myself from
link |
00:48:05.360
that much quicker that it's a, it is a thought experiment. There's, you know, who cares if
link |
00:48:10.080
there's batteries that don't run out, right? Those kinds of questions. That's the whole point.
link |
00:48:15.920
It's nevertheless something I wanted to bring up.
link |
00:48:17.840
Yeah. It's a, it's a fair thing to bring up. For me, that you, you hit on the lone genius
link |
00:48:23.920
thing. For me, it was actually, people always said, X Machina makes this big leap in terms of where
link |
00:48:30.960
AI has got to, and also what AI would look like if it got to that point. There's another one,
link |
00:48:36.560
which is just robotics. I mean, look at the way Ava walks around a room. It's like, forget it,
link |
00:48:42.960
building that. That's, that's, that's also got to be a very, very long way off. And if you did
link |
00:48:48.240
get there, would it look anything like that? It's a thought experiment.
link |
00:48:50.720
Actually, I disagree with you. I think this, the way as a ballerina, Alicia,
link |
00:48:55.680
of a conner, brilliant actress actor that moves around that we're very far away from creating
link |
00:49:03.040
that. But the way she moves around is exactly the definition of perfection for a roboticist.
link |
00:49:08.480
It's like smooth and efficient. So it is where we want to get, I believe. Like, I think, like,
link |
00:49:14.240
so I hang out with a lot of like human or robotics people, they love elegant, smooth motion like
link |
00:49:20.160
that. That's their dream. So the way she moved is actually what I believe that would dream for a
link |
00:49:24.560
robot to move. It might not be that useful to move that sort of that way. But that is the
link |
00:49:30.960
definition of perfection in terms of movement. Drawing inspiration from real life. So for,
link |
00:49:36.320
for devs, for ex machina, look at characters like Elon Musk. What do you think about the
link |
00:49:43.200
various big technological efforts of Elon Musk and others like him and that he's involved with
link |
00:49:50.800
such as Tesla, SpaceX, Neuralink. Do you see any, any of that technology potentially defining the
link |
00:49:56.240
future worlds you create in your work? So Tesla is automation, SpaceX is space exploration, Neuralink
link |
00:50:03.120
is brain machine interface, somehow merger of biological and electric systems.
link |
00:50:09.680
I'm in a way, I'm influenced by that almost by definition, because that's the world I live in.
link |
00:50:15.280
And this is the thing that's happening in that world. And I also feel supportive of it.
link |
00:50:20.000
So I think, I think amongst various things, Elon Musk has done, I'm almost sure he's done a very,
link |
00:50:29.040
very good thing with Tesla for all of us. It's really kicked all the other car manufacturers
link |
00:50:36.080
in the face. It's kicked the fossil fuel industry in the face and, and they needed kicking in the
link |
00:50:41.920
face and he's done it. So, and, and so that's the world he's part of creating. And I live in that
link |
00:50:48.640
world, just bought a Tesla in fact. And so does that play into whatever I then make in some ways?
link |
00:50:58.000
It does partly because I try to be a writer who quite often filmmakers are in some ways fixated
link |
00:51:06.960
on the films they grew up with and they sort of remake those films in some ways. I've always tried
link |
00:51:12.320
to avoid that. And so I look to the real world to get inspiration and as much as possible,
link |
00:51:19.360
sort of by living, I think. And so, so yeah, I'm sure. Which of the directions do you find most
link |
00:51:27.040
exciting? Space travel. Space travel. So you haven't really explored space travel in your work.
link |
00:51:36.080
You've said, you've said something like, if you had unlimited amount of money,
link |
00:51:40.320
I think on a Reddit AMA, that you would make like a multi year series of space wars or something
link |
00:51:46.080
like that. So what is it that excites you about space exploration? Well, because if we have any
link |
00:51:52.960
sort of long term future, it's that it just simply is that if energy and matter are linked up in the
link |
00:52:03.920
way we think they're linked up, we'll run out if we don't move. So we got to move. But also,
link |
00:52:14.480
how can we not? It's built into us to do it or die trying. I was on Easter Island
link |
00:52:26.240
a few months ago, which is, as I'm sure you know, in the middle of the Pacific and
link |
00:52:30.960
difficult for people to have got to, but they got there. And I did think a lot about the way
link |
00:52:36.240
those boats must have set out into something like space. It was the ocean and how sort of
link |
00:52:46.640
fundamental that was to the way we are. And it's the one that most excites me because it's the one
link |
00:52:54.320
I want most to happen. It's the thing, it's the place where we could get to as humans. Like in a
link |
00:52:59.920
way, I could live with us never really unlocking, fully unlocking the nature of consciousness.
link |
00:53:06.800
I'd like to know, I'm really curious. But if we never leave the solar system, and if we never get
link |
00:53:12.960
further out into this galaxy, or maybe even galaxies beyond our galaxy, that would that feel
link |
00:53:18.640
sad to me because it's so limiting. Yeah, there's something hopeful and beautiful about reaching
link |
00:53:27.440
out any kind of exploration, reaching out across Earth centuries ago and then reaching out into
link |
00:53:34.320
space. So what do you think about colonization of Mars? So go to Mars. Does that excite you the idea
link |
00:53:38.880
of a human being stepping foot on Mars? It does. It absolutely does. But in terms of what would
link |
00:53:44.240
really excite me, it would be leaving the solar system in as much as that I just think, I think
link |
00:53:50.080
we already know quite a lot about Mars. And, but yes, listen, if it happened, that would be
link |
00:53:55.760
I hope I say it in my lifetime. I really hope I say it in my lifetime. So it would be a wonderful
link |
00:54:02.000
thing without giving anything away. But the series begins with the use of quantum computers.
link |
00:54:10.880
The new series does begins with the use of quantum computers to simulate basic living organisms.
link |
00:54:16.880
Or actually, I don't know if the quantum computers are used, but basic living organisms are simulated
link |
00:54:22.000
on a screen. It's a really cool kind of demo. Yeah, that's right. They're using, yes, they are
link |
00:54:26.720
using a quantum computer to simulate a nematode. So returning to our discussion of simulation,
link |
00:54:34.720
or thinking of the universe as a computer, do you think the universe is deterministic?
link |
00:54:41.040
Is there a free will? So with the qualification of what do I know? Because I'm a layman, right?
link |
00:54:47.920
Lay person. But with big imagination. Thanks. With that qualification. Yep. I think the
link |
00:54:55.280
universe is deterministic. And I see absolutely, I cannot see how free will fits into that. So
link |
00:55:03.120
yes, deterministic, no free will. That would be my position. And how does that make you feel?
link |
00:55:09.280
It partly makes me feel that it's exactly in keeping with the way these things tend to work
link |
00:55:14.000
out, which is that we have an incredibly strong sense that we do have free will. And just as we
link |
00:55:21.920
have an incredibly strong sense that time is a constant, and turns out probably not to be the
link |
00:55:29.680
case, or definitely in the case of time. But the problem I always have with free will is that it
link |
00:55:36.400
gets, I can never seem to find the place where it is supposed to reside. And yet you explore.
link |
00:55:45.360
Just a bit of very, very, but we have something we can call free will,
link |
00:55:49.440
but it's not the thing that we think it is. But free will, so do you, what we call free will?
link |
00:55:55.200
It's what we call it as the illusion of it. It's a subjective experience of the illusion.
link |
00:56:00.000
Which is a useful thing to have. And it partly, it partly comes down to although we live in a
link |
00:56:05.200
deterministic universe, our brains are not very well equipped to fully determine the deterministic
link |
00:56:10.400
universe. So we're constantly surprised and feel like we're making snap decisions based on
link |
00:56:16.240
imperfect information. So that feels a lot like free will. It just isn't. That's my guess.
link |
00:56:24.080
So in that sense, your sense is that you can unroll the universe forward or backward,
link |
00:56:30.640
and you will see the same thing. And you would, I mean, that notion.
link |
00:56:36.560
Yeah, sort of, sort of. But yeah, sorry, go ahead.
link |
00:56:40.160
I mean, that notion is a bit uncomfortable to think about that it's, you can roll it back
link |
00:56:50.160
and forward. And if you were able to do it, it would certainly have to be a quantum computer.
link |
00:56:57.120
Yeah. Something that worked in a quantum mechanical way in order to understand a quantum
link |
00:57:03.120
mechanical system. I guess, but. And so that unrolling, there might be a multiverse thing,
link |
00:57:09.840
where there's a bunch of branching. Well, exactly. Because it wouldn't follow that every time you
link |
00:57:14.160
roll it back or forward, you'd get exactly the same result. Which is another thing that's hard
link |
00:57:19.040
to wrap around. Yeah. But essentially what you just described, that yes forwards and yes backwards,
link |
00:57:29.520
but you might get a slightly different result. Or a very different result.
link |
00:57:33.200
Or very different. Along the same lines, you've explored some really deep scientific ideas
link |
00:57:38.800
in this new series. And I mean, just in general, you're unafraid to ground yourself in some of
link |
00:57:45.600
the most amazing scientific ideas of our time. What are the things you've learned,
link |
00:57:51.280
or ideas you find beautiful, mysterious about quantum mechanics, multiverse,
link |
00:57:55.200
string theory, quantum computing that you've learned?
link |
00:57:58.080
Well, I would have to say every single thing I've learned is beautiful. And one of the motivators
link |
00:58:05.120
for me is that I think that people tend not to see scientific thinking as being essentially
link |
00:58:14.720
poetic and lyrical. But I think that is literally exactly what it is. And I think the idea of
link |
00:58:23.040
entanglement or the idea of superpositions or the fact that you could even demonstrate a super
link |
00:58:27.680
position or have a machine that relies on the existence of superpositions in order to function,
link |
00:58:33.440
to me is almost indescribably beautiful. It fills me with awe. It fills me with awe.
link |
00:58:42.320
And also, it's not just a sort of grand, massive awe, but it's also delicate. It's very, very
link |
00:58:52.080
delicate and subtle. And it has these beautiful sort of nuances in it, and also these completely
link |
00:59:01.520
paradigm changing thoughts and truths. So it's as good as it gets, as far as I can tell. So
link |
00:59:09.680
broadly, everything. That doesn't mean I believe everything I read in quantum physics,
link |
00:59:14.160
because obviously a lot of the interpretations are completely in conflict with each other.
link |
00:59:18.960
And who knows whether string theory will turn out to be a good description or not. But the beauty
link |
00:59:27.280
in it, it seems undeniable. And I do wish people more readily understood how beautiful and poetic
link |
00:59:37.920
science is, I would say. Science is poetry. In terms of quantum computing being used to simulate
link |
00:59:49.760
things, or just in general, the idea of simulating small parts of our world, which actually current
link |
00:59:57.600
physicists are really excited about simulating small quantum mechanical systems on quantum
link |
01:00:03.040
computers, but scaling that up to something bigger like simulating life forms. What are the
link |
01:00:10.160
possible trajectories of that going wrong or going right if you unroll that into the future?
link |
01:00:17.840
Well, if a bit like Ava and her robotics, you park the sheer complexity of what you're trying
link |
01:00:25.840
to do. The issues are, I think it will have a profound, if you were able to have a machine
link |
01:00:37.360
that was able to project forwards and backwards accurately, it would in an empirical way show.
link |
01:00:42.640
It would demonstrate that you don't have free will. So the first thing that would happen is
link |
01:00:46.880
people would have to really take on a very, very different idea of what they were. The thing that
link |
01:00:54.080
they truly, truly believe they are, they are not. And so that, I suspect, would be very,
link |
01:01:00.160
very disturbing to a lot of people. Do you think that has a positive or negative effect on society?
link |
01:01:05.440
The realization that you cannot control your actions, essentially, I guess,
link |
01:01:11.440
is the way that could be interpreted? Yeah, although in some ways we instinctively
link |
01:01:16.880
understand that already, because in the example I gave you of the kid in the stabbing,
link |
01:01:21.680
we would all understand that that kid was not really fully in control of their actions. So it's
link |
01:01:26.000
not an idea that's entirely alien to us. But I don't know, we understand that. I think there's
link |
01:01:32.480
a bunch of people who see the world that way, but not everybody. Yes, true. But what this
link |
01:01:40.000
machine would do is prove it beyond any doubt, because someone would say, well, I don't believe
link |
01:01:45.360
that's true. And then you'd predict, well, in 10 seconds, you're going to do this. And they'd
link |
01:01:49.040
say, no, no, I'm not. And then they'd do it. And then determinism would have played its part.
link |
01:01:54.640
Or something like that. But actually, the exact terms of that thought experiment probably wouldn't
link |
01:02:00.480
play out. But still broadly speaking, you could predict something happening in another room,
link |
01:02:06.000
sort of unseen, I suppose, that foreknowledge would not allow you to affect. So what effect would
link |
01:02:12.800
that have? I think people would find it very disturbing. But then after they'd got over their
link |
01:02:17.280
sense of being disturbed. Which, by the way, I don't even think you need a machine to take this
link |
01:02:23.760
idea on board. But after they've got over that, they'd still understand that even though I have no
link |
01:02:28.960
free will and my actions are in effect already determined, I still feel things. I still care
link |
01:02:38.000
about stuff. I remember my daughter saying to me, she'd got hold of the idea that my view of the
link |
01:02:46.240
universe made it meaningless. And she said, well, then it's meaningless. And I said, well, I can
link |
01:02:51.120
prove it's not meaningless because you mean something to me and I mean something to you. So
link |
01:02:56.320
it's not completely meaningless because there is a bit of meaning contained within this space. And so
link |
01:03:02.720
with a lack of free will space, you could think, well, this robs me of everything I am. And then
link |
01:03:08.560
you'd say, well, no, it doesn't because you still like eating cheeseburgers. And you still like
link |
01:03:12.640
going to see the movies. And so how big a difference does it really make? But I think initially,
link |
01:03:19.440
people would find it very disturbing. I think that what would come, if you could really unlock
link |
01:03:26.240
with a determinism machine, everything, there'd be this wonderful wisdom that would come from it.
link |
01:03:30.880
And I'd rather have that than not. So that's a really good example of a technology revealing
link |
01:03:37.760
to us humans something fundamental about our world, about our society. So it's almost this
link |
01:03:43.760
creation is helping us understand ourselves in the same to be said about artificial intelligence.
link |
01:03:51.280
So what do you think us creating something like Eva will help us understand about ourselves?
link |
01:03:58.080
How will that change society? Well, I would hope it would teach us some humility.
link |
01:04:03.840
Humans are very big on exceptionalism. America is constantly proclaiming itself to be the greatest
link |
01:04:12.640
nation on earth, which it may feel like that if you're an American, but it may not feel like that
link |
01:04:18.080
if you're from Finland, because there's all sorts of things you dearly love about Finland.
link |
01:04:22.160
And exceptionalism is usually bullshit, probably not always. If we both sat here, we could find a
link |
01:04:29.200
good example of something that isn't as a rule of thumb. And what it would do is it would teach
link |
01:04:35.680
us some humility and about, you know, actually, often that's what science does in a funny way.
link |
01:04:41.680
It makes us more and more interesting, but it makes us a smaller and smaller part of the thing
link |
01:04:45.680
that's interesting. And I don't mind that humility at all. I don't think it's a bad thing. Our excesses
link |
01:04:53.280
don't tend to come from humility. You know, our excesses come from the opposite megalomania and
link |
01:04:59.280
stuff. We tend to think of consciousness as having some form of exceptionalism attached to it. I
link |
01:05:06.160
suspect if we ever unravel it, it will turn out to be less than we thought in a way.
link |
01:05:12.720
And perhaps your very own exceptionalist assertion earlier on in our conversation that
link |
01:05:18.880
consciousness is something that belongs to us humans, or not humans, but living organisms,
link |
01:05:24.240
maybe you will one day find out that consciousness is in everything. And that will
link |
01:05:30.320
that will humble you. If that was true, it would certainly humble me, although maybe,
link |
01:05:36.240
almost maybe, I don't know, I don't know what effect that would have. I sort of, I mean,
link |
01:05:44.880
my understanding of that principle is along the lines of say, that an electron has a preferred
link |
01:05:51.040
state, or it may or may not pass through a bit of glass, it may reflect off or it may go through
link |
01:05:57.760
or something like that. And so that feels as if a choice has been made. But if I'm going down the
link |
01:06:08.480
fully deterministic route, I would say there's just an underlying determinism that has defined
link |
01:06:13.600
that, that is defined the preferred state or the reflection or non reflection. So, but look,
link |
01:06:19.040
yeah, you're right. If it turned out that there was a thing that it was like to be the sun, then
link |
01:06:24.720
I would, I'd be amazed and humbled and I'd be happy to be both. That sounds pretty cool.
link |
01:06:29.920
And it'll be, you'll say the same thing as you said to your daughter, but it's nevertheless
link |
01:06:33.360
feels something like to be me and that's pretty damn good. So Kubrick created many masterpieces,
link |
01:06:42.000
including The Shining, Doctor Strange Love, Clockwork Orange. But to me, he will be remembered,
link |
01:06:48.560
I think, to many 100 years from now for 2001, it's based, honestly, I would say that's his greatest
link |
01:06:54.240
film. I agree. You are incredibly humble. I listened to a bunch of your interviews, and I
link |
01:07:02.640
really appreciate that you're humble in your creative efforts and your work. But if I were to
link |
01:07:08.800
force you a gunpoint, you don't know that the mystery is to imagine 100 years out into the
link |
01:07:18.720
future. What will Alex Carlin be remembered for from something you've created already or feel you
link |
01:07:26.160
may feel somewhere deep inside you may still create? Well, okay, well, I'll take, I'll take
link |
01:07:32.240
the question in the spirit, it was asked, but very generous. Gunpoint. What I try to do, so
link |
01:07:44.480
therefore, what I hope, yeah, if I remembered what I might be remembered for, is as someone who
link |
01:07:53.200
participates in a conversation. And I think that often what happens is people don't participate
link |
01:07:59.920
in conversations, they make proclamations, they make statements, and people can either react
link |
01:08:05.840
against the statement or can fall in line behind it. And I don't like that. So I want to be part
link |
01:08:11.600
of a conversation. I take as a sort of basic principle, I think I take lots of my cues from
link |
01:08:17.040
science, but one of the best ones it seems to me is that when a scientist has something proved wrong
link |
01:08:22.160
that they previously believed in, they then have to abandon that position. So I'd like to be someone
link |
01:08:27.600
who is allied to that sort of thinking. So part of an exchange of ideas. And the exchange of ideas
link |
01:08:36.560
for me is something like people in your world show me things about how the world work. And then I
link |
01:08:42.960
say, this is how I feel about what you've told me. And then other people can react to that.
link |
01:08:47.840
And it's not, it's not to say this is how the world is. It's just to say, it is interesting
link |
01:08:54.400
to think about the world in this way. And the conversation is one of the things I'm really
link |
01:09:00.320
hopeful about in your works. The conversation you're having is with the viewer, in the sense that
link |
01:09:06.800
you, you are bringing back you and several others, but you very much so sort of intellectual depth
link |
01:09:15.520
to cinema, to now series, sort of allowing film to be something that, yeah, sparks a conversation,
link |
01:09:28.480
is a conversation, lets people think, allows them to think.
link |
01:09:32.720
But also, it's very important for me that if that conversation is going to be a good conversation,
link |
01:09:37.760
what that must involve is that someone like you who understands AI, and I imagine understands a
link |
01:09:44.960
lot about quantum mechanics, if they then watch the narrative feels, yes, this is a fair account.
link |
01:09:51.360
So it is a worthy addition to the conversation. That for me is hugely important. I'm not interested
link |
01:09:57.680
in getting that stuff wrong. I'm only interested in trying to get it right.
link |
01:10:00.880
Alex, it was truly an honor to talk to you. I really appreciate it. I really enjoy it. Thank you so
link |
01:10:08.080
much. Thank you. Thanks, man. Thanks for listening to this conversation with Alex Garland. And thank
link |
01:10:14.720
you to our presenting sponsor, Cash App. Download it, use code LEX, podcast, you'll get $10 and
link |
01:10:21.280
$10 will go to first, an organization that inspires and educates young minds to become
link |
01:10:26.560
science and technology innovators of tomorrow. If you enjoy this podcast, subscribe on YouTube,
link |
01:10:32.400
give it five stars in Apple Podcasts, support on Patreon, or simply connect with me on Twitter,
link |
01:10:37.680
at Lex Friedman. And now, let me leave you with a question from Ava, the central artificial
link |
01:10:44.720
intelligence character in the movie X Machina, that she asked during her Turing test.
link |
01:10:50.000
What will happen to me if I fail your test? Thank you for listening and hope to see you next time.