back to index

David Eagleman: Neuroplasticity and the Livewired Brain | Lex Fridman Podcast #119


small model | large model

link |
00:00:00.000
The following is a conversation with David Eagleman,
link |
00:00:02.980
a neuroscientist and one of the great science communicators
link |
00:00:06.360
of our time, exploring the beauty and mystery
link |
00:00:09.260
of the human brain.
link |
00:00:10.780
He's an author of a lot of amazing books
link |
00:00:13.140
about the human mind, and his new one called Livewired.
link |
00:00:18.060
Livewired is a work of 10 years on a topic
link |
00:00:21.200
that is fascinating to me, which is neuroplasticity
link |
00:00:24.700
or the malleability of the human brain.
link |
00:00:27.740
Quick summary of the sponsors.
link |
00:00:29.260
Athletic Greens, BetterHelp, and Cash App.
link |
00:00:32.700
Click the sponsor links in the description
link |
00:00:34.500
to get a discount and to support this podcast.
link |
00:00:37.820
As a side note, let me say that the adaptability
link |
00:00:41.420
of the human mind at the biological, chemical,
link |
00:00:44.380
cognitive, psychological, and even sociological levels
link |
00:00:48.500
is the very thing that captivated me many years ago
link |
00:00:51.860
when I first began to wonder how would my engineer
link |
00:00:54.980
something like it in the machine.
link |
00:00:57.760
The open question today in the 21st century
link |
00:01:00.700
is what are the limits of this adaptability?
link |
00:01:03.700
As new, smarter and smarter devices and AI systems
link |
00:01:07.420
come to life, or as better and better brain computer
link |
00:01:10.420
interfaces are engineered, will our brain be able to adapt,
link |
00:01:13.820
to catch up, to excel?
link |
00:01:16.580
I personally believe yes, that we're far from reaching
link |
00:01:19.540
the limitation of the human mind and the human brain,
link |
00:01:23.100
just as we are far from reaching the limitations
link |
00:01:26.060
of our computational systems.
link |
00:01:28.860
If you enjoy this thing, subscribe on YouTube,
link |
00:01:31.420
review it with five stars on Apple Podcast,
link |
00:01:33.580
follow on Spotify, support on Patreon,
link |
00:01:36.240
or connect with me on Twitter at Lex Friedman.
link |
00:01:40.100
As usual, I'll do a few minutes of ads now
link |
00:01:41.980
and no ads in the middle.
link |
00:01:43.460
I try to make these interesting,
link |
00:01:45.100
but I give you timestamps so you can skip.
link |
00:01:47.940
But please do check out the sponsors
link |
00:01:49.740
by clicking the links in the description.
link |
00:01:51.660
It's the best way to support this podcast.
link |
00:01:55.000
This show is brought to you by Athletic Greens,
link |
00:01:58.300
the all in one daily drink to support better health
link |
00:02:01.140
and peak performance.
link |
00:02:02.680
Even with a balanced diet, it's difficult to cover
link |
00:02:04.800
all of your nutritional bases.
link |
00:02:07.020
That's where Athletic Greens will help.
link |
00:02:09.140
Their daily drink is like nutritional insurance
link |
00:02:11.820
for your body as delivered straight to your door.
link |
00:02:15.380
As you may know, I fast often, sometimes intermittent fasting
link |
00:02:18.580
for 16 hours, sometimes 24 hours,
link |
00:02:21.860
dinner to dinner, sometimes more.
link |
00:02:24.500
I break the fast with Athletic Greens.
link |
00:02:26.900
It's delicious, refreshing, just makes me feel good.
link |
00:02:30.980
I think it's like 50 calories, less than a gram of sugar,
link |
00:02:34.380
but has a ton of nutrients to make sure my body
link |
00:02:36.460
has what it needs despite what I'm eating.
link |
00:02:40.200
Go to athleticgreens.com slash lex
link |
00:02:43.100
to claim a special offer of a free vitamin D3K2 for a year.
link |
00:02:49.140
If you listen to the Joe Rogan experience,
link |
00:02:51.420
you might've listened to him rant about
link |
00:02:53.340
how awesome vitamin D is for your immune system.
link |
00:02:56.140
So there you have it.
link |
00:02:57.580
So click the athleticgreens.com slash lex
link |
00:03:00.940
in the description to get the free stuff
link |
00:03:03.420
and to support this podcast.
link |
00:03:06.400
This show is sponsored by BetterHelp, spelled H E L P, help.
link |
00:03:11.460
Check it out at betterhelp.com slash lex.
link |
00:03:14.200
They figure out what you need and match you
link |
00:03:15.860
with a licensed professional therapist in under 48 hours.
link |
00:03:19.340
It's not a crisis line, it's not self help,
link |
00:03:21.820
it's professional counseling done securely online.
link |
00:03:25.500
I'm a bit from the David Goggins line of creatures
link |
00:03:28.100
and so have some demons to contend with,
link |
00:03:30.680
usually on long runs or all nights full of self doubt.
link |
00:03:34.940
I think suffering is essential for creation,
link |
00:03:37.880
but you can suffer beautifully
link |
00:03:39.540
in a way that doesn't destroy you.
link |
00:03:41.660
For most people, I think a good therapist can help on this.
link |
00:03:45.300
So it's at least worth a try.
link |
00:03:47.140
Check out their reviews, they're good.
link |
00:03:49.260
It's easy, private, affordable, available worldwide.
link |
00:03:52.660
You can communicate by text anytime
link |
00:03:54.760
and schedule a weekly audio and video session.
link |
00:03:58.280
Check it out at betterhelp.com slash lex.
link |
00:04:02.420
This show is presented by Cash App,
link |
00:04:04.620
the number one finance app in the App Store.
link |
00:04:06.660
When you get it, use code lexpodcast.
link |
00:04:09.280
Cash App lets you send money to friends, buy Bitcoin,
link |
00:04:11.660
invest in the stock market with as little as $1.
link |
00:04:14.540
Since Cash App allows you to buy Bitcoin,
link |
00:04:16.880
let me mention that cryptocurrency
link |
00:04:18.480
in the context of the history of money is fascinating.
link |
00:04:21.860
I recommend Ascent of Money as a great book on this history.
link |
00:04:25.420
Davidson credits on ledgers started around 30,000 years ago
link |
00:04:29.980
and the first decentralized cryptocurrency
link |
00:04:32.260
released just over 10 years ago.
link |
00:04:34.220
So given that history, cryptocurrency is still very much
link |
00:04:37.420
in its early days of development,
link |
00:04:39.340
but it's still aiming to
link |
00:04:40.700
and just might redefine the nature of money.
link |
00:04:44.880
So again, if you get Cash App from the App Store or Google Play
link |
00:04:47.900
and use code lexpodcast, you get $10
link |
00:04:51.020
and Cash App will also donate $10 to FIRST,
link |
00:04:53.940
an organization that is helping to advance robotics
link |
00:04:56.980
and STEM education for young people around the world.
link |
00:05:00.420
And now here's my conversation with David Eagleman.
link |
00:05:05.780
You have a new book coming out on the changing brain.
link |
00:05:10.360
Can you give a high level overview of the book?
link |
00:05:13.260
It's called Livewired by the way.
link |
00:05:14.620
Yeah, the thing is we typically think about the brain
link |
00:05:17.780
in terms of the metaphors we already have,
link |
00:05:19.500
like hardware and software, that's how we build
link |
00:05:21.820
all our stuff, but what's happening in the brain
link |
00:05:24.420
is fundamentally so different.
link |
00:05:26.860
So I coined this new term liveware,
link |
00:05:29.060
which is a system that's constantly reconfiguring itself
link |
00:05:32.400
physically as it learns and adapts to the world around it.
link |
00:05:37.100
It's physically changing.
link |
00:05:38.660
So it's liveware meaning like hardware but changing.
link |
00:05:43.620
Yeah, exactly.
link |
00:05:44.560
Well, the hardware and the software layers are blended
link |
00:05:47.500
and so typically engineers are praised for their efficiency
link |
00:05:53.660
and making something really clean and clear,
link |
00:05:55.220
like, okay, here's the hardware layer,
link |
00:05:56.380
then I'm gonna run software on top of it.
link |
00:05:57.780
And there's all sorts of universality that you get out
link |
00:06:00.300
of a piece of hardware like that that's useful.
link |
00:06:02.840
But what the brain is doing is completely different.
link |
00:06:05.240
And I am so excited about where this is all going
link |
00:06:08.360
because I feel like this is where our engineering will go.
link |
00:06:13.360
So currently we build all our devices a particular way,
link |
00:06:17.140
but I can't tear half the circuitry out of your cell phone
link |
00:06:20.620
and expect it to still function.
link |
00:06:22.660
But you can do that with the brain.
link |
00:06:26.460
So just as an example, kids who are under
link |
00:06:29.260
about seven years old can get one half of their brain
link |
00:06:32.060
removed, it's called a hemispherectomy, and they're fine.
link |
00:06:35.700
They have a slight limp on the other side of their body,
link |
00:06:37.420
but they can function just fine that way.
link |
00:06:40.540
And this is generally true.
link |
00:06:42.380
You know, sometimes children are born without a hemisphere
link |
00:06:45.220
and their visual system rewires so that everything is
link |
00:06:48.580
on the single remaining hemisphere.
link |
00:06:52.140
What thousands of cases like this teach us
link |
00:06:55.420
is that it's a very malleable system that is simply trying
link |
00:06:59.940
to accomplish the tasks in front of it by rewiring itself
link |
00:07:04.060
with the available real estate.
link |
00:07:06.100
How much of that is a quirk or a feature of evolution?
link |
00:07:09.860
Like, how hard is it to engineer?
link |
00:07:11.860
Because evolution took a lot of work.
link |
00:07:14.620
Trillions of organisms had to die for it to create
link |
00:07:18.500
this thing we have in our skull.
link |
00:07:21.580
Like, because you said you kind of look forward to the idea
link |
00:07:24.540
that we might be engineering our systems like this
link |
00:07:27.940
in the future, like creating liveware systems.
link |
00:07:30.740
How hard do you think is it to create systems like that?
link |
00:07:33.180
Great question.
link |
00:07:34.020
It has proven itself to be a difficult challenge.
link |
00:07:37.060
What I mean by that is even though it's taken evolution
link |
00:07:40.260
a really long time to get where it is now,
link |
00:07:44.260
all we have to do now is peek at the blueprints.
link |
00:07:47.620
It's just three pounds, this organ,
link |
00:07:49.620
and we just figure out how to do it.
link |
00:07:50.860
But that's the part that I mean is a difficult challenge
link |
00:07:52.760
because there are tens of thousands of neuroscientists,
link |
00:07:57.100
we're all poking and prodding and trying to figure this out,
link |
00:07:59.260
but it's an extremely complicated system.
link |
00:08:00.700
But it's only gonna be complicated until we figure out
link |
00:08:03.300
the general principles.
link |
00:08:05.380
Exactly like if you had a magic camera
link |
00:08:09.420
you could look inside the nucleus of a cell
link |
00:08:10.940
and you'd see hundreds of thousands of things
link |
00:08:13.220
moving around or whatever,
link |
00:08:14.180
and then it takes Crick and Watson to say,
link |
00:08:16.060
oh, you know what, you're just trying to maintain
link |
00:08:17.380
the order of the base pairs and all the rest is details.
link |
00:08:20.300
Then it simplifies it and we come to understand something.
link |
00:08:23.500
That was my goal in LiveWire,
link |
00:08:25.100
which I've written over 10 years, by the way,
link |
00:08:26.900
is to try to distill things down to the principles
link |
00:08:29.920
of what plastic systems are trying to accomplish.
link |
00:08:34.180
But to even just linger, you said,
link |
00:08:36.260
it's possible to be born with just one hemisphere
link |
00:08:38.820
and you still are able to function.
link |
00:08:41.940
First of all, just to pause on that,
link |
00:08:43.460
I mean, that's kind of, that's amazing.
link |
00:08:47.940
I don't know if people quite,
link |
00:08:50.100
I mean, you kind of hear things here and there.
link |
00:08:51.740
This is why I'm kind of, I'm really excited about your book
link |
00:08:54.300
is I don't know if there's definitive sort of popular sources
link |
00:09:00.020
to think about this stuff.
link |
00:09:01.500
I mean, there's a lot of, I think from my perspective,
link |
00:09:05.060
what I heard is there's like been debates over decades
link |
00:09:07.940
about how much neuroplasticity there is in the brain
link |
00:09:12.060
and so on, and people have learned a lot of things
link |
00:09:14.860
and now it's converging towards people
link |
00:09:16.900
that are understanding there's much more plastic
link |
00:09:20.380
than people realize.
link |
00:09:21.440
But just like linger on that topic,
link |
00:09:23.900
like how malleable is the hardware of the human brain?
link |
00:09:28.380
Maybe you said children at each stage of life.
link |
00:09:32.300
Yeah, so here's the whole thing.
link |
00:09:33.680
I think part of the confusion about plasticity
link |
00:09:36.960
has been that there are studies
link |
00:09:38.220
at all sorts of different ages,
link |
00:09:40.340
and then people might read that from a distance
link |
00:09:42.400
and they think, oh, well, Fred didn't recover
link |
00:09:45.360
when half his brain was taken out
link |
00:09:47.060
and so clearly you're not plastic,
link |
00:09:49.020
but then you do it with a child and they are plastic.
link |
00:09:52.180
And so part of my goal here was to pull together
link |
00:09:56.420
the tens of thousands of papers on this,
link |
00:09:59.060
both from clinical work and from all the way down
link |
00:10:02.460
to the molecular and understand
link |
00:10:04.140
what are the principles here?
link |
00:10:04.980
The principles are that plasticity diminishes,
link |
00:10:08.260
that's no surprise.
link |
00:10:09.540
By the way, maybe I should just define plasticity.
link |
00:10:11.540
It's the ability of a system to mold into a new shape
link |
00:10:14.840
and then hold that shape.
link |
00:10:16.260
That's why we make things that we call plastic
link |
00:10:20.740
because they are moldable and they can hold that new shape,
link |
00:10:23.620
like a plastic toy or something.
link |
00:10:25.240
And so maybe we'll use a lot of terms that are synonymous.
link |
00:10:29.480
So something is plastic, something is malleable,
link |
00:10:34.460
changing, live wire, the name of the book is like synonyms.
link |
00:10:38.740
So I'll tell you, exactly right,
link |
00:10:39.880
but I'll tell you why I chose live wire
link |
00:10:41.740
instead of plasticity.
link |
00:10:42.980
So I use the term plasticity in the book, but sparingly,
link |
00:10:47.140
because that was a term coined by William James
link |
00:10:51.060
over a hundred years ago and he was, of course,
link |
00:10:53.980
very impressed with plastic manufacturing
link |
00:10:55.820
that you could mold something into shape
link |
00:10:57.660
and then it holds that.
link |
00:10:58.540
But that's not what's actually happening in the brain.
link |
00:11:01.420
It's constantly rewiring your entire life.
link |
00:11:03.700
You never hit an end point.
link |
00:11:06.380
The whole point is for it to keep changing.
link |
00:11:08.220
So even in the few minutes of conversation
link |
00:11:11.020
that we've been having, your brain is changing,
link |
00:11:12.560
my brain is changing.
link |
00:11:15.460
Next time I see your face, I will remember,
link |
00:11:18.020
oh yeah, like that time Lex and I sat together
link |
00:11:19.860
and we did these things.
link |
00:11:21.420
I wonder if your brain will have like a Lex thing
link |
00:11:24.020
going on for the next few months.
link |
00:11:25.300
Like it'll stay there until you get rid of it
link |
00:11:27.860
because it was useful for now.
link |
00:11:29.340
Yeah, no, I'll probably never get rid of it.
link |
00:11:30.820
Let's say for some circumstance,
link |
00:11:32.000
you and I don't see each other for the next 35 years.
link |
00:11:34.300
When I run into you, I'll be like, oh yeah.
link |
00:11:36.220
That looks familiar.
link |
00:11:37.460
Yeah, yeah, we sat down for a podcast
link |
00:11:40.420
back when there were podcasts.
link |
00:11:42.460
Exactly.
link |
00:11:43.300
Back when we lived outside virtual reality.
link |
00:11:46.140
Exactly.
link |
00:11:47.020
So you chose live wire to mold a plastic.
link |
00:11:50.180
Exactly, because plastic implies,
link |
00:11:52.820
I mean, it's the term that's used in the field
link |
00:11:54.380
and so that's why we need to use it still for a while.
link |
00:11:57.780
But yeah, it implies something gets molded into shape
link |
00:11:59.620
and then holds that shape forever.
link |
00:12:00.780
But in fact, the whole system is completely changing.
link |
00:12:03.220
Then back to how malleable is the human brain
link |
00:12:07.700
at each stage of life.
link |
00:12:08.820
So what, just at a high level, is it malleable?
link |
00:12:13.940
So yes, and plasticity diminishes.
link |
00:12:17.220
But one of the things that I felt like
link |
00:12:19.900
I was able to put together for myself
link |
00:12:21.740
after reading thousands of papers on this issue
link |
00:12:23.900
is that different parts of the brain
link |
00:12:26.880
have different plasticity windows.
link |
00:12:30.820
So for example, with the visual cortex,
link |
00:12:33.140
that cements itself into place pretty quickly
link |
00:12:35.940
over the course of a few years.
link |
00:12:37.740
And I argue that's because of the stability of the data.
link |
00:12:41.020
In other words, what you're getting in from the world,
link |
00:12:43.940
you've got a certain number of angles, colors, shapes.
link |
00:12:47.220
It's essentially the world is visually stable.
link |
00:12:49.940
So that hardens around that data.
link |
00:12:52.420
As opposed to, let's say, the somatosensory cortex,
link |
00:12:55.100
which is the part that's taking information
link |
00:12:56.540
from your body, or the motor cortex right next to it,
link |
00:12:58.580
which is what drives your body.
link |
00:13:00.380
The fact is, bodies are always changing.
link |
00:13:01.880
You get taller over time, you get fatter, thinner,
link |
00:13:05.100
over time, you might break a leg
link |
00:13:06.980
and have to limp for a while, stuff like that.
link |
00:13:08.560
So because the data there is always changing,
link |
00:13:11.740
by the way, you might get on a bicycle,
link |
00:13:12.900
you might get on a surfboard, things like that.
link |
00:13:15.660
Because the data is always changing,
link |
00:13:16.780
that stays more malleable.
link |
00:13:19.260
And when you look through the brain,
link |
00:13:20.860
you find that it appears to be this,
link |
00:13:23.940
how stable the data is determines how fast
link |
00:13:26.660
something hardens into place.
link |
00:13:28.060
But the point is, different parts of the brain
link |
00:13:30.000
harden into place at different times.
link |
00:13:31.940
Do you think it's possible that,
link |
00:13:35.020
depending on how much data you get on different sensors,
link |
00:13:38.700
that it stays more malleable longer?
link |
00:13:41.480
So like, if you look at different cultures
link |
00:13:44.220
that experience, like if you keep your eyes closed,
link |
00:13:47.580
or maybe you're blind, I don't know,
link |
00:13:48.860
but let's say you keep your eyes closed
link |
00:13:51.180
for your entire life, then the visual cortex
link |
00:13:55.820
might be much less malleable.
link |
00:13:58.660
The reason I bring that up is like,
link |
00:14:01.300
well maybe we'll talk about brain computer interfaces
link |
00:14:03.340
a little bit down the line, but is this,
link |
00:14:08.500
is the malleability a genetic thing,
link |
00:14:11.300
or is it more about the data, like you said, that comes in?
link |
00:14:14.380
Ah, so the malleability itself is a genetic thing.
link |
00:14:17.980
The big trick that Mother Nature discovered with humans
link |
00:14:20.840
is make a system that's really flexible,
link |
00:14:24.220
as opposed to most other creatures to different degrees.
link |
00:14:28.100
So if you take an alligator, it's born,
link |
00:14:31.220
its brain does the same thing every generation.
link |
00:14:34.180
If you compare an alligator 100,000 years ago
link |
00:14:36.220
to an alligator now, they're essentially the same.
link |
00:14:39.660
We, on the other hand, as humans,
link |
00:14:41.360
drop into a world with a half baked brain,
link |
00:14:44.220
and what we require is to absorb the culture around us,
link |
00:14:48.180
and the language, and the beliefs, and the customs,
link |
00:14:50.300
and so on, that's what Mother Nature has done with us,
link |
00:14:55.060
and it's been a tremendously successful trick
link |
00:14:57.340
we've taken over the whole planet as a result of this.
link |
00:15:00.120
So that's an interesting point,
link |
00:15:01.220
I mean, just to link on it, that,
link |
00:15:03.140
I mean, this is a nice feature,
link |
00:15:05.400
like if you were to design a thing
link |
00:15:07.060
to survive in this world, do you put it at age zero
link |
00:15:11.780
already equipped to deal with the world
link |
00:15:14.580
in a hard coded way, or do you put it,
link |
00:15:17.740
do you make it malleable and just throw it in,
link |
00:15:19.580
take the risk that you're maybe going to die,
link |
00:15:23.380
but you're going to learn a lot in the process,
link |
00:15:25.220
and if you don't die, you'll learn a hell of a lot
link |
00:15:27.620
to be able to survive in the environment.
link |
00:15:29.300
So this is the experiment that Mother Nature ran,
link |
00:15:31.340
and it turns out that, for better or worse, we've won.
link |
00:15:34.980
I mean, yeah, we put other animals in the zoos,
link |
00:15:37.500
and we, yeah, that's right.
link |
00:15:38.620
AI might do better.
link |
00:15:39.580
Okay, fair enough, that's true.
link |
00:15:41.320
And maybe what the trick Mother Nature did
link |
00:15:43.620
is just the stepping stone to AI, but.
link |
00:15:46.660
So that's a beautiful feature of the human brain,
link |
00:15:50.220
that it's malleable, but let's,
link |
00:15:52.420
on the topic of Mother Nature, what do we start with?
link |
00:15:56.380
Like, how blank is the slate?
link |
00:15:58.420
Ah, so it's not actually a blank slate.
link |
00:16:01.060
What it's, it's terrific engineering that's set up in there,
link |
00:16:05.340
but much of that engineering has to do with,
link |
00:16:07.700
okay, just make sure that things get to the right place.
link |
00:16:10.100
For example, like the fibers from the eyes
link |
00:16:12.260
getting to the visual cortex,
link |
00:16:13.840
or all this very complicated machinery in the ear
link |
00:16:15.640
getting to the auditory cortex, and so on.
link |
00:16:17.700
So things, first of all, there's that.
link |
00:16:19.860
And then what we also come equipped with
link |
00:16:21.520
is the ability to absorb language
link |
00:16:24.580
and culture and beliefs, and so on.
link |
00:16:27.060
So you're already set up for that.
link |
00:16:28.600
So no matter what you're exposed to,
link |
00:16:30.020
you will absorb some sort of language.
link |
00:16:32.940
That's the trick, is how do you engineer something
link |
00:16:34.980
just enough that it's then a sponge
link |
00:16:37.300
that's ready to take in and fill in the blanks?
link |
00:16:40.060
How much of the malleability is hardware?
link |
00:16:42.460
How much is software?
link |
00:16:43.300
Is that useful at all in the brain?
link |
00:16:45.100
So what are we talking about?
link |
00:16:46.980
So there's neurons, there's synapses,
link |
00:16:52.100
and all kinds of different synapses,
link |
00:16:54.060
and there's chemical communication,
link |
00:16:55.900
like electrical signals,
link |
00:16:57.060
and there's chemical communication from the synapses.
link |
00:17:04.020
I would say the software would be the timing
link |
00:17:09.020
and the nature of the electrical signals, I guess,
link |
00:17:11.540
and the hardware would be the actual synapses.
link |
00:17:14.140
So here's the thing, this is why I really, if we can,
link |
00:17:16.860
I wanna get away from the hardware and software metaphor
link |
00:17:19.580
because what happens is,
link |
00:17:21.900
as activity passes through the system, it changes things.
link |
00:17:25.100
Now, the thing that computer engineers
link |
00:17:27.880
are really used to thinking about is synapses,
link |
00:17:30.300
where two neurons connect.
link |
00:17:31.700
Of course, each neuron connects with 10,000 of its neighbors,
link |
00:17:33.700
but at a point where they connect,
link |
00:17:36.860
what we're all used to thinking about
link |
00:17:37.700
is the changing of the strength of that connection,
link |
00:17:40.820
the synaptic weight.
link |
00:17:42.240
But in fact, everything is changing.
link |
00:17:44.560
The receptor distribution inside that neuron
link |
00:17:47.440
so that you're more or less sensitive
link |
00:17:49.600
to the neurotransmitter,
link |
00:17:50.760
then the structure of the neuron itself
link |
00:17:53.440
and what's happening there,
link |
00:17:54.440
all the way down to biochemical cascades inside the cell,
link |
00:17:57.360
all the way down to the nucleus,
link |
00:17:59.200
and for example, the epigenome,
link |
00:18:00.920
which is these little proteins that are attached to the DNA
link |
00:18:05.600
that cause conformational changes,
link |
00:18:07.680
that cause more genes to be expressed or repressed.
link |
00:18:11.200
All of these things are plastic.
link |
00:18:13.640
The reason that most people only talk
link |
00:18:15.960
about the synaptic weights
link |
00:18:17.560
is because that's really all we can measure well.
link |
00:18:20.560
And all this other stuff is really, really hard to see
link |
00:18:22.600
with our current technology.
link |
00:18:23.600
So essentially, that just gets ignored.
link |
00:18:25.840
But in fact, the system is plastic
link |
00:18:27.720
at all these different levels.
link |
00:18:28.880
And my way of thinking about this
link |
00:18:33.640
is an analogy to pace layers.
link |
00:18:37.560
So pace layers is a concept that Stewart Brand
link |
00:18:40.080
suggested about how to think about cities.
link |
00:18:43.000
So you have fashion, which changes rapidly in cities.
link |
00:18:46.520
You have governance, which changes more slowly.
link |
00:18:52.040
You have the structure, the buildings of a city,
link |
00:18:54.440
which changes more slowly, all the way down to nature.
link |
00:18:57.760
You've got all these different layers of things
link |
00:18:59.560
that are changing at different paces, at different speeds.
link |
00:19:02.680
I've taken that idea and mapped it onto the brain,
link |
00:19:05.680
which is to say you have some biochemical cascades
link |
00:19:08.480
that are just changing really rapidly
link |
00:19:10.120
when something happens, all the way down to things
link |
00:19:11.840
that are more and more cemented in there.
link |
00:19:14.520
And this actually allows us to understand a lot
link |
00:19:19.320
about particular kinds of things that happen.
link |
00:19:20.920
For example, one of the oldest,
link |
00:19:22.160
probably the oldest rule in neurology
link |
00:19:24.680
is called Ribot's Law, which is that older memories
link |
00:19:27.520
are more stable than newer memories.
link |
00:19:29.600
So when you get old and demented,
link |
00:19:32.640
you'll be able to remember things from your young life.
link |
00:19:36.160
Maybe you'll remember this podcast,
link |
00:19:37.400
but you won't remember what you did
link |
00:19:39.320
a month ago or a year ago.
link |
00:19:41.520
And this is a very weird structure, right?
link |
00:19:43.440
No other system works this way,
link |
00:19:44.640
where older memories are more stable than newer memories.
link |
00:19:48.320
But it's because through time,
link |
00:19:52.000
things get more and more cemented
link |
00:19:53.520
into deeper layers of the system.
link |
00:19:56.280
And so this is, I think, the way we have to think
link |
00:19:59.880
about the brain, not as, okay, you've got neurons,
link |
00:20:03.080
you've got synaptic weights, and that's it.
link |
00:20:04.880
So, yeah, so the idea of LiveWare and LiveWired
link |
00:20:08.640
is that it's like a, it's a gradual, yeah,
link |
00:20:14.480
it's a gradual spectrum between software and hardware.
link |
00:20:18.240
And so the metaphors completely doesn't make sense.
link |
00:20:22.000
Cause like when you talk about software and hardware,
link |
00:20:24.480
it's really hard lines.
link |
00:20:26.480
I mean, of course, software is unlike hard,
link |
00:20:31.480
but even hardware, but like, so there's two groups,
link |
00:20:36.320
but in the software world,
link |
00:20:37.640
there's levels of abstractions, right?
link |
00:20:39.040
There's the operating system, there's machine code,
link |
00:20:42.120
and then it gets higher and higher levels.
link |
00:20:44.520
But somehow that's actually fundamentally different
link |
00:20:46.920
than the layers of abstractions in the hardware.
link |
00:20:50.040
But in the brain, it's all like the same.
link |
00:20:53.160
And I love the city, the city metaphor.
link |
00:20:55.400
I mean, yeah, it's kind of mind blowing
link |
00:20:57.880
cause it's hard to know what to think about that.
link |
00:21:01.440
Like if I were to ask the question,
link |
00:21:04.160
this is an important question for machine learning is,
link |
00:21:07.520
how does the brain learn?
link |
00:21:09.720
So essentially you're saying that,
link |
00:21:13.880
I mean, it just learns on all of these different levels
link |
00:21:17.160
at all different paces.
link |
00:21:19.040
Exactly right.
link |
00:21:19.880
And as a result, what happens is
link |
00:21:21.360
as you practice something, you get good at something,
link |
00:21:24.480
you're physically changing the circuitry,
link |
00:21:26.600
you're adapting your brain around the thing
link |
00:21:30.000
that is relevant to you.
link |
00:21:31.280
So let's say you take up, do you know how to surf?
link |
00:21:34.920
Nope.
link |
00:21:35.760
Okay, great.
link |
00:21:36.600
So let's say you take up surfing now at this age.
link |
00:21:39.200
What happens is you'll be terrible at first,
link |
00:21:41.080
you don't know how to operate your body,
link |
00:21:42.000
you don't know how to read the waves, things like that.
link |
00:21:43.960
And through time you get better and better.
link |
00:21:45.680
What you're doing is you're burning that
link |
00:21:46.960
into the actual circuitry of your brain.
link |
00:21:48.600
You're of course conscious when you're first doing it,
link |
00:21:50.800
you're thinking about, okay, where am I doing?
link |
00:21:52.120
What's my body weight?
link |
00:21:53.600
But eventually when you become a pro at it,
link |
00:21:55.320
you are not conscious of it at all.
link |
00:21:57.160
In fact, you can't even unpack what it is that you did.
link |
00:22:00.200
Think about riding a bicycle.
link |
00:22:01.920
You can't describe how you're doing it,
link |
00:22:04.080
you're just doing it, you're changing your balance
link |
00:22:05.600
when you come, you know, you do this to go to a stop.
link |
00:22:08.040
So this is what we're constantly doing
link |
00:22:10.800
is actually shaping our own circuitry
link |
00:22:14.320
based on what is relevant for us.
link |
00:22:16.000
Survival, of course, being the top thing that's relevant.
link |
00:22:18.800
But interestingly, especially with humans,
link |
00:22:21.440
we have these particular goals in our lives,
link |
00:22:23.400
computer science, neuroscience, whatever.
link |
00:22:25.600
And so we actually shape our circuitry around that.
link |
00:22:28.680
I mean, you mentioned this gets slower and slower with age,
link |
00:22:31.400
but is there, like I think I've read and spoken offline,
link |
00:22:36.080
even on this podcast with a developmental neurobiologist,
link |
00:22:41.320
I guess would be the right terminology,
link |
00:22:43.400
is like looking at the very early,
link |
00:22:45.520
like from embryonic stem cells to the creation of the brain.
link |
00:22:50.440
And like, that's mind blowing how much stuff happens there.
link |
00:22:55.320
So it's very malleable at that stage.
link |
00:22:59.280
And then, but after that,
link |
00:23:01.560
at which point does it stop being malleable?
link |
00:23:04.600
So that's the interesting thing
link |
00:23:06.040
is that it remains malleable your whole life.
link |
00:23:08.480
So even when you're an old person,
link |
00:23:10.200
you'll be able to remember new faces and names,
link |
00:23:13.120
you'll be able to learn new sorts of tasks.
link |
00:23:15.680
And thank goodness,
link |
00:23:16.520
cause the world is changing rapidly
link |
00:23:17.960
in terms of technology and so on.
link |
00:23:19.720
I just sent my mother an Alexa
link |
00:23:21.400
and she figured out how to go on the settings
link |
00:23:23.400
and do the thing.
link |
00:23:24.240
And I was really impressed that she was able to do it.
link |
00:23:26.960
So there are parts of the brain
link |
00:23:28.520
that remain malleable their whole life.
link |
00:23:30.360
The interesting part is that really your goal
link |
00:23:34.200
is to make an internal model of the world.
link |
00:23:36.080
Your goal is to say, okay,
link |
00:23:39.120
the brain is trapped in silence and darkness,
link |
00:23:42.280
and it's trying to understand
link |
00:23:43.520
how the world works out there, right?
link |
00:23:46.040
I love that image.
link |
00:23:47.080
Yeah, I guess it is.
link |
00:23:48.240
Yeah.
link |
00:23:49.080
You forget, it's like this lonely thing
link |
00:23:53.040
is sitting in its own container
link |
00:23:54.640
and trying to actually throw a few sensors,
link |
00:23:57.080
figure out what the hell's going on.
link |
00:23:58.760
You know what I sometimes think about
link |
00:23:59.800
is that movie, The Martian with Matt Damon,
link |
00:24:03.120
the, I mean, it was written in a book, of course,
link |
00:24:05.560
but the movie poster shows Matt Damon
link |
00:24:08.400
all alone on the red planet.
link |
00:24:09.800
And I think, God, that's actually what it's like
link |
00:24:12.520
to be inside your head and my head and anybody's head
link |
00:24:16.360
is that you're essentially on your own planet in there.
link |
00:24:20.200
And I'm essentially on my own planet.
link |
00:24:21.600
And everyone's got their own world
link |
00:24:24.080
where you've absorbed all of your experiences
link |
00:24:26.680
up to this moment in your life
link |
00:24:28.000
that have made you exactly who you are
link |
00:24:29.560
and same for me and everyone.
link |
00:24:31.120
And we've got this very thin bandwidth of communication.
link |
00:24:36.680
And I'll say something like,
link |
00:24:38.720
oh yeah, that tastes just like peaches.
link |
00:24:40.720
And you'll say, oh, I know what you mean.
link |
00:24:42.840
But the experience, of course,
link |
00:24:44.280
might be vastly different for us.
link |
00:24:47.680
But anyway, yes.
link |
00:24:48.520
So the brain is trapped in silence and darkness,
link |
00:24:50.520
each one of us, and what it's trying to do,
link |
00:24:53.280
this is the important part,
link |
00:24:54.120
it's trying to make an internal model
link |
00:24:55.720
of what's going on out there,
link |
00:24:56.560
as in how do I function in the world?
link |
00:24:58.800
How do I interact with other people?
link |
00:25:00.800
Do I say something nice and polite?
link |
00:25:02.680
Do I say something aggressive and mean?
link |
00:25:04.040
Do I, you know, all these things
link |
00:25:05.560
that it's putting together about the world.
link |
00:25:08.440
And I think what happens when people get older and older,
link |
00:25:12.320
it may not be that plasticity is diminishing.
link |
00:25:15.480
It may be that their internal model essentially
link |
00:25:18.160
has set itself up in a way where it says,
link |
00:25:20.120
okay, I've pretty much got
link |
00:25:21.160
a really good understanding of the world now,
link |
00:25:22.800
and I don't really need to change, right?
link |
00:25:25.400
So when much older people find themselves
link |
00:25:28.600
in a situation where they need to change,
link |
00:25:30.800
they can actually are able to do it.
link |
00:25:33.280
It's just that I think this notion
link |
00:25:34.760
that we all have that plasticity diminishes
link |
00:25:36.960
as we grow older is in part
link |
00:25:38.880
because the motivation isn't there.
link |
00:25:41.480
But if you were 80 and you get fired from your job
link |
00:25:44.120
and suddenly had to figure out
link |
00:25:45.720
how to program a WordPress site or something,
link |
00:25:47.440
you'd figure it out.
link |
00:25:48.920
Got it.
link |
00:25:49.760
So the capability, the possibility of change is there.
link |
00:25:53.720
But then that's the highest challenge,
link |
00:25:57.040
the interesting challenge to this plasticity,
link |
00:26:00.880
to this liveware system.
link |
00:26:03.440
If we could talk about brain computer interfaces
link |
00:26:06.000
and Neuralink, what are your thoughts
link |
00:26:09.040
about the efforts of Elon Musk, Neuralink, BCI in general
link |
00:26:13.680
in this regard, which is adding a machine,
link |
00:26:18.320
a computer, the capability of a computer
link |
00:26:21.720
to communicate with the brain
link |
00:26:22.800
and the brain to communicate with a computer
link |
00:26:24.920
at the very basic applications
link |
00:26:26.800
and then like the futuristic kind of thoughts.
link |
00:26:28.880
Yeah, first of all, it's terrific
link |
00:26:30.920
that people are jumping in and doing that
link |
00:26:32.360
because it's clearly the future.
link |
00:26:34.480
The interesting part is our brains have pretty good methods
link |
00:26:37.280
of interacting with technology.
link |
00:26:38.760
So maybe it's your fat thumbs on a cell phone or something,
link |
00:26:41.520
but, or maybe it's watching a YouTube video
link |
00:26:44.600
and getting into your eye that way.
link |
00:26:45.840
But we have pretty rapid ways of communicating
link |
00:26:48.360
with technology and getting data.
link |
00:26:49.960
So if you actually crack open the skull
link |
00:26:52.760
and go into the inner sanctum of the brain,
link |
00:26:56.520
you might be able to get a little bit faster,
link |
00:26:58.200
but I'll tell you, I'm not so sanguine
link |
00:27:03.400
on the future of that as a business.
link |
00:27:06.640
And I'll tell you why.
link |
00:27:07.480
It's because there are various ways
link |
00:27:10.040
of getting data in and out
link |
00:27:11.160
and an open head surgery is a big deal.
link |
00:27:14.480
Neurosurgeons don't wanna do it
link |
00:27:16.560
because there's always risk of death
link |
00:27:18.000
and infection on the table.
link |
00:27:19.640
And also it's not clear how many people would say,
link |
00:27:23.200
I'm gonna volunteer to get something in my head
link |
00:27:26.000
so that I can text faster, 20% faster.
link |
00:27:29.800
So I think it's, mother nature surrounds the brain
link |
00:27:33.720
with this armored bunker of the skull
link |
00:27:37.840
because it's a very delicate material.
link |
00:27:39.680
And there's an expression in neurosurgery
link |
00:27:44.200
about the brain is,
link |
00:27:46.480
the person is never the same after you open up their skull.
link |
00:27:49.080
Now, whether or not that's true or whatever, who cares?
link |
00:27:51.640
But it's a big deal to do an open head surgery.
link |
00:27:54.080
So what I'm interested in is how can we get information
link |
00:27:57.520
in and out of the brain
link |
00:27:58.360
without having to crack the skull open?
link |
00:28:00.680
Without messing with the biological part,
link |
00:28:03.880
directly connecting or messing
link |
00:28:06.480
with the intricate biological thing that we got going on
link |
00:28:10.440
and it seems to be working.
link |
00:28:11.920
Yeah, exactly.
link |
00:28:12.760
And by the way, where Neuralink is going,
link |
00:28:15.400
which is wonderful, is going to be in patient cases.
link |
00:28:18.240
It really matters for all kinds of surgeries
link |
00:28:20.520
that a person needs,
link |
00:28:21.760
whether for Parkinson's or epilepsy or whatever.
link |
00:28:24.000
It's a terrific new technology
link |
00:28:25.680
for essentially sewing electrodes in there
link |
00:28:27.760
and getting more higher density of electrodes.
link |
00:28:30.000
So that's great.
link |
00:28:30.960
I just don't think as far as the future of BCI goes,
link |
00:28:34.400
I don't suspect that people will go in and say,
link |
00:28:38.400
yeah, drill a hole in my head and do this.
link |
00:28:40.640
Well, it's interesting
link |
00:28:41.680
because I think there's a similar intuition
link |
00:28:44.280
but say in the world of autonomous vehicles
link |
00:28:46.680
that folks know how hard it is
link |
00:28:49.400
and it seems damn impossible.
link |
00:28:51.560
The similar intuition about,
link |
00:28:52.960
I'm sticking on the Elon Musk thing
link |
00:28:54.680
is just a good, easy example.
link |
00:28:57.040
Similar intuition about colonizing Mars,
link |
00:28:59.680
it like, if you really think about it,
link |
00:29:01.440
it seems extremely difficult.
link |
00:29:04.760
And almost, I mean, just technically difficult
link |
00:29:08.960
to a degree where you wanna ask,
link |
00:29:12.000
is it really worth doing, worth trying?
link |
00:29:14.840
And then the same is applied with BCI.
link |
00:29:17.880
But the thing about the future
link |
00:29:21.800
is it's hard to predict.
link |
00:29:23.720
So the exciting thing to me with,
link |
00:29:26.680
so once it does, once if successful,
link |
00:29:29.640
it's able to help patients,
link |
00:29:31.640
it may be able to discover something very surprising
link |
00:29:36.680
of our ability to directly communicate with the brain.
link |
00:29:39.560
So exactly what you're interested in is figuring out
link |
00:29:42.840
how to play with this malleable brain,
link |
00:29:46.640
but like help assist it somehow.
link |
00:29:49.480
I mean, it's such a compelling notion to me
link |
00:29:52.240
that we're now working on
link |
00:29:53.640
all these exciting machine learning systems
link |
00:29:55.960
that are able to learn from data.
link |
00:29:59.760
And then if we can have this other brain
link |
00:30:04.040
that's a learning system,
link |
00:30:05.600
that's live wired on the human side
link |
00:30:09.600
and them to be able to communicate,
link |
00:30:11.520
it's like a self play mechanism
link |
00:30:14.120
was able to beat the world champion at Go.
link |
00:30:17.800
So they can play with each other,
link |
00:30:19.560
the computer and the brain, like when you sleep.
link |
00:30:22.440
I mean, there's a lot of futuristic kind of things
link |
00:30:24.520
that it's just exciting possibilities,
link |
00:30:27.640
but I hear you, we understand so little
link |
00:30:30.240
about the actual intricacies of the communication
link |
00:30:34.080
of the brain that it's hard to find the common language.
link |
00:30:38.480
Well, interestingly, the technologies that have been built
link |
00:30:45.800
don't actually require the perfect common language.
link |
00:30:48.320
So for example, hundreds of thousands of people
link |
00:30:51.080
are walking around with artificial ears
link |
00:30:52.440
and artificial eyes,
link |
00:30:53.440
meaning cochlear implants or retinal implants.
link |
00:30:56.920
So this is, you take a essentially digital microphone,
link |
00:31:00.320
you slip an electrode strip into the inner ear
link |
00:31:03.240
and people can learn how to hear that way,
link |
00:31:05.160
or you take an electrode grid
link |
00:31:06.600
and you plug it into the retina at the back of the eye
link |
00:31:09.240
and people can learn how to see that way.
link |
00:31:11.160
The interesting part is those devices
link |
00:31:13.880
don't speak exactly the natural biological language,
link |
00:31:17.120
they speak the dialect of Silicon Valley.
link |
00:31:19.400
And it turns out that as recently as about 25 years ago,
link |
00:31:24.360
a lot of people thought this was never gonna work.
link |
00:31:26.520
They thought it wasn't gonna work for that reason,
link |
00:31:28.960
but the brain figures it out.
link |
00:31:30.360
It's really good at saying, okay, look,
link |
00:31:32.200
there's some correlation between what I can touch
link |
00:31:34.200
and feel and hear and so on,
link |
00:31:35.840
and the data that's coming in,
link |
00:31:37.040
or between I clap my hands and I have signals coming in there
link |
00:31:41.520
and it figures out how to speak any language.
link |
00:31:44.240
Oh, that's fascinating.
link |
00:31:45.080
So like no matter if it's Neuralink,
link |
00:31:50.000
so directly communicating with the brain,
link |
00:31:51.760
or it's a smartphone or Google Glass,
link |
00:31:54.720
or the brain figures out the efficient way of communication.
link |
00:31:58.800
Well, exactly, exactly.
link |
00:32:00.200
And what I propose is the potato head theory of evolution,
link |
00:32:03.680
which is that all our eyes and nose and mouth and ears
link |
00:32:08.320
and fingertips, all this stuff is just plug and play.
link |
00:32:10.760
And the brain can figure out
link |
00:32:12.760
what to do with the data that comes in.
link |
00:32:14.160
And part of the reason that I think this is right,
link |
00:32:17.600
and I care so deeply about this,
link |
00:32:18.960
is when you look across the animal kingdom,
link |
00:32:20.360
you find all kinds of weird peripheral devices plugged in,
link |
00:32:23.520
and the brain figures out what to do with the data.
link |
00:32:25.760
And I don't believe that Mother Nature
link |
00:32:27.360
has to reinvent the principles of brain operation each time
link |
00:32:32.360
to say, oh, now I'm gonna have heat pits
link |
00:32:33.800
to detect infrared.
link |
00:32:34.640
Now I'm gonna have something
link |
00:32:36.280
to detect electroreceptors on the body.
link |
00:32:39.200
Now I'm gonna detect something
link |
00:32:40.040
to pick up the magnetic field of the earth
link |
00:32:42.520
with cryptochromes in the eye.
link |
00:32:43.960
And so instead the brain says, oh, I got it.
link |
00:32:45.800
There's data coming in.
link |
00:32:47.240
Is that useful?
link |
00:32:48.080
Can I do something with it?
link |
00:32:48.920
Oh, great, I'm gonna mold myself
link |
00:32:50.440
around the data that's coming in.
link |
00:32:52.640
It's kind of fascinating to think that,
link |
00:32:55.440
we think of smartphones
link |
00:32:56.560
and all this new technology as novel.
link |
00:32:58.760
It's totally novel as outside of what evolution
link |
00:33:02.680
ever intended or like what nature ever intended.
link |
00:33:05.600
It's fascinating to think that like the entirety
link |
00:33:08.400
of the process of evolution is perfectly fine
link |
00:33:10.960
and ready for the smartphone and the internet.
link |
00:33:14.200
Like it's ready.
link |
00:33:15.240
It's ready to be valuable to that.
link |
00:33:17.120
And whatever comes to cyborgs, to virtual reality,
link |
00:33:21.440
we kind of think like, this is, you know,
link |
00:33:23.480
there's all these like books written about what's natural
link |
00:33:27.000
and we're like destroying our natural cells
link |
00:33:29.600
by like embracing all this technology.
link |
00:33:32.560
It's kind of, you know,
link |
00:33:34.520
probably not giving the brain enough credit.
link |
00:33:36.480
Like this thing is just fine with new tech.
link |
00:33:40.240
Oh, exactly, it wraps itself around.
link |
00:33:41.840
And by the way, wait till you have kids.
link |
00:33:43.120
You'll see the ease with which they pick up on stuff.
link |
00:33:46.320
And as Kevin Kelly said,
link |
00:33:50.280
technology is what gets invented after you're born.
link |
00:33:54.480
But the stuff that already exists when you're born,
link |
00:33:56.280
that's not even tech, that's just background furniture.
link |
00:33:58.120
Like the fact that the iPad exists for my son and daughter,
link |
00:34:00.840
like that's just background furniture.
link |
00:34:02.320
So, yeah, it's because we have
link |
00:34:06.240
this incredibly malleable system,
link |
00:34:08.200
that just absorbs whatever is going on in the world
link |
00:34:10.680
and learns what to do with it.
link |
00:34:11.840
So do you think, just to linger for a little bit more,
link |
00:34:15.400
do you think it's possible to co adjust?
link |
00:34:22.160
Like we're kind of, you know,
link |
00:34:25.400
for the machine to adjust to the brain,
link |
00:34:27.880
for the brain to adjust to the machine.
link |
00:34:29.240
I guess that's what's already happening.
link |
00:34:31.040
Sure, that is what's happening.
link |
00:34:32.360
So for example, when you put electrodes
link |
00:34:34.640
in the motor cortex to control a robotic arm
link |
00:34:37.200
for somebody who's paralyzed,
link |
00:34:39.240
the engineers do a lot of work to figure out,
link |
00:34:41.080
okay, what can we do with the algorithm here
link |
00:34:42.840
so that we can detect what's going on from these cells
link |
00:34:45.640
and figure out how to best program the robotic arm to move
link |
00:34:49.760
given the data that we're measuring from these cells.
link |
00:34:51.920
But also the brain is learning too.
link |
00:34:54.560
So, you know, the paralyzed woman says,
link |
00:34:57.040
wait, I'm trying to grab this thing.
link |
00:34:58.840
And by the way, it's all about relevance.
link |
00:35:00.880
So if there's a piece of food there and she's hungry,
link |
00:35:04.040
she'll figure out how to get this food into her mouth
link |
00:35:08.240
with the robotic arm because that is what matters.
link |
00:35:13.240
Well, that's, okay, first of all,
link |
00:35:15.520
that paints a really promising and beautiful,
link |
00:35:17.640
for some reason, really optimistic picture
link |
00:35:20.160
that, you know, our brain is able to adjust to so much.
link |
00:35:26.320
You know, so many things happened this year, 2020,
link |
00:35:29.640
that you think like, how are we ever going to deal with it?
link |
00:35:32.960
And it's somehow encouraging
link |
00:35:35.680
and inspiring that like we're going to be okay.
link |
00:35:41.120
Well, that's right.
link |
00:35:41.960
I actually think, so 2020 has been an awful year
link |
00:35:45.080
for almost everybody in many ways,
link |
00:35:46.960
but the one silver lining has to do with brain plasticity,
link |
00:35:50.440
which is to say we've all been on our, you know,
link |
00:35:54.080
on our gerbil wheels, we've all been in our routines.
link |
00:35:56.480
And, you know, as I mentioned,
link |
00:35:58.600
our internal models are all about
link |
00:36:00.760
how do you maximally succeed?
link |
00:36:02.320
How do you optimize your operation
link |
00:36:04.560
in this circumstance where you are, right?
link |
00:36:07.240
And then all of a sudden, bang, 2020 comes,
link |
00:36:09.080
we're completely off our wheels.
link |
00:36:10.920
We're having to create new things all the time
link |
00:36:14.880
and figure out how to do it.
link |
00:36:15.960
And that is terrific for brain plasticity because,
link |
00:36:18.720
and we know this because there are very large studies
link |
00:36:23.480
on older people who stay cognitively active
link |
00:36:26.480
their whole lives.
link |
00:36:28.000
Some fraction of them have Alzheimer's disease
link |
00:36:31.760
physically, but nobody knows that when they're alive.
link |
00:36:34.640
Even though their brain is getting chewed up
link |
00:36:36.000
with the ravages of Alzheimer's,
link |
00:36:38.600
cognitively they're doing just fine.
link |
00:36:40.000
Why?
link |
00:36:40.840
It's because they're challenged all the time.
link |
00:36:44.080
They've got all these new things going on,
link |
00:36:45.520
all this novelty, all these responsibilities,
link |
00:36:47.200
chores, social life, all these things happening.
link |
00:36:49.640
And as a result, they're constantly building new roadways,
link |
00:36:52.760
even as parts degrade.
link |
00:36:54.680
And that's the only good news is that
link |
00:36:57.720
we are in a situation where suddenly
link |
00:36:59.480
we can't just operate like automata anymore.
link |
00:37:02.000
We have to think of completely new ways to do things.
link |
00:37:05.520
And that's wonderful.
link |
00:37:07.880
I don't know why this question popped into my head.
link |
00:37:11.200
It's quite absurd, but are we gonna be okay?
link |
00:37:16.080
Yeah.
link |
00:37:17.200
You said this is the promising silver lining
link |
00:37:19.640
just from your own,
link |
00:37:20.480
cause you've written about this and thought about this
link |
00:37:22.840
outside of maybe even the plasticity of the brain,
link |
00:37:25.240
but just this whole pandemic kind of changed the way
link |
00:37:29.920
it knocked us out of this hamster wheel like that of habit.
link |
00:37:35.560
A lot of people had to reinvent themselves.
link |
00:37:39.400
Unfortunately, and I have a lot of friends
link |
00:37:42.240
who either already or are going to lose their business,
link |
00:37:48.160
is basically it's taking the dreams that people have had
link |
00:37:52.720
and said this dream, this particular dream you've had
link |
00:37:58.160
will no longer be possible.
link |
00:38:00.080
So you have to find something new.
link |
00:38:02.040
What are your, are we gonna be okay?
link |
00:38:06.000
Yeah, we'll be okay in the sense that,
link |
00:38:08.080
I mean, it's gonna be a rough time
link |
00:38:09.560
for many or most people,
link |
00:38:11.800
but in the sense that it is sometimes useful
link |
00:38:18.000
to find that what you thought was your dream
link |
00:38:20.760
was not the thing that you're going to do.
link |
00:38:24.520
This is obviously the plot in lots of Hollywood movies
link |
00:38:26.760
that someone says, I'm gonna do this,
link |
00:38:27.920
and then that gets foiled
link |
00:38:29.280
and they end up doing something better.
link |
00:38:31.080
But this is true in life.
link |
00:38:32.320
I mean, in general, even though we plan our lives
link |
00:38:38.440
as best we can, it's predicated on our notion of,
link |
00:38:42.080
okay, given everything that's around me,
link |
00:38:43.600
this is what's possible for me next.
link |
00:38:47.040
But it takes 2020 to knock you off that
link |
00:38:49.400
where you think, oh, well, actually,
link |
00:38:50.840
maybe there's something I can be doing
link |
00:38:52.680
that's bigger, that's better.
link |
00:38:54.320
Yeah, you know, for me, one exciting thing,
link |
00:38:56.560
and I just talked to Grant Sanderson.
link |
00:38:59.600
I don't know if you know who he is.
link |
00:39:00.680
He's a 3Blue1Brown, it's a YouTube channel.
link |
00:39:03.600
He does, he's a, if you see it, you would recognize it.
link |
00:39:08.200
He's like a really famous math guy,
link |
00:39:11.120
and he's a math educator,
link |
00:39:12.400
and he does these incredible, beautiful videos.
link |
00:39:15.040
And now I see sort of at MIT,
link |
00:39:17.000
folks are struggling to try to figure out,
link |
00:39:19.680
you know, if we do teach remotely,
link |
00:39:21.800
how do we do it effectively?
link |
00:39:23.560
So you have these world class researchers
link |
00:39:27.960
and professors trying to figure out
link |
00:39:30.240
how to put content online that teaches people.
link |
00:39:33.760
And to me, a possible future of that is,
link |
00:39:37.720
you know, Nobel Prize winning faculty become YouTubers.
link |
00:39:42.720
Like that to me is so exciting, like what Grant said,
link |
00:39:47.720
which is like the possibility of creating canonical videos
link |
00:39:52.480
on the thing you're a world expert in.
link |
00:39:55.000
You know, there's so many topics.
link |
00:39:56.880
It just, the world doesn't, you know, there's faculty.
link |
00:40:00.880
I mentioned Russ Tedrick.
link |
00:40:02.120
There's all these people in robotics
link |
00:40:04.240
that are experts in a particular beautiful field
link |
00:40:07.880
on which there's only just papers.
link |
00:40:09.840
There's no popular book.
link |
00:40:12.680
There's no clean canonical video
link |
00:40:16.400
showing the beauty of a subject.
link |
00:40:18.120
And one possibility is they try to create that
link |
00:40:22.360
and share it with the world.
link |
00:40:25.400
This is the beautiful thing.
link |
00:40:26.400
This of course has been happening for a while already.
link |
00:40:28.880
I mean, for example, when I go and I give book talks,
link |
00:40:31.720
often what'll happen is some 13 year old
link |
00:40:33.760
will come up to me afterwards and say something,
link |
00:40:35.360
and I'll say, my God, that was so smart.
link |
00:40:37.160
Like, how did you know that?
link |
00:40:38.840
And they'll say, oh, I saw it on a Ted talk.
link |
00:40:40.320
Well, what an amazing opportunity.
link |
00:40:42.880
Here you got the best person in the world on subject X
link |
00:40:46.440
giving a 15 minute talk as beautifully as he or she can.
link |
00:40:51.880
And the 13 year old just grows up with that.
link |
00:40:53.440
That's just the mother's milk, right?
link |
00:40:55.160
As opposed to when we grew up,
link |
00:40:57.720
you know, I had whatever homeroom teacher I had
link |
00:41:00.320
and, you know, whatever classmates I had.
link |
00:41:03.200
And hopefully that person knew what he or she was teaching
link |
00:41:06.440
and often didn't and, you know, just made things up.
link |
00:41:08.760
So the opportunity that has become extraordinary
link |
00:41:12.960
to get the best of the world.
link |
00:41:14.480
And the reason this matters, of course,
link |
00:41:15.720
is because obviously, back to plasticity,
link |
00:41:18.560
the way that we, the way our brain gets molded
link |
00:41:22.000
is by absorbing everything from the world,
link |
00:41:24.320
all of the knowledge and the data and so on that it can get,
link |
00:41:28.520
and then springboarding off of that.
link |
00:41:33.040
And we're in a very lucky time now
link |
00:41:34.340
because we grew up with a lot of just in case learning.
link |
00:41:40.340
So, you know, just in case you ever need to know
link |
00:41:42.000
these dates in Mongolian history, here they are.
link |
00:41:44.720
But what kids are grown up with now, like my kids,
link |
00:41:47.280
is tons of just in time learning.
link |
00:41:48.800
So as soon as they're curious about something,
link |
00:41:50.120
they ask Alexa, they ask Google Home,
link |
00:41:51.800
they get the answer right there
link |
00:41:53.080
in the context of their curiosity.
link |
00:41:54.600
The reason this matters is because for plasticity to happen,
link |
00:41:59.720
you need to care, you need to be curious about something.
link |
00:42:02.480
And this is something, by the way,
link |
00:42:03.760
that the ancient Romans had noted.
link |
00:42:06.320
They had outlined seven different levels of learning
link |
00:42:08.360
and the highest level is when you're curious about a topic.
link |
00:42:10.600
But anyway, so kids now are getting tons
link |
00:42:13.200
of just in time learning, and as a result,
link |
00:42:15.800
they're gonna be so much smarter than we are.
link |
00:42:18.440
They're just, and we can already see that.
link |
00:42:19.760
I mean, my boy is eight years old, my girl is five.
link |
00:42:22.200
But I mean, the things that he knows are amazing
link |
00:42:25.740
because it's not just him having to do
link |
00:42:27.800
the rote memorization stuff that we did.
link |
00:42:29.960
Yeah, it's just fascinating what the brain,
link |
00:42:32.280
what young brains look like now
link |
00:42:33.640
because of all those TED Talks just loaded in there.
link |
00:42:36.880
And there's also, I mean, a lot of people, right,
link |
00:42:39.920
kind of, there's a sense that our attention span
link |
00:42:42.960
is growing shorter, but it's complicated
link |
00:42:46.320
because for example, most people, majority of people,
link |
00:42:50.760
it's the 80 plus percent of people listen
link |
00:42:53.000
to the entirety of these things,
link |
00:42:54.600
two, three hours for the podcast,
link |
00:42:56.800
long form podcasts are becoming more and more popular.
link |
00:43:00.920
So like that's, it's all really giant complicated mess.
link |
00:43:04.240
And the point is that the brain is able to adjust to it
link |
00:43:07.440
and somehow like form a worldview
link |
00:43:11.840
within this new medium of like information that we have.
link |
00:43:17.040
You have like these short tweets
link |
00:43:19.920
and you have these three, four hour podcasts
link |
00:43:22.840
and you have Netflix movie.
link |
00:43:24.940
I mean, it's just, it's adjusting to the entirety
link |
00:43:27.200
and just absorbing it and taking it all in
link |
00:43:30.080
and then pops up COVID that forces us all to be home
link |
00:43:34.600
and it all just adjusts and figures it out.
link |
00:43:39.200
Yeah, yeah, exactly.
link |
00:43:40.200
It's fascinating.
link |
00:43:41.880
Been talking about the brain
link |
00:43:43.880
as if it's something separate from the human
link |
00:43:48.400
that carries it a little bit.
link |
00:43:50.280
Like whenever you talk about the brain,
link |
00:43:52.180
it's easy to forget that that's like, that's us.
link |
00:43:55.340
Like how much do you,
link |
00:43:59.620
how much is the whole thing like predetermined?
link |
00:44:04.860
Like how much is it already encoded in there?
link |
00:44:08.580
And how much is it the, what's the hit?
link |
00:44:17.020
The actions, the decisions, the judgments, the...
link |
00:44:22.220
You mean like who you are?
link |
00:44:23.300
Who you are.
link |
00:44:24.140
Oh, yeah, yeah, okay, great question.
link |
00:44:25.760
Right, so there used to be a big debate
link |
00:44:27.400
about nature versus nurture.
link |
00:44:28.900
And we now know that it's always both.
link |
00:44:31.380
You can't even separate them
link |
00:44:32.720
because you come to the table with a certain amount of nature
link |
00:44:35.500
for example, your whole genome and so on.
link |
00:44:37.740
The experiences you have in the womb,
link |
00:44:39.680
like whether your mother is smoking or drinking,
link |
00:44:41.820
things like that, whether she's stressed, so on.
link |
00:44:43.740
Those all influence how you're gonna pop out of the womb.
link |
00:44:47.260
From there, everything is an interaction
link |
00:44:50.180
between all of your experiences and the nature.
link |
00:44:55.500
What I mean is, I think of it like a space time cone
link |
00:44:59.900
where you have, you drop into the world
link |
00:45:01.820
and depending on the experience that you have,
link |
00:45:03.160
you might go off in this direction
link |
00:45:04.220
or that direction or in that direction
link |
00:45:05.980
because there's interaction on the way.
link |
00:45:08.460
Your experiences determine what happens
link |
00:45:11.080
with the expression of your genes.
link |
00:45:12.380
So some genes get repressed, some get expressed and so on.
link |
00:45:15.980
And you actually become a different person
link |
00:45:17.620
based on your experiences.
link |
00:45:18.940
There's a whole field called epigenomics,
link |
00:45:21.140
which is, or epigenetics I should say,
link |
00:45:24.020
which is about the epigenome.
link |
00:45:26.400
And that is the layer that sits on top of the DNA
link |
00:45:30.380
and causes the genes to express differently.
link |
00:45:32.560
That is directly related to the experiences that you have.
link |
00:45:35.140
So if, just as an example, they take rat pups
link |
00:45:38.700
and one group is placed away from their parents
link |
00:45:41.560
and the other group is groomed and licked
link |
00:45:43.580
and taken good care of,
link |
00:45:44.620
that changes their gene expression
link |
00:45:46.100
for the rest of their life.
link |
00:45:46.940
They go off in different directions
link |
00:45:48.140
in this space time cone.
link |
00:45:50.940
So yeah, this is of course why it matters
link |
00:45:55.820
that we take care of children and pour money
link |
00:45:58.380
into things like education and good childcare
link |
00:46:00.780
and so on for children broadly,
link |
00:46:04.460
because these formative years matter so much.
link |
00:46:08.340
So is there a free will?
link |
00:46:11.620
This is a great question.
link |
00:46:13.940
I apologize for the absurd high level
link |
00:46:16.500
philosophical questions.
link |
00:46:17.340
No, no, these are my favorite kind of questions.
link |
00:46:19.060
Here's the thing, here's the thing.
link |
00:46:20.900
We don't know.
link |
00:46:21.780
If you ask most neuroscientists,
link |
00:46:23.300
they'll say that we can't really think
link |
00:46:26.780
of how you would get free will in there
link |
00:46:28.660
because as far as we can tell, it's a machine.
link |
00:46:30.280
It's a very complicated machine.
link |
00:46:33.420
Enormously sophisticated, 86 billion neurons,
link |
00:46:36.140
about the same number of glial cells.
link |
00:46:38.140
Each of these things is as complicated
link |
00:46:40.140
as the city of San Francisco.
link |
00:46:41.280
Each neuron in your head has the entire human genome in it.
link |
00:46:43.500
It's expressing millions of gene products.
link |
00:46:47.580
These are incredibly complicated biochemical cascades.
link |
00:46:50.000
Each one is connected to 10,000 of its neighbors,
link |
00:46:51.860
which means you have like half a quadrillion connections
link |
00:46:54.960
in the brain.
link |
00:46:55.800
So it's incredibly complicated thing,
link |
00:46:58.180
but it is fundamentally appears to just be a machine.
link |
00:47:02.540
And therefore, if there's nothing in it
link |
00:47:04.980
that's not being driven by something else,
link |
00:47:07.420
then it seems it's hard to understand
link |
00:47:10.160
where free will would come from.
link |
00:47:12.600
So that's the camp that pretty much all of us fall into,
link |
00:47:14.860
but I will say, our science is still quite young.
link |
00:47:18.120
And I'm a fan of the history of science,
link |
00:47:20.860
and the thing that always strikes me as interesting
link |
00:47:22.780
is when you look back at any moment in science,
link |
00:47:26.100
everybody believes something is true,
link |
00:47:28.500
and they simply didn't know about
link |
00:47:31.340
what Einstein revealed or whatever.
link |
00:47:33.180
And so who knows?
link |
00:47:35.140
And they all feel like that we've,
link |
00:47:37.080
at any moment in history,
link |
00:47:38.620
they all feel like we've converged to the final answer.
link |
00:47:40.700
Exactly, exactly.
link |
00:47:41.800
Like all the pieces of the puzzle are there.
link |
00:47:43.780
And I think that's a funny illusion
link |
00:47:45.620
that's worth getting rid of.
link |
00:47:47.180
And in fact, this is what drives good science
link |
00:47:49.540
is recognizing that we don't have most of the puzzle pieces.
link |
00:47:52.660
So as far as the free will question goes, I don't know.
link |
00:47:55.620
At the moment, it seems, wow,
link |
00:47:57.060
it'd be really impossible to figure out
link |
00:47:58.540
how something else could fit in there,
link |
00:48:00.020
but 100 years from now,
link |
00:48:02.720
our textbooks might be very different than they are now.
link |
00:48:05.620
I mean, could I ask you to speculate
link |
00:48:07.620
where do you think free will could be squeezed into there?
link |
00:48:11.060
Like, what's that even,
link |
00:48:14.580
is it possible that our brain just creates
link |
00:48:16.300
kinds of illusions that are useful for us?
link |
00:48:19.880
Or like what, where could it possibly be squeezed in?
link |
00:48:24.200
Well, let me give a speculation answer
link |
00:48:27.140
to your very nice question,
link |
00:48:28.740
but don't, and the listeners of this podcast,
link |
00:48:32.180
don't quote me on this.
link |
00:48:33.140
Yeah, exactly.
link |
00:48:33.980
I'm not saying this is what I believe to be true,
link |
00:48:35.620
but let me just give an example.
link |
00:48:36.740
I give this at the end of my book, Incognito.
link |
00:48:38.940
So the whole book of Incognito is about,
link |
00:48:41.340
all the what's happening in the brain.
link |
00:48:42.900
And essentially I'm saying, look,
link |
00:48:44.060
here's all the reasons to think
link |
00:48:45.460
that free will probably does not exist.
link |
00:48:47.020
But at the very end, I say, look,
link |
00:48:50.580
imagine that you are,
link |
00:48:53.860
imagine that you're a Kalahari Bushman
link |
00:48:56.140
and you find a radio in the sand
link |
00:48:58.900
and you've never seen anything like this.
link |
00:49:01.100
And you look at this radio and you realize
link |
00:49:04.420
that when you turn this knob, you hear voices coming from,
link |
00:49:07.180
there are voices coming from it.
link |
00:49:08.700
So being a radio materialist,
link |
00:49:11.700
you try to figure out like, how does this thing operate?
link |
00:49:14.020
So you take off the back cover
link |
00:49:15.420
and you realize there's all these wires.
link |
00:49:16.780
And when you take out some wires,
link |
00:49:19.740
the voices get garbled or stop or whatever.
link |
00:49:22.140
And so what you end up developing is a whole theory
link |
00:49:24.300
about how this connection, this pattern of wires
link |
00:49:26.820
gives rise to voices.
link |
00:49:29.020
But it would never strike you that in distant cities,
link |
00:49:31.780
there's a radio tower and there's invisible stuff beaming.
link |
00:49:34.460
And that's actually the origin of the voices.
link |
00:49:36.780
And this is just necessary for it.
link |
00:49:38.700
So I mentioned this just as a speculation,
link |
00:49:42.580
say, look, how would we know,
link |
00:49:44.100
what we know about the brain for absolutely certain
link |
00:49:46.060
is that when you damage pieces and parts of it,
link |
00:49:48.580
things get jumbled up.
link |
00:49:50.580
But how would you know if there's something else going on
link |
00:49:52.680
that we can't see like electromagnetic radiation
link |
00:49:55.140
that is what's actually generating this?
link |
00:49:58.180
Yeah, you paint a beautiful example
link |
00:50:01.420
of how totally,
link |
00:50:06.260
because we don't know most of how our universe works,
link |
00:50:10.300
how totally off base we might be with our science until,
link |
00:50:14.460
I mean, yeah, I mean, that's inspiring, that's beautiful.
link |
00:50:19.100
It's kind of terrifying, it's humbling.
link |
00:50:21.780
It's all of the above.
link |
00:50:23.980
And the important part just to recognize
link |
00:50:26.820
is that of course we're in the position
link |
00:50:28.660
of having massive unknowns.
link |
00:50:31.780
And we have of course the known unknowns
link |
00:50:36.340
and that's all the things we're pursuing in our labs
link |
00:50:38.420
and trying to figure out that,
link |
00:50:39.260
but there's this whole space of unknown unknowns.
link |
00:50:41.620
Things we haven't even realized we haven't asked yet.
link |
00:50:44.020
Let me kind of ask a weird, maybe a difficult question,
link |
00:50:47.660
part that has to do with,
link |
00:50:50.580
I've been recently reading a lot about World War II.
link |
00:50:54.340
I'm currently reading a book I recommend for people,
link |
00:50:56.620
which as a Jew has been difficult to read,
link |
00:51:00.940
but the rise and fall of the Third Reich.
link |
00:51:04.940
So let me just ask about like the nature of genius,
link |
00:51:08.900
the nature of evil.
link |
00:51:10.460
If we look at somebody like Einstein,
link |
00:51:14.220
we look at Hitler, Stalin, modern day Jeffrey Epstein,
link |
00:51:19.500
just folks who through their life have done with Einstein
link |
00:51:24.500
and works of genius and with the others I mentioned
link |
00:51:27.540
have done evil on this world.
link |
00:51:30.580
What do we think about that in a livewired brain?
link |
00:51:34.940
Like how do we think about these extreme people?
link |
00:51:39.580
Here's what I'd say.
link |
00:51:41.700
This is a very big and difficult question,
link |
00:51:43.540
but what I would say briefly on it is,
link |
00:51:47.620
first of all, I saw a cover of Time Magazine some years ago
link |
00:51:51.500
and it was a big sagittal slice of the brain
link |
00:51:55.500
and it said something like, what makes us good and evil?
link |
00:51:59.020
And there was a little spot pointing to it
link |
00:52:00.460
and there was a picture of Gandhi
link |
00:52:01.300
and there was a little spot that was pointing to Hitler.
link |
00:52:03.380
And these Time Magazine covers always make me mad
link |
00:52:05.780
because it's so goofy to think that we're gonna find
link |
00:52:08.660
some spot in the brain or something.
link |
00:52:10.940
Instead, the interesting part is because we're livewired,
link |
00:52:16.860
we are all about the world and the culture around us.
link |
00:52:20.580
So somebody like Adolf Hitler got all this positive feedback
link |
00:52:25.740
about what was going on and the crazier and crazier
link |
00:52:28.780
the ideas he had and he's like, let's set up death camps
link |
00:52:31.980
and murder a bunch of people and so on.
link |
00:52:34.140
Somehow he was getting positive feedback from that
link |
00:52:37.340
and all these other people, they're all spun each other up.
link |
00:52:40.300
And you look at anything like, I mean, look at the cultural
link |
00:52:45.300
revolution in China or the Russian revolution
link |
00:52:51.180
or things like this where you look at these things,
link |
00:52:52.620
my God, how do people all behave like this?
link |
00:52:55.300
But it's easy to see groups of people spinning themselves up
link |
00:52:58.860
in particular ways where they all say,
link |
00:53:00.300
well, would I have thought this was right
link |
00:53:03.820
in a different circumstance?
link |
00:53:04.740
I don't know, but Fred thinks it's right
link |
00:53:05.940
and Steve thinks it's right,
link |
00:53:06.780
everyone around me seems to think it's right.
link |
00:53:08.180
And so part of the maybe downside of having a livewired brain
link |
00:53:13.060
is that you can get crowds of people doing things as a group.
link |
00:53:17.500
So it's interesting to, we would pinpoint Hitler
link |
00:53:20.420
as saying that's the evil guy.
link |
00:53:21.660
But in a sense, I think it was Tolstoy who said
link |
00:53:24.740
the king becomes slave to the people.
link |
00:53:30.860
In other words, Hitler was just a representation
link |
00:53:34.740
of whatever was going on with that huge crowd
link |
00:53:37.420
that he was surrounded with.
link |
00:53:39.420
So I only bring that up to say that it's very difficult
link |
00:53:45.260
to say what it is about this person's brain
link |
00:53:48.340
or that person's brain.
link |
00:53:49.180
He obviously got feedback for what he was doing.
link |
00:53:51.660
The other thing, by the way,
link |
00:53:52.820
about what we often think of as being evil in society
link |
00:53:57.300
is my lab recently published some work
link |
00:54:01.820
on in groups and out groups,
link |
00:54:04.620
which is a very important part of this puzzle.
link |
00:54:08.260
So it turns out that we are very engineered
link |
00:54:13.860
to care about in groups versus out groups.
link |
00:54:16.060
And this seems to be like a really fundamental thing.
link |
00:54:18.700
So we did this experiment in my lab
link |
00:54:20.140
where we brought people and we stick them in the scanner.
link |
00:54:23.420
And we, I don't know if you noticed,
link |
00:54:25.020
but we show them on the screen six hands
link |
00:54:30.700
and the computer goes around randomly picks a hand.
link |
00:54:33.460
And then you see that hand gets stabbed
link |
00:54:34.980
with a syringe needle.
link |
00:54:36.060
So you actually see a syringe needle enter the hand
link |
00:54:38.260
and come out.
link |
00:54:39.100
And it's really, what that does is that triggers
link |
00:54:42.940
parts of the pain matrix,
link |
00:54:44.540
this areas in your brain that are involved
link |
00:54:46.140
in feeling physical pain.
link |
00:54:47.260
Now, the interesting thing is it's not your hand
link |
00:54:48.860
that was stabbed.
link |
00:54:49.940
So what you're seeing is empathy.
link |
00:54:51.620
This is you seeing someone else's hand gets stabbed.
link |
00:54:54.140
You feel like, oh God, this is awful, right?
link |
00:54:56.220
Okay.
link |
00:54:57.420
We contrast that by the way,
link |
00:54:58.580
with somebody's hand getting poked as a Q tip,
link |
00:55:00.820
which is, you know, looks visually the same,
link |
00:55:02.540
but you don't have that same level of response.
link |
00:55:06.100
Now what we do is we label each hand with a one word label,
link |
00:55:10.980
Christian, Jewish, Muslim, atheist, Scientologist, Hindu.
link |
00:55:14.340
And now the computer goes around, picks a hand,
link |
00:55:16.940
stabs the hand.
link |
00:55:17.900
And the question is, how much does your brain care
link |
00:55:21.420
about all the people in your out group
link |
00:55:23.220
versus the one label that happens to match you?
link |
00:55:26.660
And it turns out for everybody across all religions,
link |
00:55:29.140
they care much more about their in group
link |
00:55:31.020
than their out group.
link |
00:55:31.860
And when I say they care, what I mean is
link |
00:55:33.420
you get a bigger response from their brain.
link |
00:55:35.580
Everything's the same.
link |
00:55:36.420
It's the same hands.
link |
00:55:38.820
It's just a one word label.
link |
00:55:40.620
You care much more about your in group than your out group.
link |
00:55:42.900
And I wish this weren't true, but this is how humans are.
link |
00:55:45.700
I wonder how fundamental that is,
link |
00:55:47.820
or if it's the emergent thing about culture.
link |
00:55:53.220
Like if we lived alone with like,
link |
00:55:55.540
if it's genetically built into the brain,
link |
00:55:57.500
like this longing for tribe.
link |
00:56:00.300
So I'll tell you, we addressed that.
link |
00:56:02.300
So here's what we did.
link |
00:56:03.260
There are two, actually there are two other things
link |
00:56:06.820
we did as part of this study
link |
00:56:07.860
that I think matter for this point.
link |
00:56:09.580
One is, so okay, so we show that you have
link |
00:56:11.740
a much bigger response.
link |
00:56:13.020
And by the way, this is not a cognitive thing.
link |
00:56:14.460
This is a very low level basic response
link |
00:56:17.460
to seeing pain in somebody, okay.
link |
00:56:19.060
Great study by the way.
link |
00:56:20.060
Thanks, thanks, thanks.
link |
00:56:21.940
What we did next is we next have it where we say,
link |
00:56:24.820
okay, the year is 2025 and these three religions
link |
00:56:28.700
are now in a war against these three religions.
link |
00:56:30.780
And it's all randomized, right?
link |
00:56:31.820
But what you see is your thing and you have two allies now
link |
00:56:34.980
against these others.
link |
00:56:37.100
And now it happens over the course of many trials,
link |
00:56:38.700
you see everybody gets stabbed at different times.
link |
00:56:41.620
And the question is, do you care more about your allies?
link |
00:56:43.500
And the answer is yes.
link |
00:56:44.340
Suddenly people who a moment ago,
link |
00:56:45.940
you didn't really care when they got stabbed.
link |
00:56:47.300
Now, simply with this one word thing
link |
00:56:49.740
that they're now your allies, you care more about them.
link |
00:56:52.700
But then what I wanted to do was look at
link |
00:56:55.340
how ingrained is this or how arbitrary is it?
link |
00:56:57.660
So we brought new participants in and we said,
link |
00:57:01.460
here's a coin, toss the coin.
link |
00:57:02.700
If it's heads, you're an Augustinian.
link |
00:57:04.140
If it's a tails, you're a Justinian.
link |
00:57:06.140
These are totally made up.
link |
00:57:08.020
Okay, so they toss it, they get whatever.
link |
00:57:10.100
We give them a band that says Augustinian on it,
link |
00:57:13.380
whatever tribe they're in now, and they get in the scanner
link |
00:57:16.740
and they see a thing on the screen that says
link |
00:57:18.500
the Augustinians and Justinians are two warring tribes.
link |
00:57:21.100
Then you see a bunch of hands,
link |
00:57:22.180
some are labeled Augustinians, some are Justinian.
link |
00:57:24.620
And now you care more about whichever team you're on
link |
00:57:27.820
than the other team, even though it's totally arbitrary
link |
00:57:29.740
and you know it's arbitrary
link |
00:57:30.580
because you're the one who tossed the coin.
link |
00:57:32.900
So it's a state that's very easy to find ourselves in.
link |
00:57:36.980
In other words, just before walking in the door,
link |
00:57:39.500
they'd never even heard of Augustinian versus Justinian
link |
00:57:41.700
and now their brain is representing it
link |
00:57:43.940
simply because they're told they're on this team.
link |
00:57:46.380
You know, now I did my own personal study of this.
link |
00:57:49.620
So once you're an Augustinian, that tends to be sticky
link |
00:57:55.500
because I've been a Packers fan,
link |
00:57:57.460
grew to be a Packers fan my whole life.
link |
00:57:59.100
Now when I'm in Boston with like the Patriots,
link |
00:58:03.020
it's been tough going for my livewired brain
link |
00:58:05.420
to switch to the Patriots.
link |
00:58:07.980
So once you become, it's as interesting,
link |
00:58:10.020
once the tribe is sticky.
link |
00:58:12.140
Yeah, I'll admit that's true.
link |
00:58:14.180
That's it, you know.
link |
00:58:15.020
You know, we never tried that about saying,
link |
00:58:16.620
okay, now you're a Justinian and you were an Augustinian.
link |
00:58:19.140
We never saw how sticky it is.
link |
00:58:21.820
But there are studies of this,
link |
00:58:24.380
of monkey troops on some island.
link |
00:58:30.060
And what happens is they look at the way monkeys behave
link |
00:58:33.580
when they're part of this tribe
link |
00:58:34.460
and how they treat members of the other tribe of monkeys.
link |
00:58:37.580
And then what they do, I've forgotten how they do that,
link |
00:58:39.500
exactly, but they end up switching a monkey
link |
00:58:42.060
so he ends up in the other troop.
link |
00:58:43.100
And very quickly they end up becoming a part
link |
00:58:45.220
of that other troop and hating and behaving badly
link |
00:58:48.180
towards the original troop.
link |
00:58:50.380
These are fascinating studies, by the way.
link |
00:58:52.500
This is beautiful.
link |
00:58:55.020
In your book, you have a good light bulb joke.
link |
00:59:01.220
How many psychiatrists does it take to change a light bulb?
link |
00:59:04.460
Only one, but the light bulb has to want to change.
link |
00:59:07.900
Sorry.
link |
00:59:09.700
I'm a sucker for a good light bulb joke.
link |
00:59:11.780
Okay, so given, you know, I've been interested
link |
00:59:15.100
in psychiatry my whole life, just maybe tangentially.
link |
00:59:19.140
I've kind of early on dreamed to be a psychiatrist
link |
00:59:22.500
until I understood what it entails.
link |
00:59:25.660
But, you know, is there hope for psychiatry
link |
00:59:31.940
for somebody else to help this live, wired brain to adjust?
link |
00:59:37.260
Oh yeah, I mean, in the sense that,
link |
00:59:40.180
and this has to do with this issue
link |
00:59:41.180
about us being trapped on our own planet.
link |
00:59:43.220
Forget psychiatrists, just think of like
link |
00:59:45.900
when you're talking with a friend
link |
00:59:47.020
and you say, oh, I'm so upset about this.
link |
00:59:48.900
And your friend says, hey, just look at it this way.
link |
00:59:53.300
You know, all we have access to under normal circumstances
link |
00:59:55.780
is just the way we're seeing something.
link |
00:59:57.420
And so it's super helpful to have friends and communities
link |
01:00:02.180
and psychiatrists and so on to help things change that way.
link |
01:00:05.740
So that's how psychiatrists sort of helped us.
link |
01:00:07.220
But more importantly, the role that psychiatrists have played
link |
01:00:10.500
is that there's this sort of naive assumption
link |
01:00:13.940
that we all come to the table with,
link |
01:00:15.140
which is that everyone is fundamentally just like us.
link |
01:00:18.340
And when you're a kid, you believe this entirely,
link |
01:00:21.140
but as you get older and you start realizing,
link |
01:00:22.780
okay, there's something called schizophrenia
link |
01:00:25.260
and that's a real thing.
link |
01:00:26.300
And to be inside that person's head is totally different
link |
01:00:29.100
than what it is to be inside my head or their psychopathy.
link |
01:00:32.140
And to be inside the psychopath's head,
link |
01:00:34.900
he doesn't care about other people.
link |
01:00:36.300
He doesn't care about hurting other people.
link |
01:00:37.460
He's just doing what he needs to do to get what he needs.
link |
01:00:40.900
That's a different head.
link |
01:00:42.100
There's a million different things going on
link |
01:00:45.540
and it is different to be inside those heads.
link |
01:00:48.860
This is where the field of psychiatry comes in.
link |
01:00:51.300
Now, I think it's an interesting question
link |
01:00:53.380
about the degree to which neuroscience is leaking into
link |
01:00:57.020
and taking over psychiatry
link |
01:00:58.540
and what the landscape will look like 50 years from now.
link |
01:01:00.860
It may be that psychiatry as a profession changes a lot
link |
01:01:05.860
or maybe goes away entirely,
link |
01:01:07.220
and neuroscience will essentially be able
link |
01:01:09.220
to take over some of these functions,
link |
01:01:10.420
but it has been extremely useful to understand
link |
01:01:14.220
the differences between how people behave and why
link |
01:01:18.220
and what you can tell about what's going on
link |
01:01:19.580
inside their brain just based on observation
link |
01:01:22.380
of their behavior.
link |
01:01:25.220
This might be years ago, but I'm not sure.
link |
01:01:28.340
There's an Atlantic article you've written
link |
01:01:32.300
about moving away from a distinction
link |
01:01:34.460
between neurological disorders,
link |
01:01:36.900
quote unquote, brain problems,
link |
01:01:39.100
and psychiatric disorders or quote unquote, mind problems.
link |
01:01:43.820
So on that topic, how do you think about this gray area?
link |
01:01:47.220
Yeah, this is exactly the evolution that things are going
link |
01:01:50.580
is there was psychiatry and then there were guys and gals
link |
01:01:54.140
in labs poking cells and so on.
link |
01:01:55.940
Those were the neuroscientists.
link |
01:01:57.020
But yeah, I think these are moving together
link |
01:01:58.860
for exactly the reason you just cited.
link |
01:02:00.580
And where this matters a lot,
link |
01:02:02.540
the Atlantic article that I wrote
link |
01:02:04.660
was called The Brain on Trial,
link |
01:02:06.780
where this matters a lot is the legal system
link |
01:02:09.580
because the way we run our legal system now,
link |
01:02:12.740
and this is true everywhere in the world,
link |
01:02:13.900
is someone shows up in front of the judge's bench,
link |
01:02:17.420
or let's say there's five people
link |
01:02:18.540
in front of the judge's bench,
link |
01:02:20.100
and they've all committed the same crime.
link |
01:02:21.380
What we do, because we feel like, hey, this is fair,
link |
01:02:23.820
is we say, all right, you're gonna get the same sentence.
link |
01:02:25.580
You'll all get three years in prison or whatever it is.
link |
01:02:27.780
But in fact, brains can be so different.
link |
01:02:29.780
This guy's got schizophrenia, this guy's a psychopath,
link |
01:02:31.580
this guy's tweaked down on drugs, and so on and so on,
link |
01:02:33.620
that it actually doesn't make sense to keep doing that.
link |
01:02:37.620
And what we do in this country more than anywhere
link |
01:02:41.220
in the world is we imagine that incarceration
link |
01:02:44.100
is a one size fits all solution.
link |
01:02:45.500
And you may know we have the,
link |
01:02:47.300
America has the highest incarceration rate
link |
01:02:49.140
in the whole world in terms of the percentage
link |
01:02:50.860
of our population we put behind bars.
link |
01:02:52.860
So there's a much more refined thing we can do
link |
01:02:56.580
as neuroscience comes in and changes,
link |
01:02:59.180
and has the opportunity to change the legal system.
link |
01:03:01.900
Which is to say, this doesn't let anybody off the hook.
link |
01:03:03.980
It doesn't say, oh, it's not your fault, and so on.
link |
01:03:06.100
But what it does is it changes the equation
link |
01:03:09.380
so it's not about, hey, how blameworthy are you?
link |
01:03:12.940
But instead is about, hey, what do we do from here?
link |
01:03:15.220
What's the best thing to do from here?
link |
01:03:16.260
So if you take somebody with schizophrenia
link |
01:03:17.740
and you have them break rocks in the hot summer sun
link |
01:03:21.100
in a chain gang, that doesn't help their schizophrenia.
link |
01:03:24.500
That doesn't fix the problem.
link |
01:03:25.900
If you take somebody with a drug addiction
link |
01:03:28.020
who's in jail for being caught with two ounces
link |
01:03:30.660
of some illegal substance, and you put them in prison,
link |
01:03:34.500
it doesn't actually fix the addiction.
link |
01:03:36.020
It doesn't help anything.
link |
01:03:38.580
Happily, what neuroscience and psychiatry
link |
01:03:40.580
bring to the table is lots of really useful things
link |
01:03:43.260
you can do with schizophrenia, with drug addiction,
link |
01:03:45.620
things like this.
link |
01:03:46.940
And that's why, so I don't know if you guys
link |
01:03:49.140
better run a national law and profit
link |
01:03:50.540
called the Center for Science and Law.
link |
01:03:52.140
And it's all about this intersection
link |
01:03:53.660
of neuroscience and psychiatry.
link |
01:03:55.140
It's the intersection of neuroscience and legal system.
link |
01:03:57.340
And we're trying to implement changes
link |
01:03:59.060
in every county, in every state.
link |
01:04:02.540
I'll just, without going down that rabbit hole,
link |
01:04:04.700
I'll just say one of the very simplest things to do
link |
01:04:07.540
is to set up specialized court systems
link |
01:04:09.140
where you have a mental health court
link |
01:04:12.660
that has judges and juries with expertise
link |
01:04:14.980
in mental illness.
link |
01:04:15.820
Because if you go, by the way, to a regular court
link |
01:04:17.780
and the person says, or the defense lawyer says,
link |
01:04:21.220
this person has schizophrenia, most of the jury will say,
link |
01:04:24.300
man, I call bullshit on that.
link |
01:04:25.780
Why?
link |
01:04:26.620
Because they don't know about schizophrenia.
link |
01:04:28.660
They don't know what it's about.
link |
01:04:30.780
And it turns out people who know about schizophrenia
link |
01:04:34.500
feel very differently as a juror
link |
01:04:35.900
than someone who happens not to know anybody with
link |
01:04:37.740
schizophrenia, they think it's an excuse.
link |
01:04:39.540
So you have judges and juries with expertise
link |
01:04:42.380
in mental illness and they know the rehabilitative
link |
01:04:44.180
strategies that are available.
link |
01:04:45.780
That's one thing.
link |
01:04:46.620
Having a drug court where you have judges and juries
link |
01:04:48.500
with expertise in rehabilitative strategies
link |
01:04:50.540
and what can be done and so on.
link |
01:04:51.860
A specialized prostitution court and so on.
link |
01:04:54.140
All these different things.
link |
01:04:55.700
By the way, this is very easy for counties
link |
01:04:57.620
to implement this sort of thing.
link |
01:04:59.260
And this is, I think, where this matters
link |
01:05:01.820
to get neuroscience into public policy.
link |
01:05:05.020
What's the process of injecting expertise into this?
link |
01:05:08.620
Yeah, I'll tell you exactly what it is.
link |
01:05:10.540
A county needs to run out of money first.
link |
01:05:12.420
I've seen this happen over and over.
link |
01:05:14.460
So what happens is a county has a completely full jail
link |
01:05:17.220
and they say, you know what?
link |
01:05:18.540
We need to build another jail.
link |
01:05:19.780
And then they realize, God, we don't have any money.
link |
01:05:21.220
We can't afford this.
link |
01:05:22.260
We've got too many people in jail.
link |
01:05:23.260
And that's when they turn to,
link |
01:05:25.500
God, we need something smarter.
link |
01:05:26.700
And that's when they set up specialized court systems.
link |
01:05:28.700
Yeah.
link |
01:05:30.940
We're all function best when our back is against the wall.
link |
01:05:34.300
And that's what COVID is good for.
link |
01:05:36.180
It's because we've all had our routines
link |
01:05:38.380
and we are optimized for the things we do.
link |
01:05:40.820
And suddenly our backs are against the wall, all of us.
link |
01:05:43.020
Yeah, it's really, I mean,
link |
01:05:44.620
one of the exciting things about COVID.
link |
01:05:47.580
I mean, I'm a big believer in the possibility
link |
01:05:51.940
of what government can do for the people.
link |
01:05:56.140
And when it becomes too big of a bureaucracy,
link |
01:05:59.180
it starts functioning poorly, it starts wasting money.
link |
01:06:02.580
It's nice to, I mean, COVID reveals that nicely.
link |
01:06:07.140
And lessons to be learned about who gets elected
link |
01:06:11.700
and who goes into government.
link |
01:06:14.180
Hopefully this, hopefully this inspires talented
link |
01:06:18.860
and young people to go into government
link |
01:06:20.780
to revolutionize different aspects of it.
link |
01:06:23.580
Yeah, so that's the positive silver lining of COVID.
link |
01:06:28.660
I mean, I thought it'd be fun to ask you,
link |
01:06:30.740
I don't know if you're paying attention
link |
01:06:31.900
to the machine learning world and GPT3.
link |
01:06:37.100
So the GPT3 is this language model,
link |
01:06:39.260
this neural network that's able to,
link |
01:06:41.180
it has 175 billion parameters.
link |
01:06:44.860
So it's very large and it's trained
link |
01:06:47.820
in an unsupervised way on the internet.
link |
01:06:51.540
It just reads a lot of unstructured texts
link |
01:06:55.780
and it's able to generate some pretty impressive things.
link |
01:06:59.420
The human brain compared to that has about,
link |
01:07:02.020
you know, a thousand times more synapses.
link |
01:07:05.980
People get so upset when machine learning people
link |
01:07:10.060
compare the brain and we know synapses are different.
link |
01:07:14.380
It was very different, very different.
link |
01:07:16.860
But like, do you, what do you think about GPT3?
link |
01:07:20.580
Here's what I think, here's what I think, a few things.
link |
01:07:22.660
What GPT3 is doing is extremely impressive,
link |
01:07:25.580
but it's very different from what the brain does.
link |
01:07:27.620
So it's a good impersonator, but just as one example,
link |
01:07:33.060
everybody takes a passage that GPT3 has written
link |
01:07:37.500
and they say, wow, look at this, and it's pretty good, right?
link |
01:07:40.420
But it's already gone through a filtering process
link |
01:07:42.420
of humans looking at it and saying,
link |
01:07:43.740
okay, well that's crap, that's crap, okay.
link |
01:07:45.340
Oh, here's a sentence that's pretty cool.
link |
01:07:47.500
Now here's the thing, human creativity
link |
01:07:49.940
is about absorbing everything around it
link |
01:07:51.740
and remixing that and coming up with stuff.
link |
01:07:53.420
So in that sense, we're sort of like GPT3,
link |
01:07:55.420
you know, we're remixing what we've gotten in before.
link |
01:07:59.700
But we also know, we also have very good models
link |
01:08:03.140
of what it is to be another human.
link |
01:08:04.780
And so, you know, I don't know if you speak French
link |
01:08:08.220
or something, but I'm not gonna start speaking in French
link |
01:08:09.820
because then you'll say, wait, what are you doing?
link |
01:08:11.500
I don't understand it.
link |
01:08:12.420
Instead, everything coming out of my mouth
link |
01:08:14.700
is meant for your ears.
link |
01:08:16.260
I know what you'll understand.
link |
01:08:18.140
I know the vocabulary that you know and don't know.
link |
01:08:20.140
I know what parts you care about.
link |
01:08:23.740
That's a huge part of it.
link |
01:08:25.140
And so of all the possible sentences I could say,
link |
01:08:29.420
I'm navigating this thin bandwidth
link |
01:08:31.700
so that it's something useful for our conversation.
link |
01:08:34.540
Yeah, in real time, but also throughout your life.
link |
01:08:36.740
I mean, we're co evolving together.
link |
01:08:39.740
We're learning how to communicate together.
link |
01:08:42.980
Exactly, but this is what GPT3 does not do.
link |
01:08:46.220
All it's doing is saying, okay,
link |
01:08:47.300
I'm gonna take all these senses and remix stuff
link |
01:08:49.500
and pop some stuff out.
link |
01:08:51.100
But it doesn't know how to make it
link |
01:08:52.540
so that you, Lex, will feel like,
link |
01:08:54.340
oh yeah, that's exactly what I needed to hear.
link |
01:08:56.820
That's the next sentence that I needed to know about
link |
01:08:59.700
for something.
link |
01:09:00.540
Well, of course, it could be,
link |
01:09:02.860
all the impressive results we see.
link |
01:09:04.220
The question is, if you raise the number of parameters,
link |
01:09:07.700
whether it's going to be after some...
link |
01:09:09.980
It will not be.
link |
01:09:11.020
It will not be.
link |
01:09:11.860
Raising more parameters won't...
link |
01:09:14.060
Here's the thing.
link |
01:09:15.180
It's not that I don't think neural networks
link |
01:09:16.740
can't be like the human brain,
link |
01:09:18.100
because I suspect they will be at some point, 50 years.
link |
01:09:20.420
Who knows?
link |
01:09:21.260
But what we are missing in artificial neural networks
link |
01:09:26.020
is we've got this basic structure where you've got units
link |
01:09:29.300
and you've got synapses that are connected.
link |
01:09:32.420
And that's great.
link |
01:09:33.380
And it's done incredibly mind blowing, impressive things,
link |
01:09:35.820
but it's not doing the same algorithms as the human brain.
link |
01:09:40.260
So when I look at my children, as little kids,
link |
01:09:43.620
as infants, they can do things that no GPT3 can do.
link |
01:09:47.660
They can navigate a complex room.
link |
01:09:50.620
They can navigate social conversation with an adult.
link |
01:09:54.900
They can lie.
link |
01:09:55.740
They can do a million things.
link |
01:09:58.180
They are active thinkers in our world and doing things.
link |
01:10:03.180
And this, of course, I mean, look,
link |
01:10:04.580
we totally agree on how incredibly awesome
link |
01:10:07.820
artificial neural networks are right now,
link |
01:10:09.060
but we also know the things that they can't do well,
link |
01:10:12.780
like be generally intelligent,
link |
01:10:14.900
do all these different things.
link |
01:10:16.380
The reason about the world,
link |
01:10:17.620
efficiently learn, efficiently adapt.
link |
01:10:19.900
Exactly.
link |
01:10:20.740
But it's still the rate of improvement.
link |
01:10:23.540
It's, to me, it's possible that we'll be surprised.
link |
01:10:28.100
I agree, possible we'll be surprised.
link |
01:10:30.020
But what I would assert,
link |
01:10:33.060
and I'm glad I'm getting to say this on your podcast,
link |
01:10:36.180
we can look back at this in two years and 10 years,
link |
01:10:38.340
is that we've got to be much more sophisticated
link |
01:10:41.580
than units and synapses between them.
link |
01:10:44.740
Let me give you an example,
link |
01:10:45.660
and this is something I talk about in LiveWired,
link |
01:10:47.220
is despite the amazing impressiveness,
link |
01:10:50.540
mind blowing impressiveness,
link |
01:10:52.340
computers don't have some basic things,
link |
01:10:54.580
artificial neural networks don't have some basic things
link |
01:10:56.580
that we like caring about relevance, for example.
link |
01:10:59.460
So as humans, we are confronted
link |
01:11:02.380
with tons of data all the time,
link |
01:11:03.900
and we only encode particular things
link |
01:11:05.620
that are relevant to us.
link |
01:11:07.740
We have this very deep sense of relevance
link |
01:11:10.060
that I mentioned earlier is based on survival
link |
01:11:11.900
at the most basic level,
link |
01:11:12.740
but then all the things about my life and your life,
link |
01:11:16.420
what's relevant to you, that we encode.
link |
01:11:19.660
This is very useful.
link |
01:11:20.500
Computers at the moment don't have that.
link |
01:11:21.980
They don't even have a yen to survive
link |
01:11:24.020
and things like that.
link |
01:11:24.860
So we filled out a bunch of the junk we don't need.
link |
01:11:27.500
We're really good at efficiently
link |
01:11:29.900
zooming in on things we need.
link |
01:11:32.580
Again, could be argued, you know,
link |
01:11:34.500
let me put on my Freud hat.
link |
01:11:35.940
Maybe it's, I mean, that's our conscious mind.
link |
01:11:42.540
There's no reason that neural networks
link |
01:11:44.060
aren't doing the same kind of filtration.
link |
01:11:46.140
I mean, in the sense with GPT3 is doing,
link |
01:11:48.500
so there's a priming step.
link |
01:11:50.820
It's doing an essential kind of filtration
link |
01:11:53.420
when you ask it to generate tweets from,
link |
01:11:58.100
I don't know, from an Elon Musk or something like that.
link |
01:12:00.780
It's doing a filtration of it's throwing away
link |
01:12:04.060
all the parameters it doesn't need for this task.
link |
01:12:06.900
And it's figuring out how to do that successfully.
link |
01:12:09.700
And then ultimately it's not doing a very good job
link |
01:12:12.100
right now, but it's doing a lot better job
link |
01:12:14.340
than we expected.
link |
01:12:15.460
But it won't ever do a really good job.
link |
01:12:17.500
And I'll tell you why.
link |
01:12:18.340
I mean, so let's say we say,
link |
01:12:20.100
hey, produce an Elon Musk tweet.
link |
01:12:21.660
And we see like, oh, wow, it produced these three.
link |
01:12:23.980
That's great.
link |
01:12:24.820
But again, we're not seeing the 3000 produced
link |
01:12:27.500
that didn't really make any sense.
link |
01:12:28.900
It's because it has no idea what it is like to be a human.
link |
01:12:32.700
And all the things that you might want to say
link |
01:12:34.540
and all the reasons you wouldn't,
link |
01:12:35.380
like when you go to write a tweet,
link |
01:12:37.100
you might write something you think,
link |
01:12:38.140
ah, it's not gonna come off quite right
link |
01:12:39.860
in this modern political climate or whatever.
link |
01:12:41.580
Like, you know, you can change things.
link |
01:12:43.340
So.
link |
01:12:44.180
And it somehow boils down to fear of mortality
link |
01:12:46.700
and all of these human things at the end of the day,
link |
01:12:49.940
all contained with that tweeting experience.
link |
01:12:52.500
Well, interestingly, the fear of mortality
link |
01:12:55.540
is at the bottom of this,
link |
01:12:56.860
but you've got all these more things like,
link |
01:12:58.700
you know, oh, I want to,
link |
01:13:01.220
just in case the chairman of my department reads this,
link |
01:13:03.180
I want it to come off well there.
link |
01:13:04.260
Just in case my mom looks at this tweet,
link |
01:13:05.740
I want to make sure she, you know, and so on.
link |
01:13:08.220
So those are all the things that humans are able
link |
01:13:10.260
to sort of throw into the calculation.
link |
01:13:13.260
I mean.
link |
01:13:14.820
What it required, what it requires though,
link |
01:13:16.540
is having a model of your chairman,
link |
01:13:18.740
having a model of your mother,
link |
01:13:19.820
having a model of, you know,
link |
01:13:22.060
the person you want to go on a date with
link |
01:13:24.500
who might look at your tweet and so on.
link |
01:13:26.020
All these things are,
link |
01:13:27.820
you're running models of what it is like to be them.
link |
01:13:30.980
So in terms of the structure of the brain,
link |
01:13:34.540
again, this may be going into speculation land.
link |
01:13:37.140
I hope you go along with me.
link |
01:13:39.060
Yeah, of course.
link |
01:13:39.900
Yep.
link |
01:13:40.740
Is, okay, so the brain seems to be intelligent
link |
01:13:45.220
and our AI systems aren't very currently.
link |
01:13:48.380
So where do you think intelligence arises in the brain?
link |
01:13:52.860
Like what is it about the brain?
link |
01:13:55.660
So if you mean where location wise,
link |
01:13:58.780
it's no single spot.
link |
01:13:59.900
It would be equivalent to asking,
link |
01:14:01.980
I'm looking at New York city,
link |
01:14:04.820
where is the economy?
link |
01:14:06.500
The answer is you can't point to anywhere.
link |
01:14:08.180
The economy is all about the interaction
link |
01:14:09.980
of all of the pieces and parts of the city.
link |
01:14:12.180
And that's what, you know, intelligence,
link |
01:14:14.140
whatever we mean by that in the brain
link |
01:14:15.540
is interacting from everything going on at once.
link |
01:14:18.220
In terms of a structure.
link |
01:14:19.500
So we look humans are much smarter than fish,
link |
01:14:23.620
maybe not dolphins, but dolphins are mammals, right?
link |
01:14:26.700
I assert that what we mean by smarter
link |
01:14:28.580
has to do with live wiring.
link |
01:14:30.020
So what we mean when we say, oh, we're smart
link |
01:14:32.220
is, oh, we can figure out a new thing
link |
01:14:33.700
and figure out a new pathway to get where we need to go.
link |
01:14:36.740
And that's because fish are essentially coming to the table
link |
01:14:39.700
with, you know, okay, here's the hardware, go swim, mate.
link |
01:14:43.740
But we have the capacity to say,
link |
01:14:46.100
okay, look, I'm gonna absorb, oh, oh,
link |
01:14:47.580
but you know, I saw someone else do this thing
link |
01:14:49.260
and I read once that you could do this other thing
link |
01:14:51.980
and so on.
link |
01:14:52.820
So do you think there's, is there something,
link |
01:14:54.620
I know these are mysteries,
link |
01:14:56.700
but like architecturally speaking,
link |
01:15:00.140
what feature of the brain of the live wire aspect of it
link |
01:15:06.100
that is really useful for intelligence?
link |
01:15:08.140
So like, is it the ability of neurons to reconnect?
link |
01:15:15.180
Like, is there something,
link |
01:15:16.220
is there any lessons about the human brain
link |
01:15:18.180
you think might be inspiring for us
link |
01:15:21.020
to take into the artificial, into the machine learning world?
link |
01:15:26.860
Yeah, I'm actually just trying to write some up on this now
link |
01:15:29.620
called, you know, if you wanna build a robot,
link |
01:15:31.300
start with the stomach.
link |
01:15:32.460
And what I mean by that, what I mean by that is
link |
01:15:35.380
a robot has to care, it has to have hunger,
link |
01:15:37.220
it has to care about surviving, that kind of thing.
link |
01:15:40.340
Here's an example.
link |
01:15:41.260
So the penultimate chapter of my book,
link |
01:15:44.180
I titled The Wolf and the Mars Rover.
link |
01:15:46.620
And I just look at this simple comparison
link |
01:15:48.860
of you look at a wolf, it gets its leg caught in a trap.
link |
01:15:52.580
What does it do?
link |
01:15:53.420
It gnaws its leg off,
link |
01:15:55.460
and then it figures out how to walk on three legs.
link |
01:15:58.220
No problem.
link |
01:15:59.060
Now, the Mars Rover Curiosity got its front wheel stuck
link |
01:16:02.180
in some Martian soil, and it died.
link |
01:16:05.220
This project that cost billions of dollars died
link |
01:16:09.100
because it got its wheels.
link |
01:16:09.980
Wouldn't it be terrific if we could build a robot
link |
01:16:12.740
that chewed off its front wheel and figured out
link |
01:16:15.260
how to operate with a slightly different body plan?
link |
01:16:17.780
That's the kind of thing that we wanna be able to build.
link |
01:16:21.020
And to get there, what we need,
link |
01:16:23.300
the whole reason the wolf is able to do that
link |
01:16:25.140
is because its motor and somatosensory systems
link |
01:16:27.940
are live wired.
link |
01:16:28.780
So it says, oh, you know what?
link |
01:16:29.980
Turns out we've got a body plan that's different
link |
01:16:31.620
than what I thought a few minutes ago,
link |
01:16:34.140
but I have a yen to survive and I care about relevance,
link |
01:16:38.820
which in this case is getting to food,
link |
01:16:40.660
getting back to my pack and so on.
link |
01:16:42.580
So I'm just gonna figure out how to operate with this.
link |
01:16:44.660
Oh, whoops, that didn't work.
link |
01:16:46.020
Oh, okay, I'm kind of getting it to work.
link |
01:16:48.500
But the Mars Rover doesn't do that.
link |
01:16:49.900
It just says, oh geez, I was pre programmed.
link |
01:16:51.860
Four wheels, now I have three, I'm screwed.
link |
01:16:53.980
Yeah, you know, I don't know if you're familiar
link |
01:16:55.900
with a philosopher named Ernest Becker.
link |
01:16:58.180
He wrote a book called Denial of Death.
link |
01:17:00.780
And there's a few psychologists, Sheldon Solomon,
link |
01:17:03.500
I think I just spoke with him on his podcast
link |
01:17:07.260
who developed terror management theory,
link |
01:17:09.900
which is like Ernest Becker is a philosopher
link |
01:17:12.940
that basically said that fear of mortality
link |
01:17:17.220
is at the core of it.
link |
01:17:18.220
Yeah.
link |
01:17:19.060
And so I don't know if it sounds compelling as an idea
link |
01:17:23.140
that all of the civilization we've constructed
link |
01:17:26.980
is based on this, but it's.
link |
01:17:29.100
I'm familiar with his work.
link |
01:17:30.300
Here's what I think.
link |
01:17:31.220
I think that yes, fundamentally this desire to survive
link |
01:17:35.180
is at the core of it, I would agree with that.
link |
01:17:37.420
But how that expresses itself in your life
link |
01:17:40.660
ends up being very different.
link |
01:17:41.700
The reason you do what you do is, I mean,
link |
01:17:45.340
you could list the 100 reasons why you chose
link |
01:17:47.900
to write your tweet this way and that way.
link |
01:17:49.300
And it really has nothing to do with the survival part.
link |
01:17:51.340
It has to do with, you know, trying to impress fellow humans
link |
01:17:53.540
and surprise them and say something.
link |
01:17:55.260
Yeah, so many things built on top of each other,
link |
01:17:56.940
but it's fascinating to think
link |
01:17:58.820
that in artificial intelligence systems,
link |
01:18:00.900
we wanna be able to somehow engineer this drive
link |
01:18:05.420
for survival, for immortality.
link |
01:18:08.300
I mean, because as humans, we're not just about survival,
link |
01:18:11.460
we're aware of the fact that we're going to die,
link |
01:18:14.740
which is a very kind of, we're aware of like space time.
link |
01:18:17.420
Most people aren't, by the way.
link |
01:18:18.500
Aren't?
link |
01:18:19.340
Aren't.
link |
01:18:20.340
Confucius said, he said, each person has two lives.
link |
01:18:25.980
The second one begins when you realize
link |
01:18:28.060
that you have just one.
link |
01:18:29.500
Yeah.
link |
01:18:30.340
But most people, it takes a long time
link |
01:18:31.340
for most people to get there.
link |
01:18:32.700
I mean, you could argue this kind of Freudian thing,
link |
01:18:34.660
which Erzbecker argues is they actually figured it out
link |
01:18:41.660
early on and the terror they felt
link |
01:18:44.860
was like the reason it's been suppressed.
link |
01:18:47.580
And the reason most people, when I ask them
link |
01:18:49.300
about whether they're afraid of death,
link |
01:18:50.700
they basically say no.
link |
01:18:53.180
They basically say like, I'm afraid I won't get,
link |
01:18:56.980
like submit the paper before I die.
link |
01:18:59.820
Like they kind of see, they see death
link |
01:19:01.900
as a kind of a inconvenient deadline
link |
01:19:04.780
for a particular set of, like a book you're writing.
link |
01:19:08.180
As opposed to like, what the hell?
link |
01:19:10.700
This thing ends at any moment.
link |
01:19:14.780
Like most people, as I've encountered,
link |
01:19:17.260
do not meditate on the idea that like right now
link |
01:19:20.700
you could die.
link |
01:19:21.740
Like right now, like in the next five minutes,
link |
01:19:26.380
it could be all over and, you know, meditate on that idea.
link |
01:19:29.940
I think that somehow brings you closer
link |
01:19:32.580
to like the core of the motivations
link |
01:19:36.500
and the core of the human cognition condition.
link |
01:19:40.140
I think it might be the core, but like I said,
link |
01:19:41.300
it is not what drives us day to day.
link |
01:19:43.820
Yeah, there's so many things on top of it,
link |
01:19:45.540
but it is interesting.
link |
01:19:46.380
I mean, as the ancient poet said,
link |
01:19:49.660
death whispers at my ear, live for I come.
link |
01:19:53.340
So it's, it is certainly motivating
link |
01:19:56.100
when we think about that.
link |
01:19:58.060
Okay, I've got some deadline.
link |
01:19:59.220
I don't know exactly when it is,
link |
01:20:00.340
but I better make stuff happen.
link |
01:20:02.180
It is motivating, but I don't think,
link |
01:20:04.300
I mean, I know for just speaking for me personally,
link |
01:20:06.780
that's not what motivates me day to day.
link |
01:20:08.900
It's instead, oh, I want to get this, you know,
link |
01:20:13.380
program up and running before this,
link |
01:20:14.780
or I want to make sure my coauthor isn't mad at me
link |
01:20:17.300
because I haven't gotten this in,
link |
01:20:18.220
or I don't want to miss this grant deadline,
link |
01:20:19.580
or, you know, whatever the thing is.
link |
01:20:21.180
Yeah, it's too distant in a sense.
link |
01:20:24.060
Nevertheless, it is good to reconnect.
link |
01:20:26.540
But for the AI systems, none of that is there.
link |
01:20:31.220
Like a neural network does not fear its mortality.
link |
01:20:34.940
And that seems to be somehow
link |
01:20:37.180
fundamentally missing the point.
link |
01:20:39.660
I think that's missing the point,
link |
01:20:40.700
but I wonder, it's an interesting speculation
link |
01:20:42.420
about whether you can build an AI system
link |
01:20:43.700
that is much closer to being a human
link |
01:20:45.780
without the mortality and survival piece,
link |
01:20:48.380
but just the thing of relevance,
link |
01:20:51.140
just I care about this versus that.
link |
01:20:52.740
Right now, if you have a robot roll into the room,
link |
01:20:54.780
it's going to be frozen
link |
01:20:55.660
because it doesn't have any reason to go there versus there.
link |
01:20:57.700
It doesn't have any particular set of things
link |
01:21:02.580
about this is how I should navigate my next move
link |
01:21:05.620
because I want something.
link |
01:21:07.700
Yeah, the thing about humans
link |
01:21:10.900
is they seem to generate goals.
link |
01:21:13.740
They're like, you said livewired.
link |
01:21:15.740
I mean, it's very flexible in terms of the goals
link |
01:21:19.980
and creative in terms of the goals we generate
link |
01:21:21.780
when we enter a room.
link |
01:21:23.700
You show up to a party without a goal,
link |
01:21:25.540
usually, and then you figure it out along the way.
link |
01:21:27.900
Yes, but this goes back to the question about free will,
link |
01:21:29.980
which is when I walk into the party,
link |
01:21:33.340
if you rewound it 10,000 times,
link |
01:21:35.620
would I go and talk to that couple over there
link |
01:21:37.980
versus that person?
link |
01:21:38.980
Like, I might do this exact same thing every time
link |
01:21:41.700
because I've got some goal stack and I think,
link |
01:21:44.380
okay, well, at this party,
link |
01:21:45.820
I really want to meet these kind of people
link |
01:21:47.700
or I feel awkward or whatever my goals are.
link |
01:21:51.340
By the way, so there was something
link |
01:21:52.620
that I meant to mention earlier.
link |
01:21:54.580
If you don't mind going back,
link |
01:21:56.140
which is this, when we were talking about BCI.
link |
01:21:59.380
So I don't know if you know this,
link |
01:22:00.260
but what I'm spending 90% of my time doing now
link |
01:22:02.620
is running a company.
link |
01:22:03.820
Do you know about this?
link |
01:22:04.660
Yes, I wasn't sure what the company is involved in.
link |
01:22:08.100
Right, so. Can you talk about it?
link |
01:22:09.420
Yeah, yeah.
link |
01:22:10.820
So when it comes to the future of BCI,
link |
01:22:15.020
you can put stuff into the brain invasively,
link |
01:22:18.860
but my interest has been how you can get data streams
link |
01:22:22.060
into the brain noninvasively.
link |
01:22:24.180
So I run a company called Neosensory
link |
01:22:26.340
and what we build is this little wristband.
link |
01:22:29.620
We've built this in many different form factors.
link |
01:22:30.580
Oh, wow, that's it?
link |
01:22:31.940
Yeah, this is it.
link |
01:22:32.780
And it's got these vibratory motors in it.
link |
01:22:35.420
So these things, as I'm speaking, for example,
link |
01:22:38.220
it's capturing my voice and running algorithms
link |
01:22:41.060
and then turning that into patterns of vibration here.
link |
01:22:44.460
So people who are deaf, for example,
link |
01:22:48.740
learn to hear through their skin.
link |
01:22:50.780
So the information is getting up to their brain this way
link |
01:22:54.260
and they learn how to hear.
link |
01:22:55.820
So it turns out on day one, people are pretty good,
link |
01:22:58.020
like better than you'd expect at being able to say,
link |
01:23:00.580
oh, that's weird, was that a dog barking?
link |
01:23:02.540
Was that a baby crying?
link |
01:23:03.380
Was that a door knock, a doorbell?
link |
01:23:04.740
Like people are pretty good at it,
link |
01:23:06.500
but with time they get better and better
link |
01:23:09.260
and what it becomes is a new qualia.
link |
01:23:12.380
In other words, a new subjective internal experience.
link |
01:23:15.380
So on day one, they say, whoa, what was that?
link |
01:23:18.380
Oh, oh, that was the dog barking.
link |
01:23:20.620
But by three months later, they say,
link |
01:23:23.420
oh, there's a dog barking somewhere.
link |
01:23:24.540
Oh, there's the dog.
link |
01:23:25.540
That's fascinating.
link |
01:23:26.380
And by the way, that's exactly how you learn
link |
01:23:27.700
how to use your ears.
link |
01:23:29.380
So of course you don't remember this,
link |
01:23:30.540
but when you were an infant, all you have are
link |
01:23:33.100
your eardrum vibrating causes spikes to go down,
link |
01:23:36.540
your auditory nerves and impinging your auditory cortex.
link |
01:23:40.940
Your brain doesn't know what those mean automatically,
link |
01:23:43.060
but what happens is you learn how to hear
link |
01:23:44.820
by looking for correlations.
link |
01:23:46.420
You clap your hands as a baby,
link |
01:23:48.900
you look at your mother's mouth moving
link |
01:23:50.860
and that correlates with what's going on there.
link |
01:23:53.180
And eventually your brain says, all right,
link |
01:23:54.860
I'm just gonna summarize this as an internal experience,
link |
01:23:57.940
as a conscious experience.
link |
01:23:59.300
And that's exactly what happens here.
link |
01:24:01.260
The weird part is that you can feed data into the brain,
link |
01:24:04.140
not through the ears, but through any channel
link |
01:24:06.260
that gets there.
link |
01:24:07.180
As long as the information gets there,
link |
01:24:08.500
your brain figures out what to do with it.
link |
01:24:10.260
That's fascinating.
link |
01:24:11.340
Like expanding the set of sensors,
link |
01:24:14.740
it could be arbitrarily, yeah,
link |
01:24:19.060
it could expand arbitrarily, which is fascinating.
link |
01:24:21.220
Well, exactly.
link |
01:24:22.060
And by the way, the reason I use this skin,
link |
01:24:24.500
there's all kinds of cool stuff going on
link |
01:24:26.300
in the AR world with glasses.
link |
01:24:28.140
But the fact is your eyes are overtaxed
link |
01:24:29.940
and your ears are overtaxed
link |
01:24:30.900
and you need to be able to see and hear other stuff.
link |
01:24:33.420
But you're covered with the skin,
link |
01:24:34.660
which is this incredible computational material
link |
01:24:38.140
with which you can feed information.
link |
01:24:39.700
And we don't use our skin for much of anything nowadays.
link |
01:24:42.540
My joke in the lab is that I say,
link |
01:24:44.260
we don't call this the waste for nothing.
link |
01:24:45.820
Because originally we built this as the vest
link |
01:24:47.380
and you're passing in all this information that way.
link |
01:24:51.020
And what I'm doing here with the deaf community
link |
01:24:56.380
is what's called sensory substitution,
link |
01:24:59.020
where I'm capturing sound and I'm just replacing the ears
link |
01:25:02.420
with the skin and that works.
link |
01:25:05.900
One of the things I talk about LiveWire
link |
01:25:07.140
is sensory expansion.
link |
01:25:09.500
So what if you took something like your visual system,
link |
01:25:11.980
which picks up on a very thin slice
link |
01:25:13.500
of the electromagnetic spectrum,
link |
01:25:15.300
and you could see infrared or ultraviolet.
link |
01:25:18.460
So we've hooked that up, infrared and ultraviolet detectors,
link |
01:25:21.100
and I can feel what's going on.
link |
01:25:22.780
So just as an example, the first night I built the infrared,
link |
01:25:25.220
one of my engineers built it, the infrared detector,
link |
01:25:27.460
I was walking in the dark between two houses
link |
01:25:29.420
and suddenly I felt all this infrared radiation.
link |
01:25:31.780
I was like, where's that come from?
link |
01:25:32.860
And I just followed my wrist and I found an infrared camera,
link |
01:25:35.900
a night vision camera that was,
link |
01:25:37.780
but I immediately, oh, there's that thing there.
link |
01:25:40.420
Of course, I would have never seen it,
link |
01:25:42.060
but now it's just part of my reality.
link |
01:25:45.260
That's fascinating.
link |
01:25:46.100
Yeah, and then of course,
link |
01:25:46.940
what I'm really interested in is sensory addition.
link |
01:25:49.860
What if you could pick up on stuff
link |
01:25:51.620
that isn't even part of what we normally pick up on,
link |
01:25:55.060
like the magnetic field of the earth
link |
01:25:57.620
or Twitter or stock market or things like that.
link |
01:26:00.660
Or the, I don't know, some weird stuff
link |
01:26:02.300
like the moods of other people or something like that.
link |
01:26:04.260
Sure, now what you need is a way to measure this.
link |
01:26:06.540
So as long as there's a machine that can measure it,
link |
01:26:08.180
it's easy, it's trivial to feed this in here
link |
01:26:09.980
and you come to be, it comes to be part of your reality.
link |
01:26:14.300
It's like you have another sensor.
link |
01:26:16.220
And that kind of thing is without doing like,
link |
01:26:19.140
if you look in Neuralink,
link |
01:26:20.860
I forgot how you put it, but it was eloquent,
link |
01:26:23.700
without getting, cutting into the brain, basically.
link |
01:26:26.140
Yeah, exactly, exactly.
link |
01:26:27.300
So this costs, at the moment, $399.
link |
01:26:30.660
That's not gonna kill you.
link |
01:26:32.060
Yeah, it's not gonna kill you.
link |
01:26:33.380
You just put it on and when you're done, you take it off.
link |
01:26:36.540
Yeah, and so, and the name of the company, by the way,
link |
01:26:39.180
is Neosensory for new senses, because the whole idea is.
link |
01:26:42.700
Beautiful, that's.
link |
01:26:43.540
You can, as I said, you come to the table
link |
01:26:45.940
with certain plug and play devices and then that's it.
link |
01:26:48.060
Like I can pick up on this little bit
link |
01:26:49.140
of the electromagnetic radiation,
link |
01:26:50.180
you can pick up on this little frequency band
link |
01:26:53.420
for hearing and so on, but I'm stuck there
link |
01:26:56.220
and there's no reason we have to be stuck there.
link |
01:26:57.820
We can expand our umwelt by adding new senses, yeah.
link |
01:27:01.740
What's umwelt?
link |
01:27:02.660
Oh, I'm sorry, the umwelt is the slice of reality
link |
01:27:05.740
that you pick up on.
link |
01:27:06.580
So each animal has its own umwelt.
link |
01:27:09.420
Yeah, exactly.
link |
01:27:10.780
Nice.
link |
01:27:11.620
I'm sorry, I forgot to define it before.
link |
01:27:12.460
It's such an important concept, which is to say,
link |
01:27:17.340
for example, if you are a tick,
link |
01:27:19.780
you pick up on butyric gas, you pick up on odor
link |
01:27:22.820
and you pick up on temperature, that's it.
link |
01:27:24.100
That's how you construct your reality
link |
01:27:25.740
is with those two sensors.
link |
01:27:26.860
If you are a blind echolocating bat,
link |
01:27:28.820
you're picking up on air compression waves coming back,
link |
01:27:31.500
you know, echolocation.
link |
01:27:32.540
If you are the black ghost knife fish,
link |
01:27:34.820
you're picking up on changes in the electrical field
link |
01:27:37.980
around you with electroreception.
link |
01:27:40.420
That's how they swim around
link |
01:27:41.380
and tell there's a rock there and so on.
link |
01:27:43.020
But that's all they pick up on.
link |
01:27:45.060
That's their umwelt.
link |
01:27:47.020
That's the signals they get from the world
link |
01:27:49.820
from which to construct their reality.
link |
01:27:51.300
And they can be totally different umwelts.
link |
01:27:53.540
That's fantastic.
link |
01:27:54.380
And so our human umwelt is, you know,
link |
01:27:57.460
we've got little bits that we can pick up on.
link |
01:27:59.380
One of the things I like to do with my students
link |
01:28:01.100
is talk about, imagine that you are a bloodhound dog, right?
link |
01:28:05.780
You are a bloodhound dog with a huge snout
link |
01:28:07.500
with 200 million scent receptors in it.
link |
01:28:09.380
And your whole world is about smelling.
link |
01:28:11.260
You know, you've got slits in your nostrils,
link |
01:28:13.740
like big nose fulls of air and so on.
link |
01:28:15.300
Do you have a dog?
link |
01:28:16.460
Nope, used to.
link |
01:28:17.300
Used to, okay, right.
link |
01:28:18.140
So you know, you walk your dog around
link |
01:28:19.340
and your dog is smelling everything.
link |
01:28:21.020
The whole world is full of signals
link |
01:28:22.260
that you do not pick up on.
link |
01:28:23.820
And so imagine if you were that dog
link |
01:28:25.380
and you looked at your human master and thought,
link |
01:28:26.940
my God, what is it like to have
link |
01:28:28.100
the pitiful little nose of a human?
link |
01:28:30.420
How could you not know that there's a cat 100 yards away
link |
01:28:32.740
or that your friend was here six hours ago?
link |
01:28:34.500
And so the idea is because we're stuck in our own belt,
link |
01:28:37.980
because we have this little pitiful noses,
link |
01:28:39.180
we think, okay, well, yeah, we're seeing reality,
link |
01:28:41.500
but you can have very different sorts of realities
link |
01:28:44.540
depending on the peripheral plug and play devices
link |
01:28:46.940
you're equipped with.
link |
01:28:47.780
It's fascinating to think that like,
link |
01:28:49.660
if we're being honest, probably our own belt
link |
01:28:52.100
is, you know, some infinitely tiny percent
link |
01:28:57.340
of the possibilities of how you can sense,
link |
01:29:01.420
quote unquote, reality, even if you could,
link |
01:29:04.380
I mean, there's a guy named Donald Hoffman, yeah,
link |
01:29:08.780
who basically says we're really far away from reality
link |
01:29:13.860
in terms of our ability to sense anything.
link |
01:29:15.980
Like we're very, we're almost like we're floating out there
link |
01:29:20.700
that's almost like completely attached
link |
01:29:22.420
to the actual physical reality.
link |
01:29:24.220
It's fascinating that we can have extra senses
link |
01:29:27.060
that could help us get a little bit closer.
link |
01:29:29.660
Exactly, and by the way, this has been the fruits
link |
01:29:33.340
of science is realizing, like, for example,
link |
01:29:36.100
you know, you open your eyes
link |
01:29:36.940
and there's the world around you, right?
link |
01:29:38.220
But of course, depending on how you calculate it,
link |
01:29:40.100
it's less than a 10 trillionth of the electromagnetic
link |
01:29:42.900
spectrum that we call visible light.
link |
01:29:45.780
The reason I say it depends,
link |
01:29:46.620
because, you know, it's actually infinite
link |
01:29:47.820
in all directions presumably.
link |
01:29:49.220
Yeah, and so that's exactly that.
link |
01:29:51.260
And then science allows you to actually look
link |
01:29:53.620
into the rest of it.
link |
01:29:54.980
Exactly, so understanding how big the world is out there.
link |
01:29:57.140
And the same with the world of really small
link |
01:29:59.060
and the world of really large.
link |
01:30:00.460
Exactly.
link |
01:30:01.300
That's beyond our ability to sense.
link |
01:30:03.100
Exactly, and so the reason I think this kind of thing
link |
01:30:04.980
matters is because we now have an opportunity
link |
01:30:07.740
for the first time in human history to say,
link |
01:30:10.900
okay, well, I'm just gonna include other things
link |
01:30:13.060
in my own belt.
link |
01:30:13.900
So I'm gonna include infrared radiation
link |
01:30:15.700
and have a direct perceptual experience of that.
link |
01:30:19.020
And so I'm very, you know, I mean,
link |
01:30:21.340
so, you know, I've given up my lab
link |
01:30:22.660
and I run this company 90% of my time now.
link |
01:30:25.340
That's what I'm doing.
link |
01:30:26.180
I still teach at Stanford and I'm, you know,
link |
01:30:27.740
teaching courses and stuff like that.
link |
01:30:29.660
But this is like, this is your passion.
link |
01:30:32.980
The fire is on this.
link |
01:30:35.020
Yeah, I feel like this is the most important thing
link |
01:30:37.420
that's happening right now.
link |
01:30:38.820
I mean, obviously I think that,
link |
01:30:40.100
because that's what I'm devoting my time in my life to.
link |
01:30:42.300
But I mean, it's a brilliant set of ideas.
link |
01:30:45.100
It certainly is like, it's a step in a very vibrant future,
link |
01:30:50.100
I would say.
link |
01:30:50.940
Like the possibilities there are endless.
link |
01:30:54.180
Exactly.
link |
01:30:55.020
So if you ask what I think about Neuralink,
link |
01:30:57.220
I think it's amazing what those guys are doing
link |
01:30:59.260
and working on,
link |
01:31:00.100
but I think it's not practical for almost everybody.
link |
01:31:02.820
For example, for people who are deaf, they buy this
link |
01:31:05.700
and, you know, every day we're getting tons of emails
link |
01:31:08.420
and tweets and whatever from people saying, wow,
link |
01:31:09.980
I picked up on this and then I had no idea that was a,
link |
01:31:12.500
I didn't even know that was happening out there.
link |
01:31:14.700
And they're coming to hear, by the way,
link |
01:31:16.820
this is, you know, less than a 10 year old,
link |
01:31:18.700
by the way, this is less than a 10th of the price
link |
01:31:20.860
of a hearing aid and like 250 times less
link |
01:31:23.340
than a cochlear implant.
link |
01:31:25.220
That's amazing.
link |
01:31:27.100
People love hearing about what, you know,
link |
01:31:30.660
brilliant folks like yourself could recommend
link |
01:31:33.900
in terms of books.
link |
01:31:35.020
Of course, you're an author of many books.
link |
01:31:36.980
So I'll, in the introduction,
link |
01:31:38.780
mention all the books you've written.
link |
01:31:40.300
People should definitely read LiveWired.
link |
01:31:42.500
I've gotten a chance to read some of it and it's amazing.
link |
01:31:44.820
But is there three books, technical, fiction,
link |
01:31:48.300
philosophical that had an impact on you
link |
01:31:52.100
when you were younger or today and books,
link |
01:31:56.380
perhaps some of which you would want to recommend
link |
01:31:59.580
that others read?
link |
01:32:01.380
You know, as an undergraduate,
link |
01:32:02.460
I majored in British and American literature.
link |
01:32:04.140
That was my major because I love literature.
link |
01:32:06.780
I grew up with literature.
link |
01:32:08.660
My father had these extensive bookshelves.
link |
01:32:10.500
And so I grew up in the mountains in New Mexico.
link |
01:32:13.860
And so that was mostly why I spent my time was reading books.
link |
01:32:16.100
But, you know, I love, you know, Faulkner, Hemingway.
link |
01:32:23.660
I love many South American authors,
link |
01:32:26.100
Gabriel Garcia Marquez and Italo Calvino.
link |
01:32:28.220
I would actually recommend Invisible Cities.
link |
01:32:29.860
I just, I loved that book by Italo Calvino.
link |
01:32:33.140
Sorry, it's a book of fiction.
link |
01:32:37.020
Anthony Dorr wrote a book called
link |
01:32:39.100
All the Light We Cannot See,
link |
01:32:41.020
which actually was inspired by incognito,
link |
01:32:44.140
by exactly what we were talking about earlier
link |
01:32:45.940
about how you can only see a little bit of the,
link |
01:32:48.740
what we call visible light in the electromagnetic radiation.
link |
01:32:51.460
I wrote about this in incognito,
link |
01:32:52.780
and then he reviewed incognito for the Washington Post.
link |
01:32:54.780
Oh no, that's awesome.
link |
01:32:56.020
And then he wrote this book called,
link |
01:32:57.580
the book has nothing to do with that,
link |
01:32:58.700
but that's where the title comes from.
link |
01:33:00.860
All the Light We Cannot See
link |
01:33:01.700
is about the rest of the spectrum.
link |
01:33:02.940
But the, that's an absolutely gorgeous book.
link |
01:33:08.340
That's a book of fiction.
link |
01:33:09.420
Yeah, it's a book of fiction.
link |
01:33:10.500
What's it about?
link |
01:33:12.620
It takes place during World War II
link |
01:33:14.340
about these two young people,
link |
01:33:15.420
one of whom is blind and yeah.
link |
01:33:18.260
Anything else?
link |
01:33:19.820
So what, any, so you mentioned Hemingway?
link |
01:33:21.940
I mean.
link |
01:33:22.780
Old Man and the Sea, what's your favorite?
link |
01:33:27.860
Snow's a Kilimanjaro.
link |
01:33:29.380
Oh wow, okay.
link |
01:33:30.220
It's a collection of short stories that I love.
link |
01:33:31.660
As far as nonfiction goes,
link |
01:33:33.300
I grew up with Cosmos,
link |
01:33:35.780
both watching the PBS series and then reading the book,
link |
01:33:38.180
and that influenced me a huge amount in terms of what I do.
link |
01:33:41.260
I, from the time I was a kid,
link |
01:33:42.980
I felt like I want to be Carl Sagan.
link |
01:33:44.940
Like, I just, that's what I loved.
link |
01:33:46.140
And in the end, I just, you know,
link |
01:33:47.940
I studied space physics for a while as an undergrad,
link |
01:33:50.980
but then I, in my last semester,
link |
01:33:53.560
discovered neuroscience last semester,
link |
01:33:55.460
and I just thought, wow, I'm hooked on that.
link |
01:33:57.540
So the Carl Sagan of the brain.
link |
01:34:01.420
That was my aspiration.
link |
01:34:02.260
Is the aspiration.
link |
01:34:03.700
I mean, you're doing an incredible job of it.
link |
01:34:07.900
So you open the book live wide with a quote by Heidegger.
link |
01:34:11.900
Every man is born as many men and dies as a single one.
link |
01:34:17.700
Well, what do you mean, or what?
link |
01:34:20.020
I'll tell you what I meant by it.
link |
01:34:21.620
So he had his own reason why he was writing that,
link |
01:34:23.740
but I meant this in terms of brain plasticity,
link |
01:34:25.740
in terms of the library,
link |
01:34:26.860
which is this issue that I mentioned before
link |
01:34:28.500
about this, you know, this cone,
link |
01:34:29.900
the space time cone that we are in,
link |
01:34:31.780
which is that when you dropped into the world,
link |
01:34:35.740
you, Lex, had all this different potential.
link |
01:34:37.900
You could have been a great surfer
link |
01:34:40.400
or a great chess player or a,
link |
01:34:42.260
you could have been thousands of different men
link |
01:34:45.460
when you grew up,
link |
01:34:46.460
but what you did is things that were not your choice
link |
01:34:49.300
and your choice along the way.
link |
01:34:50.620
You know, you ended up navigating a particular path
link |
01:34:52.760
and now you're exactly who you are.
link |
01:34:54.140
You used to have lots of potential,
link |
01:34:55.300
but the day you die, you will be exactly Lex.
link |
01:34:59.100
You will be that one person, yeah.
link |
01:35:01.580
So on that, in that context,
link |
01:35:03.580
I mean, first of all, it's just a beautiful,
link |
01:35:06.580
it's a humbling picture, but it's a beautiful one
link |
01:35:09.940
because it's all the possible trajectories
link |
01:35:12.700
and you pick one and you walk down that road
link |
01:35:14.540
and it's the Robert Frost poem.
link |
01:35:16.260
But on that topic, let me ask the biggest
link |
01:35:18.860
and the most ridiculous question.
link |
01:35:21.380
So in this live, wide brain,
link |
01:35:23.300
when we choose all these different trajectories
link |
01:35:25.100
and end up with one, what's the meaning of it all?
link |
01:35:27.940
What's, is there a why here?
link |
01:35:32.180
What's the meaning of life?
link |
01:35:34.100
Yeah.
link |
01:35:34.940
David Engelman.
link |
01:35:36.380
That's it.
link |
01:35:37.220
I mean, this is the question that everyone has attacked
link |
01:35:42.940
from their own life or point of view,
link |
01:35:45.300
by which I mean, culturally,
link |
01:35:47.300
if you grew up in a religious society,
link |
01:35:49.180
you have one way of attacking that question.
link |
01:35:51.100
So if you grew up in a secular or scientific society,
link |
01:35:53.300
you have a different way of attacking that question.
link |
01:35:55.460
Obviously, I don't know, I abstain on that question.
link |
01:35:59.660
Yeah.
link |
01:36:00.860
I mean, I think one of the fundamental things,
link |
01:36:03.340
I guess, in that, in all those possible trajectories
link |
01:36:06.260
is you're always asking.
link |
01:36:09.260
I mean, that's the act of asking
link |
01:36:11.540
what the heck is this thing for,
link |
01:36:14.220
is equivalent to, or at least runs in parallel
link |
01:36:18.460
to all the choices that you're making.
link |
01:36:20.860
Cause it's kind of, that's the underlying question.
link |
01:36:23.700
Well, that's right.
link |
01:36:24.540
And by the way, you know,
link |
01:36:25.500
this is the interesting thing about human psychology.
link |
01:36:27.820
You know, we've got all these layers of things
link |
01:36:29.460
at which we can ask questions.
link |
01:36:30.900
And so if you keep asking yourself the question about,
link |
01:36:33.740
what is the optimal way for me to be spending my time?
link |
01:36:36.500
What should I be doing?
link |
01:36:37.340
What charity should I get involved with and so on?
link |
01:36:39.060
If you're asking those big questions
link |
01:36:42.500
that steers you appropriately,
link |
01:36:44.740
if you're the type of person who never asks,
link |
01:36:46.540
hey, is there something better I can be doing with my time,
link |
01:36:48.900
then presumably you won't optimize
link |
01:36:51.060
whatever it is that is important to you.
link |
01:36:53.740
So you've, I think just in your eyes, in your work,
link |
01:36:58.060
there's a passion that just is obvious and it's inspiring.
link |
01:37:03.060
It's contagious.
link |
01:37:04.380
What, if you were to give advice to us,
link |
01:37:09.180
a young person today,
link |
01:37:10.700
in the crazy chaos that we live today about life,
link |
01:37:14.660
about how to discover their passion,
link |
01:37:20.820
is there some words that you could give?
link |
01:37:24.620
First of all, I would say the main thing
link |
01:37:26.220
for a young person is stay adaptable.
link |
01:37:29.500
And this is back to this issue of why COVID
link |
01:37:31.980
is useful for us because it forces us off our tracks.
link |
01:37:35.420
The fact is the jobs that will exist 20 years from now,
link |
01:37:39.020
we don't even have names for it.
link |
01:37:40.100
We can't even imagine the jobs that are gonna exist.
link |
01:37:42.540
And so when young people that I know go into college
link |
01:37:44.980
and they say, hey, what should I major in and so on,
link |
01:37:47.460
college is and should be less and less vocational,
link |
01:37:50.660
as in, oh, I'm gonna learn how to do this
link |
01:37:52.300
and then I'm gonna do that the rest of my career.
link |
01:37:54.100
The world just isn't that way anymore
link |
01:37:55.620
with the exponential speed of things.
link |
01:37:57.820
So the important thing is learning how to learn,
link |
01:38:00.460
learning how to be livewired and adaptable.
link |
01:38:03.540
That's really key.
link |
01:38:04.580
And what I advise young people when I talk to them is,
link |
01:38:09.540
what you digest, that's what gives you the raw storehouse
link |
01:38:13.660
of things that you can remix and be creative with.
link |
01:38:17.380
And so eat broadly and widely.
link |
01:38:21.220
And obviously this is the wonderful thing
link |
01:38:23.020
about the internet world we live in now
link |
01:38:24.620
is you kind of can't help it.
link |
01:38:25.580
You're constantly, whoa.
link |
01:38:27.140
You go down some mole hole of Wikipedia
link |
01:38:28.900
and you think, oh, I didn't even realize that was a thing.
link |
01:38:31.140
I didn't know that existed.
link |
01:38:32.540
And so.
link |
01:38:33.740
Embrace that.
link |
01:38:34.580
Embrace that, yeah, exactly.
link |
01:38:36.220
And what I tell people is just always do a gut check
link |
01:38:39.900
about, okay, I'm reading this paper
link |
01:38:41.500
and yeah, I think that, but this paper, wow,
link |
01:38:44.260
that really, I really cared about that in some way.
link |
01:38:47.700
I tell them just to keep a real sniff out for that.
link |
01:38:50.500
And when you find those things, keep going down those paths.
link |
01:38:54.020
Yeah, don't be afraid.
link |
01:38:55.020
I mean, that's one of the challenges and the downsides
link |
01:38:58.260
of having so many beautiful options
link |
01:39:00.060
is that sometimes people are a little bit afraid
link |
01:39:02.860
to really commit, but that's very true.
link |
01:39:05.620
If there's something that just sparks your interest
link |
01:39:09.140
and passion, just run with it.
link |
01:39:10.860
I mean, that's, it goes back to the Haider quote.
link |
01:39:14.500
I mean, we only get this one life
link |
01:39:16.220
and that trajectory, it doesn't last forever.
link |
01:39:20.140
So just if something sparks your imagination,
link |
01:39:23.540
your passion is run with it.
link |
01:39:24.940
Yeah, exactly.
link |
01:39:26.380
I don't think there's a more beautiful way to end it.
link |
01:39:29.940
David, it's a huge honor to finally meet you.
link |
01:39:32.620
Your work is inspiring so many people.
link |
01:39:34.860
I've talked to so many people who are passionate
link |
01:39:36.340
about neuroscience, about the brain, even outside
link |
01:39:39.220
that read your book.
link |
01:39:40.900
So I hope you keep doing so.
link |
01:39:43.700
I think you're already there with Carl Sagan.
link |
01:39:46.100
I hope you continue growing.
link |
01:39:48.340
Yeah, it was an honor talking with you today.
link |
01:39:50.060
Thanks so much.
link |
01:39:50.900
Great, you too, Lex, wonderful.
link |
01:39:53.380
Thanks for listening to this conversation
link |
01:39:54.900
with David Eagleman, and thank you to our sponsors,
link |
01:39:58.100
Athletic Greens, BetterHelp, and Cash App.
link |
01:40:01.500
Click the sponsor links in the description
link |
01:40:03.860
to get a discount and to support this podcast.
link |
01:40:07.340
If you enjoy this thing, subscribe on YouTube,
link |
01:40:09.620
review it with Five Stars on Apple Podcast,
link |
01:40:11.860
follow on Spotify, support on Patreon,
link |
01:40:14.660
or connect with me on Twitter at Lex Friedman.
link |
01:40:18.380
And now let me leave you with some words
link |
01:40:20.100
from David Eagleman in his book,
link |
01:40:21.620
Some Forty Tales from the Afterlives.
link |
01:40:25.020
Imagine for a moment there were nothing but
link |
01:40:28.020
the product of billions of years of molecules
link |
01:40:30.380
coming together and ratcheting up through natural selection.
link |
01:40:35.020
There were composed only of highways of fluids
link |
01:40:37.380
and chemicals sliding along roadways
link |
01:40:39.940
within billions of dancing cells.
link |
01:40:42.540
The trillions of synaptic connections hum in parallel
link |
01:40:46.140
that this vast egg like fabric of micro thin circuitry
link |
01:40:50.220
runs algorithms undreamt of in modern science,
link |
01:40:54.300
and that these neural programs give rise to
link |
01:40:56.660
our decision making, loves, desires, fears, and aspirations.
link |
01:41:02.900
To me, understanding this would be a numinous experience,
link |
01:41:07.540
better than anything ever proposed in any holy text.
link |
01:41:11.940
Thank you for listening and hope to see you next time.