back to index

David Eagleman: Neuroplasticity and the Livewired Brain | Lex Fridman Podcast #119


small model | large model

link |
00:00:00.000
The following is a conversation with David Eagleman, a neuroscientist and one of the
link |
00:00:04.600
great science communicators of our time, exploring the beauty and mystery of the human brain.
link |
00:00:10.920
He is an author of a lot of amazing books about the human mind and his new one called
link |
00:00:15.920
LiveWired.
link |
00:00:18.040
LiveWired is a work of ten years on a topic that is fascinating to me, which is neuroplasticity
link |
00:00:24.600
or the malleability of the human brain.
link |
00:00:27.840
Quick summary of the sponsors, Athletic Greens, BetterHelp, and Cash App.
link |
00:00:32.880
Click the sponsor links in the description to get a discount and to support this podcast.
link |
00:00:37.880
As a side note, let me say that the adaptability of the human mind at the biological, chemical,
link |
00:00:44.440
cognitive, psychological, and even sociological levels is the very thing that captivated me
link |
00:00:50.800
many years ago when I first began to wonder how would my engineer something like it in
link |
00:00:56.280
the machine.
link |
00:00:57.280
The open question today in the 21st century is what are the limits of this adaptability?
link |
00:01:03.800
As new, smarter and smarter devices and AI systems come to life, or as better and better
link |
00:01:09.560
brain computer interfaces are engineered, will our brain be able to adapt, to catch
link |
00:01:14.120
up, to excel?
link |
00:01:16.520
I personally believe yes, that we are far from reaching the limitation of the human mind
link |
00:01:21.480
and the human brain, just as we are far from reaching the limitations of our computational
link |
00:01:27.000
systems.
link |
00:01:28.000
If you enjoy this thing, subscribe on YouTube, or review it with 5 stars on our podcast,
link |
00:01:33.520
follow us on Spotify, support on Patreon, or connect with me on Twitter, at Lex Freedman.
link |
00:01:40.160
As usual, I'll do a few minutes of ads now and no ads in the middle.
link |
00:01:43.360
I try to make these interesting, but I give you timestamps so you can skip.
link |
00:01:48.000
But please do check out the sponsors by clicking the links in the description.
link |
00:01:51.800
It's the best way to support this podcast.
link |
00:01:55.160
This show is brought to you by Athletic Greens, the all in one daily drink to support better
link |
00:02:00.480
health and peak performance.
link |
00:02:02.880
Even with a balanced diet, it's difficult to cover all your nutritional bases.
link |
00:02:07.200
That's where Athletic Greens will help.
link |
00:02:09.240
Their daily drink is like nutritional insurance for your body as delivered straight to your
link |
00:02:13.960
door.
link |
00:02:15.460
As you may know, I fast often, sometimes intermittent fasting for 16 hours, sometimes
link |
00:02:20.800
24 hours, dinner to dinner, sometimes more.
link |
00:02:24.480
I break the fast with Athletic Greens.
link |
00:02:26.640
It's delicious, refreshing, just makes me feel good.
link |
00:02:30.960
I think it's like 50 calories, less than a gram of sugar, but has a ton of nutrients
link |
00:02:35.440
to make sure my body has what it needs, despite what I'm eating.
link |
00:02:40.280
Go to athleticgreens.com.lex to claim a special offer of a free vitamin D3K2 for a year.
link |
00:02:49.240
If you listen to the Joe Rogan experience, you might have listened to him rant about
link |
00:02:53.280
how awesome vitamin D is for your immune system.
link |
00:02:56.240
So there you have it.
link |
00:02:57.680
So click the athleticgreens.com.lex in the description to get the free stuff and to
link |
00:03:03.960
support this podcast.
link |
00:03:06.520
This show is sponsored by BetterHelp, spelled H.E.L.P. Help.
link |
00:03:11.480
Check it out at betterhelp.com.lex.
link |
00:03:14.280
They figure out what you need and match you with a licensed professional therapist in
link |
00:03:17.680
under 48 hours.
link |
00:03:18.680
It's not a crisis line, it's not self help, it's professional counseling done securely
link |
00:03:24.080
online.
link |
00:03:25.080
I'm a bit from the David Goggins line of creatures and so have some demons to contend with, usually
link |
00:03:31.040
on long runs or all nights full of self doubt.
link |
00:03:34.920
I think suffering is essential for creation, but you can suffer beautifully in a way that
link |
00:03:40.000
doesn't destroy you.
link |
00:03:41.720
For most people, I think a good therapist can help on this.
link |
00:03:45.400
So it's at least worth a try.
link |
00:03:47.280
Check out their reviews, they're good, it's easy, private, affordable, available worldwide.
link |
00:03:52.720
You can communicate by text and your time and schedule a weekly audio and video session.
link |
00:03:58.440
Check it out at betterhelp.com.lex.
link |
00:04:02.520
This show is presented by Cash App, the number one finance app in the App Store.
link |
00:04:06.440
When you get it, use code LEX Podcast.
link |
00:04:09.480
Cash App lets you send money to friends by Bitcoin and invest in the stock market with
link |
00:04:12.960
as little as $1.
link |
00:04:14.480
Since Cash App allows you to buy Bitcoin, let me mention that cryptocurrency in the context
link |
00:04:19.120
of the history of money is fascinating.
link |
00:04:21.800
I recommend Ascent of Money as a great book on this history.
link |
00:04:25.480
Davidson credits on ledgers started around 30,000 years ago and the first decentralized
link |
00:04:31.080
cryptocurrency released just over 10 years ago.
link |
00:04:34.280
So given that history, cryptocurrency is still very much in its early days of development,
link |
00:04:39.400
but it's still aiming to and just might redefine the nature of money.
link |
00:04:44.960
So again, if you get Cash App from the App Store, Google Play and use code LEX Podcast,
link |
00:04:49.880
you get $10 and Cash App will also donate $10 to FIRST, an organization that is helping
link |
00:04:55.360
to advance robotics and STEM education for young people around the world.
link |
00:05:00.520
And now, here's my conversation with David Eagleman.
link |
00:05:05.900
You have a new book coming out on the changing brain.
link |
00:05:10.440
Can you give a high level overview of the book?
link |
00:05:13.280
It's called LiveWired, by the way.
link |
00:05:14.800
Yeah.
link |
00:05:15.800
The thing is, we typically think about the brain in terms of the metaphors we already
link |
00:05:19.080
have like hardware and software, that's how we build all our stuff, but what's happening
link |
00:05:24.000
in the brain is fundamentally so different.
link |
00:05:26.240
So I coined this new term LiveWire, which is a system that's constantly reconfiguring
link |
00:05:32.000
itself physically as it learns and adapts to the world around it.
link |
00:05:37.160
It's physically changing.
link |
00:05:38.720
So it's LiveWire meaning like hardware, but changing.
link |
00:05:43.280
Yeah, exactly.
link |
00:05:44.680
Well, the hardware and the software layers are blended.
link |
00:05:47.600
And so typically, engineers are praised for their efficiency and making something really
link |
00:05:54.400
clean and clear.
link |
00:05:55.400
Okay, here's the hardware layer, then I'm going to run software on top of it, and there's
link |
00:05:57.800
all sorts of universality that you get out of a piece of hardware like that that's useful.
link |
00:06:02.880
But what the brain is doing is completely different, and I am so excited about where
link |
00:06:07.560
this is all going because I feel like this is where our engineering will go.
link |
00:06:14.440
So currently, we build all our devices a particular way, but I can't tear half the circuitry out
link |
00:06:20.400
of your cell phone and expect it to still function.
link |
00:06:23.960
But you can do that with the brain.
link |
00:06:27.560
So just as an example, kids who are under about seven years old can get one half of their
link |
00:06:32.680
brain removed.
link |
00:06:33.680
It's called a Hemisphere Actomy.
link |
00:06:36.040
And they're fine.
link |
00:06:37.040
They have a slight limp on the other side of their body, but they can function just fine
link |
00:06:40.880
that way.
link |
00:06:41.880
And this is generally true, sometimes children are born without a hemisphere.
link |
00:06:46.440
And their visual system rewires so that everything is on the single remaining hemisphere.
link |
00:06:53.240
But thousands of cases like this teach us is that it's a very malleable system that
link |
00:06:59.720
is simply trying to accomplish the tasks in front of it by rewiring itself with the available
link |
00:07:05.520
real estate.
link |
00:07:06.520
How much of that is a quirk or a feature of evolution?
link |
00:07:11.320
How hard is it to engineer?
link |
00:07:12.520
Because evolution took a lot of work.
link |
00:07:15.720
Trillions of organisms had to die to create this thing we have in our skull.
link |
00:07:22.240
As you said, you kind of look forward to the idea that we might be engineering our systems
link |
00:07:27.440
like this in the future, like creating wide war systems.
link |
00:07:30.840
How hard do you think is it to create systems like that?
link |
00:07:33.120
Great question.
link |
00:07:34.120
It has proven itself to be a difficult challenge.
link |
00:07:37.200
What I mean by that is even though it's taken evolution a really long time to get where
link |
00:07:42.120
it is now, all we have to do now is peek at the blueprints.
link |
00:07:47.760
It's just three pounds, this organ, and we just figure out how to do it.
link |
00:07:50.920
That's the part that I mean is a difficult challenge because there are tens of thousands
link |
00:07:56.240
of neuroscientists.
link |
00:07:57.240
We're all poking and prodding and trying to figure this out, but it's an extremely complicated
link |
00:08:00.280
system.
link |
00:08:01.280
But it's only going to be complicated until we figure out the general principles.
link |
00:08:05.920
Exactly like if you had a magic camera and you could look inside the nucleus of a cell
link |
00:08:10.840
and you'd see hundreds of thousands of things moving around or whatever, and then it takes
link |
00:08:15.040
a quick and watch and say, oh, you're just trying to maintain the order of the base pairs
link |
00:08:18.680
and all the rest is details.
link |
00:08:20.600
Then it simplifies it and we come to understand something.
link |
00:08:24.000
That was my goal in Livewire, which I've written over 10 years, by the way, is to try to distill
link |
00:08:27.880
things down to the principles of what plastic systems are trying to accomplish.
link |
00:08:34.480
But to even just linger, you said it's possible to be born with just one hemisphere and you
link |
00:08:39.040
still are able to function.
link |
00:08:42.120
First of all, just a pause on that.
link |
00:08:44.320
That's amazing.
link |
00:08:47.920
I don't know if people quite... I mean, you hear things here and there.
link |
00:08:51.840
This is why I'm really excited about your book, because I don't know if there's definitive
link |
00:08:57.800
popular sources to think about this stuff.
link |
00:09:01.880
There's a lot of... I think from my perspective, what I heard is there's been debates over
link |
00:09:07.320
decades about how much neuroplasticity there is in the brain and so on.
link |
00:09:13.520
People have learned a lot of things and now it's converging towards people that are understanding
link |
00:09:17.560
this much more plastic than people realize.
link |
00:09:21.680
Just linger on that topic, how malleable is the hardware of the human brain?
link |
00:09:28.600
Maybe you said children at each stage of life.
link |
00:09:32.960
Here's the whole thing.
link |
00:09:33.960
I think part of the confusion about plasticity has been that there are studies at all sorts
link |
00:09:38.760
of different ages and then people might read that from a distance and they think, oh, well,
link |
00:09:44.160
Fred didn't recover when half his brain was taken out and so clearly you're not plastic,
link |
00:09:49.040
but then you do it with a child and they are plastic.
link |
00:09:54.720
Part of my goal here was to pull together the tens of thousands of papers on this, both
link |
00:09:59.200
from clinical work and from all the way down to the molecular and understand what are the
link |
00:10:04.360
principles here.
link |
00:10:05.360
The principles are that plasticity diminishes.
link |
00:10:08.400
That's no surprise.
link |
00:10:09.400
By the way, maybe I should just define plasticity.
link |
00:10:11.640
It's the ability of a system to mold into a new shape and then hold that shape.
link |
00:10:16.480
That's why we make things that we call plastic because they are moldable and they can hold
link |
00:10:22.840
that new shape like a plastic toy or something.
link |
00:10:26.000
Maybe we'll use a lot of terms that are synonymous.
link |
00:10:31.080
Something is plastic, something is malleable, changing, live wire, the name of the book is
link |
00:10:37.840
like...
link |
00:10:38.840
I'll tell you exactly right, but I'll tell you why I chose live wire instead of plasticity.
link |
00:10:43.040
I used the term plasticity in the book, but sparingly because that was a term coined by
link |
00:10:50.040
William James over 100 years ago and he was, of course, very impressed with plastic manufacturing
link |
00:10:55.720
that you could mold something in shape and then it holds that, but that's not what's
link |
00:10:59.560
actually happening in the brain.
link |
00:11:01.560
It's constantly rewiring your entire life.
link |
00:11:03.720
You never hit an endpoint.
link |
00:11:06.480
The whole point is for it to keep changing.
link |
00:11:08.840
Even in the few minutes of conversation that we've been having, your brain is changing,
link |
00:11:12.600
my brain is changing.
link |
00:11:15.680
Next time I see your face, I will remember, oh yeah, that time Lex and I sat together
link |
00:11:19.800
and we did these things.
link |
00:11:21.400
I wonder if your brain will have a Lex thing going on for the next few months.
link |
00:11:25.480
You'll stay there until you get rid of it because it was useful for now.
link |
00:11:29.440
No, I'll probably never get rid of it.
link |
00:11:30.920
Let's say for some circumstance, you and I don't see each other for the next 35 years.
link |
00:11:34.400
When I run into you, I'll be like, oh yeah.
link |
00:11:36.480
That looks familiar.
link |
00:11:37.480
Yeah, we sat down for a podcast back when there were podcasts.
link |
00:11:43.160
Back when we lived outside virtual reality.
link |
00:11:46.600
Exactly.
link |
00:11:47.600
So you chose LiveWire to mold your plastic.
link |
00:11:50.320
Exactly, because plastic implies, I mean, it's the term that's used in the field and
link |
00:11:54.440
so that's why we need to use it still for a while, but yeah, it implies something gets
link |
00:11:59.040
mold into shape and then holds that shape forever, but in fact, the whole system is
link |
00:12:02.400
completely changing.
link |
00:12:03.400
And then back to how malleable is the human brain at each stage of life.
link |
00:12:08.920
So what, just at a high level, is it malleable?
link |
00:12:14.000
So yes, and plasticity diminishes, but one of the things that I felt like I was able
link |
00:12:20.280
to put together for myself after reading thousands of papers on this issue is that different
link |
00:12:25.560
parts of the brain have different plasticity windows.
link |
00:12:30.920
So for example, with the visual cortex, that cements itself into place pretty quickly over
link |
00:12:36.040
the course of a few years.
link |
00:12:37.960
And I argue that's because of the stability of the data.
link |
00:12:41.120
In other words, what you're getting in from the world, you've got a certain number of
link |
00:12:44.800
angles, colors, shapes, you know, it's essentially the world is visually stable.
link |
00:12:50.060
So that hardens around that data.
link |
00:12:52.560
As opposed to let's say the somatosensory cortex, which is the part that's taking information
link |
00:12:56.480
from your body or the motor cortex right next to it, which is what drives your body.
link |
00:13:00.400
The fact is bodies are always changing.
link |
00:13:01.920
You get taller over time, you get fatter, thinner over time, you might break a leg and
link |
00:13:07.000
have to limp for a while, stuff like that.
link |
00:13:08.680
So because the data there is always changing, by the way, you might get on a bicycle, you
link |
00:13:12.920
might get a surfboard, things like that.
link |
00:13:15.880
Because that data is always changing, that stays more malleable.
link |
00:13:19.400
And when you look through the brain, you find that it appears to be the, you know, how stable
link |
00:13:25.000
the data is determines how fast something hardens into place.
link |
00:13:28.120
But the point is, different parts of the brain harden into place at different times.
link |
00:13:31.680
Do you think it's possible that depending on how much data you get on different sensors,
link |
00:13:38.960
that it stays more malleable longer?
link |
00:13:41.560
So like, you know, if you look at different cultures, it's experienced like, if you keep
link |
00:13:46.600
your eyes closed, or maybe you're blind, I don't know, but let's say you keep your eyes
link |
00:13:50.260
closed for your entire life, then the visual cortex might be much less malleable.
link |
00:13:57.720
So the reason I bring that up is like, you know, maybe we'll talk about brain, computer
link |
00:14:02.760
interfaces a little bit down the line, but, you know, like, is this, is the malleability
link |
00:14:09.800
a genetic thing, or is it more about the data, like you said, that comes in?
link |
00:14:14.800
So the malleability itself is a genetic thing.
link |
00:14:18.040
The big trick that Mother Nature discovered with humans is make a system that's really
link |
00:14:22.600
flexible, as opposed to most other creatures to different degrees.
link |
00:14:28.120
So if you take an alligator, it's born, its brain does the same thing every generation.
link |
00:14:34.240
If you compare an alligator 100,000 years ago to an alligator now, they're essentially
link |
00:14:37.400
the same.
link |
00:14:39.920
We on the other hand, as humans drop into a world with a half baked brain, and what
link |
00:14:44.560
we require is to absorb the culture around us, and the language and the beliefs and the
link |
00:14:49.760
customs and so on, that's what Mother Nature has done with us, and it's been a tremendously
link |
00:14:56.520
successful trick we've taken over the whole planet as a result of this.
link |
00:15:00.200
So that's an interesting point.
link |
00:15:01.200
I mean, just to link on it that, I mean, this is a nice feature, like if you were to design
link |
00:15:06.400
a thing to survive in this world, do you put it at age zero already equipped to deal with
link |
00:15:13.480
the world in a like hard coded way, or do you put it, do you make it malleable and just
link |
00:15:18.960
throw it in, take the risk that you're maybe going to die, but you're going to learn a
link |
00:15:24.120
lot in the process.
link |
00:15:25.120
And if you don't die, you'll learn a hell of a lot to be able to survive in the environment.
link |
00:15:29.360
So this is the experiment that Mother Nature ran, and it turns out that for better or worse,
link |
00:15:34.400
we've won.
link |
00:15:35.400
I mean, yeah, we put other animals into zoos, and we, yeah, that's right.
link |
00:15:38.560
AI might do better.
link |
00:15:39.560
Okay, fair enough.
link |
00:15:40.720
That's true.
link |
00:15:41.800
And maybe what the trick Mother Nature did is just the stepping stone to AI.
link |
00:15:46.760
So that's a beautiful feature of the human brain that is malleable.
link |
00:15:51.200
But let's, on the topic of Mother Nature, what do we start with?
link |
00:15:56.520
Like how blank is the slate?
link |
00:15:58.920
So it's not actually a blank slate.
link |
00:16:01.160
What it's, it's terrific engineering that's set up in there, but much of that engineering
link |
00:16:06.960
has to do with, okay, just make sure that things get to the right place.
link |
00:16:10.240
For example, like the fibers from the eye is getting to the visual cortex or all this
link |
00:16:14.240
very complicated machinery in the ear getting to the auditory cortex and so on.
link |
00:16:17.720
So things, first of all, there's that.
link |
00:16:19.960
And then what we also come equipped with is the ability to absorb language and culture
link |
00:16:25.000
and beliefs and so on.
link |
00:16:27.120
So you're already set up for that.
link |
00:16:28.640
So no matter what you're exposed to, you will, you will absorb some sort of language.
link |
00:16:33.160
That's the trick is how do you engineer something just enough that it's then a sponge that's
link |
00:16:37.440
ready to take in and fill in the blanks?
link |
00:16:40.160
How much of the malleability is hardware?
link |
00:16:42.480
How much of software?
link |
00:16:43.480
How much of that useful at all in the brain?
link |
00:16:45.160
So like, what are we talking about?
link |
00:16:46.960
So there's like, there's neurons, there's synapses and all kinds of different synapses
link |
00:16:53.880
and there's chemical communication, like electrical signals and there's chemical communication
link |
00:16:58.920
from those in the synapses.
link |
00:17:03.520
What I would say the software would be the timing and the nature of the electrical signals,
link |
00:17:10.920
I guess, and the hardware would be the actual synapses.
link |
00:17:14.240
So here's the thing.
link |
00:17:15.240
This is why really, if we can, I want to get away from the hardware and software metaphor
link |
00:17:19.440
because what happens is as activity passes through the system, it changes things.
link |
00:17:25.160
Now the thing that computer engineers are really used to thinking about is synapses where
link |
00:17:30.360
two neurons connect.
link |
00:17:31.360
Of course, each neuron connects with 10,000 of its neighbors.
link |
00:17:33.760
But at a point where they connect, what we're all used to thinking about is the changing
link |
00:17:38.280
of the strength of that connection, the synaptic weight.
link |
00:17:43.520
But in fact, everything is changing.
link |
00:17:45.640
The receptor distribution inside that neuron so that you're more or less sensitive to the
link |
00:17:50.800
neurotransmitter, then the structure of the neuron itself and what's happening there
link |
00:17:55.460
all the way down to biochemical cascades inside the cell, all the way down to the nucleus.
link |
00:18:00.480
And for example, the epigenome, which is the, you know, these little proteins that are
link |
00:18:05.200
attached to the DNA that cause conformational changes, that cause more genes to be expressed
link |
00:18:10.880
or repressed, all of these things are plastic.
link |
00:18:15.080
The reason that most people only talk about the synaptic weights is because that's really
link |
00:18:20.360
all we can measure well.
link |
00:18:22.160
And all this other stuff is really, really hard to see with our current technology.
link |
00:18:25.000
So essentially that just gets ignored.
link |
00:18:27.440
But in fact, the system is plastic at all these different levels and my way of thinking
link |
00:18:34.160
about this is an analogy to paste layers.
link |
00:18:39.000
So paste layers is a concept that Stewart Brand suggested about how to think about cities.
link |
00:18:44.160
So you have fashion, which changes rapidly in cities.
link |
00:18:47.560
You have governance, which changes more slowly.
link |
00:18:52.960
You have the structure, the buildings of a city, which changes more slowly all the way
link |
00:18:56.580
down to nature.
link |
00:18:58.800
You've got all these different layers of things that are changing at different paces at different
link |
00:19:02.320
speeds.
link |
00:19:03.320
I've taken that idea and mapped it onto the brain, which is to say you have some biochemical
link |
00:19:08.880
class gays that are just changing really rapidly when something happens all the way down to
link |
00:19:12.040
things that are more and more cemented in there.
link |
00:19:15.240
And this is actually, this actually allows us to understand a lot about particular kinds
link |
00:19:20.880
of things that happen.
link |
00:19:21.880
For example, one of the oldest, probably the oldest rule in neurology is called Rybo's
link |
00:19:26.040
law, which is that older memories are more stable than newer memories.
link |
00:19:30.200
So when you get old and demented, you'll be able to remember things from your young life.
link |
00:19:36.800
Maybe you'll remember this podcast, but you won't remember what you did a month ago or
link |
00:19:40.320
a year ago.
link |
00:19:42.040
And this is a very weird structure, right?
link |
00:19:43.960
No other system works this way where older memories are more stable than newer memories.
link |
00:19:49.040
But it's because through time, things get more and more cemented into deeper layers
link |
00:19:55.080
of the system.
link |
00:19:58.320
And so this is, I think, the way we have to think about the brain, not as, okay, you've
link |
00:20:02.560
got neurons, you've got synaptic weights, and that's it.
link |
00:20:05.360
So yeah.
link |
00:20:06.360
So the idea of live wear and live wired is that it's a gradual spectrum between software
link |
00:20:17.400
and hardware.
link |
00:20:18.400
And so the metaphors completely doesn't make sense because when you talk about software
link |
00:20:23.480
and hardware, it's really hard lines.
link |
00:20:26.400
I mean, of course, software is unlike hardware, but even hardware.
link |
00:20:34.320
But like, so there's two groups, but in the software world, there's levels of abstractions,
link |
00:20:38.800
right?
link |
00:20:39.800
There's the operating system, there's machine code, and then it gets higher and higher levels.
link |
00:20:44.600
But somehow that's actually fundamentally different than the layers of abstractions
link |
00:20:48.800
in the hardware.
link |
00:20:49.800
But in the brain, it's all like the same and all of the city, the city metaphor.
link |
00:20:55.360
I mean, yeah, it's kind of mind blowing because it's hard to know what to think about that.
link |
00:21:01.520
Like if I were to ask the question, this is important question for machine learning is,
link |
00:21:07.600
how does the brain learn?
link |
00:21:09.880
So essentially, you're saying that, I mean, it just learns on all of these different levels
link |
00:21:17.040
at all different paces.
link |
00:21:19.480
Exactly right.
link |
00:21:20.480
And as a result, what happens is as you practice something, get good at something, you're physically
link |
00:21:25.320
changing the circuitry, you're adapting your brain around the thing that is relevant to
link |
00:21:31.160
you.
link |
00:21:32.160
So let's say you take up, do you know how to surf?
link |
00:21:34.600
Nope.
link |
00:21:35.600
Okay, great.
link |
00:21:36.600
So let's say you take up surfing.
link |
00:21:37.600
Yeah.
link |
00:21:38.600
Now, at this age, what happens is, you'll be terrible at first, you don't know how to
link |
00:21:41.480
operate your body, you don't know how to read the waves, things like that.
link |
00:21:44.080
And through time, you get better and better.
link |
00:21:45.720
What you're doing is you're burning that into the actual circuitry of your brain.
link |
00:21:48.600
You're, of course, conscious when you're first doing it, you're thinking about, okay,
link |
00:21:51.640
where am I doing?
link |
00:21:52.640
What's my body weight?
link |
00:21:53.760
But eventually, when you become a pro at it, you are not conscious of it at all.
link |
00:21:57.240
In fact, you can't even unpack what it is that you did.
link |
00:22:00.320
Think about riding a bicycle.
link |
00:22:02.040
You can't describe how you're doing it.
link |
00:22:04.360
You're just doing it or changing your balance when you do this to go to a stop and so on.
link |
00:22:08.200
So this is what we're constantly doing, is actually shaping our own circuitry based
link |
00:22:14.560
on what is relevant for us.
link |
00:22:16.000
Survival, of course, being the top thing that's relevant.
link |
00:22:18.880
But interestingly, especially with humans, we have these particular goals in our lives,
link |
00:22:23.720
under science, neuroscience, whatever.
link |
00:22:25.720
And so we actually shape our circuitry around that.
link |
00:22:28.320
I mean, you mentioned this, get slower and slower with age, but is there, I think I've
link |
00:22:34.240
read and spoken offline, even on this podcast, with a developmental neurobiologist, I guess
link |
00:22:41.480
would be the right terminology, is like looking at the very early, like from embryonic stem
link |
00:22:47.160
cells to creation of the brain.
link |
00:22:50.560
And that's mind blowing, how much stuff happens there.
link |
00:22:55.160
So it's very malleable at that stage.
link |
00:23:00.520
But after that, at which point does it stop being malleable?
link |
00:23:04.720
So that's the interesting thing, is that it remains malleable your whole life.
link |
00:23:08.560
So even when you're an old person, you'll be able to remember new faces and names.
link |
00:23:13.200
You'll be able to learn new sorts of tasks.
link |
00:23:16.040
And thank goodness, because the world is changing rapidly in terms of technology and so on.
link |
00:23:19.480
I just sent my mother and Alexa and she figured out how to go on the settings and do the thing
link |
00:23:23.800
and I was really impressed by it, that she was able to do it.
link |
00:23:27.160
So there are parts of the brain that remain malleable their whole life.
link |
00:23:30.640
The interesting part is that really your goal is to make an internal model of the world.
link |
00:23:36.200
Your goal is to say, okay, the brain is trapped in silence and darkness and it's trying to
link |
00:23:42.800
understand how the world works out there, right?
link |
00:23:45.720
I love that image.
link |
00:23:47.200
Yeah, I guess it is.
link |
00:23:48.720
You forget.
link |
00:23:49.720
You forget.
link |
00:23:50.720
It's like this lonely thing is sitting in its own container and trying to actually throw
link |
00:23:56.400
a few sensors, figure out what the hell's going on.
link |
00:23:58.920
You know what I sometimes think about is that movie, The Martian with Matt Damon, the movie
link |
00:24:06.520
poster shows Matt Damon all alone on the red planet and I think, God, that's actually what
link |
00:24:11.400
it's like to be inside your head and my head and anybody's head is that you're essentially
link |
00:24:18.560
on your own planet in there and I'm essentially on my own planet and everyone's got their
link |
00:24:22.380
own world where you've absorbed all of your experiences up to this moment in your life
link |
00:24:28.000
that made you exactly who you are and same for me and everyone.
link |
00:24:33.560
And we've got this very thin bandwidth of communication and I'll say something like,
link |
00:24:38.840
oh yeah, that tastes just like peaches and you'll say, oh, I know what you mean.
link |
00:24:43.160
But the experience, of course, might be vastly different for us.
link |
00:24:47.920
But anyway, yes.
link |
00:24:48.920
So the brain is trapped in silence and darkness, each one of us.
link |
00:24:52.040
And what it's trying to do, this is the important part, is trying to make an internal model
link |
00:24:55.720
of what's going on out there as in how do I function in the world?
link |
00:24:59.000
How do I interact with other people?
link |
00:25:00.960
Do I say something nice or polite or do I say something aggressive and mean?
link |
00:25:04.160
Do I, you know, all these things that it's putting together about the world.
link |
00:25:08.880
And I think what happens when people get older and older, it may not be that plasticity is
link |
00:25:14.880
diminishing.
link |
00:25:15.880
It may be that their internal model essentially has set itself up in a way where it says,
link |
00:25:20.320
OK, I've pretty much got a really good understanding of the world now and I don't really need to
link |
00:25:23.840
change, right?
link |
00:25:25.560
So when old, when much older people find themselves in a situation where they need to change,
link |
00:25:30.920
they can actually are able to do it.
link |
00:25:33.480
It's just that I think this notion that we all have that plasticity diminishes as we
link |
00:25:37.240
grow older is in part because the motivation isn't there.
link |
00:25:42.160
But if you were 80 and you get fired from your job and suddenly had to figure out how
link |
00:25:45.920
to program a WordPress site or something, you'd figure it out.
link |
00:25:49.200
Got it.
link |
00:25:50.200
So the the capability, the possibility of change is there.
link |
00:25:53.400
But let me ask the highest challenge, the interesting challenge to this plasticity,
link |
00:26:01.080
to this liveware system.
link |
00:26:03.600
If we could talk about brain computer interfaces and Neuralink, what are your thoughts about
link |
00:26:09.520
the efforts of Elon Musk, Neuralink, PCI in general, in this regard, which is adding
link |
00:26:16.360
a machine, a computer, the capability of a computer to communicate with the brain and
link |
00:26:22.920
the brain to communicate with the computer at the very basic applications and then like
link |
00:26:27.360
the futuristic kind of thoughts.
link |
00:26:29.080
Yeah.
link |
00:26:30.080
First of all, it's terrific that people are jumping into doing that because it's clearly
link |
00:26:32.960
the the future.
link |
00:26:34.520
The interesting part is our brains have pretty good methods of interacting with technology.
link |
00:26:39.000
So maybe it's your fat thumbs on a cell phone or something, but or maybe it's watching
link |
00:26:43.680
a YouTube video and getting into your eye that way.
link |
00:26:46.000
But we have pretty rapid ways of communicating with technology and getting data.
link |
00:26:50.160
So if you actually crack open the skull and go into the inner sanctum of the brain, you
link |
00:26:56.760
might be able to get a little bit faster.
link |
00:26:58.360
But I'll tell you, I'm not so sanguine on the future of that as a business and I'll tell
link |
00:27:06.960
you why it's because there are various ways of getting data in and out and an open head
link |
00:27:12.920
surgery is a big deal.
link |
00:27:14.680
Neurosurgeons don't want to do it because there's always risk of death and infection
link |
00:27:18.440
on the table.
link |
00:27:19.960
And also it's not clear how many people would say I'm going to volunteer to get something
link |
00:27:24.480
in my head so that I can text faster, you know, 20% faster.
link |
00:27:29.880
So I think it's, you know, mother nature surrounds the brain with this armored, you know, bunker
link |
00:27:36.480
of the skull because it's a very delicate material.
link |
00:27:39.640
And there's an expression in neurosurgery about the brain is, you know, the person is
link |
00:27:46.880
never the same after you open up their skull.
link |
00:27:49.000
Now, whether or not that's true or whatever, who cares, but it's a big deal to do an open
link |
00:27:53.480
head surgery.
link |
00:27:54.480
So what I'm interested in is how can we get information in and out of the brain without
link |
00:27:58.520
having to crack the skull open?
link |
00:28:00.760
Got it.
link |
00:28:01.760
We're messing with the biological part, like directly connecting or messing with the intricate
link |
00:28:08.120
biological thing that we got going on, it seems to be working.
link |
00:28:11.640
Yeah, exactly.
link |
00:28:12.640
And by the way, where NeuroLink is going, which is wonderful is going to be in patient
link |
00:28:17.680
cases, it really matters for all kinds of surgeries that a person needs, whether for
link |
00:28:22.000
Parkinson's or epilepsy or whatever.
link |
00:28:24.080
It's a terrific new technology for essentially sewing electrodes in there and getting more
link |
00:28:28.600
higher density of electrodes.
link |
00:28:29.960
So that's great.
link |
00:28:30.960
I just don't think as far as the future of BCI goes, I don't suspect that people will
link |
00:28:37.600
go in and say, yeah, drill a hole in my head and do that.
link |
00:28:40.600
Well, it's interesting because I think there's a similar intuition, but I say in the world
link |
00:28:45.160
of autonomous vehicles, that folks know how hard it is and it seems damn impossible.
link |
00:28:51.560
The similar intuition about I'm sticking on the Elon Musk thing is just a good, easy
link |
00:28:55.960
example.
link |
00:28:56.960
There's a similar intuition about colonizing Mars, if you really think about it, it seems
link |
00:29:01.960
extremely difficult and almost, I mean, just technically difficult to a degree where you
link |
00:29:10.760
want to ask, is it really worth doing, worth trying?
link |
00:29:14.960
And then the same is applied with BCI, but the thing about the future is it's hard to
link |
00:29:22.560
predict.
link |
00:29:23.560
So the exciting thing to me with, so once it does, once if successful, it's able to
link |
00:29:29.960
help patients, it may be able to discover something very surprising about our ability
link |
00:29:37.680
to directly communicate with the brain.
link |
00:29:39.680
So exactly what you're interested in is figuring out how to play with this malleable brain,
link |
00:29:46.800
but like help assist it somehow.
link |
00:29:49.400
I mean, it's such a compelling notion to me that we're now working on all these exciting
link |
00:29:54.320
machine learning systems that are able to learn from data.
link |
00:30:00.120
And then if we can have this other brain that's a learning system that's live wired on the
link |
00:30:08.800
human side and them to be able to communicate, it's like a self play mechanism was able to
link |
00:30:14.560
beat the world champion ago so they can play with each other, the computer and the brain,
link |
00:30:21.560
like when you sleep.
link |
00:30:22.560
I mean, there's a lot of futuristic kind of things that it's just exciting possibilities,
link |
00:30:27.720
but I hear you, we understand so little about the actual intricacies of the communication
link |
00:30:34.000
of the brain that it's hard to find the common language.
link |
00:30:38.680
Well, interestingly, the technologies that have been built don't actually require the
link |
00:30:47.080
perfect common language.
link |
00:30:48.400
So for example, hundreds of thousands of people are walking around with artificial ears and
link |
00:30:52.440
artificial eyes, meaning cochlear implants or retinal implants.
link |
00:30:57.040
So this is, you know, you take essentially digital microphone, you slip an electrode
link |
00:31:01.200
stripped into the inner ear, and people can learn how to hear that way, or you take an
link |
00:31:05.640
electrode grid and you plug it into the retina at the back of the eye, and people can learn
link |
00:31:10.040
how to see that way.
link |
00:31:11.200
The interesting part is those devices don't speak exactly the natural biological language,
link |
00:31:17.080
they speak the dialect of Silicon Valley.
link |
00:31:20.440
And it turns out that as recently as about 25 years ago, a lot of people thought this
link |
00:31:25.160
was never going to work.
link |
00:31:26.560
They thought it wasn't going to work for that reason, but the brain figures it out.
link |
00:31:30.400
It's really good at saying, okay, look, there's some correlation between what I can touch
link |
00:31:34.080
and feel and hear and so on, and the data that's coming in or between, you know, I
link |
00:31:38.700
clap my hands and I have signals coming in there, and it figures out how to speak any
link |
00:31:43.000
language.
link |
00:31:44.000
Oh, that's fascinating.
link |
00:31:45.080
So like no matter if it's Neuralink, so directly communicating with the brain or it's a smartphone
link |
00:31:52.880
or Google Glass, or the brain figures out the efficient way of communication.
link |
00:31:58.760
Well, exactly, exactly.
link |
00:32:00.520
And what I propose is the potato head theory of evolution, which is that our eyes and nose
link |
00:32:07.120
and mouth and ears and fingertips, all this stuff is just plug and play.
link |
00:32:10.960
And the brain can figure out what to do with the data that comes in.
link |
00:32:14.360
And part of the reason that I think this is right, and I care so deeply about this, is
link |
00:32:19.120
when you look across the animal kingdom, you find all kinds of weird peripheral devices
link |
00:32:22.100
plugged in, and the brain figures out what to do with the data.
link |
00:32:25.800
And I don't believe that Mother Nature has to reinvent the principles of brain operation
link |
00:32:31.480
each time, to say, oh, now I'm going to have heat pits to detect infrared, now I'm going
link |
00:32:34.720
to have something to detect, you know, electro receptors on the body, now I'm going to detect
link |
00:32:39.640
something to pick up the magnetic field of the earth with cryptochromes in the eye.
link |
00:32:44.080
And so instead, the brain says, oh, I got it, there's data coming in.
link |
00:32:47.280
Is that useful?
link |
00:32:48.280
Can I use something with it?
link |
00:32:49.280
Oh, great.
link |
00:32:50.280
I'm going to mold myself around the data that's coming in.
link |
00:32:52.280
It's kind of fascinating to think that we think of smartphones and all this new technology
link |
00:32:57.640
as novel, as totally novel, as outside of what evolution ever intended, or like what
link |
00:33:04.000
nature ever intended, it's fascinating to think that like, the entirety of the process
link |
00:33:08.960
of evolution is perfectly fine and ready for the smartphone, and the internet, like it's
link |
00:33:14.840
ready, it's ready to be valuable to that, and whatever comes to cyborgs, to virtual
link |
00:33:20.520
reality, we kind of think like, this is, you know, there's all these like books written
link |
00:33:24.720
about natural, what's natural, and we're like destroying our natural cells by like embracing
link |
00:33:31.200
all this technology, it's kind of, you know, we're not probably not giving the brain enough
link |
00:33:35.880
credit, like, this thing is just fine with new tech.
link |
00:33:40.200
Oh, exactly.
link |
00:33:41.200
It wraps yourself around, and by the way, wait till you have kids, you'll see the ease
link |
00:33:44.480
with which they pick up on stuff.
link |
00:33:46.440
Yeah.
link |
00:33:47.440
Yeah, as Kevin Kelly said, technology is what gets invented after you're born.
link |
00:33:54.560
But the stuff that already exists when you're born, that's not even tech, that's just background
link |
00:33:57.680
furniture, like the fact that the iPad exists for my son and daughter, like that's just
link |
00:34:01.400
background furniture.
link |
00:34:02.400
So, yeah, it's, because we have this incredibly malleable system, it just absorbs whatever
link |
00:34:09.560
is going on in the world and learns what to do with it.
link |
00:34:11.800
So do you think, just to link up for a little bit more, do you think it's perfect?
link |
00:34:17.280
It's possible to coadjust, like, we're kind of, you know, for the machine to adjust to
link |
00:34:27.280
the brain, for the brain to adjust to the machine, I guess that's what's already happening.
link |
00:34:31.040
Sure.
link |
00:34:32.040
That is what's happening.
link |
00:34:33.040
So, for example, when you put electrodes in the motor cortex to control a robotic arm
link |
00:34:37.080
for somebody who's paralyzed, the engineers do a lot of work to figure out, okay, what
link |
00:34:41.400
can we do with the algorithm here so that we can detect what's going on from these cells
link |
00:34:45.600
and figure out how to best program the robotic arm to move, given the data that we're measuring
link |
00:34:50.960
from these cells.
link |
00:34:52.120
But also, the brain is learning too.
link |
00:34:54.640
So, you know, the paralyzed woman says, wait, I'm trying to grab this thing.
link |
00:34:59.000
And by the way, it's all about relevance.
link |
00:35:01.000
So if there's a piece of food there and she's hungry, she'll figure out how to get this food
link |
00:35:07.400
into her mouth with the robotic arm, because that is what matters.
link |
00:35:11.840
Well, that's, okay, first of all, that pain is really promising and beautiful.
link |
00:35:17.680
For some reason, a really optimistic picture that, you know, our brain is able to adjust
link |
00:35:24.160
to so much that, you know, so many things happen this year, 2020, that you think, like,
link |
00:35:30.720
how we're ever going to deal with it.
link |
00:35:33.480
It's somehow encouraging and inspiring that, like, we're going to be okay.
link |
00:35:40.720
Well, that's right.
link |
00:35:41.720
I actually think so 2020 has been an awful year for almost everybody in many ways.
link |
00:35:47.240
But the one silver lining has to do with brain plasticity, which is to say, we've all been
link |
00:35:52.520
on our, you know, on our gerbil wheels, we've all been in our routines and, and, you know,
link |
00:35:58.040
as I mentioned, our internal models are all about how do you maximally succeed?
link |
00:36:02.360
How do you optimize your operation in this circumstance where you are, right?
link |
00:36:07.440
And then all of a sudden, bang, 2020 comes, we're completely off our wheels, we're having
link |
00:36:12.160
to create new things all the time and figure out how to do it.
link |
00:36:16.240
And that is terrific for brain plasticity because, and we know this because there are
link |
00:36:21.840
very large studies on older people who stay cognitively active their whole lives.
link |
00:36:28.200
Some, some fraction of them have Alzheimer's disease physically, but nobody knows that
link |
00:36:33.360
when they're alive, even though their brain is getting chewed up with the ravages of
link |
00:36:37.240
Alzheimer's, cognitively they're doing just fine.
link |
00:36:40.120
Why?
link |
00:36:41.120
It's because they're, they're, they're challenged all the time.
link |
00:36:44.200
They've got all these new things going on, all this novelty, all these responsibilities,
link |
00:36:47.400
chores, social life, all these things happening.
link |
00:36:49.800
And as a result, they're constantly building new roadways, even as parts degrade.
link |
00:36:54.880
And, and, and that's the only good news is that we are in a situation where suddenly
link |
00:36:59.360
we can't just operate like automata anymore, we have to think of completely new ways to
link |
00:37:04.440
do things.
link |
00:37:05.440
And that's wonderful.
link |
00:37:07.840
I don't know why this question popped into my head.
link |
00:37:11.240
It's quite absurd, but are we going to be okay?
link |
00:37:17.480
You said this, it's a promising civil lining just from your own, because you've written
link |
00:37:20.760
about this and thought about this outside of maybe even the plasticity of the brain,
link |
00:37:25.400
but just this whole pandemic kind of changed the way it knocked us out of this hamster
link |
00:37:33.400
wheel like that of habit.
link |
00:37:35.560
A lot of people had to reinvent themselves, unfortunately, and I have a lot of friends
link |
00:37:42.160
who either already or, or are going to lose their business, you know, is basically it's
link |
00:37:49.800
taking the dreams that people have had and said, like, said this, this dream, this particular
link |
00:37:56.680
dream you've had will no longer be possible.
link |
00:37:59.440
So you have to find something new.
link |
00:38:02.440
What are your, are we going to be okay?
link |
00:38:06.000
Yeah, we'll be okay in the sense that, I mean, it's going to be a rough time for many or
link |
00:38:11.040
most people, but in the sense that it is sometimes useful to find that what you thought was your
link |
00:38:19.760
dream was not the thing that you're going to do.
link |
00:38:24.760
This is obviously the plot in lots of Hollywood movies that someone says, I'm going to do
link |
00:38:27.920
this, and then that gets foiled and they end up doing something better.
link |
00:38:31.200
But this is true in life, I mean, in general, even though we plan our lives as best we can,
link |
00:38:39.640
it's predicated on our notion of, okay, given everything that's around me, this is what's
link |
00:38:44.160
possible for me next.
link |
00:38:47.200
But it takes 20, 20 to knock you off that where you think, oh, well, actually, maybe
link |
00:38:51.040
there's something I can be doing that's bigger, that's better.
link |
00:38:53.960
Yeah, you know, for me, one exciting thing, and I just talked to Grant Sanderson, I don't
link |
00:38:59.760
know if you know who he is, he's a three blue one brown, it's a YouTube channel.
link |
00:39:03.880
He does, he's a, if you see it, you would recognize it.
link |
00:39:08.360
He's like a really famous math guy, and he's a math educator, and he does these incredible
link |
00:39:13.520
beautiful videos.
link |
00:39:15.160
And now I see sort of at MIT, folks are struggling to try to figure out, you know, if we do teach
link |
00:39:20.800
remotely, how do we do it effectively?
link |
00:39:23.680
You have these world class researchers and professors trying to figure out how to put
link |
00:39:31.160
content online that teaches people.
link |
00:39:34.000
And to me, a possible future of that is, you know, Nobel Prize winning faculty become
link |
00:39:42.440
YouTubers.
link |
00:39:43.440
Like, like that, that to me is so exciting, like what Grant said, which is like the possibility
link |
00:39:50.800
of creating canonical videos on the thing you're a world expert in, you know, there's
link |
00:39:55.360
so many topics, it just, the world doesn't, you know, there's faculty, I mentioned Tedric,
link |
00:40:02.200
there's all these people in robotics that are experts in a particular beautiful field,
link |
00:40:07.960
on which there's only just papers.
link |
00:40:11.080
There's no popular book, there's no, there's no clean canonical video showing the beauty
link |
00:40:17.080
of a subject.
link |
00:40:18.360
And one possibility is they try to create that and share it with the world.
link |
00:40:25.560
This is the beautiful thing.
link |
00:40:26.560
This, of course, has been happening for a while already.
link |
00:40:28.760
I mean, for example, when I go and I give book talks, often what will happen is some
link |
00:40:33.000
13 year old will come up to me afterwards and say something, and I'll say, my God, that
link |
00:40:36.640
was so small.
link |
00:40:37.640
Like, how did you know that?
link |
00:40:38.640
And they'll say, oh, I saw it on a TED talk.
link |
00:40:40.680
Well, what an amazing opportunity.
link |
00:40:43.000
We got the best person in the world on subject X giving a 15 minute talk as beautifully as
link |
00:40:50.560
he or she can.
link |
00:40:52.080
And the 13 year old just grows up with that.
link |
00:40:53.560
That's just the mother's milk, right?
link |
00:40:55.280
As opposed to when we grew up, you know, I had whatever homeroom teacher I had and, you
link |
00:41:01.680
know, whatever classmates I had and hopefully that person knew what he or she was teaching
link |
00:41:06.320
and often didn't and, you know, just made things up.
link |
00:41:08.880
So the opportunity now has become extraordinary to get the best of the world.
link |
00:41:14.480
And the reason this matters, of course, is because obviously, back to plasticity, the
link |
00:41:18.880
way that we, the way our brain gets molded is by absorbing everything from the world,
link |
00:41:24.480
all of the knowledge and the data and so on that it can get.
link |
00:41:28.680
And then springboarding off of that.
link |
00:41:33.160
And we're in a very lucky time now because we grew up with a lot of just in case learning.
link |
00:41:40.440
So you know, just in case you ever need to know these dates in Mongolian history, here
link |
00:41:43.760
there.
link |
00:41:44.760
But what kids are grown up with now, like my kids, is tons of just in time learning.
link |
00:41:48.840
So as soon as they're curious about something, they ask Alexa, they ask Google Home, they
link |
00:41:51.880
get the answer right there in the context of the curiosity.
link |
00:41:54.640
The reason this matters is because for plasticity to happen, you need to carry, you need to
link |
00:42:00.680
be curious about something.
link |
00:42:02.760
And this is something, by the way, that the ancient Romans had noted.
link |
00:42:06.400
They had outlined seven different levels of learning and the highest level is when you're
link |
00:42:09.280
curious about a topic.
link |
00:42:10.720
But anyway, so kids now are getting tons of just in time learning.
link |
00:42:15.080
And as a result, they're going to be so much smarter than we are.
link |
00:42:18.080
They're just, and we can already see that.
link |
00:42:19.800
I mean, my boy is eight years old, my girl is five.
link |
00:42:22.360
But I mean, the things that he knows are amazing because it's not just him having to do the
link |
00:42:27.840
rote numberization stuff that we did.
link |
00:42:29.960
Yeah, that's just fascinating with the brain, what young brains look like now, because of
link |
00:42:33.840
all those TED Talks, just to just load it in there.
link |
00:42:37.080
And there's also, I mean, a lot of people, right, kind of, there's a sense that our attention
link |
00:42:42.560
span is growing shorter.
link |
00:42:45.000
But it's complicated because, for example, most people, majority of people, it's the
link |
00:42:51.200
80 plus percent of people listen to the entirety of these things, two, three hours for podcasts,
link |
00:42:57.080
long form podcasts, or are becoming more and more popular.
link |
00:43:01.040
So like that's, it's all really giant, complicated mess.
link |
00:43:04.280
And the point is that the brain is able to adjust to it and somehow like form a world
link |
00:43:10.680
view within this new medium of like information that we have, you have like these short tweets
link |
00:43:19.720
and you have these three, four hour podcasts and you have Netflix movie.
link |
00:43:24.920
I mean, it's just adjusting to the entirety and just absorbing it and taking it all in
link |
00:43:30.040
and then pops up COVID that forces us all to be home and it all just adjusts and figures
link |
00:43:38.360
it out.
link |
00:43:39.360
Yeah, yeah, exactly.
link |
00:43:40.360
It's fascinating.
link |
00:43:41.360
You know, we've been talking about the brain as if it's something separate from the human
link |
00:43:48.360
that carries it a little bit, like whenever you talk about the brain, it's easy to forget
link |
00:43:53.280
that that's us.
link |
00:43:57.200
How much is the whole thing predetermined?
link |
00:44:05.320
How much is it already encoded in there and how much is it, the actions, the decisions,
link |
00:44:20.320
the judgments.
link |
00:44:21.320
You mean like who you are?
link |
00:44:23.360
Who you are.
link |
00:44:24.360
Oh, yeah, yeah.
link |
00:44:25.360
Okay, great question.
link |
00:44:26.360
Right.
link |
00:44:27.360
So there used to be a big debate about nature versus nurture and we now know that it's always
link |
00:44:30.920
both.
link |
00:44:31.920
You can't even separate them because you come to the table with a certain amount of nature,
link |
00:44:35.520
for example, your whole genome and so on.
link |
00:44:37.760
The experiences you have in the womb, like whether your mother is smoking or drinking,
link |
00:44:41.920
things like that, whether she's stressed, so on, those all influence how you're going
link |
00:44:45.520
to pop out of the womb.
link |
00:44:47.400
From there, everything is an interaction between all of your experiences and the nature.
link |
00:44:55.600
What I mean is, I think of it like a space time cone where you have, you dropped in the
link |
00:45:01.560
world and depending on the experience that you have, you might go off in this direction,
link |
00:45:04.240
that direction, that direction because there's interaction all the way.
link |
00:45:08.720
Your experiences determine what happens with the expression of your genes.
link |
00:45:12.400
So some genes get repressed, some get expressed and so on and you actually become a different
link |
00:45:17.280
person based on your experiences.
link |
00:45:19.000
There's a whole field called epigenomics, or epigenetics, I should say, which is about
link |
00:45:24.520
the epigenome and that is the layer that sits on top of the DNA and causes the genes to
link |
00:45:31.520
express differently.
link |
00:45:32.520
That is directly related to the experiences that you have.
link |
00:45:35.200
So just as an example, they take rat pups and one group is placed away from their parents
link |
00:45:41.440
and the other group is groomed and licked and taken good care of.
link |
00:45:44.760
That changes their gene expression for the rest of their life.
link |
00:45:46.920
They go off in different directions in this space time cone.
link |
00:45:51.120
So yeah, this is of course why it matters that we take care of children and pour money
link |
00:45:58.240
into things like education and good childcare and so on for children broadly because these
link |
00:46:05.000
formed of years matter so much.
link |
00:46:08.440
So is there a free will?
link |
00:46:11.720
This is a great question for the absurd high level philosophical question.
link |
00:46:17.200
No, no, these are my favorite kind of questions.
link |
00:46:19.200
Here's the thing.
link |
00:46:20.200
Here's the thing.
link |
00:46:21.200
We don't know.
link |
00:46:22.200
If you ask most neuroscientists, they'll say that we can't really think of how you would
link |
00:46:27.720
get free will in there because as far as we can tell, it's a machine.
link |
00:46:30.320
It's a very complicated machine, enormously sophisticated, 86 billion neurons about the
link |
00:46:36.280
same number of glial cells.
link |
00:46:38.320
Each of these things is as complicated as the city of San Francisco.
link |
00:46:41.360
Each neuron in your head has the entire human genome in it.
link |
00:46:43.600
It's expressing millions of gene products.
link |
00:46:47.720
These are incredibly complicated biochemical cascades.
link |
00:46:50.080
Each one is connected to 10,000 of its neighbors, which means you have half a quadrillion connections
link |
00:46:54.880
in the brain.
link |
00:46:55.880
So it's incredibly complicated, but it is fundamentally appears to just be a machine.
link |
00:47:02.680
And therefore, if there's nothing in it that's not being driven by something else, then it
link |
00:47:07.960
seems it's hard to understand where free will would come from.
link |
00:47:12.680
So that's the camp that pretty much all of us fall into, but I will say our science is
link |
00:47:16.760
still quite young.
link |
00:47:18.360
And I'm a fan of the history of science, and the thing that always strikes me as interesting
link |
00:47:22.680
is when you look back at any moment in science, everybody believes something is true, and
link |
00:47:28.800
they simply didn't know about what Einstein revealed or whatever.
link |
00:47:33.280
And so who knows?
link |
00:47:35.000
And they all feel like at any moment in history, they all feel like we've converged to the
link |
00:47:39.920
final answer.
link |
00:47:40.920
Exactly.
link |
00:47:41.920
Exactly.
link |
00:47:42.920
Like all the pieces of the puzzle are there.
link |
00:47:44.000
And I think that's a funny illusion that's worth getting rid of, and in fact, this is
link |
00:47:48.280
what drives good science is recognizing that we don't have most of the puzzle pieces.
link |
00:47:52.760
So as far as the free will question goes, I don't know.
link |
00:47:55.680
At the moment, it seems, wow, it would be really impossible to figure out how something
link |
00:47:59.040
else could fit in there, but 100 years from now, our textbooks might be very different
link |
00:48:04.600
than they are now.
link |
00:48:05.600
I mean, could I ask you to speculate, where do you think free will could be squeezed into
link |
00:48:10.720
there?
link |
00:48:11.720
Like, what's that even, is it possible that our brain just creates kinds of illusions
link |
00:48:17.600
that are useful for us?
link |
00:48:20.040
Or like what, where could it possibly be squeezed in?
link |
00:48:24.320
Well, let me give a speculation answer to your very nice question, but you know, and
link |
00:48:31.080
the listeners of this podcast, don't quote me on this.
link |
00:48:33.160
Yeah, exactly, I'm not saying this is what I believe to be true, but let me just give
link |
00:48:36.160
an example.
link |
00:48:37.160
I give this at the end of my book Incognito.
link |
00:48:39.040
So the whole book of Incognito is about, you know, all the what's happening in the
link |
00:48:42.480
brain and essentially I'm saying, look, here's all the reasons to think that free will probably
link |
00:48:46.120
does not exist.
link |
00:48:47.120
But at the very end, I say, look, imagine that you are, you know, imagine that you're
link |
00:48:54.640
a Kalahari Bushman, and you find a radio in the sand, and you've never seen anything
link |
00:48:59.960
like this, and you, you look at this radio and you realize that when you turn this knob,
link |
00:49:05.920
you hear voices coming from their voices coming from it.
link |
00:49:08.840
So being a, you know, a radio materialist, you try to figure out like, how does this
link |
00:49:13.560
thing operate?
link |
00:49:14.560
So you take off the back cover, and you realize there's all these wires.
link |
00:49:17.040
And when you take out some wires, the voices get garbled or stop or whatever.
link |
00:49:22.440
And so what you end up developing is a whole theory about how this connection is patterned
link |
00:49:26.040
of wires gives rise to voices, but it would never strike you that in distant cities, there's
link |
00:49:31.920
a radio tower and there's invisible stuff beaming, and that's actually the origin of
link |
00:49:36.320
the voices.
link |
00:49:37.320
And this is just necessary for it.
link |
00:49:38.860
So I mentioned this just as a speculation, say, look, how would we know what we know
link |
00:49:44.520
about the brain for absolutely certain is that if when you damage pieces and parts of
link |
00:49:47.920
it, things get jumbled up.
link |
00:49:50.680
But how would you know if there's something else going on that we can't see like electromagnetic
link |
00:49:54.320
radiation, that is what's actually generating this?
link |
00:49:58.280
Yeah.
link |
00:49:59.280
You paint a beautiful example of how totally, because we don't know most of how our universe
link |
00:50:09.400
works, how totally off base we might be with our science until, I mean, yeah, I mean, that's
link |
00:50:17.800
inspiring.
link |
00:50:18.800
That's beautiful.
link |
00:50:19.800
It's kind of terrifying.
link |
00:50:20.800
It's humbling.
link |
00:50:21.800
It's all of the above.
link |
00:50:23.800
And the important part just to recognize is that, of course, we're in the position of
link |
00:50:28.720
having massive unknowns.
link |
00:50:32.000
And we have, of course, the known unknowns, and that's all the things we're pursuing in
link |
00:50:37.720
our labs and trying to figure out that, but there's this whole space of unknown unknowns
link |
00:50:41.360
that we haven't even realized we haven't asked yet.
link |
00:50:43.760
Let me kind of ask a weird, maybe a difficult question part that has to do with, I've been
link |
00:50:51.560
recently reading a lot about World War II.
link |
00:50:54.280
I'm currently reading a book I recommend for people, which is, as a Jew, it's been difficult
link |
00:51:00.400
to read, but the rise and fall of the Third Reich.
link |
00:51:04.960
So let me just ask about like the nature of genius, the nature of evil.
link |
00:51:10.040
If we look at somebody like Einstein, we look at Hitler, Stalin, modern day, Jeffrey Epstein,
link |
00:51:19.680
those folks who through their life have done with Einstein, done works of genius, and with
link |
00:51:26.240
the others I mentioned have done evil on this world.
link |
00:51:30.880
What do we think about that in a live wired brain?
link |
00:51:34.600
Like, how do we think about these extreme people?
link |
00:51:39.440
Here's what I'd say.
link |
00:51:41.840
This is a very big and difficult question, but what I would say briefly on it is, first
link |
00:51:47.880
of all, I saw a cover of Time Magazine some years ago, and it was a big, you know, sagittal
link |
00:51:54.560
slice of the brain, and it said something like, what makes us good and evil?
link |
00:51:59.040
And there was a little spot pointing to it, and there was a picture of Gandhi, and there
link |
00:52:01.400
was a little spot that was pointing to Hitler.
link |
00:52:03.600
And these Time Magazine covers always make me mad because it's so goofy to think that
link |
00:52:08.080
we're going to find some spot in the brain or something.
link |
00:52:10.760
Instead, the interesting part is, because we're live wired, we are all about the world
link |
00:52:18.600
and the culture around us.
link |
00:52:20.760
So somebody like Adolf Hitler got all this positive feedback about what was going on,
link |
00:52:27.720
and the crazier and crazier the ideas he had, he's like, let's set up death camps and murder
link |
00:52:32.720
a bunch of people and so on.
link |
00:52:34.440
Somehow he was getting positive feedback from that, and all these other people, they're
link |
00:52:38.720
all, you know, spun each other up, and you look at anything like, I mean, look at the,
link |
00:52:43.000
you know, the cultural revolution in China or the, you know, the Russian revolution or
link |
00:52:51.200
things like this, where you look at these, you think, my God, how do people all behave
link |
00:52:54.360
like this?
link |
00:52:55.360
But it's easy to see groups of people spinning themselves up in particular ways where they
link |
00:52:59.880
all say, well, what I have thought this was right in a different circumstance, I don't
link |
00:53:05.000
know, but Fred thinks it's right, and Steve thinks it's right, everyone around me thinks
link |
00:53:07.440
it's right.
link |
00:53:08.440
So part of the maybe downside of having a live wired brain is that you can get crowds
link |
00:53:14.320
of people doing things as a group.
link |
00:53:17.560
So it's interesting to, you know, we would pinpoint Hitler as saying that's the evil
link |
00:53:21.240
guy, but in a sense, I think it was Tolstoy who said the king becomes slave to the people.
link |
00:53:30.880
In other words, you know, Hitler was just a representation of whatever was going on
link |
00:53:35.800
with that huge crowd that he was surrounded with.
link |
00:53:39.520
So I only bring that up to say that it's, you know, it's very difficult to say what
link |
00:53:47.000
it is about this person's brain and that person's brain.
link |
00:53:49.120
He obviously got feedback for what he was doing.
link |
00:53:51.680
The other thing, by the way, about what we often think of as being evil in society is
link |
00:53:59.320
my lab recently published some work on in groups and out groups, which is a very important
link |
00:54:07.160
part of this puzzle.
link |
00:54:08.360
So it turns out that we are very, we are very, you know, engineered to care about in groups
link |
00:54:14.960
versus out groups, and this seems to be like a really fundamental thing.
link |
00:54:18.780
So we did this experiment in my lab where we brought people in, we stick them in the
link |
00:54:23.120
scanner, and we, I don't know, and it's not me if you know this, but we show them on the
link |
00:54:27.000
screen six hands, and the computer goes around randomly picks a hand, and then you see that
link |
00:54:33.840
hand gets stabbed with a syringe needle.
link |
00:54:35.800
So you actually see a syringe needle enter the hand and come out, and it's really what
link |
00:54:40.320
that does is that triggers parts of the pain matrix, this area is in your brain that involved
link |
00:54:46.080
in feeling physical pain.
link |
00:54:47.320
Now the interesting thing is it's not your hand that was stabbed.
link |
00:54:50.040
So what you're seeing is empathy.
link |
00:54:51.720
This is you seeing someone else's hand get stabbed, you feel like, oh God, this awful,
link |
00:54:56.120
okay, okay.
link |
00:54:57.560
We contrast that by the way with somebody's hand getting poked as a Q tip, which is, you
link |
00:55:01.320
know, looks visually the same, but it's, you don't have that same level of response.
link |
00:55:06.320
Now what we do is we label each hand with a, with a one word label, Christian, Jewish,
link |
00:55:12.240
Muslim, atheist, Scientologist, Hindu.
link |
00:55:14.480
And now the computer goes around picks a hand stabs the hand and the question is, how
link |
00:55:19.360
much does your brain care about all the people in your out group versus the one label that
link |
00:55:24.240
happens to match you?
link |
00:55:26.800
And it turns out for everybody across all religions, they care much more about their
link |
00:55:30.680
in group than their out group.
link |
00:55:31.680
And when I say they care, what I mean is you get a bigger response from their brain.
link |
00:55:35.920
Everything's the same.
link |
00:55:36.920
It's the same hands.
link |
00:55:38.920
It's just a one word label.
link |
00:55:40.680
You care much more about your in group than your out group.
link |
00:55:43.120
And I wish this weren't true, but this is how humans are.
link |
00:55:45.680
I wonder how fundamental that is, or if it's a, it's the emergent thing about culture.
link |
00:55:53.360
Like if we lived alone with, like if it's genetically built into the brain, like this,
link |
00:55:58.680
this longing for tribe.
link |
00:56:00.480
So I'll tell you, we addressed that.
link |
00:56:02.360
So here's what we did.
link |
00:56:03.360
There are two, actually there are two other things we did as part of this study that I
link |
00:56:08.000
think matter for this point.
link |
00:56:09.840
One is, so, okay, so we show that you have a much bigger response.
link |
00:56:12.840
And by the way, this is, this is not a cognitive thing.
link |
00:56:14.440
It's a very low level basic response to seeing pain in somebody.
link |
00:56:18.720
Okay.
link |
00:56:19.720
Great study, by the way.
link |
00:56:20.720
Thanks.
link |
00:56:21.720
Thanks.
link |
00:56:22.720
So one of the things is we, we next have it where we say, okay, the year is 2025.
link |
00:56:27.200
And these three religions are now in a war against these three religions.
link |
00:56:30.840
And it's all randomized, right?
link |
00:56:31.840
But what you see is your thing, and you have two allies now against these others.
link |
00:56:36.240
And now it happens over the course of many trials, you see everybody gets stabbed at
link |
00:56:40.160
different times.
link |
00:56:41.160
And the question is, do you care more about your allies?
link |
00:56:43.280
And the answer is yes.
link |
00:56:44.280
Suddenly, people who a moment ago, you didn't really care when they got stabbed.
link |
00:56:47.360
Now, simply with this one word thing that you're there now, your allies, you care more
link |
00:56:51.400
about them.
link |
00:56:52.760
But then what I wanted to do was look at how ingrained is this or how arbitrary is it?
link |
00:56:57.720
So we brought new participants in and we said, here's a coin, toss the coin.
link |
00:57:02.720
If it's heads, you're an Augustinian.
link |
00:57:04.240
If it's a tails, you're a Justinian, totally made up.
link |
00:57:07.320
Okay.
link |
00:57:08.320
So they toss it, they get whatever.
link |
00:57:10.160
We give them a band that says, you know, Augustinian on it, whatever tribe they're in now.
link |
00:57:15.560
And they get in the scanner and they see a thing on the screen says the Augustinians
link |
00:57:19.200
and Justinians are two warring tribes.
link |
00:57:21.160
You see a bunch of hands, some are labeled Augustinians, some are Justinian.
link |
00:57:24.720
And now you care more about whichever team you're on than the other team, even though
link |
00:57:28.960
it's totally arbitrary and you know it's arbitrary because you're the one who tossed the coin.
link |
00:57:33.040
So it's a state that's very easy to find ourselves in.
link |
00:57:37.080
In other words, just before walking in the door, they'd never even heard of Augustinian
link |
00:57:41.000
versus Justinian.
link |
00:57:42.000
And now their brain is representing it simply because they're told they're on this team.
link |
00:57:46.440
You know, now I did my own personal study of this.
link |
00:57:51.120
It's a, so once you're an Augustinian, that tends to be sticky because I've been a Packers
link |
00:57:56.720
fan, a good bad Packers fan of my whole life.
link |
00:57:59.200
Now when I'm in Boston with like the Patriots, it's been tough going from my live wire brain
link |
00:58:05.360
to switch to the Patriots to be, so once you become, it's, it's interesting once the tribe
link |
00:58:11.040
is sticky.
link |
00:58:12.040
Yeah.
link |
00:58:13.040
But that's true.
link |
00:58:14.040
That's, that's it.
link |
00:58:15.040
You know, we never tried that about saying, okay, now you're a Justinian and you were
link |
00:58:18.440
an Augustinian.
link |
00:58:19.440
So how, how sticky it is, but there are studies of this, of monkey troops on some island.
link |
00:58:30.520
And what happens is they look at the way monkeys behave when they're part of this tribe and
link |
00:58:34.480
how they treat members of the other tribe of monkeys.
link |
00:58:37.960
And then what they do, I've forgotten how they do that exactly, but they end up switching
link |
00:58:41.520
a monkey so he ends up in the other troop.
link |
00:58:43.200
And very quickly they end up becoming a part of the other troop and, and hating and behaving
link |
00:58:47.800
badly towards the original troops.
link |
00:58:50.480
These are fascinating studies, by the way, this is, this is beautiful.
link |
00:58:55.320
In your, in your book, you have a, you have a good light bulb joke.
link |
00:59:01.320
How many psychiatrists does it take to change a light bulb?
link |
00:59:04.720
Only one, but the light bulb has to want to change, I'm sorry, I'm a sucker for a good
link |
00:59:11.040
light bulb joke.
link |
00:59:12.040
Okay.
link |
00:59:13.040
So, you know, I've been interested in psychiatry my whole life, just maybe tangentially.
link |
00:59:19.160
I've kind of early on dreamed to be a psychiatrist until I understood what it entails.
link |
00:59:25.960
But you know, what, you know, is there hope for psychiatry, for somebody else to help
link |
00:59:33.720
this live wired brain to adjust?
link |
00:59:36.800
Oh yeah.
link |
00:59:37.800
I mean, in the sense that, and this has to do with this issue about us being trapped
link |
00:59:42.120
on our own planet, forget psychiatrists, just think of like when you're talking with a friend
link |
00:59:46.960
and you say, oh, I'm so upset about this.
link |
00:59:49.160
And your friend says, hey, just look at it this way, you know, all we have access to
link |
00:59:54.640
under normal circumstances is just the way we're seeing something.
link |
00:59:57.520
And so it's super helpful to have friends and communities and psychiatrists and so on
link |
01:00:03.400
to help things change that way.
link |
01:00:05.840
So that's how psychiatrists sort of helped us.
link |
01:00:07.320
But more importantly, the role that psychiatrists have played is that there's this sort of naive
link |
01:00:13.400
assumption that we all come to the table with, which is that everyone is fundamentally just
link |
01:00:16.680
like us.
link |
01:00:18.680
And when you're a kid, you believe this entirely.
link |
01:00:21.240
But as you get older and you start realizing, okay, there's something called schizophrenia
link |
01:00:25.120
and that's a real thing.
link |
01:00:26.560
And to be inside that person's head is totally different than what it is to be inside my head
link |
01:00:30.400
or their psychopathy.
link |
01:00:32.520
And to be inside this psychopath's head, he doesn't care about other people.
link |
01:00:36.280
He doesn't care about hurting other people.
link |
01:00:37.520
He's just doing what he needs to do to get what he needs.
link |
01:00:41.120
That's a different head.
link |
01:00:42.400
There's a million different things going on.
link |
01:00:45.600
And it is different to be inside those heads that this is where the field of psychiatry
link |
01:00:50.280
comes in.
link |
01:00:51.280
Now, I think it's an interesting question about the degree to which neuroscience is
link |
01:00:56.080
leaking into and taking over psychiatry and what the landscape will look like 50 years
link |
01:01:00.480
from now.
link |
01:01:01.480
But psychiatry as a profession changes a lot or maybe goes away entirely and neuroscience
link |
01:01:08.480
will essentially be able to take over some of these functions, but it has been extremely
link |
01:01:12.320
useful to understand the differences between how people behave and why and what you can
link |
01:01:19.160
tell about what's going on inside their brain just based on observation of their behavior.
link |
01:01:25.880
This might be years ago, but I'm not sure.
link |
01:01:29.000
There's an Atlantic article you've written about moving away from a distinction between
link |
01:01:35.320
neurological disorders, quote unquote, brain problems and psychiatric disorders or quote
link |
01:01:42.440
unquote, mind problems.
link |
01:01:45.600
So on that topic, how do you think about this gray area?
link |
01:01:47.720
Yeah.
link |
01:01:48.720
This is exactly the evolution that things are going is there was psychiatry and then there
link |
01:01:52.680
were guys and gals in labs poking cells and so on, those were the neuroscientists.
link |
01:01:57.640
But yeah, I think these are moving together for exactly the reason you just cited.
link |
01:02:01.280
And where this matters a lot, the Atlantic article that I wrote was called the brain
link |
01:02:05.760
on trial where this matters a lot is it's the legal system because the way we run our
link |
01:02:12.360
legal system now, and this is true everywhere in the world is, you know, someone shows up
link |
01:02:16.280
in front of the judge's bench or let's say there's five people in front of the judge's
link |
01:02:19.600
bench and they've all committed the same crime.
link |
01:02:21.920
What we do because we feel like, hey, this is fair is because you're going to get the
link |
01:02:25.240
same sense.
link |
01:02:26.240
You're going to get three years in prison or whatever it is.
link |
01:02:28.320
But in fact, brains can be so different.
link |
01:02:30.320
This guy's got schizophrenia.
link |
01:02:31.320
This guy's psychopath.
link |
01:02:32.320
This guy's tweaked down on drugs and so on and so on that it actually doesn't make sense
link |
01:02:37.160
to keep doing that.
link |
01:02:38.160
And what we do in this country more than anywhere in the world is we imagine that incarceration
link |
01:02:44.480
is a one size fits all solution and you may know we have the America has the highest incarceration
link |
01:02:49.320
rate in the whole world in terms of the percentage of our population we put behind bars.
link |
01:02:53.440
So there's a much more refined thing we can do as neuroscience comes in and changes and
link |
01:02:59.960
has the opportunity to change the legal system, which is to say, this doesn't let anybody
link |
01:03:03.920
off the hook.
link |
01:03:04.920
It doesn't say, oh, it's not your fault and so on.
link |
01:03:06.680
But what it does is it changes the equation.
link |
01:03:09.920
So it's not about, hey, how blame worthy are you, but instead is about, hey, what do we
link |
01:03:15.160
do from here?
link |
01:03:16.160
What's the best thing to do from here?
link |
01:03:17.160
So if you take somebody with schizophrenia and you have them break rocks in the hot summer
link |
01:03:20.620
sun in a chain gang, that doesn't help the schizophrenia, that doesn't fix the problem.
link |
01:03:27.640
If you take somebody with a drug addiction who's in jail for being caught with two ounces
link |
01:03:32.200
of some illegal substance and you put them in prison, it doesn't actually fix the addiction,
link |
01:03:37.560
it doesn't help anything.
link |
01:03:40.720
Happily what neuroscience and psychiatry bring to the table is lots of really useful
link |
01:03:44.680
things you can do with schizophrenia, with drug addiction, things like this.
link |
01:03:48.880
And that's why, so I don't know if you know it's better to run a national law and profit
link |
01:03:52.080
called the Center for Science and Law.
link |
01:03:54.000
And it's all about this intersection of neuroscience and legal system.
link |
01:03:57.480
And we're trying to implement changes in every county and every state.
link |
01:04:02.600
I'll just, without going down that rabbit hole, I'll just say one of the very simplest
link |
01:04:06.640
things to do is to set up specialized court systems where you have a mental health court
link |
01:04:12.640
that has judges and juries with expertise in mental illness.
link |
01:04:15.840
Because if you go, by the way, to a regular court and the person says, or the defense
link |
01:04:20.800
lawyer says, this person has schizophrenia, most of the jury will say, man, I call bullshit
link |
01:04:25.440
on that.
link |
01:04:26.440
Why?
link |
01:04:27.440
Because they don't know about schizophrenia.
link |
01:04:28.440
They don't know what it's about.
link |
01:04:30.440
And it turns out people who know about schizophrenia feel very differently as a juror than someone
link |
01:04:36.240
who happens not to know any about schizophrenia, they think it's an excuse.
link |
01:04:39.640
So you have judges and jurors with expertise in mental illness and they know the rehabilitative
link |
01:04:44.120
strategies that are available.
link |
01:04:45.920
That's one thing.
link |
01:04:46.920
Having a drug court where you have judges and jurors with expertise in rehabilitative
link |
01:04:49.880
strategies and what can be done and so on, a specialized prostitution court and so on.
link |
01:04:54.240
All these different things.
link |
01:04:55.800
By the way, this is very easy for counties to implement this sort of thing.
link |
01:04:59.400
And this is, I think, where this matters to get neuroscience into public policy.
link |
01:05:05.160
What's the process of injecting expertise into this?
link |
01:05:08.400
Yeah, I'll tell you exactly what it is.
link |
01:05:10.560
A county needs to run out of money first.
link |
01:05:12.440
I've seen this happen over and over.
link |
01:05:14.560
So what happens is a county has a completely full jail and they say, you know what?
link |
01:05:18.680
We need to build another jail.
link |
01:05:19.880
And then they realize, God, we don't have any money.
link |
01:05:21.280
We can't afford this.
link |
01:05:22.360
We've got too many people in jail.
link |
01:05:23.600
And that's when they turn to, God, we need something smarter.
link |
01:05:26.800
And that's when they set up specialized court systems.
link |
01:05:31.040
We're all function best when our back is against the wall.
link |
01:05:34.360
And that's what COVID is good for.
link |
01:05:36.280
It's because we've all had our routines and we are optimized for the things we do and
link |
01:05:40.880
suddenly our backs are against the wall, all of us.
link |
01:05:43.000
Yeah, it's really, I mean, one of the exciting things about COVID, I mean, I'm a big believer
link |
01:05:49.080
in the possibility of what government can do for the people.
link |
01:05:56.600
And when it becomes too big of a bureaucracy, it starts functioning poorly, it starts wasting
link |
01:06:01.640
money.
link |
01:06:02.720
It's nice to, I mean, COVID is, and reveals that nicely.
link |
01:06:07.360
And it lessons to be learned about who gets elected and who goes into government.
link |
01:06:14.520
Hopefully this, hopefully this inspires talented and young people to go into government to
link |
01:06:20.800
revolutionize different aspects of it.
link |
01:06:23.080
Yeah.
link |
01:06:24.080
So that's, that's the positive silver lining of COVID.
link |
01:06:27.680
I mean, I thought it'd be fun to ask you, I don't know if you're paying attention to
link |
01:06:32.240
machine learning world and GPT three, so the GPT three is this language model is neural
link |
01:06:39.560
network that's able to, it has 175 billion parameters.
link |
01:06:45.040
So it's very large and it's trained and unsupervised way on the internet is just reads a lot of
link |
01:06:54.280
unstructured texts and it's able to generate some pretty impressive things.
link |
01:06:59.480
The human brain compared to that has about a thousand times more synapses.
link |
01:07:06.200
People get so upset when machine learning people compare the brain and we know synapses
link |
01:07:13.800
are different.
link |
01:07:14.800
It was very different, very different, but like, do you, what do you think about GPT three?
link |
01:07:20.720
Here's what I think.
link |
01:07:21.720
Here's what I think a few things.
link |
01:07:22.720
What GPT three is doing is extremely impressive, but it's very different from what the brain
link |
01:07:27.360
does.
link |
01:07:28.360
It's a good impersonator, but just as one example, everybody takes a passage that GPT three has
link |
01:07:36.880
written and they say, wow, look at this and it's pretty good, right?
link |
01:07:40.480
But it's already gone through a filtering process of humans looking at it and saying,
link |
01:07:43.760
okay, well, that's crap.
link |
01:07:44.760
That's crap.
link |
01:07:45.760
Okay.
link |
01:07:46.760
Oh, here's what, here's a sentence that's pretty cool.
link |
01:07:47.760
Now here's the thing.
link |
01:07:49.160
Human creativity is about absorbing everything around it and remixing that and coming up
link |
01:07:53.040
with stuff.
link |
01:07:54.040
We're sort of like GPT three, you know, we're remixing what we've gotten in before, but
link |
01:07:59.880
we also know, we also have very good models of what it is to be another human.
link |
01:08:05.080
And so, you know, I don't know if you speak French or something, but I'm not going to
link |
01:08:08.880
start speaking in French because then you'll say, wait, what are you doing?
link |
01:08:11.560
I don't understand you.
link |
01:08:12.560
Instead, everything coming out of my mouth is meant for your ears.
link |
01:08:16.240
I know what you'll understand.
link |
01:08:18.120
I know the vocabulary that you know and don't know.
link |
01:08:20.120
I know what parts you care about.
link |
01:08:24.000
That's a huge part of it.
link |
01:08:25.240
And so, of all the possible sentences I could say, I'm navigating this thin bandwidth so
link |
01:08:31.800
that it's something useful for our conversation.
link |
01:08:34.280
Yeah, in real time, but also throughout your life, I mean, we're covolving together.
link |
01:08:39.800
We're learning how to communicate together.
link |
01:08:42.920
Exactly.
link |
01:08:43.920
But this is what GPT three does not do.
link |
01:08:46.320
All it's doing is saying, okay, I'm going to take all these senses and remix stuff and
link |
01:08:49.560
pop some stuff out.
link |
01:08:51.240
But it doesn't know how to make it so that you, Lex, will feel like, oh, yeah, that's
link |
01:08:54.720
exactly what I needed to hear.
link |
01:08:57.080
That's the next sentence that I needed to know about for something.
link |
01:09:00.280
Well, of course, it could be all the impressive results we see.
link |
01:09:04.280
The question is, if you raise the number of parameters, whether it's going to be after
link |
01:09:09.520
some...
link |
01:09:10.520
It will not be.
link |
01:09:11.520
No, raising more parameters won't...
link |
01:09:14.320
Here's the thing.
link |
01:09:15.320
It's not that I don't think neural networks can't be like the human brain because I suspect
link |
01:09:18.640
they will be at some point, 50 years.
link |
01:09:20.480
We know.
link |
01:09:21.480
But what we are missing in artificial neural networks is we've got this basic structure.
link |
01:09:28.400
We've got units and you've got synapses that are connected.
link |
01:09:32.720
And that's great.
link |
01:09:33.720
And it's done incredibly mind blowing impressive things, but it's not doing the same algorithms
link |
01:09:38.880
as the human brain.
link |
01:09:40.380
So when I look at my children as little kids, you know, as infants, they can do things that
link |
01:09:45.520
no GPT three can do.
link |
01:09:47.760
They can navigate a complex room.
link |
01:09:50.640
They can navigate social conversation with an adult.
link |
01:09:55.000
They can lie.
link |
01:09:56.000
They can do a million things.
link |
01:09:58.320
They are active thinkers in our world and doing things.
link |
01:10:03.320
And this, of course, I mean, look, we totally agree on how incredibly awesome artificial
link |
01:10:08.160
neural networks are right now.
link |
01:10:09.160
But we also know the things that they can't do well, like be generally intelligent, do
link |
01:10:14.960
all these different things.
link |
01:10:15.960
And that's the reason about the world efficiently, learn efficiently, adapt efficiently.
link |
01:10:20.760
But it's still the rate of improvement.
link |
01:10:24.720
To me, it's possible that it will be surprised.
link |
01:10:28.160
I agree.
link |
01:10:29.160
Awesome.
link |
01:10:30.160
It will be surprised.
link |
01:10:31.160
But what I would assert, and I'm glad I'm getting to say this on your podcast so we
link |
01:10:36.280
can look back at this in two years and 10 years and so on, is that we've got to be much
link |
01:10:40.480
more sophisticated than units and synapses between them.
link |
01:10:44.200
Let me give you an example, and this is something I talk about in LiveWired, is despite the
link |
01:10:48.240
amazing impressiveness, mind blowing impressiveness, computers don't have some basic things, artificial
link |
01:10:55.200
neural networks don't have some basic things that we like caring about relevance, for example.
link |
01:10:59.560
So as humans, we are confronted with tons of data all the time, and we only encode particular
link |
01:11:05.080
things that are relevant to us.
link |
01:11:07.920
We have this very deep sense of relevance that I mentioned earlier is based on survival
link |
01:11:11.840
at the most basic level, but then all the things about my life and your life, what's
link |
01:11:16.600
relevant to you, that we encode.
link |
01:11:19.800
This is very useful.
link |
01:11:20.800
Computers at the moment don't have that, they don't have a yen to survive and things
link |
01:11:24.320
like that.
link |
01:11:25.320
So we filter out a bunch of the junk we don't need, or really good at efficiently zooming
link |
01:11:30.280
in on things we need.
link |
01:11:32.120
Again, could be argued, let me put on my Freud hat, maybe it's, I mean, that's our conscious
link |
01:11:38.840
mind, you know, we're not, you know, there's no reason that neural networks aren't doing
link |
01:11:44.520
the same kind of filtration.
link |
01:11:45.520
I mean, in the sense with GPT3 is doing, so there's a priming step, it's doing an essential
link |
01:11:52.280
kind of filtration when you ask it to generate tweets from, from, I don't know, from an
link |
01:11:58.960
Elon Musk or something like that, it's doing a filtration of it's throwing away all the
link |
01:12:04.360
parameters it doesn't need for this task, and it's figuring out how to do that successfully.
link |
01:12:09.880
And then ultimately it's not doing a very good job right now, but it's doing a lot better
link |
01:12:14.000
job than we expected.
link |
01:12:15.000
But it won't ever do a really good job.
link |
01:12:17.560
And I'll tell you why.
link |
01:12:18.560
I mean, so, so let's say we say, hey, produce an Elon Musk tweet, and we see like, oh, wow,
link |
01:12:23.200
it produced these three.
link |
01:12:24.200
That's great.
link |
01:12:25.200
But again, it's not, we're not seeing the 3000 produced that didn't really make any sense.
link |
01:12:29.120
It's because it has no idea what it is like to be a human.
link |
01:12:33.080
And all the things that you might want to say, and all the reasons you wouldn't, like
link |
01:12:35.680
when you go to write a tweet, you might write something, you think, yeah, it's not going
link |
01:12:39.040
to come off quite right in this modern political climate or whatever, you know, you change things.
link |
01:12:43.480
So and it somehow boils down to fear of mortality and all of these human things at the end of
link |
01:12:49.280
the day, all contained with that tweeting experience.
link |
01:12:52.960
Well, interestingly, the fear of mortality is at the bottom of this, but you've got all
link |
01:12:57.560
these more things like, you know, oh, I want to just in case the chairman of my department
link |
01:13:02.800
reads this.
link |
01:13:03.800
I wanted to come off with that.
link |
01:13:04.800
Just in case my mom looks at this tweet, I want to make sure she, you know, and so on.
link |
01:13:08.360
So that those are all the things that humans are able to sort of throw into the calculation.
link |
01:13:13.280
I mean, what it required, what it requires, though, is having a model of your chairman,
link |
01:13:18.920
having a model of your mother, having a model of, you know, the person you want to go on
link |
01:13:23.920
a date with who might look at your tweet and so on.
link |
01:13:26.280
All these things are, you're running, what it is like to be them.
link |
01:13:31.160
So in terms of the structure of the brain, again, this may be going into speculation
link |
01:13:36.720
land.
link |
01:13:37.720
I hope you go along with me is, okay, so the brain seems to be intelligent and our systems
link |
01:13:46.400
aren't very currently.
link |
01:13:48.520
So where do you think intelligence arises in the brain?
link |
01:13:53.000
Like what, what is it about the brain?
link |
01:13:55.840
So if you mean where location wise, it's no single spot.
link |
01:13:59.960
It would be equivalent to asking, I'm looking at New York City, where is the economy?
link |
01:14:06.560
The answer is you can't point to anywhere.
link |
01:14:08.200
The economy is all about the interaction of all of the pieces and parts of the city.
link |
01:14:12.320
And that's what, you know, intelligence, whatever we mean by that in the brain is interacting
link |
01:14:16.040
from everything going on in the ones.
link |
01:14:18.280
In terms of a structure, so we look humans are much smarter than fish, maybe not dolphins,
link |
01:14:24.960
but dolphins are mammals, right?
link |
01:14:26.680
I assert that what we mean by smarter has to do with live wiring.
link |
01:14:30.720
So what we mean when we say, oh, we're smarter is, oh, we can figure out a new thing and
link |
01:14:33.720
figure out a new pathway to get where we need to go.
link |
01:14:37.080
And that's because fish are essentially coming to the table with, you know, okay, here's
link |
01:14:40.520
the hardware, go, swim, mate, eat.
link |
01:14:43.920
But we have the capacity to say, okay, look, I'm going to absorb, oh, oh, but, you know,
link |
01:14:47.840
I saw someone else do this thing and, and I read once that you could do this other thing
link |
01:14:51.880
and so on.
link |
01:14:52.880
And do you think there's, is there something, I know these are mysteries, but like architecturally
link |
01:14:59.080
speaking, what's feature of the brain of, of the live wire aspect of it that is really
link |
01:15:06.800
useful for intelligence?
link |
01:15:08.200
So like, is it the ability of neurons to reconnect?
link |
01:15:14.360
Like, is there something, is there any lessons about the human brain you think might be inspiring
link |
01:15:20.080
for us in, to take into the artificial, into the machine learning world?
link |
01:15:26.560
Yeah, I'm actually just trying to write some up on this now called, you know, if you want
link |
01:15:30.400
to build a robot, start with the stomach.
link |
01:15:32.640
And what I mean by that, what I mean by that is a robot has to care, it has to have hunger,
link |
01:15:37.080
it has to care about surviving, that kind of thing.
link |
01:15:40.640
Here's an example.
link |
01:15:41.640
So the penultimate chapter of my book, I titled the wolf in the Mars rover.
link |
01:15:46.760
And I just look at this simple comparison of, you look at a wolf, it gets its leg caught
link |
01:15:52.120
in a trap.
link |
01:15:53.120
What does it do?
link |
01:15:54.120
It gnaws its leg off, and then it figures out how to walk on three legs.
link |
01:15:58.360
No problem.
link |
01:15:59.360
Now the Mars rover, Curiosity, got its front wheel stuck in some Martian soil, and it died.
link |
01:16:05.400
This project cut that cost billions of dollars died because the guy's wheels wouldn't be
link |
01:16:10.720
terrific if we could build a robot that chewed off its front wheel and figured out how to
link |
01:16:15.440
operate with a slightly different body plan.
link |
01:16:18.040
That's the kind of thing that we want to be able to build and to get there.
link |
01:16:22.640
What we need, the whole reason the wolf is able to do that is because its motor and somatic
link |
01:16:27.240
sensory systems are live wired.
link |
01:16:28.720
So it says, oh, you know what, turns out we got a body plan that's different than what
link |
01:16:31.840
I thought a few minutes ago, but I have a yen to survive and I care about relevance,
link |
01:16:38.920
which in this case is getting to food, getting back to my pack and so on.
link |
01:16:42.680
So I'm just going to figure out how to operate with this, oh, that didn't work, oh, okay,
link |
01:16:46.560
I'm kind of getting it to work.
link |
01:16:48.640
But the Mars rover doesn't do that.
link |
01:16:49.960
It just says, oh, geez, I was pre programmed to have four wheels, now I have three, I'm
link |
01:16:53.480
screwed.
link |
01:16:54.480
Yeah, you know, I don't know if you're familiar with a philosopher named Ernest Becker.
link |
01:16:58.280
He wrote a book called The Now of Death, and there's a few psychologists, Sheldon Salmon,
link |
01:17:03.480
I think I just spoke with him in his podcast, who developed terror management theory, which
link |
01:17:10.080
is, like Ernest Becker is a philosopher that basically said that fear of mortality is at
link |
01:17:17.360
the core of it.
link |
01:17:19.120
And so I don't know, it sounds compelling as an idea that all of the civilization we've
link |
01:17:25.840
constructed is based on this, but it's...
link |
01:17:28.920
I'm familiar with his work.
link |
01:17:30.640
Here's what I think.
link |
01:17:31.640
I think that, yes, fundamentally this desire to survive is at the core of it, I would agree
link |
01:17:36.760
with that.
link |
01:17:38.240
But how that expresses itself in your life ends up being very different.
link |
01:17:41.720
The reason you do what you do is, I mean, you could list the 100 reasons why you chose
link |
01:17:47.760
to write your tweet this way and that way, and it really has nothing to do with the survival
link |
01:17:50.960
part.
link |
01:17:51.960
It has to do with trying to impress fellow humans and surprise them and say something.
link |
01:17:54.960
Yeah, so many things built on top of each other, but it's fascinating to think that in
link |
01:17:59.480
artificial intelligence systems, we want to be able to somehow engineer this drive for
link |
01:18:05.480
survival for immortality, I mean, because as humans, we're not just about survival,
link |
01:18:11.600
we're aware of the fact that we're going to die, which is a very kind of, we're aware
link |
01:18:16.600
like space time.
link |
01:18:17.600
Most people aren't, by the way.
link |
01:18:18.600
Aren't?
link |
01:18:19.600
Aren't.
link |
01:18:20.600
Confucius said, he said, each person has two lives.
link |
01:18:25.960
The second one begins when you realize that you have just one, but it takes a long time
link |
01:18:31.200
for most people to get there.
link |
01:18:32.360
I mean, you could argue this kind of Freudian thing, which Urs Becker argues is they actually
link |
01:18:40.080
figured it out early on, and the terror they felt was like the reason it's been suppressed
link |
01:18:47.320
and the reason most people, when I ask them about whether they're afraid of death, they
link |
01:18:50.760
basically say, no.
link |
01:18:53.120
They basically say, I'm afraid I won't submit the paper before I die.
link |
01:19:01.120
They see death as a kind of inconvenient deadline for a particular set of, like a book you're
link |
01:19:06.880
writing, as opposed to like, what the hell?
link |
01:19:10.720
This thing ends at any moment.
link |
01:19:15.760
Most people, as I've encountered, do not meditate on the idea that right now you could die.
link |
01:19:22.240
Right now, in the next five minutes, it could be all over and meditate on that idea.
link |
01:19:30.800
I think that somehow brings you closer to the core of the motivations and the core of
link |
01:19:37.480
the human cognition condition.
link |
01:19:39.480
I think it might be the core, but like I said, it is not what's right now.
link |
01:19:43.560
There's so many things on top of it, but it is interesting.
link |
01:19:47.240
As the ancient poet said, death whispers at my ear live for I come.
link |
01:19:55.040
It is certainly motivating when we think about that, okay, I've got some deadline, I don't
link |
01:19:59.400
know exactly what it is, but I better make stuff happen.
link |
01:20:02.280
It is motivating, but I know for just speaking for me personally, that's not what motivates
link |
01:20:08.040
me day to day.
link |
01:20:09.040
It's instead, oh, I want to get this program up and running before this, or I want to make
link |
01:20:15.680
sure my coauthor isn't mad at me because I haven't gotten this in, or I don't want to
link |
01:20:18.520
miss this grant deadline, or whatever the thing is.
link |
01:20:22.080
It's too distant in a sense.
link |
01:20:24.080
Nevertheless, it is good to reconnect, but for the AI systems, none of that is there.
link |
01:20:31.440
A neural network does not fear its mortality, and that seems to be somehow fundamentally
link |
01:20:37.840
missing the point.
link |
01:20:39.640
I think that's missing the point, but I wonder, it's an interesting speculation about whether
link |
01:20:42.720
you can build an AI system that is much closer to being a human without the mortality and
link |
01:20:47.280
survival piece, but just the thing of relevance.
link |
01:20:51.640
I care about this versus that.
link |
01:20:52.920
Right now, if you have a robot roll into the room, it's going to be frozen because it doesn't
link |
01:20:56.040
have any reason to go there versus there.
link |
01:20:57.720
It doesn't have any particular set of things about, this is how I should navigate my next
link |
01:21:05.400
move because I want something.
link |
01:21:07.400
Yeah.
link |
01:21:08.400
The thing about humans is they seem to generate goals.
link |
01:21:13.840
They're like, you said live wired, I mean, it's very flexible in terms of the goals
link |
01:21:19.920
and creative in terms of the goals we generate when we enter a room.
link |
01:21:23.800
You show up to a party without a goal usually, and then you figure it out along the way.
link |
01:21:28.280
Yes, but this goes back to the question about free will, which is when I walk into the party,
link |
01:21:33.480
if you rewound it 10,000 times, would I go and talk to that couple over there versus
link |
01:21:38.400
that person?
link |
01:21:39.400
Like, I might do this exact same thing every time because I've got some goal stack and
link |
01:21:43.920
I think, okay, well, at this party, I really want to meet these kind of people or I feel
link |
01:21:48.000
awkward or whatever my goals are.
link |
01:21:50.880
By the way, there was something that I meant to mention earlier, if you don't mind going
link |
01:21:55.560
back, which is this, when we were talking about BCI, so I don't know if you know this,
link |
01:22:00.400
but what I'm spending 90% of my time doing now is running a company.
link |
01:22:03.480
Do you know about this?
link |
01:22:04.480
Yes.
link |
01:22:05.480
I wasn't sure what the company is involved in.
link |
01:22:08.000
Right.
link |
01:22:09.000
Being talk about it?
link |
01:22:10.000
Yeah.
link |
01:22:11.000
Yeah.
link |
01:22:12.000
If you're a BCI, you can put stuff into the brain invasively, but my interest has been
link |
01:22:20.720
how you can get data streams into the brain noninvasively.
link |
01:22:24.240
So I run a company called Neosensory and what we build is this little wristband.
link |
01:22:29.720
We've built this in many different factors.
link |
01:22:31.400
Oh, that's it?
link |
01:22:32.400
Yeah, this is it.
link |
01:22:33.400
And it's got these vibratory motors in it.
link |
01:22:35.560
So these things, as I'm speaking, for example, it's capturing my voice and running algorithms
link |
01:22:40.960
and then turning that into patterns of vibration here.
link |
01:22:44.600
So people who are deaf, for example, learn to hear through their skin.
link |
01:22:50.920
So the information is getting up to their brain this way and they learn how to hear.
link |
01:22:55.920
So it turns out on day one, people are pretty good, like better than you'd expect at being
link |
01:22:59.640
able to say, oh, that's weird.
link |
01:23:01.520
Was that a dog barking?
link |
01:23:02.520
Was that a baby crying?
link |
01:23:03.520
Was that a door knock?
link |
01:23:04.520
A doorbell?
link |
01:23:05.520
Like people are pretty good at it, but with time they get better and better and what
link |
01:23:09.560
it becomes is a new qualia.
link |
01:23:12.440
In other words, a new subjective internal experience.
link |
01:23:15.520
So on day one, they say, whoa, what was it?
link |
01:23:18.840
Oh, that was the dog barking.
link |
01:23:20.800
But by three months later, they say, oh, there's a dog barking somewhere.
link |
01:23:24.600
Oh, there's the dog.
link |
01:23:25.600
That's fascinating.
link |
01:23:26.600
And by the way, that's exactly how you learn how to use your ears.
link |
01:23:29.480
So of course, you don't remember this, but when you were an infant, all you have are
link |
01:23:33.160
your eardrum vibrating causes spikes to go down, your auditory nerves and impinging your
link |
01:23:38.960
auditory cortex.
link |
01:23:39.960
Your brain doesn't know what those mean automatically, but what happens is you learn how to hear
link |
01:23:44.800
by looking for correlations, you know, you clap your hands as a baby, you know, you look
link |
01:23:49.160
at your mother's mouth moving and that correlates with what's going on there.
link |
01:23:53.480
And eventually your brain says, all right, I'm just going to summarize this as an internal
link |
01:23:57.280
experience, as a conscious experience.
link |
01:23:59.600
And that's exactly what happens here.
link |
01:24:01.360
The weird part is that you can feed data into the brain, not through the ears, but through
link |
01:24:05.520
any channel that gets there, as long as the information gets there, your brain figures
link |
01:24:09.240
out what to do with it.
link |
01:24:10.240
That's fascinating.
link |
01:24:11.240
And like expanding the set of sensors, it could be arbitrarily, could expand arbitrarily,
link |
01:24:20.440
which is fascinating.
link |
01:24:21.440
Well, exactly.
link |
01:24:22.440
And by the way, the reason I use this skin, you know, there's all kinds of cool stuff
link |
01:24:25.920
going on in the AR world with glasses.
link |
01:24:28.320
But the fact is your eyes are overtaxed and your ears are overtaxed and you need to be
link |
01:24:31.400
able to see and hear other stuff, but you're covered with the skin, which is this incredible
link |
01:24:36.240
computational material with which you can feed information and we don't use our skin
link |
01:24:40.920
for much of anything nowadays.
link |
01:24:43.600
My joke in the lab is that I say we don't call this the waste for nothing because originally
link |
01:24:46.680
we built this the vest and you're passing in all this information that way.
link |
01:24:51.760
And what I'm doing here with the deaf community is what's called sensory substitution, where
link |
01:24:59.520
I'm capturing sound and sent, you know, I'm just replacing the ears with the skin and
link |
01:25:04.040
that works.
link |
01:25:05.040
One of the things I talk about live wire is sensory expansion.
link |
01:25:10.000
So what if you took something like your visual system, which picks up on a very thin slice
link |
01:25:13.760
of the electromagnetic spectrum and you could see infrared or ultraviolet.
link |
01:25:18.880
So we've hooked that up infrared and ultraviolet detectors and you know, I can feel what's
link |
01:25:22.760
going on.
link |
01:25:23.760
So just as an example, the first night I built the infrared, one of my engineers built
link |
01:25:26.480
that the infrared detector, I was walking in the dark between two houses and suddenly
link |
01:25:30.080
I felt all this infrared radiation.
link |
01:25:32.040
I was like, where does that come from?
link |
01:25:33.040
And I just followed my wrist and I found an infrared camera, a night vision camera that
link |
01:25:37.520
was, but like, you know, I immediately, oh, there's that thing there.
link |
01:25:40.720
Of course, I would have never seen it, but now it's just part of my reality.
link |
01:25:45.880
That's fascinating.
link |
01:25:46.880
Yeah.
link |
01:25:47.880
And then of course, what I'm really interested in is sensory addition.
link |
01:25:50.400
What if you could pick up on stuff that isn't even part of what we normally pick up
link |
01:25:55.280
on like, you know, like the magnetic field of the earth or Twitter or stock market or
link |
01:26:00.120
things like that.
link |
01:26:01.120
Or the, I don't know, some weird stuff, like the moons of other people or something like
link |
01:26:04.480
that.
link |
01:26:05.480
Sure.
link |
01:26:06.480
Now what you need is a way to measure that.
link |
01:26:07.480
So as long as there's a machine that can measure it, it's easy, it's trivial to feed
link |
01:26:09.760
this in here and you come to be, it comes to be part of your reality.
link |
01:26:14.840
It's like you have another sensor.
link |
01:26:16.760
And that kind of thing is without doing like, if you look in your link without, I forgot
link |
01:26:21.800
how you put it, but it was eloquent, you know, without getting, cutting into the brain basically.
link |
01:26:26.480
Yeah, exactly.
link |
01:26:27.480
Exactly.
link |
01:26:28.480
So this, this costs at the moment, $399.
link |
01:26:30.480
That's not going to kill you.
link |
01:26:32.600
It's not going to kill you.
link |
01:26:33.880
You just put it on and when you're done, you take it off.
link |
01:26:36.520
Yeah.
link |
01:26:37.520
And so, and the name of the company, by the way, is Neo Sensory for new senses because
link |
01:26:41.880
the whole idea is beautiful.
link |
01:26:44.040
You can, as I said, you know, you come to the table with certain plug and play devices
link |
01:26:47.720
and then that's it.
link |
01:26:48.720
Like I can pick up on this little bit of the electromagnetic radiation and pick up on,
link |
01:26:51.520
on this little frequency band for hearing and so on, but, but, but I'm stuck there and
link |
01:26:56.720
there's no reason we have to be stuck there.
link |
01:26:58.240
We can expand our umvelte by adding new senses.
link |
01:27:01.800
Yeah.
link |
01:27:02.800
What's umvelte?
link |
01:27:03.800
Oh, I'm sorry.
link |
01:27:04.800
The umvelte is the slice of reality that you pick up on.
link |
01:27:07.000
So each animal has its own.
link |
01:27:08.520
Hell of a word.
link |
01:27:09.520
Umvelte.
link |
01:27:10.520
Yeah, exactly.
link |
01:27:11.520
Nice.
link |
01:27:12.520
I'm sorry, I forgot to define it before.
link |
01:27:13.520
It's, it's, it's such an important concept, which is to say, for example, if you are a
link |
01:27:19.080
tick, you pick up on butyric acid, you pick up on odor and you pick up on temperature.
link |
01:27:24.040
That's it.
link |
01:27:25.040
That's how you construct your realities with those two sensors.
link |
01:27:27.280
If you are a blind echolocating bat, you're picking up on air compression waves coming
link |
01:27:31.560
back, you know, echolocation.
link |
01:27:33.000
If you are the black ghost knife fish, you're picking up on changes in the electrical field
link |
01:27:38.280
around you with electroreception.
link |
01:27:40.840
That's how they swim around and tell there's a rock there and so on.
link |
01:27:43.680
But that's, that's all they pick up on.
link |
01:27:45.560
That's their umvelte.
link |
01:27:47.040
It's, that's their, the signals they get from the world from which to construct their reality
link |
01:27:51.600
and they can be totally different umvelts.
link |
01:27:54.480
And so our human umvelte is, you know, we've got little bits that we can pick up on.
link |
01:27:59.920
One of the things I like to do with my students is talk about, um, imagine that you are a
link |
01:28:04.920
bloodhound dog, right?
link |
01:28:06.200
You are a bloodhound dog with a huge snout with 200 million cent receptors in it and
link |
01:28:10.120
your whole world is about smelling.
link |
01:28:11.720
You know, you've got slits in your nostril, take big nosefuls of air and so on.
link |
01:28:15.760
Do you have a dog?
link |
01:28:16.760
Nope, used to.
link |
01:28:17.760
Used to.
link |
01:28:18.760
Okay, right.
link |
01:28:19.760
So, you know, you walk your dog around and your dog is smelling everything.
link |
01:28:21.440
The whole world is full of signals that you do not pick up on and so imagine if you were
link |
01:28:25.080
that dog and you looked at your human master and thought, my God, what is it like to have
link |
01:28:28.400
the pitiful little nose of a human?
link |
01:28:30.880
How could you not know that there's a cat 100 yards away or that your friend was here
link |
01:28:33.840
six hours ago?
link |
01:28:35.000
And so the idea is because we're stuck in our umvelte, because we have this little pitiful
link |
01:28:39.160
nose is we think, okay, well, yeah, we're seeing reality, but, but you can have very
link |
01:28:43.560
different sorts of realities depending on the peripheral plug and play devices you're
link |
01:28:47.360
equipped with.
link |
01:28:48.360
It's fascinating to think that, like, if we're being honest, probably our umvelte is, you
link |
01:28:54.160
know, some infinitely tiny percent of the possibilities of how you can sense quote, unquote, reality,
link |
01:29:03.600
even if you could.
link |
01:29:04.600
I mean, there's a guy named Donald Hoffman, yeah, who basically says we're really far
link |
01:29:12.440
away from reality in terms of our ability to sense anything.
link |
01:29:16.000
Like we're very, we're almost like we're floating out there.
link |
01:29:21.000
That's almost like completely to attach to the actual physical reality.
link |
01:29:24.280
It's fascinating that we can have extra senses that could help us get a little bit a little
link |
01:29:29.200
bit closer.
link |
01:29:30.200
Exactly.
link |
01:29:31.200
And by the way, this has been the, the fruits of science is realized, like, yeah, for example,
link |
01:29:35.880
you know, you open your eyes and there's the world around you, right?
link |
01:29:38.320
But of course, depending on how you calculate it, it's less than a 10 trillion of the
link |
01:29:42.360
electromagnetic spectrum that we call visible light.
link |
01:29:44.960
The reason I say it depends, because, you know, it's actually infinite in all directions,
link |
01:29:48.680
presumably.
link |
01:29:49.680
Yeah.
link |
01:29:50.680
And so that's exactly that.
link |
01:29:51.680
And then science allows you to actually look into the rest of it.
link |
01:29:55.240
Exactly.
link |
01:29:56.240
Sort of understanding how big the world is out there.
link |
01:29:57.240
And the same with the world of really small and the world of really large.
link |
01:30:00.240
Exactly.
link |
01:30:01.240
That does beyond our ability to sense.
link |
01:30:03.000
Exactly.
link |
01:30:04.000
And so the reason I think this kind of thing matters is because we now have an opportunity
link |
01:30:07.760
for that first time in human history to say, okay, well, I'm just going to include other
link |
01:30:12.680
things in my umbel.
link |
01:30:13.680
So I'm going to include infrared radiation and, and have a direct perceptual experience
link |
01:30:18.040
of that.
link |
01:30:19.200
And so I'm very, you know, I mean, so, you know, I've given up my lab and I run this
link |
01:30:23.400
company 90% of my time now.
link |
01:30:25.600
That's what I'm doing.
link |
01:30:26.600
I still teach at Stanford and I'm, you know, teaching courses and stuff like that.
link |
01:30:29.880
But this is like a, this is your, your passion.
link |
01:30:33.080
The fire is, is, is on this.
link |
01:30:34.960
Yeah.
link |
01:30:35.960
I feel like this is the most important thing that's
link |
01:30:37.600
happening right now.
link |
01:30:38.880
I mean, obviously I think that because that's what I'm devoting my time in my life to, but
link |
01:30:42.480
um.
link |
01:30:43.480
I mean, it's a brilliant set of ideas.
link |
01:30:45.280
It certainly is like it, um, it's a step in, uh, in a very vibrant future, I would say.
link |
01:30:53.040
Like the possibilities there are, are endless.
link |
01:30:56.120
Exactly.
link |
01:30:57.120
So if you ask what I think about Neural Link, I think it's amazing what those guys are doing
link |
01:31:01.160
and working on, but I think it's not practical for almost everybody.
link |
01:31:04.920
For example, for people who are deaf, they buy this and, you know, every day we're getting
link |
01:31:09.200
tons of emails and tweets or whatever from people saying, wow, I picked up on this and
link |
01:31:12.840
then I had no idea that was a, I didn't even know that was happening out there and they're
link |
01:31:16.920
coming to hear, by the way, this is, you know, less than a tenth of the price of a hearing
link |
01:31:21.320
aid and like 250 times less than a cochlear implant.
link |
01:31:25.320
That's amazing.
link |
01:31:27.360
People love hearing about, uh, what, you know, brilliant folks like yourself, uh, could
link |
01:31:33.240
recommend in terms of books, of course you're an author of many books.
link |
01:31:37.080
So I'll, in the introduction, mentioned all the books you've written.
link |
01:31:40.480
People should definitely read LiveWired.
link |
01:31:42.480
I've gotten a chance to read some of it and it's amazing.
link |
01:31:44.960
But is there three books, technical, fiction, philosophical, that had an impact on you when
link |
01:31:52.200
you were younger or today and, uh, books, perhaps some of which you would, uh, want to
link |
01:31:58.880
recommend that others read, uh, you know, as an undergraduate, I majored in British
link |
01:32:03.200
and American literature.
link |
01:32:04.200
That was my major because I love literature.
link |
01:32:06.760
I grew up with, um, literature.
link |
01:32:08.680
My father had these extensive bookshelves and so I grew up in the mountains in New Mexico
link |
01:32:13.680
and so that was mostly why I spent my time was reading books.
link |
01:32:16.240
But, um, you know, I love, uh, you know, Faulkner, Hemingway.
link |
01:32:23.640
I love many South American authors, Gabriel Garcia Marquez and Italo Calvino.
link |
01:32:28.200
I would actually recommend Invisible Cities.
link |
01:32:29.840
I just, I loved that book.
link |
01:32:31.560
Uh, Italo Calvino, sorry.
link |
01:32:33.760
It's a book of fiction, um, uh, Anthony Doar wrote a book called All the Light We Cannot
link |
01:32:40.040
See, which actually, uh, was inspired by incognito by exactly what we were talking about earlier
link |
01:32:45.880
about how you can only see a little bit of the, what we call visible light in the electromagnetic
link |
01:32:50.760
radiation.
link |
01:32:51.760
I wrote about this in incognito and then he reviewed incognito for the Washington Post.
link |
01:32:54.800
Oh no, that's awesome.
link |
01:32:55.800
And then he wrote this book called, the book has nothing to do with that, but that's where
link |
01:32:59.120
the title comes from.
link |
01:33:00.120
Yeah.
link |
01:33:01.120
The only thing that we cannot see is about the rest of the spectrum, but, um, the, that's
link |
01:33:06.080
a absolutely gorgeous book.
link |
01:33:07.760
That's a book of fiction.
link |
01:33:09.280
Yeah, it's a book of fiction.
link |
01:33:10.280
What's, what's it about?
link |
01:33:12.280
It takes place during World War II, uh, about these two young people, one of whom is blind.
link |
01:33:16.600
Got it.
link |
01:33:17.600
And, um, yeah.
link |
01:33:18.600
Anything else?
link |
01:33:19.600
So, what any, so you mentioned Hemingway.
link |
01:33:21.880
I mean, all men, uh, all men on the sea, uh, what, uh, what's your favorite, uh, um,
link |
01:33:27.720
Snow's a Kilimanjaro, uh, collection of short stories I love, um, as far as not, as far
link |
01:33:32.480
as nonfiction goes, I grew up, uh, with Cosmos, both watching the PBS series that read the
link |
01:33:37.800
book and that influenced me a huge amount in terms of what I do.
link |
01:33:41.040
I, as from the time I was a kid, I felt like I want to be Carl Sagan, like I just, that's
link |
01:33:45.560
what I left.
link |
01:33:46.560
And in the end, I just, you know, I studied space physics for a while as an undergrad,
link |
01:33:51.040
but then I, in my last semester, discovered neuroscience last semester, and I just thought
link |
01:33:56.000
wow, I'm hooked on that.
link |
01:33:57.640
So the Carl Sagan of the brain is the aspiration.
link |
01:34:03.200
Yeah.
link |
01:34:04.200
I mean, uh, you're doing, you're doing, um, an incredible job of it.
link |
01:34:08.000
So you opened the book live wide with a quote by Heidegger, every man is born as many men
link |
01:34:13.840
and dies as a single one.
link |
01:34:16.080
Uh, well, what do you mean?
link |
01:34:19.000
Or what?
link |
01:34:20.000
Well, I'll tell you what I meant by it.
link |
01:34:21.440
I'll tell you.
link |
01:34:22.440
So he, he had his own reason why he was writing that, but I meant this in terms of brain plasticity
link |
01:34:25.760
in terms of library, which is this issue that I mentioned before about this, you know, this
link |
01:34:29.560
cone, the space time cone that we are in, which is that when you dropped into the world,
link |
01:34:35.960
you Lex had all this different potential.
link |
01:34:38.040
You could have been a great surfer or a great chess player or a, you could have been thousands
link |
01:34:44.160
of different men when you grew up.
link |
01:34:46.360
But what you did is things that were not your choice and your choice along the way,
link |
01:34:50.760
you know, you ended up navigating a particular path and now you're exactly who you are.
link |
01:34:54.240
You used to have lots of potential, but the day you die, you will be exactly Lex.
link |
01:34:59.200
You will be that one person.
link |
01:35:01.120
Yeah.
link |
01:35:02.120
So on that, in that context, I mean, first of all, it's just a beautiful, it's a humbling
link |
01:35:07.360
picture, but it's a beautiful one because it's, uh, all the possible trajectories and
link |
01:35:12.800
you pick one, you walk down that road and it's the Robert Frost poem.
link |
01:35:16.440
But on that topic, let me ask the, the biggest and the most ridiculous question.
link |
01:35:21.520
So in this live wide brain, when we choose all these different trajectories and end up
link |
01:35:25.400
with one, what's the meaning of it all?
link |
01:35:28.160
What's, uh, is there, is there a why here?
link |
01:35:32.320
What's the meaning of life?
link |
01:35:34.200
Yeah.
link |
01:35:35.200
David Engelman.
link |
01:35:36.200
That's it.
link |
01:35:37.200
I mean, this is the question that everyone has attacked from their own live art point
link |
01:35:45.040
of view, by which I mean culturally, if you grow up in a religious society, you have one
link |
01:35:49.680
way of attacking that question, so if you grow up in a secular or scientific society,
link |
01:35:53.120
you have a different way of attacking that question.
link |
01:35:55.320
Obviously, I don't know.
link |
01:35:57.360
I abstain on that question.
link |
01:35:59.680
Yeah.
link |
01:36:00.680
Uh, I mean, I think one of the fundamental things, I guess, in that, in all those possible
link |
01:36:05.480
trajectories is, uh, you're always asking, I mean, that's the act of asking what the
link |
01:36:12.000
heck is this thing for is equivalent to, or at least runs in parallel to all the choices
link |
01:36:19.440
that you're making because it's kind of, that's the underlying question.
link |
01:36:23.320
Well, that's right.
link |
01:36:24.320
And by the way, you know, this is the interesting thing about human psychology, you know, we've
link |
01:36:27.960
got all these layers of things at which we can ask questions.
link |
01:36:30.920
And so if you keep asking yourself the question about what is the optimal way for me to be
link |
01:36:35.840
spending my time?
link |
01:36:36.840
What should I be doing?
link |
01:36:37.840
What charity should I get involved with?
link |
01:36:38.840
And so if you're asking those big questions that, that steers you appropriately.
link |
01:36:44.720
If you're the type of person who never asks, Hey, is there something better I could be
link |
01:36:47.520
doing with my time, then presumably you won't optimize whatever it is that is important
link |
01:36:52.720
to you.
link |
01:36:53.720
So you've, uh, I think just in your eyes, in your work, there's a passion, uh, that
link |
01:37:01.000
just is obvious and it's inspiring, it's contagious.
link |
01:37:05.200
What, um, if you were to give advice to us, a young person today in the crazy chaos that
link |
01:37:12.640
we live today, uh, about life, about how to, how to, uh, how to discover their passion.
link |
01:37:21.680
Is there some words that you could give?
link |
01:37:25.360
First of all, I would say the main thing for a young person is stay adaptable.
link |
01:37:30.420
And this is back to this issue of why COVID is useful for us because it forces us off
link |
01:37:34.400
our tracks.
link |
01:37:36.160
The fact is the jobs that will exist 20 years from now, we don't even have names for it.
link |
01:37:40.520
We can't even imagine the jobs that are going to exist.
link |
01:37:43.120
And so when young people that I know go into college and they say, Hey, what should I major
link |
01:37:46.640
in and so on?
link |
01:37:48.320
College is and should be less and less vocational as in, Oh, I'm going to learn how to do this
link |
01:37:52.560
and then I'm going to do that the rest of my career.
link |
01:37:54.600
The world just isn't that way anymore with the, the exponential speed of things.
link |
01:37:58.400
So the important thing is learning how to learn, learning how to be live wired and adaptable.
link |
01:38:04.240
That's really key.
link |
01:38:05.240
And what I tell, what I advise young people when I talk to them is, you know, what you
link |
01:38:10.360
digest that, that's what gives you the raw storehouse of things that you can remix and
link |
01:38:16.320
be creative with.
link |
01:38:17.920
And so eat broadly and widely.
link |
01:38:21.960
And obviously this is the wonderful thing about the internet world we live in now is
link |
01:38:25.040
you kind of can't help it.
link |
01:38:26.040
You're constantly, whoa, I didn't, you know, you go down some molehole of Wikipedia and
link |
01:38:29.320
you think, Oh, I didn't even realize that was a thing.
link |
01:38:31.160
I didn't know that existed.
link |
01:38:32.760
And so embrace that, embrace that.
link |
01:38:35.480
Yeah, exactly.
link |
01:38:36.480
And what I tell people is just always do a gut check about, okay, I'm reading this paper
link |
01:38:41.320
and yeah, I think that, but this paper, wow, that really, I really cared about that.
link |
01:38:47.320
I tell them just to keep a real sniff out for that.
link |
01:38:50.600
And when you find those things, keep going down those paths.
link |
01:38:53.360
Yeah, don't be afraid.
link |
01:38:55.040
I mean, that's one of the, the challenges and the downsides of having so many beautiful
link |
01:38:59.360
options is that sometimes people are a little bit afraid to really commit, but that that's
link |
01:39:04.920
very true.
link |
01:39:05.920
I mean, if there's something that just sparks your interest and passion, just run with it.
link |
01:39:10.720
I mean, that's, it goes back to the Hydra quote, I mean, we only get this one life and
link |
01:39:16.560
that trajectory, it doesn't last forever.
link |
01:39:20.320
So just if something sparks your imagination, your passion is run with it.
link |
01:39:25.160
Yeah, exactly.
link |
01:39:26.160
I don't think there's a more beautiful way to end it.
link |
01:39:29.880
David, it's a huge honor to finally meet you.
link |
01:39:32.720
Your work is inspiring so many people, I've talked to so many people who are passionate
link |
01:39:36.280
about neuroscience, about the brain, even outside that read your book.
link |
01:39:40.960
So I hope, I hope you keep doing so.
link |
01:39:43.720
I, I think you're already there with Carl Sagan.
link |
01:39:46.080
I hope you continue growing.
link |
01:39:47.600
Um, yeah, it was a honor talking with you today.
link |
01:39:50.200
Thanks so much.
link |
01:39:51.200
Great, you too, Lex.
link |
01:39:52.200
Wonderful.
link |
01:39:53.200
Thanks for listening to this conversation with David Eagleman and thank you to our sponsors,
link |
01:39:58.080
Athletic Greens, BetterHelp and Cash App.
link |
01:40:01.680
Click the sponsor links in the description to get a discount and to support this podcast.
link |
01:40:06.720
If you enjoy this thing, subscribe on YouTube, review it with five stars on Apple podcast,
link |
01:40:11.960
follow on Spotify, support on Patreon or connect with me on Twitter at Lex Freedman.
link |
01:40:18.520
And now let me leave you with some words from David Eagleman in his book, Some, Four Details
link |
01:40:23.400
from the Afterlives.
link |
01:40:25.560
Imagine for a moment, there were nothing but the product of billions of years of molecules
link |
01:40:30.400
coming together and ratcheting up through natural selection.
link |
01:40:35.240
There were composed only of highways of fluids and chemicals sliding along roadways within
link |
01:40:40.240
billions of dancing cells, the trillions of synaptic connections hum in parallel that
link |
01:40:46.400
this vast egg like fabric of micro thin circuitry runs algorithms undreamt of in modern science
link |
01:40:53.640
and that these neural programs give rise to our decision making, loves, desires, fears
link |
01:41:01.200
and aspirations.
link |
01:41:03.080
To me, understanding this would be a numinous experience, better than anything ever proposed
link |
01:41:09.640
in any holy text.
link |
01:41:12.240
Thank you for listening and hope to see you next time.