back to index

Rosalind Picard: Affective Computing, Emotion, Privacy, and Health | Lex Fridman Podcast #24


small model | large model

link |
00:00:00.000
The following is a conversation with Rosalind Picard.
link |
00:00:02.880
She's a professor at MIT,
link |
00:00:04.540
director of the Effective Computing Research Group
link |
00:00:06.880
at the MIT Media Lab,
link |
00:00:08.360
and cofounder of two companies, Affectiva and Empatica.
link |
00:00:12.440
Over two decades ago,
link |
00:00:13.560
she launched a field of effective computing
link |
00:00:15.420
with her book of the same name.
link |
00:00:17.560
This book described the importance of emotion
link |
00:00:20.040
in artificial and natural intelligence.
link |
00:00:23.040
The vital role of emotional communication
link |
00:00:25.320
has to the relationship between people in general
link |
00:00:28.520
and human robot interaction.
link |
00:00:30.880
I really enjoy talking with Ros over so many topics,
link |
00:00:34.000
including emotion, ethics, privacy, wearable computing,
link |
00:00:37.440
and her recent research in epilepsy,
link |
00:00:39.680
and even love and meaning.
link |
00:00:42.600
This conversation is part
link |
00:00:43.960
of the Artificial Intelligence Podcast.
link |
00:00:46.000
If you enjoy it, subscribe on YouTube, iTunes,
link |
00:00:48.720
or simply connect with me on Twitter at Lex Friedman,
link |
00:00:51.920
spelled F R I D.
link |
00:00:53.960
And now, here's my conversation with Rosalind Picard.
link |
00:00:59.480
More than 20 years ago,
link |
00:01:00.720
you've coined the term effective computing
link |
00:01:03.320
and led a lot of research in this area since then.
link |
00:01:06.680
As I understand, the goal is to make the machine detect
link |
00:01:09.220
and interpret the emotional state of a human being
link |
00:01:12.380
and adapt the behavior of the machine
link |
00:01:14.200
based on the emotional state.
link |
00:01:16.120
So how is your understanding of the problem space
link |
00:01:19.920
defined by effective computing changed in the past 24 years?
link |
00:01:25.360
So it's the scope, the applications, the challenges,
link |
00:01:28.880
what's involved, how has that evolved over the years?
link |
00:01:32.120
Yeah, actually, originally,
link |
00:01:33.400
when I defined the term affective computing,
link |
00:01:36.880
it was a bit broader than just recognizing
link |
00:01:40.120
and responding intelligently to human emotion,
link |
00:01:42.260
although those are probably the two pieces
link |
00:01:44.520
that we've worked on the hardest.
link |
00:01:47.120
The original concept also encompassed machines
link |
00:01:50.680
that would have mechanisms
link |
00:01:52.480
that functioned like human emotion does inside them.
link |
00:01:55.680
It would be any computing that relates to arises from
link |
00:01:59.000
or deliberately influences human emotion.
link |
00:02:02.560
So the human computer interaction part
link |
00:02:05.160
is the part that people tend to see,
link |
00:02:07.880
like if I'm really ticked off at my computer
link |
00:02:11.000
and I'm scowling at it and I'm cursing at it
link |
00:02:13.480
and it just keeps acting smiling and happy
link |
00:02:15.720
like that little paperclip used to do,
link |
00:02:17.880
dancing, winking, that kind of thing
link |
00:02:22.200
just makes you even more frustrated, right?
link |
00:02:24.640
And I thought that stupid thing needs to see my affect.
link |
00:02:29.120
And if it's gonna be intelligent,
link |
00:02:30.640
which Microsoft researchers had worked really hard on,
link |
00:02:33.000
it actually had some of the most sophisticated AI
link |
00:02:34.920
in it at the time,
link |
00:02:36.240
that thing's gonna actually be smart.
link |
00:02:38.000
It needs to respond to me and you,
link |
00:02:41.600
and we can send it very different signals.
link |
00:02:45.360
So by the way, just a quick interruption,
link |
00:02:47.160
the Clippy, maybe it's in Word 95, 98,
link |
00:02:52.600
I don't remember when it was born,
link |
00:02:54.360
but many people, do you find yourself with that reference
link |
00:02:58.320
that people recognize what you're talking about
link |
00:03:00.320
still to this point?
link |
00:03:01.680
I don't expect the newest students to these days,
link |
00:03:05.200
but I've mentioned it to a lot of audiences,
link |
00:03:07.200
like how many of you know this Clippy thing?
link |
00:03:09.240
And still the majority of people seem to know it.
link |
00:03:11.720
So Clippy kind of looks at maybe natural language processing
link |
00:03:15.340
where you were typing and tries to help you complete,
link |
00:03:18.200
I think.
link |
00:03:19.280
I don't even remember what Clippy was, except annoying.
link |
00:03:22.520
Yeah, some people actually liked it.
link |
00:03:25.840
I would hear those stories.
link |
00:03:27.520
You miss it?
link |
00:03:28.480
Well, I miss the annoyance.
link |
00:03:31.300
They felt like there's an element.
link |
00:03:34.080
Someone was there.
link |
00:03:34.920
Somebody was there and we were in it together
link |
00:03:36.960
and they were annoying.
link |
00:03:37.800
It's like a puppy that just doesn't get it.
link |
00:03:40.880
They keep stripping up the couch kind of thing.
link |
00:03:42.200
And in fact, they could have done it smarter like a puppy.
link |
00:03:44.960
If they had done, like if when you yelled at it
link |
00:03:48.000
or cursed at it,
link |
00:03:49.040
if it had put its little ears back in its tail down
link |
00:03:51.800
and shrugged off,
link |
00:03:52.960
probably people would have wanted it back, right?
link |
00:03:55.900
But instead, when you yelled at it, what did it do?
link |
00:03:58.600
It smiled, it winked, it danced, right?
link |
00:04:01.260
If somebody comes to my office and I yell at them,
link |
00:04:03.200
they start smiling, winking and dancing.
link |
00:04:04.760
I'm like, I never want to see you again.
link |
00:04:06.760
So Bill Gates got a standing ovation
link |
00:04:08.520
when he said it was going away
link |
00:04:10.160
because people were so ticked.
link |
00:04:12.360
It was so emotionally unintelligent, right?
link |
00:04:15.040
It was intelligent about whether you were writing a letter,
link |
00:04:18.160
what kind of help you needed for that context.
link |
00:04:20.880
It was completely unintelligent about,
link |
00:04:23.440
hey, if you're annoying your customer,
link |
00:04:25.760
don't smile in their face when you do it.
link |
00:04:28.400
So that kind of mismatch was something
link |
00:04:32.360
the developers just didn't think about.
link |
00:04:35.080
And intelligence at the time was really all about math
link |
00:04:39.520
and language and chess and games,
link |
00:04:44.960
problems that could be pretty well defined.
link |
00:04:47.920
Social emotional interaction is much more complex
link |
00:04:50.880
than chess or Go or any of the games
link |
00:04:53.640
that people are trying to solve.
link |
00:04:56.060
And in order to understand that required skills
link |
00:04:58.720
that most people in computer science
link |
00:05:00.320
actually were lacking personally.
link |
00:05:02.600
Well, let's talk about computer science.
link |
00:05:03.800
Have things gotten better since the work,
link |
00:05:06.400
since the message,
link |
00:05:07.920
since you've really launched the field
link |
00:05:09.520
with a lot of research work in this space?
link |
00:05:11.320
I still find as a person like yourself,
link |
00:05:14.080
who's deeply passionate about human beings
link |
00:05:16.680
and yet am in computer science,
link |
00:05:18.860
there still seems to be a lack of,
link |
00:05:22.440
sorry to say empathy in as computer scientists.
link |
00:05:26.800
Yeah, well.
link |
00:05:27.800
Or hasn't gotten better.
link |
00:05:28.880
Let's just say there's a lot more variety
link |
00:05:30.720
among computer scientists these days.
link |
00:05:32.400
Computer scientists are a much more diverse group today
link |
00:05:35.000
than they were 25 years ago.
link |
00:05:37.600
And that's good.
link |
00:05:39.000
We need all kinds of people to become computer scientists
link |
00:05:41.760
so that computer science reflects more what society needs.
link |
00:05:45.580
And there's brilliance among every personality type.
link |
00:05:49.080
So it need not be limited to people
link |
00:05:52.000
who prefer computers to other people.
link |
00:05:54.080
How hard do you think it is?
link |
00:05:55.800
Your view of how difficult it is to recognize emotion
link |
00:05:58.580
or to create a deeply emotionally intelligent interaction.
link |
00:06:03.920
Has it gotten easier or harder
link |
00:06:06.000
as you've explored it further?
link |
00:06:07.440
And how far away are we from cracking this?
link |
00:06:12.400
If you think of the Turing test solving the intelligence,
link |
00:06:16.040
looking at the Turing test for emotional intelligence.
link |
00:06:20.720
I think it is as difficult as I thought it was gonna be.
link |
00:06:25.560
I think my prediction of its difficulty is spot on.
link |
00:06:29.240
I think the time estimates are always hard
link |
00:06:33.120
because they're always a function of society's love
link |
00:06:37.280
and hate of a particular topic.
link |
00:06:39.440
If society gets excited and you get thousands of researchers
link |
00:06:45.200
working on it for a certain application,
link |
00:06:49.000
that application gets solved really quickly.
link |
00:06:52.000
The general intelligence,
link |
00:06:54.320
the computer's complete lack of ability
link |
00:06:58.120
to have awareness of what it's doing,
link |
00:07:03.480
the fact that it's not conscious,
link |
00:07:05.480
the fact that there's no signs of it becoming conscious,
link |
00:07:08.580
the fact that it doesn't read between the lines,
link |
00:07:11.800
those kinds of things that we have to teach it explicitly,
link |
00:07:15.000
what other people pick up implicitly.
link |
00:07:17.440
We don't see that changing yet.
link |
00:07:20.360
There aren't breakthroughs yet that lead us to believe
link |
00:07:23.540
that that's gonna go any faster,
link |
00:07:25.280
which means that it's still gonna be kind of stuck
link |
00:07:28.640
with a lot of limitations
link |
00:07:31.240
where it's probably only gonna do the right thing
link |
00:07:34.000
in very limited, narrow, prespecified contexts
link |
00:07:37.120
where we can prescribe pretty much
link |
00:07:40.880
what's gonna happen there.
link |
00:07:42.800
So I don't see the,
link |
00:07:46.920
it's hard to predict a date
link |
00:07:47.960
because when people don't work on it, it's infinite.
link |
00:07:51.720
When everybody works on it, you get a nice piece of it
link |
00:07:56.000
well solved in a short amount of time.
link |
00:07:58.560
I actually think there's a more important issue right now
link |
00:08:01.520
than the difficulty of it.
link |
00:08:04.480
And that's causing some of us
link |
00:08:05.760
to put the brakes on a little bit.
link |
00:08:07.360
Usually we're all just like step on the gas,
link |
00:08:09.320
let's go faster.
link |
00:08:11.120
This is causing us to pull back and put the brakes on.
link |
00:08:14.160
And that's the way that some of this technology
link |
00:08:18.640
is being used in places like China right now.
link |
00:08:21.160
And that worries me so deeply
link |
00:08:24.480
that it's causing me to pull back myself
link |
00:08:27.760
on a lot of the things that we could be doing.
link |
00:08:30.040
And try to get the community to think a little bit more
link |
00:08:33.640
about, okay, if we're gonna go forward with that,
link |
00:08:36.000
how can we do it in a way that puts in place safeguards
link |
00:08:39.240
that protects people?
link |
00:08:41.080
So the technology we're referring to is
link |
00:08:43.480
just when a computer senses the human being,
link |
00:08:46.360
like the human face, right?
link |
00:08:48.560
So there's a lot of exciting things there,
link |
00:08:51.800
like forming a deep connection with the human being.
link |
00:08:53.880
So what are your worries, how that could go wrong?
link |
00:08:57.920
Is it in terms of privacy?
link |
00:08:59.400
Is it in terms of other kinds of more subtle things?
link |
00:09:02.880
But let's dig into privacy.
link |
00:09:04.200
So here in the US, if I'm watching a video
link |
00:09:07.680
of say a political leader,
link |
00:09:09.760
and in the US we're quite free as we all know
link |
00:09:13.520
to even criticize the president of the United States, right?
link |
00:09:17.800
Here that's not a shocking thing.
link |
00:09:19.320
It happens about every five seconds, right?
link |
00:09:22.600
But in China, what happens if you criticize
link |
00:09:27.600
the leader of the government, right?
link |
00:09:30.800
And so people are very careful not to do that.
link |
00:09:34.080
However, what happens if you're simply watching a video
link |
00:09:37.600
and you make a facial expression
link |
00:09:40.760
that shows a little bit of skepticism, right?
link |
00:09:45.000
Well, and here we're completely free to do that.
link |
00:09:47.920
In fact, we're free to fly off the handle
link |
00:09:50.440
and say anything we want, usually.
link |
00:09:54.440
I mean, there are some restrictions
link |
00:09:56.280
when the athlete does this
link |
00:09:58.800
as part of the national broadcast.
link |
00:10:00.800
Maybe the teams get a little unhappy
link |
00:10:03.800
about picking that forum to do it, right?
link |
00:10:05.840
But that's more a question of judgment.
link |
00:10:08.680
We have these freedoms,
link |
00:10:11.520
and in places that don't have those freedoms,
link |
00:10:14.120
what if our technology can read
link |
00:10:17.040
your underlying affective state?
link |
00:10:19.560
What if our technology can read it even noncontact?
link |
00:10:22.400
What if our technology can read it
link |
00:10:24.400
without your prior consent?
link |
00:10:28.800
And here in the US,
link |
00:10:30.360
in my first company we started, Affectiva,
link |
00:10:32.920
we have worked super hard to turn away money
link |
00:10:35.560
and opportunities that try to read people's affect
link |
00:10:38.400
without their prior informed consent.
link |
00:10:41.320
And even the software that is licensable,
link |
00:10:45.120
you have to sign things saying
link |
00:10:46.680
you will only use it in certain ways,
link |
00:10:48.360
which essentially is get people's buy in, right?
link |
00:10:52.080
Don't do this without people agreeing to it.
link |
00:10:56.760
There are other countries where they're not interested
link |
00:10:58.560
in people's buy in.
link |
00:10:59.520
They're just gonna use it.
link |
00:11:01.400
They're gonna inflict it on you.
link |
00:11:03.000
And if you don't like it,
link |
00:11:04.400
you better not scowl in the direction of any censors.
link |
00:11:08.440
So one, let me just comment on a small tangent.
link |
00:11:11.400
Do you know with the idea of adversarial examples
link |
00:11:15.920
and deep fakes and so on,
link |
00:11:18.760
what you bring up is actually,
link |
00:11:20.760
in that one sense, deep fakes provide
link |
00:11:23.680
a comforting protection that you can no longer really trust
link |
00:11:30.640
that the video of your face was legitimate.
link |
00:11:34.560
And therefore you always have an escape clause
link |
00:11:37.040
if a government is trying,
link |
00:11:38.440
if a stable, balanced, ethical government
link |
00:11:44.800
is trying to accuse you of something,
link |
00:11:46.200
at least you have protection.
link |
00:11:47.080
You can say it was fake news, as is a popular term now.
link |
00:11:50.600
Yeah, that's the general thinking of it.
link |
00:11:52.360
We know how to go into the video
link |
00:11:54.360
and see, for example, your heart rate and respiration
link |
00:11:58.360
and whether or not they've been tampered with.
link |
00:12:02.200
And we also can put like fake heart rate and respiration
link |
00:12:05.520
in your video now too.
link |
00:12:06.680
We decided we needed to do that.
link |
00:12:10.440
After we developed a way to extract it,
link |
00:12:12.640
we decided we also needed a way to jam it.
link |
00:12:15.920
And so the fact that we took time to do that other step too,
link |
00:12:20.880
that was time that I wasn't spending
link |
00:12:22.520
making the machine more affectively intelligent.
link |
00:12:25.240
And there's a choice in how we spend our time,
link |
00:12:28.480
which is now being swayed a little bit less by this goal
link |
00:12:32.400
and a little bit more like by concern
link |
00:12:34.320
about what's happening in society
link |
00:12:36.560
and what kind of future do we wanna build.
link |
00:12:38.840
And as we step back and say,
link |
00:12:41.640
okay, we don't just build AI to build AI
link |
00:12:44.560
to make Elon Musk more money
link |
00:12:46.480
or to make Amazon Jeff Bezos more money.
link |
00:12:48.760
Good gosh, you know, that's the wrong ethic.
link |
00:12:52.840
Why are we building it?
link |
00:12:54.080
What is the point of building AI?
link |
00:12:57.160
It used to be, it was driven by researchers in academia
link |
00:13:01.520
to get papers published and to make a career for themselves
link |
00:13:04.120
and to do something cool, right?
link |
00:13:05.760
Like, cause maybe it could be done.
link |
00:13:08.480
Now we realize that this is enabling rich people
link |
00:13:12.440
to get vastly richer, the poor are,
link |
00:13:17.200
the divide is even larger.
link |
00:13:19.760
And is that the kind of future that we want?
link |
00:13:22.840
Maybe we wanna think about, maybe we wanna rethink AI.
link |
00:13:25.880
Maybe we wanna rethink the problems in society
link |
00:13:29.080
that are causing the greatest inequity
link |
00:13:32.720
and rethink how to build AI
link |
00:13:35.000
that's not about a general intelligence,
link |
00:13:36.720
but that's about extending the intelligence
link |
00:13:39.280
and capability of the have nots
link |
00:13:41.200
so that we close these gaps in society.
link |
00:13:43.760
Do you hope that kind of stepping on the brake
link |
00:13:46.600
happens organically?
link |
00:13:47.920
Because I think still majority of the force behind AI
link |
00:13:51.160
is the desire to publish papers,
link |
00:13:52.720
is to make money without thinking about the why.
link |
00:13:55.480
Do you hope it happens organically?
link |
00:13:57.200
Is there room for regulation?
link |
00:14:01.040
Yeah, yeah, yeah, great questions.
link |
00:14:02.920
I prefer the, you know,
link |
00:14:05.920
they talk about the carrot versus the stick.
link |
00:14:07.320
I definitely prefer the carrot to the stick.
link |
00:14:09.120
And, you know, in our free world,
link |
00:14:12.360
we, there's only so much stick, right?
link |
00:14:14.880
You're gonna find a way around it.
link |
00:14:17.240
I generally think less regulation is better.
link |
00:14:21.160
That said, even though my position is classically carrot,
link |
00:14:24.400
no stick, no regulation,
link |
00:14:26.240
I think we do need some regulations in this space.
link |
00:14:29.040
I do think we need regulations
link |
00:14:30.680
around protecting people with their data,
link |
00:14:33.560
that you own your data, not Amazon, not Google.
link |
00:14:38.160
I would like to see people own their own data.
link |
00:14:40.760
I would also like to see the regulations
link |
00:14:42.440
that we have right now around lie detection
link |
00:14:44.480
being extended to emotion recognition in general,
link |
00:14:48.120
that right now you can't use a lie detector on an employee
link |
00:14:50.960
when you're, on a candidate
link |
00:14:52.680
when you're interviewing them for a job.
link |
00:14:54.640
I think similarly, we need to put in place protection
link |
00:14:57.720
around reading people's emotions without their consent
link |
00:15:00.520
and in certain cases,
link |
00:15:02.120
like characterizing them for a job and other opportunities.
link |
00:15:06.080
So I'm also, I also think that when we're reading emotion
link |
00:15:09.120
that's predictive around mental health,
link |
00:15:11.640
that that should, even though it's not medical data,
link |
00:15:14.120
that that should get the kinds of protections
link |
00:15:16.040
that our medical data gets.
link |
00:15:18.440
What most people don't know yet
link |
00:15:19.960
is right now with your smartphone use,
link |
00:15:22.560
and if you're wearing a sensor
link |
00:15:25.160
and you wanna learn about your stress and your sleep
link |
00:15:27.680
and your physical activity
link |
00:15:28.960
and how much you're using your phone
link |
00:15:30.760
and your social interaction,
link |
00:15:32.560
all of that nonmedical data,
link |
00:15:34.880
when we put it together with machine learning,
link |
00:15:37.880
now called AI, even though the founders of AI
link |
00:15:40.080
wouldn't have called it that,
link |
00:15:42.840
that capability can not only tell that you're calm right now
link |
00:15:48.360
or that you're getting a little stressed,
link |
00:15:50.760
but it can also predict how you're likely to be tomorrow.
link |
00:15:53.840
If you're likely to be sick or healthy,
link |
00:15:55.760
happy or sad, stressed or calm.
link |
00:15:58.640
Especially when you're tracking data over time.
link |
00:16:00.560
Especially when we're tracking a week of your data or more.
link |
00:16:03.680
Do you have an optimism towards,
link |
00:16:05.600
you know, a lot of people on our phones
link |
00:16:07.720
are worried about this camera that's looking at us.
link |
00:16:10.280
For the most part, on balance,
link |
00:16:12.480
are you optimistic about the benefits
link |
00:16:16.000
that can be brought from that camera
link |
00:16:17.400
that's looking at billions of us?
link |
00:16:19.560
Or should we be more worried?
link |
00:16:24.520
I think we should be a little bit more worried
link |
00:16:28.840
about who's looking at us and listening to us.
link |
00:16:32.480
The device sitting on your countertop in your kitchen,
link |
00:16:36.680
whether it's, you know, Alexa or Google Home or Apple, Siri,
link |
00:16:42.160
these devices want to listen
link |
00:16:47.520
while they say ostensibly to help us.
link |
00:16:49.680
And I think there are great people in these companies
link |
00:16:52.080
who do want to help people.
link |
00:16:54.360
Let me not brand them all bad.
link |
00:16:56.160
I'm a user of products from all of these companies
link |
00:16:59.320
I'm naming all the A companies, Alphabet, Apple, Amazon.
link |
00:17:04.360
They are awfully big companies, right?
link |
00:17:09.120
They have incredible power.
link |
00:17:11.520
And you know, what if China were to buy them, right?
link |
00:17:17.200
And suddenly all of that data
link |
00:17:19.880
were not part of free America,
link |
00:17:22.440
but all of that data were part of somebody
link |
00:17:24.400
who just wants to take over the world
link |
00:17:26.640
and you submit to them.
link |
00:17:27.920
And guess what happens if you so much as smirk the wrong way
link |
00:17:32.120
when they say something that you don't like?
link |
00:17:34.560
Well, they have reeducation camps, right?
link |
00:17:37.440
That's a nice word for them.
link |
00:17:39.000
By the way, they have a surplus of organs
link |
00:17:41.440
for people who have surgery these days.
link |
00:17:43.320
They don't have an organ donation problem
link |
00:17:45.040
because they take your blood and they know you're a match.
link |
00:17:48.040
And the doctors are on record of taking organs
link |
00:17:51.800
from people who are perfectly healthy and not prisoners.
link |
00:17:55.360
They're just simply not the favored ones of the government.
link |
00:17:59.600
And you know, that's a pretty freaky evil society.
link |
00:18:04.480
And we can use the word evil there.
link |
00:18:06.480
I was born in the Soviet Union.
link |
00:18:07.840
I can certainly connect to the worry that you're expressing.
link |
00:18:13.080
At the same time, probably both you and I
link |
00:18:15.440
and you very much so,
link |
00:18:19.120
you know, there's an exciting possibility
link |
00:18:23.160
that you can have a deep connection with a machine.
link |
00:18:27.720
Yeah, yeah.
link |
00:18:28.640
Right, so.
link |
00:18:30.920
Those of us, I've admitted students who say that they,
link |
00:18:35.440
you know, when you list like,
link |
00:18:36.760
who do you most wish you could have lunch with
link |
00:18:39.400
or dinner with, right?
link |
00:18:41.400
And they'll write like, I don't like people.
link |
00:18:43.360
I just like computers.
link |
00:18:44.800
And one of them said to me once
link |
00:18:46.360
when I had this party at my house,
link |
00:18:49.520
I want you to know,
link |
00:18:51.160
this is my only social event of the year,
link |
00:18:53.160
my one social event of the year.
link |
00:18:55.560
Like, okay, now this is a brilliant
link |
00:18:57.680
machine learning person, right?
link |
00:18:59.280
And we need that kind of brilliance in machine learning.
link |
00:19:01.920
And I love that computer science welcomes people
link |
00:19:04.760
who love people and people who are very awkward
link |
00:19:07.200
around people.
link |
00:19:08.040
I love that this is a field that anybody could join.
link |
00:19:12.720
We need all kinds of people
link |
00:19:14.960
and you don't need to be a social person.
link |
00:19:16.720
I'm not trying to force people who don't like people
link |
00:19:19.000
to suddenly become social.
link |
00:19:21.720
At the same time,
link |
00:19:23.880
if most of the people building the AIs of the future
link |
00:19:26.480
are the kind of people who don't like people,
link |
00:19:29.400
we've got a little bit of a problem.
link |
00:19:31.040
Well, hold on a second.
link |
00:19:31.920
So let me push back on that.
link |
00:19:33.400
So don't you think a large percentage of the world
link |
00:19:38.640
can, you know, there's loneliness.
link |
00:19:40.880
There is a huge problem with loneliness that's growing.
link |
00:19:44.400
And so there's a longing for connection.
link |
00:19:47.560
Do you...
link |
00:19:49.080
If you're lonely, you're part of a big and growing group.
link |
00:19:51.400
Yes.
link |
00:19:52.240
So we're in it together, I guess.
link |
00:19:54.320
If you're lonely, join the group.
link |
00:19:56.120
You're not alone.
link |
00:19:56.960
You're not alone.
link |
00:19:57.960
That's a good line.
link |
00:20:00.160
But do you think there's...
link |
00:20:03.160
You talked about some worry,
link |
00:20:04.600
but do you think there's an exciting possibility
link |
00:20:07.600
that something like Alexa and these kinds of tools
link |
00:20:11.560
can alleviate that loneliness
link |
00:20:14.240
in a way that other humans can't?
link |
00:20:16.640
Yeah, yeah, definitely.
link |
00:20:18.920
I mean, a great book can kind of alleviate loneliness
link |
00:20:22.120
because you just get sucked into this amazing story
link |
00:20:25.000
and you can't wait to go spend time with that character.
link |
00:20:27.760
And they're not a human character.
link |
00:20:30.360
There is a human behind it.
link |
00:20:33.200
But yeah, it can be an incredibly delightful way
link |
00:20:35.400
to pass the hours and it can meet needs.
link |
00:20:39.480
Even, you know, I don't read those trashy romance books,
link |
00:20:43.440
but somebody does, right?
link |
00:20:44.760
And what are they getting from this?
link |
00:20:46.200
Well, probably some of that feeling of being there, right?
link |
00:20:50.720
Being there in that social moment,
link |
00:20:52.920
that romantic moment or connecting with somebody.
link |
00:20:56.240
I've had a similar experience
link |
00:20:57.560
reading some science fiction books, right?
link |
00:20:59.400
And connecting with the character.
link |
00:21:00.560
Orson Scott Card, you know, just amazing writing
link |
00:21:04.160
and Ender's Game and Speaker for the Dead, terrible title.
link |
00:21:07.560
But those kind of books that pull you into a character
link |
00:21:11.000
and you feel like you're, you feel very social.
link |
00:21:13.880
It's very connected, even though it's not responding to you.
link |
00:21:17.280
And a computer, of course, can respond to you.
link |
00:21:19.720
So it can deepen it, right?
link |
00:21:21.440
You can have a very deep connection,
link |
00:21:25.480
much more than the movie Her, you know, plays up, right?
link |
00:21:29.400
Well, much more.
link |
00:21:30.640
I mean, movie Her is already a pretty deep connection, right?
link |
00:21:34.760
Well, but it's just a movie, right?
link |
00:21:36.760
It's scripted.
link |
00:21:37.600
It's just, you know, but I mean,
link |
00:21:39.560
like there can be a real interaction
link |
00:21:42.680
where the character can learn and you can learn.
link |
00:21:46.600
You could imagine it not just being you and one character.
link |
00:21:49.560
You could imagine a group of characters.
link |
00:21:51.600
You can imagine a group of people and characters,
link |
00:21:53.600
human and AI connecting,
link |
00:21:56.440
where maybe a few people can't sort of be friends
link |
00:22:00.800
with everybody, but the few people
link |
00:22:02.880
and their AIs can befriend more people.
link |
00:22:07.000
There can be an extended human intelligence in there
link |
00:22:10.320
where each human can connect with more people that way.
link |
00:22:14.880
But it's still very limited, but there are just,
link |
00:22:19.480
what I mean is there are many more possibilities
link |
00:22:21.560
than what's in that movie.
link |
00:22:22.760
So there's a tension here.
link |
00:22:24.680
So one, you expressed a really serious concern
link |
00:22:27.360
about privacy, about how governments
link |
00:22:29.120
can misuse the information,
link |
00:22:31.120
and there's the possibility of this connection.
link |
00:22:34.080
So let's look at Alexa.
link |
00:22:36.200
So personal assistance.
link |
00:22:37.760
For the most part, as far as I'm aware,
link |
00:22:40.840
they ignore your emotion.
link |
00:22:42.840
They ignore even the context or the existence of you,
link |
00:22:47.400
the intricate, beautiful, complex aspects of who you are,
link |
00:22:52.200
except maybe aspects of your voice
link |
00:22:54.160
that help it recognize for speech recognition.
link |
00:22:58.360
Do you think they should move towards
link |
00:23:00.600
trying to understand your emotion?
link |
00:23:03.160
All of these companies are very interested
link |
00:23:04.960
in understanding human emotion.
link |
00:23:07.440
They want, more people are telling Siri every day
link |
00:23:11.400
they want to kill themselves.
link |
00:23:13.720
Apple wants to know the difference between
link |
00:23:15.640
if a person is really suicidal versus if a person
link |
00:23:18.480
is just kind of fooling around with Siri, right?
link |
00:23:21.400
The words may be the same, the tone of voice
link |
00:23:25.560
and what surrounds those words is pivotal to understand
link |
00:23:31.360
if they should respond in a very serious way,
link |
00:23:34.200
bring help to that person,
link |
00:23:35.920
or if they should kind of jokingly tease back,
link |
00:23:40.640
ah, you just want to sell me for something else, right?
link |
00:23:44.960
Like, how do you respond when somebody says that?
link |
00:23:47.920
Well, you do want to err on the side of being careful
link |
00:23:51.440
and taking it seriously.
link |
00:23:53.640
People want to know if the person is happy or stressed
link |
00:23:59.120
in part, well, so let me give you an altruistic reason
link |
00:24:03.160
and a business profit motivated reason.
link |
00:24:08.320
And there are people in companies that operate
link |
00:24:11.000
on both principles.
link |
00:24:12.720
The altruistic people really care about their customers
link |
00:24:16.920
and really care about helping you feel a little better
link |
00:24:19.320
at the end of the day.
link |
00:24:20.240
And it would just make those people happy
link |
00:24:22.680
if they knew that they made your life better.
link |
00:24:24.320
If you came home stressed and after talking
link |
00:24:27.000
with their product, you felt better.
link |
00:24:29.920
There are other people who maybe have studied
link |
00:24:32.960
the way affect affects decision making
link |
00:24:35.120
and prices people pay.
link |
00:24:36.440
And they know, I don't know if I should tell you,
link |
00:24:38.760
like the work of Jen Lerner on heartstrings and purse strings,
link |
00:24:43.960
you know, if we manipulate you into a slightly sadder mood,
link |
00:24:47.960
you'll pay more, right?
link |
00:24:50.800
You'll pay more to change your situation.
link |
00:24:53.800
You'll pay more for something you don't even need
link |
00:24:55.800
to make yourself feel better.
link |
00:24:58.040
So, you know, if they sound a little sad,
link |
00:25:00.120
maybe I don't want to cheer them up.
link |
00:25:01.240
Maybe first I want to help them get something,
link |
00:25:04.800
a little shopping therapy, right?
link |
00:25:07.400
That helps them.
link |
00:25:08.480
Which is really difficult for a company
link |
00:25:09.880
that's primarily funded on advertisement.
link |
00:25:12.120
So they're encouraged to get you to offer you products
link |
00:25:16.160
or Amazon that's primarily funded
link |
00:25:17.840
on you buying things from their store.
link |
00:25:20.040
So I think we should be, you know,
link |
00:25:22.120
maybe we need regulation in the future
link |
00:25:24.120
to put a little bit of a wall between these agents
link |
00:25:27.240
that have access to our emotion
link |
00:25:29.120
and agents that want to sell us stuff.
link |
00:25:32.280
Maybe there needs to be a little bit more
link |
00:25:35.560
of a firewall in between those.
link |
00:25:38.400
So maybe digging in a little bit
link |
00:25:40.480
on the interaction with Alexa,
link |
00:25:42.200
you mentioned, of course, a really serious concern
link |
00:25:44.880
about like recognizing emotion,
link |
00:25:46.680
if somebody is speaking of suicide or depression and so on,
link |
00:25:49.680
but what about the actual interaction itself?
link |
00:25:55.000
Do you think, so if I, you know,
link |
00:25:57.840
you mentioned Clippy and being annoying,
link |
00:26:01.480
what is the objective function we're trying to optimize?
link |
00:26:04.200
Is it minimize annoyingness or minimize or maximize happiness?
link |
00:26:09.480
Or if we look at human to human relations,
link |
00:26:12.440
I think that push and pull, the tension, the dance,
link |
00:26:15.480
you know, the annoying, the flaws, that's what makes it fun.
link |
00:26:19.840
So is there a room for, like what is the objective function?
link |
00:26:24.160
There are times when you want to have a little push and pull,
link |
00:26:26.720
I think of kids sparring, right?
link |
00:26:29.120
You know, I see my sons and they,
link |
00:26:31.200
one of them wants to provoke the other to be upset
link |
00:26:33.720
and that's fun.
link |
00:26:34.720
And it's actually healthy to learn where your limits are,
link |
00:26:38.520
to learn how to self regulate.
link |
00:26:40.080
You can imagine a game where it's trying to make you mad
link |
00:26:43.000
and you're trying to show self control.
link |
00:26:45.080
And so if we're doing a AI human interaction
link |
00:26:48.640
that's helping build resilience and self control,
link |
00:26:51.240
whether it's to learn how to not be a bully
link |
00:26:54.040
or how to turn the other cheek
link |
00:26:55.560
or how to deal with an abusive person in your life,
link |
00:26:58.920
then you might need an AI that pushes your buttons, right?
link |
00:27:04.480
But in general, do you want an AI that pushes your buttons?
link |
00:27:10.440
Probably depends on your personality.
link |
00:27:12.080
I don't, I want one that's respectful,
link |
00:27:15.320
that is there to serve me
link |
00:27:18.160
and that is there to extend my ability to do things.
link |
00:27:23.200
I'm not looking for a rival,
link |
00:27:25.160
I'm looking for a helper.
link |
00:27:27.240
And that's the kind of AI I'd put my money on.
link |
00:27:30.200
Your sense is for the majority of people in the world,
link |
00:27:33.680
in order to have a rich experience,
link |
00:27:35.120
that's what they're looking for as well.
link |
00:27:37.120
So they're not looking,
link |
00:27:37.960
if you look at the movie Her, spoiler alert,
link |
00:27:40.800
I believe the program that the woman in the movie Her
link |
00:27:46.280
leaves the person for somebody else,
link |
00:27:51.320
says they don't wanna be dating anymore, right?
link |
00:27:54.360
Like, do you, your sense is if Alexa said,
link |
00:27:58.280
you know what, I'm actually had enough of you for a while,
link |
00:28:02.840
so I'm gonna shut myself off.
link |
00:28:04.880
You don't see that as...
link |
00:28:07.120
I'd say you're trash, cause I paid for you, right?
link |
00:28:10.120
You, we've got to remember,
link |
00:28:14.000
and this is where this blending human AI
link |
00:28:18.160
as if we're equals is really deceptive
link |
00:28:22.520
because AI is something at the end of the day
link |
00:28:26.400
that my students and I are making in the lab.
link |
00:28:28.840
And we're choosing what it's allowed to say,
link |
00:28:33.120
when it's allowed to speak, what it's allowed to listen to,
link |
00:28:36.600
what it's allowed to act on given the inputs
link |
00:28:40.760
that we choose to expose it to,
link |
00:28:43.400
what outputs it's allowed to have.
link |
00:28:45.880
It's all something made by a human.
link |
00:28:49.320
And if we wanna make something
link |
00:28:50.560
that makes our lives miserable, fine.
link |
00:28:52.920
I wouldn't invest in it as a business,
link |
00:28:56.640
unless it's just there for self regulation training.
link |
00:28:59.520
But I think we need to think about
link |
00:29:01.960
what kind of future we want.
link |
00:29:02.800
And actually your question, I really like the,
link |
00:29:05.560
what is the objective function?
link |
00:29:06.760
Is it to calm people down?
link |
00:29:09.320
Sometimes.
link |
00:29:10.560
Is it to always make people happy and calm them down?
link |
00:29:14.400
Well, there was a book about that, right?
link |
00:29:16.080
The brave new world, make everybody happy,
link |
00:29:18.840
take your Soma if you're unhappy, take your happy pill.
link |
00:29:22.520
And if you refuse to take your happy pill,
link |
00:29:24.360
well, we'll threaten you by sending you to Iceland
link |
00:29:28.600
to live there.
link |
00:29:29.600
I lived in Iceland three years.
link |
00:29:30.840
It's a great place.
link |
00:29:31.800
Don't take your Soma, then go to Iceland.
link |
00:29:35.240
A little TV commercial there.
link |
00:29:37.520
Now I was a child there for a few years.
link |
00:29:39.240
It's a wonderful place.
link |
00:29:40.600
So that part of the book never scared me.
link |
00:29:43.240
But really like, do we want AI to manipulate us
link |
00:29:46.720
into submission, into making us happy?
link |
00:29:49.080
Well, if you are a, you know,
link |
00:29:52.640
like a power obsessed sick dictator individual
link |
00:29:56.080
who only wants to control other people
link |
00:29:57.640
to get your jollies in life, then yeah,
link |
00:29:59.640
you wanna use AI to extend your power and your scale
link |
00:30:03.720
to force people into submission.
link |
00:30:07.080
If you believe that the human race is better off
link |
00:30:10.080
being given freedom and the opportunity
link |
00:30:12.120
to do things that might surprise you,
link |
00:30:15.360
then you wanna use AI to extend people's ability to build,
link |
00:30:20.200
you wanna build AI that extends human intelligence,
link |
00:30:22.960
that empowers the weak and helps balance the power
link |
00:30:27.320
between the weak and the strong,
link |
00:30:28.840
not that gives more power to the strong.
link |
00:30:32.440
So in this process of empowering people and sensing people,
link |
00:30:39.280
what is your sense on emotion
link |
00:30:41.280
in terms of recognizing emotion?
link |
00:30:42.680
The difference between emotion that is shown
link |
00:30:44.680
and emotion that is felt.
link |
00:30:46.640
So yeah, emotion that is expressed on the surface
link |
00:30:52.640
through your face, your body, and various other things,
link |
00:30:56.560
and what's actually going on deep inside
link |
00:30:58.840
on the biological level, on the neuroscience level,
link |
00:31:01.760
or some kind of cognitive level.
link |
00:31:03.680
Yeah, yeah.
link |
00:31:05.720
Whoa, no easy questions here.
link |
00:31:07.840
Well, yeah, I'm sure there's no definitive answer,
link |
00:31:11.280
but what's your sense?
link |
00:31:12.360
How far can we get by just looking at the face?
link |
00:31:16.160
We're very limited when we just look at the face,
link |
00:31:18.480
but we can get further than most people think we can get.
link |
00:31:21.920
People think, hey, I have a great poker face,
link |
00:31:25.920
therefore all you're ever gonna get from me is neutral.
link |
00:31:28.280
Well, that's naive.
link |
00:31:30.160
We can read with the ordinary camera
link |
00:31:32.680
on your laptop or on your phone.
link |
00:31:34.920
We can read from a neutral face if your heart is racing.
link |
00:31:39.320
We can read from a neutral face
link |
00:31:41.280
if your breathing is becoming irregular
link |
00:31:44.760
and showing signs of stress.
link |
00:31:46.920
We can read under some conditions
link |
00:31:50.720
that maybe I won't give you details on,
link |
00:31:53.200
how your heart rate variability power is changing.
link |
00:31:57.080
That could be a sign of stress,
link |
00:31:58.640
even when your heart rate is not necessarily accelerating.
link |
00:32:02.920
So...
link |
00:32:03.760
Sorry, from physio sensors or from the face?
link |
00:32:06.080
From the color changes that you cannot even see,
link |
00:32:09.160
but the camera can see.
link |
00:32:11.680
That's amazing.
link |
00:32:12.520
So you can get a lot of signal, but...
link |
00:32:15.320
So we get things people can't see using a regular camera.
link |
00:32:18.680
And from that, we can tell things about your stress.
link |
00:32:21.920
So if you were just sitting there with a blank face
link |
00:32:25.600
thinking nobody can read my emotion, well, you're wrong.
link |
00:32:30.120
Right, so that's really interesting,
link |
00:32:31.840
but that's from sort of visual information from the face.
link |
00:32:34.520
That's almost like cheating your way
link |
00:32:37.120
to the physiological state of the body,
link |
00:32:39.120
by being very clever with what you can do with vision.
link |
00:32:42.600
With signal processing.
link |
00:32:43.440
With signal processing.
link |
00:32:44.360
So that's really impressive.
link |
00:32:45.320
But if you just look at the stuff we humans can see,
link |
00:32:49.320
the poker, the smile, the smirks,
link |
00:32:52.240
the subtle, all the facial actions.
link |
00:32:54.320
So then you can hide that on your face
link |
00:32:55.960
for a limited amount of time.
link |
00:32:57.240
Now, if you're just going in for a brief interview
link |
00:33:00.600
and you're hiding it, that's pretty easy for most people.
link |
00:33:03.880
If you are, however, surveilled constantly everywhere you go,
link |
00:33:08.800
then it's gonna say, gee, you know, Lex used to smile a lot
link |
00:33:13.280
and now I'm not seeing so many smiles.
link |
00:33:15.920
And Roz used to laugh a lot
link |
00:33:20.240
and smile a lot very spontaneously.
link |
00:33:22.280
And now I'm only seeing
link |
00:33:23.480
these not so spontaneous looking smiles.
link |
00:33:26.440
And only when she's asked these questions.
link |
00:33:28.920
You know, that's something's changed here.
link |
00:33:31.720
Probably not getting enough sleep.
link |
00:33:33.720
We could look at that too.
link |
00:33:35.200
So now I have to be a little careful too.
link |
00:33:37.000
When I say we, you think we can't read your emotion
link |
00:33:40.920
and we can, it's not that binary.
link |
00:33:42.760
What we're reading is more some physiological changes
link |
00:33:45.960
that relate to your activation.
link |
00:33:48.760
Now, that doesn't mean that we know everything
link |
00:33:51.800
about how you feel.
link |
00:33:52.640
In fact, we still know very little about how you feel.
link |
00:33:54.880
Your thoughts are still private.
link |
00:33:56.880
Your nuanced feelings are still completely private.
link |
00:34:01.120
We can't read any of that.
link |
00:34:02.920
So there's some relief that we can't read that.
link |
00:34:07.000
Even brain imaging can't read that.
link |
00:34:09.800
Wearables can't read that.
link |
00:34:12.280
However, as we read your body state changes
link |
00:34:16.000
and we know what's going on in your environment
link |
00:34:18.520
and we look at patterns of those over time,
link |
00:34:21.480
we can start to make some inferences
link |
00:34:24.960
about what you might be feeling.
link |
00:34:26.960
And that is where it's not just the momentary feeling
link |
00:34:31.400
but it's more your stance toward things.
link |
00:34:34.120
And that could actually be a little bit more scary
link |
00:34:37.040
with certain kinds of governmental control freak people
link |
00:34:42.840
who want to know more about are you on their team
link |
00:34:46.800
or are you not?
link |
00:34:48.320
And getting that information through over time.
link |
00:34:50.320
So you're saying there's a lot of signal
link |
00:34:51.640
by looking at the change over time.
link |
00:34:53.680
Yeah.
link |
00:34:54.520
So you've done a lot of exciting work
link |
00:34:56.600
both in computer vision
link |
00:34:57.800
and physiological sense like wearables.
link |
00:35:00.560
What do you think is the best modality for,
link |
00:35:03.480
what's the best window into the emotional soul?
link |
00:35:08.360
Is it the face?
link |
00:35:09.200
Is it the voice?
link |
00:35:10.160
Depends what you want to know.
link |
00:35:11.920
It depends what you want to know.
link |
00:35:13.120
It depends what you want to know.
link |
00:35:13.960
Everything is informative.
link |
00:35:15.600
Everything we do is informative.
link |
00:35:17.440
So for health and wellbeing and things like that,
link |
00:35:20.160
do you find the wearable physiotechnical,
link |
00:35:22.680
measuring physiological signals
link |
00:35:24.840
is the best for health based stuff?
link |
00:35:29.320
So here I'm going to answer empirically
link |
00:35:31.880
with data and studies we've been doing.
link |
00:35:34.680
We've been doing studies.
link |
00:35:36.040
Now these are currently running
link |
00:35:38.280
with lots of different kinds of people
link |
00:35:39.600
but where we've published data
link |
00:35:41.880
and I can speak publicly to it,
link |
00:35:44.080
the data are limited right now
link |
00:35:45.520
to New England college students.
link |
00:35:47.680
So that's a small group.
link |
00:35:50.320
Among New England college students,
link |
00:35:52.440
when they are wearing a wearable
link |
00:35:55.880
like the empathic embrace here
link |
00:35:57.640
that's measuring skin conductance, movement, temperature.
link |
00:36:01.760
And when they are using a smartphone
link |
00:36:05.800
that is collecting their time of day
link |
00:36:09.400
of when they're texting, who they're texting,
link |
00:36:12.120
their movement around it, their GPS,
link |
00:36:14.200
the weather information based upon their location.
link |
00:36:18.040
And when it's using machine learning
link |
00:36:19.360
and putting all of that together
link |
00:36:20.920
and looking not just at right now
link |
00:36:22.920
but looking at your rhythm of behaviors
link |
00:36:26.960
over about a week.
link |
00:36:28.560
When we look at that,
link |
00:36:29.920
we are very accurate at forecasting tomorrow's stress,
link |
00:36:33.560
mood and happy, sad mood and health.
link |
00:36:38.560
And when we look at which pieces of that are most useful,
link |
00:36:43.680
first of all, if you have all the pieces,
link |
00:36:45.560
you get the best results.
link |
00:36:48.320
If you have only the wearable,
link |
00:36:50.600
you get the next best results.
link |
00:36:52.680
And that's still better than 80% accurate
link |
00:36:56.520
at forecasting tomorrow's levels.
link |
00:37:00.080
Isn't that exciting because the wearable stuff
link |
00:37:02.800
with physiological information,
link |
00:37:05.320
it feels like it violates privacy less
link |
00:37:08.000
than the noncontact face based methods.
link |
00:37:12.720
Yeah, it's interesting.
link |
00:37:14.040
I think what people sometimes don't,
link |
00:37:16.560
it's funny in the early days people would say,
link |
00:37:18.880
oh, wearing something or giving blood is invasive, right?
link |
00:37:22.560
Whereas a camera is less invasive
link |
00:37:24.400
because it's not touching you.
link |
00:37:26.920
I think on the contrary,
link |
00:37:28.320
the things that are not touching you are maybe the scariest
link |
00:37:31.200
because you don't know when they're on or off.
link |
00:37:33.880
And you don't know who's behind it, right?
link |
00:37:39.920
A wearable, depending upon what's happening
link |
00:37:43.480
to the data on it, if it's just stored locally
link |
00:37:46.760
or if it's streaming and what it is being attached to,
link |
00:37:52.640
in a sense, you have the most control over it
link |
00:37:54.960
because it's also very easy to just take it off, right?
link |
00:37:59.400
Now it's not sensing me.
link |
00:38:01.440
So if I'm uncomfortable with what it's sensing,
link |
00:38:05.320
now I'm free, right?
link |
00:38:07.240
If I'm comfortable with what it's sensing,
link |
00:38:09.960
then, and I happen to know everything about this one
link |
00:38:12.800
and what it's doing with it,
link |
00:38:13.720
so I'm quite comfortable with it,
link |
00:38:15.600
then I have control, I'm comfortable.
link |
00:38:20.240
Control is one of the biggest factors for an individual
link |
00:38:24.720
in reducing their stress.
link |
00:38:26.600
If I have control over it,
link |
00:38:28.160
if I know all there is to know about it,
link |
00:38:30.200
then my stress is a lot lower
link |
00:38:32.480
and I'm making an informed choice
link |
00:38:34.960
about whether to wear it or not,
link |
00:38:36.640
or when to wear it or not.
link |
00:38:38.040
I wanna wear it sometimes, maybe not others.
link |
00:38:40.320
Right, so that control, yeah, I'm with you.
link |
00:38:42.760
That control, even if, yeah, the ability to turn it off,
link |
00:38:47.520
that is a really important thing.
link |
00:38:49.000
It's huge.
link |
00:38:49.840
And we need to, maybe, if there's regulations,
link |
00:38:53.440
maybe that's number one to protect
link |
00:38:55.080
is people's ability to, it's easy to opt out as to opt in.
link |
00:38:59.960
Right, so you've studied a bit of neuroscience as well.
link |
00:39:04.480
How have looking at our own minds,
link |
00:39:08.240
sort of the biological stuff or the neurobiological,
link |
00:39:12.840
the neuroscience to get the signals in our brain,
link |
00:39:17.400
helped you understand the problem
link |
00:39:18.760
and the approach of effective computing, so?
link |
00:39:21.880
Originally, I was a computer architect
link |
00:39:23.720
and I was building hardware and computer designs
link |
00:39:26.320
and I wanted to build ones that worked like the brain.
link |
00:39:28.240
So I've been studying the brain
link |
00:39:29.880
as long as I've been studying how to build computers.
link |
00:39:33.920
Have you figured out anything yet?
link |
00:39:36.160
Very little.
link |
00:39:37.000
It's so amazing.
link |
00:39:39.400
You know, they used to think like,
link |
00:39:40.720
oh, if you remove this chunk of the brain
link |
00:39:42.560
and you find this function goes away,
link |
00:39:44.040
well, that's the part of the brain that did it.
link |
00:39:45.760
And then later they realized
link |
00:39:46.880
if you remove this other chunk of the brain,
link |
00:39:48.320
that function comes back and,
link |
00:39:50.120
oh no, we really don't understand it.
link |
00:39:52.800
Brains are so interesting and changing all the time
link |
00:39:56.200
and able to change in ways
link |
00:39:58.080
that will probably continue to surprise us.
link |
00:40:02.120
When we were measuring stress,
link |
00:40:04.520
you may know the story where we found
link |
00:40:07.160
an unusual big skin conductance pattern on one wrist
link |
00:40:10.680
in one of our kids with autism.
link |
00:40:14.120
And in trying to figure out how on earth
link |
00:40:15.480
you could be stressed on one wrist and not the other,
link |
00:40:17.760
like how can you get sweaty on one wrist, right?
link |
00:40:20.160
When you get stressed
link |
00:40:21.520
with that sympathetic fight or flight response,
link |
00:40:23.280
like you kind of should like sweat more
link |
00:40:25.120
in some places than others,
link |
00:40:26.240
but not more on one wrist than the other.
link |
00:40:27.920
That didn't make any sense.
link |
00:40:30.840
We learned that what had actually happened
link |
00:40:33.120
was a part of his brain had unusual electrical activity
link |
00:40:37.080
and that caused an unusually large sweat response
link |
00:40:41.240
on one wrist and not the other.
link |
00:40:44.040
And since then we've learned
link |
00:40:45.480
that seizures cause this unusual electrical activity.
link |
00:40:49.360
And depending where the seizure is,
link |
00:40:51.240
if it's in one place and it's staying there,
link |
00:40:53.360
you can have a big electrical response
link |
00:40:55.520
we can pick up with a wearable at one part of the body.
link |
00:40:58.480
You can also have a seizure
link |
00:40:59.400
that spreads over the whole brain,
link |
00:41:00.480
generalized grand mal seizure.
link |
00:41:02.400
And that response spreads
link |
00:41:04.040
and we can pick it up pretty much anywhere.
link |
00:41:07.160
As we learned this and then later built Embrace
link |
00:41:10.200
that's now FDA cleared for seizure detection,
link |
00:41:13.120
we have also built relationships
link |
00:41:15.760
with some of the most amazing doctors in the world
link |
00:41:18.480
who not only help people
link |
00:41:20.400
with unusual brain activity or epilepsy,
link |
00:41:23.040
but some of them are also surgeons
link |
00:41:24.440
and they're going in and they're implanting electrodes,
link |
00:41:27.160
not just to momentarily read the strange patterns
link |
00:41:31.200
of brain activity that we'd like to see return to normal,
link |
00:41:35.080
but also to read out continuously what's happening
link |
00:41:37.320
in some of these deep regions of the brain
link |
00:41:39.120
during most of life when these patients are not seizing.
link |
00:41:41.640
Most of the time they're not seizing,
link |
00:41:42.960
most of the time they're fine.
link |
00:41:44.960
And so we are now working on mapping
link |
00:41:47.920
those deep brain regions
link |
00:41:49.960
that you can't even usually get with EEG scalp electrodes
link |
00:41:53.680
because the changes deep inside don't reach the surface.
link |
00:41:58.640
But interesting when some of those regions
link |
00:42:00.480
are activated, we see a big skin conductance response.
link |
00:42:04.280
Who would have thunk it, right?
link |
00:42:05.800
Like nothing here, but something here.
link |
00:42:07.840
In fact, right after seizures
link |
00:42:10.400
that we think are the most dangerous ones
link |
00:42:12.600
that precede what's called SUDEP,
link |
00:42:14.120
Sudden Unexpected Death and Epilepsy,
link |
00:42:16.960
there's a period where the brainwaves go flat
link |
00:42:19.520
and it looks like the person's brain has stopped,
link |
00:42:21.640
but it hasn't.
link |
00:42:23.120
The activity has gone deep into a region
link |
00:42:26.400
that can make the cortical activity look flat,
link |
00:42:29.120
like a quick shutdown signal here.
link |
00:42:32.600
It can unfortunately cause breathing to stop
link |
00:42:35.560
if it progresses long enough.
link |
00:42:38.360
Before that happens, we see a big skin conductance response
link |
00:42:42.080
in the data that we have.
link |
00:42:43.760
The longer this flattening, the bigger our response here.
link |
00:42:46.880
So we have been trying to learn, you know, initially,
link |
00:42:49.480
like why are we getting a big response here
link |
00:42:51.720
when there's nothing here?
link |
00:42:52.560
Well, it turns out there's something much deeper.
link |
00:42:55.600
So we can now go inside the brains
link |
00:42:57.760
of some of these individuals, fabulous people
link |
00:43:01.280
who usually aren't seizing,
link |
00:43:03.480
and get this data and start to map it.
link |
00:43:05.600
So that's the active research that we're doing right now
link |
00:43:07.600
with top medical partners.
link |
00:43:09.360
So this wearable sensor that's looking at skin conductance
link |
00:43:12.960
can capture sort of the ripples of the complexity
link |
00:43:17.160
of what's going on in our brain.
link |
00:43:18.880
So this little device, you have a hope
link |
00:43:22.240
that you can start to get the signal
link |
00:43:24.920
from the interesting things happening in the brain.
link |
00:43:27.880
Yeah, we've already published the strong correlations
link |
00:43:30.600
between the size of this response
link |
00:43:32.200
and the flattening that happens afterwards.
link |
00:43:35.360
And unfortunately, also in a real SUDEP case
link |
00:43:38.320
where the patient died because the, well, we don't know why.
link |
00:43:42.360
We don't know if somebody was there,
link |
00:43:43.600
it would have definitely prevented it.
link |
00:43:45.280
But we know that most SUDEPs happen
link |
00:43:47.040
when the person's alone.
link |
00:43:48.600
And in this case, a SUDEP is an acronym, S U D E P.
link |
00:43:53.600
And it stands for the number two cause
link |
00:43:56.680
of years of life lost actually
link |
00:43:58.960
among all neurological disorders.
link |
00:44:01.040
Stroke is number one, SUDEP is number two,
link |
00:44:03.480
but most people haven't heard of it.
link |
00:44:05.640
Actually, I'll plug my TED talk,
link |
00:44:07.240
it's on the front page of TED right now
link |
00:44:09.280
that talks about this.
link |
00:44:11.160
And we hope to change that.
link |
00:44:13.560
I hope everybody who's heard of SIDS and stroke
link |
00:44:17.160
will now hear of SUDEP
link |
00:44:18.760
because we think in most cases it's preventable
link |
00:44:21.280
if people take their meds and aren't alone
link |
00:44:24.720
when they have a seizure.
link |
00:44:26.200
Not guaranteed to be preventable.
link |
00:44:27.800
There are some exceptions,
link |
00:44:29.680
but we think most cases probably are.
link |
00:44:31.560
So you had this embrace now in the version two wristband,
link |
00:44:35.000
right, for epilepsy management.
link |
00:44:39.280
That's the one that's FDA approved?
link |
00:44:41.440
Yes.
link |
00:44:42.640
Which is kind of a clear.
link |
00:44:43.480
FDA cleared, they say.
link |
00:44:45.200
Sorry.
link |
00:44:46.040
No, it's okay.
link |
00:44:46.880
It essentially means it's approved for marketing.
link |
00:44:49.400
Got it.
link |
00:44:50.240
Just a side note, how difficult is that to do?
link |
00:44:52.960
It's essentially getting FDA approval
link |
00:44:54.880
for computer science technology.
link |
00:44:57.000
It's so agonizing.
link |
00:44:58.000
It's much harder than publishing multiple papers
link |
00:45:01.920
in top medical journals.
link |
00:45:04.120
Yeah, we've published peer reviewed
link |
00:45:05.920
top medical journal neurology, best results,
link |
00:45:08.840
and that's not good enough for the FDA.
link |
00:45:10.800
Is that system,
link |
00:45:12.520
so if we look at the peer review of medical journals,
link |
00:45:14.880
there's flaws, there's strengths,
link |
00:45:16.800
is the FDA approval process,
link |
00:45:19.560
how does it compare to the peer review process?
link |
00:45:21.560
Does it have the strength?
link |
00:45:23.160
I'll take peer review over FDA any day.
link |
00:45:25.720
But is that a good thing?
link |
00:45:26.600
Is that a good thing for FDA?
link |
00:45:28.040
You're saying, does it stop some amazing technology
link |
00:45:31.160
from getting through?
link |
00:45:32.320
Yeah, it does.
link |
00:45:33.400
The FDA performs a very important good role
link |
00:45:36.240
in keeping people safe.
link |
00:45:37.600
They keep things,
link |
00:45:39.480
they put you through tons of safety testing
link |
00:45:41.800
and that's wonderful and that's great.
link |
00:45:44.240
I'm all in favor of the safety testing.
link |
00:45:46.240
But sometimes they put you through additional testing
link |
00:45:51.000
that they don't have to explain why they put you through it
link |
00:45:54.080
and you don't understand why you're going through it
link |
00:45:56.680
and it doesn't make sense.
link |
00:45:58.440
And that's very frustrating.
link |
00:46:00.680
And maybe they have really good reasons
link |
00:46:04.400
and they just would,
link |
00:46:05.480
it would do people a service to articulate those reasons.
link |
00:46:09.720
Be more transparent.
link |
00:46:10.640
Be more transparent.
link |
00:46:12.120
So as part of Empatica, you have sensors.
link |
00:46:15.760
So what kind of problems can we crack?
link |
00:46:17.800
What kind of things from seizures to autism
link |
00:46:24.480
to I think I've heard you mentioned depression.
link |
00:46:28.000
What kind of things can we alleviate?
link |
00:46:29.760
Can we detect?
link |
00:46:30.720
What's your hope of what,
link |
00:46:32.240
how we can make the world a better place
link |
00:46:33.960
with this wearable tech?
link |
00:46:35.760
I would really like to see my fellow brilliant researchers
link |
00:46:40.760
step back and say, what are the really hard problems
link |
00:46:46.200
that we don't know how to solve
link |
00:46:47.760
that come from people maybe we don't even see
link |
00:46:50.440
in our normal life because they're living
link |
00:46:52.800
in the poor places.
link |
00:46:54.240
They're stuck on the bus.
link |
00:46:56.160
They can't even afford the Uber or the Lyft
link |
00:46:58.440
or the data plan or all these other wonderful things
link |
00:47:02.200
we have that we keep improving on.
link |
00:47:04.360
Meanwhile, there's all these folks left behind in the world
link |
00:47:07.080
and they're struggling with horrible diseases
link |
00:47:09.520
with depression, with epilepsy, with diabetes,
link |
00:47:12.960
with just awful stuff that maybe a little more time
link |
00:47:19.200
and attention hanging out with them
link |
00:47:20.440
and learning what are their challenges in life?
link |
00:47:22.800
What are their needs?
link |
00:47:24.000
How do we help them have job skills?
link |
00:47:25.640
How do we help them have a hope and a future
link |
00:47:28.520
and a chance to have the great life
link |
00:47:31.240
that so many of us building technology have?
link |
00:47:34.960
And then how would that reshape the kinds of AI
link |
00:47:37.920
that we build? How would that reshape the new apps
link |
00:47:41.400
that we build or the maybe we need to focus
link |
00:47:44.040
on how to make things more low cost and green
link |
00:47:46.880
instead of thousand dollar phones?
link |
00:47:49.320
I mean, come on, why can't we be thinking more
link |
00:47:52.840
about things that do more with less for these folks?
link |
00:47:56.840
Quality of life is not related to the cost of your phone.
link |
00:48:00.200
It's not something that, it's been shown that what about
link |
00:48:03.960
$75,000 of income and happiness is the same, okay?
link |
00:48:08.440
However, I can tell you, you get a lot of happiness
link |
00:48:10.720
from helping other people.
link |
00:48:12.320
You get a lot more than $75,000 buys.
link |
00:48:15.200
So how do we connect up the people who have real needs
link |
00:48:19.320
with the people who have the ability to build the future
link |
00:48:21.920
and build the kind of future that truly improves the lives
link |
00:48:25.840
of all the people that are currently being left behind?
link |
00:48:28.720
So let me return just briefly on a point,
link |
00:48:32.720
maybe in the movie, Her.
link |
00:48:35.280
So do you think if we look farther into the future,
link |
00:48:37.720
you said so much of the benefit from making our technology
link |
00:48:41.240
more empathetic to us human beings would make them
link |
00:48:46.240
better tools, empower us, make our lives better.
link |
00:48:50.320
Well, if we look farther into the future,
link |
00:48:51.960
do you think we'll ever create an AI system
link |
00:48:54.560
that we can fall in love with?
link |
00:48:56.920
That we can fall in love with and loves us back
link |
00:49:00.280
on a level that is similar to human to human interaction,
link |
00:49:04.920
like in the movie Her or beyond?
link |
00:49:07.680
I think we can simulate it in ways that could,
link |
00:49:13.920
you know, sustain engagement for a while.
link |
00:49:17.280
Would it be as good as another person?
link |
00:49:20.160
I don't think so, if you're used to like good people.
link |
00:49:24.560
Now, if you've just grown up with nothing but abuse
link |
00:49:27.120
and you can't stand human beings,
link |
00:49:29.080
can we do something that helps you there
link |
00:49:32.160
that gives you something through a machine?
link |
00:49:34.000
Yeah, but that's pretty low bar, right?
link |
00:49:36.160
If you've only encountered pretty awful people.
link |
00:49:39.120
If you've encountered wonderful, amazing people,
link |
00:49:41.680
we're nowhere near building anything like that.
link |
00:49:44.800
And I would not bet on building it.
link |
00:49:49.480
I would bet instead on building the kinds of AI
link |
00:49:53.160
that helps kind of raise all boats,
link |
00:49:56.880
that helps all people be better people,
link |
00:49:59.480
helps all people figure out if they're getting sick tomorrow
link |
00:50:02.280
and helps give them what they need to stay well tomorrow.
link |
00:50:05.400
That's the kind of AI I wanna build
link |
00:50:07.000
that improves human lives,
link |
00:50:09.040
not the kind of AI that just walks on The Tonight Show
link |
00:50:11.640
and people go, wow, look how smart that is.
link |
00:50:14.600
Really?
link |
00:50:15.800
And then it goes back in a box, you know?
link |
00:50:18.640
So on that point,
link |
00:50:19.960
if we continue looking a little bit into the future,
link |
00:50:23.440
do you think an AI that's empathetic
link |
00:50:25.200
and does improve our lives
link |
00:50:28.960
need to have a physical presence, a body?
link |
00:50:31.840
And even let me cautiously say the C word consciousness
link |
00:50:38.720
and even fear of mortality.
link |
00:50:40.760
So some of those human characteristics,
link |
00:50:42.760
do you think it needs to have those aspects
link |
00:50:45.920
or can it remain simply a machine learning tool
link |
00:50:50.880
that learns from data of behavior
link |
00:50:53.400
that learns to make us,
link |
00:50:56.640
based on previous patterns, feel better?
link |
00:51:00.080
Or does it need those elements of consciousness?
link |
00:51:02.560
It depends on your goals.
link |
00:51:03.760
If you're making a movie, it needs a body.
link |
00:51:06.720
It needs a gorgeous body.
link |
00:51:08.000
It needs to act like it has consciousness.
link |
00:51:10.040
It needs to act like it has emotion, right?
link |
00:51:11.680
Because that's what sells.
link |
00:51:13.360
That's what's gonna get me to show up and enjoy the movie.
link |
00:51:16.280
Okay.
link |
00:51:17.800
In real life, does it need all that?
link |
00:51:19.800
Well, if you've read Orson Scott Card,
link |
00:51:21.840
Ender's Game, Speaker of the Dead,
link |
00:51:23.520
it could just be like a little voice in your earring, right?
link |
00:51:26.720
And you could have an intimate relationship
link |
00:51:28.360
and it could get to know you.
link |
00:51:29.520
And it doesn't need to be a robot.
link |
00:51:34.160
But that doesn't make this compelling of a movie, right?
link |
00:51:37.160
I mean, we already think it's kind of weird
link |
00:51:38.440
when a guy looks like he's talking to himself on the train,
link |
00:51:41.600
even though it's earbuds.
link |
00:51:43.760
So we have these, embodied is more powerful.
link |
00:51:49.680
Embodied, when you compare interactions
link |
00:51:51.760
with an embodied robot versus a video of a robot
link |
00:51:55.280
versus no robot, the robot is more engaging.
link |
00:52:00.160
The robot gets our attention more.
link |
00:52:01.720
The robot, when you walk in your house,
link |
00:52:03.080
is more likely to get you to remember to do the things
link |
00:52:05.440
that you asked it to do,
link |
00:52:06.480
because it's kind of got a physical presence.
link |
00:52:09.000
You can avoid it if you don't like it.
link |
00:52:10.840
It could see you're avoiding it.
link |
00:52:12.440
There's a lot of power to being embodied.
link |
00:52:14.760
There will be embodied AIs.
link |
00:52:17.160
They have great power and opportunity and potential.
link |
00:52:22.040
There will also be AIs that aren't embodied,
link |
00:52:24.600
that just are little software assistants
link |
00:52:28.520
that help us with different things
link |
00:52:30.240
that may get to know things about us.
link |
00:52:33.480
Will they be conscious?
link |
00:52:34.880
There will be attempts to program them
link |
00:52:36.640
to make them appear to be conscious.
link |
00:52:39.280
We can already write programs that make it look like,
link |
00:52:41.760
oh, what do you mean?
link |
00:52:42.600
Of course I'm aware that you're there, right?
link |
00:52:43.800
I mean, it's trivial to say stuff like that.
link |
00:52:45.720
It's easy to fool people,
link |
00:52:48.720
but does it actually have conscious experience like we do?
link |
00:52:53.400
Nobody has a clue how to do that yet.
link |
00:52:55.920
That seems to be something that is beyond
link |
00:52:58.720
what any of us knows how to build now.
link |
00:53:01.480
Will it have to have that?
link |
00:53:03.720
I think you can get pretty far
link |
00:53:05.520
with a lot of stuff without it.
link |
00:53:07.280
But will we accord it rights?
link |
00:53:10.960
Well, that's more a political game
link |
00:53:13.320
than it is a question of real consciousness.
link |
00:53:16.480
Yeah, can you go to jail for turning off Alexa
link |
00:53:18.720
is the question for an election maybe a few decades from now.
link |
00:53:24.960
Well, Sophia Robot's already been given rights
link |
00:53:27.640
as a citizen in Saudi Arabia, right?
link |
00:53:30.040
Even before women have full rights.
link |
00:53:33.680
Then the robot was still put back in the box
link |
00:53:36.960
to be shipped to the next place
link |
00:53:39.600
where it would get a paid appearance, right?
link |
00:53:42.760
Yeah, it's dark and almost comedic, if not absurd.
link |
00:53:50.160
So I've heard you speak about your journey in finding faith.
link |
00:53:54.640
Sure.
link |
00:53:55.960
And how you discovered some wisdoms about life
link |
00:54:00.040
and beyond from reading the Bible.
link |
00:54:03.240
And I've also heard you say that,
link |
00:54:05.440
you said scientists too often assume
link |
00:54:07.040
that nothing exists beyond what can be currently measured.
link |
00:54:11.360
Yeah, materialism.
link |
00:54:12.640
Materialism.
link |
00:54:13.480
And scientism, yeah.
link |
00:54:14.880
So in some sense, this assumption enables
link |
00:54:17.360
the near term scientific method,
link |
00:54:20.280
assuming that we can uncover the mysteries of this world
link |
00:54:25.360
by the mechanisms of measurement that we currently have.
link |
00:54:28.280
But we easily forget that we've made this assumption.
link |
00:54:33.960
So what do you think we miss out on
link |
00:54:35.640
by making that assumption?
link |
00:54:38.760
It's fine to limit the scientific method
link |
00:54:42.640
to things we can measure and reason about and reproduce.
link |
00:54:47.720
That's fine.
link |
00:54:49.640
I think we have to recognize
link |
00:54:51.160
that sometimes we scientists also believe
link |
00:54:53.160
in things that happen historically.
link |
00:54:55.560
Like I believe the Holocaust happened.
link |
00:54:57.920
I can't prove events from past history scientifically.
link |
00:55:03.880
You prove them with historical evidence, right?
link |
00:55:06.480
With the impact they had on people,
link |
00:55:08.200
with eyewitness testimony and things like that.
link |
00:55:11.640
So a good thinker recognizes that science
link |
00:55:15.680
is one of many ways to get knowledge.
link |
00:55:19.440
It's not the only way.
link |
00:55:21.640
And there's been some really bad philosophy
link |
00:55:24.280
and bad thinking recently, you can call it scientism,
link |
00:55:27.840
where people say science is the only way to get to truth.
link |
00:55:31.200
And it's not, it just isn't.
link |
00:55:33.360
There are other ways that work also.
link |
00:55:35.960
Like knowledge of love with someone.
link |
00:55:38.520
You don't prove your love through science, right?
link |
00:55:43.600
So history, philosophy, love,
link |
00:55:48.160
a lot of other things in life show us
link |
00:55:50.840
that there's more ways to gain knowledge and truth
link |
00:55:55.840
if you're willing to believe there is such a thing,
link |
00:55:57.800
and I believe there is, than science.
link |
00:56:01.040
I do, I am a scientist, however.
link |
00:56:03.080
And in my science, I do limit my science
link |
00:56:05.880
to the things that the scientific method can do.
link |
00:56:09.600
But I recognize that it's myopic
link |
00:56:11.800
to say that that's all there is.
link |
00:56:13.720
Right, there's, just like you listed,
link |
00:56:15.680
there's all the why questions.
link |
00:56:17.120
And really we know, if we're being honest with ourselves,
link |
00:56:20.520
the percent of what we really know is basically zero
link |
00:56:25.520
relative to the full mystery of the...
link |
00:56:28.120
Measure theory, a set of measure zero,
link |
00:56:30.360
if I have a finite amount of knowledge, which I do.
link |
00:56:34.520
So you said that you believe in truth.
link |
00:56:37.960
So let me ask that old question.
link |
00:56:40.600
What do you think this thing is all about?
link |
00:56:42.800
What's the life on earth?
link |
00:56:44.800
Life, the universe, and everything?
link |
00:56:46.200
And everything, what's the meaning?
link |
00:56:47.040
I can't quote Douglas Adams 42.
link |
00:56:49.680
It's my favorite number.
link |
00:56:51.240
By the way, that's my street address.
link |
00:56:52.640
My husband and I guessed the exact same number
link |
00:56:54.880
for our house, we got to pick it.
link |
00:56:57.760
And there's a reason we picked 42, yeah.
link |
00:57:00.120
So is it just 42 or is there,
link |
00:57:02.320
do you have other words that you can put around it?
link |
00:57:05.360
Well, I think there's a grand adventure
link |
00:57:07.760
and I think this life is a part of it.
link |
00:57:09.840
I think there's a lot more to it than meets the eye
link |
00:57:12.200
and the heart and the mind and the soul here.
link |
00:57:14.440
I think we see but through a glass dimly in this life.
link |
00:57:18.000
We see only a part of all there is to know.
link |
00:57:22.720
If people haven't read the Bible, they should,
link |
00:57:25.800
if they consider themselves educated
link |
00:57:27.920
and you could read Proverbs
link |
00:57:30.400
and find tremendous wisdom in there
link |
00:57:33.320
that cannot be scientifically proven.
link |
00:57:35.680
But when you read it, there's something in you,
link |
00:57:38.200
like a musician knows when the instruments played right
link |
00:57:41.160
and it's beautiful.
link |
00:57:42.680
There's something in you that comes alive
link |
00:57:45.200
and knows that there's a truth there
link |
00:57:47.240
that it's like your strings are being plucked by the master
link |
00:57:50.160
instead of by me, right, playing when I pluck it.
link |
00:57:54.240
But probably when you play, it sounds spectacular, right?
link |
00:57:57.400
And when you encounter those truths,
link |
00:58:01.520
there's something in you that sings
link |
00:58:03.240
and knows that there is more
link |
00:58:06.320
than what I can prove mathematically
link |
00:58:09.200
or program a computer to do.
link |
00:58:11.280
Don't get me wrong, the math is gorgeous.
link |
00:58:13.360
The computer programming can be brilliant.
link |
00:58:16.040
It's inspiring, right?
link |
00:58:17.200
We wanna do more.
link |
00:58:19.200
None of this squashes my desire to do science
link |
00:58:21.840
or to get knowledge through science.
link |
00:58:23.640
I'm not dissing the science at all.
link |
00:58:26.040
I grow even more in awe of what the science can do
link |
00:58:29.560
because I'm more in awe of all there is we don't know.
link |
00:58:33.000
And really at the heart of science,
link |
00:58:36.040
you have to have a belief that there's truth,
link |
00:58:38.200
that there's something greater to be discovered.
link |
00:58:41.080
And some scientists may not wanna use the faith word,
link |
00:58:44.480
but it's faith that drives us to do science.
link |
00:58:47.800
It's faith that there is truth,
link |
00:58:49.920
that there's something to know that we don't know,
link |
00:58:52.520
that it's worth knowing, that it's worth working hard,
link |
00:58:56.000
and that there is meaning,
link |
00:58:58.320
that there is such a thing as meaning,
link |
00:58:59.880
which by the way, science can't prove either.
link |
00:59:02.720
We have to kind of start with some assumptions
link |
00:59:04.760
that there's things like truth and meaning.
link |
00:59:06.880
And these are really questions philosophers own, right?
link |
00:59:10.680
This is their space,
link |
00:59:11.760
of philosophers and theologians at some level.
link |
00:59:14.560
So these are things science,
link |
00:59:19.160
when people claim that science will tell you all truth,
link |
00:59:23.000
there's a name for that.
link |
00:59:23.920
It's its own kind of faith.
link |
00:59:25.600
It's scientism and it's very myopic.
link |
00:59:29.000
Yeah, there's a much bigger world out there to be explored
link |
00:59:32.360
in ways that science may not,
link |
00:59:34.800
at least for now, allow us to explore.
link |
00:59:37.120
Yeah, and there's meaning and purpose and hope
link |
00:59:40.080
and joy and love and all these awesome things
link |
00:59:43.240
that make it all worthwhile too.
link |
00:59:45.480
I don't think there's a better way to end it, Roz.
link |
00:59:47.680
Thank you so much for talking today.
link |
00:59:49.040
Thanks Lex, what a pleasure.
link |
00:59:50.360
Great questions.