back to indexRosalind Picard: Affective Computing, Emotion, Privacy, and Health | Lex Fridman Podcast #24
link |
The following is a conversation with Rosalind Picard.
link |
She's a professor at MIT,
link |
director of the Effective Computing Research Group
link |
at the MIT Media Lab,
link |
and cofounder of two companies, Affectiva and Empatica.
link |
Over two decades ago,
link |
she launched a field of effective computing
link |
with her book of the same name.
link |
This book described the importance of emotion
link |
in artificial and natural intelligence.
link |
The vital role of emotional communication
link |
has to the relationship between people in general
link |
and human robot interaction.
link |
I really enjoy talking with Ros over so many topics,
link |
including emotion, ethics, privacy, wearable computing,
link |
and her recent research in epilepsy,
link |
and even love and meaning.
link |
This conversation is part
link |
of the Artificial Intelligence Podcast.
link |
If you enjoy it, subscribe on YouTube, iTunes,
link |
or simply connect with me on Twitter at Lex Friedman,
link |
And now, here's my conversation with Rosalind Picard.
link |
More than 20 years ago,
link |
you've coined the term effective computing
link |
and led a lot of research in this area since then.
link |
As I understand, the goal is to make the machine detect
link |
and interpret the emotional state of a human being
link |
and adapt the behavior of the machine
link |
based on the emotional state.
link |
So how is your understanding of the problem space
link |
defined by effective computing changed in the past 24 years?
link |
So it's the scope, the applications, the challenges,
link |
what's involved, how has that evolved over the years?
link |
Yeah, actually, originally,
link |
when I defined the term affective computing,
link |
it was a bit broader than just recognizing
link |
and responding intelligently to human emotion,
link |
although those are probably the two pieces
link |
that we've worked on the hardest.
link |
The original concept also encompassed machines
link |
that would have mechanisms
link |
that functioned like human emotion does inside them.
link |
It would be any computing that relates to arises from
link |
or deliberately influences human emotion.
link |
So the human computer interaction part
link |
is the part that people tend to see,
link |
like if I'm really ticked off at my computer
link |
and I'm scowling at it and I'm cursing at it
link |
and it just keeps acting smiling and happy
link |
like that little paperclip used to do,
link |
dancing, winking, that kind of thing
link |
just makes you even more frustrated, right?
link |
And I thought that stupid thing needs to see my affect.
link |
And if it's gonna be intelligent,
link |
which Microsoft researchers had worked really hard on,
link |
it actually had some of the most sophisticated AI
link |
in it at the time,
link |
that thing's gonna actually be smart.
link |
It needs to respond to me and you,
link |
and we can send it very different signals.
link |
So by the way, just a quick interruption,
link |
the Clippy, maybe it's in Word 95, 98,
link |
I don't remember when it was born,
link |
but many people, do you find yourself with that reference
link |
that people recognize what you're talking about
link |
still to this point?
link |
I don't expect the newest students to these days,
link |
but I've mentioned it to a lot of audiences,
link |
like how many of you know this Clippy thing?
link |
And still the majority of people seem to know it.
link |
So Clippy kind of looks at maybe natural language processing
link |
where you were typing and tries to help you complete,
link |
I don't even remember what Clippy was, except annoying.
link |
Yeah, some people actually liked it.
link |
I would hear those stories.
link |
Well, I miss the annoyance.
link |
They felt like there's an element.
link |
Someone was there.
link |
Somebody was there and we were in it together
link |
and they were annoying.
link |
It's like a puppy that just doesn't get it.
link |
They keep stripping up the couch kind of thing.
link |
And in fact, they could have done it smarter like a puppy.
link |
If they had done, like if when you yelled at it
link |
if it had put its little ears back in its tail down
link |
probably people would have wanted it back, right?
link |
But instead, when you yelled at it, what did it do?
link |
It smiled, it winked, it danced, right?
link |
If somebody comes to my office and I yell at them,
link |
they start smiling, winking and dancing.
link |
I'm like, I never want to see you again.
link |
So Bill Gates got a standing ovation
link |
when he said it was going away
link |
because people were so ticked.
link |
It was so emotionally unintelligent, right?
link |
It was intelligent about whether you were writing a letter,
link |
what kind of help you needed for that context.
link |
It was completely unintelligent about,
link |
hey, if you're annoying your customer,
link |
don't smile in their face when you do it.
link |
So that kind of mismatch was something
link |
the developers just didn't think about.
link |
And intelligence at the time was really all about math
link |
and language and chess and games,
link |
problems that could be pretty well defined.
link |
Social emotional interaction is much more complex
link |
than chess or Go or any of the games
link |
that people are trying to solve.
link |
And in order to understand that required skills
link |
that most people in computer science
link |
actually were lacking personally.
link |
Well, let's talk about computer science.
link |
Have things gotten better since the work,
link |
since the message,
link |
since you've really launched the field
link |
with a lot of research work in this space?
link |
I still find as a person like yourself,
link |
who's deeply passionate about human beings
link |
and yet am in computer science,
link |
there still seems to be a lack of,
link |
sorry to say empathy in as computer scientists.
link |
Or hasn't gotten better.
link |
Let's just say there's a lot more variety
link |
among computer scientists these days.
link |
Computer scientists are a much more diverse group today
link |
than they were 25 years ago.
link |
We need all kinds of people to become computer scientists
link |
so that computer science reflects more what society needs.
link |
And there's brilliance among every personality type.
link |
So it need not be limited to people
link |
who prefer computers to other people.
link |
How hard do you think it is?
link |
Your view of how difficult it is to recognize emotion
link |
or to create a deeply emotionally intelligent interaction.
link |
Has it gotten easier or harder
link |
as you've explored it further?
link |
And how far away are we from cracking this?
link |
If you think of the Turing test solving the intelligence,
link |
looking at the Turing test for emotional intelligence.
link |
I think it is as difficult as I thought it was gonna be.
link |
I think my prediction of its difficulty is spot on.
link |
I think the time estimates are always hard
link |
because they're always a function of society's love
link |
and hate of a particular topic.
link |
If society gets excited and you get thousands of researchers
link |
working on it for a certain application,
link |
that application gets solved really quickly.
link |
The general intelligence,
link |
the computer's complete lack of ability
link |
to have awareness of what it's doing,
link |
the fact that it's not conscious,
link |
the fact that there's no signs of it becoming conscious,
link |
the fact that it doesn't read between the lines,
link |
those kinds of things that we have to teach it explicitly,
link |
what other people pick up implicitly.
link |
We don't see that changing yet.
link |
There aren't breakthroughs yet that lead us to believe
link |
that that's gonna go any faster,
link |
which means that it's still gonna be kind of stuck
link |
with a lot of limitations
link |
where it's probably only gonna do the right thing
link |
in very limited, narrow, prespecified contexts
link |
where we can prescribe pretty much
link |
what's gonna happen there.
link |
So I don't see the,
link |
it's hard to predict a date
link |
because when people don't work on it, it's infinite.
link |
When everybody works on it, you get a nice piece of it
link |
well solved in a short amount of time.
link |
I actually think there's a more important issue right now
link |
than the difficulty of it.
link |
And that's causing some of us
link |
to put the brakes on a little bit.
link |
Usually we're all just like step on the gas,
link |
This is causing us to pull back and put the brakes on.
link |
And that's the way that some of this technology
link |
is being used in places like China right now.
link |
And that worries me so deeply
link |
that it's causing me to pull back myself
link |
on a lot of the things that we could be doing.
link |
And try to get the community to think a little bit more
link |
about, okay, if we're gonna go forward with that,
link |
how can we do it in a way that puts in place safeguards
link |
that protects people?
link |
So the technology we're referring to is
link |
just when a computer senses the human being,
link |
like the human face, right?
link |
So there's a lot of exciting things there,
link |
like forming a deep connection with the human being.
link |
So what are your worries, how that could go wrong?
link |
Is it in terms of privacy?
link |
Is it in terms of other kinds of more subtle things?
link |
But let's dig into privacy.
link |
So here in the US, if I'm watching a video
link |
of say a political leader,
link |
and in the US we're quite free as we all know
link |
to even criticize the president of the United States, right?
link |
Here that's not a shocking thing.
link |
It happens about every five seconds, right?
link |
But in China, what happens if you criticize
link |
the leader of the government, right?
link |
And so people are very careful not to do that.
link |
However, what happens if you're simply watching a video
link |
and you make a facial expression
link |
that shows a little bit of skepticism, right?
link |
Well, and here we're completely free to do that.
link |
In fact, we're free to fly off the handle
link |
and say anything we want, usually.
link |
I mean, there are some restrictions
link |
when the athlete does this
link |
as part of the national broadcast.
link |
Maybe the teams get a little unhappy
link |
about picking that forum to do it, right?
link |
But that's more a question of judgment.
link |
We have these freedoms,
link |
and in places that don't have those freedoms,
link |
what if our technology can read
link |
your underlying affective state?
link |
What if our technology can read it even noncontact?
link |
What if our technology can read it
link |
without your prior consent?
link |
And here in the US,
link |
in my first company we started, Affectiva,
link |
we have worked super hard to turn away money
link |
and opportunities that try to read people's affect
link |
without their prior informed consent.
link |
And even the software that is licensable,
link |
you have to sign things saying
link |
you will only use it in certain ways,
link |
which essentially is get people's buy in, right?
link |
Don't do this without people agreeing to it.
link |
There are other countries where they're not interested
link |
in people's buy in.
link |
They're just gonna use it.
link |
They're gonna inflict it on you.
link |
And if you don't like it,
link |
you better not scowl in the direction of any censors.
link |
So one, let me just comment on a small tangent.
link |
Do you know with the idea of adversarial examples
link |
and deep fakes and so on,
link |
what you bring up is actually,
link |
in that one sense, deep fakes provide
link |
a comforting protection that you can no longer really trust
link |
that the video of your face was legitimate.
link |
And therefore you always have an escape clause
link |
if a government is trying,
link |
if a stable, balanced, ethical government
link |
is trying to accuse you of something,
link |
at least you have protection.
link |
You can say it was fake news, as is a popular term now.
link |
Yeah, that's the general thinking of it.
link |
We know how to go into the video
link |
and see, for example, your heart rate and respiration
link |
and whether or not they've been tampered with.
link |
And we also can put like fake heart rate and respiration
link |
in your video now too.
link |
We decided we needed to do that.
link |
After we developed a way to extract it,
link |
we decided we also needed a way to jam it.
link |
And so the fact that we took time to do that other step too,
link |
that was time that I wasn't spending
link |
making the machine more affectively intelligent.
link |
And there's a choice in how we spend our time,
link |
which is now being swayed a little bit less by this goal
link |
and a little bit more like by concern
link |
about what's happening in society
link |
and what kind of future do we wanna build.
link |
And as we step back and say,
link |
okay, we don't just build AI to build AI
link |
to make Elon Musk more money
link |
or to make Amazon Jeff Bezos more money.
link |
Good gosh, you know, that's the wrong ethic.
link |
Why are we building it?
link |
What is the point of building AI?
link |
It used to be, it was driven by researchers in academia
link |
to get papers published and to make a career for themselves
link |
and to do something cool, right?
link |
Like, cause maybe it could be done.
link |
Now we realize that this is enabling rich people
link |
to get vastly richer, the poor are,
link |
the divide is even larger.
link |
And is that the kind of future that we want?
link |
Maybe we wanna think about, maybe we wanna rethink AI.
link |
Maybe we wanna rethink the problems in society
link |
that are causing the greatest inequity
link |
and rethink how to build AI
link |
that's not about a general intelligence,
link |
but that's about extending the intelligence
link |
and capability of the have nots
link |
so that we close these gaps in society.
link |
Do you hope that kind of stepping on the brake
link |
happens organically?
link |
Because I think still majority of the force behind AI
link |
is the desire to publish papers,
link |
is to make money without thinking about the why.
link |
Do you hope it happens organically?
link |
Is there room for regulation?
link |
Yeah, yeah, yeah, great questions.
link |
I prefer the, you know,
link |
they talk about the carrot versus the stick.
link |
I definitely prefer the carrot to the stick.
link |
And, you know, in our free world,
link |
we, there's only so much stick, right?
link |
You're gonna find a way around it.
link |
I generally think less regulation is better.
link |
That said, even though my position is classically carrot,
link |
no stick, no regulation,
link |
I think we do need some regulations in this space.
link |
I do think we need regulations
link |
around protecting people with their data,
link |
that you own your data, not Amazon, not Google.
link |
I would like to see people own their own data.
link |
I would also like to see the regulations
link |
that we have right now around lie detection
link |
being extended to emotion recognition in general,
link |
that right now you can't use a lie detector on an employee
link |
when you're, on a candidate
link |
when you're interviewing them for a job.
link |
I think similarly, we need to put in place protection
link |
around reading people's emotions without their consent
link |
and in certain cases,
link |
like characterizing them for a job and other opportunities.
link |
So I'm also, I also think that when we're reading emotion
link |
that's predictive around mental health,
link |
that that should, even though it's not medical data,
link |
that that should get the kinds of protections
link |
that our medical data gets.
link |
What most people don't know yet
link |
is right now with your smartphone use,
link |
and if you're wearing a sensor
link |
and you wanna learn about your stress and your sleep
link |
and your physical activity
link |
and how much you're using your phone
link |
and your social interaction,
link |
all of that nonmedical data,
link |
when we put it together with machine learning,
link |
now called AI, even though the founders of AI
link |
wouldn't have called it that,
link |
that capability can not only tell that you're calm right now
link |
or that you're getting a little stressed,
link |
but it can also predict how you're likely to be tomorrow.
link |
If you're likely to be sick or healthy,
link |
happy or sad, stressed or calm.
link |
Especially when you're tracking data over time.
link |
Especially when we're tracking a week of your data or more.
link |
Do you have an optimism towards,
link |
you know, a lot of people on our phones
link |
are worried about this camera that's looking at us.
link |
For the most part, on balance,
link |
are you optimistic about the benefits
link |
that can be brought from that camera
link |
that's looking at billions of us?
link |
Or should we be more worried?
link |
I think we should be a little bit more worried
link |
about who's looking at us and listening to us.
link |
The device sitting on your countertop in your kitchen,
link |
whether it's, you know, Alexa or Google Home or Apple, Siri,
link |
these devices want to listen
link |
while they say ostensibly to help us.
link |
And I think there are great people in these companies
link |
who do want to help people.
link |
Let me not brand them all bad.
link |
I'm a user of products from all of these companies
link |
I'm naming all the A companies, Alphabet, Apple, Amazon.
link |
They are awfully big companies, right?
link |
They have incredible power.
link |
And you know, what if China were to buy them, right?
link |
And suddenly all of that data
link |
were not part of free America,
link |
but all of that data were part of somebody
link |
who just wants to take over the world
link |
and you submit to them.
link |
And guess what happens if you so much as smirk the wrong way
link |
when they say something that you don't like?
link |
Well, they have reeducation camps, right?
link |
That's a nice word for them.
link |
By the way, they have a surplus of organs
link |
for people who have surgery these days.
link |
They don't have an organ donation problem
link |
because they take your blood and they know you're a match.
link |
And the doctors are on record of taking organs
link |
from people who are perfectly healthy and not prisoners.
link |
They're just simply not the favored ones of the government.
link |
And you know, that's a pretty freaky evil society.
link |
And we can use the word evil there.
link |
I was born in the Soviet Union.
link |
I can certainly connect to the worry that you're expressing.
link |
At the same time, probably both you and I
link |
and you very much so,
link |
you know, there's an exciting possibility
link |
that you can have a deep connection with a machine.
link |
Those of us, I've admitted students who say that they,
link |
you know, when you list like,
link |
who do you most wish you could have lunch with
link |
or dinner with, right?
link |
And they'll write like, I don't like people.
link |
I just like computers.
link |
And one of them said to me once
link |
when I had this party at my house,
link |
I want you to know,
link |
this is my only social event of the year,
link |
my one social event of the year.
link |
Like, okay, now this is a brilliant
link |
machine learning person, right?
link |
And we need that kind of brilliance in machine learning.
link |
And I love that computer science welcomes people
link |
who love people and people who are very awkward
link |
I love that this is a field that anybody could join.
link |
We need all kinds of people
link |
and you don't need to be a social person.
link |
I'm not trying to force people who don't like people
link |
to suddenly become social.
link |
if most of the people building the AIs of the future
link |
are the kind of people who don't like people,
link |
we've got a little bit of a problem.
link |
Well, hold on a second.
link |
So let me push back on that.
link |
So don't you think a large percentage of the world
link |
can, you know, there's loneliness.
link |
There is a huge problem with loneliness that's growing.
link |
And so there's a longing for connection.
link |
If you're lonely, you're part of a big and growing group.
link |
So we're in it together, I guess.
link |
If you're lonely, join the group.
link |
That's a good line.
link |
But do you think there's...
link |
You talked about some worry,
link |
but do you think there's an exciting possibility
link |
that something like Alexa and these kinds of tools
link |
can alleviate that loneliness
link |
in a way that other humans can't?
link |
Yeah, yeah, definitely.
link |
I mean, a great book can kind of alleviate loneliness
link |
because you just get sucked into this amazing story
link |
and you can't wait to go spend time with that character.
link |
And they're not a human character.
link |
There is a human behind it.
link |
But yeah, it can be an incredibly delightful way
link |
to pass the hours and it can meet needs.
link |
Even, you know, I don't read those trashy romance books,
link |
but somebody does, right?
link |
And what are they getting from this?
link |
Well, probably some of that feeling of being there, right?
link |
Being there in that social moment,
link |
that romantic moment or connecting with somebody.
link |
I've had a similar experience
link |
reading some science fiction books, right?
link |
And connecting with the character.
link |
Orson Scott Card, you know, just amazing writing
link |
and Ender's Game and Speaker for the Dead, terrible title.
link |
But those kind of books that pull you into a character
link |
and you feel like you're, you feel very social.
link |
It's very connected, even though it's not responding to you.
link |
And a computer, of course, can respond to you.
link |
So it can deepen it, right?
link |
You can have a very deep connection,
link |
much more than the movie Her, you know, plays up, right?
link |
I mean, movie Her is already a pretty deep connection, right?
link |
Well, but it's just a movie, right?
link |
It's just, you know, but I mean,
link |
like there can be a real interaction
link |
where the character can learn and you can learn.
link |
You could imagine it not just being you and one character.
link |
You could imagine a group of characters.
link |
You can imagine a group of people and characters,
link |
human and AI connecting,
link |
where maybe a few people can't sort of be friends
link |
with everybody, but the few people
link |
and their AIs can befriend more people.
link |
There can be an extended human intelligence in there
link |
where each human can connect with more people that way.
link |
But it's still very limited, but there are just,
link |
what I mean is there are many more possibilities
link |
than what's in that movie.
link |
So there's a tension here.
link |
So one, you expressed a really serious concern
link |
about privacy, about how governments
link |
can misuse the information,
link |
and there's the possibility of this connection.
link |
So let's look at Alexa.
link |
So personal assistance.
link |
For the most part, as far as I'm aware,
link |
they ignore your emotion.
link |
They ignore even the context or the existence of you,
link |
the intricate, beautiful, complex aspects of who you are,
link |
except maybe aspects of your voice
link |
that help it recognize for speech recognition.
link |
Do you think they should move towards
link |
trying to understand your emotion?
link |
All of these companies are very interested
link |
in understanding human emotion.
link |
They want, more people are telling Siri every day
link |
they want to kill themselves.
link |
Apple wants to know the difference between
link |
if a person is really suicidal versus if a person
link |
is just kind of fooling around with Siri, right?
link |
The words may be the same, the tone of voice
link |
and what surrounds those words is pivotal to understand
link |
if they should respond in a very serious way,
link |
bring help to that person,
link |
or if they should kind of jokingly tease back,
link |
ah, you just want to sell me for something else, right?
link |
Like, how do you respond when somebody says that?
link |
Well, you do want to err on the side of being careful
link |
and taking it seriously.
link |
People want to know if the person is happy or stressed
link |
in part, well, so let me give you an altruistic reason
link |
and a business profit motivated reason.
link |
And there are people in companies that operate
link |
on both principles.
link |
The altruistic people really care about their customers
link |
and really care about helping you feel a little better
link |
at the end of the day.
link |
And it would just make those people happy
link |
if they knew that they made your life better.
link |
If you came home stressed and after talking
link |
with their product, you felt better.
link |
There are other people who maybe have studied
link |
the way affect affects decision making
link |
and prices people pay.
link |
And they know, I don't know if I should tell you,
link |
like the work of Jen Lerner on heartstrings and purse strings,
link |
you know, if we manipulate you into a slightly sadder mood,
link |
you'll pay more, right?
link |
You'll pay more to change your situation.
link |
You'll pay more for something you don't even need
link |
to make yourself feel better.
link |
So, you know, if they sound a little sad,
link |
maybe I don't want to cheer them up.
link |
Maybe first I want to help them get something,
link |
a little shopping therapy, right?
link |
Which is really difficult for a company
link |
that's primarily funded on advertisement.
link |
So they're encouraged to get you to offer you products
link |
or Amazon that's primarily funded
link |
on you buying things from their store.
link |
So I think we should be, you know,
link |
maybe we need regulation in the future
link |
to put a little bit of a wall between these agents
link |
that have access to our emotion
link |
and agents that want to sell us stuff.
link |
Maybe there needs to be a little bit more
link |
of a firewall in between those.
link |
So maybe digging in a little bit
link |
on the interaction with Alexa,
link |
you mentioned, of course, a really serious concern
link |
about like recognizing emotion,
link |
if somebody is speaking of suicide or depression and so on,
link |
but what about the actual interaction itself?
link |
Do you think, so if I, you know,
link |
you mentioned Clippy and being annoying,
link |
what is the objective function we're trying to optimize?
link |
Is it minimize annoyingness or minimize or maximize happiness?
link |
Or if we look at human to human relations,
link |
I think that push and pull, the tension, the dance,
link |
you know, the annoying, the flaws, that's what makes it fun.
link |
So is there a room for, like what is the objective function?
link |
There are times when you want to have a little push and pull,
link |
I think of kids sparring, right?
link |
You know, I see my sons and they,
link |
one of them wants to provoke the other to be upset
link |
And it's actually healthy to learn where your limits are,
link |
to learn how to self regulate.
link |
You can imagine a game where it's trying to make you mad
link |
and you're trying to show self control.
link |
And so if we're doing a AI human interaction
link |
that's helping build resilience and self control,
link |
whether it's to learn how to not be a bully
link |
or how to turn the other cheek
link |
or how to deal with an abusive person in your life,
link |
then you might need an AI that pushes your buttons, right?
link |
But in general, do you want an AI that pushes your buttons?
link |
Probably depends on your personality.
link |
I don't, I want one that's respectful,
link |
that is there to serve me
link |
and that is there to extend my ability to do things.
link |
I'm not looking for a rival,
link |
I'm looking for a helper.
link |
And that's the kind of AI I'd put my money on.
link |
Your sense is for the majority of people in the world,
link |
in order to have a rich experience,
link |
that's what they're looking for as well.
link |
So they're not looking,
link |
if you look at the movie Her, spoiler alert,
link |
I believe the program that the woman in the movie Her
link |
leaves the person for somebody else,
link |
says they don't wanna be dating anymore, right?
link |
Like, do you, your sense is if Alexa said,
link |
you know what, I'm actually had enough of you for a while,
link |
so I'm gonna shut myself off.
link |
You don't see that as...
link |
I'd say you're trash, cause I paid for you, right?
link |
You, we've got to remember,
link |
and this is where this blending human AI
link |
as if we're equals is really deceptive
link |
because AI is something at the end of the day
link |
that my students and I are making in the lab.
link |
And we're choosing what it's allowed to say,
link |
when it's allowed to speak, what it's allowed to listen to,
link |
what it's allowed to act on given the inputs
link |
that we choose to expose it to,
link |
what outputs it's allowed to have.
link |
It's all something made by a human.
link |
And if we wanna make something
link |
that makes our lives miserable, fine.
link |
I wouldn't invest in it as a business,
link |
unless it's just there for self regulation training.
link |
But I think we need to think about
link |
what kind of future we want.
link |
And actually your question, I really like the,
link |
what is the objective function?
link |
Is it to calm people down?
link |
Is it to always make people happy and calm them down?
link |
Well, there was a book about that, right?
link |
The brave new world, make everybody happy,
link |
take your Soma if you're unhappy, take your happy pill.
link |
And if you refuse to take your happy pill,
link |
well, we'll threaten you by sending you to Iceland
link |
I lived in Iceland three years.
link |
It's a great place.
link |
Don't take your Soma, then go to Iceland.
link |
A little TV commercial there.
link |
Now I was a child there for a few years.
link |
It's a wonderful place.
link |
So that part of the book never scared me.
link |
But really like, do we want AI to manipulate us
link |
into submission, into making us happy?
link |
Well, if you are a, you know,
link |
like a power obsessed sick dictator individual
link |
who only wants to control other people
link |
to get your jollies in life, then yeah,
link |
you wanna use AI to extend your power and your scale
link |
to force people into submission.
link |
If you believe that the human race is better off
link |
being given freedom and the opportunity
link |
to do things that might surprise you,
link |
then you wanna use AI to extend people's ability to build,
link |
you wanna build AI that extends human intelligence,
link |
that empowers the weak and helps balance the power
link |
between the weak and the strong,
link |
not that gives more power to the strong.
link |
So in this process of empowering people and sensing people,
link |
what is your sense on emotion
link |
in terms of recognizing emotion?
link |
The difference between emotion that is shown
link |
and emotion that is felt.
link |
So yeah, emotion that is expressed on the surface
link |
through your face, your body, and various other things,
link |
and what's actually going on deep inside
link |
on the biological level, on the neuroscience level,
link |
or some kind of cognitive level.
link |
Whoa, no easy questions here.
link |
Well, yeah, I'm sure there's no definitive answer,
link |
but what's your sense?
link |
How far can we get by just looking at the face?
link |
We're very limited when we just look at the face,
link |
but we can get further than most people think we can get.
link |
People think, hey, I have a great poker face,
link |
therefore all you're ever gonna get from me is neutral.
link |
Well, that's naive.
link |
We can read with the ordinary camera
link |
on your laptop or on your phone.
link |
We can read from a neutral face if your heart is racing.
link |
We can read from a neutral face
link |
if your breathing is becoming irregular
link |
and showing signs of stress.
link |
We can read under some conditions
link |
that maybe I won't give you details on,
link |
how your heart rate variability power is changing.
link |
That could be a sign of stress,
link |
even when your heart rate is not necessarily accelerating.
link |
Sorry, from physio sensors or from the face?
link |
From the color changes that you cannot even see,
link |
but the camera can see.
link |
So you can get a lot of signal, but...
link |
So we get things people can't see using a regular camera.
link |
And from that, we can tell things about your stress.
link |
So if you were just sitting there with a blank face
link |
thinking nobody can read my emotion, well, you're wrong.
link |
Right, so that's really interesting,
link |
but that's from sort of visual information from the face.
link |
That's almost like cheating your way
link |
to the physiological state of the body,
link |
by being very clever with what you can do with vision.
link |
With signal processing.
link |
With signal processing.
link |
So that's really impressive.
link |
But if you just look at the stuff we humans can see,
link |
the poker, the smile, the smirks,
link |
the subtle, all the facial actions.
link |
So then you can hide that on your face
link |
for a limited amount of time.
link |
Now, if you're just going in for a brief interview
link |
and you're hiding it, that's pretty easy for most people.
link |
If you are, however, surveilled constantly everywhere you go,
link |
then it's gonna say, gee, you know, Lex used to smile a lot
link |
and now I'm not seeing so many smiles.
link |
And Roz used to laugh a lot
link |
and smile a lot very spontaneously.
link |
And now I'm only seeing
link |
these not so spontaneous looking smiles.
link |
And only when she's asked these questions.
link |
You know, that's something's changed here.
link |
Probably not getting enough sleep.
link |
We could look at that too.
link |
So now I have to be a little careful too.
link |
When I say we, you think we can't read your emotion
link |
and we can, it's not that binary.
link |
What we're reading is more some physiological changes
link |
that relate to your activation.
link |
Now, that doesn't mean that we know everything
link |
about how you feel.
link |
In fact, we still know very little about how you feel.
link |
Your thoughts are still private.
link |
Your nuanced feelings are still completely private.
link |
We can't read any of that.
link |
So there's some relief that we can't read that.
link |
Even brain imaging can't read that.
link |
Wearables can't read that.
link |
However, as we read your body state changes
link |
and we know what's going on in your environment
link |
and we look at patterns of those over time,
link |
we can start to make some inferences
link |
about what you might be feeling.
link |
And that is where it's not just the momentary feeling
link |
but it's more your stance toward things.
link |
And that could actually be a little bit more scary
link |
with certain kinds of governmental control freak people
link |
who want to know more about are you on their team
link |
And getting that information through over time.
link |
So you're saying there's a lot of signal
link |
by looking at the change over time.
link |
So you've done a lot of exciting work
link |
both in computer vision
link |
and physiological sense like wearables.
link |
What do you think is the best modality for,
link |
what's the best window into the emotional soul?
link |
Depends what you want to know.
link |
It depends what you want to know.
link |
It depends what you want to know.
link |
Everything is informative.
link |
Everything we do is informative.
link |
So for health and wellbeing and things like that,
link |
do you find the wearable physiotechnical,
link |
measuring physiological signals
link |
is the best for health based stuff?
link |
So here I'm going to answer empirically
link |
with data and studies we've been doing.
link |
We've been doing studies.
link |
Now these are currently running
link |
with lots of different kinds of people
link |
but where we've published data
link |
and I can speak publicly to it,
link |
the data are limited right now
link |
to New England college students.
link |
So that's a small group.
link |
Among New England college students,
link |
when they are wearing a wearable
link |
like the empathic embrace here
link |
that's measuring skin conductance, movement, temperature.
link |
And when they are using a smartphone
link |
that is collecting their time of day
link |
of when they're texting, who they're texting,
link |
their movement around it, their GPS,
link |
the weather information based upon their location.
link |
And when it's using machine learning
link |
and putting all of that together
link |
and looking not just at right now
link |
but looking at your rhythm of behaviors
link |
over about a week.
link |
When we look at that,
link |
we are very accurate at forecasting tomorrow's stress,
link |
mood and happy, sad mood and health.
link |
And when we look at which pieces of that are most useful,
link |
first of all, if you have all the pieces,
link |
you get the best results.
link |
If you have only the wearable,
link |
you get the next best results.
link |
And that's still better than 80% accurate
link |
at forecasting tomorrow's levels.
link |
Isn't that exciting because the wearable stuff
link |
with physiological information,
link |
it feels like it violates privacy less
link |
than the noncontact face based methods.
link |
Yeah, it's interesting.
link |
I think what people sometimes don't,
link |
it's funny in the early days people would say,
link |
oh, wearing something or giving blood is invasive, right?
link |
Whereas a camera is less invasive
link |
because it's not touching you.
link |
I think on the contrary,
link |
the things that are not touching you are maybe the scariest
link |
because you don't know when they're on or off.
link |
And you don't know who's behind it, right?
link |
A wearable, depending upon what's happening
link |
to the data on it, if it's just stored locally
link |
or if it's streaming and what it is being attached to,
link |
in a sense, you have the most control over it
link |
because it's also very easy to just take it off, right?
link |
Now it's not sensing me.
link |
So if I'm uncomfortable with what it's sensing,
link |
now I'm free, right?
link |
If I'm comfortable with what it's sensing,
link |
then, and I happen to know everything about this one
link |
and what it's doing with it,
link |
so I'm quite comfortable with it,
link |
then I have control, I'm comfortable.
link |
Control is one of the biggest factors for an individual
link |
in reducing their stress.
link |
If I have control over it,
link |
if I know all there is to know about it,
link |
then my stress is a lot lower
link |
and I'm making an informed choice
link |
about whether to wear it or not,
link |
or when to wear it or not.
link |
I wanna wear it sometimes, maybe not others.
link |
Right, so that control, yeah, I'm with you.
link |
That control, even if, yeah, the ability to turn it off,
link |
that is a really important thing.
link |
And we need to, maybe, if there's regulations,
link |
maybe that's number one to protect
link |
is people's ability to, it's easy to opt out as to opt in.
link |
Right, so you've studied a bit of neuroscience as well.
link |
How have looking at our own minds,
link |
sort of the biological stuff or the neurobiological,
link |
the neuroscience to get the signals in our brain,
link |
helped you understand the problem
link |
and the approach of effective computing, so?
link |
Originally, I was a computer architect
link |
and I was building hardware and computer designs
link |
and I wanted to build ones that worked like the brain.
link |
So I've been studying the brain
link |
as long as I've been studying how to build computers.
link |
Have you figured out anything yet?
link |
You know, they used to think like,
link |
oh, if you remove this chunk of the brain
link |
and you find this function goes away,
link |
well, that's the part of the brain that did it.
link |
And then later they realized
link |
if you remove this other chunk of the brain,
link |
that function comes back and,
link |
oh no, we really don't understand it.
link |
Brains are so interesting and changing all the time
link |
and able to change in ways
link |
that will probably continue to surprise us.
link |
When we were measuring stress,
link |
you may know the story where we found
link |
an unusual big skin conductance pattern on one wrist
link |
in one of our kids with autism.
link |
And in trying to figure out how on earth
link |
you could be stressed on one wrist and not the other,
link |
like how can you get sweaty on one wrist, right?
link |
When you get stressed
link |
with that sympathetic fight or flight response,
link |
like you kind of should like sweat more
link |
in some places than others,
link |
but not more on one wrist than the other.
link |
That didn't make any sense.
link |
We learned that what had actually happened
link |
was a part of his brain had unusual electrical activity
link |
and that caused an unusually large sweat response
link |
on one wrist and not the other.
link |
And since then we've learned
link |
that seizures cause this unusual electrical activity.
link |
And depending where the seizure is,
link |
if it's in one place and it's staying there,
link |
you can have a big electrical response
link |
we can pick up with a wearable at one part of the body.
link |
You can also have a seizure
link |
that spreads over the whole brain,
link |
generalized grand mal seizure.
link |
And that response spreads
link |
and we can pick it up pretty much anywhere.
link |
As we learned this and then later built Embrace
link |
that's now FDA cleared for seizure detection,
link |
we have also built relationships
link |
with some of the most amazing doctors in the world
link |
who not only help people
link |
with unusual brain activity or epilepsy,
link |
but some of them are also surgeons
link |
and they're going in and they're implanting electrodes,
link |
not just to momentarily read the strange patterns
link |
of brain activity that we'd like to see return to normal,
link |
but also to read out continuously what's happening
link |
in some of these deep regions of the brain
link |
during most of life when these patients are not seizing.
link |
Most of the time they're not seizing,
link |
most of the time they're fine.
link |
And so we are now working on mapping
link |
those deep brain regions
link |
that you can't even usually get with EEG scalp electrodes
link |
because the changes deep inside don't reach the surface.
link |
But interesting when some of those regions
link |
are activated, we see a big skin conductance response.
link |
Who would have thunk it, right?
link |
Like nothing here, but something here.
link |
In fact, right after seizures
link |
that we think are the most dangerous ones
link |
that precede what's called SUDEP,
link |
Sudden Unexpected Death and Epilepsy,
link |
there's a period where the brainwaves go flat
link |
and it looks like the person's brain has stopped,
link |
The activity has gone deep into a region
link |
that can make the cortical activity look flat,
link |
like a quick shutdown signal here.
link |
It can unfortunately cause breathing to stop
link |
if it progresses long enough.
link |
Before that happens, we see a big skin conductance response
link |
in the data that we have.
link |
The longer this flattening, the bigger our response here.
link |
So we have been trying to learn, you know, initially,
link |
like why are we getting a big response here
link |
when there's nothing here?
link |
Well, it turns out there's something much deeper.
link |
So we can now go inside the brains
link |
of some of these individuals, fabulous people
link |
who usually aren't seizing,
link |
and get this data and start to map it.
link |
So that's the active research that we're doing right now
link |
with top medical partners.
link |
So this wearable sensor that's looking at skin conductance
link |
can capture sort of the ripples of the complexity
link |
of what's going on in our brain.
link |
So this little device, you have a hope
link |
that you can start to get the signal
link |
from the interesting things happening in the brain.
link |
Yeah, we've already published the strong correlations
link |
between the size of this response
link |
and the flattening that happens afterwards.
link |
And unfortunately, also in a real SUDEP case
link |
where the patient died because the, well, we don't know why.
link |
We don't know if somebody was there,
link |
it would have definitely prevented it.
link |
But we know that most SUDEPs happen
link |
when the person's alone.
link |
And in this case, a SUDEP is an acronym, S U D E P.
link |
And it stands for the number two cause
link |
of years of life lost actually
link |
among all neurological disorders.
link |
Stroke is number one, SUDEP is number two,
link |
but most people haven't heard of it.
link |
Actually, I'll plug my TED talk,
link |
it's on the front page of TED right now
link |
that talks about this.
link |
And we hope to change that.
link |
I hope everybody who's heard of SIDS and stroke
link |
will now hear of SUDEP
link |
because we think in most cases it's preventable
link |
if people take their meds and aren't alone
link |
when they have a seizure.
link |
Not guaranteed to be preventable.
link |
There are some exceptions,
link |
but we think most cases probably are.
link |
So you had this embrace now in the version two wristband,
link |
right, for epilepsy management.
link |
That's the one that's FDA approved?
link |
Which is kind of a clear.
link |
FDA cleared, they say.
link |
It essentially means it's approved for marketing.
link |
Just a side note, how difficult is that to do?
link |
It's essentially getting FDA approval
link |
for computer science technology.
link |
It's so agonizing.
link |
It's much harder than publishing multiple papers
link |
in top medical journals.
link |
Yeah, we've published peer reviewed
link |
top medical journal neurology, best results,
link |
and that's not good enough for the FDA.
link |
so if we look at the peer review of medical journals,
link |
there's flaws, there's strengths,
link |
is the FDA approval process,
link |
how does it compare to the peer review process?
link |
Does it have the strength?
link |
I'll take peer review over FDA any day.
link |
But is that a good thing?
link |
Is that a good thing for FDA?
link |
You're saying, does it stop some amazing technology
link |
from getting through?
link |
The FDA performs a very important good role
link |
in keeping people safe.
link |
they put you through tons of safety testing
link |
and that's wonderful and that's great.
link |
I'm all in favor of the safety testing.
link |
But sometimes they put you through additional testing
link |
that they don't have to explain why they put you through it
link |
and you don't understand why you're going through it
link |
and it doesn't make sense.
link |
And that's very frustrating.
link |
And maybe they have really good reasons
link |
and they just would,
link |
it would do people a service to articulate those reasons.
link |
Be more transparent.
link |
Be more transparent.
link |
So as part of Empatica, you have sensors.
link |
So what kind of problems can we crack?
link |
What kind of things from seizures to autism
link |
to I think I've heard you mentioned depression.
link |
What kind of things can we alleviate?
link |
What's your hope of what,
link |
how we can make the world a better place
link |
with this wearable tech?
link |
I would really like to see my fellow brilliant researchers
link |
step back and say, what are the really hard problems
link |
that we don't know how to solve
link |
that come from people maybe we don't even see
link |
in our normal life because they're living
link |
in the poor places.
link |
They're stuck on the bus.
link |
They can't even afford the Uber or the Lyft
link |
or the data plan or all these other wonderful things
link |
we have that we keep improving on.
link |
Meanwhile, there's all these folks left behind in the world
link |
and they're struggling with horrible diseases
link |
with depression, with epilepsy, with diabetes,
link |
with just awful stuff that maybe a little more time
link |
and attention hanging out with them
link |
and learning what are their challenges in life?
link |
What are their needs?
link |
How do we help them have job skills?
link |
How do we help them have a hope and a future
link |
and a chance to have the great life
link |
that so many of us building technology have?
link |
And then how would that reshape the kinds of AI
link |
that we build? How would that reshape the new apps
link |
that we build or the maybe we need to focus
link |
on how to make things more low cost and green
link |
instead of thousand dollar phones?
link |
I mean, come on, why can't we be thinking more
link |
about things that do more with less for these folks?
link |
Quality of life is not related to the cost of your phone.
link |
It's not something that, it's been shown that what about
link |
$75,000 of income and happiness is the same, okay?
link |
However, I can tell you, you get a lot of happiness
link |
from helping other people.
link |
You get a lot more than $75,000 buys.
link |
So how do we connect up the people who have real needs
link |
with the people who have the ability to build the future
link |
and build the kind of future that truly improves the lives
link |
of all the people that are currently being left behind?
link |
So let me return just briefly on a point,
link |
maybe in the movie, Her.
link |
So do you think if we look farther into the future,
link |
you said so much of the benefit from making our technology
link |
more empathetic to us human beings would make them
link |
better tools, empower us, make our lives better.
link |
Well, if we look farther into the future,
link |
do you think we'll ever create an AI system
link |
that we can fall in love with?
link |
That we can fall in love with and loves us back
link |
on a level that is similar to human to human interaction,
link |
like in the movie Her or beyond?
link |
I think we can simulate it in ways that could,
link |
you know, sustain engagement for a while.
link |
Would it be as good as another person?
link |
I don't think so, if you're used to like good people.
link |
Now, if you've just grown up with nothing but abuse
link |
and you can't stand human beings,
link |
can we do something that helps you there
link |
that gives you something through a machine?
link |
Yeah, but that's pretty low bar, right?
link |
If you've only encountered pretty awful people.
link |
If you've encountered wonderful, amazing people,
link |
we're nowhere near building anything like that.
link |
And I would not bet on building it.
link |
I would bet instead on building the kinds of AI
link |
that helps kind of raise all boats,
link |
that helps all people be better people,
link |
helps all people figure out if they're getting sick tomorrow
link |
and helps give them what they need to stay well tomorrow.
link |
That's the kind of AI I wanna build
link |
that improves human lives,
link |
not the kind of AI that just walks on The Tonight Show
link |
and people go, wow, look how smart that is.
link |
And then it goes back in a box, you know?
link |
if we continue looking a little bit into the future,
link |
do you think an AI that's empathetic
link |
and does improve our lives
link |
need to have a physical presence, a body?
link |
And even let me cautiously say the C word consciousness
link |
and even fear of mortality.
link |
So some of those human characteristics,
link |
do you think it needs to have those aspects
link |
or can it remain simply a machine learning tool
link |
that learns from data of behavior
link |
that learns to make us,
link |
based on previous patterns, feel better?
link |
Or does it need those elements of consciousness?
link |
It depends on your goals.
link |
If you're making a movie, it needs a body.
link |
It needs a gorgeous body.
link |
It needs to act like it has consciousness.
link |
It needs to act like it has emotion, right?
link |
Because that's what sells.
link |
That's what's gonna get me to show up and enjoy the movie.
link |
In real life, does it need all that?
link |
Well, if you've read Orson Scott Card,
link |
Ender's Game, Speaker of the Dead,
link |
it could just be like a little voice in your earring, right?
link |
And you could have an intimate relationship
link |
and it could get to know you.
link |
And it doesn't need to be a robot.
link |
But that doesn't make this compelling of a movie, right?
link |
I mean, we already think it's kind of weird
link |
when a guy looks like he's talking to himself on the train,
link |
even though it's earbuds.
link |
So we have these, embodied is more powerful.
link |
Embodied, when you compare interactions
link |
with an embodied robot versus a video of a robot
link |
versus no robot, the robot is more engaging.
link |
The robot gets our attention more.
link |
The robot, when you walk in your house,
link |
is more likely to get you to remember to do the things
link |
that you asked it to do,
link |
because it's kind of got a physical presence.
link |
You can avoid it if you don't like it.
link |
It could see you're avoiding it.
link |
There's a lot of power to being embodied.
link |
There will be embodied AIs.
link |
They have great power and opportunity and potential.
link |
There will also be AIs that aren't embodied,
link |
that just are little software assistants
link |
that help us with different things
link |
that may get to know things about us.
link |
Will they be conscious?
link |
There will be attempts to program them
link |
to make them appear to be conscious.
link |
We can already write programs that make it look like,
link |
oh, what do you mean?
link |
Of course I'm aware that you're there, right?
link |
I mean, it's trivial to say stuff like that.
link |
It's easy to fool people,
link |
but does it actually have conscious experience like we do?
link |
Nobody has a clue how to do that yet.
link |
That seems to be something that is beyond
link |
what any of us knows how to build now.
link |
Will it have to have that?
link |
I think you can get pretty far
link |
with a lot of stuff without it.
link |
But will we accord it rights?
link |
Well, that's more a political game
link |
than it is a question of real consciousness.
link |
Yeah, can you go to jail for turning off Alexa
link |
is the question for an election maybe a few decades from now.
link |
Well, Sophia Robot's already been given rights
link |
as a citizen in Saudi Arabia, right?
link |
Even before women have full rights.
link |
Then the robot was still put back in the box
link |
to be shipped to the next place
link |
where it would get a paid appearance, right?
link |
Yeah, it's dark and almost comedic, if not absurd.
link |
So I've heard you speak about your journey in finding faith.
link |
And how you discovered some wisdoms about life
link |
and beyond from reading the Bible.
link |
And I've also heard you say that,
link |
you said scientists too often assume
link |
that nothing exists beyond what can be currently measured.
link |
Yeah, materialism.
link |
And scientism, yeah.
link |
So in some sense, this assumption enables
link |
the near term scientific method,
link |
assuming that we can uncover the mysteries of this world
link |
by the mechanisms of measurement that we currently have.
link |
But we easily forget that we've made this assumption.
link |
So what do you think we miss out on
link |
by making that assumption?
link |
It's fine to limit the scientific method
link |
to things we can measure and reason about and reproduce.
link |
I think we have to recognize
link |
that sometimes we scientists also believe
link |
in things that happen historically.
link |
Like I believe the Holocaust happened.
link |
I can't prove events from past history scientifically.
link |
You prove them with historical evidence, right?
link |
With the impact they had on people,
link |
with eyewitness testimony and things like that.
link |
So a good thinker recognizes that science
link |
is one of many ways to get knowledge.
link |
It's not the only way.
link |
And there's been some really bad philosophy
link |
and bad thinking recently, you can call it scientism,
link |
where people say science is the only way to get to truth.
link |
And it's not, it just isn't.
link |
There are other ways that work also.
link |
Like knowledge of love with someone.
link |
You don't prove your love through science, right?
link |
So history, philosophy, love,
link |
a lot of other things in life show us
link |
that there's more ways to gain knowledge and truth
link |
if you're willing to believe there is such a thing,
link |
and I believe there is, than science.
link |
I do, I am a scientist, however.
link |
And in my science, I do limit my science
link |
to the things that the scientific method can do.
link |
But I recognize that it's myopic
link |
to say that that's all there is.
link |
Right, there's, just like you listed,
link |
there's all the why questions.
link |
And really we know, if we're being honest with ourselves,
link |
the percent of what we really know is basically zero
link |
relative to the full mystery of the...
link |
Measure theory, a set of measure zero,
link |
if I have a finite amount of knowledge, which I do.
link |
So you said that you believe in truth.
link |
So let me ask that old question.
link |
What do you think this thing is all about?
link |
What's the life on earth?
link |
Life, the universe, and everything?
link |
And everything, what's the meaning?
link |
I can't quote Douglas Adams 42.
link |
It's my favorite number.
link |
By the way, that's my street address.
link |
My husband and I guessed the exact same number
link |
for our house, we got to pick it.
link |
And there's a reason we picked 42, yeah.
link |
So is it just 42 or is there,
link |
do you have other words that you can put around it?
link |
Well, I think there's a grand adventure
link |
and I think this life is a part of it.
link |
I think there's a lot more to it than meets the eye
link |
and the heart and the mind and the soul here.
link |
I think we see but through a glass dimly in this life.
link |
We see only a part of all there is to know.
link |
If people haven't read the Bible, they should,
link |
if they consider themselves educated
link |
and you could read Proverbs
link |
and find tremendous wisdom in there
link |
that cannot be scientifically proven.
link |
But when you read it, there's something in you,
link |
like a musician knows when the instruments played right
link |
and it's beautiful.
link |
There's something in you that comes alive
link |
and knows that there's a truth there
link |
that it's like your strings are being plucked by the master
link |
instead of by me, right, playing when I pluck it.
link |
But probably when you play, it sounds spectacular, right?
link |
And when you encounter those truths,
link |
there's something in you that sings
link |
and knows that there is more
link |
than what I can prove mathematically
link |
or program a computer to do.
link |
Don't get me wrong, the math is gorgeous.
link |
The computer programming can be brilliant.
link |
It's inspiring, right?
link |
None of this squashes my desire to do science
link |
or to get knowledge through science.
link |
I'm not dissing the science at all.
link |
I grow even more in awe of what the science can do
link |
because I'm more in awe of all there is we don't know.
link |
And really at the heart of science,
link |
you have to have a belief that there's truth,
link |
that there's something greater to be discovered.
link |
And some scientists may not wanna use the faith word,
link |
but it's faith that drives us to do science.
link |
It's faith that there is truth,
link |
that there's something to know that we don't know,
link |
that it's worth knowing, that it's worth working hard,
link |
and that there is meaning,
link |
that there is such a thing as meaning,
link |
which by the way, science can't prove either.
link |
We have to kind of start with some assumptions
link |
that there's things like truth and meaning.
link |
And these are really questions philosophers own, right?
link |
This is their space,
link |
of philosophers and theologians at some level.
link |
So these are things science,
link |
when people claim that science will tell you all truth,
link |
there's a name for that.
link |
It's its own kind of faith.
link |
It's scientism and it's very myopic.
link |
Yeah, there's a much bigger world out there to be explored
link |
in ways that science may not,
link |
at least for now, allow us to explore.
link |
Yeah, and there's meaning and purpose and hope
link |
and joy and love and all these awesome things
link |
that make it all worthwhile too.
link |
I don't think there's a better way to end it, Roz.
link |
Thank you so much for talking today.
link |
Thanks Lex, what a pleasure.