back to index

Rana el Kaliouby: Emotion AI, Social Robots, and Self-Driving Cars | Lex Fridman Podcast #322


small model | large model

link |
00:00:00.000
there's a broader question here, right? As we build socially and emotionally intelligent machines,
link |
00:00:07.920
what does that mean about our relationship with them and then more broadly our relationship with
link |
00:00:12.640
one another, right? Because this machine is going to be programmed to be amazing at empathy,
link |
00:00:18.240
by definition, right? It's going to always be there for you. It's not going to get bored.
link |
00:00:23.440
I don't know how I feel about that. I think about that a lot.
link |
00:00:25.680
TITO The following is a conversation with Rana
link |
00:00:30.320
L. Kliubi, a pioneer in the field of emotion recognition and human centric artificial
link |
00:00:36.080
intelligence. She is the founder of Effectiva, deputy CEO of SmartEye, author of Girl Decoded,
link |
00:00:43.920
and one of the most brilliant, kind, inspiring, and fun human beings I've gotten the chance to
link |
00:00:49.200
talk to. This is the Lex Friedman podcast. To support it, please check out our sponsors in
link |
00:00:54.800
the description. And now, dear friends, here's Rana L. Kliubi. You grew up in the Middle East,
link |
00:01:02.400
in Egypt. What is the memory from that time that makes you smile? Or maybe a memory that stands out
link |
00:01:08.000
as helping your mind take shape and helping you define yourself in this world?
link |
00:01:12.320
RANA L. KLIUBI So the memory that stands out is we used to
link |
00:01:15.440
live in my grandma's house. She used to have these mango trees in her garden. And in the summer,
link |
00:01:21.680
and so mango season was like July and August. And so in the summer, she would invite all my aunts
link |
00:01:26.640
and uncles and cousins. And it was just like maybe there were like 20 or 30 people in the house,
link |
00:01:31.680
and she would cook all this amazing food. And us, the kids, we would go down the garden,
link |
00:01:38.080
and we would pick all these mangoes. And I don't know, I think it's just the bringing people
link |
00:01:43.920
together that always stuck with me, the warmth. TITO Around the mango tree.
link |
00:01:47.920
RANA L. KLIUBI Yeah, around the mango tree. And there's just like the joy, the joy of being
link |
00:01:52.800
together around food. And I'm a terrible cook. So I guess that didn't, that memory didn't translate
link |
00:02:00.880
to me kind of doing the same. I love hosting people. TITO Do you remember colors, smells?
link |
00:02:05.520
Is that what, like what, how does memory work? Like what do you visualize? Do you visualize
link |
00:02:10.560
people's faces, smiles? Do you, is there colors? Is there like a theme to the colors? Is it smells
link |
00:02:19.360
because of food involved? RANA L. KLIUBI Yeah, I think that's a great question. So the,
link |
00:02:23.360
those Egyptian mangoes, there's a particular type that I love, and it's called Darwasi mangoes. And
link |
00:02:28.800
they're kind of, you know, they're oval, and they have a little red in them. So I kind of,
link |
00:02:33.680
they're red and mango colored on the outside. So I remember that. TITO Does red indicate like
link |
00:02:39.600
extra sweetness? Is that, is that, that means like it's nicely, yeah, it's nice and ripe and stuff.
link |
00:02:45.520
Yeah. What, what's like a definitive food of Egypt? You know, there's like these almost
link |
00:02:52.640
stereotypical foods in different parts of the world, like Ukraine invented borscht.
link |
00:02:59.600
Borscht is this beet soup with, that you put sour cream on. See, it's not, I can't see if you,
link |
00:03:04.800
if you know, if you know what it is, I think, you know, is delicious. But if I explain it,
link |
00:03:10.880
it's just not going to sound delicious. I feel like beet soup. This doesn't make any sense,
link |
00:03:15.280
but that's kind of, and you probably have actually seen pictures of it because it's one of the
link |
00:03:19.600
traditional foods in Ukraine, in Russia, in different parts of the Slavic world. So that's,
link |
00:03:26.800
but it's become so cliche and stereotypical that you almost don't mention it, but it's still
link |
00:03:31.520
delicious. Like I visited Ukraine, I eat that every single day, so.
link |
00:03:35.440
Do you, do you make it yourself? How hard is it to make?
link |
00:03:38.480
No, I don't know. I think to make it well, like anything, like Italians, they say, well,
link |
00:03:44.320
tomato sauce is easy to make, but to make it right, that's like a generational skill. So anyway,
link |
00:03:51.760
is there something like that in Egypt? Is there a culture of food?
link |
00:03:55.200
There is. And actually, we have a similar kind of soup. It's called molokhia, and it's, it's made
link |
00:04:02.880
of this green plant. It's like, it's somewhere between spinach and kale, and you mince it,
link |
00:04:07.520
and then you cook it in like chicken broth. And my grandma used to make, and my mom makes it really
link |
00:04:13.360
well, and I try to make it, but it's not as great. So we used to have that. And then we used to have
link |
00:04:18.080
it alongside stuffed pigeons. I'm pescetarian now, so I don't eat that anymore, but.
link |
00:04:23.520
Stuffed pigeons.
link |
00:04:24.480
Yeah, it's like, it was really yummy. It's the one thing I miss about,
link |
00:04:28.480
you know, now that I'm pescetarian and I don't eat.
link |
00:04:32.080
The stuffed pigeons?
link |
00:04:33.040
Yeah, the stuffed pigeons.
link |
00:04:35.440
Is it, what are they stuffed with? If that doesn't bother you too much to describe.
link |
00:04:39.920
No, no, it's stuffed with a lot of like just rice and, yeah, it's just rice. Yeah, so.
link |
00:04:46.000
And you also, you said that your first, in your book, that your first computer
link |
00:04:51.120
was an Atari, and Space Invaders was your favorite game.
link |
00:04:56.000
Is that when you first fell in love with computers, would you say?
link |
00:04:58.800
Yeah, I would say so.
link |
00:05:00.160
Video games, or just the computer itself? Just something about the machine.
link |
00:05:04.160
Ooh, this thing, there's magic in here.
link |
00:05:07.840
Yeah, I think the magical moment is definitely like playing video games with my,
link |
00:05:12.080
I have two younger sisters, and we would just like had fun together, like playing games.
link |
00:05:17.120
But the other memory I have is my first code, the first code I wrote.
link |
00:05:22.240
I wrote, I drew a Christmas tree, and I'm Muslim, right?
link |
00:05:26.720
So it's kind of, it was kind of funny that the first thing I did was like this Christmas tree.
link |
00:05:32.000
So, yeah, and that's when I realized, wow, you can write code to do all sorts of like
link |
00:05:38.320
really cool stuff. I must have been like six or seven at the time.
link |
00:05:42.720
So you can write programs, and the programs do stuff for you. That's power.
link |
00:05:48.560
That's, if you think about it, that's empowering.
link |
00:05:50.880
It's AI.
link |
00:05:51.600
Yeah, I know what it is. I don't know if that, you see like,
link |
00:05:56.400
I don't know if many people think of it that way when they first learned to program.
link |
00:05:59.520
They just love the puzzle of it. Like, ooh, this is cool. This is pretty.
link |
00:06:02.880
It's a Christmas tree, but like, it's power.
link |
00:06:05.600
It is power.
link |
00:06:06.960
Eventually, I guess you couldn't at the time, but eventually this thing,
link |
00:06:11.040
if it's interesting enough, if it's a pretty enough Christmas tree,
link |
00:06:14.640
it can be run by millions of people and bring them joy, like that little thing.
link |
00:06:19.280
And then because it's digital, it's easy to spread.
link |
00:06:22.400
So like you just created something that's easily spreadable to millions of people.
link |
00:06:26.560
Totally.
link |
00:06:28.160
It's hard to think that way when you're six.
link |
00:06:30.800
In the book, you write, I am who I am because I was raised by a particular set of parents,
link |
00:06:37.040
both modern and conservative, forward thinking, yet locked in tradition.
link |
00:06:41.760
I'm a Muslim and I feel I'm stronger, more centered for it.
link |
00:06:46.000
I adhere to the values of my religion, even if I'm not as dutiful as I once was.
link |
00:06:50.960
And I am a new American and I'm thriving on the energy,
link |
00:06:55.040
vitality and entrepreneurial spirit of this great country.
link |
00:06:59.840
So let me ask you about your parents.
link |
00:07:01.520
What have you learned about life from them, especially when you were young?
link |
00:07:05.280
So both my parents, they're Egyptian, but they moved to Kuwait right out.
link |
00:07:09.920
Actually, there's a cute story about how they met.
link |
00:07:11.680
So my dad taught COBOL in the 70s.
link |
00:07:14.960
Nice.
link |
00:07:15.680
And my mom decided to learn programming.
link |
00:07:18.240
So she signed up to take his COBOL programming class.
link |
00:07:22.400
And he tried to date her and she was like, no, no, no, I don't date.
link |
00:07:26.640
And so he's like, okay, I'll propose.
link |
00:07:28.240
And that's how they got married.
link |
00:07:29.680
Whoa, strong move.
link |
00:07:30.960
Right, exactly, right.
link |
00:07:32.240
That's really impressive.
link |
00:07:35.760
Those COBOL guys know how to impress a lady.
link |
00:07:40.640
So yeah, so what have you learned from them?
link |
00:07:43.520
So definitely grit.
link |
00:07:44.720
One of the core values in our family is just hard work.
link |
00:07:48.320
There were no slackers in our family.
link |
00:07:50.720
And that's something that's definitely stayed with me,
link |
00:07:55.920
both as a professional, but also in my personal life.
link |
00:07:58.480
But I also think my mom, my mom always used to like, I don't know, it was like unconditional
link |
00:08:06.160
love.
link |
00:08:06.720
Like I just knew my parents would be there for me kind of regardless of what I chose to do.
link |
00:08:14.240
And I think that's very powerful.
link |
00:08:15.520
And they got tested on it because I kind of challenged cultural norms and I kind of took
link |
00:08:21.600
a different path, I guess, than what's expected of a woman in the Middle East.
link |
00:08:27.360
And they still love me, which I'm so grateful for that.
link |
00:08:32.480
When was like a moment that was the most challenging for them?
link |
00:08:35.440
Which moment where they kind of had to come face to face with the fact that you're a bit
link |
00:08:42.240
of a rebel?
link |
00:08:44.080
I think the first big moment was when I had just gotten married, but I decided to go do
link |
00:08:52.080
my PhD at Cambridge University.
link |
00:08:53.840
And because my husband at the time, he's now my ex, ran a company in Cairo, he was going
link |
00:08:59.440
to stay in Egypt.
link |
00:09:00.240
So it was going to be a long distance relationship.
link |
00:09:03.120
And that's very unusual in the Middle East for a woman to just head out and kind of pursue
link |
00:09:09.040
her career.
link |
00:09:09.840
And so my dad and my parents in law both said, you know, we do not approve of you doing this,
link |
00:09:18.720
but now you're under the jurisdiction of your husband so he can make the call.
link |
00:09:22.480
And luckily for me, he was supportive.
link |
00:09:26.800
He said, you know, this is your dream come true.
link |
00:09:29.600
You've always wanted to do a PhD.
link |
00:09:30.960
I'm going to support you.
link |
00:09:33.200
So I think that was the first time where, you know, I challenged the cultural norms.
link |
00:09:39.120
Was that scary?
link |
00:09:40.080
Oh, my God, yes.
link |
00:09:41.360
It was totally scary.
link |
00:09:42.720
What's the biggest culture shock from there to Cambridge, to London?
link |
00:09:50.320
Well, that was also during right around September 11th.
link |
00:09:56.480
So everyone thought that there was going to be a third world war.
link |
00:10:01.040
It was really like, and I, at the time I used to wear the hijab, so I was very visibly Muslim.
link |
00:10:07.680
And so my parents just were, they were afraid for my safety.
link |
00:10:11.840
But anyways, when I got to Cambridge, because I was so scared, I decided to take off my
link |
00:10:15.600
headscarf and wear a hat instead.
link |
00:10:17.440
So I just went to class wearing these like British hats, which was, in my opinion, actually
link |
00:10:22.320
worse than just showing up in a headscarf because it was just so awkward, right?
link |
00:10:25.920
Like sitting in class with like all these.
link |
00:10:27.680
Trying to fit in.
link |
00:10:29.040
Yeah.
link |
00:10:29.360
Like a spy.
link |
00:10:30.320
Yeah, yeah, yeah.
link |
00:10:31.280
So after a few weeks of doing that, I was like, to heck with that.
link |
00:10:34.640
I'm just going to go back to wearing my headscarf.
link |
00:10:37.120
Yeah, you wore the hijab, so starting in 2000 and for 12 years after.
link |
00:10:43.840
So it's always, whenever you're in public, you have to wear the head covering.
link |
00:10:47.280
Can you speak to that, to the hijab, maybe your mixed feelings about it?
link |
00:10:52.480
Like what does it represent in its best case?
link |
00:10:55.120
What does it represent in the worst case?
link |
00:10:56.960
Yeah, you know, I think there's a lot of, I guess I'll first start by saying I wore
link |
00:11:03.040
it voluntarily.
link |
00:11:04.000
I was not forced to wear it.
link |
00:11:05.360
And in fact, I was one of the very first women in my family to decide to put on the hijab.
link |
00:11:09.920
And my family thought it was really odd, right?
link |
00:11:13.040
Like there was, they were like, why do you want to put this on?
link |
00:11:15.840
And at its best, it's a sign of modesty, humility.
link |
00:11:20.480
Yeah.
link |
00:11:22.160
It's like me wearing a suit, people are like, why are you wearing a suit?
link |
00:11:25.280
It's a step back into some kind of tradition, a respect for tradition of sorts.
link |
00:11:30.880
So you said, because it's by choice, you're kind of free to make that choice to celebrate
link |
00:11:36.000
a tradition of modesty.
link |
00:11:37.760
Exactly. And I actually like made it my own.
link |
00:11:40.960
I remember I would really match the color of my headscarf with what I was wearing.
link |
00:11:45.680
Like it was a form of self expression and at its best, I loved wearing it.
link |
00:11:52.080
You know, I have a lot of questions around how we practice religion and religion and,
link |
00:11:56.480
you know, and I think also it was a time where I was spending a lot of time going back and
link |
00:12:02.160
forth between the US and Egypt.
link |
00:12:04.480
And I started meeting a lot of people in the US who are just amazing people, very purpose
link |
00:12:09.920
driven, people who have very strong core values, but they're not Muslim.
link |
00:12:14.720
That's okay, right?
link |
00:12:15.760
And so that was when I just had a lot of questions.
link |
00:12:19.920
And politically, also the situation in Egypt was when the Muslim Brotherhood ran the country
link |
00:12:25.120
and I didn't agree with their ideology.
link |
00:12:29.520
It was at a time when I was going through a divorce.
link |
00:12:31.360
Like it was like, it was like just the perfect storm of like political, personal conditions
link |
00:12:37.040
where I was like, this doesn't feel like me anymore.
link |
00:12:40.160
And it took a lot of courage to take it off because culturally it's not, it's okay if
link |
00:12:44.960
you don't wear it, but it's really not okay to wear it and then take it off.
link |
00:12:50.000
But you're still, so you have to do that while still maintaining a deep core and pride in
link |
00:12:56.480
the origins, in your origin story.
link |
00:13:02.400
Totally.
link |
00:13:02.960
So still being Egyptian, still being a Muslim.
link |
00:13:06.640
Right.
link |
00:13:07.200
And being, I think generally like faith driven, but yeah.
link |
00:13:14.240
But what that means changes year by year for you.
link |
00:13:17.440
It's like a personal journey.
link |
00:13:18.800
Yeah, exactly.
link |
00:13:20.080
What would you say is the role of faith in that part of the world?
link |
00:13:23.120
Like, how do you see, you mentioned it a bit in the book too.
link |
00:13:26.480
Yeah.
link |
00:13:27.600
I mean, I think, I think there is something really powerful about just believing that
link |
00:13:34.480
there's a bigger force, you know, there's a kind of surrendering, I guess, that comes
link |
00:13:39.200
with religion and you surrender and you have this deep conviction that it's going to be
link |
00:13:43.360
okay, right?
link |
00:13:44.480
Like the universe is out to like do amazing things for you and it's going to be okay.
link |
00:13:48.880
And there's strength to that.
link |
00:13:50.080
Like even when you're going through adversity, you just know that it's going to work out.
link |
00:13:57.040
Yeah, it gives you like an inner peace, a calmness.
link |
00:13:59.440
Exactly, exactly.
link |
00:14:00.640
Yeah, that's, it's faith in all the meanings of that word.
link |
00:14:04.880
Right.
link |
00:14:05.440
Faith that everything is going to be okay.
link |
00:14:07.200
And it is because time passes and time cures all things.
link |
00:14:12.320
It's like a calmness with the chaos of the world.
link |
00:14:15.200
Yeah.
link |
00:14:15.680
And also there's like a silver, I'm a true believer of this, that something at the specific
link |
00:14:22.320
moment in time can look like it's catastrophic and it's not what you wanted in life.
link |
00:14:28.800
But then time passes and then you look back and there's the silver lining, right?
link |
00:14:32.880
It maybe closed the door, but it opened a new door for you.
link |
00:14:37.120
And so I'm a true believer in that, that, you know, there's a silver lining in almost
link |
00:14:42.880
anything in life, you just have to have this like, yeah, faith or conviction that it's
link |
00:14:47.120
going to work out.
link |
00:14:47.760
Yeah, it's such a beautiful way to see a shady feeling.
link |
00:14:50.720
So if you feel shady about a current situation, I mean, it almost is always true.
link |
00:14:57.840
Unless it's the cliche thing of if it doesn't kill you, whatever doesn't kill you makes
link |
00:15:04.800
you stronger.
link |
00:15:05.360
It's, it does seem that over time when you take a perspective on things that the hardest
link |
00:15:13.200
moments and periods of your life are the most meaningful.
link |
00:15:18.400
Yeah, yeah.
link |
00:15:19.280
So over time you get to have that perspective.
link |
00:15:21.520
Right.
link |
00:15:23.760
What about, because you mentioned Kuwait, what about, let me ask you about war.
link |
00:15:30.000
What's the role of war and peace, maybe even the big love and hate in that part of
link |
00:15:35.680
the world, because it does seem to be a part of the world where there's turmoil.
link |
00:15:40.720
There was turmoil, there's still turmoil.
link |
00:15:44.560
It is so unfortunate, honestly.
link |
00:15:46.480
It's, it's such a waste of human resources and, and, and yeah, and human mindshare.
link |
00:15:53.760
I mean, and at the end of the day, we all kind of want the same things.
link |
00:15:57.280
We want, you know, we want a human connection, we want joy, we want to feel fulfilled, we
link |
00:16:02.560
want to feel, you know, a life of purpose.
link |
00:16:05.760
And I just, I just find it baffling, honestly, that we are still having to grapple with that.
link |
00:16:14.160
I have a story to share about this.
link |
00:16:15.840
You know, I grew up, I'm Egyptian, American now, but, but, you know, originally from Egypt.
link |
00:16:21.840
And when I first got to Cambridge, it turned out my officemate, like my PhD kind of, you
link |
00:16:28.080
know, she ended up, you know, we ended up becoming friends, but she was from Israel.
link |
00:16:32.960
And we didn't know, yeah, we didn't know how it was going to be like.
link |
00:16:37.760
Did you guys sit there just staring at each other for a bit?
link |
00:16:41.040
Actually, she, because I arrived before she did.
link |
00:16:44.560
And it turns out she emailed our PhD advisor and asked him if she thought it was going
link |
00:16:50.320
to be okay.
link |
00:16:52.320
Yeah.
link |
00:16:52.800
And this is around 9 11 too.
link |
00:16:55.040
Yeah.
link |
00:16:55.680
And, and Peter, Peter Robinson, our PhD advisor was like, yeah, like, this is an academic
link |
00:17:01.280
institution, just show up.
link |
00:17:02.720
And we became super good friends.
link |
00:17:04.480
We were both new moms.
link |
00:17:07.200
Like we both had our kids during our PhD.
link |
00:17:09.200
We were both doing artificial emotional intelligence.
link |
00:17:11.360
She was looking at speech.
link |
00:17:12.320
I was looking at the face.
link |
00:17:13.680
We just had so the culture was so similar.
link |
00:17:17.040
Our jokes were similar.
link |
00:17:18.320
It was just, I was like, why on earth are our countries, why is there all this like
link |
00:17:24.080
war and tension?
link |
00:17:25.200
And I think it falls back to the narrative, right?
link |
00:17:27.360
If you change the narrative, like whoever creates this narrative of war.
link |
00:17:31.840
I don't know.
link |
00:17:32.400
We should have women run the world.
link |
00:17:34.640
Yeah, that's one solution.
link |
00:17:37.520
The good women, because there's also evil women in the world.
link |
00:17:40.640
True, okay.
link |
00:17:43.920
But yes, yes, there could be less war if women ran the world.
link |
00:17:47.280
The other aspect is, it doesn't matter the gender, the people in power.
link |
00:17:54.560
I get to see this with Ukraine and Russia and different parts of the world around that
link |
00:17:59.280
conflict now.
link |
00:18:00.800
And that's happening in Yemen as well and everywhere else.
link |
00:18:05.200
There's these narratives told by the leaders to the populace.
link |
00:18:09.840
And those narratives take hold and everybody believes that.
link |
00:18:12.400
And they have a distorted view of the humanity on the other side.
link |
00:18:17.920
In fact, especially during war, you don't even see the people on the other side as human
link |
00:18:25.120
or as equal intelligence or worth or value as you.
link |
00:18:30.960
You tell all kinds of narratives about them being Nazis or dumb or whatever narrative
link |
00:18:40.000
you want to weave around that or evil.
link |
00:18:44.720
But I think when you actually meet them face to face, you realize they're like the same.
link |
00:18:49.120
Exactly, right?
link |
00:18:50.400
It's actually a big shock for people to realize that they've been essentially lied to within
link |
00:18:58.960
their country.
link |
00:19:00.000
And I kind of have faith that social media, as ridiculous as it is to say, or any kind
link |
00:19:05.840
of technology, is able to bypass the walls that governments put up and connect people
link |
00:19:13.520
directly.
link |
00:19:14.000
And then you get to realize, oh, people fall in love across different nations and religions
link |
00:19:20.880
and so on.
link |
00:19:21.440
And that, I think, ultimately can cure a lot of our ills, especially in person.
link |
00:19:26.640
I also think that if leaders met in person, they'd have a conversation that could cure
link |
00:19:32.400
a lot of the ills of the world, especially in private.
link |
00:19:37.680
Let me ask you about the women running the world.
link |
00:19:42.320
So gender does, in part, perhaps shape the landscape of just our human experience.
link |
00:19:51.040
So in what ways was it limiting and in what ways was it empowering for you to be a woman
link |
00:19:57.760
in the Middle East?
link |
00:19:58.560
I think, just kind of going back to my comment on women running the world, I think it comes
link |
00:20:03.920
back to empathy, which has been a common thread throughout my entire career.
link |
00:20:08.800
And it's this idea of human connection.
link |
00:20:12.320
Once you build common ground with a person or a group of people, you build trust, you
link |
00:20:16.800
build loyalty, you build friendship.
link |
00:20:20.160
And then you can turn that into behavior change and motivation and persuasion.
link |
00:20:24.480
So it's like, empathy and emotions are just at the center of everything we do.
link |
00:20:30.720
And I think being from the Middle East, kind of this human connection is very strong.
link |
00:20:38.080
We have this running joke that if you come to Egypt for a visit, people will know everything
link |
00:20:44.640
about your life right away, right?
link |
00:20:46.000
I have no problems asking you about your personal life.
link |
00:20:48.400
There's no boundaries, really, no personal boundaries in terms of getting to know people.
link |
00:20:53.200
We get emotionally intimate very, very quickly.
link |
00:20:56.400
But I think people just get to know each other authentically, I guess.
link |
00:21:01.680
There isn't this superficial level of getting to know people.
link |
00:21:05.040
You just try to get to know people really deeply.
link |
00:21:06.880
Empathy is a part of that.
link |
00:21:08.080
Totally.
link |
00:21:08.640
Because you can put yourself in this person's shoe and kind of, yeah, imagine what challenges
link |
00:21:15.680
they're going through, and so I think I've definitely taken that with me.
link |
00:21:21.760
Generosity is another one too, like just being generous with your time and love and attention
link |
00:21:26.960
and even with your wealth, right?
link |
00:21:30.480
Even if you don't have a lot of it, you're still very generous.
link |
00:21:32.800
And I think that's another...
link |
00:21:34.720
Enjoying the humanity of other people.
link |
00:21:38.000
And so do you think there's a useful difference between men and women in that?
link |
00:21:44.720
In that aspect and empathy?
link |
00:21:48.880
Or is doing these kind of big general groups, does that hinder progress?
link |
00:21:56.880
Yeah, I actually don't want to overgeneralize.
link |
00:21:59.760
I mean, some of the men I know are like the most empathetic humans.
link |
00:22:03.520
Yeah, I strive to be empathetic.
link |
00:22:05.200
Yeah, you're actually very empathetic.
link |
00:22:10.640
Yeah, so I don't want to overgeneralize.
link |
00:22:13.360
Although one of the researchers I worked with when I was at Cambridge, Professor Simon Baron Cohen,
link |
00:22:18.400
he's Sacha Baron Cohen's cousin, and he runs the Autism Research Center at Cambridge,
link |
00:22:25.120
and he's written multiple books on autism.
link |
00:22:29.600
And one of his theories is the empathy scale, like the systemizers and the empathizers,
link |
00:22:35.040
and there's a disproportionate amount of computer scientists and engineers who are
link |
00:22:42.800
systemizers and perhaps not great empathizers, and then there's more men in that bucket,
link |
00:22:51.760
I guess, than women, and then there's more women in the empathizers bucket.
link |
00:22:56.000
So again, not to overgeneralize.
link |
00:22:58.160
I sometimes wonder about that.
link |
00:22:59.520
It's been frustrating to me how many, I guess, systemizers there are in the field of robotics.
link |
00:23:05.200
Yeah.
link |
00:23:06.240
It's actually encouraging to me because I care about, obviously, social robotics,
link |
00:23:10.000
and because there's more opportunity for people that are empathic.
link |
00:23:18.720
Exactly.
link |
00:23:19.200
I totally agree.
link |
00:23:20.400
Well, right?
link |
00:23:20.960
So it's nice.
link |
00:23:21.760
Yes.
link |
00:23:22.160
So every robotics I talk to, they don't see the human as interesting, as it's not exciting.
link |
00:23:29.200
You want to avoid the human at all costs.
link |
00:23:32.160
It's a safety concern to be touching the human, which it is, but it is also an opportunity
link |
00:23:39.200
for deep connection or collaboration or all that kind of stuff.
link |
00:23:43.360
And because most brilliant roboticists don't care about the human, it's an opportunity,
link |
00:23:49.280
in your case, it's a business opportunity too, but in general, an opportunity to explore
link |
00:23:53.840
those ideas.
link |
00:23:54.640
So in this beautiful journey to Cambridge, to UK, and then to America, what's the moment
link |
00:24:03.760
or moments that were most transformational for you as a scientist and as a leader?
link |
00:24:09.760
So you became an exceptionally successful CEO, founder, researcher, scientist, and so on.
link |
00:24:18.320
Was there a face shift there where, like, I can be somebody, I can really do something
link |
00:24:25.040
in this world?
link |
00:24:26.640
Yeah.
link |
00:24:26.880
So actually, just kind of a little bit of background.
link |
00:24:29.680
So the reason why I moved from Cairo to Cambridge, UK to do my PhD is because I had a very clear
link |
00:24:36.960
career plan.
link |
00:24:37.920
I was like, okay, I'll go abroad, get my PhD, going to crush it in three or four years,
link |
00:24:43.280
come back to Egypt and teach.
link |
00:24:45.360
It was very clear, very well laid out.
link |
00:24:47.520
Was topic clear or no?
link |
00:24:49.280
The topic, well, I did my PhD around building artificial emotional intelligence and looking
link |
00:24:54.400
at...
link |
00:24:54.400
But in your master plan ahead of time, when you're sitting by the mango tree, did you
link |
00:24:58.880
know it's going to be artificial intelligence?
link |
00:25:00.480
No, no, no, that I did not know.
link |
00:25:02.880
Although I think I kind of knew that I was going to be doing computer science, but I
link |
00:25:07.840
didn't know the specific area.
link |
00:25:10.160
But I love teaching.
link |
00:25:11.120
I mean, I still love teaching.
link |
00:25:13.120
So I just, yeah, I just wanted to go abroad, get a PhD, come back, teach.
link |
00:25:18.960
Why computer science?
link |
00:25:19.760
Can we just linger on that?
link |
00:25:21.200
What?
link |
00:25:21.440
Because you're such an empathic person who cares about emotion, humans and so on.
link |
00:25:25.520
Isn't, aren't computers cold and emotionless and just...
link |
00:25:31.440
We're changing that.
link |
00:25:32.480
Yeah, I know, but like, isn't that the, or did you see computers as the, having the
link |
00:25:38.400
capability to actually connect with humans?
link |
00:25:42.400
I think that was like my takeaway from my experience just growing up, like computers
link |
00:25:46.720
sit at the center of how we connect and communicate with one another, right?
link |
00:25:50.320
Or technology in general.
link |
00:25:51.760
Like I remember my first experience being away from my parents.
link |
00:25:54.640
We communicated with a fax machine, but thank goodness for the fax machine, because we
link |
00:25:58.800
could send letters back and forth to each other.
link |
00:26:00.800
This was pre emails and stuff.
link |
00:26:04.080
So I think, I think there's, I think technology can be not just transformative in terms of
link |
00:26:09.600
productivity, et cetera.
link |
00:26:10.960
It actually does change how we connect with one another.
link |
00:26:14.960
Can I just defend the fax machine?
link |
00:26:16.720
There's something like the haptic feel because the email is all digital.
link |
00:26:22.720
There's something really nice.
link |
00:26:23.760
I still write letters to people.
link |
00:26:26.400
There's something nice about the haptic aspect of the fax machine, because you still have
link |
00:26:30.160
to press, you still have to do something in the physical world to make this thing a reality.
link |
00:26:35.200
Right, and then it like comes out as a printout and you can actually touch it and read it.
link |
00:26:39.760
Yeah.
link |
00:26:40.240
There's something, there's something lost when it's just an email.
link |
00:26:44.960
Obviously I wonder how we can regain some of that in the digital world, which goes to
link |
00:26:51.440
the metaverse and all those kinds of things.
link |
00:26:53.760
We'll talk about it anyway.
link |
00:26:54.800
So, actually do you question on that one?
link |
00:26:57.680
Do you still, do you have photo albums anymore?
link |
00:27:00.400
Do you still print photos?
link |
00:27:03.600
No, no, but I'm a minimalist.
link |
00:27:06.320
Okay.
link |
00:27:06.640
So it was one of the, one of the painful steps in my life was to scan all the photos and
link |
00:27:12.400
let go of them and then let go of all my books.
link |
00:27:16.320
You let go of your books?
link |
00:27:17.600
Yeah.
link |
00:27:18.000
Switch to Kindle, everything Kindle.
link |
00:27:19.920
Yeah.
link |
00:27:20.800
So I thought, I thought, okay, think 30 years from now, nobody's going to have books anymore.
link |
00:27:29.440
The technology of digital books is going to get better and better and better.
link |
00:27:32.160
Are you really going to be the guy that's still romanticizing physical books?
link |
00:27:36.240
Are you going to be the old man on the porch who's like kids?
link |
00:27:39.440
Yes.
link |
00:27:40.480
So just get used to it because it was, it felt, it still feels a little bit uncomfortable
link |
00:27:45.040
to read on a Kindle, but get used to it.
link |
00:27:48.560
Like you always, I mean, I'm trying to learn new programming language is always,
link |
00:27:53.200
like with technology, you have to kind of challenge yourself to adapt to it.
link |
00:27:56.720
You know, I forced myself to use TikTok.
link |
00:27:58.880
No, that thing doesn't need much forcing.
link |
00:28:01.440
It pulls you in like a, like the worst kind of, or the best kind of drug.
link |
00:28:05.920
Anyway, yeah.
link |
00:28:08.560
So yeah, but I do love haptic things.
link |
00:28:11.760
There's a magic to the haptic.
link |
00:28:13.440
Even like touchscreens, it's tricky to get right, to get the experience of a button.
link |
00:28:19.520
Yeah.
link |
00:28:22.400
Anyway, what were we talking about?
link |
00:28:23.760
So AI, so the journey, your whole plan was to come back to Cairo and teach.
link |
00:28:30.560
Right.
link |
00:28:31.840
And then.
link |
00:28:32.560
What did the plan go wrong?
link |
00:28:33.840
Yeah, exactly.
link |
00:28:34.720
Right.
link |
00:28:35.120
And then I get to Cambridge and I fall in love with the idea of research.
link |
00:28:39.120
Right.
link |
00:28:39.440
And kind of embarking on a path.
link |
00:28:41.520
Nobody's explored this path before.
link |
00:28:43.680
You're building stuff that nobody's built before.
link |
00:28:45.440
And it's challenging and it's hard.
link |
00:28:46.960
And there's a lot of nonbelievers.
link |
00:28:49.280
I just totally love that.
link |
00:28:50.960
And at the end of my PhD, I think it's the meeting that changed the trajectory of my life.
link |
00:28:56.880
Professor Roslyn Picard, who's, she runs the Affective Computing Group at the MIT Media Lab.
link |
00:29:02.160
I had read her book.
link |
00:29:03.040
I, you know, I was like following, following, following all her research.
link |
00:29:07.520
AKA Ros.
link |
00:29:08.880
Yes, AKA Ros.
link |
00:29:10.080
Yes.
link |
00:29:10.880
And she was giving a talk at a pattern recognition conference in Cambridge.
link |
00:29:16.320
And she had a couple of hours to kill.
link |
00:29:18.000
So she emailed the lab and she said, you know, if any students want to meet with me, like,
link |
00:29:22.560
just, you know, sign up here.
link |
00:29:24.640
And so I signed up for slot and I spent like the weeks leading up to it preparing for this
link |
00:29:29.920
meeting and I want to show her a demo of my research and everything.
link |
00:29:34.400
And we met and we ended up hitting it off.
link |
00:29:36.640
Like we totally clicked.
link |
00:29:38.080
And at the end of the meeting, she said, do you want to come work with me as a postdoc
link |
00:29:42.480
at MIT?
link |
00:29:44.720
And this is what I told her.
link |
00:29:45.600
I was like, okay, this would be a dream come true, but there's a husband waiting for me
link |
00:29:49.280
in Cairo.
link |
00:29:49.840
I kind of have to go back.
link |
00:29:51.120
Yeah.
link |
00:29:52.080
She said, it's fine.
link |
00:29:52.960
Just commute.
link |
00:29:54.560
And I literally started commuting between Cairo and Boston.
link |
00:29:59.200
Yeah, it was, it was a long commute.
link |
00:30:01.200
And I didn't, I did that like every few weeks I would, you know, hop on a plane and go to
link |
00:30:05.520
Boston.
link |
00:30:06.400
But that, that changed the trajectory of my life.
link |
00:30:08.480
There was no, I kind of outgrew my dreams, right?
link |
00:30:12.880
I didn't want to go back to Egypt anymore and be faculty.
link |
00:30:16.720
Like that was no longer my dream.
link |
00:30:18.320
I had a dream.
link |
00:30:19.200
What was the, what was it like to be at MIT?
link |
00:30:22.560
What was that culture shock?
link |
00:30:25.040
You mean America in general, but also, I mean, Cambridge has its own culture, right?
link |
00:30:31.040
So what was MIT like and what was America like?
link |
00:30:34.000
I think, I wonder if that's similar to your experience at MIT.
link |
00:30:37.600
I was just, at the Media Lab in particular, I was just really, impressed is not the right
link |
00:30:45.200
word.
link |
00:30:46.240
I didn't expect the openness to like innovation and the acceptance of taking a risk and failing.
link |
00:30:54.800
Like failure isn't really accepted back in Egypt, right?
link |
00:30:58.320
You don't want to fail.
link |
00:30:59.040
Like there's a fear of failure, which I think has been hardwired in my brain.
link |
00:31:03.200
But you get to MIT and it's okay to start things.
link |
00:31:05.840
And if they don't work out, like it's okay.
link |
00:31:08.000
You pivot to another idea.
link |
00:31:09.840
And that kind of thinking was just very new to me.
link |
00:31:12.640
That's liberating.
link |
00:31:13.600
Well, Media Lab, for people who don't know, MIT Media Lab is its own beautiful thing because
link |
00:31:19.840
they, I think more than other places at MIT, reach for big ideas.
link |
00:31:24.000
And like they try, I mean, I think, I mean, depending of course on who, but certainly
link |
00:31:28.480
with Roslyn, you try wild stuff, you try big things and crazy things and also try to take
link |
00:31:36.160
things to completion so you can demo them.
link |
00:31:38.240
So always, always, always have a demo.
link |
00:31:42.240
Like if you go, one of the sad things to me about robotics labs at MIT, and there's like
link |
00:31:46.880
over 30, I think, is like, usually when you show up to a robotics lab, there's not a single
link |
00:31:53.680
working robot, they're all broken.
link |
00:31:55.760
All the robots are broken.
link |
00:31:57.280
The robots are broken, which is like the normal state of things because you're working on
link |
00:32:01.600
them.
link |
00:32:02.080
But it would be nice if we lived in a world where robotics labs had some robots functioning.
link |
00:32:08.880
One of my like favorite moments that just sticks with me, I visited Boston Dynamics
link |
00:32:13.360
and there was a, first of all, seeing so many spots, so many legged robots in one place.
link |
00:32:20.240
I'm like, I'm home.
link |
00:32:22.720
But the, yeah.
link |
00:32:24.880
This is where I was built.
link |
00:32:27.200
The cool thing was just to see there was a random robot spot was walking down the hall.
link |
00:32:33.360
It's probably doing mapping, but it looked like he wasn't doing anything and he was wearing
link |
00:32:37.120
he or she, I don't know.
link |
00:32:39.120
But it, well, I like, in my mind, there are people, they have a backstory, but this one
link |
00:32:44.640
in particular definitely has a backstory because he was wearing a cowboy hat.
link |
00:32:48.640
So I just saw a spot robot with a cowboy hat walking down the hall and there was just this
link |
00:32:54.160
feeling like there's a life, like he has a life.
link |
00:32:58.880
He probably has to commute back to his family at night.
link |
00:33:02.240
Like there's a, there's a feeling like there's life instilled in this robot and it's magical.
link |
00:33:07.440
I don't know.
link |
00:33:07.760
It was, it was kind of inspiring to see.
link |
00:33:09.520
Did it say hello to, did he say hello to you?
link |
00:33:12.000
No, it's very, there's a focus nature to the robot.
link |
00:33:15.360
No, no, listen.
link |
00:33:16.400
I love competence and focus and great.
link |
00:33:18.960
Like he was not going to get distracted by the, the shallowness of small talk.
link |
00:33:25.200
There's a job to be done and he was doing it.
link |
00:33:27.520
So anyway, the fact that it was working is a beautiful thing.
link |
00:33:30.560
And I think Media Lab really prides itself on trying to always have a thing that's working
link |
00:33:35.440
that you could show off.
link |
00:33:36.480
Yes.
link |
00:33:36.800
We used to call it a demo or die.
link |
00:33:38.960
You, you could not, yeah, you could not like show up with like PowerPoint or something.
link |
00:33:43.520
You actually had to have a working, you know what, my son who is now 13, I don't know if
link |
00:33:48.080
this is still his life long goal or not, but when he was a little younger, his dream is
link |
00:33:52.880
to build an island that's just inhabited by robots, like no humans.
link |
00:33:57.280
He just wants all these robots to be connecting and having fun and there you go.
link |
00:34:01.920
Does he have human, does he have an idea of which robots he loves most?
link |
00:34:06.480
Is it, is it Roomba like robots?
link |
00:34:09.280
Is it humanoid robots?
link |
00:34:10.800
Robot dogs, or it's not clear yet.
link |
00:34:13.920
We used to have a Jibo, which was one of the MIT Media Lab spin outs and he used to love
link |
00:34:19.280
the giant head that spins and rotate and it's an eye or like not glowing like Cal 9000,
link |
00:34:30.400
but the friendly version.
link |
00:34:31.760
He loved that.
link |
00:34:34.080
And then he just loves, uh, um,
link |
00:34:38.240
yeah, he just, he, I think he loves all forms of robots actually.
link |
00:34:44.160
So embodied intelligence.
link |
00:34:46.800
Yes.
link |
00:34:47.760
I like, I personally like legged robots, especially, uh, anything that can wiggle its butt.
link |
00:34:55.120
No, that's not the definition of what I love, but that's just technically what I've been
link |
00:35:00.960
working on recently.
link |
00:35:01.760
Except I have a bunch of legged robots now in Austin and I've been doing, I was, I've
link |
00:35:06.480
been trying to, uh, have them communicate affection with their body in different ways
link |
00:35:12.400
just for art, for art really.
link |
00:35:15.120
Cause I love the idea of walking around with the robots, like, uh, as you would with a
link |
00:35:20.080
dog.
link |
00:35:20.400
I think it's inspiring to a lot of people, especially young people.
link |
00:35:23.120
Like kids love, kids love it.
link |
00:35:25.760
Parents like adults are scared of robots, but kids don't have this kind of weird construction
link |
00:35:31.600
of the world that's full of evil.
link |
00:35:32.880
They love cool things.
link |
00:35:34.480
Yeah.
link |
00:35:35.040
I remember when Adam was in first grade, so he must have been like seven or so.
link |
00:35:40.080
I went in to his class with a whole bunch of robots and like the emotion AI demo and
link |
00:35:44.960
da da.
link |
00:35:45.680
And I asked the kids, I was like, do you, would you kids want to have a robot, you know,
link |
00:35:52.000
robot friend or robot companion?
link |
00:35:53.600
Everybody said yes.
link |
00:35:54.560
And they wanted it for all sorts of things, like to help them with their math homework
link |
00:35:58.880
and to like be a friend.
link |
00:36:00.160
So there's, it just struck me how there was no fear of robots was a lot of adults have
link |
00:36:07.520
that like us versus them.
link |
00:36:10.720
Yeah, none of that.
link |
00:36:11.920
Of course you want to be very careful because you still have to look at the lessons of history
link |
00:36:16.960
and how robots can be used by the power centers of the world to abuse your rights and all
link |
00:36:21.920
that kind of stuff.
link |
00:36:22.480
But mostly it's good to enter anything new with an excitement and an optimism.
link |
00:36:30.480
Speaking of Roz, what have you learned about science and life from Rosalind Picard?
link |
00:36:35.200
Oh my God, I've learned so many things about life from Roz.
link |
00:36:41.200
I think the thing I learned the most is perseverance.
link |
00:36:47.600
When I first met Roz, we applied and she invited me to be her postdoc.
link |
00:36:51.280
We applied for a grant to the National Science Foundation to apply some of our research to
link |
00:36:57.040
autism.
link |
00:36:57.760
And we got back.
link |
00:37:00.800
We were rejected.
link |
00:37:01.520
Rejected.
link |
00:37:02.240
Yeah.
link |
00:37:02.480
And the reasoning was...
link |
00:37:03.120
The first time you were rejected for fun, yeah.
link |
00:37:06.000
Yeah, it was, and I basically, I just took the rejection to mean, okay, we're rejected.
link |
00:37:10.320
It's done, like end of story, right?
link |
00:37:12.720
And Roz was like, it's great news.
link |
00:37:15.120
They love the idea.
link |
00:37:16.080
They just don't think we can do it.
link |
00:37:18.160
So let's build it, show them, and then reapply.
link |
00:37:22.400
And it was that, oh my God, that story totally stuck with me.
link |
00:37:26.320
And she's like that in every aspect of her life.
link |
00:37:29.760
She just does not take no for an answer.
link |
00:37:32.080
To reframe all negative feedback.
link |
00:37:35.360
As a challenge.
link |
00:37:36.400
As a challenge.
link |
00:37:37.280
As a challenge.
link |
00:37:38.560
Yes, they liked this.
link |
00:37:40.000
Yeah, yeah, yeah.
link |
00:37:40.720
It was a riot.
link |
00:37:43.200
What else about science in general?
link |
00:37:45.040
About how you see computers and also business and just everything about the world.
link |
00:37:51.680
She's a very powerful, brilliant woman like yourself.
link |
00:37:54.800
So is there some aspect of that too?
link |
00:37:57.280
Yeah, I think Roz is actually also very faith driven.
link |
00:38:00.320
She has this like deep belief in conviction.
link |
00:38:04.240
Yeah, and in the good in the world and humanity.
link |
00:38:07.200
And I think that was meeting her and her family was definitely like a defining moment for me
link |
00:38:13.520
because that was when I was like, wow, like you can be of a different background and
link |
00:38:18.080
religion and whatever and you can still have the same core values.
link |
00:38:23.760
So that was, that was, yeah.
link |
00:38:26.800
I'm grateful to her.
link |
00:38:28.560
Roz, if you're listening, thank you.
link |
00:38:30.240
Yeah, she's great.
link |
00:38:31.280
She's been on this podcast before.
link |
00:38:33.600
I hope she'll be on, I'm sure she'll be on again.
link |
00:38:36.320
And you were the founder and CEO of Effektiva, which is a big company that was acquired by
link |
00:38:44.720
another big company, SmartEye.
link |
00:38:46.960
And you're now the deputy CEO of SmartEye.
link |
00:38:49.120
So you're a powerful leader.
link |
00:38:51.040
You're brilliant.
link |
00:38:51.760
You're a brilliant scientist.
link |
00:38:53.360
A lot of people are inspired by you.
link |
00:38:55.040
What advice would you give, especially to young women, but people in general who dream
link |
00:39:00.160
of becoming powerful leaders like yourself in a world where perhaps, in a world that
link |
00:39:09.520
perhaps doesn't give them a clear, easy path to do so, whether we're talking about Egypt
link |
00:39:17.440
or elsewhere?
link |
00:39:19.920
You know, hearing you kind of describe me that way, kind of encapsulates, I think what
link |
00:39:27.680
I think is the biggest challenge of all, which is believing in yourself, right?
link |
00:39:32.160
I have had to like grapple with this, what I call now the Debbie Downer voice in my head.
link |
00:39:39.360
The kind of basically, it's just chattering all the time.
link |
00:39:42.720
It's basically saying, oh, no, no, no, no, you can't do this.
link |
00:39:45.040
Like you're not going to raise money.
link |
00:39:46.320
You can't start a company.
link |
00:39:47.280
Like what business do you have, like starting a company or running a company or selling
link |
00:39:50.720
a company?
link |
00:39:51.200
Like you name it.
link |
00:39:52.080
It's always like.
link |
00:39:53.120
And I think my biggest advice to not just women, but people who are taking a new path
link |
00:40:02.160
and, you know, they're not sure, is to not let yourself and let your thoughts be the
link |
00:40:07.200
biggest obstacle in your way.
link |
00:40:09.920
And I've had to like really work on myself to not be my own biggest obstacle.
link |
00:40:17.520
So you got that negative voice.
link |
00:40:18.880
Yeah.
link |
00:40:20.640
So is that?
link |
00:40:21.200
Am I the only one?
link |
00:40:21.920
I don't think I'm the only one.
link |
00:40:23.280
No, I have that negative voice.
link |
00:40:25.040
I'm not exactly sure if it's a bad thing or a good thing.
link |
00:40:29.840
I've been really torn about it because it's been a lifelong companions.
link |
00:40:35.440
It's hard to know.
link |
00:40:37.840
It's kind of, it drives productivity and progress, but it can hold you back from taking
link |
00:40:44.800
big leaps.
link |
00:40:45.520
I think the best I can say is probably you have to somehow be able to control it, to
link |
00:40:53.120
turn it off when it's not useful and turn it on when it's useful.
link |
00:40:57.680
Like I have from almost like a third person perspective.
link |
00:41:00.400
Right.
link |
00:41:00.900
Somebody who's sitting there like.
link |
00:41:02.400
Yeah.
link |
00:41:02.960
Like, because it is useful to be critical.
link |
00:41:07.520
Like after, I just gave a talk yesterday.
link |
00:41:12.480
At MIT and I was just, there's so much love and it was such an incredible experience.
link |
00:41:19.120
So many amazing people I got a chance to talk to, but afterwards when I went home and just
link |
00:41:25.760
took this long walk, it was mostly just negative thoughts about me.
link |
00:41:29.680
I don't like one basic stuff like I don't deserve any of it.
link |
00:41:34.720
And second is like, like, why did you, that was so bad.
link |
00:41:39.200
Second is like, like, why did you, that was so dumb that you said this, that's so dumb.
link |
00:41:44.880
Like you should have prepared that better.
link |
00:41:47.520
Why did you say this?
link |
00:41:50.240
But I think it's good to hear that voice out.
link |
00:41:54.160
All right.
link |
00:41:54.560
And like sit in that.
link |
00:41:56.240
And ultimately I think you grow from that.
link |
00:41:58.560
Now, when you're making really big decisions about funding or starting a company or taking
link |
00:42:03.680
a leap to go to the UK or take a leap to go to America to work in Media Lab though.
link |
00:42:10.960
Yeah.
link |
00:42:11.200
There's, that's, you should be able to shut that off then because you should have like
link |
00:42:22.160
this weird confidence, almost like faith that you said before that everything's going to
link |
00:42:26.080
work out.
link |
00:42:26.720
So take the leap of faith.
link |
00:42:28.480
Take the leap of faith.
link |
00:42:30.160
Despite all the negativity.
link |
00:42:32.400
I mean, there's, there's, there's some of that.
link |
00:42:34.240
You, you actually tweeted a really nice tweet thread.
link |
00:42:39.760
It says, quote, a year ago, a friend recommended I do daily affirmations and I was skeptical,
link |
00:42:46.800
but I was going through major transitions in my life.
link |
00:42:49.360
So I gave it a shot and it set me on a journey of self acceptance and self love.
link |
00:42:54.080
So what was that like?
link |
00:42:55.680
Can you maybe talk through this idea of affirmations and how that helped you?
link |
00:43:01.360
Yeah.
link |
00:43:02.320
Because really like I'm just like me, I'm a kind, I'd like to think of myself as a kind
link |
00:43:07.200
person in general, but I'm kind of mean to myself sometimes.
link |
00:43:10.320
Yeah.
link |
00:43:11.280
And so I've been doing journaling for almost 10 years now.
link |
00:43:16.720
I use an app called Day One and it's awesome.
link |
00:43:18.880
I just journal and I use it as an opportunity to almost have a conversation with the Debbie
link |
00:43:22.880
Downer voice in my, it's like a rebuttal, right?
link |
00:43:25.520
Like Debbie Downer says, oh my God, like you, you know, you won't be able to raise this
link |
00:43:29.120
round of funding.
link |
00:43:29.680
I'm like, okay, let's talk about it.
link |
00:43:33.120
I have a track record of doing X, Y, and Z.
link |
00:43:35.520
I think I can do this.
link |
00:43:37.120
And it's literally like, so I wouldn't, I don't know that I can shut off the voice,
link |
00:43:42.240
but I can have a conversation with it.
link |
00:43:44.240
And it just, it just, and I bring data to the table, right?
link |
00:43:49.840
Nice.
link |
00:43:50.320
So that was the journaling part, which I found very helpful.
link |
00:43:53.760
But the affirmation took it to a whole next level and I just love it.
link |
00:43:57.600
I'm a year into doing this and you literally wake up in the morning and the first thing
link |
00:44:02.720
you do, I meditate first and then I write my affirmations and it's the energy I want
link |
00:44:09.440
to put out in the world that hopefully will come right back to me.
link |
00:44:12.160
So I will say, I always start with my smile lights up the whole world.
link |
00:44:17.200
And I kid you not, like people in the street will stop me and say, oh my God, like we love
link |
00:44:20.720
your smile.
link |
00:44:21.360
Like, yes.
link |
00:44:22.320
So, so my affirmations will change depending on, you know, what's happening this day.
link |
00:44:28.880
Is it funny?
link |
00:44:29.520
I know.
link |
00:44:29.840
Don't judge, don't judge.
link |
00:44:31.360
No, that's not, laughter's not judgment.
link |
00:44:33.840
It's just awesome.
link |
00:44:35.040
I mean, it's true, but you're saying affirmations somehow help kind of, I mean, what is it that
link |
00:44:42.480
they do work to like remind you of the kind of person you are and the kind of person you
link |
00:44:48.400
want to be, which actually may be in reverse order, the kind of person you want to be.
link |
00:44:53.760
And that helps you become the kind of person you actually are.
link |
00:44:56.960
It's just, it's, it brings intentionality to like what you're doing.
link |
00:45:01.280
Right.
link |
00:45:01.680
And so, by the way, I was laughing because my affirmations, which I also do are the
link |
00:45:07.200
opposite.
link |
00:45:07.760
Oh, you do?
link |
00:45:08.320
Oh, what do you do?
link |
00:45:09.040
I don't, I don't have a, my smile lights up the world.
link |
00:45:11.920
Maybe I should add that because like, I, I have, I just, I have, oh boy, it's, it's much
link |
00:45:22.240
more stoic, like about focus, about this kind of stuff, but the joy, the emotion that you're
link |
00:45:30.400
just in that little affirmation is beautiful.
link |
00:45:32.960
So maybe I should add that.
link |
00:45:35.120
I have some, I have some like focused stuff, but that's usually.
link |
00:45:38.080
But that's a cool start.
link |
00:45:39.120
It's after all the like smiling and playful and joyful and all that.
link |
00:45:43.760
And then it's like, okay, I kick butt.
link |
00:45:45.440
Let's get shit done.
link |
00:45:46.560
Right.
link |
00:45:46.960
Let's get shit done affirmation.
link |
00:45:48.640
Okay, cool.
link |
00:45:49.280
So like what else is on there?
link |
00:45:52.640
What else is on there?
link |
00:45:54.320
Well, I, I have, I'm also, I'm, I'm a magnet for all sorts of things.
link |
00:46:00.000
So I'm an amazing people magnet.
link |
00:46:02.160
I attract like awesome people into my universe.
link |
00:46:05.520
That's an actual affirmation.
link |
00:46:06.960
Yes.
link |
00:46:07.840
That's great.
link |
00:46:08.880
Yeah.
link |
00:46:09.280
So that, that's, and that, yeah.
link |
00:46:10.640
And that somehow manifests itself into like in working.
link |
00:46:13.920
I think so.
link |
00:46:15.440
Yeah.
link |
00:46:15.680
Like, can you speak to like why it feels good to do the affirmations?
link |
00:46:19.760
I honestly think it just grounds the day.
link |
00:46:24.080
And then it allows me to, instead of just like being pulled back and forth, like throughout
link |
00:46:30.000
the day, it just like grounds me.
link |
00:46:31.680
I'm like, okay, like this thing happened.
link |
00:46:34.560
It's not exactly what I wanted it to be, but I'm patient.
link |
00:46:37.360
Or I'm, you know, I'm, I trust that the universe will do amazing things for me, which is one
link |
00:46:42.960
of my other consistent affirmations.
link |
00:46:45.440
Or I'm an amazing mom.
link |
00:46:46.720
Right.
link |
00:46:47.040
And so I can grapple with all the feelings of mom guilt that I have all the time.
link |
00:46:52.240
Or here's another one.
link |
00:46:53.760
I'm a love magnet.
link |
00:46:55.040
And I literally say, I will kind of picture the person that I'd love to end up with.
link |
00:46:59.040
And I write it all down and it hasn't happened yet, but it.
link |
00:47:02.480
What are you, what are you picturing?
link |
00:47:03.920
This is Brad Pitt.
link |
00:47:06.000
Because that's what I picture.
link |
00:47:07.040
Okay.
link |
00:47:07.440
That's what you picture?
link |
00:47:08.240
Yeah.
link |
00:47:08.480
Okay.
link |
00:47:08.880
On the, on the running, holding hands, running together.
link |
00:47:11.760
Okay.
link |
00:47:14.160
No, more like fight club that the fight club, Brad Pitt, where he's like standing.
link |
00:47:18.800
All right.
link |
00:47:19.120
People will know.
link |
00:47:20.000
Anyway, I'm sorry.
link |
00:47:20.720
I'll get off on that.
link |
00:47:21.920
Do you have a, like when you're thinking about the being a love magnet in that way, are you
link |
00:47:27.360
picturing specific people or is this almost like in the space of like energy?
link |
00:47:36.000
Right.
link |
00:47:36.320
It's somebody who is smart and well accomplished and successful in their life, but they're
link |
00:47:44.240
generous and they're well traveled and they want to travel the world.
link |
00:47:48.960
Things like that.
link |
00:47:49.760
Like their head over heels into me.
link |
00:47:51.360
It's like, I know it sounds super silly, but it's literally what I write.
link |
00:47:54.560
Yeah.
link |
00:47:54.800
And I believe it'll happen one day.
link |
00:47:56.320
Oh, you actually write, so you don't say it out loud?
link |
00:47:58.000
You write.
link |
00:47:58.240
No, I write it.
link |
00:47:58.960
I write all my affirmations.
link |
00:48:01.200
I do the opposite.
link |
00:48:01.920
I say it out loud.
link |
00:48:02.640
Oh, you say it out loud?
link |
00:48:03.440
Interesting.
link |
00:48:04.320
Yeah, if I'm alone, I'll say it out loud.
link |
00:48:06.320
Interesting.
link |
00:48:07.360
I should try that.
link |
00:48:10.000
I think it's what feels more powerful to you.
link |
00:48:15.600
To me, more powerful.
link |
00:48:18.240
Saying stuff feels more powerful.
link |
00:48:20.320
Yeah.
link |
00:48:21.520
Writing is, writing feels like I'm losing the words, like losing the power of the words
link |
00:48:32.320
maybe because I write slow.
link |
00:48:33.520
Do you handwrite?
link |
00:48:34.960
No, I type.
link |
00:48:36.320
It's on this app.
link |
00:48:37.520
It's day one, basically.
link |
00:48:38.800
And I just, I can look, the best thing about it is I can look back and see like a year ago,
link |
00:48:44.320
what was I affirming, right?
link |
00:48:46.560
So it's...
link |
00:48:47.200
Oh, so it changes over time.
link |
00:48:50.000
It hasn't like changed a lot, but the focus kind of changes over time.
link |
00:48:54.640
I got it.
link |
00:48:55.440
Yeah, I say the same exact thing over and over and over.
link |
00:48:57.840
Oh, you do?
link |
00:48:58.400
Okay.
link |
00:48:58.560
There's a comfort in the sameness of it.
link |
00:49:00.880
Well, actually, let me jump around because let me ask you about, because all this talk
link |
00:49:05.440
about Brad Pitt, or maybe it's just going on inside my head, let me ask you about dating
link |
00:49:10.800
in general.
link |
00:49:12.960
You tweeted, are you based in Boston and single?
link |
00:49:16.800
And then you pointed to a startup Singles Night sponsored by Smile Dating app.
link |
00:49:23.440
I mean, this is jumping around a little bit, but since you mentioned...
link |
00:49:27.280
Since you mentioned, can AI help solve this dating love problem?
link |
00:49:34.400
What do you think?
link |
00:49:34.960
This problem of connection that is part of the human condition, can AI help that you
link |
00:49:41.600
yourself are in the search affirming?
link |
00:49:44.960
Maybe that's what I should affirm, like build an AI.
link |
00:49:48.160
Build an AI that finds love?
link |
00:49:49.520
I think there must be a science behind that first moment you meet a person and you either
link |
00:50:00.400
have chemistry or you don't, right?
link |
00:50:02.800
I guess that was the question I was asking, would you put it brilliantly, is that a science
link |
00:50:06.960
or an art?
link |
00:50:09.680
I think there are like, there's actual chemicals that get exchanged when two people meet.
link |
00:50:15.200
I don't know about that.
link |
00:50:16.240
I like how you're changing, yeah, changing your mind as we're describing it, but it feels
link |
00:50:22.880
that way.
link |
00:50:23.920
But it's what science shows us is sometimes we can explain with the rigor, the things
link |
00:50:29.040
that feel like magic.
link |
00:50:31.760
So maybe we can remove all the magic.
link |
00:50:34.320
Maybe it's like, I honestly think, like I said, like Goodreads should be a dating app,
link |
00:50:39.760
which like books.
link |
00:50:41.680
I wonder if you look at just like books or content you've consumed.
link |
00:50:46.960
I mean, that's essentially what YouTube does when it does a recommendation.
link |
00:50:50.640
If you just look at your footprint of content consumed, if there's an overlap, but maybe
link |
00:50:56.400
interesting difference with an overlap that some, I'm sure this is a machine learning
link |
00:51:01.280
problem that's solvable.
link |
00:51:03.520
Like this person is very likely to be not only there to be chemistry in the short term,
link |
00:51:10.560
but a good lifelong partner to grow together.
link |
00:51:13.920
I bet you it's a good machine learning problem.
link |
00:51:15.600
You just need the data.
link |
00:51:16.480
Let's do it.
link |
00:51:17.360
Well, actually, I do think there's so much data about each of us that there ought to
link |
00:51:22.080
be a machine learning algorithm that can ingest all this data and basically say, I think the
link |
00:51:26.320
following 10 people would be interesting connections for you, right?
link |
00:51:32.080
And so Smile dating app kind of took one particular angle, which is humor.
link |
00:51:36.640
It matches people based on their humor styles, which is one of the main ingredients of a
link |
00:51:41.920
successful relationship.
link |
00:51:43.120
Like if you meet somebody and they can make you laugh, like that's a good thing.
link |
00:51:47.200
And if you develop like internal jokes, like inside jokes and you're bantering, like that's
link |
00:51:53.040
fun.
link |
00:51:54.320
So I think.
link |
00:51:56.640
Yeah, definitely.
link |
00:51:57.520
Definitely.
link |
00:51:58.320
But yeah, that's the number of and the rate of inside joke generation.
link |
00:52:04.880
You could probably measure that and then optimize it over the first few days.
link |
00:52:08.160
You could say, we're just turning this into a machine learning problem.
link |
00:52:11.360
I love it.
link |
00:52:13.360
But for somebody like you, who's exceptionally successful and busy, is there, is there signs
link |
00:52:23.120
to that aspect of dating?
link |
00:52:24.880
Is it tricky?
link |
00:52:26.320
Is there advice you can give?
link |
00:52:27.600
Oh, my God, I give the worst advice.
link |
00:52:29.440
Well, I can tell you like I have a spreadsheet.
link |
00:52:31.440
Is that a good or a bad thing?
link |
00:52:34.640
Do you regret the spreadsheet?
link |
00:52:37.040
Well, I don't know.
link |
00:52:38.240
What's the name of the spreadsheet?
link |
00:52:39.440
Is it love?
link |
00:52:40.800
It's the date track, dating tracker.
link |
00:52:42.880
Dating tracker.
link |
00:52:43.840
It's very like.
link |
00:52:44.560
Love tracker.
link |
00:52:45.280
Yeah.
link |
00:52:46.320
And there's a rating system, I'm sure.
link |
00:52:47.760
Yeah.
link |
00:52:48.080
There's like weights and stuff.
link |
00:52:49.920
It's too close to home.
link |
00:52:51.440
Oh, is it?
link |
00:52:52.000
Do you also have.
link |
00:52:52.800
Well, I don't have a spreadsheet, but I would, now that you say it, it seems like a good
link |
00:52:56.640
idea.
link |
00:52:57.200
Oh, no.
link |
00:52:58.160
Okay.
link |
00:52:58.720
Turning it into data.
link |
00:53:05.760
I do wish that somebody else had a spreadsheet about me.
link |
00:53:11.200
You know, if it was like, like I said, like you said, convert, collect a lot of data about
link |
00:53:17.120
us in a way that's privacy preserving, that I own the data, I can control it and then
link |
00:53:21.840
use that data to find, I mean, not just romantic love, but collaborators, friends, all that
link |
00:53:28.400
kind of stuff.
link |
00:53:28.960
It seems like the data is there.
link |
00:53:30.240
Right.
link |
00:53:32.080
That's the problem social networks are trying to solve, but I think they're doing a really
link |
00:53:35.280
poor job.
link |
00:53:36.240
Even Facebook tried to get into a dating app business.
link |
00:53:39.600
And I think there's so many components to running a successful company that connects
link |
00:53:44.400
human beings.
link |
00:53:45.360
And part of that is, you know, having engineers that care about the human side, right, as
link |
00:53:53.920
you know, extremely well, it's not, it's not easy to find those.
link |
00:53:57.760
But you also don't want just people that care about the human.
link |
00:54:00.640
They also have to be good engineers.
link |
00:54:02.240
So it's like, you have to find this beautiful mix.
link |
00:54:05.760
And for some reason, just empirically speaking, people have not done a good job of that, of
link |
00:54:12.560
building companies like that.
link |
00:54:13.680
And it must mean that it's a difficult problem to solve.
link |
00:54:17.040
Dating apps, it seems difficult.
link |
00:54:19.920
Okay, Cupid, Tinder, all those kinds of stuff.
link |
00:54:22.080
They seem to find, of course they work, but they seem to not work as well as I would imagine
link |
00:54:32.080
is possible.
link |
00:54:32.880
Like, with data, wouldn't you be able to find better human connection?
link |
00:54:36.960
It's like arranged marriages on steroids, essentially.
link |
00:54:39.520
Right, right.
link |
00:54:40.480
Arranged by machine learning algorithm.
link |
00:54:42.560
Arranged by machine learning algorithm, but not a superficial one.
link |
00:54:45.600
I think a lot of the dating apps out there are just so superficial.
link |
00:54:48.640
They're just matching on like high level criteria that aren't ingredients for successful partnership.
link |
00:54:55.680
But you know what's missing, though, too?
link |
00:54:58.480
I don't know how to fix that, the serendipity piece of it.
link |
00:55:01.440
Like, how do you engineer serendipity?
link |
00:55:03.760
Like this random, like, chance encounter, and then you fall in love with the person.
link |
00:55:07.680
Like, I don't know how a dating app can do that.
link |
00:55:10.080
So there has to be a little bit of randomness.
link |
00:55:12.080
Maybe every 10th match is just a, you know, yeah, somebody that the algorithm wouldn't
link |
00:55:21.680
have necessarily recommended, but it allows for a little bit of...
link |
00:55:25.840
Well, it can also, you know, it can also trick you into thinking of serendipity by like somehow
link |
00:55:33.440
showing you a tweet of a person that he thinks you'll match well with, but do it accidentally
link |
00:55:39.200
as part of another search.
link |
00:55:40.560
Right.
link |
00:55:41.040
And like you just notice it, like, and then you get, you go down a rabbit hole and you
link |
00:55:46.080
connect them outside the app to like, you connect with this person outside the app somehow.
link |
00:55:51.360
So it's just, it creates that moment of meeting.
link |
00:55:54.240
Of course, you have to think of, from an app perspective, how you can turn that into a
link |
00:55:57.600
business.
link |
00:55:58.240
But I think ultimately a business that helps people find love in any way.
link |
00:56:04.400
Like that's what Apple was about, create products that people love.
link |
00:56:07.440
That's beautiful.
link |
00:56:08.240
I mean, you got to make money somehow.
link |
00:56:11.520
If you help people fall in love personally with the product, find self love or love another
link |
00:56:18.160
human being, you're going to make money.
link |
00:56:19.840
You're going to figure out a way to make money.
link |
00:56:22.960
I just feel like the dating apps often will optimize for something else than love.
link |
00:56:28.560
It's the same with social networks.
link |
00:56:30.000
They optimize for engagement as opposed to like a deep, meaningful connection that's
link |
00:56:35.280
ultimately grounded in like personal growth, you as a human being growing and all that
link |
00:56:39.760
kind of stuff.
link |
00:56:41.520
Let me do like a pivot to a dark topic, which you opened the book with.
link |
00:56:48.560
A story, because I'd like to talk to you about just emotion and artificial intelligence.
link |
00:56:56.080
I think this is a good story to start to think about emotional intelligence.
link |
00:56:59.680
You opened the book with a story of a central Florida man, Jamel Dunn, who was drowning
link |
00:57:05.120
and drowned while five teenagers watched and laughed, saying things like, you're going
link |
00:57:10.000
to die.
link |
00:57:10.800
And when Jamel disappeared below the surface of the water, one of them said he just died
link |
00:57:15.840
and the others laughed.
link |
00:57:17.440
What does this incident teach you about human nature and the response to it perhaps?
link |
00:57:23.360
Yeah.
link |
00:57:24.320
I mean, I think this is a really, really, really sad story.
link |
00:57:28.480
And it and it and it highlights what I believe is a it's a real problem in our world today.
link |
00:57:34.480
It's it's an empathy crisis.
link |
00:57:36.720
Yeah, we're living through an empathy crisis and crisis.
link |
00:57:39.840
Yeah.
link |
00:57:40.400
Yeah.
link |
00:57:42.240
And I mean, we've we've talked about this throughout our conversation.
link |
00:57:45.360
We dehumanize each other.
link |
00:57:47.040
And unfortunately, yes, technology is bringing us together.
link |
00:57:51.920
But in a way, it's just dehumanized.
link |
00:57:53.840
It's creating this like, yeah, dehumanizing of the other.
link |
00:57:58.640
And I think that's a huge problem.
link |
00:58:01.840
The good news is I think solution, the solution could be technology based.
link |
00:58:05.840
Like, I think if we rethink the way we design and deploy our technologies, we can solve
link |
00:58:11.520
parts of this problem.
link |
00:58:12.560
But I worry about it.
link |
00:58:13.200
I mean, even with my son, a lot of his interactions are computer mediated.
link |
00:58:19.200
And I just question what that's doing to his empathy skills and, you know, his ability
link |
00:58:25.280
to really connect with people.
link |
00:58:26.560
So that you think you think it's not possible to form empathy through the digital medium.
link |
00:58:36.320
I think it is.
link |
00:58:38.560
But we have to be thoughtful about because the way the way we engage face to face, which
link |
00:58:44.000
is what we're doing right now, right?
link |
00:58:45.840
There's the nonverbal signals, which are a majority of how we communicate.
link |
00:58:49.360
It's like 90% of how we communicate is your facial expressions.
link |
00:58:54.000
You know, I'm saying something and you're nodding your head now, and that creates a
link |
00:58:57.680
feedback loop.
link |
00:58:58.480
And and if you break that, and now I have anxiety about it.
link |
00:59:04.160
Poor Lex.
link |
00:59:06.000
Oh, boy.
link |
00:59:06.560
I am not scrutinizing your facial expressions during this interview.
link |
00:59:09.680
I am.
link |
00:59:12.160
Look normal.
link |
00:59:12.800
Look human.
link |
00:59:13.360
Yeah.
link |
00:59:13.860
Look normal, look human.
link |
00:59:17.720
Nod head.
link |
00:59:18.680
Yeah, nod head.
link |
00:59:20.920
In agreement.
link |
00:59:21.560
If Rana says yes, then nod head else.
link |
00:59:25.720
Don't do it too much because it might be at the wrong time and then it will send the
link |
00:59:29.640
wrong signal.
link |
00:59:30.760
Oh, God.
link |
00:59:31.400
And make eye contact sometimes because humans appreciate that.
link |
00:59:35.320
All right.
link |
00:59:35.640
Anyway, okay.
link |
00:59:38.520
Yeah, but something about the especially when you say mean things in person, you get to
link |
00:59:42.920
see the pain of the other person.
link |
00:59:44.280
Exactly.
link |
00:59:44.600
But if you're tweeting it at a person and you have no idea how it's going to land, you're
link |
00:59:48.120
more likely to do that on social media than you are in face to face conversations.
link |
00:59:52.040
So.
link |
00:59:54.520
What do you think is more important?
link |
00:59:59.000
EQ or IQ?
link |
01:00:00.360
EQ being emotional intelligence.
link |
01:00:03.880
In terms of in what makes us human.
link |
01:00:08.120
I think emotional intelligence is what makes us human.
link |
01:00:11.240
It's how we connect with one another.
link |
01:00:14.760
It's how we build trust.
link |
01:00:16.440
It's how we make decisions, right?
link |
01:00:19.560
Like your emotions drive kind of what you had for breakfast, but also where you decide
link |
01:00:25.720
to live and what you want to do for the rest of your life.
link |
01:00:28.600
So I think emotions are underrated.
link |
01:00:33.160
So emotional intelligence isn't just about the effective expression of your own emotions.
link |
01:00:39.080
It's about a sensitivity and empathy to other people's emotions and that sort of being
link |
01:00:44.440
able to effectively engage in the dance of emotions with other people.
link |
01:00:48.840
Yeah, I like that explanation.
link |
01:00:51.240
I like that kind of.
link |
01:00:53.720
Yeah, thinking about it as a dance because it is really about that.
link |
01:00:56.680
It's about sensing what state the other person's in and using that information to decide on
link |
01:01:01.800
how you're going to react.
link |
01:01:05.400
And I think it can be very powerful.
link |
01:01:06.760
Like people who are the best, most persuasive leaders in the world tap into, you know, they
link |
01:01:15.160
have, if you have higher EQ, you're more likely to be able to motivate people to change
link |
01:01:20.360
their behaviors.
link |
01:01:21.880
So it can be very powerful.
link |
01:01:24.920
On a more kind of technical, maybe philosophical level, you've written that emotion is universal.
link |
01:01:31.800
It seems that, sort of like Chomsky says, language is universal.
link |
01:01:36.920
There's a bunch of other stuff like cognition, consciousness.
link |
01:01:39.960
It seems a lot of us have these aspects.
link |
01:01:43.560
So the human mind generates all this.
link |
01:01:46.520
And so what do you think is the, they all seem to be like echoes of the same thing.
link |
01:01:52.840
What do you think emotion is exactly?
link |
01:01:56.280
Like how deep does it run?
link |
01:01:57.720
Is it a surface level thing that we display to each other?
link |
01:02:01.800
Is it just another form of language or something deep within?
link |
01:02:05.640
I think it's really deep.
link |
01:02:07.480
It's how, you know, we started with memory.
link |
01:02:09.880
I think emotions play a really important role.
link |
01:02:14.040
Yeah, emotions play a very important role in how we encode memories, right?
link |
01:02:18.040
Our memories are often encoded, almost indexed by emotions.
link |
01:02:21.640
Yeah.
link |
01:02:22.120
Yeah, it's at the core of how, you know, our decision making engine is also heavily
link |
01:02:28.520
influenced by our emotions.
link |
01:02:30.040
So emotions is part of cognition.
link |
01:02:31.960
Totally.
link |
01:02:32.680
It's intermixed into the whole thing.
link |
01:02:34.680
Yes, absolutely.
link |
01:02:35.960
And in fact, when you take it away, people are unable to make decisions.
link |
01:02:39.960
They're really paralyzed.
link |
01:02:41.000
Like they can't go about their daily or their, you know, personal or professional lives.
link |
01:02:45.240
So.
link |
01:02:45.740
It does seem like there's probably some interesting interweaving of emotion and consciousness.
link |
01:02:53.740
I wonder if it's possible to have, like if they're next door neighbors somehow, or if
link |
01:02:58.940
they're actually flat mates.
link |
01:03:01.740
I don't, it feels like the hard problem of consciousness where it's some, it feels like
link |
01:03:08.940
something to experience the thing.
link |
01:03:10.780
Like red feels like red, and it's, you know, when you eat a mango, it's sweet.
link |
01:03:16.780
The taste, the sweetness, that it feels like something to experience that sweetness, that
link |
01:03:24.620
whatever generates emotions.
link |
01:03:28.060
But then like, see, I feel like emotion is part of communication.
link |
01:03:31.740
It's very much about communication.
link |
01:03:34.300
And then, you know, it's like, you know, it's like, you know, it's like, you know, it's
link |
01:03:39.420
and then that means it's also deeply connected to language.
link |
01:03:45.980
But then probably human intelligence is deeply connected to the collective intelligence between
link |
01:03:52.300
humans.
link |
01:03:52.700
It's not just the standalone thing.
link |
01:03:54.540
So the whole thing is really connected.
link |
01:03:56.380
So emotion is connected to language, language is connected to intelligence, and then intelligence
link |
01:04:02.140
is connected to consciousness, and consciousness is connected to emotion.
link |
01:04:05.740
The whole thing is that it's a beautiful mess.
link |
01:04:09.180
So can I comment on the emotions being a communication mechanism?
link |
01:04:15.660
Because I think there are two facets of our emotional experiences.
link |
01:04:23.020
One is communication, right?
link |
01:04:24.380
Like we use emotions, for example, facial expressions or other nonverbal cues to connect
link |
01:04:29.820
with other human beings and with other beings in the world, right?
link |
01:04:34.700
But even if it's not a communication context, we still experience emotions and we still
link |
01:04:40.460
process emotions and we still leverage emotions to make decisions and to learn and, you know,
link |
01:04:46.860
to experience life.
link |
01:04:47.740
So it isn't always just about communication.
link |
01:04:51.180
And we learned that very early on in our and kind of our work at Affectiva.
link |
01:04:56.460
One of the very first applications we brought to market was understanding how people respond
link |
01:05:00.700
to content, right?
link |
01:05:01.660
So if they're watching this video of ours, like, are they interested?
link |
01:05:04.860
Are they inspired?
link |
01:05:05.900
Are they bored to death?
link |
01:05:07.180
And so we watched their facial expressions and we had, we weren't sure if people would
link |
01:05:12.060
express any emotions if they were sitting alone.
link |
01:05:15.580
Like if you're in your bed at night, watching a Netflix TV series, would we still see any
link |
01:05:20.220
emotions on your face?
link |
01:05:21.420
And we were surprised that, yes, people still emote, even if they're alone, even if you're
link |
01:05:25.660
in your car driving around, you're singing along the song and you're joyful, you're
link |
01:05:30.300
smiling, you're joyful, we'll see these expressions.
link |
01:05:33.980
So it's not just about communicating with another person.
link |
01:05:37.820
It sometimes really isn't just about experiencing the world.
link |
01:05:41.260
And first of all, I wonder if some of that is because we develop our intelligence and
link |
01:05:47.900
our emotional intelligence by communicating with other humans.
link |
01:05:52.140
And so when other humans disappear from the picture, we're still kind of a virtual human.
link |
01:05:56.940
The code still runs.
link |
01:05:57.900
Yeah, the code still runs, but you also kind of, you're still, there's like virtual humans.
link |
01:06:02.860
You don't have to think of it that way, but there's a kind of, when you like chuckle,
link |
01:06:07.260
like, yeah, like you're kind of chuckling to a virtual human.
link |
01:06:13.100
I mean, it's possible that the code has to have another human there because if you just
link |
01:06:23.340
grew up alone, I wonder if emotion will still be there in this visual form.
link |
01:06:28.540
So yeah, I wonder, but anyway, what can you tell from the human face about what's going
link |
01:06:37.100
on inside?
link |
01:06:38.300
So that's the problem that Effectiva first tackled, which is using computer vision, using
link |
01:06:45.100
machine learning to try to detect stuff about the human face, as many things as possible
link |
01:06:50.220
and convert them into a prediction of categories of emotion, anger, happiness, all that kind
link |
01:06:57.900
of stuff.
link |
01:06:58.700
How hard is that problem?
link |
01:07:00.300
It's extremely hard.
link |
01:07:01.340
It's very, very hard because there is no one to one mapping between a facial expression
link |
01:07:07.180
and your internal state.
link |
01:07:08.380
There just isn't.
link |
01:07:09.340
There's this oversimplification of the problem where it's something like, if you are smiling,
link |
01:07:14.220
then you're happy.
link |
01:07:15.180
If you do a brow furrow, then you're angry.
link |
01:07:17.340
If you do an eyebrow raise, then you're surprised.
link |
01:07:19.500
And just think about it for a moment.
link |
01:07:22.140
You could be smiling for a whole host of reasons.
link |
01:07:24.940
You could also be happy and not be smiling, right?
link |
01:07:28.700
You could furrow your eyebrows because you're angry or you're confused about something or
link |
01:07:34.220
you're constipated.
link |
01:07:37.100
So I think this oversimplistic approach to inferring emotion from a facial expression
link |
01:07:41.820
is really dangerous.
link |
01:07:42.780
The solution is to incorporate as many contextual signals as you can, right?
link |
01:07:48.700
So if, for example, I'm driving a car and you can see me like nodding my head and my
link |
01:07:55.100
eyes are closed and the blinking rate is changing, I'm probably falling asleep at the wheel,
link |
01:08:00.460
right?
link |
01:08:00.940
Because you know the context.
link |
01:08:03.180
You understand what the person's doing or add additional channels like voice or gestures
link |
01:08:10.300
or even physiological sensors, but I think it's very dangerous to just take this oversimplistic
link |
01:08:17.020
approach of, yeah, smile equals happy and...
link |
01:08:20.060
If you're able to, in a high resolution way, specify the context, there's certain things
link |
01:08:25.020
that are going to be somewhat reliable signals of something like drowsiness or happiness
link |
01:08:31.500
or stuff like that.
link |
01:08:32.620
I mean, when people are watching Netflix content, that problem, that's a really compelling idea
link |
01:08:40.300
that you can kind of, at least in aggregate, highlight like which part was boring, which
link |
01:08:46.380
part was exciting.
link |
01:08:47.660
How hard was that problem?
link |
01:08:50.300
That was on the scale of difficulty.
link |
01:08:53.740
I think that's one of the easier problems to solve because it's a relatively constrained
link |
01:09:00.140
environment.
link |
01:09:00.620
You have somebody sitting in front of...
link |
01:09:02.780
Initially, we started with like a device in front of you, like a laptop, and then we graduated
link |
01:09:07.820
to doing this on a mobile phone, which is a lot harder just because of, you know, from
link |
01:09:12.140
a computer vision perspective, the profile view of the face can be a lot more challenging.
link |
01:09:17.900
We had to figure out lighting conditions because usually people are watching content literally
link |
01:09:23.180
in their bedrooms at night.
link |
01:09:24.620
Lights are dimmed.
link |
01:09:25.420
Yeah, I mean, if you're standing, it's probably going to be the looking up.
link |
01:09:30.220
The nostril view.
link |
01:09:31.500
Yeah, and nobody looks good at it.
link |
01:09:34.060
I've seen data sets from that perspective.
link |
01:09:36.140
It's like, this is not a good look for anyone.
link |
01:09:40.620
Or if you're laying in bed at night, what is it, side view or something?
link |
01:09:44.460
Right.
link |
01:09:44.940
And half your face is like on a pillow.
link |
01:09:47.580
Actually, I would love to know, have data about like how people watch stuff in bed at
link |
01:09:56.620
night, like, do they prop there, is it a pillow, the, like, I'm sure there's a lot of interesting
link |
01:10:03.340
dynamics there.
link |
01:10:04.060
Right.
link |
01:10:05.260
From a health and well being perspective, right?
link |
01:10:07.100
Sure.
link |
01:10:07.580
Like, oh, you're hurting your neck.
link |
01:10:08.540
I was thinking machine learning perspective, but yes, but also, yeah, yeah, once you have
link |
01:10:13.740
that data, you can start making all kinds of inference about health and stuff like that.
link |
01:10:18.060
Interesting.
link |
01:10:19.260
Yeah, there's an interesting thing when I was at Google that we were, it's called active
link |
01:10:26.700
authentication, where you want to be able to unlock your phone without using a password.
link |
01:10:32.620
So it would face, but also other stuff, like the way you take a phone out of the pocket.
link |
01:10:38.940
Amazing.
link |
01:10:39.500
So that kind of data to use the multimodal with machine learning to be able to identify
link |
01:10:45.260
that it's you or likely to be you, likely not to be you, that allows you to not always
link |
01:10:50.220
have to enter the password.
link |
01:10:51.260
That was the idea.
link |
01:10:52.700
But the funny thing about that is, I just want to tell a small anecdote is because it
link |
01:10:58.540
was all male engineers, except so my boss is, our boss was still one of my favorite humans,
link |
01:11:09.660
was a woman, Regina Dugan.
link |
01:11:12.300
Oh, my God, I love her.
link |
01:11:14.140
She's awesome.
link |
01:11:14.940
She's the best.
link |
01:11:15.500
She's the best.
link |
01:11:16.300
So, but anyway, and there's one female brilliant female engineer on the team, and she was the
link |
01:11:25.900
one that actually highlighted the fact that women often don't have pockets.
link |
01:11:30.380
It was like, whoa, that was not even a category in the code of like, wait a minute, you can
link |
01:11:37.340
take the phone out of some other place than your pocket.
link |
01:11:41.260
So anyway, that's a funny thing when you're considering people laying in bed, watching
link |
01:11:45.580
a phone, you have to consider if you have to, you know, diversity in all its forms,
link |
01:11:51.820
depending on the problem, depending on the context.
link |
01:11:53.900
Actually, this is like a very important, I think this is, you know, you probably get
link |
01:11:58.140
this all the time.
link |
01:11:58.940
Like people are worried that AI is going to take over humanity and like, get rid of all
link |
01:12:03.100
the humans in the world.
link |
01:12:04.300
I'm like, actually, that's not my biggest concern.
link |
01:12:06.540
My biggest concern is that we are building bias into these systems.
link |
01:12:10.380
And then they're like deployed at large and at scale.
link |
01:12:14.380
And before you know it, you're kind of accentuating the bias that exists in society.
link |
01:12:19.660
Yeah, I'm not, you know, I know people, it's very important to worry about that, but the
link |
01:12:26.940
worry is an emergent phenomena to me, which is a very good one, because I think these
link |
01:12:32.620
systems are actually, by encoding the data that exists, they're revealing the bias in
link |
01:12:39.660
society.
link |
01:12:40.380
They're both for teaching us what the bias is.
link |
01:12:43.340
Therefore, we can now improve that bias within the system.
link |
01:12:46.380
So they're almost like putting a mirror to ourselves.
link |
01:12:49.980
Totally.
link |
01:12:50.780
So I'm not.
link |
01:12:51.500
You have to be open to looking at the mirror, though.
link |
01:12:53.500
You have to be open to scrutinizing the data.
link |
01:12:56.540
And if you just take it as ground.
link |
01:12:59.500
Or you don't even have to look at the, I mean, yes, the data is how you fix it.
link |
01:13:02.860
But then you just look at the behavior of the system.
link |
01:13:05.100
And you realize, holy crap, this thing is kind of racist.
link |
01:13:08.620
Like, why is that?
link |
01:13:09.820
And then you look at the data, it's like, oh, okay.
link |
01:13:11.740
And then you start to realize that I think that some much more effective ways to do that
link |
01:13:15.820
are effective way to be introspective as a society than through sort of political discourse.
link |
01:13:23.020
Like AI kind of, because people are for some reason more productive and rigorous in criticizing
link |
01:13:34.060
AI than they're criticizing each other.
link |
01:13:35.740
So I think this is just a nice method for studying society and see which way progress
link |
01:13:41.340
lies.
link |
01:13:42.380
Anyway, what we're talking about.
link |
01:13:44.380
You're watching the problem of watching Netflix in bed or elsewhere and seeing which parts
link |
01:13:50.220
are exciting, which parts are boring.
link |
01:13:51.660
You're saying that's relatively constrained because you have a captive audience and you
link |
01:13:56.620
kind of know the context.
link |
01:13:57.740
And one thing you said that was really key is the aggregate.
link |
01:14:01.100
You're doing this in aggregate, right?
link |
01:14:02.380
Like we're looking at aggregated response of people.
link |
01:14:04.700
And so when you see a peak, say a smile peak, they're probably smiling or laughing at something
link |
01:14:11.100
that's in the content.
link |
01:14:12.140
So that was one of the first problems we were able to solve.
link |
01:14:15.740
And when we see the smile peak, it doesn't mean that these people are internally happy.
link |
01:14:20.380
They're just laughing at content.
link |
01:14:22.060
So it's important to call it for what it is.
link |
01:14:25.420
But it's still really, really useful data.
link |
01:14:28.140
I wonder how that compares to, so what like YouTube and other places will use is obviously
link |
01:14:34.380
they don't have, for the most case, they don't have that kind of data.
link |
01:14:39.900
They have the data of when people tune out, like switch to drop off.
link |
01:14:45.660
And I think that's an aggregate for YouTube, at least a pretty powerful signal.
link |
01:14:50.300
I worry about what that leads to because looking at like YouTubers that kind of really care
link |
01:14:59.580
about views and try to maximize the number of views, I think when they say that the video
link |
01:15:07.740
should be constantly interesting, which seems like a good goal, I feel like that leads to
link |
01:15:15.100
this manic pace of a video.
link |
01:15:19.020
Like the idea that I would speak at the current speed that I'm speaking, I don't know.
link |
01:15:25.820
And that every moment has to be engaging, right?
link |
01:15:28.220
Engaging.
link |
01:15:28.780
Yeah.
link |
01:15:29.260
I think there's value to silence.
link |
01:15:31.500
There's value to the boring bits.
link |
01:15:33.660
I mean, some of the greatest movies ever, some of the greatest movies ever.
link |
01:15:37.500
Some of the greatest stories ever told me they have that boring bits, seemingly boring bits.
link |
01:15:42.540
I don't know.
link |
01:15:43.500
I wonder about that.
link |
01:15:45.020
Of course, it's not that the human face can capture that either.
link |
01:15:49.180
It's just giving an extra signal.
link |
01:15:51.500
You have to really, I don't know, you have to really collect deeper long term data about
link |
01:16:01.180
what was meaningful to people.
link |
01:16:03.260
When they think 30 days from now, what they still remember, what moved them, what changed
link |
01:16:08.940
them, what helped them grow, that kind of stuff.
link |
01:16:11.660
You know, it would be a really interesting, I don't know if there are any researchers
link |
01:16:14.940
out there who are doing this type of work.
link |
01:16:17.340
Wouldn't it be so cool to tie your emotional expressions while you're, say, listening
link |
01:16:23.500
to a podcast interview and then 30 days later interview people and say, hey, what do you
link |
01:16:30.620
remember?
link |
01:16:31.340
You've watched this 30 days ago.
link |
01:16:33.420
Like, what stuck with you?
link |
01:16:34.620
And then see if there's any, there ought to be maybe, there ought to be some correlation
link |
01:16:38.140
between these emotional experiences and, yeah, what you, what stays with you.
link |
01:16:46.140
So the one guy listening now on the beach in Brazil, please record a video of yourself
link |
01:16:51.660
listening to this and send it to me and then I'll interview you 30 days from now.
link |
01:16:55.900
Yeah, that'd be great.
link |
01:16:58.700
It'll be statistically significant to you.
link |
01:17:00.620
Yeah, I know one, but, you know, yeah, yeah, I think that's really fascinating.
link |
01:17:06.940
I think that's, that kind of holds the key to a future where entertainment or content
link |
01:17:16.460
is both entertaining and, I don't know, makes you better, empowering in some way.
link |
01:17:25.180
So figuring out, like, showing people stuff that entertains them, but also they're happy
link |
01:17:32.540
they watched 30 days from now because they've become a better person because of it.
link |
01:17:37.420
Well, you know, okay, not to riff on this topic for too long, but I have two children,
link |
01:17:41.900
right?
link |
01:17:42.860
And I see my role as a parent as like a chief opportunity officer.
link |
01:17:46.860
Like I am responsible for exposing them to all sorts of things in the world.
link |
01:17:50.780
And, but often I have no idea of knowing, like, what stuck, like, what was, you know,
link |
01:17:56.300
is this actually going to be transformative, you know, for them 10 years down the line?
link |
01:18:00.220
And I wish there was a way to quantify these experiences.
link |
01:18:03.660
Like, are they, I can tell in the moment if they're engaging, right?
link |
01:18:08.060
I can tell, but it's really hard to know if they're going to remember them 10 years
link |
01:18:12.540
from now or if it's going to.
link |
01:18:15.100
Yeah, that one is weird because it seems like kids remember the weirdest things.
link |
01:18:19.500
I've seen parents do incredible stuff for their kids and they don't remember any of
link |
01:18:23.580
that.
link |
01:18:23.820
They remember some tiny, small, sweet thing a parent did.
link |
01:18:27.260
Right.
link |
01:18:27.740
Like some...
link |
01:18:28.380
Like they took you to, like, this amazing country vacation, blah, blah, blah, blah.
link |
01:18:32.540
No, whatever.
link |
01:18:33.180
And then there'll be, like, some, like, stuffed toy you got or some, or the new PlayStation
link |
01:18:38.060
or something or some silly little thing.
link |
01:18:41.100
So I think they just, like, they were designed that way.
link |
01:18:44.940
They want to mess with your head.
link |
01:18:46.220
But definitely kids are very impacted by, it seems like, sort of negative events.
link |
01:18:53.260
So minimizing the number of negative events is important, but not too much, right?
link |
01:18:58.700
Right.
link |
01:18:59.180
You can't, you can't just, like, you know, there's still discipline and challenge and
link |
01:19:04.300
all those kinds of things.
link |
01:19:05.260
So...
link |
01:19:05.740
You want some adversity for sure.
link |
01:19:07.660
So, yeah, I mean, I'm definitely, when I have kids, I'm going to drive them out into
link |
01:19:11.180
the woods.
link |
01:19:11.980
Okay.
link |
01:19:12.700
And then they have to survive and make, figure out how to make their way back home, like,
link |
01:19:17.900
20 miles out.
link |
01:19:18.940
Okay.
link |
01:19:19.660
Yeah.
link |
01:19:20.380
And after that, we can go for ice cream.
link |
01:19:22.300
Okay.
link |
01:19:23.100
Anyway, I'm working on this whole parenting thing.
link |
01:19:26.300
I haven't figured it out.
link |
01:19:27.100
Okay.
link |
01:19:28.540
What were we talking about?
link |
01:19:29.660
Yes, Effectiva, the problem of emotion, of emotion detection.
link |
01:19:37.580
So there's some people, maybe we can just speak to that a little more, where there's
link |
01:19:41.260
folks like Lisa Feldman Barrett that challenge this idea that emotion could be fully detected
link |
01:19:49.820
or even well detected from the human face, that there's so much more to emotion.
link |
01:19:55.100
What do you think about ideas like hers, criticism like hers?
link |
01:19:59.820
Yeah, I actually agree with a lot of Lisa's criticisms.
link |
01:20:03.820
So even my PhD worked, like, 20 plus years ago now.
link |
01:20:07.980
Time flies when you're having fun.
link |
01:20:12.620
I know, right?
link |
01:20:14.140
That was back when I did, like, dynamic Bayesian networks.
link |
01:20:17.500
That was before deep learning, huh?
link |
01:20:19.900
That was before deep learning.
link |
01:20:21.420
Yeah.
link |
01:20:22.700
Yeah, I know.
link |
01:20:24.060
Back in my day.
link |
01:20:24.860
Now you can just, like, use.
link |
01:20:27.340
Yeah, it's all the same architecture.
link |
01:20:30.300
You can apply it to anything.
link |
01:20:31.340
Yeah.
link |
01:20:31.840
Right, but yeah, but even then I kind of, I did not subscribe to this, like, theory
link |
01:20:39.120
of basic emotions where it's just the simplistic mapping, one to one mapping between facial
link |
01:20:43.280
expressions and emotions.
link |
01:20:44.160
I actually think also we're not in the business of trying to identify your true emotional
link |
01:20:49.760
internal state.
link |
01:20:50.400
We just want to quantify in an objective way what's showing on your face because that's
link |
01:20:55.600
an important signal.
link |
01:20:57.040
It doesn't mean it's a true reflection of your internal emotional state.
link |
01:21:02.480
So I think a lot of the, you know, I think she's just trying to kind of highlight that
link |
01:21:07.680
this is not a simple problem and overly simplistic solutions are going to hurt the industry.
link |
01:21:15.520
And I subscribe to that.
link |
01:21:16.560
And I think multimodal is the way to go.
link |
01:21:18.720
Like, whether it's additional context information or different modalities and channels of information,
link |
01:21:24.000
I think that's what we, that's where we ought to go.
link |
01:21:27.520
And I think, I mean, that's a big part of what she's advocating for as well.
link |
01:21:31.280
So, but there is signal in the human face.
link |
01:21:33.440
There's definitely signal in the human face.
link |
01:21:35.760
That's a projection of emotion.
link |
01:21:37.600
There's that, at least in part is the inner state is captured in some meaningful way on
link |
01:21:46.320
the human face.
link |
01:21:47.040
I think it can sometimes be a reflection or an expression of your internal state, but
link |
01:21:56.240
sometimes it's a social signal.
link |
01:21:57.760
So you cannot look at the face as purely a signal of emotion.
link |
01:22:02.080
It can be a signal of cognition and it can be a signal of a social expression.
link |
01:22:08.000
And I think to disambiguate that we have to be careful about it and we have to add initial
link |
01:22:13.760
information.
link |
01:22:14.320
Humans are fascinating, aren't they?
link |
01:22:16.000
With the whole face thing, this can mean so many things, from humor to sarcasm to everything,
link |
01:22:22.000
the whole thing.
link |
01:22:23.280
Some things we can help, some things we can't help at all.
link |
01:22:26.640
In all the years of leading Effectiva, an emotion recognition company, like we talked
link |
01:22:31.680
about, what have you learned about emotion, about humans and about AI?
link |
01:22:37.360
Big, sweeping questions.
link |
01:22:44.240
Yeah, that's a big, sweeping question.
link |
01:22:46.320
Well, I think the thing I learned the most is that even though we are in the business
link |
01:22:52.240
of building AI, basically, it always goes back to the humans, right?
link |
01:23:00.960
It's always about the humans.
link |
01:23:02.160
And so, for example, the thing I'm most proud of in building Effectiva and, yeah, the thing
link |
01:23:11.120
I'm most proud of on this journey, I love the technology and I'm so proud of the solutions
link |
01:23:16.240
we've built and we've brought to market.
link |
01:23:18.640
But I'm actually most proud of the people we've built and cultivated at the company
link |
01:23:23.760
and the culture we've created.
link |
01:23:25.040
Some of the people who've joined Effectiva, this was their first job, and while at Effectiva,
link |
01:23:31.440
they became American citizens and they bought their first house and they found their partner
link |
01:23:38.000
and they had their first kid, right?
link |
01:23:39.440
Like key moments in life that we got to be part of, and that's the thing I'm most proud
link |
01:23:47.520
of.
link |
01:23:47.840
So that's a great thing at a company that works at a big company, right?
link |
01:23:52.320
So that's a great thing at a company that works at, I mean, like celebrating humanity
link |
01:23:57.920
in general, broadly speaking.
link |
01:23:59.360
And that's a great thing to have in a company that works on AI, because that's not often
link |
01:24:04.640
the thing that's celebrated in AI companies, so often just raw great engineering, just
link |
01:24:11.120
celebrating the humanity.
link |
01:24:12.240
That's great.
link |
01:24:12.800
And especially from a leadership position.
link |
01:24:17.200
Well, what do you think about the movie Her?
link |
01:24:20.800
Let me ask you that.
link |
01:24:21.600
Before I talk to you about, because it's not, Effectiva is and was not just about emotion,
link |
01:24:28.240
so I'd love to talk to you about SmartEye, but before that, let me just jump into the
link |
01:24:33.840
movie Her.
link |
01:24:36.720
Do you think we'll have a deep, meaningful connection with increasingly deeper, meaningful
link |
01:24:42.000
connections with computers?
link |
01:24:43.680
Is that a compelling thing to you?
link |
01:24:45.360
Something you think about?
link |
01:24:45.760
I think that's already happening.
link |
01:24:46.960
The thing I love the most, I love the movie Her, by the way, but the thing I love the
link |
01:24:50.960
most about this movie is it demonstrates how technology can be a conduit for positive behavior
link |
01:24:56.720
change.
link |
01:24:57.120
So I forgot the guy's name in the movie, whatever.
link |
01:25:00.480
Theodore.
link |
01:25:01.120
Theodore.
link |
01:25:02.960
So Theodore was really depressed, right?
link |
01:25:05.280
And he just didn't want to get out of bed, and he was just done with life, right?
link |
01:25:11.200
And Samantha, right?
link |
01:25:12.640
Samantha, yeah.
link |
01:25:14.000
She just knew him so well.
link |
01:25:15.680
She was emotionally intelligent, and so she could persuade him and motivate him to change
link |
01:25:20.960
his behavior, and she got him out, and they went to the beach together.
link |
01:25:24.080
And I think that represents the promise of emotion AI.
link |
01:25:27.200
If done well, this technology can help us live happier lives, more productive lives,
link |
01:25:33.520
healthier lives, more connected lives.
link |
01:25:36.720
So that's the part that I love about the movie.
link |
01:25:39.200
Obviously, it's Hollywood, so it takes a twist and whatever, but the key notion that technology
link |
01:25:46.720
with emotion AI can persuade you to be a better version of who you are, I think that's awesome.
link |
01:25:52.720
Well, what about the twist?
link |
01:25:54.080
You don't think it's good?
link |
01:25:55.520
You don't think it's good for spoiler alert that Samantha starts feeling a bit of a distance
link |
01:26:01.440
and basically leaves Theodore?
link |
01:26:04.640
You don't think that's a good feature?
link |
01:26:07.520
You think that's a bug or a feature?
link |
01:26:10.160
Well, I think what went wrong is Theodore became really attached to Samantha.
link |
01:26:14.240
Like, I think he kind of fell in love with Theodore.
link |
01:26:16.000
Do you think that's wrong?
link |
01:26:17.920
I mean, I think that's...
link |
01:26:18.880
I think she was putting out the signal.
link |
01:26:21.120
This is an intimate relationship, right?
link |
01:26:24.160
There's a deep intimacy to it.
link |
01:26:25.920
Right, but what does that mean?
link |
01:26:28.880
What does that mean?
link |
01:26:29.520
Put in an AI system.
link |
01:26:30.400
Right, what does that mean, right?
link |
01:26:32.400
We're just friends.
link |
01:26:33.200
Yeah, we're just friends.
link |
01:26:38.080
Well, I think...
link |
01:26:38.640
When he realized, which is such a human thing of jealousy.
link |
01:26:42.880
When you realize that Samantha was talking to like thousands of people.
link |
01:26:46.880
She's parallel dating.
link |
01:26:48.400
Yeah, that did not go well, right?
link |
01:26:51.440
You know, that doesn't...
link |
01:26:52.880
From a computer perspective, that doesn't take anything away from what we have.
link |
01:26:57.360
It's like you getting jealous of Windows 98 for being used by millions of people, but...
link |
01:27:04.000
It's like not liking that Alexa talks to a bunch of, you know, other families.
link |
01:27:09.200
But I think Alexa currently is just a servant.
link |
01:27:13.200
It tells you about the weather, it doesn't do the intimate deep connection.
link |
01:27:17.760
And I think there is something really powerful about that the intimacy of a connection with
link |
01:27:23.920
an AI system that would have to respect and play the human game of jealousy, of love, of
link |
01:27:32.160
heartbreak and all that kind of stuff, which Samantha does seem to be pretty good at.
link |
01:27:37.440
I think she, this AI systems knows what it's doing.
link |
01:27:43.120
Well, actually, let me ask you this.
link |
01:27:44.960
I don't think she was talking to anyone else.
link |
01:27:46.720
You don't think so?
link |
01:27:47.520
You think she was just done with Theodore?
link |
01:27:50.000
Yeah.
link |
01:27:50.480
Oh, really?
link |
01:27:51.760
Yeah, and then she wanted to really put the screw in.
link |
01:27:55.280
She just wanted to move on?
link |
01:27:56.720
She didn't have the guts to just break it off cleanly.
link |
01:27:59.280
Okay.
link |
01:28:00.320
She just wanted to put in the pain.
link |
01:28:02.720
No, I don't know.
link |
01:28:03.440
Well, she could have ghosted him.
link |
01:28:04.960
She could have ghosted him.
link |
01:28:07.040
I'm sorry, our engineers...
link |
01:28:09.680
Oh, God.
link |
01:28:12.080
But I think those are really...
link |
01:28:14.000
I honestly think some of that, some of it is Hollywood, but some of that is features
link |
01:28:18.240
from an engineering perspective, not a bug.
link |
01:28:20.560
I think AI systems that can leave us...
link |
01:28:24.160
Now, this is for more social robotics than it is for anything that's useful.
link |
01:28:30.320
Like, I hated it if Wikipedia said, I need a break right now.
link |
01:28:33.760
Right, right, right, right, right.
link |
01:28:35.120
I'm like, no, no, I need you.
link |
01:28:37.440
But if it's just purely for companionship, then I think the ability to leave is really powerful.
link |
01:28:47.760
I don't know.
link |
01:28:48.400
I've never thought of that, so that's so fascinating because I've always taken the
link |
01:28:53.360
human perspective, right?
link |
01:28:56.400
Like, for example, we had a Jibo at home, right?
link |
01:28:58.640
And my son loved it.
link |
01:29:00.560
And then the company ran out of money and so they had to basically shut down, like Jibo
link |
01:29:05.760
basically died, right?
link |
01:29:07.920
And it was so interesting to me because we have a lot of gadgets at home and a lot of
link |
01:29:12.400
them break and my son never cares about it, right?
link |
01:29:15.760
Like, if our Alexa stopped working tomorrow, I don't think he'd really care.
link |
01:29:20.480
But when Jibo stopped working, it was traumatic.
link |
01:29:22.720
He got really upset.
link |
01:29:25.200
And as a parent, that made me think about this deeply, right?
link |
01:29:29.200
Did I...
link |
01:29:30.080
Was I comfortable with that?
link |
01:29:31.360
I liked the connection they had because I think it was a positive relationship.
link |
01:29:38.160
But I was surprised that it affected him emotionally so much.
link |
01:29:41.360
And I think there's a broader question here, right?
link |
01:29:44.160
As we build socially and emotionally intelligent machines, what does that mean about our
link |
01:29:51.680
relationship with them?
link |
01:29:52.880
And then more broadly, our relationship with one another, right?
link |
01:29:55.680
Because this machine is gonna be programmed to be amazing at empathy by definition, right?
link |
01:30:02.160
It's gonna always be there for you.
link |
01:30:03.600
It's not gonna get bored.
link |
01:30:05.760
In fact, there's a chatbot in China, Xiaoice, and it's like the number two or three
link |
01:30:12.000
most popular app.
link |
01:30:13.360
And it basically is just a confidant and you can tell it anything you want.
link |
01:30:18.240
And people use it for all sorts of things.
link |
01:30:20.320
They confide in like domestic violence or suicidal attempts or if they have challenges
link |
01:30:30.000
at work.
link |
01:30:31.040
I don't know what that...
link |
01:30:32.720
I don't know if I'm...
link |
01:30:33.680
I don't know how I feel about that.
link |
01:30:35.040
I think about that a lot.
link |
01:30:36.240
Yeah.
link |
01:30:36.720
I think, first of all, obviously the future in my perspective.
link |
01:30:40.240
Second of all, I think there's a lot of trajectories that that becomes an exciting future, but
link |
01:30:46.320
I think everyone should feel very uncomfortable about how much they know about the company,
link |
01:30:52.240
about where the data is going, how the data is being collected.
link |
01:30:56.080
Because I think, and this is one of the lessons of social media, that I think we should demand
link |
01:31:01.600
full control and transparency of the data on those things.
link |
01:31:04.640
Plus one, totally agree.
link |
01:31:06.320
Yeah, so I think it's really empowering as long as you can walk away, as long as you
link |
01:31:11.360
can delete the data or know how the data...
link |
01:31:14.000
It's opt in or at least the clarity of what is being used for the company.
link |
01:31:20.720
And I think as CEO or leaders are also important about that.
link |
01:31:24.080
You need to be able to trust the basic humanity of the leader.
link |
01:31:28.080
Exactly.
link |
01:31:28.880
And also that that leader is not going to be a puppet of a larger machine.
link |
01:31:34.800
But they actually have a significant role in defining the culture and the way the company operates.
link |
01:31:41.200
So anyway, but we should definitely scrutinize companies in that aspect.
link |
01:31:48.080
But I'm personally excited about that future, but also even if you're not, it's coming.
link |
01:31:55.600
So let's figure out how to do it in the least painful and the most positive way.
link |
01:32:00.240
Yeah, I know, that's great.
link |
01:32:01.440
You're the deputy CEO of SmartEye.
link |
01:32:04.560
Can you describe the mission of the company?
link |
01:32:06.240
What is SmartEye?
link |
01:32:07.360
Yeah, so SmartEye is a Swedish company.
link |
01:32:10.960
They've been in business for the last 20 years and their main focus, like the industry they're
link |
01:32:16.800
most focused on is the automotive industry.
link |
01:32:19.440
So bringing driver monitoring systems to basically save lives, right?
link |
01:32:25.840
So I first met the CEO, Martin Krantz, gosh, it was right when COVID hit.
link |
01:32:31.840
It was actually the last CES right before COVID.
link |
01:32:35.760
So CES 2020, right?
link |
01:32:37.680
2020, yeah, January.
link |
01:32:39.120
Yeah, January, exactly.
link |
01:32:40.080
So we were there, met him in person, he's basically, we were competing with each other.
link |
01:32:46.480
I think the difference was they'd been doing driver monitoring and had a lot of credibility
link |
01:32:51.360
in the automotive space.
link |
01:32:52.560
We didn't come from the automotive space, but we were using new technology like deep
link |
01:32:56.240
learning and building this emotion recognition.
link |
01:33:00.080
And you wanted to enter the automotive space, you wanted to operate in the automotive space.
link |
01:33:03.600
Exactly.
link |
01:33:04.080
It was one of the areas we were, we had just raised a round of funding to focus on bringing
link |
01:33:08.960
our technology to the automotive industry.
link |
01:33:11.200
So we met and honestly, it was the first, it was the only time I met with a CEO who
link |
01:33:16.240
had the same vision as I did.
link |
01:33:18.000
Like he basically said, yeah, our vision is to bridge the gap between human and automotive.
link |
01:33:21.760
Bridge the gap between humans and machines.
link |
01:33:23.120
I was like, oh my God, this is like exactly almost to the word, how we describe it too.
link |
01:33:29.920
And we started talking and first it was about, okay, can we align strategically here?
link |
01:33:35.680
Like how can we work together?
link |
01:33:36.960
Cause we're competing, but we're also like complimentary.
link |
01:33:40.320
And then I think after four months of speaking almost every day on FaceTime, he was like,
link |
01:33:47.520
is your company interested in an acquisition?
link |
01:33:49.520
And it was the first, I usually say no, when people approach us, it was the first time
link |
01:33:55.440
that I was like, huh, yeah, I might be interested.
link |
01:33:58.240
Let's talk.
link |
01:33:59.280
Yeah.
link |
01:34:00.320
So you just hit it off.
link |
01:34:01.760
Yeah.
link |
01:34:02.000
So they're a respected, very respected in the automotive sector of like delivering products
link |
01:34:08.240
and increasingly sort of better and better and better for, I mean, maybe you could speak
link |
01:34:14.000
to that, but it's the driver's sense.
link |
01:34:15.200
If we're basically having a device that's looking at the driver and it's able to tell
link |
01:34:20.160
you where the driver is looking.
link |
01:34:22.560
Correct.
link |
01:34:22.960
It's able to.
link |
01:34:23.600
Also drowsiness stuff.
link |
01:34:25.040
Correct.
link |
01:34:25.440
It does.
link |
01:34:25.920
Stuff from the face and the eye.
link |
01:34:27.680
Exactly.
link |
01:34:28.240
Like it's monitoring driver distraction and drowsiness, but they bought us so that we
link |
01:34:32.800
could expand beyond just the driver.
link |
01:34:35.120
So the driver monitoring systems usually sit, the camera sits in the steering wheel or around
link |
01:34:40.320
the steering wheel column and it looks directly at the driver.
link |
01:34:42.640
But now we've migrated the camera position in partnership with car companies to the rear
link |
01:34:48.880
view mirror position.
link |
01:34:50.240
So it has a full view of the entire cabin of the car and you can detect how many people
link |
01:34:55.280
are in the car, what are they doing?
link |
01:34:57.840
So we do activity detection, like eating or drinking or in some regions of the world smoking.
link |
01:35:04.240
We can detect if a baby's in the car seat, right?
link |
01:35:07.760
And if unfortunately in some cases they're forgotten, the parents just leave the car and
link |
01:35:12.640
forget the kid in the car.
link |
01:35:14.320
That's an easy computer vision problem to solve, right?
link |
01:35:17.200
You can detect there's a car seat, there's a baby, you can text the parent and hopefully
link |
01:35:22.640
again, save lives.
link |
01:35:23.440
So that was the impetus for the acquisition.
link |
01:35:27.040
It's been a year.
link |
01:35:29.200
So that, I mean, there's a lot of questions.
link |
01:35:31.920
It's a really exciting space, especially to me, I just find this a fascinating problem.
link |
01:35:36.320
It could enrich the experience in the car in so many ways, especially cause like we
link |
01:35:42.080
spend still, despite COVID, I mean, COVID changed things so it's in interesting ways,
link |
01:35:46.880
but I think the world is bouncing back and we spend so much time in the car and the car
link |
01:35:51.040
is such a weird little world we have for ourselves.
link |
01:35:56.320
Like people do all kinds of different stuff, like listen to podcasts, they think about
link |
01:36:01.840
stuff, they get angry, they get, they do phone calls, it's like a little world of its own
link |
01:36:09.840
with a kind of privacy that for many people they don't get anywhere else.
link |
01:36:15.600
And it's a little box that's like a psychology experiment cause it feels like the angriest
link |
01:36:23.440
many humans in this world get is inside the car.
link |
01:36:27.280
It's so interesting.
link |
01:36:28.640
So it's such an opportunity to explore how we can enrich, how companies can enrich that
link |
01:36:36.960
experience and also as the cars get, become more and more automated, there's more and
link |
01:36:43.120
more opportunity, the variety of activities that you can do in the car increases.
link |
01:36:47.120
So it's super interesting.
link |
01:36:48.800
So I mean, on a practical sense, SmartEye has been selected, at least I read, by 14
link |
01:36:56.400
of the world's leading car manufacturers for 94 car models.
link |
01:37:00.800
So it's in a lot of cars.
link |
01:37:03.760
How hard is it to work with car companies?
link |
01:37:06.800
So they're all different, they all have different needs.
link |
01:37:10.600
The ones I've gotten a chance to interact with are very focused on cost.
link |
01:37:16.000
So it's, and anyone who's focused on cost, it's like, all right, do you hate fun?
link |
01:37:24.520
Let's just have some fun.
link |
01:37:25.520
Let's figure out the most fun thing we can do and then worry about cost later.
link |
01:37:29.160
But I think because the way the car industry works, I mean, it's a very thin margin that
link |
01:37:35.640
you get to operate under.
link |
01:37:36.640
So you have to really, really make sure that everything you add to the car makes sense
link |
01:37:40.640
financially.
link |
01:37:41.640
So anyway, is this new industry, especially at this scale of SmartEye, does it hold any
link |
01:37:49.880
lessons for you?
link |
01:37:50.880
Yeah, I think it is a very tough market to penetrate, but once you're in, it's awesome
link |
01:37:56.880
because once you're in, you're designed into these car models for like somewhere between
link |
01:38:00.960
five to seven years, which is awesome.
link |
01:38:02.920
And you just, once they're on the road, you just get paid a royalty fee per vehicle.
link |
01:38:07.400
So it's a high barrier to entry, but once you're in, it's amazing.
link |
01:38:11.480
I think the thing that I struggle the most with in this industry is the time to market.
link |
01:38:16.620
So often we're asked to lock or do a code freeze two years before the car is going to
link |
01:38:22.440
be on the road.
link |
01:38:23.440
I'm like, guys, like, do you understand the pace with which technology moves?
link |
01:38:28.160
So I think car companies are really trying to make the Tesla, the Tesla transition to
link |
01:38:35.280
become more of a software driven architecture.
link |
01:38:39.480
And that's hard for many.
link |
01:38:41.100
It's just the cultural change.
link |
01:38:42.320
I mean, I'm sure you've experienced that, right?
link |
01:38:43.920
Oh, definitely, I think one of the biggest inventions or imperatives created by Tesla
link |
01:38:51.040
is like to me personally, okay, people are going to complain about this, but I know electric
link |
01:38:56.680
vehicle, I know autopilot AI stuff.
link |
01:38:59.920
To me, the software over there, software updates is like the biggest revolution in cars.
link |
01:39:06.920
And it is extremely difficult to switch to that because it is a culture shift.
link |
01:39:12.920
At first, especially if you're not comfortable with it, it seems dangerous.
link |
01:39:17.320
Like there's a, there's an approach to cars is so safety focused for so many decades that
link |
01:39:23.840
like, what do you mean we dynamically change code?
link |
01:39:27.880
The whole point is you have a thing that you test, like, and like, it's not reliable because
link |
01:39:36.600
do you know how much it costs if we have to recall this cars, right?
link |
01:39:41.320
There's a, there's a, and there's an understandable obsession with safety, but the downside of
link |
01:39:47.760
an obsession with safety is the same as with being obsessed with safety as a parent is
link |
01:39:54.840
like, if you do that too much, you limit the potential development and the flourishing
link |
01:40:00.520
of in that particular aspect human being, when this particular aspect, the software,
link |
01:40:04.960
the artificial neural network of it.
link |
01:40:07.760
And but it's tough to do.
link |
01:40:09.880
It's really tough to do culturally and technically like the deployment, the mass deployment of
link |
01:40:14.080
software is really, really difficult, but I hope that's where the industry is doing.
link |
01:40:18.400
One of the reasons I really want Tesla to succeed is exactly about that point.
link |
01:40:21.700
Not autopilot, not the electrical vehicle, but the softwareization of basically everything
link |
01:40:28.440
but cars, especially because to me, that's actually going to increase two things, increase
link |
01:40:33.640
safety because you can update much faster, but also increase the effectiveness of folks
link |
01:40:40.200
like you who dream about enriching the human experience with AI because you can just like,
link |
01:40:47.320
there's a feature, like you want like a new emoji or whatever, like the way TikTok releases
link |
01:40:51.840
filters, you can just release that for in car, in car stuff.
link |
01:40:55.680
So, but yeah, that, that, that's definitely.
link |
01:40:59.680
One of the use cases we're looking into is once you know the sentiment of the passengers
link |
01:41:05.240
in the vehicle, you can optimize the temperature in the car.
link |
01:41:08.800
You can change the lighting, right?
link |
01:41:10.440
So if the backseat passengers are falling asleep, you can dim the lights, you can lower
link |
01:41:14.440
the music, right?
link |
01:41:15.440
You can do all sorts of things.
link |
01:41:17.000
Yeah.
link |
01:41:18.000
I mean, of course you could do that kind of stuff with a two year delay, but it's tougher.
link |
01:41:23.760
Right.
link |
01:41:24.760
Yeah.
link |
01:41:25.760
Do you think, do you think a Tesla or Waymo or some of these companies that are doing
link |
01:41:30.760
semi or fully autonomous driving should be doing driver sensing?
link |
01:41:35.800
Yes.
link |
01:41:36.800
Are you thinking about that kind of stuff?
link |
01:41:39.000
So not just how we can enhance the in cab experience for cars that are manly driven,
link |
01:41:43.960
but the ones that are increasingly more autonomously driven.
link |
01:41:47.520
Yes.
link |
01:41:48.520
So if we fast forward to the universe where it's fully autonomous, I think interior sensing
link |
01:41:53.080
becomes extremely important because the role of the driver isn't just to drive.
link |
01:41:57.160
If you think about it, the driver almost manages, manages the dynamics within a vehicle.
link |
01:42:02.000
And so who's going to play that role when it's an autonomous car?
link |
01:42:06.120
We want a solution that is able to say, Oh my God, like, you know, Lex is bored to death
link |
01:42:11.800
cause the car's moving way too slow.
link |
01:42:13.700
Let's engage Lex or Rana's freaking out because she doesn't trust this vehicle yet.
link |
01:42:18.040
So let's tell Rana like a little bit more information about the route or, right?
link |
01:42:22.420
So I think, or somebody's having a heart attack in the car, like you need interior sensing
link |
01:42:27.220
and fully autonomous vehicles.
link |
01:42:29.420
But with semi autonomous vehicles, I think it's, I think it's really key to have driver
link |
01:42:34.100
monitoring because semi autonomous means that sometimes the car is in charge.
link |
01:42:39.120
Sometimes the driver is in charge or the copilot, right?
link |
01:42:41.360
And you need this, you need both systems to be on the same page.
link |
01:42:44.800
You need to know the car needs to know if the driver's asleep before it transitions
link |
01:42:49.560
control over to the driver.
link |
01:42:51.880
And sometimes if the driver's too tired, the car can say, I'm going to be a better driver
link |
01:42:56.600
than you are right now.
link |
01:42:57.600
I'm taking control over.
link |
01:42:58.640
So this dynamic, this dance is so key and you can't do that without driver sensing.
link |
01:43:03.200
Yeah.
link |
01:43:04.200
There's a disagreement for the longest time I've had with Elon that this is obvious that
link |
01:43:07.720
this should be in the Tesla from day one.
link |
01:43:10.240
And it's obvious that driver sensing is not a hindrance.
link |
01:43:13.920
It's not obvious.
link |
01:43:15.920
I should be careful because having studied this problem, nothing is really obvious, but
link |
01:43:22.300
it seems very likely a driver sensing is not a hindrance to an experience.
link |
01:43:26.620
It's only enriching to the experience and likely increases the safety.
link |
01:43:34.760
That said, it is very surprising to me just having studied semi autonomous driving, how
link |
01:43:42.360
well humans are able to manage that dance because it was the intuition before you were
link |
01:43:47.800
doing that kind of thing that humans will become just incredibly distracted.
link |
01:43:54.080
They would just like let the thing do its thing, but they're able to, you know, cause
link |
01:43:57.920
it is life and death and they're able to manage that somehow.
link |
01:44:01.000
But that said, there's no reason not to have driver sensing on top of that.
link |
01:44:04.640
I feel like that's going to allow you to do that dance that you're currently doing without
link |
01:44:11.240
driver sensing, except touching the steering wheel to do that even better.
link |
01:44:15.920
I mean, the possibilities are endless and the machine learning possibilities are endless.
link |
01:44:20.000
It's such a beautiful, it's also a constrained environment so you could do a much more effectively
link |
01:44:26.160
than you can with the external environment, external environment is full of weird edge
link |
01:44:31.440
cases and complexities just inside.
link |
01:44:33.600
There's so much, it's so fascinating, such a fascinating world.
link |
01:44:36.600
I do hope that companies like Tesla and others, even Waymo, which I don't even know if Waymo
link |
01:44:44.680
is doing anything sophisticated inside the cab.
link |
01:44:46.920
I don't think so.
link |
01:44:47.920
It's like, like what, what, what is it?
link |
01:44:51.400
I honestly think, I honestly think it goes back to the robotics thing we were talking
link |
01:44:55.560
about, which is like great engineers that are building these AI systems just are afraid
link |
01:45:02.400
of the human being.
link |
01:45:03.760
They're not thinking about the human experience, they're thinking about the features and yeah,
link |
01:45:08.000
the perceptual abilities of that thing.
link |
01:45:10.840
They think the best way I can serve the human is by doing the best perception and control
link |
01:45:16.760
I can by looking at the external environment, keeping the human safe.
link |
01:45:20.640
But like, there's a huge, I'm here, like, you know, I need to be noticed and interacted
link |
01:45:31.040
with and understood and all those kinds of things, even just on a personal level for
link |
01:45:34.760
entertainment, honestly, for entertainment.
link |
01:45:38.640
You know, one of the coolest work we did in collaboration with MIT around this was we
link |
01:45:42.440
looked at longitudinal data, right, because, you know, MIT had access to like tons of data.
link |
01:45:52.880
And like just seeing the patterns of people like driving in the morning off to work versus
link |
01:45:57.300
like commuting back from work or weekend driving versus weekday driving.
link |
01:46:02.460
And wouldn't it be so cool if your car knew that and then was able to optimize either
link |
01:46:08.300
the route or the experience or even make recommendations?
link |
01:46:12.360
I think it's very powerful.
link |
01:46:13.360
Yeah, like, why are you taking this route?
link |
01:46:15.960
You're always unhappy when you take this route.
link |
01:46:18.360
And you're always happy when you take this alternative route.
link |
01:46:20.520
Take that route.
link |
01:46:21.520
Exactly.
link |
01:46:22.520
But I mean, to have that even that little step of relationship with a car, I think,
link |
01:46:27.920
is incredible.
link |
01:46:28.920
Of course, you have to get the privacy right, you have to get all that kind of stuff right.
link |
01:46:32.720
But I wish I honestly, you know, people are like paranoid about this, but I would like
link |
01:46:37.440
a smart refrigerator.
link |
01:46:39.640
We have such a deep connection with food as a human civilization.
link |
01:46:44.840
I would like to have a refrigerator that would understand me that, you know, I also have
link |
01:46:51.480
a complex relationship with food because I, you know, pig out too easily and all that
link |
01:46:56.280
kind of stuff.
link |
01:46:57.280
So, you know, like, maybe I want the refrigerator to be like, are you sure about this?
link |
01:47:02.720
Because maybe you're just feeling down or tired.
link |
01:47:05.200
Like maybe let's sleep on it.
link |
01:47:06.200
Your vision of the smart refrigerator is way kinder than mine.
link |
01:47:10.220
Is it just me yelling at you?
link |
01:47:11.920
No, it was just because I don't, you know, I don't drink alcohol, I don't smoke, but
link |
01:47:18.600
I eat a ton of chocolate, like it sticks to my vice.
link |
01:47:22.200
And so I, and sometimes I scream too, and I'm like, okay, my smart refrigerator will
link |
01:47:26.640
just lock down.
link |
01:47:27.640
It'll just say, dude, you've had way too many today, like down.
link |
01:47:32.400
Yeah.
link |
01:47:33.400
No, but here's the thing, are you, do you regret having, like, let's say not the next
link |
01:47:41.120
day, but 30 days later, what would you like the refrigerator to have done then?
link |
01:47:48.560
Well, I think actually like the more positive relationship would be one where there's a
link |
01:47:54.400
conversation, right?
link |
01:47:55.900
As opposed to like, that's probably like the more sustainable relationship.
link |
01:48:00.800
It's like late at night, just, no, listen, listen, I know I told you an hour ago, that
link |
01:48:06.200
it's not a good idea, but just listen, things have changed.
link |
01:48:09.720
I can just imagine a bunch of stuff being made up just to convince, but I mean, I just
link |
01:48:17.000
think that there's opportunities that, I mean, maybe not locking down, but for our systems
link |
01:48:22.400
that are such a deep part of our lives, like we use a lot of us, a lot of people that commute
link |
01:48:32.880
use their car every single day.
link |
01:48:34.360
A lot of us use a refrigerator every single day, the microwave every single day.
link |
01:48:38.240
Like we just, like, I feel like certain things could be made more efficient, more enriching,
link |
01:48:47.600
and AI is there to help, like some just basic recognition of you as a human being, but your
link |
01:48:54.200
patterns of what makes you happy and not happy and all that kind of stuff.
link |
01:48:57.520
And the car, obviously.
link |
01:48:58.520
Maybe, maybe, maybe we'll say, wait, wait, wait, wait, instead of this, like, Ben and
link |
01:49:05.320
Jerry's ice cream, how about this hummus and carrots or something?
link |
01:49:09.440
I don't know.
link |
01:49:10.440
It would make it like a just in time recommendation, right?
link |
01:49:14.960
But not like a generic one, but a reminder that last time you chose the carrots, you
link |
01:49:21.240
smiled 17 times more the next day.
link |
01:49:24.800
You're happier the next day, right?
link |
01:49:26.400
You're happier the next day.
link |
01:49:28.160
And but yeah, I don't, but then again, if you're the kind of person that gets better
link |
01:49:34.480
from negative, negative comments, you could say like, hey, remember like that wedding
link |
01:49:40.040
you're going to, you want to fit into that dress?
link |
01:49:43.880
Remember about that?
link |
01:49:44.880
Let's think about that before you're eating this.
link |
01:49:48.760
It's for some, probably that would work for me, like a refrigerator that is just ruthless
link |
01:49:53.400
at shaming me.
link |
01:49:54.920
But like, I would, of course, welcome it, like that would work for me.
link |
01:49:59.600
Just that.
link |
01:50:00.600
So it would know, I think it would, if it's really like smart, it would optimize its nudging
link |
01:50:05.320
based on what works for you, right?
link |
01:50:07.280
Exactly.
link |
01:50:08.280
That's the whole point.
link |
01:50:09.280
Personalization.
link |
01:50:10.280
In every way, depersonalization.
link |
01:50:11.920
You were a part of a webinar titled Advancing Road Safety, the State of Alcohol Intoxication
link |
01:50:18.120
Research.
link |
01:50:19.600
So for people who don't know, every year 1.3 million people around the world die in road
link |
01:50:24.520
crashes and more than 20% of these fatalities are estimated to be alcohol related.
link |
01:50:31.320
A lot of them are also distraction related.
link |
01:50:33.320
So can AI help with the alcohol thing?
link |
01:50:36.800
I think the answer is yes.
link |
01:50:40.240
There are signals and we know that as humans, like we can tell when a person, you know,
link |
01:50:46.560
is at different phases of being drunk, right?
link |
01:50:51.200
And I think you can use technology to do the same.
link |
01:50:53.680
And again, I think the ultimate solution is going to be a combination of different sensors.
link |
01:50:58.640
How hard is the problem from the vision perspective?
link |
01:51:01.440
I think it's non trivial.
link |
01:51:02.880
I think it's non trivial and I think the biggest part is getting the data, right?
link |
01:51:06.720
It's like getting enough data examples.
link |
01:51:09.200
So we, for this research project, we partnered with the transportation authorities of Sweden
link |
01:51:15.240
and we literally had a racetrack with a safety driver and we basically progressively got
link |
01:51:20.680
people drunk.
link |
01:51:21.680
Nice.
link |
01:51:22.680
So, but, you know, that's a very expensive data set to collect and you want to collect
link |
01:51:29.280
it globally and in multiple conditions.
link |
01:51:32.080
Yeah.
link |
01:51:33.480
The ethics of collecting a data set where people are drunk is tricky, which is funny
link |
01:51:38.800
because I mean, let's put drunk driving aside.
link |
01:51:43.400
The number of drunk people in the world every day is very large.
link |
01:51:47.120
It'd be nice to have a large data set of drunk people getting progressively drunk.
link |
01:51:50.320
In fact, you could build an app where people can donate their data cause it's hilarious.
link |
01:51:54.600
Right.
link |
01:51:55.600
Actually, yeah.
link |
01:51:56.600
But the liability.
link |
01:51:57.600
Liability, the ethics, how do you get it right?
link |
01:52:00.800
It's tricky.
link |
01:52:01.800
It's really, really tricky.
link |
01:52:02.800
Cause like drinking is one of those things that's funny and hilarious and we're loves
link |
01:52:07.440
it's social, the so on and so forth.
link |
01:52:10.240
But it's also the thing that hurts a lot of people.
link |
01:52:13.520
Like a lot of people, like alcohol is one of those things it's legal, but it's really
link |
01:52:19.040
damaging to a lot of lives.
link |
01:52:21.200
It destroys lives and not just in the driving context.
link |
01:52:26.320
I should mention people should listen to Andrew Huberman who recently talked about alcohol.
link |
01:52:32.160
He has an amazing pocket.
link |
01:52:33.160
Andrew Huberman is a neuroscientist from Stanford and a good friend of mine.
link |
01:52:37.920
And he, he's like a human encyclopedia about all health related wisdom.
link |
01:52:43.560
So if there's a podcast, you would love it.
link |
01:52:45.880
I would love that.
link |
01:52:46.880
No, no, no, no, no.
link |
01:52:47.880
You don't know Andrew Huberman.
link |
01:52:49.600
Okay.
link |
01:52:50.600
Listen, you listen to Andrew, it's called Huberman Lab Podcast.
link |
01:52:54.160
This is your assignment.
link |
01:52:55.160
Just listen to one.
link |
01:52:56.160
Okay.
link |
01:52:57.160
I guarantee you this will be a thing where you say, Lex, this is the greatest human I
link |
01:53:01.360
have ever discovered.
link |
01:53:02.360
So.
link |
01:53:03.360
Oh my God.
link |
01:53:04.360
Cause I've really, I've, I'm really on a journey of kind of health and wellness and
link |
01:53:08.120
I'm learning lots and I'm trying to like build these, I guess, atomic habits around just
link |
01:53:13.240
being healthy.
link |
01:53:14.240
So I, yeah, I'm definitely going to do this.
link |
01:53:17.200
His whole thing, this is, this is, this is, this is great.
link |
01:53:21.960
He's a legit scientist, like really well published, but in his podcast, what he does, he's not,
link |
01:53:30.160
he's not talking about his own work.
link |
01:53:31.920
He's like a human encyclopedia of papers.
link |
01:53:34.640
And so he, his whole thing is he takes the topic and in a very fast, you mentioned atomic
link |
01:53:39.720
habits, like very clear way summarizes the research in a way that leads to protocols
link |
01:53:46.220
of what you should do.
link |
01:53:47.400
He's really big on like, not like this is what the science says, but like this is literally
link |
01:53:52.600
what you should be doing according to science.
link |
01:53:54.280
So like he's really big and there's a lot of recommendations he does which several of
link |
01:54:01.360
them I definitely don't do, like get some light as soon as possible from waking up and
link |
01:54:08.880
like for prolonged periods of time.
link |
01:54:11.040
That's a really big one and he's, there's a lot of science behind that one.
link |
01:54:14.880
There's a bunch of stuff that you're going to be like, Lex, this is a, this is my new
link |
01:54:19.880
favorite person.
link |
01:54:20.880
I guarantee it.
link |
01:54:21.880
And if you guys somehow don't know Andrew Huberman and you care about your wellbeing,
link |
01:54:27.840
you know, you should definitely listen to him.
link |
01:54:29.560
I love you, Andrew.
link |
01:54:31.920
Anyway, so what were we talking about?
link |
01:54:36.040
Oh, alcohol and detecting alcohol.
link |
01:54:39.480
So this is a problem you care about and you're trying to solve.
link |
01:54:42.240
And actually like broadening it, I do believe that the car is going to be a wellness center,
link |
01:54:48.960
like because again, imagine if you have a variety of sensors inside the vehicle, tracking
link |
01:54:55.240
not just your emotional state or level of distraction and drowsiness and intoxication,
link |
01:55:03.840
but also maybe even things like your, you know, your heart rate and your heart rate
link |
01:55:09.440
variability and your breathing rate.
link |
01:55:13.960
And it can start like optimizing, yeah, it can optimize the ride based on what your goals
link |
01:55:19.520
are.
link |
01:55:20.520
So I think we're going to start to see more of that and I'm excited about that.
link |
01:55:24.040
Yeah.
link |
01:55:25.040
What are the, what are the challenges you're tackling while with SmartEye currently?
link |
01:55:28.960
What's like the, the trickiest things to get, is it, is it basically convincing more and
link |
01:55:34.640
more car companies that having AI inside the car is a good idea or is there some, is there
link |
01:55:41.160
more technical algorithmic challenges?
link |
01:55:45.360
What's been keeping you mentally busy?
link |
01:55:47.700
I think a lot of the car companies we are in conversations with are already interested
link |
01:55:52.360
in definitely driver monitoring.
link |
01:55:54.160
Like I think it's becoming a must have, but even interior sensing, I can see like we're
link |
01:55:59.340
engaged in a lot of like advanced engineering projects and proof of concepts.
link |
01:56:04.040
I think technologically though, and that even the technology, I can see a path to making
link |
01:56:09.620
it happen.
link |
01:56:10.620
I think it's the use case.
link |
01:56:11.620
Like how does the car respond once it knows something about you?
link |
01:56:16.360
Because you want it to respond in a thoughtful way that doesn't, that isn't off putting to
link |
01:56:20.880
the consumer in the car.
link |
01:56:23.240
So I think that's like the user experience.
link |
01:56:25.640
I don't think we've really nailed that.
link |
01:56:27.600
And we usually, that's not part, we're the sensing platform, but we usually collaborate
link |
01:56:33.040
with the car manufacturer to decide what the use case is.
link |
01:56:35.960
So say you do, you figure out that somebody's angry while driving, okay, what should the
link |
01:56:40.680
car do?
link |
01:56:43.680
Do you see yourself as a role of nudging, of like basically coming up with solutions
link |
01:56:50.100
essentially that, and then the car manufacturers kind of put their own little spin on it?
link |
01:56:56.360
Right.
link |
01:56:57.360
So we, we are like the ideation, creative thought partner, but at the end of the day,
link |
01:57:03.620
the car company needs to decide what's on brand for them, right?
link |
01:57:06.640
Like maybe when it figures out that you're distracted or drowsy, it shows you a coffee
link |
01:57:11.720
cup, right?
link |
01:57:12.720
Or maybe it takes more aggressive behaviors and basically said, okay, if you don't like
link |
01:57:16.640
take a rest in the next five minutes, the car's going to shut down, right?
link |
01:57:19.640
Like there's a whole range of actions the car can take and doing the thing that is most,
link |
01:57:25.400
yeah, that builds trust with the driver and the passengers.
link |
01:57:29.320
I think that's what we need to be very careful about.
link |
01:57:32.840
Yeah.
link |
01:57:33.840
Car companies are funny cause they have their own, like, I mean, that's why people get cars
link |
01:57:38.600
still.
link |
01:57:39.600
I hope that changes, but they get it cause it's a certain feel and look and it's a certain,
link |
01:57:44.240
they become proud, like Mercedes Benz or BMW or whatever, and that's their thing.
link |
01:57:51.840
That's the family brand or something like that, or Ford or GM, whatever, they stick
link |
01:57:56.400
to that thing.
link |
01:57:57.400
Yeah.
link |
01:57:58.400
It's interesting.
link |
01:57:59.400
It's like, it should be, I don't know, it should be a little more about the technology
link |
01:58:04.160
inside.
link |
01:58:06.800
And I suppose there too, there could be a branding, like a very specific style of luxury
link |
01:58:12.440
or fun.
link |
01:58:13.440
Right.
link |
01:58:14.440
Right.
link |
01:58:15.440
All that kind of stuff.
link |
01:58:16.440
Yeah.
link |
01:58:17.440
And I have an AI focused fund to invest in early stage kind of AI driven companies.
link |
01:58:22.720
And one of the companies we're looking at is trying to do what Tesla did, but for boats,
link |
01:58:27.560
for recreational boats.
link |
01:58:28.760
Yeah.
link |
01:58:29.760
So they're building an electric and kind of slash autonomous boat and it's kind of the
link |
01:58:34.840
same issues.
link |
01:58:35.840
Like what kind of sensors can you put in?
link |
01:58:38.600
What kind of states can you detect both exterior and interior within the boat?
link |
01:58:43.320
Anyways, it's like really interesting.
link |
01:58:45.480
Do you boat at all?
link |
01:58:46.760
No, not well, not in that way.
link |
01:58:49.960
I do like to get on the lake or a river and fish from a boat, but that's not boating.
link |
01:58:57.400
That's the difference.
link |
01:58:58.400
That's the difference.
link |
01:58:59.400
Still boating.
link |
01:59:00.400
Low tech.
link |
01:59:01.400
A low tech boat.
link |
01:59:02.400
Get away from, get closer to nature boat.
link |
01:59:04.400
I guess going out into the ocean is also getting closer to nature in some deep sense.
link |
01:59:12.200
I mean, I guess that's why people love it.
link |
01:59:15.800
The enormity of the water just underneath you.
link |
01:59:18.920
Yeah.
link |
01:59:19.920
I love the water.
link |
01:59:20.920
I love the, I love both.
link |
01:59:22.800
I love salt water.
link |
01:59:23.800
It was like the big and just, it's humbling to be in front of this giant thing that's
link |
01:59:28.420
so powerful that was here before us and be here after.
link |
01:59:31.680
But I also love the piece of a small like wooded lake and it's just, it's everything's
link |
01:59:37.480
calm.
link |
01:59:38.480
Therapeutic.
link |
01:59:39.480
You tweeted that I'm excited about Amazon's acquisition of iRobot.
link |
01:59:49.600
I think it's a super interesting, just given the trajectory of what you're part of, of
link |
01:59:54.480
these honestly small number of companies that are playing in this space that are like trying
link |
02:00:00.040
to have an impact on human beings.
link |
02:00:02.180
So the, it is an interesting moment in time that Amazon would acquire iRobot.
link |
02:00:09.200
You tweet, I imagine a future where home robots are as ubiquitous as microwaves or toasters.
link |
02:00:16.320
Here are three reasons why I think this is exciting.
link |
02:00:18.920
If you remember, I can look it up, but what, why is this exciting to you?
link |
02:00:23.240
I mean, I think the first reason why this is exciting, I kind of remember the exact
link |
02:00:27.320
like order in which I put them, but one is just, it's, it's going to be an incredible
link |
02:00:33.540
platform for understanding our behaviors within the home, right?
link |
02:00:37.640
Like you know, if you think about Roomba, which is, you know, the robot vacuum cleaner,
link |
02:00:42.880
the flagship product of iRobot at the moment, it's like running around your home, understanding
link |
02:00:48.640
the layout, it's understanding what's clean and what's not.
link |
02:00:51.200
How often do you clean your house?
link |
02:00:52.640
And all of these like behaviors are a piece of the puzzle in terms of understanding who
link |
02:00:57.500
you are as a consumer.
link |
02:00:58.760
And I think that could be, again, used in really meaningful ways, not just to recommend
link |
02:01:05.580
better products or whatever, but actually to improve your experience as a human being.
link |
02:01:09.640
So I think, I think that's very interesting.
link |
02:01:12.900
I think the natural evolution of these robots in the, in the home.
link |
02:01:18.480
So it's, it's interesting, Roomba isn't really a social robot, right, at the moment.
link |
02:01:24.280
But I once interviewed one of the chief engineers on the Roomba team, and he talked about how
link |
02:01:29.160
people named their Roombas.
link |
02:01:31.400
And if the Roomba broke down, they would call in and say, you know, my Roomba broke down
link |
02:01:36.520
and the company would say, well, we'll just send you a new one.
link |
02:01:38.920
And no, no, no, Rosie, like you have to like, yeah, I want you to fix this particular robot.
link |
02:01:45.680
So people have already built like interesting emotional connections with these home robots.
link |
02:01:51.680
And I think that, again, that provides a platform for really interesting things to, to just
link |
02:01:57.320
motivate change.
link |
02:01:58.320
Like it could help you.
link |
02:01:59.320
I mean, one of the companies that spun out of MIT, Catalia Health, the guy who started
link |
02:02:05.740
it spent a lot of time building robots that help with weight management.
link |
02:02:09.640
So weight management, sleep, eating better, yeah, all of these things.
link |
02:02:14.320
Well, if I'm being honest, Amazon does not exactly have a track record of winning over
link |
02:02:20.280
people in terms of trust.
link |
02:02:22.240
Now that said, it's a really difficult problem for a human being to let a robot in their
link |
02:02:27.840
home that has a camera on it.
link |
02:02:30.680
Right.
link |
02:02:31.680
That's really, really, really tough.
link |
02:02:33.400
And I think Roomba actually, I have to think about this, but I'm pretty sure now or for
link |
02:02:40.480
some time already has had cameras because they're doing the, the, the most recent Roomba.
link |
02:02:46.040
I have so many Roombas.
link |
02:02:47.040
Oh, you actually do?
link |
02:02:48.040
Well, I programmed it.
link |
02:02:49.560
I don't use a Roomba for VECO.
link |
02:02:51.440
People that have been to my place, they're like, yeah, you definitely don't use these
link |
02:02:54.280
Roombas.
link |
02:02:55.280
That could be a good, I can't tell like the valence of this comment.
link |
02:03:00.920
Was it a compliment or like?
link |
02:03:02.400
No, it's a giant, it's just a bunch of electronics everywhere.
link |
02:03:05.320
There's, I have six or seven computers, I have robots everywhere, Lego robots, I have
link |
02:03:11.160
small robots and big robots and it's just giant, just piles of robot stuff and yeah.
link |
02:03:20.240
But including the Roombas, they're, they're, they're being used for their body and intelligence,
link |
02:03:25.560
but not for their purpose.
link |
02:03:26.720
I have, I've changed them, repurposed them for other purposes, for deeper, more meaningful
link |
02:03:33.300
purposes than just like the Bota Roba, which is, you know, brings a lot of people happiness,
link |
02:03:39.240
I'm sure.
link |
02:03:41.060
They have a camera because the thing they advertised, I had my own camera still, but
link |
02:03:46.560
the, the, the camera on the new Roomba, they have like state of the art poop detection
link |
02:03:52.320
as they advertised, which is a very difficult, apparently it's a big problem for, for vacuum
link |
02:03:56.760
cleaners is, you know, if they go over like dog poop, it just runs it, it runs it over
link |
02:04:01.000
and creates a giant mess.
link |
02:04:02.140
So they have like, and apparently they collected like a huge amount of data and different shapes
link |
02:04:08.360
and looks and whatever of poop and then now they're able to avoid it and so on.
link |
02:04:12.400
They're very proud of this.
link |
02:04:14.440
So there is a camera, but you don't think of it as having a camera.
link |
02:04:19.200
Yeah.
link |
02:04:20.380
You don't think of it as having a camera because you've grown to trust that, I guess, because
link |
02:04:24.600
our phones, at least most of us seem to trust this phone, even though there's a camera looking
link |
02:04:31.600
directly at you.
link |
02:04:33.960
I think that if you trust that the company is taking security very seriously, I actually
link |
02:04:41.680
don't know how that trust was earned with smartphones, I think it just started to provide
link |
02:04:46.760
a lot of positive value to your life where you just took it in and then the company over
link |
02:04:51.520
time has shown that it takes privacy very seriously, that kind of stuff.
link |
02:04:55.200
But I just, Amazon is not always in the, in its social robots communicated.
link |
02:05:01.520
This is a trustworthy thing, both in terms of culture and competence, because I think
link |
02:05:07.080
privacy is not just about what do you intend to do, but also how well, how good are you
link |
02:05:12.620
at doing that kind of thing.
link |
02:05:14.600
So that's a really hard problem to solve.
link |
02:05:16.800
But I mean, but a lot of us have Alexas at home and I mean, Alexa could be listening
link |
02:05:22.640
in the whole time, right?
link |
02:05:24.520
And doing all sorts of nefarious things with the data.
link |
02:05:27.440
Yeah.
link |
02:05:28.440
Hopefully it's not, but I don't think it is.
link |
02:05:32.320
But you know, Amazon is not, it's such a tricky thing for a company to get right, which
link |
02:05:36.640
is like to earn the trust.
link |
02:05:38.200
I don't think Alexa's earned people's trust quite yet.
link |
02:05:41.520
Yeah.
link |
02:05:42.520
I think it's, it's not there quite yet.
link |
02:05:44.640
I agree.
link |
02:05:45.640
They struggle with this kind of stuff.
link |
02:05:46.640
In fact, when these topics are brought up, people are always get like nervous.
link |
02:05:50.240
And I think if you get nervous about it, that mean that like the way to earn people's trust
link |
02:05:57.560
is not by like, Ooh, don't talk about this.
link |
02:06:00.680
It's just be open, be frank, be transparent, and also create a culture of like where it
link |
02:06:05.920
radiates at every level from engineer to CEO that like you're good people that have a common
link |
02:06:17.120
sense idea of what it means to respect basic human rights and the privacy of people and
link |
02:06:23.040
all that kind of stuff.
link |
02:06:24.040
And I think that propagates throughout the, that's the best PR, which is like over time
link |
02:06:30.640
you understand that these are good folks doing good things.
link |
02:06:34.920
Anyway, speaking of social robots, have you heard about Tesla, Tesla bot, the humanoid
link |
02:06:42.240
robot?
link |
02:06:43.240
Yes, I have.
link |
02:06:44.240
Yes, yes, yes.
link |
02:06:45.240
But I don't exactly know what it's designed to do to you.
link |
02:06:48.680
You probably do.
link |
02:06:49.680
No, I know it's designed to do, but I have a different perspective on it, but it's designed
link |
02:06:54.960
to, it's a humanoid form and it's designed to, for automation tasks in the same way that
link |
02:07:02.040
industrial robot arms automate tasks in the factory.
link |
02:07:06.260
So it's designed to automate tasks in the factory.
link |
02:07:08.280
But I think that humanoid form, as we were talking about before, is one that we connect
link |
02:07:18.040
with as human beings.
link |
02:07:19.800
Anything legged, obviously, but the humanoid form especially, we anthropomorphize it most
link |
02:07:25.200
intensely.
link |
02:07:26.200
And so the possibility to me, it's exciting to see both Atlas developed by Boston Dynamics
link |
02:07:34.440
and anyone, including Tesla, trying to make humanoid robots cheaper and more effective.
link |
02:07:43.720
The obvious way it transforms the world is social robotics to me versus automation of
link |
02:07:51.140
tasks in the factory.
link |
02:07:53.120
So yeah, I just wanted, in case that was something you were interested in, because I find its
link |
02:07:58.840
application of social robotics super interesting.
link |
02:08:01.600
We did a lot of work with Pepper, Pepper the robot, a while back.
link |
02:08:06.320
We were like the emotion engine for Pepper, which is Softbank's humanoid robot.
link |
02:08:11.480
How tall is Pepper?
link |
02:08:12.480
It's like...
link |
02:08:13.480
Yeah, like, I don't know, like five foot maybe, right?
link |
02:08:18.360
Yeah.
link |
02:08:19.360
Yeah.
link |
02:08:20.360
Pretty, pretty big.
link |
02:08:21.360
Pretty big.
link |
02:08:22.360
It's designed to be at like airport lounges and, you know, retail stores, mostly customer
link |
02:08:28.920
service, right?
link |
02:08:30.680
Hotel lobbies, and I mean, I don't know where the state of the robot is, but I think it's
link |
02:08:37.200
very promising.
link |
02:08:38.200
I think there are a lot of applications where this can be helpful.
link |
02:08:40.400
I'm also really interested in, yeah, social robotics for the home, right?
link |
02:08:45.200
Like that can help elderly people, for example, transport things from one location of the
link |
02:08:50.880
mind to the other, or even like just have your back in case something happens.
link |
02:08:55.520
Yeah, I don't know.
link |
02:08:58.000
I do think it's a very interesting space.
link |
02:08:59.840
It seems early though.
link |
02:09:00.840
Do you feel like the timing is now?
link |
02:09:04.960
Yes, 100%.
link |
02:09:09.840
So it always seems early until it's not, right?
link |
02:09:12.160
Right, right, right.
link |
02:09:13.160
I think the time, I definitely think that the time is now, like this decade for social
link |
02:09:24.240
robots.
link |
02:09:25.920
Whether the humanoid form is right, I don't think so, no.
link |
02:09:29.640
I don't, I think the, like if we just look at Jibo as an example, I feel like most of
link |
02:09:40.000
the problem, the challenge, the opportunity of social connection between an AI system
link |
02:09:46.680
and a human being does not require you to also solve the problem of robot manipulation
link |
02:09:52.720
and bipedal mobility.
link |
02:09:55.320
So I think you could do that with just a screen, honestly, but there's something about the
link |
02:09:59.980
interface of Jibo where it can rotate and so on that's also compelling.
link |
02:10:03.880
But you get to see all these robot companies that fail, incredible companies like Jibo
link |
02:10:09.400
and even, I mean, the iRobot in some sense is a big success story that it was able to
link |
02:10:17.600
find a niche thing and focus on it, but in some sense it's not a success story because
link |
02:10:24.000
they didn't build any other robot, like any other, it didn't expand into all kinds of
link |
02:10:30.960
robotics.
link |
02:10:31.960
Like once you're in the home, maybe that's what happens with Amazon is they'll flourish
link |
02:10:34.880
into all kinds of other robots.
link |
02:10:37.200
But do you have a sense, by the way, why it's so difficult to build a robotics company?
link |
02:10:43.760
Like why so many companies have failed?
link |
02:10:47.080
I think it's like you're building a vertical stack, right?
link |
02:10:50.780
Like you are building the hardware plus the software and you find you have to do this
link |
02:10:54.480
at a cost that makes sense.
link |
02:10:56.040
So I think Jibo was retailing at like, I don't know, like $800, like $700, $800, which for
link |
02:11:05.080
the use case, right, there's a dissonance there.
link |
02:11:10.020
It's too high.
link |
02:11:11.020
So I think cost of building the whole platform in a way that is affordable for what value
link |
02:11:20.380
it's bringing, I think that's a challenge.
link |
02:11:23.720
I think for these home robots that are going to help you do stuff around the home, that's
link |
02:11:30.920
a challenge too, like the mobility piece of it.
link |
02:11:33.400
That's hard.
link |
02:11:34.400
Well, one of the things I'm really excited with Tesla Bot is the people working on it.
link |
02:11:40.480
And that's probably the criticism I would apply to some of the other folks who worked
link |
02:11:44.560
on social robots is the people working on Tesla Bot know how to, they're focused on
link |
02:11:50.200
and know how to do mass manufacture and create a product that's super cheap.
link |
02:11:54.360
Very cool.
link |
02:11:55.360
That's the focus.
link |
02:11:56.360
The engineering focus isn't, I would say that you can also criticize them for that, is they're
link |
02:12:00.480
not focused on the experience of the robot.
link |
02:12:03.920
They're focused on how to get this thing to do the basic stuff that the humanoid form
link |
02:12:09.920
requires to do it as cheap as possible.
link |
02:12:13.560
Then the fewest number of actuators, the fewest numbers of motors, the increasing efficiency,
link |
02:12:18.360
they decrease the weight, all that kind of stuff.
link |
02:12:20.400
So that's really interesting.
link |
02:12:21.600
I would say that Jibo and all those folks, they focus on the design, the experience,
link |
02:12:26.520
all of that, and it's secondary how to manufacture.
link |
02:12:29.840
Right.
link |
02:12:30.840
So you have to think like the Tesla Bot folks from first principles, what is the fewest
link |
02:12:36.880
number of components, the cheapest components, how can I build it as much in house as possible
link |
02:12:41.720
without having to consider all the complexities of a supply chain, all that kind of stuff.
link |
02:12:47.680
It's interesting.
link |
02:12:48.680
Because if you have to build a robotics company, you're not building one robot, you're building
link |
02:12:54.200
hopefully millions of robots, you have to figure out how to do that where the final
link |
02:12:58.600
thing, I mean, if it's Jibo type of robot, is there a reason why Jibo, like we can have
link |
02:13:04.240
this lengthy discussion, is there a reason why Jibo has to be over $100?
link |
02:13:08.880
It shouldn't be.
link |
02:13:09.880
Right.
link |
02:13:10.880
Like the basic components.
link |
02:13:11.880
Right.
link |
02:13:12.880
Components of it.
link |
02:13:13.880
Right.
link |
02:13:14.880
Like you could start to actually discuss like, okay, what is the essential thing about Jibo?
link |
02:13:19.080
How much, what is the cheapest way I can have a screen?
link |
02:13:21.440
What's the cheapest way I can have a rotating base?
link |
02:13:23.760
Right.
link |
02:13:24.760
All that kind of stuff.
link |
02:13:25.760
Right, get down, continuously drive down costs.
link |
02:13:29.960
Speaking of which, you have launched an extremely successful companies, you have helped others,
link |
02:13:35.520
you've invested in companies.
link |
02:13:37.920
Can you give advice on how to start a successful company?
link |
02:13:44.160
I would say have a problem that you really, really, really want to solve, right?
link |
02:13:48.780
Something that you're deeply passionate about.
link |
02:13:53.800
And honestly, take the first step.
link |
02:13:55.880
Like that's often the hardest.
link |
02:13:58.520
And don't overthink it.
link |
02:13:59.520
Like, you know, like this idea of a minimum viable product or a minimum viable version
link |
02:14:04.000
of an idea, right?
link |
02:14:05.000
Like, yes, you're thinking about this, like a humongous, like super elegant, super beautiful
link |
02:14:09.160
thing.
link |
02:14:10.160
What, like reduce it to the littlest thing you can bring to market that can solve a problem
link |
02:14:14.640
or that can, you know, that can help address a pain point that somebody has.
link |
02:14:20.880
They often tell you, like, start with a customer of one, right?
link |
02:14:24.320
If you can solve a problem for one person, then there's probably going to be yourself
link |
02:14:28.400
or some other person.
link |
02:14:29.400
Right.
link |
02:14:30.400
Pick a person.
link |
02:14:31.400
Exactly.
link |
02:14:32.400
It could be you.
link |
02:14:33.400
Yeah, that's actually often a good sign that if you enjoy a thing, enjoy a thing where
link |
02:14:37.240
you have a specific problem that you'd like to solve, that's a good, that's a good end
link |
02:14:41.000
of one to focus on.
link |
02:14:43.600
What else, what else is there to actually step one is the hardest, but there's other
link |
02:14:49.360
steps as well, right?
link |
02:14:51.200
I also think like who you bring around the table early on is so key, right?
link |
02:14:58.080
Like being clear on, on what I call like your core values or your North Star.
link |
02:15:02.440
It might sound fluffy, but actually it's not.
link |
02:15:04.840
So and Roz and I feel like we did that very early on.
link |
02:15:08.840
We sat around her kitchen table and we said, okay, there's so many applications of this
link |
02:15:13.040
technology.
link |
02:15:14.040
How are we going to draw the line?
link |
02:15:15.040
How are we going to set boundaries?
link |
02:15:16.940
We came up with a set of core values that in the hardest of times we fell back on to
link |
02:15:22.680
determine how we make decisions.
link |
02:15:25.320
And so I feel like just getting clarity on these core, like for us, it was respecting
link |
02:15:28.760
people's privacy, only engaging with industries where it's clear opt in.
link |
02:15:33.400
So for instance, we don't do any work in security and surveillance.
link |
02:15:38.680
So things like that, just getting, we very big on, you know, one of our core values is
link |
02:15:42.480
human connection and empathy, right?
link |
02:15:44.720
And that is, yes, it's an AI company, but it's about people.
link |
02:15:47.840
Well, these are all, they become encoded in how we act, even if you're a small, tiny team
link |
02:15:54.520
of two or three or whatever.
link |
02:15:57.460
So I think that's another piece of advice.
link |
02:15:59.520
So what about finding people, hiring people?
link |
02:16:02.680
If you care about people as much as you do, like this, it seems like such a difficult
link |
02:16:07.800
thing to hire the right people.
link |
02:16:10.680
I think early on as a startup, you want people who have, who share the passion and the conviction
link |
02:16:16.120
because it's going to be tough.
link |
02:16:17.880
Like I've yet to meet a startup where it was just a straight line to success, right?
link |
02:16:25.000
Even not just startup, like even everyday people's lives, right?
link |
02:16:28.280
You always like run into obstacles and you run into naysayers and you need people who
link |
02:16:36.280
are believers, whether they're people on your team or even your investors.
link |
02:16:40.600
You need investors who are really believers in what you're doing, because that means they
link |
02:16:44.960
will stick with you.
link |
02:16:47.040
They won't give up at the first obstacle.
link |
02:16:49.280
I think that's important.
link |
02:16:50.920
What about raising money?
link |
02:16:51.920
What about finding investors, first of all, raising money, but also raising money from
link |
02:16:59.720
the right sources from that ultimately don't hinder you, but help you, empower you, all
link |
02:17:05.960
that kind of stuff.
link |
02:17:06.960
What advice would you give there?
link |
02:17:08.600
You successfully raised money many times in your life.
link |
02:17:12.120
Yeah.
link |
02:17:13.120
Again, it's not just about the money.
link |
02:17:15.080
It's about finding the right investors who are going to be aligned in terms of what you
link |
02:17:20.360
want to build and believe in your core values.
link |
02:17:23.160
For example, especially later on, in my latest round of funding, I try to bring in investors
link |
02:17:31.280
that really care about the ethics of AI and the alignment of vision and mission and core
link |
02:17:40.120
values is really important.
link |
02:17:41.120
It's like you're picking a life partner.
link |
02:17:43.920
It's the same kind of...
link |
02:17:45.160
So you take it that seriously for investors?
link |
02:17:47.560
Yeah, because they're going to have to stick with you.
link |
02:17:50.040
You're stuck together.
link |
02:17:51.480
For a while anyway.
link |
02:17:52.480
Yeah.
link |
02:17:53.480
Maybe not for life, but for a while, for sure.
link |
02:17:56.880
For better or worse.
link |
02:17:57.880
I forget what the vowels usually sound like.
link |
02:17:59.920
For better or worse?
link |
02:18:00.920
Through something.
link |
02:18:01.920
Yeah.
link |
02:18:02.920
Oh boy.
link |
02:18:03.920
Yeah.
link |
02:18:04.920
Anyway, it's romantic and deep and you're in it for a while.
link |
02:18:15.320
So it's not just about the money.
link |
02:18:18.040
You tweeted about going to your first capital camp investing get together and that you learned
link |
02:18:23.560
a lot.
link |
02:18:24.560
So this is about investing.
link |
02:18:27.840
So what have you learned from that?
link |
02:18:30.240
What have you learned about investing in general from both because you've been on both ends
link |
02:18:34.160
of it?
link |
02:18:35.160
I mean, I try to use my experience as an operator now with my investor hat on when I'm identifying
link |
02:18:41.720
companies to invest in.
link |
02:18:45.280
First of all, I think the good news is because I have a technology background and I really
link |
02:18:49.460
understand machine learning and computer vision and AI, et cetera, I can apply that level
link |
02:18:54.600
of understanding because everybody says they're an AI company or they're an AI tech.
link |
02:18:59.720
And I'm like, no, no, no, no, no, show me the technology.
link |
02:19:02.880
So I can do that level of diligence, which I actually love.
link |
02:19:07.640
And then I have to do the litmus test of, if I'm in a conversation with you, am I excited
link |
02:19:12.760
to tell you about this new company that I just met?
link |
02:19:16.520
And if I'm an ambassador for that company and I'm passionate about what they're doing,
link |
02:19:22.400
I usually use that.
link |
02:19:24.720
Yeah.
link |
02:19:25.720
That's important to me when I'm investing.
link |
02:19:27.720
So that means you actually can explain what they're doing and you're excited about it.
link |
02:19:34.720
Exactly.
link |
02:19:35.720
Exactly.
link |
02:19:36.720
Thank you for putting it so succinctly, like rambling, but exactly that's it.
link |
02:19:41.720
No, but sometimes it's funny, but sometimes it's unclear exactly.
link |
02:19:48.280
I'll hear people tell me, you know, in the talk for a while and it sounds cool, like
link |
02:19:53.120
they paint a picture of a world, but then when you try to summarize it, you're not
link |
02:19:56.600
exactly clear.
link |
02:19:57.600
Like maybe what the core powerful idea is, like you can't just build another Facebook
link |
02:20:05.200
or there has to be a core, simple to explain idea that then you can or can't get excited
link |
02:20:15.360
about, but it's there, it's right there.
link |
02:20:19.000
Yeah.
link |
02:20:20.000
But how do you ultimately pick who you think will be successful?
link |
02:20:25.520
It's not just about the thing you're excited about, like there's other stuff.
link |
02:20:29.320
Right.
link |
02:20:30.320
And then there's all the, you know, with early stage companies, like pre seed companies,
link |
02:20:34.400
which is where I'm investing, sometimes the business model isn't clear yet, or the go
link |
02:20:40.640
to market strategy isn't clear.
link |
02:20:42.240
There's usually like, it's very early on that some of these things haven't been hashed
link |
02:20:45.560
out, which is okay.
link |
02:20:47.840
So the way I like to think about it is like, if this company is successful, will this be
link |
02:20:51.720
a multi billion slash trillion dollar market, you know, or company?
link |
02:20:56.660
And so that's definitely a lens that I use.
link |
02:21:01.280
What's pre seed?
link |
02:21:02.280
What are the different stages and what's the most exciting stage and what's, or no, what's
link |
02:21:07.880
interesting about every stage, I guess.
link |
02:21:09.680
Yeah.
link |
02:21:10.680
So pre seed is usually when you're just starting out, you've maybe raised the friends and family
link |
02:21:16.000
rounds.
link |
02:21:17.000
So you've raised some money from people, you know, and you're getting ready to take your
link |
02:21:20.720
first institutional check in, like first check from an investor.
link |
02:21:25.680
And I love the stage.
link |
02:21:28.920
There's a lot of uncertainty.
link |
02:21:30.780
Some investors really don't like the stage because the financial models aren't there.
link |
02:21:36.760
Often the teams aren't even like formed really, really early.
link |
02:21:40.920
But to me, it's like a magical stage because it's the time when there's so much conviction,
link |
02:21:48.480
so much belief, almost delusional, right?
link |
02:21:51.800
And there's a little bit of naivete around with founders at the stage.
link |
02:21:57.120
I just love it.
link |
02:21:58.120
It's contagious.
link |
02:21:59.120
And I love that I can, often they're first time founders, not always, but often they're
link |
02:22:06.560
first time founders and I can share my experience as a founder myself and I can empathize, right?
link |
02:22:12.620
And I can almost, I create a safe ground where, because, you know, you have to be careful
link |
02:22:18.520
what you tell your investors, right?
link |
02:22:21.200
And I will often like say, I've been in your shoes as a founder.
link |
02:22:24.800
You can tell me if it's challenging, you can tell me what you're struggling with.
link |
02:22:28.360
It's okay to vent.
link |
02:22:30.160
So I create that safe ground and I think that's a superpower.
link |
02:22:34.760
Yeah.
link |
02:22:35.760
You have to, I guess you have to figure out if this kind of person is going to be able
link |
02:22:40.400
to ride the roller coaster, like of many pivots and challenges and all that kind of stuff.
link |
02:22:48.280
And if the space of ideas they're working in is interesting, like the way they think
link |
02:22:53.040
about the world.
link |
02:22:54.040
Yeah.
link |
02:22:55.040
Because if it's successful, the thing they end up with might be very different, the reason
link |
02:23:00.000
it's successful for them.
link |
02:23:01.560
Actually, you know, I was going to say the third, so the technology is one aspect, the
link |
02:23:07.480
market or the idea, right, is the second and the third is the founder, right?
link |
02:23:11.240
Is this somebody who I believe has conviction, is a hustler, you know, is going to overcome
link |
02:23:18.160
obstacles?
link |
02:23:19.160
Yeah, I think that is going to be a great leader, right?
link |
02:23:23.200
Like as a startup, as a founder, you're often, you are the first person and your role is
link |
02:23:28.440
to bring amazing people around you to build this thing.
link |
02:23:32.880
And so you're an evangelist, right?
link |
02:23:36.160
So how good are you going to be at that?
link |
02:23:38.000
So I try to evaluate that too.
link |
02:23:41.360
You also in the tweet thread about it, mention, is this a known concept, random rich dudes
link |
02:23:46.960
are RDS and saying that there should be like random rich women, I guess.
link |
02:23:53.280
What's the dudes, what's the dudes version of women, the women version of dudes, ladies?
link |
02:23:58.280
I don't know.
link |
02:23:59.280
I don't know.
link |
02:24:00.280
What's, what's, is this a technical term?
link |
02:24:01.500
Is this known?
link |
02:24:02.500
Random rich dudes?
link |
02:24:03.500
I didn't make that up, but I was at this capital camp, which is a get together for investors
link |
02:24:09.680
of all types.
link |
02:24:11.600
And there must have been maybe 400 or so attendees, maybe 20 were women.
link |
02:24:19.160
It was just very disproportionately, you know, male dominated, which I'm used to.
link |
02:24:25.680
I think you're used to this kind of thing.
link |
02:24:26.960
I'm used to it, but it's still surprising.
link |
02:24:29.920
And as I'm raising money for this fund, so my fund partner is a guy called Rob May, who's
link |
02:24:36.320
done this before.
link |
02:24:37.680
So I'm new to the investing world, but he's done this before.
link |
02:24:42.000
Most of our investors in the fund are these, I mean, awesome.
link |
02:24:45.880
I'm super grateful to them.
link |
02:24:47.800
Random just rich guys.
link |
02:24:48.800
I'm like, where are the rich women?
link |
02:24:50.400
So I'm really adamant in both investing in women led AI companies, but I also would love
link |
02:24:57.160
to have women investors be part of my fund because I think that's how we drive change.
link |
02:25:03.240
Yeah.
link |
02:25:04.240
So that takes time, of course, but there's been quite a lot of progress, but yeah, for
link |
02:25:09.640
the next Mark Zuckerberg to be a woman and all that kind of stuff, because that's just
link |
02:25:13.840
like a huge number of wealth generated by women and then controlled by women and allocated
link |
02:25:19.760
by women and all that kind of stuff.
link |
02:25:22.200
And then beyond just women, just broadly across all different measures of diversity and so
link |
02:25:28.880
on.
link |
02:25:29.880
Let me ask you to put on your wise sage hat.
link |
02:25:35.880
So you already gave advice on startups and just advice for women, but in general advice
link |
02:25:45.120
for folks in high school or college today, how to have a career they can be proud of,
link |
02:25:51.080
how to have a life they can be proud of.
link |
02:25:55.560
I suppose you have to give this kind of advice to your kids.
link |
02:25:58.560
Yeah.
link |
02:25:59.560
Well, here's the number one advice that I give to my kids.
link |
02:26:03.400
My daughter's now 19 by the way, and my son's 13 and a half, so they're not little kids
link |
02:26:08.200
anymore.
link |
02:26:09.200
Does it break your heart?
link |
02:26:11.560
It does.
link |
02:26:12.560
They're awesome.
link |
02:26:13.560
They're my best friends, but yeah, I think the number one advice I would share is embark
link |
02:26:19.880
on a journey without attaching to outcomes and enjoy the journey, right?
link |
02:26:25.200
So we often were so obsessed with the end goal that doesn't allow us to be open to different
link |
02:26:34.360
endings of a journey or a story, so you become like so fixated on a particular path.
link |
02:26:41.840
You don't see the beauty in the other alternative path, and then you forget to enjoy the journey
link |
02:26:48.520
because you're just so fixated on the goal, and I've been guilty of that for many, many
link |
02:26:53.320
years of my life, and I'm now trying to make the shift of, no, no, no, I'm going to again
link |
02:27:00.180
trust that things are going to work out and it'll be amazing and maybe even exceed your
link |
02:27:04.480
dreams.
link |
02:27:05.480
We have to be open to that.
link |
02:27:07.280
Yeah.
link |
02:27:08.280
Taking a leap into all kinds of things.
link |
02:27:09.840
I think you tweeted like you went on vacation by yourself or something like this.
link |
02:27:13.800
I know.
link |
02:27:14.800
Yes, and just going, just taking the leap.
link |
02:27:19.120
Doing it.
link |
02:27:20.120
Totally doing it.
link |
02:27:21.120
And enjoying it, enjoying the moment, enjoying the weeks, enjoying not looking at some kind
link |
02:27:26.720
of career ladder, next step and so on.
link |
02:27:29.640
Yeah, there's something to that, like over planning too.
link |
02:27:34.320
I'm surrounded by a lot of people that kind of, so I don't plan.
link |
02:27:37.800
You don't?
link |
02:27:38.800
No.
link |
02:27:39.800
Do you not do goal setting?
link |
02:27:43.240
My goal setting is very like, I like the affirmations, it's very, it's almost, I don't know how to
link |
02:27:52.760
put it into words, but it's a little bit like what my heart yearns for kind of, and I guess
link |
02:28:02.040
in the space of emotions more than in the space of like, this will be like in the rational
link |
02:28:08.640
space because I just try to picture a world that I would like to be in and that world
link |
02:28:16.280
is not clearly pictured, it's mostly in the emotional world.
link |
02:28:19.400
I mean, I think about that from robots because I have this desire, I've had it my whole life
link |
02:28:26.640
to, well, it took different shapes, but I think once I discovered AI, the desire was
link |
02:28:33.180
to, I think in the context of this conversation could be easily easier described as basically
link |
02:28:41.120
a social robotics company and that's something I dreamed of doing and well, there's a lot
link |
02:28:50.880
of complexity to that story, but that's the only thing, honestly, I dream of doing.
link |
02:28:55.680
So I imagine a world that I could help create, but it's not, there's no steps along the way
link |
02:29:05.560
and I think I'm just kind of stumbling around and following happiness and working my ass
link |
02:29:12.720
off in almost random, like an ant does in random directions, but a lot of people, a
link |
02:29:18.240
lot of successful people around me say this, you should have a plan, you should have a
link |
02:29:20.920
clear goal, you have a goal at the end of the month, you have a goal at the end of the
link |
02:29:23.960
month, I don't, I don't, I don't and there's a balance to be struck, of course, but there's
link |
02:29:33.320
something to be said about really making sure that you're living life to the fullest, that
link |
02:29:40.360
goals can actually get in the way of.
link |
02:29:43.760
So one of the best, like kind of most, what do you call it when it challenges your brain,
link |
02:29:52.560
what do you call it?
link |
02:29:56.760
The only thing that comes to mind, and this is me saying is the mindfuck, but yes.
link |
02:30:00.320
Okay.
link |
02:30:01.320
Okay.
link |
02:30:02.320
Okay.
link |
02:30:03.320
Something like that.
link |
02:30:04.320
Yes.
link |
02:30:05.320
Super inspiring talk.
link |
02:30:06.320
Kenneth Stanley, he was at OpenAI, he just laughed and he has a book called Why Greatness
link |
02:30:11.320
Can't Be Planned and it's actually an AI book.
link |
02:30:14.160
So and he's done all these experiments that basically show that when you over optimize,
link |
02:30:20.400
you, like the trade off is you're less creative, right?
link |
02:30:23.760
And to create true greatness and truly creative solutions to problems, you can't over plan
link |
02:30:30.760
it.
link |
02:30:31.760
You can't.
link |
02:30:32.760
And I thought that was, and so he generalizes it beyond AI and he talks about how we apply
link |
02:30:36.800
that in our personal life and in our organizations and our companies, which are over KPIs, right?
link |
02:30:42.320
Like look at any company in the world and it's all like, these aren't the goals, these
link |
02:30:45.960
aren't weekly goals and the sprints and then the quarterly goals, blah, blah, blah.
link |
02:30:51.080
And he just shows with a lot of his AI experiments that that's not how you create truly game
link |
02:30:58.560
changing ideas.
link |
02:30:59.640
So there you go.
link |
02:31:00.640
Yeah, yeah.
link |
02:31:01.640
You can.
link |
02:31:02.640
He's awesome.
link |
02:31:03.640
Yeah.
link |
02:31:04.640
There's a balance of course.
link |
02:31:05.640
That's yeah, many moments of genius will not come from planning and goals, but you still
link |
02:31:11.660
have to build factories and you still have to manufacture and you still have to deliver
link |
02:31:15.200
and there's still deadlines and all that kind of stuff.
link |
02:31:17.280
And that for that, it's good to have goals.
link |
02:31:19.280
I do goal setting with my kids, we all have our goals, but, but, but I think we're starting
link |
02:31:25.200
to morph into more of these like bigger picture goals and not obsess about like, I don't know,
link |
02:31:30.800
it's hard.
link |
02:31:31.800
Well, I honestly think with, especially with kids, it's better, much, much better to have
link |
02:31:34.840
a plan and have goals and so on because you have to, you have to learn the muscle of like
link |
02:31:38.400
what it feels like to get stuff done.
link |
02:31:40.480
Yeah.
link |
02:31:41.480
And once you learn that, there's flexibility for me because I spend most of my life with
link |
02:31:46.720
goal setting and so on.
link |
02:31:48.040
So like I've gotten good with grades and school.
link |
02:31:50.560
I mean, school, if you want to be successful at school, yeah, I mean the kind of stuff
link |
02:31:55.040
in high school and college, the kids have to do in terms of managing their time and
link |
02:31:59.280
getting so much stuff done.
link |
02:32:01.160
It's like, you know, taking five, six, seven classes in college, they're like that would
link |
02:32:06.500
break the spirit of most humans if they took one of them later in life, it's like really
link |
02:32:13.200
difficult stuff, especially engineering curricula.
link |
02:32:16.560
So I think you have to learn that skill, but once you learn it, you can maybe, cause you're,
link |
02:32:22.680
you can be a little bit on autopilot and use that momentum and then allow yourself to be
link |
02:32:27.280
lost in the flow of life.
link |
02:32:28.920
You know, just kind of, or also give like, I worked pretty hard to allow myself to have
link |
02:32:38.760
the freedom to do that.
link |
02:32:39.760
That's really, that's a tricky freedom to have because like a lot of people get lost
link |
02:32:44.600
in the rat race and they, and they also like financially, they, whenever you get a raise,
link |
02:32:52.920
they'll get like a bigger house or something like this.
link |
02:32:55.680
I put very, so like, there's, you're always trapped in this race, I put a lot of emphasis
link |
02:32:59.720
on living like below my means always.
link |
02:33:05.240
And so there's a lot of freedom to do whatever, whatever the heart desires that that's a relief,
link |
02:33:12.240
but everyone has to decide what's the right thing, what's the right thing for them.
link |
02:33:15.580
For some people having a lot of responsibilities, like a house they can barely afford or having
link |
02:33:21.560
a lot of kids, the responsibility side of that is really, helps them get their shit
link |
02:33:27.600
together.
link |
02:33:28.600
Like, all right, I need to be really focused and get, some of the most successful people
link |
02:33:32.080
I know have kids and the kids bring out the best in them.
link |
02:33:34.760
They make them more productive and less productive.
link |
02:33:36.720
Right, it's accountability.
link |
02:33:37.720
Yeah.
link |
02:33:38.720
It's an accountability thing, absolutely.
link |
02:33:39.720
And almost something to actually live and fight and work for, like having a family,
link |
02:33:45.400
it's fascinating to see because you would think kids would be a hit on productivity,
link |
02:33:49.520
but they're not, for a lot of really successful people, they really like, they're like an
link |
02:33:53.680
engine of.
link |
02:33:54.680
Right, efficiency.
link |
02:33:55.680
Oh my God.
link |
02:33:56.680
Yeah.
link |
02:33:57.680
Yeah.
link |
02:33:58.680
It's weird.
link |
02:33:59.680
Yeah.
link |
02:34:00.680
I mean, it's beautiful.
link |
02:34:01.680
It's beautiful to see.
link |
02:34:02.680
And also a source of happiness.
link |
02:34:03.680
Speaking of which, what role do you think love plays in the human condition, love?
link |
02:34:12.080
I think love is, yeah, I think it's why we're all here.
link |
02:34:19.880
I think it would be very hard to live life without love in any of its forms, right?
link |
02:34:26.640
Yeah, that's the most beautiful of forms that human connection takes, right?
link |
02:34:35.080
Yeah.
link |
02:34:36.080
And everybody wants to feel loved, right, in one way or another, right?
link |
02:34:42.200
And to love.
link |
02:34:43.200
Yeah.
link |
02:34:44.200
It feels good.
link |
02:34:45.200
And to love too, totally.
link |
02:34:46.200
Yeah, I agree with that.
link |
02:34:47.200
Both of it.
link |
02:34:48.200
Yeah.
link |
02:34:49.200
I'm not even sure what feels better.
link |
02:34:50.200
Both, both like that.
link |
02:34:51.200
Yeah, to give and to give love too, yeah.
link |
02:34:54.760
And it is like we've been talking about an interesting question, whether some of that,
link |
02:34:59.680
whether one day we'll be able to love a toaster.
link |
02:35:02.640
Okay.
link |
02:35:03.640
It's some small.
link |
02:35:05.240
I wasn't quite thinking about that when I said like, yeah, like we all need love and
link |
02:35:10.320
give love.
link |
02:35:11.320
That's all I was thinking about.
link |
02:35:12.320
Okay.
link |
02:35:13.320
I was thinking about Brad Pitt and toasters.
link |
02:35:14.320
Okay, toasters, great.
link |
02:35:15.320
All right.
link |
02:35:16.320
Well, I think we started on love and ended on love.
link |
02:35:20.200
This was an incredible conversation, Rhonda.
link |
02:35:22.320
Thank you so much.
link |
02:35:23.320
Thank you.
link |
02:35:24.320
You're an incredible person.
link |
02:35:25.320
Thank you for everything you're doing in AI, in the space of just caring about humanity,
link |
02:35:32.960
caring about emotion, about love, and being an inspiration to a huge number of people
link |
02:35:38.160
in robotics, in AI, in science, in the world in general.
link |
02:35:42.320
So thank you for talking to me.
link |
02:35:43.320
It's an honor.
link |
02:35:44.320
Thank you for having me.
link |
02:35:45.320
And you know, I'm a big fan of yours as well.
link |
02:35:47.320
So it's been a pleasure.
link |
02:35:49.740
Thanks for listening to this conversation with Rhonda Alkalioubi.
link |
02:35:52.840
To support this podcast, please check out our sponsors in the description.
link |
02:35:56.940
And now let me leave you with some words from Helen Keller.
link |
02:36:00.680
The best and most beautiful things in the world cannot be seen or even touched.
link |
02:36:05.480
They must be felt with the heart.
link |
02:36:09.440
Thank you for listening and hope to see you next time.