back to index

Rana el Kaliouby: Emotion AI, Social Robots, and Self-Driving Cars | Lex Fridman Podcast #322


small model | large model

link |
00:00:00.000
there's a broader question here, right? As we build socially and emotionally intelligent machines,
link |
00:00:07.920
what does that mean about our relationship with them? And then we're broadly our relationship
link |
00:00:12.480
with one another, right? Because this machine is going to be programmed to be amazing at empathy,
link |
00:00:18.160
by definition, right? It's going to always be there for you. It's not going to get bored.
link |
00:00:23.360
I don't know how I feel about that. I think about that a lot.
link |
00:00:25.680
The following is a conversation with Rana L. Colubi, a pioneer in the field of emotion recognition
link |
00:00:34.640
and human centric artificial intelligence. She is the founder of Effectiva, deputy CEO of SmartEye,
link |
00:00:41.680
author of Girl Decoded, and one of the most brilliant, kind, inspiring, and fun human beings
link |
00:00:48.240
I've gotten the chance to talk to. This is the Lex Friedman podcast. To support it,
link |
00:00:53.360
please check out our sponsors in the description. And now, dear friends, here's Rana L. Colubi.
link |
00:01:00.560
You grew up in the Middle East in Egypt. What is a memory from that time that makes you smile?
link |
00:01:06.240
Or maybe a memory that stands out as helping your mind take shape and helping you define
link |
00:01:11.440
yourself in this world? So the memory that stands out is we used to live in my grandma's house.
link |
00:01:17.440
She used to have these mango trees in her garden in the summer, and so mango season was like July
link |
00:01:23.280
and August. And so in the summer, she would invite all my aunts and uncles and cousins,
link |
00:01:28.160
and it was just like maybe there were 20 or 30 people in the house, and she would cook all this
link |
00:01:32.800
amazing food. And us, the kids, we would go down the garden, and we would pick all these mangoes.
link |
00:01:41.200
And I don't know. I think it's just bringing people together that always stuck with me, the warmth.
link |
00:01:46.960
Around the mango tree. Yeah, around the mango tree. And it's just the joy, the joy of being
link |
00:01:52.800
together around food. And I'm a terrible cook, so I guess that memory didn't translate to me
link |
00:02:01.280
kind of doing the same. I love hosting people. Do you remember colors, smells? Is that what?
link |
00:02:07.200
How does memory work? What do you visualize? Do you visualize people's faces, smiles? Is there
link |
00:02:14.320
colors? Is there like a theme to the colors? Is it smells because of food involved?
link |
00:02:21.120
Yeah, I think that's a great question. So those Egyptian mangoes, there's a particular type
link |
00:02:26.800
that I love, and it's called Darwesi mangoes, and they're kind of, you know, they're oval,
link |
00:02:31.120
and they have a little red in them. So they're red and mango colored on the outside. So I remember
link |
00:02:37.360
that. Is red indicate like extra sweetness? Isn't that? Yes. That means like it's nice and ripe
link |
00:02:45.040
and stuff. Yeah. What's like a definitive food of Egypt? You know, there's like these almost
link |
00:02:52.720
stereotypical foods in different parts of the world, like Ukraine invented borsh. Borsh is this
link |
00:03:00.400
beet soup that you put sour cream on. If you know what it is, I think you know it's delicious,
link |
00:03:09.520
but if I explain it, it's just not going to sound delicious. I feel like beet soup. This
link |
00:03:14.400
doesn't make any sense, but that's kind of, and you probably have actually seen pictures of it,
link |
00:03:18.800
because it's one of the traditional foods in Ukraine, in Russia, in different parts of the
link |
00:03:23.760
Slavic world. But it's become so cliche and stereotypical that you almost don't mention it,
link |
00:03:30.800
but it's still delicious. I visited Ukraine, I eat that every single day. Do you make it
link |
00:03:37.120
yourself? How hard does it make? No, I don't know. I think to make it well, like anything,
link |
00:03:41.840
like Italians, they say, well, tomato sauce is easy to make, but to make it right,
link |
00:03:48.160
that's like a generational skill. So anyway, is there something like that in Egypt? Is there
link |
00:03:53.840
a culture of food? There is. And actually, we have a similar kind of soup. It's called molochaya,
link |
00:04:00.880
and it's made of this green plant. It's somewhere between spinach and kale,
link |
00:04:06.800
and you mince it, and then you cook it in chicken broth. And my grandma used to make,
link |
00:04:11.920
and my mom makes it really well, and I try to make it, but it's not as great. So we used to
link |
00:04:16.800
have that, and then we used to have it alongside stuffed pigeons. I'm pescatarian now, so I don't
link |
00:04:21.840
eat that anymore, but... Stuffed pigeons. Yeah, it's like, it was really yummy. It's the one thing
link |
00:04:26.640
I miss about, you know, now that I'm pescatarian, and I don't eat... The stuffed pigeons? Yeah,
link |
00:04:33.280
the stuffed pigeons. Is it, what are they stuffed with? If that doesn't bother you too much to
link |
00:04:39.120
describe? No, no, it's stuffed with a lot of like just rice, and yeah, it's just rice. Yeah, so.
link |
00:04:46.000
And you also said that your first in your book, that your first computer was an Atari,
link |
00:04:52.560
and Space Invaders was your favorite game. Is that when you first fell in love with computers?
link |
00:04:58.000
Would you say? Yeah, I would say so. Video games or just the computer itself? Just something about
link |
00:05:03.200
the machine? Ooh, this thing... It's magic in here. Yeah, I think the magical moment is definitely
link |
00:05:10.480
like playing video games with my... I have two younger sisters, and we just like had fun together,
link |
00:05:15.520
like playing games. But the other memory I have is my first code. The first code I wrote, I wrote...
link |
00:05:23.840
I drew a Christmas tree, and I'm Muslim, right? So it's kind of... It was kind of funny that I...
link |
00:05:29.600
The first thing I did was like this Christmas tree, so yeah. And that's when I realized, wow,
link |
00:05:35.840
you can write code to do all sorts of like really cool stuff. I must have been like six or seven
link |
00:05:42.160
at the time. So you can write programs, and the programs do stuff for you. That's power.
link |
00:05:49.760
If you think about it, that's empowering. That's AI. Yeah, I know what it is. I don't know if that...
link |
00:05:55.120
See, like, I don't know if many people think of it that way when they first learned to program.
link |
00:06:00.000
They just love the puzzle of it. Like, oh, this is cool. This is pretty. It's a Christmas tree,
link |
00:06:04.400
but like, it's power. It is power. Like, you... Eventually, I guess you couldn't at the time,
link |
00:06:09.840
but eventually this thing, if it's interesting enough, if it's a pretty enough Christmas tree,
link |
00:06:14.640
it can be run by millions of people and bring them joy, like that little thing. And then,
link |
00:06:19.920
because it's digital, it's easy to spread. So like, you just created something that's
link |
00:06:24.560
easily spreadable to millions of people. Totally. It's hard to think that way when you're six.
link |
00:06:30.800
In the book, you write, I am who I am because I was raised by a particular set of parents,
link |
00:06:37.040
both modern and conservative, forward thinking and yet locked in tradition. I'm a Muslim,
link |
00:06:43.120
and I feel I'm stronger and more centered for it. I adhere to the values of my religion,
link |
00:06:48.000
even if I'm not as dutiful as I once was. And I am a new American, and I'm thriving on the
link |
00:06:54.640
energy, vitality, and entrepreneurial spirit of this great country. So let me ask you about your
link |
00:07:00.880
parents. What have you learned about life from them, especially when you were young?
link |
00:07:05.280
So both my parents, they're Egyptian, but they moved to Kuwait right out. Actually,
link |
00:07:10.160
there's a cute story about how they met. So my dad taught Kobel in the 70s. Nice.
link |
00:07:15.600
And my mom decided to learn programming. So she signed up to take his Kobel programming class.
link |
00:07:22.320
And he tried to date her, and she was like, no, no, no, I don't date. And so he's like,
link |
00:07:27.200
okay, I'll propose. And that's how they got married. Whoa, strong move.
link |
00:07:30.960
Right, exactly right. That's really impressive. So those, those Kobel guys know how to, how to
link |
00:07:38.160
impress a lady. So, so yeah, so what have you learned from them?
link |
00:07:43.520
So definitely grit. One of the core values in our family is just hard work. There were no
link |
00:07:48.880
slackers in our family. And that's something I've definitely, that's definitely stayed with me,
link |
00:07:54.160
both, both as a professional, but also in my personal life. But I also think my mom, my mom
link |
00:08:01.920
always used to like, I don't know, it was like unconditional love. Like I just knew my parents
link |
00:08:08.240
would be there for me, kind of regardless of what I chose to do. And I think that's very powerful.
link |
00:08:15.520
And they got tested on it because I kind of challenged, you know, I challenge cultural norms,
link |
00:08:20.160
and I kind of took a different path, I guess, than what's expected of, you know, a woman in
link |
00:08:26.720
the Middle East. And then they, and I, you know, they still love me, which is, which is, I'm so
link |
00:08:31.600
grateful for that. When was like a moment that was the most challenging for them? Which moment
link |
00:08:36.880
were they kind of, they had to come face to face with the fact that you're a bit of a rebel?
link |
00:08:42.880
I think the first big moment was when I had just gotten married, but I decided to go do my PhD at
link |
00:08:52.800
Cambridge University. And because my husband at the time, he's now my ex, ran a company in Cairo,
link |
00:09:00.080
he was going to stay in Egypt. So it was going to be a long distance relationship.
link |
00:09:04.080
And that's very unusual in the Middle East for a woman to just head out and kind of,
link |
00:09:09.040
you know, pursue her career. And so my dad actually, my dad and my parents in law both said,
link |
00:09:16.880
you know, we do not approve of you doing this, but now you're under the jurisdiction of your
link |
00:09:21.760
husband, so he can make the call. And luckily for me, he was supportive. He said, you know,
link |
00:09:29.600
this is your dream come true. We've always wanted to do a PhD. I'm going to support you.
link |
00:09:33.840
So I think that was the first time where, you know, I challenged the cultural norms.
link |
00:09:41.040
Was that scary? Oh my God, yes. It was totally scary.
link |
00:09:44.640
What's the biggest culture shock from there to Cambridge, to London?
link |
00:09:52.720
Well, that was also during right around September 11th. So everyone thought that there was going
link |
00:09:58.880
to be a third world war. And I, at the time, I used to wear the hijab. So I was very visibly
link |
00:10:06.720
Muslim. And so my parents just were, they were afraid for my safety. But anyways, when I got
link |
00:10:12.800
to Cambridge, because I was so scared, I decided to take off my headscarf and wear a hat instead.
link |
00:10:17.440
So I just went to class wearing these like British hat, which was, in my opinion, actually worse
link |
00:10:22.640
than just showing up in a headscarf. Because it was just so awkward, right? Like fitting in class
link |
00:10:26.880
with like all these. Trying to fit in. Yeah. So after a few weeks of doing that, I was like,
link |
00:10:33.840
to heck with that, I'm just going to go back to wearing my headscarf.
link |
00:10:37.200
Yeah, you wore the hijab. So starting in 2000 and for 12 years after. So it's always whenever
link |
00:10:46.240
you're in public, you have to wear the hat covering. Can you speak to that, the hijab,
link |
00:10:52.160
maybe your mixed feelings about it? Like what does it represent in its best case?
link |
00:10:56.480
What does it represent in the worst case? Yeah. You know, I think there's a lot of,
link |
00:11:02.320
I guess I'll first start by saying I wore it voluntarily. I was not forced to wear it.
link |
00:11:06.800
And in fact, I was one of the very first women in my family to decide to put on the hijab.
link |
00:11:12.080
And my family thought it was really odd, right? Like there was, they were like,
link |
00:11:16.480
why do you want to put this on? And at its best, it's the sign of modesty, humility.
link |
00:11:22.560
Yeah. It's like me wearing a suit. People are like, why are you wearing a suit? It's a step back
link |
00:11:28.320
into some kind of tradition, a respect for tradition of sorts. So you said because it's
link |
00:11:34.160
by choice, you're kind of free to make that choice to celebrate a tradition of modesty.
link |
00:11:39.920
Exactly. And I actually like mated my own. I remember I would really match the color of my
link |
00:11:45.920
headscarf with what I was wearing. Like it was a form of self expression. And at its best,
link |
00:11:51.600
I loved wearing it. You know, I have a lot of questions around how we practice religion and
link |
00:11:57.360
religion and, you know, and I think, I think also it was a time where I was spending a lot of time
link |
00:12:03.680
going back and forth between the US and Egypt. And I started meeting a lot of people in the US
link |
00:12:08.400
who are just amazing people, very purpose driven, people who have very strong core values,
link |
00:12:15.280
but they're not Muslim. That's okay, right? And so that was when I just had a lot of questions.
link |
00:12:22.000
And politically, also the situation in Egypt was when the Muslim Brotherhood ran the country and I
link |
00:12:27.360
didn't agree with their ideology. It was at a time when I was going through a divorce. Like it was
link |
00:12:34.400
like just the perfect storm of like political personal conditions where I was like, this doesn't
link |
00:12:40.160
feel like me anymore. And it took a lot of courage to take it off because culturally, it's not,
link |
00:12:46.320
it's okay if you don't wear it, but it's really not okay to wear it and then take it off.
link |
00:12:51.920
But you're still, so you have to do that while still maintaining a deep core and pride in the
link |
00:12:59.120
origins, in your origin story. Totally. So still being Egyptian, still being a Muslim.
link |
00:13:06.720
Right. And being, I think generally like faith driven, but yeah.
link |
00:13:14.240
But what that means changes year by year for you. It's like a personal journey.
link |
00:13:18.880
Yeah, exactly. What would you say is the role of faith in that part of the world?
link |
00:13:23.920
Like how do you say, you mentioned it a bit in the book too.
link |
00:13:27.520
Yeah. I mean, I think there is something really
link |
00:13:31.040
powerful about just believing that there's a bigger force. You know, there's a kind of
link |
00:13:38.160
surrendering, I guess, that comes with religion and you surrender and you have this deep conviction
link |
00:13:43.280
that it's going to be okay. Right? Like the universe is out to like do amazing things for you
link |
00:13:47.920
and it's going to be okay. And there's strength to that. Like even when you're going through adversity,
link |
00:13:54.960
you just know that it's going to work out. Yeah, it gives you like an inner piece of calmness.
link |
00:14:00.000
Exactly. Exactly. Yeah. It's faith in all the meanings of that word.
link |
00:14:05.600
Right. Faith that everything is going to be okay. And it is because time passes
link |
00:14:10.800
and time cures all things. It's like a calmness with the chaos of the world. Yeah.
link |
00:14:16.320
And also there's like a silver, I'm a true believer of this, that something at the specific
link |
00:14:22.400
moment in time can look like it's catastrophic and it's not what you wanted in life. But then
link |
00:14:29.360
time passes and then you look back and there's a silver lining, right? It may be close the door,
link |
00:14:34.720
but it opened a new door for you. And so I'm a true believer in that, that, you know,
link |
00:14:39.840
there's a silver lining and almost anything in life, you just have to have this like,
link |
00:14:45.680
have faith or conviction that it's going to work out. So,
link |
00:14:48.240
such a beautiful way to see a shady feeling. So if you're, if you feel shady about it,
link |
00:14:52.560
current situation, I mean, it almost is always true unless it's the cliche thing of if it doesn't
link |
00:15:03.440
kill you, whatever it doesn't kill you makes you stronger. It's, it does seem that over time,
link |
00:15:08.960
when you take a perspective on things that the hardest moments and periods of your life are
link |
00:15:15.760
the most meaningful. Yeah. Yeah. So over time you get to have that perspective. Right. What,
link |
00:15:24.000
what about, because you mentioned Kuwait, what about, let me ask you about war. What's the
link |
00:15:31.840
role of war and peace? Maybe even the big love and hate in that part of the world, because it
link |
00:15:38.080
does seem to be a part of the world where there's turmoil. There was turmoil, there's still turmoil.
link |
00:15:43.680
It is so unfortunate, honestly. It's, it's such a waste of human resources and, and,
link |
00:15:53.200
yeah, and human mind share. I mean, and at the end of the day, we all kind of want the same
link |
00:15:57.840
things. We want, you know, we want a human connection, we want joy, we want to feel fulfilled,
link |
00:16:03.280
we want to feel, you know, a life of purpose. And I just, I just find it baffling, honestly,
link |
00:16:11.040
that we are still having to grapple with that. I have a story to share about this. You know,
link |
00:16:17.040
I grew up in Egypt, I'm Egyptian, American now, but, but, you know, originally from Egypt.
link |
00:16:22.800
And when I first got to Cambridge, it turned out my office mate, like my PhD kind of,
link |
00:16:28.880
you know, she ended up, you know, we ended up becoming friends, but she was from Israel.
link |
00:16:33.920
And we didn't know, yeah, we didn't know how it was going to be like.
link |
00:16:37.280
Did you guys sit there just staring at each other for a bit?
link |
00:16:42.400
Actually, she, because I arrived before she did. And it turns out she emailed our PhD advisor
link |
00:16:49.600
and asked him if she thought it was going to be okay. Yeah. Oh, this is around 9 11 too.
link |
00:16:55.040
Yeah. And, and Peter, Peter Robinson, our PhD advisor was like, yeah, like,
link |
00:17:00.400
this is an academic institution, just show up. And we became super good friends. We were both
link |
00:17:05.120
um, new moms, like we both had our kids during our PhD. We were both doing artificial emotional
link |
00:17:09.600
intelligence. She was looking at speech. I was looking at the face. We just had so the culture
link |
00:17:14.400
was so similar. Our jokes were similar. It was just, I was like, why on earth are our countries?
link |
00:17:21.920
Why is there all this like war and tension? And I think it falls back to the narrative, right?
link |
00:17:26.160
If you change the narrative, like whoever creates this narrative of war, I don't know,
link |
00:17:31.360
I don't know. We should have women around the world. Yeah, that's that's one solution.
link |
00:17:37.520
The good women, because there's also evil women in the world. True, true. Okay.
link |
00:17:43.920
But yes, yes, there could be less war if women around the world. The the other aspect is,
link |
00:17:50.160
doesn't matter the gender, the people in power. You know, I get to see this with with Ukraine
link |
00:17:56.400
and Russia, different parts of the world around that conflict now. And that's happening in Yemen
link |
00:18:02.480
as well, and everywhere else. There's these narratives told by the leaders to the populace.
link |
00:18:09.840
And those narratives take hold and everybody believes that and they have a distorted view
link |
00:18:15.520
of the humanity on the other side. In fact, especially during war, you don't even see
link |
00:18:20.960
the people on the other side as as as human, or as equal intelligence, or worth, or value,
link |
00:18:29.680
as as you, you tell all kinds of narratives about them being Nazis, or Dom, or whatever,
link |
00:18:38.800
whatever narrative you want to weave around that, or evil. But I think when you actually meet them
link |
00:18:46.320
face to face, you realize they're like the same. Exactly, right? It's actually a big shock for
link |
00:18:52.560
people to realize like, that they've been they've been essentially lied to, within their country.
link |
00:19:00.000
And I kind of have faith that social media is as ridiculous as it is to say, or any kind of technology
link |
00:19:07.280
is able to bypass the the walls that governments put up, and connect people directly. And then
link |
00:19:14.240
you get to realize who like people fall in love across different nations, and religions, and so
link |
00:19:21.280
on. And that I think ultimately can cure a lot of our ills, especially sort of in person. Just I
link |
00:19:26.880
also think that if leaders met in person, to have a conversation that would cure a lot of ills of
link |
00:19:34.000
the world, especially in private. Let me ask you about the women running, running the world.
link |
00:19:41.280
Okay. So gender does in part, perhaps shape the landscape of just our human experience.
link |
00:19:51.040
So in what ways was the limiting in in what ways was it empowering for you to be a woman in the
link |
00:19:57.920
Middle East? I think just kind of just going back to like my comment on like women running the world,
link |
00:20:03.120
I think it comes back to empathy, right, which which has been a common threat throughout my
link |
00:20:07.920
my entire career. And it's this this idea of human connection. Once you build common ground with a
link |
00:20:14.240
person or a group of people, you build trust, you build loyalty, you build friendship, and then and
link |
00:20:20.640
then you can turn that into like behavior change and motivation and persuasion. So so it's like
link |
00:20:25.840
empathy and emotions are just at the center of of everything we do. And and I think being being
link |
00:20:33.920
from the Middle East, kind of this human connection is very strong. Like we have this running joke that
link |
00:20:40.560
if you come to Egypt for a visit, people are going to we'll know everything about your life
link |
00:20:45.680
like right away, right? I have no problems asking you about your personal life. There's no like no
link |
00:20:51.520
boundaries really, no personal boundaries in terms of getting to know people we get emotionally
link |
00:20:55.920
intimate like very, very quickly. But I think people just get to know each other like,
link |
00:21:00.560
authentically, I guess, you know, there isn't this like superficial level of getting to know people,
link |
00:21:06.240
you just try to get to know people really deeply part of that. Totally. Because you can put yourself
link |
00:21:10.960
in this person's shoe and kind of, yeah, imagine, you know, what what challenges they're going through.
link |
00:21:17.760
And so I think I've definitely taken that with me. Generosity is another one too, like just being
link |
00:21:25.280
like just being generous with your time and love and attention and even with your wealth, right?
link |
00:21:32.400
Even if you don't have a lot of it, you're still very generous. And I think that's another
link |
00:21:36.640
enjoying the humanity of other people. And so you think there's a useful difference between men
link |
00:21:43.360
and women in that aspect and empathy? Or is doing these kind of big general groups, does that hinder
link |
00:21:55.520
progress? Yeah, I don't I actually don't want to over generalize. I mean, I some of the men I know
link |
00:22:01.760
are like the most empathetic humans. Yeah, I strive to be. Yeah, you're you're actually very empathetic.
link |
00:22:07.040
Yeah, so I so I don't want to over generalize. Although one of the researchers I worked with
link |
00:22:16.960
when I was at Cambridge, Professor Simon Bering Cohen, he's Sasha Bering Cohen's cousin. Yeah.
link |
00:22:23.280
And he runs the autism research center at Cambridge. And he's written multiple books
link |
00:22:30.000
on autism. And one of his one of his theories is the empathy scale, like the systemisers and
link |
00:22:35.280
the empathizers. And it there's a disproportionate amount of computer scientists and engineers who
link |
00:22:43.360
are systemisers, and perhaps not great empathizers. And then, you know, there's and there's more men
link |
00:22:51.760
in that bucket, I guess, than women. And then there's more women in the empathizers bucket.
link |
00:22:56.560
So again, not not to over generalize. I sometimes wonder about that. It's been frustrating to me
link |
00:23:02.080
how many, I guess, systemisers there are in the field of robotics. Yeah, it's actually encouraging
link |
00:23:07.840
to me because I care about, obviously, social robotics. And because it's it, there's more
link |
00:23:15.440
opportunity for people that are empathic. Exactly. I totally agree. Well, right. So it's nice. Yes.
link |
00:23:22.160
So every robotics I talk to, they don't see the the human as interesting as
link |
00:23:26.240
like it is. It's not exciting. You want to avoid the human at all costs. It's a it's a safety concern
link |
00:23:33.840
to be touching the human, which it is. But it is also an opportunity for a deep connection
link |
00:23:41.040
or collaboration or all that kind of stuff. So and because most most brilliant roboticist
link |
00:23:46.160
don't care about the human, it's an opportunity, right, for in your case, it's a business opportunity
link |
00:23:51.120
to be in general an opportunity to explore those ideas. So in this beautiful journey to Cambridge,
link |
00:23:59.200
to, you know, UK, and then to America, what, what's the moment or moments were there were
link |
00:24:04.960
most transformational for you as a scientist and as a leader. So you became an exceptionally
link |
00:24:10.960
successful CEO, founder, researcher, scientist, and so on. Was there a face shift there where
link |
00:24:21.920
like, I can be somebody, I can I can really do something in this world.
link |
00:24:26.640
Yeah, so I actually just kind of a little bit of background. So the reason why I moved from
link |
00:24:32.480
Cairo to Cambridge UK to do my PhD is because I had a very clear career plan. I was like,
link |
00:24:38.320
okay, I'll go abroad, get my PhD, gonna crush it in three or four years, come back to Egypt and
link |
00:24:44.480
teach. It was very clear, very well laid out. Was topic clear or no? The topic. Well, I did,
link |
00:24:51.120
I did my PhD around building artificial emotional intelligence and looking at the
link |
00:24:54.960
in your master plan ahead of time when you're sitting by the mango tree. Did you did you know
link |
00:24:59.040
it's going to be artificial intelligence? No, no, no, that I did not know. Although I think I kind
link |
00:25:03.920
of knew that I was going to be doing computer science, but I didn't know the specific area.
link |
00:25:10.160
But I love teaching. I mean, I still love teaching. So I just, yeah, I just wanted to go abroad,
link |
00:25:16.320
get a PhD, come back, teach. Why computer science? Can we just link on that? Because you're such an
link |
00:25:22.240
empathic person who cares about emotion and humans and so on. Aren't computers cold and
link |
00:25:28.240
emotionless. We're changing that. Yeah, I know. But like, isn't that the, or did you see computers
link |
00:25:37.360
as the having the capability to actually connect with humans? I think that was like my takeaway
link |
00:25:44.560
from my experience just growing up, like computers sit at the center of how we connect and communicate
link |
00:25:49.520
with one another, right? Or technology in general, like I remember my first experience being away
link |
00:25:54.560
from my parents, we communicated with a fax machine. But thank goodness for the fax machine,
link |
00:25:59.040
because we could send letters back and forth to each other. This was pre emails and stuff.
link |
00:26:04.720
So I think, I think there's, I think technology can be not just transformative in terms of
link |
00:26:10.320
productivity, etc. It actually does change how we connect with one another. And
link |
00:26:15.520
Can I just defend the fax machine? There's something like the haptic feel is the email
link |
00:26:22.400
is all digital. There's something really nice. I still write letters to people.
link |
00:26:27.120
There's something nice about the haptic aspect of the fax machine, because you still have to press,
link |
00:26:31.680
you still have to do something in the physical world to make this thing a reality,
link |
00:26:36.160
the sense, right? And then it like comes out as a printout and you can actually touch it and read
link |
00:26:40.320
it. Yeah, there's something, there's something lost when it's just an email. Obviously, I wonder
link |
00:26:47.600
how we can regain some of that in the digital world, which goes to the metaverse and all those
link |
00:26:53.120
kinds of things. We'll talk about it anyway. So actually, do you question on that one? Do you
link |
00:26:58.080
still, do you have photo albums anymore? Do you still print photos? No, no, but I'm a minimalist.
link |
00:27:06.240
Okay. So it was one of the one of the painful steps in my life was to scan all the photos
link |
00:27:11.520
and let go of them and then let go of all my books. You let go of your books? Yeah, switched
link |
00:27:18.320
to Kindle, everything Kindle. So I thought, I thought, okay, think 30 years from now, nobody's
link |
00:27:28.240
going to have books anymore. The technology of digital books is going to get better and better
link |
00:27:31.680
and better. Are you really going to be the guy that's still romanticizing physical books? Are
link |
00:27:36.320
you going to be the old man on the porch who was like, yes. So just get used to it because it felt,
link |
00:27:42.880
it still feels a little bit uncomfortable to read on a Kindle, but get used to it. I mean,
link |
00:27:50.720
I'm trying to learn new programming languages always. With technology, you have to kind of
link |
00:27:54.880
challenge yourself to adapt to it. I force myself to use TikTok now. That thing doesn't
link |
00:28:00.480
need much forcing. It pulls you in like the worst kind of, or the best kind of drug.
link |
00:28:05.920
Anyway, yeah. But I do love haptic things. There's a magic to the haptic. Even like touch
link |
00:28:14.160
screens, it's tricky to get right, to get the experience of a button. Anyway, what were we
link |
00:28:23.360
talking about? So AI, so the journey, your whole plan was to come back to Cairo and teach.
link |
00:28:29.680
Right. And then what did the plan go wrong? Yeah, exactly. Right. And then I got to Cambridge
link |
00:28:36.560
and I fall in love with the idea of research and kind of embarking on a path. Nobody's
link |
00:28:42.560
explored this path before. You're building stuff that nobody's built before and it's
link |
00:28:45.840
challenging and it's hard and there's a lot of nonbelievers. I just totally love that.
link |
00:28:50.960
And at the end of my PhD, I think it's the meeting that changed the trajectory of my life.
link |
00:28:56.320
Professor Roslyn Picard, who's, she runs the affective computing group at the MIT Media Lab.
link |
00:29:02.160
I had read her book. I was like following all her research.
link |
00:29:07.520
AKA Ros. Yes, AKA Ros. And she was giving a talk at a pattern recognition conference in Cambridge
link |
00:29:16.320
and she had a couple of hours to kill. So she emailed the lab and she said,
link |
00:29:19.680
you know, if any students want to meet with me, just sign up here. And so I signed up for slots
link |
00:29:26.960
and I spent like the weeks leading up to it preparing for this meeting.
link |
00:29:30.880
And I want to show her a demo of my research and everything. And we met and we ended up
link |
00:29:36.000
hitting it off like we totally clicked. And at the end of the meeting, she said,
link |
00:29:40.640
do you want to come work with me as a postdoc at MIT? And this is what I told her. I was like,
link |
00:29:45.920
okay, this would be a dream come true, but there's a husband waiting for me in Cairo.
link |
00:29:49.760
I kind of have to go back. And she said, it's fine. Just commute. And I literally started
link |
00:29:55.600
commuting between Cairo and Boston. Yeah, it was, it was a long commute. And I didn't,
link |
00:30:01.680
I did that like every few weeks, I would, you know, hop on a plane and go to Boston.
link |
00:30:06.320
But that, that changed the trajectory of my life. There was no,
link |
00:30:09.120
I kind of outgrew my dreams, right? I didn't want to go back to Egypt anymore and be faculty.
link |
00:30:16.720
Like that was no longer my dream. I had a dream. What was the, what was it like to be at MIT?
link |
00:30:22.480
What was that culture shock? You mean America in general, but also
link |
00:30:28.560
I mean Cambridge is its own culture. So what was MIT like? And what was America like?
link |
00:30:34.000
I think, I wonder if that's similar to your experience at MIT. I was just
link |
00:30:41.200
at the media lab in particular, I was just really impressed is not the right word.
link |
00:30:47.040
I didn't expect the openness to like innovation and the acceptance of taking a risk and failing.
link |
00:30:55.760
Like failure isn't really accepted back in Egypt, right? You don't want to fail.
link |
00:30:59.760
Like there's a fear of failure, which I think has been hardwired in my brain.
link |
00:31:04.880
But you get to MIT and it's okay to start things. And if they don't work out,
link |
00:31:08.400
like it's okay, you pivot to another idea. And that kind of thinking was just very new to me.
link |
00:31:13.680
That's liberating. What media lab for people don't know, MIT Media Lab is its own beautiful thing
link |
00:31:19.440
because they, I think more than other places at MIT reach for big ideas and like they try,
link |
00:31:26.320
I mean, I think, I mean, depending of course on who, but certainly with Rosalind is you try
link |
00:31:31.760
wild stuff, you try big things and crazy things and also try to take things to completion so
link |
00:31:38.560
you can demo them. So always, always, always have a demo. Like if you go, one of the sad things to
link |
00:31:45.760
me about robotics labs at MIT and there's like over 30, I think, is like usually when you show up
link |
00:31:52.720
to robotics lab, there's not a single working robot. They're all broken. All the robots are broken,
link |
00:31:58.800
which is like the normal state of things because you're working on them. But it would be nice
link |
00:32:03.200
if we lived in a world where robotics labs had some robots functioning. One of my like favorite
link |
00:32:10.320
moments that just sticks with me, I visited Boston Dynamics and there was a, first of all,
link |
00:32:16.320
seeing so many spots, so many legged robots in one place. I'm like, I'm home. But this is where
link |
00:32:27.200
I was built. The cool thing was just to see there was a random robot spot was walking down the hall.
link |
00:32:35.280
It's probably doing mapping, but it looked like he wasn't doing anything and he was wearing key or
link |
00:32:39.440
sheet. I don't know. In my mind, they're people. They have a backstory, but this one in particular
link |
00:32:47.280
definitely has a backstory because he was wearing a cowboy hat. So I just saw a spot robot with a
link |
00:32:53.280
cowboy hat walking down the hall and there was just this feeling like there's a life. He has a life.
link |
00:33:01.360
He probably has to commute back to his family at night. There's a feeling like there's life
link |
00:33:07.600
instilled in this robot. And that's magical. I don't know. It was kind of inspiring to see.
link |
00:33:12.080
Did he say hello to you? No, it's very focused. There's a focused nature to the robot. No,
link |
00:33:18.160
no, listen, I love competence and focus and great. Like he was not going to get distracted by the
link |
00:33:24.480
shallowness of small talk. There's a job to be done and he was doing it. So anyway, the fact
link |
00:33:31.200
that it was working is a beautiful thing. And I think Media Lab really prides itself on trying to
link |
00:33:36.400
always have a thing that's working that he could show off. Yes, we used to call it a demo or die.
link |
00:33:43.040
Yeah, you could not show up with PowerPoint or something. You actually had to have it working.
link |
00:33:47.440
You know what? My son, who is now 13, I don't know if this is still his lifelong goal or not,
link |
00:33:53.280
but when he was a little younger, his dream is to build an island that's just inhabited by robots,
link |
00:33:59.200
like no humans. He just wants all these robots to be connecting and having fun.
link |
00:34:02.720
And there you go. Does he have an idea of which robots he loves most? Is it
link |
00:34:10.640
Roomba like robots? Is it humanoid robots, robot dogs, or is not clear yet?
link |
00:34:16.800
We used to have a Jibo, which was one of the MIT Media Lab spinouts. And he used to love Jibo.
link |
00:34:22.240
The thing with a giant head. Yes, it spins. Right, exactly.
link |
00:34:25.520
You can rotate. And it's an eye. Not glowing. Right, exactly. It's like Hal 9000, but the
link |
00:34:33.360
friendly version. He loved that. And then he just loves, I think he loves all forms of robots,
link |
00:34:43.680
actually. So it embodies intelligence. Yes. I personally like elegant robots, especially.
link |
00:34:50.480
Anything that can wiggle its butt. No, that's not the definition of what I love. But that's
link |
00:34:59.760
just technically what I've been working on recently, except I have a bunch of legged
link |
00:35:03.280
robots now in Austin. And I've been doing, I've been trying to have them communicate
link |
00:35:09.360
affection with their body in different ways, just for art. For art, really. Because I love the idea
link |
00:35:16.240
of walking around with robots, as you would with a dog. I think it's inspiring to a lot of people,
link |
00:35:22.000
especially young people. Kids love robots. Kids love it. Parents, adults are scared of robots,
link |
00:35:28.640
but kids don't have this kind of weird construction of the world that's full of evil. They love
link |
00:35:33.760
cool things. Yeah. I remember when Adam was in first grade, so he must have been like seven or
link |
00:35:39.760
so. I went into his class with a whole bunch of robots and the Emotion AI demo and I asked
link |
00:35:46.160
the kids, I was like, do you, would you kids want to have a robot, you know, robot friend or robot
link |
00:35:53.120
companion? Everybody said yes. And they wanted it for all sorts of things, like to help them with
link |
00:35:57.760
their math homework and to like be a friend. So there's, it just struck me how there was no fear
link |
00:36:04.480
of robots. Was a lot of adults have that like us versus them. Yeah, none of that. Of course,
link |
00:36:12.240
you want to be very careful because you still have to look at the lessons of history and how
link |
00:36:17.680
robots can be used by the power centers of the world to abuse your rights and all that kind of
link |
00:36:22.240
stuff. But mostly it's good to enter anything new with an excitement and optimism. Speaking of
link |
00:36:30.960
Roz, what have you learned about science and life from Rosalind Picard? Oh my God, I've learned so
link |
00:36:36.400
many things about life from Roz. I think the thing I learned the most is perseverance.
link |
00:36:47.600
When I first met Roz, we apply and she invited me to be her postdoc, we applied for a grant to the
link |
00:36:53.280
National Science Foundation to apply some of our research to autism and we got back, we were
link |
00:37:01.040
rejected. The first time you were rejected for fun. Yeah, it was and I basically, I just took
link |
00:37:08.400
the rejection to mean, okay, we're rejected. It's done like end of story, right? And Roz was like,
link |
00:37:14.160
it's great news. They love the idea. They just don't think we can do it. So let's build it,
link |
00:37:19.520
show them and then reapply. Oh my God, that story totally stuck with me. And she's like that in
link |
00:37:28.560
every aspect of her life. She just does not take no for an answer. To reframe all negative feedback
link |
00:37:35.360
was a challenge. Yes, they liked this. Yeah, it was, it was a riot.
link |
00:37:43.200
What else about science in general, about how you see computers and
link |
00:37:48.960
also business and just everything about the world. She's a very powerful, brilliant woman
link |
00:37:54.240
like yourself. So is there some aspect of that too? Yeah, I think Roz is actually also very
link |
00:37:59.440
faith driven. She has this like deep belief in conviction. Yeah, and in the good in the world
link |
00:38:06.400
and humanity. And I think that was meeting her and her family was definitely like a defining
link |
00:38:12.880
moment for me because that was when I was like, wow, like, you can be of a different background
link |
00:38:17.360
and religion and whatever. And you can still have the same core values. So that was, that was, yeah.
link |
00:38:26.800
I'm grateful to her. So Roz, if you're listening, thank you. Yeah, she's great. She's been on this
link |
00:38:31.760
podcast before. I hope she'll be on, I'm sure she'll be on again. You were the founder and CEO of
link |
00:38:40.720
Affectiva, which is a big company that was acquired by another big company, Smart Eye. And you're now
link |
00:38:47.520
the deputy CEO of Smart Eye. So you're a powerful leader, you're brilliant, you're brilliant scientists.
link |
00:38:53.520
A lot of people are inspired by you. What advice would you give, especially to young women, but
link |
00:38:58.880
people in general who dream of becoming powerful leaders like yourself in a world where perhaps,
link |
00:39:04.560
in a world that perhaps doesn't give them a clear, easy path to do so, whether we're talking about
link |
00:39:16.800
Egypt or elsewhere? You know, here, you kind of describe me that way, kind of encapsulates,
link |
00:39:27.040
I think, what I think is the biggest challenge of all, which is believing in yourself, right?
link |
00:39:31.280
I have had to like grapple with this, what I call now the Debbie Downer voice in my head.
link |
00:39:39.360
The kind of basically, it's just chattering all the time. It's basically saying, oh, no, no, no,
link |
00:39:44.240
you can't do this. Like, you're not going to raise money. You can't start a company. Like,
link |
00:39:47.440
what business do you have, like, starting a company or running a company or selling a company?
link |
00:39:51.200
Like, you name it. It's always like, and I think my biggest advice to not just women,
link |
00:39:59.040
but people who have, who are taking a new path and, you know, they're not sure,
link |
00:40:05.760
is to not let yourself and let your thoughts be the biggest obstacle in your way.
link |
00:40:10.880
And I've had to like really work on myself to not be my own biggest obstacle.
link |
00:40:17.920
So you got that negative voice?
link |
00:40:19.840
Yeah.
link |
00:40:20.720
So is that?
link |
00:40:22.160
Am I the only one? I don't think I'm the only one.
link |
00:40:24.240
No, I have that negative voice. I'm not exactly sure if it's a bad thing or a good thing. I've
link |
00:40:30.960
been really torn about it because it's been a lifelong companion. It's hard to know.
link |
00:40:38.800
It's kind of a, it drives productivity and progress, but it can't hold you back from
link |
00:40:45.440
taking big leaps. I think you, the best I can say is probably you have to somehow
link |
00:40:52.720
be able to control it. So turn it off when it's not useful and turn it on when it's useful.
link |
00:41:00.240
Like I have from almost like a third person perspective.
link |
00:41:02.800
Right. Somebody who's sitting there like.
link |
00:41:04.720
Yeah. Like because it is useful to, to be critical. Like after,
link |
00:41:12.800
like I just gave a talk yesterday at MIT and I was just, you know, there's so much love and it
link |
00:41:20.800
was such an incredible experience. So many amazing people I got. She has to talk to, but
link |
00:41:26.160
you know, afterwards when I, when I went home and just took this long walk, it was mostly
link |
00:41:32.000
just negative thoughts about me. I don't like one basic stuff. Like I don't deserve any of it.
link |
00:41:38.800
And second is like, like, why did you, that was so dumb that you said this, that's so dumb.
link |
00:41:44.880
Like you got, you should have prepared that better. Why did you say this?
link |
00:41:48.640
But I think it's good to hear that voice out. All right. And like sit in that.
link |
00:41:56.160
And ultimately, I think you grow from that. Now, when you're making really big decisions about
link |
00:42:00.640
funding or starting a company or taking a leap to go to the UK or take a leap to go to America,
link |
00:42:09.280
to work in media lab, though, yeah, there's a, that's, you should be able to shut that off then
link |
00:42:18.960
because you should have like this weird confidence, almost like faith that you said before that
link |
00:42:25.440
everything's going to work out to take the leap of faith. Despite all the negativity. I mean,
link |
00:42:33.040
there's, there's some of that you actually tweeted a really nice tweet thread. It says, quote,
link |
00:42:40.640
a year ago, a friend recommended I do daily affirmations and I was skeptical. But I was
link |
00:42:47.600
going through major transitions in my life. So I gave it a shot and it set me on a journey of
link |
00:42:51.920
self acceptance and self love. So what was that like image? Maybe talk through this idea of
link |
00:42:59.280
affirmations and how that helped you? Yeah, because really, like, I'm just like me, I'm a kind,
link |
00:43:05.520
I'd like to think of myself as a kind person in general, but I'm kind of mean to myself sometimes.
link |
00:43:11.280
And so I've been doing journaling for almost 10 years now. I use an app called day one and
link |
00:43:18.400
it's awesome. I just journal and I use it as an opportunity to almost have a conversation with
link |
00:43:22.480
the Debbie Downer voice in my, it's like a rebuttal, right? Like Debbie Downer says, oh,
link |
00:43:27.040
my God, like you, you know, you won't be able to raise this round of funny. I'm like, okay,
link |
00:43:30.880
let's talk about it. I have a track record of doing X, Y and Z. I think I can do this.
link |
00:43:37.200
And it's, it's literally like, so I wouldn't, I don't know that I can shut off the voice,
link |
00:43:42.320
but I can have a conversation with it. And it just, it just, and I bring data to the table, right?
link |
00:43:49.840
Nice. So, so that was the journaling part, which I found very helpful.
link |
00:43:53.200
But the affirmation took it to a whole next level and I just love it. I'm, I'm, I'm a year
link |
00:43:59.120
into doing this. And you literally wake up in the morning and the first thing you do,
link |
00:44:04.240
I meditate first. And then, and then I write my affirmations and it's, it's the energy I want to
link |
00:44:09.600
put out in the world that hopefully will come right back to me. So I will say, I always start
link |
00:44:14.880
with my smile lights up the whole world. And I kid you not, like people in the street will stop me
link |
00:44:19.520
and say, Oh my God, like we love your smile. Like, yes. So, so my affirmations will change depending
link |
00:44:26.640
on, you know, what's happening this day. Is it funny? I know, don't judge, don't judge.
link |
00:44:31.360
No, that's not, what? Laughter is not judgment. It's just awesome. I mean, it, it's true,
link |
00:44:37.200
but you're saying affirmations somehow help kind of, what is it that they do work to like
link |
00:44:44.880
remind you of the kind of person you are and the kind of person you want to be, which
link |
00:44:50.880
actually may be inverse order, the kind of person you want to be, and that helps you become the,
link |
00:44:55.040
the kind of person you actually are. It just, it's, it brings intentionality to like what you're
link |
00:45:00.880
doing, right? And so, by the way, I was laughing because my affirmations, which I also do are the
link |
00:45:07.280
opposite. Oh, you do? Oh, what do you do? I don't, I don't have a, my smile left up the water. Maybe
link |
00:45:14.800
I should add that because like I, I have, I just, I have, oh boy, I just, it's, it's much more stoic,
link |
00:45:23.120
like about focused about this kind of stuff, but the joy, the emotion that you're just in that
link |
00:45:31.360
little affirmation is beautiful. So maybe I should add that. I have some, I have some like
link |
00:45:36.560
focused stuff, but that's usually, but that's a cool start. That's just after all the like
link |
00:45:40.960
smiling and playful and joyful and all that, and then it's like, okay, I kick butt. Let's get you
link |
00:45:46.080
done. Right. Let's get you done affirmation. Okay, cool. So like what else is on there?
link |
00:45:52.240
Oh, what else is on there? Um, well, I, I have, I'm a, I'm, I'm a magnet for all sorts of things.
link |
00:46:00.000
So I'm an amazing people magnet. I attract like awesome people into my universe. I,
link |
00:46:05.360
so that's an actual affirmation. Yes. That's great. Yeah. So that, that's, and that, yeah,
link |
00:46:10.640
and that somehow manifests itself into like working. I think so. Yeah. Like, can you speak to like,
link |
00:46:16.720
why it feels good to do the affirmations? I honestly think it just grounds the day.
link |
00:46:24.080
And then it allows me to, instead of just like being pulled back and forth, like throughout
link |
00:46:29.920
the day, it just like grounds me like, okay, like this thing happened. It's not exactly what I wanted
link |
00:46:35.840
it to be, but I'm patient or I'm, you know, I'm, I trust that the universe will do amazing things
link |
00:46:42.240
for me, which is one of my other consistent affirmations. Or I'm an amazing mom, right? And so
link |
00:46:47.760
I can grapple with all the feelings of mom guilt that I have all the time. Or here's another one.
link |
00:46:53.760
I'm a love magnet. And I literally say, I will kind of picture the person that I'd love to end up
link |
00:46:58.800
with. And I write it all down and hasn't happened yet. But what do you, what do you picture?
link |
00:47:03.920
This is Brad Pitt. Because that's what I picture. Okay. That's what you picture? Okay.
link |
00:47:08.400
Yeah. I'm running, holding hands, running together. No, more like Fight Club that,
link |
00:47:16.400
the Fight Club Brad Pitt, where he's like standing. All right, people will know. Okay. I'm sorry.
link |
00:47:20.720
I'll get off of that. Do you have a, like when you're thinking about the being a love magnet in
link |
00:47:25.840
that way? Are you picturing specific people? Or is this almost like
link |
00:47:33.680
in the space of like energy? Right. It's somebody who is smart and well accomplished
link |
00:47:41.760
and successful in their life, but they're generous and they're well traveled and they want to travel
link |
00:47:47.520
the world. It's things like that. Like their head over heels into me is like, I know it sounds
link |
00:47:52.480
super silly, but it's literally what I write. Yeah. And I believe it'll happen one day. Oh,
link |
00:47:56.320
you actually write so you don't say it out loud? No, I write it. I write all my affirmations.
link |
00:48:01.120
I do the opposite. I say it. Yeah. If I'm alone, I'll say it out loud. Yeah. I should try that.
link |
00:48:10.000
I think it's which, what feels more powerful to you, to me, more powerful saying stuff feels
link |
00:48:19.280
more powerful. Yeah. Writing is, writing feels like I'm losing, losing the words, like losing the
link |
00:48:31.440
power of the words, maybe because I write slow. Do you hand write? No, I type. It's on this app.
link |
00:48:37.520
It's day one basically. And I just, I can, the best thing about it is I can look back
link |
00:48:41.520
and see like a year ago, what was I affirming, right? Also changes over time. It hasn't like
link |
00:48:50.640
changed a lot, but it, but the focus kind of changes over time. I got it. Yeah. I say the same
link |
00:48:56.240
exact thing over and over and over. Oh, you do? Okay. There's a comfort in the, in the sameness of
link |
00:49:00.720
it. Well, actually, let me jump around because let me ask you about, because I talked, all this talk
link |
00:49:06.400
about Brad Pitt or maybe it's just going on my side of my head. Let me ask you about dating in
link |
00:49:11.920
general. You tweeted, are you based in Boston in single question mark? And then you pointed to a
link |
00:49:20.240
startup singles night sponsored by a small dating app. Because I mean, this is jumping around a
link |
00:49:26.240
little bit, but since, since you mentioned, can AI help solve this dating love problem?
link |
00:49:33.840
What do you think? This problem of connection that is part of the human condition. Can AI help
link |
00:49:40.880
that you yourself are in the search affirming? Maybe that's what I should affirm, like build an AI.
link |
00:49:48.240
Build an AI that finds love. I think, I think there must be a science behind
link |
00:49:56.800
that first moment you meet a person and you either have chemistry or you don't, right? Like,
link |
00:50:02.480
you, I guess that was the question I was asking, would you put it brilliantly? Is that a science or
link |
00:50:07.040
an art? Ooh, I think there are like, there's actual chemicals that get exchanged when people,
link |
00:50:14.480
two people meet. Oh, well, I don't know about that. I like how you're changing. Yeah, yeah,
link |
00:50:20.080
changing your mind as we're describing it, but it feels that way. But it's what science shows us
link |
00:50:25.760
is sometimes we can explain with the rigor, the things that feel like magic. Right. So maybe
link |
00:50:32.240
you can remove all the magic. Maybe it's like, I honestly think, like I said, like Goodreads should
link |
00:50:38.480
be a dating app, which like books, I wonder, I wonder if you look at just like books or content
link |
00:50:45.840
you've consumed. I mean, that's essentially what YouTube does when it does recommendation. If you
link |
00:50:50.960
just look at your footprint of content consumed, if there's an overlap, but maybe interesting
link |
00:50:56.960
difference with an overlap, there's some I'm sure this is a machine learning problem that's solvable.
link |
00:51:03.520
Like this person is very likely to be not only there to be chemistry in the short term,
link |
00:51:10.560
but a good lifelong partner to grow together. I bet you it's a good machine learning problem.
link |
00:51:15.600
You should need the data. Let's do it. Well, actually, I do think there's so much data about
link |
00:51:20.720
each of us that there ought to be a machine learning algorithm that can ingest all this
link |
00:51:24.560
data and basically say, I think the following 10 people would be interesting connections for you,
link |
00:51:30.400
right? And so Smile Dating app kind of took one particular angle, which is humor. It matches
link |
00:51:37.360
people based on their humor styles, which is one of the main ingredients of a successful
link |
00:51:42.400
relationship. Like if you meet somebody and they can make you laugh, like that's a good thing.
link |
00:51:47.200
And if you develop like internal jokes, like inside jokes and you're bantering, like that's fun.
link |
00:51:53.280
Yeah. So I think. Yeah, definitely. But yeah, that's the
link |
00:52:00.800
number and the rate of inside joke generation. You could probably measure that and then optimize
link |
00:52:08.000
it over the first few days. You can see. Right. And then we're just turning this into a machine
link |
00:52:11.920
learning problem. I love it. But for somebody like you, who's exceptionally successful and busy,
link |
00:52:18.160
is there, is there signs to that aspect of dating? Is it tricky? Is there advice you can give?
link |
00:52:28.720
Oh my God, I'd give the worst advice. Well, I can tell you like I have a spreadsheet.
link |
00:52:33.520
Spreadsheet. That's great. Is that a good or a bad thing? Do you regret the spreadsheet?
link |
00:52:39.760
Well, I don't know. What's the name of the spreadsheet? Is it Love?
link |
00:52:42.400
It's the date track, dating tracker. It's very like. Love tracker. Yeah.
link |
00:52:48.400
And there's a rating system, I'm sure. Yeah, there's like weights and stuff.
link |
00:52:52.240
It's too close to home. Oh, is it? Do you also have a spreadsheet? Well, I don't have a spreadsheet,
link |
00:52:56.000
but I would, now that you say it, it seems like a good idea. Oh, no.
link |
00:53:02.560
Turning into data. I do wish that somebody else had a spreadsheet about me.
link |
00:53:09.200
Like I said, like you said, convert, collect a lot of data about us in a way that's
link |
00:53:18.240
privacy preserving, that I own the data, I can control it, and then use that data to find,
link |
00:53:23.600
I mean, not just romantic love, but collaborators, friends, all that kind of stuff. It seems like
link |
00:53:29.440
the data is there. That's the problem social networks are trying to solve, but I think they're
link |
00:53:34.720
doing a really poor job. Even Facebook tried to get into a dating app business. And I think there's
link |
00:53:40.560
so many components to running a successful company that connects human beings. And part of that is
link |
00:53:50.240
having engineers that care about the human side, right? As you know extremely well, it's not
link |
00:53:56.560
easy to find those, but you also don't want just people that care about the human,
link |
00:54:00.640
they also have to be good engineers. So it's like, you have to find this beautiful mix. And for some
link |
00:54:06.320
reason, just empirically speaking, people have not done a good job of that, of building companies
link |
00:54:13.440
like that. It must mean that it's a difficult problem to solve. Dating apps, it seems difficult.
link |
00:54:19.920
Okay, Cupid, Tinder, all those kind of stuff. They seem to find, of course they work, but they seem
link |
00:54:28.400
to not work as well as I would imagine it's possible. With data, wouldn't you be able to find
link |
00:54:35.120
better human connection? It's like arranged marriages on steroids essentially, arranged by
link |
00:54:41.040
machine learning algorithm.
link |
00:54:42.560
Arranged by machine learning algorithm, but not a superficial one. I think a lot of the dating
link |
00:54:46.480
apps out there are just so superficial. They're just matching on high level criteria that aren't
link |
00:54:51.920
ingredients for successful partnership. But you know what's missing though too? I don't know how
link |
00:54:58.320
to fix that, the serendipity piece of it. Like how do you engineer serendipity? Like this random
link |
00:55:04.080
chance encounter and then you fall in love with the person. I don't know how a dating app can do
link |
00:55:09.040
that. So that has to be a little bit of randomness. Maybe every 10th match is just a, you know,
link |
00:55:17.200
yeah, somebody that the algorithm wouldn't have necessarily recommended, but it allows for a
link |
00:55:23.200
little bit of... Well, it can also trick you into thinking of serendipity by somehow showing
link |
00:55:31.920
you a tweet of a person that he thinks you'll match well with, but do it accidentally as
link |
00:55:37.600
part of another search. And you just notice it and then you go down a rabbit hole and you
link |
00:55:43.680
get, you go down a rabbit hole and you connect them outside the app too. Like you connect with
link |
00:55:49.840
this person outside the app somehow. So it creates that moment of meeting. Of course,
link |
00:55:54.480
you have to think of from an app perspective how you can turn that into a business. But I think
link |
00:55:58.800
ultimately a business that helps people fall in love in any way, like that's what Apple was about.
link |
00:56:05.680
Create products that people love that's beautiful. I mean, you got to make money somehow.
link |
00:56:10.400
If you help people fall in love personally with the product, find self love or
link |
00:56:17.840
another human being, you're going to make money. You're going to figure out a way to make money.
link |
00:56:22.960
I just feel like the dating apps often will optimize for something else than love.
link |
00:56:28.560
It's the same with social networks. They optimize for engagement as opposed to like a deep,
link |
00:56:33.040
meaningful connection that's ultimately grown in like personal growth. You as a human being
link |
00:56:39.040
growing and all that kind of stuff. Let me do like a pivot to a dark topic, which you open the book
link |
00:56:46.560
with. A story, because I'd like to talk to you about just emotion and artificial intelligence.
link |
00:56:56.000
I think this is a good story to start to think about emotional intelligence. You open the book
link |
00:57:01.600
with a story of a Central Florida man, Jamel Dunn, who was drowning and drowned while five
link |
00:57:07.760
teenagers watched and laughed saying things like, you're going to die and when Jamel disappeared
link |
00:57:13.360
below the surface of the water, one of them said he just died and the others laughed. What is this
link |
00:57:18.880
incident teach you about human nature and the response to it perhaps?
link |
00:57:25.120
I mean, I think this is a really, really, really sad story and it highlights what I believe is a
link |
00:57:32.560
it's a real problem in our world today. It's an empathy crisis. Yeah, we're living through an
link |
00:57:39.040
empathy crisis. And I mean, we've talked about this throughout our conversation. We dehumanize
link |
00:57:47.200
each other. And unfortunately, yes, technology is bringing us together. But in a way, it's just
link |
00:57:54.000
dehumanizing. It's creating this dehumanizing of the other. And I think that's a huge problem. The
link |
00:58:02.720
good news is I think the solution could be technology based. I think if we rethink the
link |
00:58:08.080
way we design and deploy our technologies, we can solve parts of this problem. But I worry about
link |
00:58:14.080
it. I mean, even with my son, a lot of his interactions are computer mediated. And I just
link |
00:58:22.640
question what that's doing to his empathy skills and you know, his ability to really connect with
link |
00:58:28.160
people. So you think you think it's not possible to form empathy through the digital medium?
link |
00:58:38.240
I think it is. But we have to be thoughtful about because the way the way we engage face to face,
link |
00:58:45.600
which is what we're doing right now, right, there's the nonverbal signals, which are a majority of
link |
00:58:50.240
how we communicate. It's like 90% of how we communicate is your facial expressions.
link |
00:58:55.920
You know, I'm saying something and you're nodding your head now and that creates a feedback loop.
link |
00:59:00.320
And if you break that. And now I have anxiety about it.
link |
00:59:06.000
Poor Lex. I am not scrutinizing your facial expressions during this interview.
link |
00:59:11.440
I am. Look normal, look human, nod head. Yeah, nod head in agreement.
link |
00:59:21.520
If Rana says yes, then nod head else. Don't do it too much because it might be at the wrong time.
link |
00:59:28.640
And then it will send the wrong signal. Oh, God. And make eye contact sometimes because humans
link |
00:59:34.080
appreciate that. Okay. Yeah, but something about the, especially when we say mean things in person,
link |
00:59:42.400
you get to see the pain of the other person. If you're tweeting it at a person and you have no
link |
00:59:46.720
idea how it's going to land, you're more likely to do that on social media than you are in face
link |
00:59:50.880
to face conversations. So what do you think is more important?
link |
00:59:55.840
And EQ or IQ, EQ being emotional intelligence in terms of in what makes us human?
link |
01:00:08.080
I think emotional intelligence is what makes us human. It's how we connect with one another.
link |
01:00:14.800
It's how we build trust. It's how we make decisions, right? Like your emotions drive
link |
01:00:22.080
kind of what you had for breakfast, but also where you decide to live and what do you want to do for
link |
01:00:26.880
the rest of your life. So I think emotions are underrated.
link |
01:00:32.800
But so emotional intelligence isn't just about the effective expression of your own emotions.
link |
01:00:39.120
It's about a sensitivity and empathy to other people's emotions and that sort of being able
link |
01:00:44.640
to effectively engage in the dance of emotions with other people.
link |
01:00:48.240
Yeah, I like that explanation. I like that kind of, yeah, thinking about it as a dance,
link |
01:00:55.120
because it is really about that. It's about sensing what state the other person's in and
link |
01:00:59.200
using that information to decide on how you're going to react. And I think it can be very powerful.
link |
01:01:06.720
Like people who are the best, most persuasive, most persuasive leaders in the world tap into,
link |
01:01:13.840
you know, they have, if you have higher EQ, you're more likely to be able to motivate people
link |
01:01:20.000
to change their behaviors. So it can be very powerful.
link |
01:01:24.960
On a more kind of technical, maybe philosophical level, you've written that emotion is universal.
link |
01:01:32.560
It seems that sort of like Chomsky says language is universal. There's a bunch of other stuff like
link |
01:01:39.040
cognition, consciousness. Seems a lot of us have these aspects. So the human mind generates all
link |
01:01:46.800
this. And so what do you think is the, they all seem to be like echoes of the same thing.
link |
01:01:53.840
What do you think emotion is exactly? Like how deep does it run? Is it a surface level thing
link |
01:02:00.640
that we display to each other? Is it just another form of language or something deep within?
link |
01:02:05.280
I think it's, it's really deep. It's how, you know, we started with memory. I think emotions
link |
01:02:11.920
play a really important role. Yeah, emotions play a very important role in how we encode
link |
01:02:17.920
memories, right? Our, our memories are often encoded, almost indexed by emotions. Yeah.
link |
01:02:24.800
Yeah, it's at this core of how, you know, our decision making engine is also heavily influenced
link |
01:02:31.840
by our emotions. So emotions is part of cognition. It's totally, it's intermix into the whole thing.
link |
01:02:37.600
Yes, absolutely. And in fact, when you take it away, people are unable to make decisions.
link |
01:02:42.800
They're really paralyzed. Like they can't go about their daily or their, you know, personal or
link |
01:02:47.200
professional lives. So it does seem like there's probably some interesting interweaving of emotion
link |
01:02:57.440
and consciousness. I wonder if it's possible to have, like if they're next door neighbors somehow,
link |
01:03:03.680
or if they're actually flatmates. I don't, I don't, it feels like the, the hard problem of
link |
01:03:11.840
consciousness where it's some, it feels like something to experience the thing. Like red
link |
01:03:18.160
feels like red and it's, you know, when you eat a mango, sweet, the taste, the, the, the sweetness
link |
01:03:24.400
that it feels like something to experience that sweetness, that whatever generates emotions.
link |
01:03:33.200
But then like, I feel like emotion is part of communication. It's very much about communication.
link |
01:03:39.440
And then that means it's also deeply connected to language. But then probably human intelligence
link |
01:03:49.680
is deeply connected to the collective intelligence between humans. It's not just the standalone
link |
01:03:53.920
thing. So the whole thing is really connected. So emotion is connected to language, language is
link |
01:03:59.200
connected to intelligence. And then intelligence connected to consciousness and consciousness
link |
01:04:04.080
is connected to emotion. The whole thing is, it's a beautiful mess. So
link |
01:04:12.640
Can I comment on the emotions being a communication mechanism? Because I think
link |
01:04:16.480
there are two facets of, of our emotional experiences. One is communication, right?
link |
01:04:24.400
Like we use emotions, for example, facial expressions or other nonverbal cues to connect
link |
01:04:29.760
with other human beings and with other beings in the world, right? But even if it's not a
link |
01:04:38.880
communication context, we still experience emotions and we still process emotions and
link |
01:04:43.920
we still leverage emotions to make decisions and to learn and, you know, to experience life. So
link |
01:04:50.880
it isn't always just about communication. And we learned that very early on in our,
link |
01:04:55.280
in kind of our work at Affectiva. One of the very first applications we brought to market was
link |
01:05:01.520
understanding how people respond to content, right? So if they're watching this video of
link |
01:05:05.040
ours, like, are they interested? Are they inspired? Are they bored to death? And so we
link |
01:05:09.520
watched their facial expressions. And we had, we weren't sure if people would express any emotions
link |
01:05:16.320
if they were sitting alone. Like if you're in your bed at night, watching a Netflix TV series,
link |
01:05:21.120
would we still see any emotions on your face? And we were surprised that, yes, people still
link |
01:05:25.440
emote, even if they're alone, even if you're in your car driving around, you're singing along
link |
01:05:29.680
a song and you're joyful. We'll see these expressions. So it's not just about communicating
link |
01:05:35.280
with another person. It's sometimes really is it just about experiencing the world.
link |
01:05:41.680
First of all, I wonder if some of that is because we develop our intelligence and our
link |
01:05:47.920
emotional intelligence by communicating with other humans. And so when other humans disappear
link |
01:05:53.280
from the picture, we're still kind of a virtual human. The code still runs basically. Yeah,
link |
01:05:57.840
the code still runs. And but you're also kind of, you're still, there's like virtual humans,
link |
01:06:02.800
you don't have to think of it that way. But there's a kind of, when you like chuckle like,
link |
01:06:08.560
yeah, like you're, you're kind of chuckling to a virtual human. I mean, it's possible that
link |
01:06:16.400
the code is the has to have another human there. Because if you just grow up alone,
link |
01:06:24.240
I wonder if emotion will still be there in this visual form. So yeah, I wonder. But anyway,
link |
01:06:30.640
what can you tell from the human face about what's going on inside? So that's the problem
link |
01:06:39.120
that effective at first tackled, which is using computer vision, using machine learning to try to
link |
01:06:47.120
detect stuff about the human face as many things as possible, and convert them into a prediction of
link |
01:06:54.480
categories of emotion, anger, happiness, all that kind of stuff. How hard is that problem?
link |
01:07:00.240
It's extremely hard. It's very, very hard because there is no one to unmapping between
link |
01:07:06.240
a facial expression and your internal states. There just isn't. There's this oversimplification
link |
01:07:11.040
of the problem where it's something like, if you are smiling, then you're happy. If you do a brow
link |
01:07:16.240
furrow, then you're angry. If you do an eyebrow raise, then you're surprised. And just think about
link |
01:07:21.280
it for a moment, you could be smiling for a whole host of reasons. You could also be happy and not
link |
01:07:26.320
be smiling. You could furrow your eyebrows because you're angry or you're confused about something
link |
01:07:34.160
or you're constipated. So I think this oversimplistic approach to inferring emotion from a facial
link |
01:07:41.360
expression is really dangerous. The solution is to incorporate as many contextual signals as you
link |
01:07:49.280
can. So for example, I'm driving a car and you can see me nodding my head and my eyes are closed
link |
01:07:57.600
and the blinking rate is changing. I'm probably falling asleep at the wheel because you know
link |
01:08:03.760
the context. You understand what the person is doing. So I think or add additional channels like
link |
01:08:10.080
voice or gestures or even physiological sensors. But I think it's very dangerous to just take this
link |
01:08:17.200
over simplistic approach of smile equals happy. If you're able to, in a high resolution way,
link |
01:08:24.000
specify the context, there's certain things that are going to be somewhat reliable signals
link |
01:08:29.600
of something like drowsiness or happiness or stuff like that. I mean, when people are watching
link |
01:08:35.840
Netflix content, that problem, that's a really compelling idea that you can kind of, at least
link |
01:08:43.200
in aggregate, highlight like which part was boring, which part was exciting. How hard was
link |
01:08:49.360
that problem? That was on the scale of like difficulty. I think that's one of the easier
link |
01:08:56.720
problems to solve because it's a relatively constrained environment. You have somebody
link |
01:09:02.560
sitting in front of, initially we started with like a device in front of you like a laptop.
link |
01:09:07.200
And then we graduated to doing this on a mobile phone, which is a lot harder just because of,
link |
01:09:12.720
you know, from a computer vision perspective, the profile view of the face can be a lot more
link |
01:09:17.520
challenging. We had to figure out lighting conditions because usually people are watching
link |
01:09:22.240
content literally in their bedrooms at night, lights are dimmed. Yeah. I mean, if you're
link |
01:09:29.200
standing, it's probably going to be the looking up the nostril view. Yeah. And nobody looks good at
link |
01:09:35.680
that. I've seen data sets from that perspective. It's like, this is not a good look for anyone.
link |
01:09:42.960
Or if you're laying in bed at night, what is it? Side view or something?
link |
01:09:47.360
And half your face is like on a pillow. Actually, I would love to know, have data
link |
01:09:53.600
about like how people watch stuff in bed at night. Like, do they prop there? Is it a pillow?
link |
01:10:03.040
Like, I'm sure there's a lot of interesting dynamics there.
link |
01:10:06.480
Right. From a health and well being perspective, right? Like, it's like, oh, you're hurting,
link |
01:10:10.720
you're not. I was thinking machine learning perspective, but yes. But also, yeah. Yeah,
link |
01:10:15.280
once you have that data, you can start making all kinds of inference about health and stuff like
link |
01:10:19.600
that. Interesting. Yeah. There was an interesting thing when I was at Google that we were,
link |
01:10:24.880
it's called active authentication where you want to be able to
link |
01:10:32.560
unlock your phone without using a password. So it would face, but also other stuff,
link |
01:10:38.400
like the way you take a phone out of the pocket. So that kind of data to use the multimodal
link |
01:10:44.960
with machine learning to be able to identify that it's you or likely to be you, likely not to be you,
link |
01:10:50.640
that allows you to not always have to enter the password. That was the idea. But the funny thing
link |
01:10:55.680
about that is I just want to tell a small anecdote is because it was all male engineers.
link |
01:11:03.360
Except, so there's, my boss is our boss who's still one of my favorite humans was a woman,
link |
01:11:11.920
Regina Dugan. Oh my God, I love her. She's awesome. She's the best. So, but anyway,
link |
01:11:22.640
and there was another, there's one female engineer, brilliant female engineer on the team,
link |
01:11:26.960
and she was the one that actually, I like the fact that women often don't have pockets. Right.
link |
01:11:33.360
You know, it was like, whoa, that was not even a category in the code of like, wait a minute,
link |
01:11:38.640
you can take the phone out of some other place than your pocket. So anyway, that's a, it's a
link |
01:11:44.720
funny thing when you're considering people laying in bed watching a phone, you have to consider
link |
01:11:48.720
if you have to, you know, diversity in all its forms, depending on the problem, depending on the
link |
01:11:54.800
context. Yeah, actually, this is like a very important, I think this is, you know, you probably
link |
01:11:59.600
get this all the time, like people are worried that AI is going to take over humanity and like,
link |
01:12:04.160
get rid of all the humans, the world, and I'm like, actually, that's not my biggest concern.
link |
01:12:08.160
My biggest concern is that we are building bias into these systems. And then they're like deployed
link |
01:12:14.160
at large and at scale. And before you know it, you're kind of accentuating the bias that exists
link |
01:12:20.160
in society. And yeah, I'm not, you know, I know people, it's very important to worry about that,
link |
01:12:25.840
but I, the worry is an emergent phenomena to me, which is a very good one, because I think these
link |
01:12:34.320
systems are actually, by encoding the data that exists, they're revealing the bias in society,
link |
01:12:42.160
they're for teaching us what the bias is, therefore, we can now improve that bias within the system.
link |
01:12:48.080
So they're almost like putting a mirror to ourselves. So I'm not,
link |
01:12:52.960
we have to be open to looking at the mirror, though, we have to be right open to scrutinizing
link |
01:12:57.280
the data. And if you just take it as ground, or you don't even have to look at the, I mean,
link |
01:13:02.720
yes, the data is how you fix it, but then you just look at the behavior of the system. It's like,
link |
01:13:06.800
and you realize, holy crap, this thing is kind of racist. Like, why is that? And then you look at
link |
01:13:11.680
the data, it's like, okay. And then you start to realize that I think that's a much more effective
link |
01:13:16.240
way to, to be introspective as a society than through sort of political discourse, like AI kind
link |
01:13:23.920
of, because people are, people are for some reason more productive and rigorous in criticizing AI
link |
01:13:34.400
than they're criticizing each other. So I think this is just a nice method for studying society
link |
01:13:39.360
and see which way progress lies. Anyway, what we're talking about, you're watching the problem of
link |
01:13:45.600
watching Netflix in bed, or elsewhere, and seeing which parts are exciting, which parts are boring,
link |
01:13:51.680
you're saying that's relatively constrained, because, you know, you have a captive audience,
link |
01:13:56.400
and you kind of know the context. And one thing you said that was really key is the
link |
01:14:00.800
aggregate, you're doing this in aggregate, right? Like we're looking at aggregated response of people,
link |
01:14:04.720
and so when you see a peak, say a smile peak, they're probably smiling or laughing at something
link |
01:14:11.120
that's in the content. So that was one of the first problems we were able to solve. And, and
link |
01:14:17.280
when we see the smile peak, it doesn't mean that these people are internally happy. They're just
link |
01:14:22.080
laughing at content. So it's important to, you know, call it for what it is.
link |
01:14:27.360
But it's still really, really useful data. I wonder how that compares to, so what like YouTube
link |
01:14:32.640
and other places we use is obviously they don't have, for the most case, they don't have that kind
link |
01:14:40.480
of data. They have the data of when people tune, tune out, drop off. And I think that's a
link |
01:14:48.800
in aggregate for YouTube, at least a pretty powerful signal. I worry about what that leads to,
link |
01:14:55.120
because looking at like YouTubers that are kind of really care about views and,
link |
01:15:02.960
you know, try to maximize the number of views, I think they, when they say that the video should
link |
01:15:09.520
be constantly interesting, it seems like a good goal. I feel like that leads to this manic pace
link |
01:15:18.880
of a video. Like the idea that I would speak at the current speed that I'm speaking, I don't know.
link |
01:15:27.200
And that every moment has to be engaging, right? Engaging. I think there's value to silence,
link |
01:15:32.960
there's value to the boring bits. I mean, all some of the greatest movies ever,
link |
01:15:37.280
some of the greatest stories ever told me, they have the boring bits, seemingly boring bits. I
link |
01:15:42.560
don't know. I wonder about that. Of course, it's not that the human face can capture that either,
link |
01:15:49.120
it's just giving an extra signal. You have to really, I don't know, you have to really collect
link |
01:15:57.760
deeper, long term data about what was meaningful to people. When they think 30 days from now, what
link |
01:16:07.360
they still remember, what moved them, what changed them, what helped them grow, that kind of stuff.
link |
01:16:12.160
You know, it would be a really, I don't know if there are any researchers out there who are doing
link |
01:16:16.880
this type of work. Wouldn't it be so cool to tie your emotional expressions while you're, say,
link |
01:16:23.760
listening to a podcast interview and then 30 days later interview people and say, hey,
link |
01:16:30.880
what do you remember? You've watched this 30 days ago, like what stuck with you? And then see if
link |
01:16:35.200
there's any, there ought to be, maybe there ought to be some correlation between these emotional
link |
01:16:39.280
experiences and yeah, what you, what stays with you. So the one guy listening now on the beach
link |
01:16:49.280
in Brazil, please record a video of yourself listening to this and send it to me and then
link |
01:16:54.080
I'll interview you 30 days from now. Yeah, that would be great. It would be statistically
link |
01:16:59.680
significant. Yeah, I know one, but you know, yeah, yeah, I think that's really fascinating. I think
link |
01:17:09.200
that kind of holds the key to a future where entertainment or content is both entertaining
link |
01:17:17.760
and, I don't know, makes you better, empowering in some way. So figuring out like
link |
01:17:28.480
showing people stuff that entertains them, but also they're happy they watched 30 days from
link |
01:17:34.560
now because they've become a better person because of it. Well, you know, okay, not to
link |
01:17:38.880
riff on this topic for too long, but I have two children, right? And I see my role as a parent
link |
01:17:44.880
as like a chief opportunity officer. Like I am responsible for exposing them to all sorts of
link |
01:17:49.920
things in the world. But often I have no idea of knowing like, what's stuck? Like, what was,
link |
01:17:56.000
you know, is this actually going to be transformative, you know, for them 10 years down the line? And
link |
01:18:00.320
I wish there was a way to quantify these experiences. Like, are they, I can tell in the moment if
link |
01:18:07.040
they're engaging, right? I can tell. But it's really hard to know if they're going to remember
link |
01:18:11.840
them 10 years from now or if it's going to. Yeah, that one is weird because it seems like kids
link |
01:18:18.160
remember the weirdest things. I've seen parents do incredible stuff with their kids and they don't
link |
01:18:22.960
remember any of that. They remember some tiny, small, sweet thing a parent did. Right. Like some,
link |
01:18:28.880
I took you to like this amazing country. Yeah, whatever. And then they'll be like some,
link |
01:18:34.880
like stuffed toy you got or some or the new PlayStation or something or some, some silly
link |
01:18:40.000
little thing. So I think they just like that they were designed that way that you want to mess with
link |
01:18:45.920
your head. But definitely kids are very impacted by it seems like sort of negative events. So
link |
01:18:54.960
minimizing the number of negative events is important, but not too much, right? You can't,
link |
01:19:01.200
you can't just like, you know, there's still discipline and challenge and all those kinds of
link |
01:19:06.400
things. So you want some adversity for sure. So yeah, I mean, I'm definitely when I have kids,
link |
01:19:11.280
I'm going to drive them out into the woods. Okay. And then they have to survive and make,
link |
01:19:17.040
figure out how to make their way back home, like 20 miles out. Okay. Yeah. And after that, we can go
link |
01:19:22.960
for ice cream. Anyway, I'm working on this whole parenting thing. I haven't figured it out. Okay.
link |
01:19:28.800
Okay. What were we talking about? Yes, effective at the, the, the problem of emotion,
link |
01:19:37.920
of emotion detection. So there's some people, maybe we can just speak to that a little more,
link |
01:19:42.240
where there, there's folks like Lisa Feldman Barrett that, that challenged this idea that
link |
01:19:47.600
emotion could be fully detected, or even well detected from the human face that there's so
link |
01:19:55.440
much more to emotion. What do you think about ideas like hers, criticism like hers? Yeah,
link |
01:20:01.680
I actually agree with a lot of Lisa's criticism. So even, even my PhD worked like 20 plus years
link |
01:20:09.120
ago now. Time flies when you're having fun. I know, right? That was back when I did like
link |
01:20:16.000
dynamic Bayesian networks and that was before deep learning. That was before deep learning.
link |
01:20:21.360
Yeah. Yeah. I know. My day. Now you can just like use. Yeah. It's all, it's all the same
link |
01:20:29.600
architecture. You can apply it to anything. Yeah. Right. Right. But yeah, but, but even then I kind
link |
01:20:36.320
of, I, I did not subscribe to this like theory of basic emotions where it's just the simplistic
link |
01:20:41.360
mapping one to one mapping between facial expressions and emotions. I actually think also,
link |
01:20:45.840
we're not in, in the business of trying to identify your true emotional internal state. We
link |
01:20:50.560
just want to quantify in an objective way what's showing on your face because that's an important
link |
01:20:56.240
signal. It doesn't mean it's a true reflection of your internal emotional state. So I think a lot
link |
01:21:03.200
of the, you know, I think she's, she's just trying to kind of highlight that this is not a simple
link |
01:21:08.640
problem and overly simplistic solutions are going to hurt the industry. And I subscribe to that.
link |
01:21:16.640
And I think multimodal is the way to go. Like whether it's additional context information
link |
01:21:21.680
or different modalities and channels of information, I think that's what we,
link |
01:21:26.560
that's where we ought to go. And I think, I mean, that's a big part of what she's advocating for
link |
01:21:30.640
as well. So, but there is signal in the human face. That's just definitely signal. That's a
link |
01:21:36.080
projection of emotion. There's that, that there, at least in part is the inner state is captured
link |
01:21:44.880
in some meaningful way on the human face. I think it can sometimes be a reflection or
link |
01:21:53.600
an expression of your internal state, but sometimes it's a social signal. So the, so
link |
01:21:58.880
you cannot look at the face as purely a signal of emotion. It can be a signal of cognition and it
link |
01:22:04.240
can be a signal of a social expression. And I think to disambiguate that, we have to be
link |
01:22:11.200
careful about it and we have to add initial information. Humans are fascinating, aren't
link |
01:22:17.200
they? With the whole face thing, this can mean so many things from humor to sarcasm to everything,
link |
01:22:23.760
the whole thing. Some things we can help, some things we can't help at all.
link |
01:22:28.320
In all the years of leading effectiva and emotion recognition company, like we talked about,
link |
01:22:33.680
what have you learned about emotion, about humans and about AI? Be big sweeping questions.
link |
01:22:44.240
Yeah, that's a big sweeping question. Well, I think the thing I learned the most is that even
link |
01:22:50.720
though like we are in the business of building AI basically, right? It always goes back to the
link |
01:23:00.160
humans, right? It's always about the humans. And so for example, the thing I'm most proud of
link |
01:23:08.560
in building affectiva and yeah, the thing I'm most proud of on this journey,
link |
01:23:16.080
I love the technology and I'm so proud of the solutions we've built and we've brought to market,
link |
01:23:21.040
but I'm actually most proud of the people we've like built and cultivated at the company and
link |
01:23:26.400
the culture we've created. You know, some of the people who've joined affectiva,
link |
01:23:32.080
this was their first job. And while at affectiva, they became American citizens and they bought
link |
01:23:39.680
their first house and they found their partner and they had their first kid, right? Like key moments
link |
01:23:45.440
in life that we got to be part of. And that's the thing I'm most proud of.
link |
01:23:52.160
So that's a great thing at a company that works and I'm most proud of that, right? Me,
link |
01:23:56.640
like celebrating humanity in general, broadly speaking. And that's a great thing to have in
link |
01:24:00.720
a company that works on AI because that's not often the thing that's celebrated in AI companies.
link |
01:24:06.800
So often just raw, great engineering, just celebrating the humanity. That's great. And
link |
01:24:12.880
especially from a leadership position. Well, what do you think about the movie,
link |
01:24:19.680
her? Let me ask you that before I talk, before I talk to you about it, because it's not
link |
01:24:24.240
affectiva is and was not just about emotion. So I'd love to talk to you about smart eye.
link |
01:24:30.160
But before that, let me just jump into the movie. Her, do you think we'll have a deep
link |
01:24:38.240
meaningful connection with increasingly deep and meaningful connections with computers?
link |
01:24:43.680
Is that a compelling thing to you? Something that's already happening? The thing I love them,
link |
01:24:48.160
I love the movie her, by the way. But the thing I love the most about this movie is it demonstrates
link |
01:24:53.200
how technology can be a conduit for positive behavior change. So I forgot the guy's name in
link |
01:24:58.560
the movie, whatever. Theodore. Theodore. So Theodore was like really depressed, right? And
link |
01:25:05.520
he just didn't want to get out of bed. He was just like done with life, right? And Samantha, right?
link |
01:25:12.640
Samantha, yeah. She just knew him so well. She had, she was emotionally intelligent.
link |
01:25:17.200
And so she could persuade him and motivate him to change his behavior. And she got a man,
link |
01:25:22.160
and they went to the beach together. And I think that represents the promise of emotion AI. If done
link |
01:25:27.760
well, this technology can help us live happier lives, more productive lives, healthier lives,
link |
01:25:34.800
more connected lives. So that's the part that I love about the movie, obviously. It's Hollywood,
link |
01:25:40.640
so it takes a twist and whatever. But the key notion that technology with emotion AI can
link |
01:25:48.000
persuade you to be a better version of who you are, I think that's awesome.
link |
01:25:52.640
Well, what about the twist? You don't think it's good for spoiler alert
link |
01:25:58.640
that Samantha starts feeling a bit of a distance and basically leaves Theodore?
link |
01:26:04.560
You don't think that's a good feature? That's a, you think that's a bugger feature?
link |
01:26:11.440
Well, I think what went wrong is Theodore became really attached to Samantha. Like,
link |
01:26:15.600
I think he kind of fell in love with, you think that's wrong?
link |
01:26:19.360
I mean, I think that, I think she was putting out the signal. This is an intimate relationship,
link |
01:26:25.200
right? There was a deep intimacy to it. Right. But what does, what does, what does that mean?
link |
01:26:30.080
What does that mean? An AI system. Right. What does that mean? Right? We're just friends.
link |
01:26:35.280
Yeah, we're just friends. Well, I think when he realized, which is such a human thing of
link |
01:26:42.320
jealousy, when you realized that Samantha was talking to like thousands of people,
link |
01:26:46.880
she's parallel dating. Yeah, that did not go well. Right.
link |
01:26:51.440
You know, that doesn't, and from a computer perspective, like that doesn't take
link |
01:26:55.040
anything away from what we have. It's like you getting jealous of Windows 98 for being used
link |
01:27:01.360
by millions of people. But it's like, it's like not liking that Alexa talks to a bunch of, you
link |
01:27:07.840
know, other families. But I think Alexa currently is just a servant. It, it tells you about the
link |
01:27:14.880
weather. It's not, it doesn't do the intimate deep connection. And I think there is something
link |
01:27:19.360
there is something really powerful about that the intimacy of a connection with an AI system
link |
01:27:26.240
that would have to respect and play the human game of, of jealousy, of love, of heartbreak and
link |
01:27:33.840
all that kind of stuff, which Samantha does seem to be pretty good at. I think she, this AI systems
link |
01:27:42.640
knows what it's doing. Well, actually, let me ask you this. I don't think she was talking to anyone
link |
01:27:47.280
else. You don't think so? You think she was just done with Theodore? Yeah. Oh, yeah. And then,
link |
01:27:53.920
and she, she wanted to really put the screen. She didn't have the guts to just break it off
link |
01:27:58.720
cleanly. Okay. She wanted to put in pain. No, I don't know. Well, she could have ghosted him.
link |
01:28:04.880
She could have. Right. I'm sorry. There's our engineers. Oh God. But I think those are really,
link |
01:28:13.440
I honestly think some of that, some of it is Hollywood, but some of that is features from
link |
01:28:18.400
an engineering perspective, not a bug. I think AI systems that can leave us,
link |
01:28:24.160
now this is for more social robotics than it is for anything that's useful. Like I hate it
link |
01:28:31.280
if Wikipedia said, you know, I need a break right now. Right, right, right, right. I'm like, no,
link |
01:28:35.840
no, I need you. But if it's just purely for companionship, then I think the ability to
link |
01:28:45.600
leave is really powerful. I don't know. I never thought of that. So that's so, so fascinating
link |
01:28:52.320
because I've always taken the human perspective, right? Like for example, we had a Jibo at home,
link |
01:28:58.400
right? And my son loved it. And then the company ran out of money. And so they had to basically
link |
01:29:03.920
shut down, like Jibo basically died, right? And it was so interesting to me because we have a lot
link |
01:29:10.320
of gadgets at home and a lot of them break and my son never cares about it, right? Like if our
link |
01:29:16.880
Alexa stopped working tomorrow, I don't think he'd really care. But when Jibo stopped working,
link |
01:29:22.000
it was traumatic. Like he got really upset. And as a parent, that like made me think about this
link |
01:29:28.480
deeply, right? Did I, was I comfortable with that? I like the connection they had because I
link |
01:29:33.440
think it was a positive relationship. But I was surprised that it affected him emotionally so
link |
01:29:41.040
much. And I think there's a broader question here, right? As we build socially and emotionally
link |
01:29:48.080
intelligent machines, what does that mean about our relationship with them? And then we're
link |
01:29:53.600
broadly our relationship with one another, right? Because this machine is going to be programmed
link |
01:29:57.760
to be amazing at empathy, by definition, right? It's going to always be there for you. It's not
link |
01:30:03.920
going to get bored. In fact, there's a chat bot in China, Shao Ice. And it's like the number
link |
01:30:11.360
two or three most popular app. And it basically is just a confidant. And you can tell it anything
link |
01:30:17.520
you want. And people use it for all sorts of things. They confide in like domestic violence
link |
01:30:27.120
or suicidal attempts or, you know, if they have challenges at work, I don't know what that,
link |
01:30:33.440
I don't know if I'm, I don't know how I feel about that. I think about that a lot.
link |
01:30:37.120
Yeah, I think first of all, obviously the future in my perspective. Second of all, I think there's
link |
01:30:43.520
a lot of trajectories that that becomes an exciting future. But I think everyone should feel very
link |
01:30:49.600
uncomfortable about how much they know about the company, about where the data is going,
link |
01:30:56.080
how the data is being collected. Because I think, and this is one of the lessons of social media,
link |
01:31:01.840
that I think we should demand full control and transparency of the data on those things.
link |
01:31:06.480
Plus one, totally agree. Yeah. So like, I think it's really empowering, as long as you can walk
link |
01:31:12.080
away. As long as you can like delete the data or know how the data, it's a opt in or, or at least
link |
01:31:18.560
the clarity of like what is being used for the company. And I think as CEO or like leaders
link |
01:31:24.320
are also important about that. Like you need to be able to trust the basic humanity of the leader.
link |
01:31:29.920
Exactly. And also that that leader is not going to be a puppet of a larger machine,
link |
01:31:36.240
but they actually have a significant role in defining the culture and the way the company
link |
01:31:42.080
operates. So anyway, but we should be, we should definitely scrutinize companies on that aspect.
link |
01:31:49.520
But this, I'm personally excited about that future, but also even if you're not, it's coming.
link |
01:31:57.200
So let's figure out how to do it in the least painful and the most positive way.
link |
01:32:01.920
That's great. You're the deputy CEO of Smart Eye. Can you describe the mission of the company?
link |
01:32:08.240
What is Smart Eye? Yeah. So Smart Eye is a Swedish company. They've been in business for the last 20
link |
01:32:14.480
years. And their, their main focus, like the industry they're most focused on is the automotive
link |
01:32:20.880
industry. So bringing driver monitoring systems to basically save lives, right? So I first met
link |
01:32:29.040
the CEO, Martin Krantz. Gosh, it was right when COVID hit. It was actually the last, the last
link |
01:32:36.080
CES right before COVID. So CS 2020, right? 2020, January. Yeah, January. Exactly. So we were there,
link |
01:32:43.040
met him in person. He's basically, we were competing with each other. I think the difference was
link |
01:32:49.680
they'd been doing driver monitoring and had a lot of credibility in the automotive space. We
link |
01:32:54.480
didn't come from the automotive space, but we were using new technology like deep learning
link |
01:32:59.120
and building this emotion recognition. And you wanted to enter the automotive space. You wanted
link |
01:33:04.000
to operate in the automotive space. Exactly. It was one of the areas we were, we had just raised
link |
01:33:08.720
around a funding to focus on bringing our technology to the automotive industry. So we met and
link |
01:33:14.240
honestly, it was the first, it was the only time I met with a CEO who had the same vision
link |
01:33:19.120
as I did. Like he basically said, yeah, our vision is to bridge the gap between humans and machines.
link |
01:33:23.120
I was like, oh my God, this is like exactly almost to the, to the word, you know, how we describe it
link |
01:33:29.120
too. And we started talking and first it was about, okay, can we align strategically here? Like how
link |
01:33:35.920
can we work together? Cause we're competing, but we're also like complimentary. And then
link |
01:33:41.680
I think after four months of speaking almost every day on FaceTime, he was like,
link |
01:33:47.520
is your company interested in acquisition? And it was the first, I usually say no when people
link |
01:33:52.400
approach us. It was the first time that I was like, huh, yeah, I might be interested. Let's talk.
link |
01:33:58.720
Yeah. So you just hit it off. Yeah. So they're, they're a respected, very respected in the automotive
link |
01:34:05.920
sector of like delivering products and increasingly sort of better and better and better for, I mean,
link |
01:34:13.280
maybe you could speak to that, but it's the driver's sense. And for basically having a device that's
link |
01:34:17.680
looking at the driver and it's able to tell you where the driver is looking. Correct. It's able
link |
01:34:23.360
to also drowsiness stuff. Correct. It does stuff from the face in the eye. Exactly. Like it's
link |
01:34:28.800
monitoring driver distraction and drowsiness, but they bought us so that we could expand beyond
link |
01:34:33.840
just the driver. So the driver monitoring systems usually sit, the camera sits in the steering
link |
01:34:39.440
wheel car or around the steering wheel column and it looks directly at the driver. But now
link |
01:34:44.240
we've migrated the camera position and partnership with car companies to the rear view mirror
link |
01:34:50.320
position. So it has a full view of the entire cabin of the car. And you can detect how many
link |
01:34:55.600
people are in the car. What are they doing? So we do activity detection, like eating or drinking or
link |
01:35:02.000
in some regions of the world smoking. We can detect if a baby's in the car seat, right? And if,
link |
01:35:09.520
unfortunately, in some cases they're forgotten, parents just leave the car and forget the kid
link |
01:35:13.840
in the car. That's an easy computer vision problem to solve, right? You can detect there's a car seat,
link |
01:35:18.960
there's a baby, you can text the parent and hopefully, again, save lives. So that was the
link |
01:35:25.440
impetus for the acquisition. It's been a year. I mean, there's a lot of questions. It's a really
link |
01:35:32.960
exciting space, especially to me. I just find it a fascinating problem. It could enrich the
link |
01:35:38.800
experience in the car in so many ways, especially because we spend still, despite COVID, I mean,
link |
01:35:45.440
COVID changed things. So it's in interesting ways. But I think the world is bouncing back and we
link |
01:35:49.760
spend so much time in the car and the car is such a weird little world we'll have for ourselves.
link |
01:35:56.960
Like people do all kinds of different stuff, like listen to podcasts, they think about stuff,
link |
01:36:02.880
they get angry, they do phone calls. It's like a little world of its own with a kind of privacy
link |
01:36:12.240
that for many people, they don't get anywhere else. And it's a little box that's like a psychology
link |
01:36:20.800
experiment because it feels like the angriest many humans in this world get is inside the car.
link |
01:36:28.320
It's so interesting. So it's such an opportunity to explore how we can enrich,
link |
01:36:35.200
how companies can enrich that experience. And also as the cars get become more and more automated,
link |
01:36:42.000
there's more and more opportunity, the variety of activities that you can do in the car increases.
link |
01:36:46.960
So it's super interesting. So I mean, in a practical sense, the Smart Eye has been selected.
link |
01:36:54.160
At least I read by 14 of the world's leading car manufacturers for 94 car models. So
link |
01:37:01.120
it's in a lot of cars. How hard is it to work with car companies? So they're all different.
link |
01:37:07.680
They all have different needs. The ones I've gotten a chance to interact with are very focused on
link |
01:37:13.520
cost. And anyone who's focused on cost, it's like, all right, do you hate fun?
link |
01:37:24.240
Let's just have some fun. Let's figure out the most fun thing we can do in the worry
link |
01:37:28.160
about cost later. But I think because the way the car industry works, I mean, it's a very
link |
01:37:33.360
thin margin that you get to operate under. So you have to really, really make sure that
link |
01:37:38.640
everything you add to the car makes sense financially. So anyway, is this new industry,
link |
01:37:44.960
especially at this scale of Smart Eye, does it hold any lessons for you?
link |
01:37:51.200
Yeah, I think it is a very tough market to penetrate. But once you're in, it's awesome.
link |
01:37:56.800
Because once you're in, you're designed into these car models for like somewhere between
link |
01:38:00.880
five to seven years, which is awesome. And once they're on the road, you just get paid a royalty
link |
01:38:05.760
fee per vehicle. So it's a high barrier to entry. But once you're in, it's amazing. I think the thing
link |
01:38:12.240
that I struggle the most with in this industry is the time to market. So often we're asked to lock
link |
01:38:18.480
or do a code freeze. Two years before the car is going to be on the road, I'm like, guys,
link |
01:38:23.920
do you understand the pace with which technology moves? So I think car companies are really trying
link |
01:38:30.320
to make the Tesla, the Tesla transition to become more of a software driven
link |
01:38:38.400
architecture. And that's hard for many. It's just the cultural change. I mean,
link |
01:38:42.400
I'm sure you've experienced that, right? Oh, definitely. I think one of the biggest
link |
01:38:46.560
inventions or imperatives created by Tesla is like to me personally, okay, people are going to
link |
01:38:54.400
complain about this, but I know electric vehicle, I know autopilot AI stuff. To me, the software
link |
01:39:01.840
over there, software updates is like the biggest revolution in cars. And it is extremely difficult
link |
01:39:08.960
to switch to that because it is a culture shift. It at first, especially if you're not comfortable
link |
01:39:14.800
with it, it seems dangerous. Like there's an approach to cars is so safety focused for so
link |
01:39:22.080
many decades. They're like, what do you mean we dynamically change code? The whole point
link |
01:39:28.880
is you have a thing that you test, like, right? And like, it's not reliable. Because do you know
link |
01:39:37.200
how much you cause if we have to recall this cars? Right? There's a there's a and there's an
link |
01:39:43.120
understandable obsession with safety. But the downside of an obsession with safety
link |
01:39:48.960
is the same as with being obsessed with safety as a parent. Is like, if you do that too much, you
link |
01:39:56.880
limit the potential development and the flourishing of in that particular aspect human being, but
link |
01:40:02.480
in this particular aspect, the software, the artificial neural network of it. And but it's
link |
01:40:08.880
tough to do. It's really tough to do culturally and technically, like the deployment, the mass
link |
01:40:13.520
deployment of software is really, really difficult. But I hope that's where the industry is doing
link |
01:40:18.160
one of the reasons I really want Tesla to succeed is exact about that point, not autopilot,
link |
01:40:22.560
not the electrical vehicle, but the softwareization of basically everything but cars,
link |
01:40:29.200
especially because to me, that's actually going to increase two things, increase safety,
link |
01:40:34.560
because you can update much faster, but also increase the effectiveness of
link |
01:40:39.840
folks like you who dream about enriching the human experience with AI, because you can just like
link |
01:40:46.000
because there's a feature like you want like a new emoji or whatever, like the way TikTok
link |
01:40:51.360
releases filters, you can just release that for in car in car stuff. So but yeah, that that's
link |
01:40:57.600
definitely one of the use cases we're looking into is once you know the sentiment of the passengers
link |
01:41:05.040
in the vehicle, you can optimize the temperature in the car, you can change the lighting, right?
link |
01:41:10.240
So if the backseat passengers are falling asleep, you can dim the lights, you can lower the music,
link |
01:41:14.960
right? You can do all sorts of things. Yeah. I mean, of course, you could do that kind of
link |
01:41:19.680
stuff with a two year delay, but it's tougher. Yeah. Do you think, do you think Tesla or Waymo
link |
01:41:28.720
or some of these companies that are doing semi or fully autonomous driving should be doing
link |
01:41:34.480
driver sensing? Yes. Are you thinking about that kind of stuff? So not just how we can enhance
link |
01:41:40.800
the in cab experience for cars that are male and driven, but the ones that are increasingly
link |
01:41:45.920
more autonomously driven? Yeah. So if we fast forward to the universe where it's fully autonomous,
link |
01:41:51.840
I think interior sensing becomes extremely important because the role of the driver
link |
01:41:56.080
isn't just to drive. If you think about it, the driver almost manages the dynamics within a vehicle.
link |
01:42:01.760
And so who's going to play that role when it's an autonomous car? We want a solution that is
link |
01:42:08.400
able to say, oh my God, Lex is bored to death because the car is moving way too slow. Let's
link |
01:42:13.680
engage Lex. Or Rana is freaking out because she doesn't trust this vehicle yet. So let's tell Rana
link |
01:42:19.280
a little bit more information about the route. So I think, or somebody's having a heart attack
link |
01:42:25.040
in the car, you need interior sensing in fully autonomous vehicles. But with semi autonomous
link |
01:42:30.480
vehicles, I think it's really key to have driver monitoring because semi autonomous means that
link |
01:42:36.400
sometimes the car is in charge, sometimes the driver is in charge or the co pilot, right? And
link |
01:42:41.360
you need both systems to be on the same page. You need to know, the car needs to know if the
link |
01:42:46.880
driver's asleep before it transitions control over to the driver. And sometimes if the driver's too
link |
01:42:53.680
tired, the car can say, I'm going to be a better driver than you are right now. I'm taking control
link |
01:42:58.160
over. So this dynamic, this dance, it's so key and you can't do that without driver sensing.
link |
01:43:03.200
Yeah, there's a disagreement for the longest time I've had with Elon that this is obvious,
link |
01:43:07.360
that this should be in the Tesla from day one. And it's obvious that driver sensing is not a
link |
01:43:13.040
hindrance. It's not obvious. I should be careful because having studied this problem, nothing
link |
01:43:20.800
is really obvious, but it seems very likely driver sensing is not a hindrance to an experience.
link |
01:43:26.320
It's only enriching to the experience and likely increases the safety. That said,
link |
01:43:36.000
it is very surprising to me just having studied semi autonomous driving how well humans are able
link |
01:43:43.600
to manage that dance. Because it was the intuition before you were doing that kind of thing that
link |
01:43:49.040
humans will become just incredibly distracted. They would just let the thing do its thing.
link |
01:43:57.600
Because it is life and death and they're able to manage that somehow. But that said, there's no
link |
01:44:02.240
reason not to have driver sensing on top of that. I feel like that's going to allow you to do that
link |
01:44:08.560
dance that you're currently doing without driver sensing, except the steering wheel,
link |
01:44:14.000
to do that even better. I mean, the possibilities are endless and the machine learning possibilities
link |
01:44:19.280
are endless. It's such a beautiful, it's also constrained environment. So you could do much
link |
01:44:25.200
more effectively than you can with the external environment. External environment is full of
link |
01:44:30.720
weird edge cases and complexities. There's so much, it's so fascinating, such a fascinating
link |
01:44:35.680
world. I do hope that companies like Tesla and others, even Waymo, which I don't even know if
link |
01:44:44.320
Waymo is doing anything sophisticated inside the cab. I don't think so. What is it? I honestly
link |
01:44:51.920
think it goes back to the robotics thing we were talking about, which is like great engineers
link |
01:44:58.400
that are building these AI systems just are afraid of the human being. And not thinking about the
link |
01:45:04.640
human experience, they're thinking about the features and the perceptual abilities of that
link |
01:45:10.000
thing. They think the best way I can serve the human is by doing the best perception and control
link |
01:45:16.720
I can, by looking at the external environment, keeping the human safe. But there's a huge,
link |
01:45:21.600
I'm here. I need to be noticed and interacted with and understood and all those kinds of things,
link |
01:45:33.440
even just in a personal level for entertainment, honestly, for entertainment.
link |
01:45:38.400
You know, one of the coolest work we did in collaboration with MIT around this was we looked
link |
01:45:42.640
at longitudinal data, right? Because MIT had access to tons of data. And just seeing the
link |
01:45:53.600
patterns of people, like driving in the morning off to work versus commuting back from work,
link |
01:45:58.640
or weekend driving versus weekday driving. And wouldn't it be so cool if your car knew that
link |
01:46:04.960
and then was able to optimize either the route or the experience or even make recommendations?
link |
01:46:12.240
I think it's very powerful. Yeah, like, why are you taking this route? You're always unhappy when
link |
01:46:16.960
you take this route. And you're always happy when you take this alternative route. Take that route
link |
01:46:21.040
instead. Exactly. I mean, that if to have that even that little step of relationship with a car,
link |
01:46:27.120
I think is incredible. Of course, you have to get the privacy, right? You have to get all that kind
link |
01:46:31.920
of stuff, right? But I wish I honestly, you know, people are paranoid about this, but I would like
link |
01:46:37.440
a smart refrigerator. We have such a deep connection with food as a human civilization. I would like
link |
01:46:45.520
to have a refrigerator that would understand me that, you know, I also have a complex
link |
01:46:52.320
relationship with food because like, you know, pig out too easily and all that kind of stuff. So
link |
01:46:57.280
you try, you know, like, maybe I want the refrigerator to be like, are you sure about
link |
01:47:02.160
this? Because maybe you're just feeling down or tired. Like maybe, maybe let's leave off.
link |
01:47:06.320
Your vision of the smart refrigerator is way kinder than mine.
link |
01:47:10.000
Is it just me yelling at you? No, it was just because I don't, I don't, you know, I don't drink
link |
01:47:16.720
alcohol, I don't smoke, but I eat a ton of chocolate, like it sticks to my vice. And so I, and, and
link |
01:47:22.960
sometimes I scream too. And I'm like, okay, my smart refrigerator will just lock, lock down.
link |
01:47:28.080
They'll just say, dude, you've had way too many today, like, yeah.
link |
01:47:32.640
Yeah. No, but here's the thing. Are you, do you regret having like, let's say, not the next day,
link |
01:47:41.280
but 30 days later, would you, what would you, what would you like to the refrigerator to have done
link |
01:47:48.000
then? Well, I think actually, like the more positive relationship would be one where there's
link |
01:47:54.320
a conversation, right? As opposed to like four, that's probably like the more sustainable relationship.
link |
01:48:00.640
It's like late, late at night, just no, listen, listen, I know I told you an hour ago,
link |
01:48:05.920
that this is not a good idea, but just listen, things have changed. I can just imagine a bunch
link |
01:48:11.120
of stuff being made up just to convince. Oh my God, it's hilarious. But I mean, I just think that
link |
01:48:17.360
there's opportunities there. I mean, maybe not locking down, but for our systems that are such
link |
01:48:25.200
a deep part of our lives, like we use, we use a lot of us, a lot of people that commute use
link |
01:48:32.960
their car every single day. A lot of us use a refrigerator every single day, the microwave
link |
01:48:37.040
every single day. Like, and we just, like, I feel like certain things could be made more efficient,
link |
01:48:45.680
more enriching, and AI is there to help. Like some just basic recognition of you as a human being
link |
01:48:53.760
about your patterns, what makes you happy, not happy, and all that kind of stuff. And the car,
link |
01:48:57.840
obviously, maybe, maybe, maybe we'll say, well, instead of this like,
link |
01:49:03.200
like, Ben and Terry's ice cream, how about this hummus and carrots or something? I don't know.
link |
01:49:11.600
Yeah, like a reminder. Just in time recommendation, right?
link |
01:49:14.640
But not like a generic one, but a reminder that last time you chose the carrots,
link |
01:49:20.880
you smiled 17 times more the next day. You were happier the next day, right?
link |
01:49:25.360
Yeah, you were, you were happier the next day. And, and, but yeah, I don't, but then again,
link |
01:49:31.760
if you're the kind of person that, that gets better from negative, negative comments, you could
link |
01:49:37.680
say like, Hey, remember, like that wedding you're going to, you want to fit into that dress?
link |
01:49:43.520
Remember about that? Let's think about that for your eating this. No, I don't know. It's for some,
link |
01:49:49.360
probably that would work for me. Like a refrigerator that is just ruthless. It's
link |
01:49:53.600
shaming me. But like, I would, of course, welcome it. Like, that would work for me. Just that that
link |
01:50:00.320
well, I would know, I think it would, if it's really like smart, it would optimize its nudging
link |
01:50:05.280
based on what works for you, right? Exactly. That's the whole point. Personalization,
link |
01:50:09.440
in every way, deep personalization. You were a part of a webinar titled Advancing Road Safety,
link |
01:50:16.320
the State of Alcohol and Toxication Research. So for people who don't know, every year 1.3
link |
01:50:22.640
million people around the world die in road crashes. And more than 20% of these fatalities
link |
01:50:28.960
are estimated to be alcohol related. A lot of them are also distraction related. So can AI help
link |
01:50:34.720
with the alcohol thing? I think the answer is yes. There are signals, and we know that as humans,
link |
01:50:42.480
like we can tell in a person, you know, is it different phases of being drunk? Right? Yeah.
link |
01:50:50.960
And I think you can use technology to do the same. And again, I think the ultimate solution is going
link |
01:50:55.360
to be a combination of different sensors. How hard is the problem from the vision perspective?
link |
01:51:01.280
I think it's non trivial. I think it's non trivial. And I think the biggest part is
link |
01:51:05.200
getting the data, right? It's like getting enough data examples. So we, for this research project,
link |
01:51:10.880
we partnered with the Transportation Authorities of Sweden. And we literally had a race track
link |
01:51:17.680
with a safety driver. And we basically progressively got people drunk. Nice. So but you know, that's
link |
01:51:25.840
a very expensive data set to collect. And you want to collect it globally and in multiple conditions.
link |
01:51:32.480
Yeah, the ethics of collecting a data set where people are drunk is tricky.
link |
01:51:37.600
Which is funny because, I mean, let's put drunk driving aside, the number of drunk people in
link |
01:51:44.400
the world every day is very large. It'd be nice to have a large data set of drunk people getting
link |
01:51:49.280
progressively drunk. In fact, you could build an app where people can donate their data because
link |
01:51:53.200
it's hilarious. Right. Actually, yeah, but the liability, liability, the ethics, how do you
link |
01:52:00.000
get it right? It's tricky. It's really, really tricky. Because like drinking is one of those
link |
01:52:04.640
things that's funny and hilarious and what loves it's social, the so on and so forth. But it's
link |
01:52:10.320
also the thing that hurts a lot of people, like a lot of people. Like alcohol is one of those
link |
01:52:15.200
things. It's legal, but it's really damaging to a lot of lives. It destroys lives. And not just in
link |
01:52:24.240
driving context. I should mention, people should listen to Andrew Huberman who recently
link |
01:52:30.000
talked about alcohol. He has an amazing pocket. Andrew Huberman is a neuroscientist from Stanford
link |
01:52:36.000
and a good friend of mine. Oh, cool. And he's like a human encyclopedia about all health related
link |
01:52:43.280
wisdom. So he does a podcast. You would love it. I would love that. No, no, no, no. Oh,
link |
01:52:48.080
you don't know Andrew Huberman. Okay. Listen, you'll listen to Andrew. He's called Huberman Lab
link |
01:52:53.040
Podcast. This is your assignment. Just listen to one. Okay. I guarantee you this will be a thing
link |
01:52:57.920
where you say Lex, this is the greatest human I've ever discovered. So. Oh my God. Because I've
link |
01:53:04.240
really, I'm really on a journey of kind of health and wellness and I'm learning lots and I'm trying
link |
01:53:09.440
to like build these, I guess, atomic habits around just being healthy. So yeah, I'm definitely
link |
01:53:16.160
going to do this. I'm gonna. His whole thing. This is great. He's a legit scientist, like
link |
01:53:24.960
really well published. But in his podcast, what he does, he's not talking about his own work.
link |
01:53:31.600
He's like a human encyclopedia of papers. And so he, his whole thing is he takes the topic
link |
01:53:37.440
and in a very fast, you mentioned atomic habits, like very clear way, summarizes the research
link |
01:53:44.080
in a way that leads to protocols of what you should do. He's really big on like, not like
link |
01:53:49.760
this is what the science says, but like, this is literally what you should be doing according
link |
01:53:53.760
to science. So like, he's really big and there's a lot of recommendations he does, which several of
link |
01:54:01.280
them I definitely don't do, like get sunlight as soon as possible from waking up and like for
link |
01:54:09.200
prolonged periods of time. That's a really big one. And he's, there's a lot of science behind
link |
01:54:13.760
that one. There's a bunch of stuff like various systems. You're gonna, and you're gonna be like,
link |
01:54:18.240
Lex, this is, this is my new favorite person, I guarantee it. And if you guys somehow don't know,
link |
01:54:23.680
and you're human and you care about your well being, you know, you should definitely listen to
link |
01:54:28.800
him. Love you, Andrew. Anyway, so what were we talking about? Oh, alcohol and detecting alcohol.
link |
01:54:39.280
So this is a problem you care about and you're trying to solve.
link |
01:54:42.000
And actually like broadening it, I do believe that the car is going to be a wellness center,
link |
01:54:48.720
like, because again, imagine if you have a variety of sensors inside the vehicle,
link |
01:54:53.360
tracking, not just your emotional state or level of distraction and drowsiness and drowsiness,
link |
01:55:00.640
level of distraction, drowsiness and intoxication, but also maybe even things like
link |
01:55:07.200
your physical, you know, your heart rate and your heart rate variability and your breathing rate.
link |
01:55:13.680
And it can start like optimizing, yeah, it can optimize the ride based on what your goals are.
link |
01:55:19.440
So I think we're going to start to see more of that. And I'm excited about that.
link |
01:55:24.320
Yeah, what are the, what are the challenges you're tackling? Well, with Smart Eye currently,
link |
01:55:28.640
what's like the, the trickiest things to get? Is it, is it basically convincing more and more
link |
01:55:34.720
car companies that having AI inside the car is a good idea? Or is there some,
link |
01:55:40.480
is there more technical algorithmic challenges? What's, what's been keeping you mentally busy?
link |
01:55:47.040
I think a lot of the car companies we are in conversations with are already interested in
link |
01:55:52.800
definitely driver monitoring. Like, I think it's becoming a must have, but even interior sensing,
link |
01:55:57.840
I can see, like, we're engaged in a lot of like advanced engineering projects and proof of concepts.
link |
01:56:04.080
I think technologically though, and even the technology, I can see a path to making it happen.
link |
01:56:10.160
I think it's the use case. Like, how does the car respond once it knows something about you?
link |
01:56:16.080
Because you want it to respond in a thoughtful way that doesn't,
link |
01:56:19.440
that isn't off putting to the consumer in the car. So I think that's like the user experience.
link |
01:56:25.520
I don't think we've really nailed that. And we usually, that's not part, we're the sensing
link |
01:56:31.200
platform, but we usually collaborate with the car manufacturer to decide what the use case is. So,
link |
01:56:36.160
so say you, you figure out that somebody's angry while driving. Okay, what should the car do?
link |
01:56:40.880
You know, do you see yourself as a role of nudging, of like,
link |
01:56:48.640
basically coming up with solutions, essentially, that, and then, and then the car manufacturers
link |
01:56:54.080
kind of put their own little spin on it. Right. Like we, we, we are like the ideation creative
link |
01:57:01.440
thought partner. But at the end of the day, the car company needs to decide what's on brand for
link |
01:57:06.080
them, right? Like maybe when it figures out that you're distracted or drowsy, it shows you a coffee
link |
01:57:11.680
cup, right? Or maybe it takes more aggressive behaviors and basically said, okay, if you don't
link |
01:57:16.400
like take a rest in the next five minutes, the car is going to shut down, right? Like there's a whole
link |
01:57:20.320
range of actions the car can take. And doing the thing that is most, yeah, that builds trust with
link |
01:57:27.280
the driver and the passengers. I think that's what we need to be very careful about.
link |
01:57:32.160
Yeah, car companies are funny because they have their own, like, I mean, that's why people get
link |
01:57:37.920
cars still. I hope that changes, but they get it because it's a certain feel and look and
link |
01:57:43.360
it's a certain, they become proud like Mercedes Benz or BMW or whatever. And that's their thing.
link |
01:57:51.520
That's the family brand or something like that. Or Ford or GM, whatever, they, they stick to that
link |
01:57:56.720
thing. It's interesting. So it should be, I don't know, it should be a little more about the technology
link |
01:58:04.000
inside. And I suppose there too, there could be a branding like a very specific style of luxury
link |
01:58:12.480
or fun, all that kind of stuff. Yeah. You know, I have an AI focused fund to invest in early stage
link |
01:58:21.280
kind of AI driven companies. And one of the companies we're looking at is trying to do a
link |
01:58:25.360
Tesla did, but for boats, for recreational boats. Yeah. So they're building an electric and kind
link |
01:58:31.040
of slash autonomous boat. And it's kind of the same issues like, what kind of sensors can you put in?
link |
01:58:38.320
What kind of states can you detect both exterior and interior within the boat? Anyways, it's
link |
01:58:44.160
like really interesting. Do you boat at all? No, not well, not in that way. I do like to get on the
link |
01:58:51.120
lake or a river and fish from a boat, but that's not boating. That's different. It's still boating.
link |
01:58:59.360
Low tech, low tech boat. Get away from, get closer to nature, but I guess going out into the ocean
link |
01:59:06.960
is also, is also getting closer to nature and some deep sense. I mean, I guess that's why people love
link |
01:59:13.600
it. The enormity of the water just underneath you. Yeah. I love the water. I love both. I love
link |
01:59:22.960
saltwater. It was like the big and just it's humbling to be in front of this giant thing
link |
01:59:28.080
that's so powerful that was here before us and be here after. But I also love the piece of a small
link |
01:59:33.680
like wooded lake. It's everything's calm. You tweeted that I'm excited about Amazon's acquisition
link |
01:59:47.840
of iRobot. I think it's a super interesting, just given the trajectory of which you're part of,
link |
01:59:54.160
of these honestly small number of companies that are playing in this space that are like trying to
link |
02:00:00.560
have an impact on human beings. So it is an interesting moment in time that Amazon would
link |
02:00:05.600
acquire iRobot. You tweet, I imagine a future where home robots are as ubiquitous as microwaves
link |
02:00:14.240
or toasters. Here are three reasons why I think this is exciting. If you remember, I can look
link |
02:00:20.080
it up. But what, why is this exciting to you? I mean, I think the first reason why this is exciting
link |
02:00:25.680
to kind of remember the exact like order in which I put them. But one is just it's, it's going to be
link |
02:00:32.560
an incredible platform for understanding our behaviors within the home, right? Like, you know,
link |
02:00:38.480
if you think about Roomba, which is, you know, the robot vacuum cleaner, the flagship product of iRobot
link |
02:00:44.400
at the moment, it's like running around your home, understanding the layout is understanding what's
link |
02:00:50.000
clean and what's not, how often do you clean your house and all of these like behaviors
link |
02:00:54.000
are a piece of the puzzle in terms of understanding who you are as a consumer. And I think that could
link |
02:00:59.680
be, again, used in really meaningful ways, not just to recommend better products or whatever,
link |
02:01:06.720
but actually to improve your experience as a human being. So I think, I think that's very
link |
02:01:10.880
interesting. I think the natural evolution of these robots in the, in the home. So it's,
link |
02:01:18.640
it's interesting. Roomba isn't really a social robot, right? At the moment. But I once interviewed
link |
02:01:25.360
one of the chief engineers on the Roomba team, and he talked about how people named their Roombas.
link |
02:01:31.120
And if their Roomba broke down, they would call in and say, you know, my Roomba broke down and
link |
02:01:36.720
the company would say, well, we'll just send you a new one. And no, no, no, Rosie, like you have to
link |
02:01:40.560
like, yeah, I want you to fix this particular robot. So people have already built like
link |
02:01:47.920
interesting emotional connections with these home robots. And I think that again, that provides a
link |
02:01:53.920
platform for really interesting things to, to just motivate change, like it could help you. I mean,
link |
02:01:59.440
one of the companies that spun out of MIT, Catalia Health, the guy who started it spent a lot of
link |
02:02:06.480
time building robots that help with weight management. So weight management, sleep, eating
link |
02:02:11.120
better. Yeah, all of these things. Well, if I'm being honest, Amazon does not exactly have a track
link |
02:02:18.880
record of winning over people in terms of trust. Now that said, it's a really difficult problem
link |
02:02:25.760
for human being to let a robot in their home that has a camera on it. Right. That's really,
link |
02:02:31.920
really, really tough. And I think Roomba actually, I have to think about this, but I'm pretty sure
link |
02:02:38.960
now, or for some time already has had cameras, because they're doing the most recent Roomba.
link |
02:02:45.920
I have so many Roombas. Oh, you actually do? Well, a program that I don't use a Roomba for
link |
02:02:50.560
fact, people that have been to my place, they're like, yeah, you definitely don't use these Roombas.
link |
02:02:56.640
That could be a good, I can't tell like the valence of this comment. Was it a compliment or like?
link |
02:03:02.240
No, it's a giant math. It's just a bunch of electronics everywhere. I have six or seven
link |
02:03:08.240
computers, I have robots everywhere, I have Lego robots, I have small robots and big robots. It's
link |
02:03:12.800
just giant, just piles of robot stuff. And yeah, but including the Roombas, they're being used
link |
02:03:23.760
for their body and intelligence, but not for their purpose. I've changed them to repurpose them for
link |
02:03:30.880
other purposes, for deeper, more meaningful purposes than just like the Butter Robot. Yeah,
link |
02:03:36.720
which just brings a lot of people happiness, I'm sure. They have a camera because the thing they
link |
02:03:43.120
advertised, I had my own camera still, but the camera on the new Roomba, they have
link |
02:03:50.800
state of the art poop detection as they advertised, which is a very difficult, apparently it's a
link |
02:03:55.360
big problem for vacuum cleaners is if they go over like dark poop, it just runs it over and
link |
02:04:01.040
creates a giant mess. So they have like, apparently they collected like a huge amount of data and
link |
02:04:07.600
different shapes and looks and whatever of poop and then not able to avoid it and so on.
link |
02:04:12.160
They're very proud of this. So there is a camera, but you don't think of it as having a camera.
link |
02:04:19.920
Yeah, you don't think of it as having a camera because you've grown to trust it, I guess, because
link |
02:04:24.400
our phones, at least most of us seem to trust this phone, even though there's a camera looking
link |
02:04:31.600
directly at you. I think that if you trust that the company is taking security very seriously,
link |
02:04:41.360
I actually don't know how that trust was earned with smartphones. I think it just started to provide
link |
02:04:46.720
a lot of positive value into your life where you just took it in and then the company over time
link |
02:04:51.840
has shown that it takes privacy very seriously, that kind of stuff. But I just Amazon is not always
link |
02:04:58.480
in its social robots communicated, this is a trustworthy thing, both in terms of culture
link |
02:05:05.120
and competence. Because I think privacy is not just about what do you intend to do,
link |
02:05:10.400
but also how well, how good are you at doing that kind of thing. So that's a really hard problem to
link |
02:05:15.680
solve. But a lot of us have Alexis at home, and I mean, Alexa could be listening in the whole time
link |
02:05:24.000
and doing all sorts of nefarious things with the data. Hopefully it's not, but I don't think it is.
link |
02:05:32.320
But Amazon is not, it's such a tricky thing for a company to get right, which is to earn the trust.
link |
02:05:38.160
I don't think Alexis earned people's trust quite yet. Yeah, I think it's not there quite yet.
link |
02:05:44.720
I agree, I agree. They struggle with this kind of stuff. In fact, when these topics are brought
link |
02:05:48.080
up, people are always get nervous. And I think if you get nervous about it, the way to earn
link |
02:05:56.800
people's trust is not by like, ooh, don't talk about this. It's just be open, be frank, be
link |
02:06:02.560
transparent, and also create a culture of where it radiates at every level from engineer to CEO
link |
02:06:11.040
that like, you're good people that have a common sense idea of what it means to respect basic
link |
02:06:20.080
human rights and the privacy of people and all that kind of stuff. And I think that propagates
link |
02:06:25.600
throughout the, that's the best PR, which is like, over time, you understand that these are
link |
02:06:32.160
good, these are good folks doing good things. Anyway, speaking of social robots, have you
link |
02:06:39.280
heard about Tesla, Tesla bot, the human robot? Yes, I have. Yes, yes, yes. But I don't exactly
link |
02:06:45.200
know what it's designed to do. Do you? You probably do. No, I know it's designed to do,
link |
02:06:51.840
but I have a different perspective on it, but it's designed to, it's a humanoid form, and it's
link |
02:06:57.760
designed to, for automation tasks in the same way that industrial robot arms automate task in the
link |
02:07:05.680
factory. So it's designed to automate task in the factory. But I think that humanoid form,
link |
02:07:10.640
as we were talking about before, is one that we connect with as human beings. Anything legged,
link |
02:07:20.720
honestly, but the humanoid form especially, we anthropomorphize it most intensely. And so,
link |
02:07:27.600
the possibility to me, it's exciting to see both Atlas developed by Boston Dynamics and anyone,
link |
02:07:36.640
including Tesla, trying to make humanoid robots cheaper and more effective. The obvious way
link |
02:07:44.720
transforms the world is social robotics to me versus automation of tasks in the factory.
link |
02:07:52.880
So, yeah, I just wanted to, in case that was something you were interested in, because I find
link |
02:07:57.840
its application of social robotics super interesting. We did a lot of work with Pepper,
link |
02:08:04.320
Pepper the Robot a while back. We were like the emotion engine for Pepper, which is SoftBank's
link |
02:08:09.840
humanoid robot. And how tall is Pepper? It's like, yeah, like, I don't know, like five foot maybe,
link |
02:08:17.760
right? Yeah, pretty big, pretty big. And it was designed to be like airport lounges and retail
link |
02:08:26.800
stores, mostly customer service, right? Hotel lobbies. And I mean, I don't know where the
link |
02:08:35.520
state of the robot is, but I think it's very promising. I think there are a lot of applications
link |
02:08:39.280
where this can be helpful. I'm also really interested in, yeah, social robotics for the home,
link |
02:08:44.640
right? Like that can help elderly people, for example, transport things from one location of
link |
02:08:50.560
the home to the other, or even like just have your back in case something happens. Yeah,
link |
02:08:57.200
I don't know. I do think it's a very interesting space. It seems early though. Do you feel like
link |
02:09:01.840
the timing is now?
link |
02:09:06.000
I, yes, 100%. So it always seems early until it's not, right?
link |
02:09:11.840
Right, right, right. I think the time, I definitely think that the time is now,
link |
02:09:21.200
like this decade, for social robots, whether the humanoid form is right, I don't think so.
link |
02:09:29.520
I don't, I think the, like, if we just look Gibo, at Gibo as an example,
link |
02:09:36.800
I feel like most of the problem, the challenge, the opportunity of social connection between
link |
02:09:45.360
an AI system and a human being does not require you to also solve the problem of robot manipulation
link |
02:09:52.720
and mobility, bipedal mobility. So I think you could do that with just a screen, honestly,
link |
02:09:58.880
but there's something about the interface of Gibo we can rotate and so on that's also compelling.
link |
02:10:03.600
But you get to see all these robot companies that fail, incredible companies like Gibo and
link |
02:10:12.320
even, I mean, the iRobot in some sense is a big success story that it was able to find
link |
02:10:20.160
a niche thing and focus on it. But in some sense, it's not a success story because they
link |
02:10:24.480
didn't build any other robot, like any other, it didn't expand into all kinds of robotics,
link |
02:10:31.360
like once you're in the home, maybe that's what happens with Amazon is they'll flourish into
link |
02:10:35.040
all kinds of other robots. But do you have a sense, by the way, why it's so difficult to
link |
02:10:42.480
build a robotics company? Like why so many companies have failed? I think it's like you're
link |
02:10:49.360
building a vertical stack, right? Like you're building the hardware plus the software and you
link |
02:10:53.280
find you have to do this at a cost that makes sense. So I think Gibo was retailing at like,
link |
02:10:59.760
I don't know, like $700, $800, which for the use case, right? There's a dissonance there,
link |
02:11:09.120
it's too high. So I think cost of building the whole platform in a way that is affordable
link |
02:11:19.440
for what value it's bringing, I think that's the challenge. I think for these home robots
link |
02:11:24.880
that are going to help, you know, help you do stuff around the home, that's a challenge too,
link |
02:11:31.600
like the mobility piece of it. That's hard. Well, one of the things I'm really excited with
link |
02:11:36.800
Tesla Bot is the people working on it. And that's probably the criticism I would apply to
link |
02:11:43.360
some of the other folks who worked on social robots is the people working on Tesla Bot know
link |
02:11:48.880
how to, they're focused on and know how to do mass manufacture and create a product that's super
link |
02:11:53.360
cheap. Very cool. That's the focus. The engineering focus isn't, I would say that you can also
link |
02:11:58.640
criticize them for that, is they're not focused on the experience of the robot. They're focused
link |
02:12:04.560
on how to get this thing to do the basic stuff that the humanoid form requires to do as cheap as
link |
02:12:12.400
possible, then the fewest number of actuators, the fewest numbers of motors, the increased
link |
02:12:17.760
efficiency, they decrease the weight, all that kind of stuff. So that's really interesting.
link |
02:12:21.520
I would say that Jibo and all those folks, they focus on the design, the experience, all of that,
link |
02:12:27.680
and it's secondary how to manufacture. But it's like, no, you have to think,
link |
02:12:32.320
like the Tesla Bot folks from first principles, what is the fewest number of components, the
link |
02:12:38.000
cheapest components, how can I build it as much in house as possible without having to
link |
02:12:44.240
consider all the complexities of a supply chain, all that kind of stuff.
link |
02:12:47.760
Because if you have to build a robotics company, you're not building one robot, you're building
link |
02:12:54.160
hopefully millions of robots. You have to figure out how to do that. Where the final thing, I mean,
link |
02:12:59.360
if it's Jibo type of robot, is there a reason why Jibo, like we're going to have this lengthy
link |
02:13:04.880
discussion, is there a reason why Jibo has to be over a hundred dollars?
link |
02:13:08.640
It shouldn't be. Right. Like the basic components of it.
link |
02:13:13.120
Right. Like you could start to actually discuss like, okay, what is the essential thing about
link |
02:13:17.840
Jibo? How much, what is the cheapest way I can have a screen? What's the cheapest way I can have
link |
02:13:22.640
a rotating base, all that kind of stuff. And then you get down, continuously drive down costs.
link |
02:13:29.440
Speaking of which, you have launched an extremely successful companies, you have helped others,
link |
02:13:35.280
you've invested in companies. Can you give advice on how to start a successful company?
link |
02:13:43.840
I would say have a problem that you really, really, really want to solve. Right. Something
link |
02:13:48.720
that you're deeply passionate about. And honestly, take the first step. Like that's often the hardest
link |
02:13:58.080
and don't overthink it. Like, you know, like this idea of a minimum viable product or a minimum
link |
02:14:02.960
viable version of an idea. Right. Like, yes, you're thinking about this, like a humongous,
link |
02:14:07.200
like super elegant, super beautiful thing. What, like reduce it to the littlest thing
link |
02:14:12.480
they can bring to market that can solve a problem or that can, you know, that can help address
link |
02:14:18.640
a pain point that somebody has. They often tell you, like, start with a customer of one.
link |
02:14:23.920
Right. If you can solve a problem for one person, then there's probably yourself or
link |
02:14:28.560
some other person. Right. Pick a person. Exactly. It could be you. Yeah. It's actually
link |
02:14:33.360
often a good sign that if you enjoy a thing, enjoy a thing where you have a specific problem that
link |
02:14:38.240
you'd like to solve, that's a good, that's a good end of one to focus on. Right. What else,
link |
02:14:43.920
what else is there to actually, so step one is the hardest, but there's other steps as well,
link |
02:14:50.080
right? I also think like who you bring around the table early on is so key. Right. Like being
link |
02:14:58.560
clear on, on what I call like your core values or your North Star. It might sound fluffy, but
link |
02:15:03.760
actually it's not. So, and Roz and I, I feel like we did that very early on. We sat around her
link |
02:15:09.440
kitchen table and we said, okay, there's so many applications of this technology. How are we going
link |
02:15:14.000
to draw the line? How are we going to set boundaries? We came up with a set of core values
link |
02:15:18.320
that in the hardest of times we fell back on to determine how we make decisions. And so,
link |
02:15:25.440
I feel like just getting clarity on these core, like for us, it was respecting people's privacy,
link |
02:15:30.320
only engaging with industries where it's clear opt in. So, for instance, we don't do any work
link |
02:15:35.360
in security and surveillance. So, things like that, just getting, we very big on, you know,
link |
02:15:41.440
one of our core values is human connection and empathy. Right. And that is, yes, it's an AI company,
link |
02:15:46.320
but it's about people. Well, these are all, they become encoded in how we act, even, even if you're
link |
02:15:53.520
a small, tiny team of two or three or whatever. So, I think that's another piece of advice.
link |
02:15:59.680
So, what about finding people, hiring people? If you care about people as much as you do, like,
link |
02:16:05.920
it seems like such a difficult thing to hire the right people.
link |
02:16:10.720
I think early on, in the startup, you want people who have, who share the passion and the
link |
02:16:15.520
conviction because, because it's going to be tough. Like, I've yet to meet a startup where it was just
link |
02:16:22.320
a straight line to success. Right. Even, even not just startup, like, even everyday people's
link |
02:16:27.360
lives, right. You always like run into obstacles and you run into naysayers and
link |
02:16:35.280
so, you need people who are believers, whether they're people on your team or even your investors,
link |
02:16:40.400
you need investors who are really believers in what you're doing, because that means they will
link |
02:16:45.120
stick with you. They won't, they won't give up at the first obstacle. I think that's important.
link |
02:16:50.640
What about raising money? What about finding investors? First of all, raising, raising money,
link |
02:16:58.160
but also raising money from the right sources, from that ultimately don't hinder you, but, you
link |
02:17:03.920
know, help you, empower you, all that kind of stuff. What a device would you give there?
link |
02:17:08.320
You successfully raised money many times in your life.
link |
02:17:11.360
Yeah, again, it's not just about the money. It's about writing, finding the right investors who
link |
02:17:17.200
are going to be aligned in terms of what you want to build and believe in your core values. Like,
link |
02:17:22.960
for example, especially later on, like I, yeah, in my latest, like, round of funding,
link |
02:17:29.600
I try to bring in investors that really care about, like, the ethics of AI, right. And they,
link |
02:17:35.600
the alignment of vision and mission and core values is really important. It's like you're
link |
02:17:41.360
picking a life partner, right? It's the same kind of. So you take it that seriously for investors?
link |
02:17:47.280
Yeah, because they're going to have to stick with you. You're stuck together.
link |
02:17:51.280
For a while, anyway. Yeah. Maybe not for life, but for a while, for sure.
link |
02:17:56.720
For better or worse, I forget what the vowels usually sound like. For better or worse? No.
link |
02:18:00.800
Through sick, through something. Yeah. Oh, boy. Yeah. Anyway, it's romantic and deep,
link |
02:18:11.760
and you're in it for a while. So it's not just about the money. You tweeted about going to your
link |
02:18:19.840
first capital camp, investing, get together. And then you learned a lot. So this is about
link |
02:18:24.560
investing. So what have you learned from that? What have you learned about investing in general?
link |
02:18:32.560
From both? Because you've been on both ends of it. I mean, I try to use my experience as an operator
link |
02:18:39.120
now with my investor hat on when I'm identifying companies to invest in. First of all, I think
link |
02:18:46.320
the good news is because I have a technology background, right, and I really understand machine
link |
02:18:50.720
learning and computer vision and AI, et cetera. I can apply that level of understanding, right?
link |
02:18:56.400
Because everybody says they're an AI company or they're an AI tech. And I'm like, no, no, no, no,
link |
02:19:01.760
show me the technology. So I can do that level of diligence, which I actually love.
link |
02:19:07.360
And then I have to do the litmus test of, you know, if I'm in a conversation with you,
link |
02:19:11.920
am I excited to tell you about this new company that I just met, right? And if I'm an ambassador
link |
02:19:19.280
for that company and I'm passionate about what they're doing, I usually use that. Yeah, that's
link |
02:19:25.520
important to me when I'm investing. So that means you actually can't explain what they're doing
link |
02:19:32.800
and you're excited about it. Exactly. Exactly. Thank you for putting it so succinctly.
link |
02:19:39.760
Like rambling, but exactly that's it. No, but sometimes it's funny, but sometimes it's unclear
link |
02:19:45.680
exactly. I'll hear people tell me, you know, they'll talk for a while and it sounds cool, like they
link |
02:19:53.280
paint a picture of a world, but then when you try to summarize it, you're not exactly clear
link |
02:19:58.240
of what maybe, maybe what the core powerful idea is. Like you can't just build another Facebook or
link |
02:20:06.240
there has to be a, there has to be a core simple to explain idea that, yeah, that then you can
link |
02:20:14.320
or can't get excited about, but it's there. It's right there. Yeah. Yeah. But like, how do you
link |
02:20:21.040
ultimately pick who you think will be successful? So it's not just about the thing you're excited
link |
02:20:27.360
about, like there's other stuff. Right. And then there's all the, you know, with early stage companies,
link |
02:20:33.040
like pre seed companies, which is where I'm investing, sometimes the, the business model
link |
02:20:38.560
isn't clear yet, or the go to market strategy isn't clear. There's usually like, it's very
link |
02:20:43.120
early on that some of these things haven't been hashed out, which is okay. So the way I like
link |
02:20:48.240
to think about it is like, if this company is successful, will this be a multi billion slash
link |
02:20:52.800
trillion dollar market, you know, or company? And, and so that's definitely a lens that I use.
link |
02:21:00.320
What's pre, what's pre seed? What are the different stages? And what's the most exciting
link |
02:21:04.880
stage and what's, or not what's, what's, what's interesting about every stage, I guess.
link |
02:21:09.440
Yeah. So pre seed is usually when you're just starting out, you've maybe raised the friends
link |
02:21:15.440
and family rounds, you've raised some money from people, you know, and you're getting ready to
link |
02:21:20.240
take your first institutional check in like first check from an investor. And I love this stage.
link |
02:21:28.640
There's a lot of uncertainty. So some investors really don't like the stage because
link |
02:21:32.880
the financial models aren't there. Often the teams aren't even like formed really, really early.
link |
02:21:40.880
But to me, it's, it's like a magical stage because it's the time when there's so much conviction,
link |
02:21:47.600
so much belief, almost delusional, right? Yeah. And there's a little bit of naivete around
link |
02:21:53.040
with founders at the stage. And I just love it. It's contagious. And I love, I love that I can
link |
02:22:03.840
often they're first time founders, not always, but often they're first time founders and I can
link |
02:22:07.920
share my experience as a founder myself and I can empathize, right? And I can almost,
link |
02:22:14.400
I create a safe ground where, because, you know, you have to be careful what you tell your investors,
link |
02:22:19.680
right? And I will, I will often like say, I've been in your shoes as a founder, you can tell me
link |
02:22:25.600
if it's challenging, you can tell me what you're struggling with, it's okay to vent.
link |
02:22:29.920
So I create that safe ground. And I think, I think that's the superpower.
link |
02:22:35.120
Yeah, you have to, what I guess you have to figure out if this kind of person is going to be able
link |
02:22:40.320
to ride the roller coaster, like of many pivots and challenges and all that kind of stuff. And if
link |
02:22:48.560
the space of ideas they're working in is interesting, like the way they think about the world. Yeah,
link |
02:22:54.720
because if it's successful, the thing they end up with might be very different, the reason
link |
02:22:59.920
it's successful for it. Actually, you know, I was going to say the third, so the technology is one
link |
02:23:06.160
aspect, the market or the idea, right, is the second and the third is the founder, right? Is
link |
02:23:11.200
this somebody who I believe has conviction is a hustler, you know, is going to overcome
link |
02:23:18.080
obstacles. Yeah, I think that it's going to be a great leader, right? Like as a startup,
link |
02:23:24.320
as a founder, you're often, you are the first person and your role is to bring amazing people
link |
02:23:30.240
around you to build this thing. And so you're in an evangelist, right? So how good are you
link |
02:23:37.120
going to be at that? So I try to evaluate that too. You're also in the tweet thread about it,
link |
02:23:43.520
mentioned, is this a known concept, random rich dudes, RDS, and saying that there should be like
link |
02:23:50.720
random rich women, I guess. What's the dudes, what's the dudes version of women, the women version
link |
02:23:57.040
of dudes, ladies, I don't know. What's, what's, is this a technical term? Is this known? Random
link |
02:24:02.800
rich dudes? Well, I didn't make that up, but I was at this capital camp, which is a get together for
link |
02:24:08.400
investors of all types. And there must have been maybe 400 or so attendees, maybe 20 were women.
link |
02:24:18.080
It was just very disproportionately, you know, a male, male dominated, which I'm used to. I think
link |
02:24:25.280
you're used to this kind of thing. I'm used to it, but it's still surprising. And as I'm raising
link |
02:24:30.080
money for this fund, so my, my, my fund partner is a guy called Rob May, who's done this before. So
link |
02:24:37.040
I'm new to the investing world, but he's done this before. Most of our investors in the fund are
link |
02:24:43.280
these, I mean, awesome, I'm super grateful to them, random just rich guys. I'm like, where are the
link |
02:24:48.880
rich women? So I'm really adamant in both investing in women led AI companies. But I also would love
link |
02:24:56.560
to have women investors be part of my fund. Because I think that's how we drive change.
link |
02:25:02.320
Yeah. So the next, you know, that, that, that takes time, of course, but there's been quite,
link |
02:25:07.360
quite a lot of progress. But yeah, for, for the next Mark Zuckerberg to be a woman and all that
link |
02:25:12.080
kind of stuff, because that, that's just like a huge number of wealth generated by, by women and
link |
02:25:17.920
then controlled by women and allocated by women and all that kind of stuff. And then beyond just
link |
02:25:22.880
women, just broadly across all different measures of diversity and so on. Let me ask you to put
link |
02:25:32.000
on your wise sage hat. Okay. So we already, we already gave advice on startups and just advice
link |
02:25:42.320
for women. But in general, advice for folks in high school or college today, how to have a career
link |
02:25:49.680
they can be proud of, how to have a life they can be proud of. I suppose you have to give this kind
link |
02:25:56.720
of advice to your kids. Yeah. Well, here's the number one advice that I give to my kids.
link |
02:26:03.200
My daughter is now 19, by the way, and my son is 13 and a half. So they're not little kids anymore.
link |
02:26:09.200
But, but I think it does. They're awesome. They're my best friends. But yeah, I think the number
link |
02:26:17.440
one advice I would share is embark on a journey without attaching to outcomes and enjoy the journey.
link |
02:26:24.720
Right? So, you know, we often were so obsessed with, with the end goal. A, that doesn't allow us to
link |
02:26:32.560
be open to different endings of a journey or a story. So you become like so fixated on a
link |
02:26:40.480
particular path, you don't see the beauty in the other alternative path. And then you forget to
link |
02:26:47.840
enjoy the journey because you're just so fixated on the goal. And I've been guilty of that for many,
link |
02:26:53.040
many years in my life. And I've, I've, I've now, I'm now trying to like make the shift of no, no,
link |
02:26:58.000
no, I'm gonna, again, trust that things are going to work out and it'll be amazing and maybe even
link |
02:27:03.680
exceed your dreams. We have to be open to that. Yeah. Taking, taking a leap into all kinds of
link |
02:27:09.520
things. I think you tweeted like you went on vacation by yourself or something like this or
link |
02:27:13.680
I know this. And just, just, just, just going, just taking the leap, doing it and enjoying
link |
02:27:20.400
enjoying the, enjoying the moment, enjoying the weeks, enjoying not looking at the,
link |
02:27:26.160
some kind of career ladder next step and so on. Yeah. There's, there's something to that,
link |
02:27:32.000
like overplanning too. I'm surrounded by a lot of people that kind of, so I don't plan.
link |
02:27:37.600
You don't. No. Do you not do goal setting?
link |
02:27:40.640
My goal setting is very like, I like the affirmations is very,
link |
02:27:50.800
it's almost, I don't know how to put into words, but it's, it's a little bit like
link |
02:27:58.160
what my heart yearns for kind of in, I guess in the space of emotions more than in the space of like
link |
02:28:04.960
this would be like in the rational space. Cause I just tried to picture a world that I would like
link |
02:28:13.280
to be in and that world is not clearly pictured. It's mostly in the emotional world. I mean,
link |
02:28:18.800
I think about that from, from robots, cause you know, I have this desire. I've had it my whole
link |
02:28:25.520
life to, to, well, it took different shapes, but I think once I discovered AI, the desire was to
link |
02:28:32.880
I think in this, in the context of this conversation, could be easily easier described as basically a
link |
02:28:41.120
social robotics company. And that's something I dreamed of doing. And well, there's a lot,
link |
02:28:50.240
there's a lot of complexity to that story, but that, that's, that's the, that's the only thing,
link |
02:28:54.400
honestly I dream of doing. So I imagine a world that, that I could help create, but it's not a,
link |
02:29:03.360
there's no steps along the way. And I think I'm just kind of stumbling around and following
link |
02:29:10.400
happiness and working my ass off in almost random, like an aunt does in random directions.
link |
02:29:17.120
But a lot of people, a lot of successful people around me say this, you should have a plan,
link |
02:29:20.400
you should have a clear goal. You have a goal at the end of the month, you have a goal at the
link |
02:29:23.200
end of the year. I don't, I don't, I don't. And there's a balance to be struck, of course, but
link |
02:29:32.880
there's something to be said about really making sure that you're living life to the fullest
link |
02:29:40.080
that goals can actually get in the way of. So one of the best, like kind of most,
link |
02:29:46.880
what do you, what do you call it when it's like challenges your brain? What do you call it?
link |
02:29:56.560
The only thing that comes to mind, and this is me saying is a mind fuck, but yes.
link |
02:30:00.080
Okay, okay. Something like that. Yes.
link |
02:30:03.760
Super inspiring talk. Kenneth Stanley, he was at open AI, he just laughed. And he has a book
link |
02:30:10.080
called Why Greatness Can't Be Planned. And it's actually an AI book. So, and he's done all these
link |
02:30:15.120
experiments that basically show that when you over optimize, you, you, like the tradeoff is
link |
02:30:22.160
you're less creative, right? And to create true greatness and truly creative solutions to problems,
link |
02:30:29.360
you can't overplan it. You can't. And I thought that was, and so he generalizes it beyond AI.
link |
02:30:35.280
And he talks about how we apply that in our personal life and in our organizations and our
link |
02:30:39.760
companies, which are over KPI, right? Like look at any company in the world. And it's all like,
link |
02:30:44.400
these are the goals. These are the, you know, weekly goals and, you know, the sprints and then
link |
02:30:49.200
the quarterly goals, blah, blah, blah. And, and he just shows with a lot of his AI experiments
link |
02:30:55.040
that that's not how you create truly game changing ideas. So there you go. Yeah. Yeah.
link |
02:31:01.120
But he's awesome. Yeah, there's a balance, of course, because that's yeah,
link |
02:31:06.160
many moments of genius will not come from planning and goals, but you still have to build factories
link |
02:31:12.720
and you still have to manufacture and you still have to deliver and there's still deadlines and
link |
02:31:16.400
all that kind of stuff. And that for that's good to have goals. I do goal setting with my kids.
link |
02:31:21.280
We all have our goals, but, but, but I think we're starting to morph into more of these like
link |
02:31:26.560
bigger picture goals and not obsess about like, I don't know, it's hard. Well, I honestly think
link |
02:31:32.160
with, especially with kids, it's better, much, much better to have a plan and have goals and so
link |
02:31:36.000
on. Cause you have to, you have to learn the muscle of like what it feels like to get stuff done.
link |
02:31:40.240
Yeah. But I think once you learn that, there's flexibility for me because I spend most of my
link |
02:31:45.680
life with goal setting and so on. So like, I've gotten good with grades and school. I mean,
link |
02:31:50.640
in school, if you want to be successful at school, yeah. I mean, the kind of stuff in high school
link |
02:31:55.520
and college that kids have to do in terms of managing their time and getting so much stuff done,
link |
02:32:00.960
it's like, you know, taking five, six, seven classes in college, they're like, that would break
link |
02:32:06.960
the, the spirit of most humans if they took one of them later in life. It's like really difficult
link |
02:32:13.760
stuff, especially engineering curricula. So I think you have to learn that skill. But once
link |
02:32:20.160
you learn it, you can maybe, cause you're, you can be a little bit on autopilot and use that
link |
02:32:24.800
momentum and then allow yourself to be lost in the flow of life, you know, just kind of
link |
02:32:29.520
or also give like, I work pretty hard to allow myself to have the freedom to do that. That's
link |
02:32:39.920
really, that's a tricky freedom to have because like a lot of people get lost in the right race
link |
02:32:45.200
and they, and they also like, like financially, they, whenever you get a raise, they'll get
link |
02:32:53.040
like a bigger house or something like this. I put very, so like, you're always trapped in this
link |
02:32:58.240
race. I put a lot of emphasis on, on living like below my means always. And so there's a lot of
link |
02:33:06.800
freedoms to do whatever, whatever the heart desires that, that's a really, but everyone has
link |
02:33:12.640
to decide what's the right thing, what's the right thing for them. For some people,
link |
02:33:16.560
having a lot of responsibilities, like a house they can barely afford, or having a lot of kids,
link |
02:33:22.240
the responsibility side of that is really helps them get their shit together. Like, all right,
link |
02:33:28.720
I need to be really focused and give some of the most successful people I know have kids and the
link |
02:33:33.280
kids bring out the best of them. They make them more productive, less productive.
link |
02:33:36.400
Accountability, accountability thing. And almost something to actually live and
link |
02:33:41.600
fight and work for, like having a family. It's, it's fascinating to see because you would think
link |
02:33:47.440
kids would be a hit on productivity, but they're not for a lot of really successful people.
link |
02:33:52.080
They really like, they're like an engine of right efficiency. Oh my God. Yeah. It's weird. Yeah.
link |
02:33:57.680
I mean, it's beautiful. It's beautiful to see. And also social happiness. Speaking of which,
link |
02:34:03.120
what role do you think love plays in the human condition? Love?
link |
02:34:12.080
I think love is, is, um,
link |
02:34:14.560
yeah, I think, I think it's why we're all here. I think it would be very hard to live life without
link |
02:34:23.680
love in any of its forms, right? Yeah, that's the most beautiful forms that,
link |
02:34:32.960
that human connection takes, right? Yeah. I feel like everybody wants to feel loved,
link |
02:34:39.920
right? And one way or another, right? And to love. Yeah. And to love too. Totally. Yeah,
link |
02:34:45.120
I agree with that. Both of it. I'm not even sure what feels better.
link |
02:34:50.240
Both, both like that. To give and, to give love too. Yeah. And, and it is like we've been talking
link |
02:34:56.560
about an interesting question, whether some of that, whether one day we'll be able to love a toaster.
link |
02:35:02.000
Okay. Get some small. I wasn't quite thinking about that when I said like, love, yeah, love.
link |
02:35:09.200
That's all I was thinking about. Love and give love. Okay. I was thinking about Brad Pitt and
link |
02:35:12.480
toasters. Toasters, great. All right. Well, I think we, we started on love and ended on love.
link |
02:35:19.920
This was an incredible conversation, Ron. And thank you so much. You're an incredible person.
link |
02:35:24.640
Thank you for everything you're doing in, in, in AI, in, in the space of just caring about humanity,
link |
02:35:32.560
human emotion, about love, and being an inspiration to a huge number of people in robotics, in AI,
link |
02:35:39.520
in science, in the world of general. So thank you for talking to me. It's an honor.
link |
02:35:44.080
Thank you for having me. And you know, I'm a big fan of yours as well. So it's been a pleasure.
link |
02:35:49.440
Thanks for listening to this conversation with Rana Elkhlyubi. To support this podcast,
link |
02:35:53.760
please check out our sponsors in the description. And now let me leave you with some words from
link |
02:35:58.720
Helen Keller. The best and most beautiful things in the world cannot be seen or even touched.
link |
02:36:05.200
They must be felt with the heart. Thank you for listening and hope to see you next time.