back to indexRana el Kaliouby: Emotion AI, Social Robots, and Self-Driving Cars | Lex Fridman Podcast #322
link |
there's a broader question here, right? As we build socially and emotionally intelligent machines,
link |
what does that mean about our relationship with them and then more broadly our relationship with
link |
one another, right? Because this machine is going to be programmed to be amazing at empathy,
link |
by definition, right? It's going to always be there for you. It's not going to get bored.
link |
I don't know how I feel about that. I think about that a lot.
link |
TITO The following is a conversation with Rana
link |
L. Kliubi, a pioneer in the field of emotion recognition and human centric artificial
link |
intelligence. She is the founder of Effectiva, deputy CEO of SmartEye, author of Girl Decoded,
link |
and one of the most brilliant, kind, inspiring, and fun human beings I've gotten the chance to
link |
talk to. This is the Lex Friedman podcast. To support it, please check out our sponsors in
link |
the description. And now, dear friends, here's Rana L. Kliubi. You grew up in the Middle East,
link |
in Egypt. What is the memory from that time that makes you smile? Or maybe a memory that stands out
link |
as helping your mind take shape and helping you define yourself in this world?
link |
RANA L. KLIUBI So the memory that stands out is we used to
link |
live in my grandma's house. She used to have these mango trees in her garden. And in the summer,
link |
and so mango season was like July and August. And so in the summer, she would invite all my aunts
link |
and uncles and cousins. And it was just like maybe there were like 20 or 30 people in the house,
link |
and she would cook all this amazing food. And us, the kids, we would go down the garden,
link |
and we would pick all these mangoes. And I don't know, I think it's just the bringing people
link |
together that always stuck with me, the warmth. TITO Around the mango tree.
link |
RANA L. KLIUBI Yeah, around the mango tree. And there's just like the joy, the joy of being
link |
together around food. And I'm a terrible cook. So I guess that didn't, that memory didn't translate
link |
to me kind of doing the same. I love hosting people. TITO Do you remember colors, smells?
link |
Is that what, like what, how does memory work? Like what do you visualize? Do you visualize
link |
people's faces, smiles? Do you, is there colors? Is there like a theme to the colors? Is it smells
link |
because of food involved? RANA L. KLIUBI Yeah, I think that's a great question. So the,
link |
those Egyptian mangoes, there's a particular type that I love, and it's called Darwasi mangoes. And
link |
they're kind of, you know, they're oval, and they have a little red in them. So I kind of,
link |
they're red and mango colored on the outside. So I remember that. TITO Does red indicate like
link |
extra sweetness? Is that, is that, that means like it's nicely, yeah, it's nice and ripe and stuff.
link |
Yeah. What, what's like a definitive food of Egypt? You know, there's like these almost
link |
stereotypical foods in different parts of the world, like Ukraine invented borscht.
link |
Borscht is this beet soup with, that you put sour cream on. See, it's not, I can't see if you,
link |
if you know, if you know what it is, I think, you know, is delicious. But if I explain it,
link |
it's just not going to sound delicious. I feel like beet soup. This doesn't make any sense,
link |
but that's kind of, and you probably have actually seen pictures of it because it's one of the
link |
traditional foods in Ukraine, in Russia, in different parts of the Slavic world. So that's,
link |
but it's become so cliche and stereotypical that you almost don't mention it, but it's still
link |
delicious. Like I visited Ukraine, I eat that every single day, so.
link |
Do you, do you make it yourself? How hard is it to make?
link |
No, I don't know. I think to make it well, like anything, like Italians, they say, well,
link |
tomato sauce is easy to make, but to make it right, that's like a generational skill. So anyway,
link |
is there something like that in Egypt? Is there a culture of food?
link |
There is. And actually, we have a similar kind of soup. It's called molokhia, and it's, it's made
link |
of this green plant. It's like, it's somewhere between spinach and kale, and you mince it,
link |
and then you cook it in like chicken broth. And my grandma used to make, and my mom makes it really
link |
well, and I try to make it, but it's not as great. So we used to have that. And then we used to have
link |
it alongside stuffed pigeons. I'm pescetarian now, so I don't eat that anymore, but.
link |
Yeah, it's like, it was really yummy. It's the one thing I miss about,
link |
you know, now that I'm pescetarian and I don't eat.
link |
The stuffed pigeons?
link |
Yeah, the stuffed pigeons.
link |
Is it, what are they stuffed with? If that doesn't bother you too much to describe.
link |
No, no, it's stuffed with a lot of like just rice and, yeah, it's just rice. Yeah, so.
link |
And you also, you said that your first, in your book, that your first computer
link |
was an Atari, and Space Invaders was your favorite game.
link |
Is that when you first fell in love with computers, would you say?
link |
Yeah, I would say so.
link |
Video games, or just the computer itself? Just something about the machine.
link |
Ooh, this thing, there's magic in here.
link |
Yeah, I think the magical moment is definitely like playing video games with my,
link |
I have two younger sisters, and we would just like had fun together, like playing games.
link |
But the other memory I have is my first code, the first code I wrote.
link |
I wrote, I drew a Christmas tree, and I'm Muslim, right?
link |
So it's kind of, it was kind of funny that the first thing I did was like this Christmas tree.
link |
So, yeah, and that's when I realized, wow, you can write code to do all sorts of like
link |
really cool stuff. I must have been like six or seven at the time.
link |
So you can write programs, and the programs do stuff for you. That's power.
link |
That's, if you think about it, that's empowering.
link |
Yeah, I know what it is. I don't know if that, you see like,
link |
I don't know if many people think of it that way when they first learned to program.
link |
They just love the puzzle of it. Like, ooh, this is cool. This is pretty.
link |
It's a Christmas tree, but like, it's power.
link |
Eventually, I guess you couldn't at the time, but eventually this thing,
link |
if it's interesting enough, if it's a pretty enough Christmas tree,
link |
it can be run by millions of people and bring them joy, like that little thing.
link |
And then because it's digital, it's easy to spread.
link |
So like you just created something that's easily spreadable to millions of people.
link |
It's hard to think that way when you're six.
link |
In the book, you write, I am who I am because I was raised by a particular set of parents,
link |
both modern and conservative, forward thinking, yet locked in tradition.
link |
I'm a Muslim and I feel I'm stronger, more centered for it.
link |
I adhere to the values of my religion, even if I'm not as dutiful as I once was.
link |
And I am a new American and I'm thriving on the energy,
link |
vitality and entrepreneurial spirit of this great country.
link |
So let me ask you about your parents.
link |
What have you learned about life from them, especially when you were young?
link |
So both my parents, they're Egyptian, but they moved to Kuwait right out.
link |
Actually, there's a cute story about how they met.
link |
So my dad taught COBOL in the 70s.
link |
And my mom decided to learn programming.
link |
So she signed up to take his COBOL programming class.
link |
And he tried to date her and she was like, no, no, no, I don't date.
link |
And so he's like, okay, I'll propose.
link |
And that's how they got married.
link |
Whoa, strong move.
link |
Right, exactly, right.
link |
That's really impressive.
link |
Those COBOL guys know how to impress a lady.
link |
So yeah, so what have you learned from them?
link |
So definitely grit.
link |
One of the core values in our family is just hard work.
link |
There were no slackers in our family.
link |
And that's something that's definitely stayed with me,
link |
both as a professional, but also in my personal life.
link |
But I also think my mom, my mom always used to like, I don't know, it was like unconditional
link |
Like I just knew my parents would be there for me kind of regardless of what I chose to do.
link |
And I think that's very powerful.
link |
And they got tested on it because I kind of challenged cultural norms and I kind of took
link |
a different path, I guess, than what's expected of a woman in the Middle East.
link |
And they still love me, which I'm so grateful for that.
link |
When was like a moment that was the most challenging for them?
link |
Which moment where they kind of had to come face to face with the fact that you're a bit
link |
I think the first big moment was when I had just gotten married, but I decided to go do
link |
my PhD at Cambridge University.
link |
And because my husband at the time, he's now my ex, ran a company in Cairo, he was going
link |
So it was going to be a long distance relationship.
link |
And that's very unusual in the Middle East for a woman to just head out and kind of pursue
link |
And so my dad and my parents in law both said, you know, we do not approve of you doing this,
link |
but now you're under the jurisdiction of your husband so he can make the call.
link |
And luckily for me, he was supportive.
link |
He said, you know, this is your dream come true.
link |
You've always wanted to do a PhD.
link |
I'm going to support you.
link |
So I think that was the first time where, you know, I challenged the cultural norms.
link |
It was totally scary.
link |
What's the biggest culture shock from there to Cambridge, to London?
link |
Well, that was also during right around September 11th.
link |
So everyone thought that there was going to be a third world war.
link |
It was really like, and I, at the time I used to wear the hijab, so I was very visibly Muslim.
link |
And so my parents just were, they were afraid for my safety.
link |
But anyways, when I got to Cambridge, because I was so scared, I decided to take off my
link |
headscarf and wear a hat instead.
link |
So I just went to class wearing these like British hats, which was, in my opinion, actually
link |
worse than just showing up in a headscarf because it was just so awkward, right?
link |
Like sitting in class with like all these.
link |
So after a few weeks of doing that, I was like, to heck with that.
link |
I'm just going to go back to wearing my headscarf.
link |
Yeah, you wore the hijab, so starting in 2000 and for 12 years after.
link |
So it's always, whenever you're in public, you have to wear the head covering.
link |
Can you speak to that, to the hijab, maybe your mixed feelings about it?
link |
Like what does it represent in its best case?
link |
What does it represent in the worst case?
link |
Yeah, you know, I think there's a lot of, I guess I'll first start by saying I wore
link |
I was not forced to wear it.
link |
And in fact, I was one of the very first women in my family to decide to put on the hijab.
link |
And my family thought it was really odd, right?
link |
Like there was, they were like, why do you want to put this on?
link |
And at its best, it's a sign of modesty, humility.
link |
It's like me wearing a suit, people are like, why are you wearing a suit?
link |
It's a step back into some kind of tradition, a respect for tradition of sorts.
link |
So you said, because it's by choice, you're kind of free to make that choice to celebrate
link |
a tradition of modesty.
link |
Exactly. And I actually like made it my own.
link |
I remember I would really match the color of my headscarf with what I was wearing.
link |
Like it was a form of self expression and at its best, I loved wearing it.
link |
You know, I have a lot of questions around how we practice religion and religion and,
link |
you know, and I think also it was a time where I was spending a lot of time going back and
link |
forth between the US and Egypt.
link |
And I started meeting a lot of people in the US who are just amazing people, very purpose
link |
driven, people who have very strong core values, but they're not Muslim.
link |
That's okay, right?
link |
And so that was when I just had a lot of questions.
link |
And politically, also the situation in Egypt was when the Muslim Brotherhood ran the country
link |
and I didn't agree with their ideology.
link |
It was at a time when I was going through a divorce.
link |
Like it was like, it was like just the perfect storm of like political, personal conditions
link |
where I was like, this doesn't feel like me anymore.
link |
And it took a lot of courage to take it off because culturally it's not, it's okay if
link |
you don't wear it, but it's really not okay to wear it and then take it off.
link |
But you're still, so you have to do that while still maintaining a deep core and pride in
link |
the origins, in your origin story.
link |
So still being Egyptian, still being a Muslim.
link |
And being, I think generally like faith driven, but yeah.
link |
But what that means changes year by year for you.
link |
It's like a personal journey.
link |
What would you say is the role of faith in that part of the world?
link |
Like, how do you see, you mentioned it a bit in the book too.
link |
I mean, I think, I think there is something really powerful about just believing that
link |
there's a bigger force, you know, there's a kind of surrendering, I guess, that comes
link |
with religion and you surrender and you have this deep conviction that it's going to be
link |
Like the universe is out to like do amazing things for you and it's going to be okay.
link |
And there's strength to that.
link |
Like even when you're going through adversity, you just know that it's going to work out.
link |
Yeah, it gives you like an inner peace, a calmness.
link |
Yeah, that's, it's faith in all the meanings of that word.
link |
Faith that everything is going to be okay.
link |
And it is because time passes and time cures all things.
link |
It's like a calmness with the chaos of the world.
link |
And also there's like a silver, I'm a true believer of this, that something at the specific
link |
moment in time can look like it's catastrophic and it's not what you wanted in life.
link |
But then time passes and then you look back and there's the silver lining, right?
link |
It maybe closed the door, but it opened a new door for you.
link |
And so I'm a true believer in that, that, you know, there's a silver lining in almost
link |
anything in life, you just have to have this like, yeah, faith or conviction that it's
link |
going to work out.
link |
Yeah, it's such a beautiful way to see a shady feeling.
link |
So if you feel shady about a current situation, I mean, it almost is always true.
link |
Unless it's the cliche thing of if it doesn't kill you, whatever doesn't kill you makes
link |
It's, it does seem that over time when you take a perspective on things that the hardest
link |
moments and periods of your life are the most meaningful.
link |
So over time you get to have that perspective.
link |
What about, because you mentioned Kuwait, what about, let me ask you about war.
link |
What's the role of war and peace, maybe even the big love and hate in that part of
link |
the world, because it does seem to be a part of the world where there's turmoil.
link |
There was turmoil, there's still turmoil.
link |
It is so unfortunate, honestly.
link |
It's, it's such a waste of human resources and, and, and yeah, and human mindshare.
link |
I mean, and at the end of the day, we all kind of want the same things.
link |
We want, you know, we want a human connection, we want joy, we want to feel fulfilled, we
link |
want to feel, you know, a life of purpose.
link |
And I just, I just find it baffling, honestly, that we are still having to grapple with that.
link |
I have a story to share about this.
link |
You know, I grew up, I'm Egyptian, American now, but, but, you know, originally from Egypt.
link |
And when I first got to Cambridge, it turned out my officemate, like my PhD kind of, you
link |
know, she ended up, you know, we ended up becoming friends, but she was from Israel.
link |
And we didn't know, yeah, we didn't know how it was going to be like.
link |
Did you guys sit there just staring at each other for a bit?
link |
Actually, she, because I arrived before she did.
link |
And it turns out she emailed our PhD advisor and asked him if she thought it was going
link |
And this is around 9 11 too.
link |
And, and Peter, Peter Robinson, our PhD advisor was like, yeah, like, this is an academic
link |
institution, just show up.
link |
And we became super good friends.
link |
We were both new moms.
link |
Like we both had our kids during our PhD.
link |
We were both doing artificial emotional intelligence.
link |
She was looking at speech.
link |
I was looking at the face.
link |
We just had so the culture was so similar.
link |
Our jokes were similar.
link |
It was just, I was like, why on earth are our countries, why is there all this like
link |
And I think it falls back to the narrative, right?
link |
If you change the narrative, like whoever creates this narrative of war.
link |
We should have women run the world.
link |
Yeah, that's one solution.
link |
The good women, because there's also evil women in the world.
link |
But yes, yes, there could be less war if women ran the world.
link |
The other aspect is, it doesn't matter the gender, the people in power.
link |
I get to see this with Ukraine and Russia and different parts of the world around that
link |
And that's happening in Yemen as well and everywhere else.
link |
There's these narratives told by the leaders to the populace.
link |
And those narratives take hold and everybody believes that.
link |
And they have a distorted view of the humanity on the other side.
link |
In fact, especially during war, you don't even see the people on the other side as human
link |
or as equal intelligence or worth or value as you.
link |
You tell all kinds of narratives about them being Nazis or dumb or whatever narrative
link |
you want to weave around that or evil.
link |
But I think when you actually meet them face to face, you realize they're like the same.
link |
It's actually a big shock for people to realize that they've been essentially lied to within
link |
And I kind of have faith that social media, as ridiculous as it is to say, or any kind
link |
of technology, is able to bypass the walls that governments put up and connect people
link |
And then you get to realize, oh, people fall in love across different nations and religions
link |
And that, I think, ultimately can cure a lot of our ills, especially in person.
link |
I also think that if leaders met in person, they'd have a conversation that could cure
link |
a lot of the ills of the world, especially in private.
link |
Let me ask you about the women running the world.
link |
So gender does, in part, perhaps shape the landscape of just our human experience.
link |
So in what ways was it limiting and in what ways was it empowering for you to be a woman
link |
in the Middle East?
link |
I think, just kind of going back to my comment on women running the world, I think it comes
link |
back to empathy, which has been a common thread throughout my entire career.
link |
And it's this idea of human connection.
link |
Once you build common ground with a person or a group of people, you build trust, you
link |
build loyalty, you build friendship.
link |
And then you can turn that into behavior change and motivation and persuasion.
link |
So it's like, empathy and emotions are just at the center of everything we do.
link |
And I think being from the Middle East, kind of this human connection is very strong.
link |
We have this running joke that if you come to Egypt for a visit, people will know everything
link |
about your life right away, right?
link |
I have no problems asking you about your personal life.
link |
There's no boundaries, really, no personal boundaries in terms of getting to know people.
link |
We get emotionally intimate very, very quickly.
link |
But I think people just get to know each other authentically, I guess.
link |
There isn't this superficial level of getting to know people.
link |
You just try to get to know people really deeply.
link |
Empathy is a part of that.
link |
Because you can put yourself in this person's shoe and kind of, yeah, imagine what challenges
link |
they're going through, and so I think I've definitely taken that with me.
link |
Generosity is another one too, like just being generous with your time and love and attention
link |
and even with your wealth, right?
link |
Even if you don't have a lot of it, you're still very generous.
link |
And I think that's another...
link |
Enjoying the humanity of other people.
link |
And so do you think there's a useful difference between men and women in that?
link |
In that aspect and empathy?
link |
Or is doing these kind of big general groups, does that hinder progress?
link |
Yeah, I actually don't want to overgeneralize.
link |
I mean, some of the men I know are like the most empathetic humans.
link |
Yeah, I strive to be empathetic.
link |
Yeah, you're actually very empathetic.
link |
Yeah, so I don't want to overgeneralize.
link |
Although one of the researchers I worked with when I was at Cambridge, Professor Simon Baron Cohen,
link |
he's Sacha Baron Cohen's cousin, and he runs the Autism Research Center at Cambridge,
link |
and he's written multiple books on autism.
link |
And one of his theories is the empathy scale, like the systemizers and the empathizers,
link |
and there's a disproportionate amount of computer scientists and engineers who are
link |
systemizers and perhaps not great empathizers, and then there's more men in that bucket,
link |
I guess, than women, and then there's more women in the empathizers bucket.
link |
So again, not to overgeneralize.
link |
I sometimes wonder about that.
link |
It's been frustrating to me how many, I guess, systemizers there are in the field of robotics.
link |
It's actually encouraging to me because I care about, obviously, social robotics,
link |
and because there's more opportunity for people that are empathic.
link |
So every robotics I talk to, they don't see the human as interesting, as it's not exciting.
link |
You want to avoid the human at all costs.
link |
It's a safety concern to be touching the human, which it is, but it is also an opportunity
link |
for deep connection or collaboration or all that kind of stuff.
link |
And because most brilliant roboticists don't care about the human, it's an opportunity,
link |
in your case, it's a business opportunity too, but in general, an opportunity to explore
link |
So in this beautiful journey to Cambridge, to UK, and then to America, what's the moment
link |
or moments that were most transformational for you as a scientist and as a leader?
link |
So you became an exceptionally successful CEO, founder, researcher, scientist, and so on.
link |
Was there a face shift there where, like, I can be somebody, I can really do something
link |
So actually, just kind of a little bit of background.
link |
So the reason why I moved from Cairo to Cambridge, UK to do my PhD is because I had a very clear
link |
I was like, okay, I'll go abroad, get my PhD, going to crush it in three or four years,
link |
come back to Egypt and teach.
link |
It was very clear, very well laid out.
link |
Was topic clear or no?
link |
The topic, well, I did my PhD around building artificial emotional intelligence and looking
link |
But in your master plan ahead of time, when you're sitting by the mango tree, did you
link |
know it's going to be artificial intelligence?
link |
No, no, no, that I did not know.
link |
Although I think I kind of knew that I was going to be doing computer science, but I
link |
didn't know the specific area.
link |
But I love teaching.
link |
I mean, I still love teaching.
link |
So I just, yeah, I just wanted to go abroad, get a PhD, come back, teach.
link |
Why computer science?
link |
Can we just linger on that?
link |
Because you're such an empathic person who cares about emotion, humans and so on.
link |
Isn't, aren't computers cold and emotionless and just...
link |
We're changing that.
link |
Yeah, I know, but like, isn't that the, or did you see computers as the, having the
link |
capability to actually connect with humans?
link |
I think that was like my takeaway from my experience just growing up, like computers
link |
sit at the center of how we connect and communicate with one another, right?
link |
Or technology in general.
link |
Like I remember my first experience being away from my parents.
link |
We communicated with a fax machine, but thank goodness for the fax machine, because we
link |
could send letters back and forth to each other.
link |
This was pre emails and stuff.
link |
So I think, I think there's, I think technology can be not just transformative in terms of
link |
productivity, et cetera.
link |
It actually does change how we connect with one another.
link |
Can I just defend the fax machine?
link |
There's something like the haptic feel because the email is all digital.
link |
There's something really nice.
link |
I still write letters to people.
link |
There's something nice about the haptic aspect of the fax machine, because you still have
link |
to press, you still have to do something in the physical world to make this thing a reality.
link |
Right, and then it like comes out as a printout and you can actually touch it and read it.
link |
There's something, there's something lost when it's just an email.
link |
Obviously I wonder how we can regain some of that in the digital world, which goes to
link |
the metaverse and all those kinds of things.
link |
We'll talk about it anyway.
link |
So, actually do you question on that one?
link |
Do you still, do you have photo albums anymore?
link |
Do you still print photos?
link |
No, no, but I'm a minimalist.
link |
So it was one of the, one of the painful steps in my life was to scan all the photos and
link |
let go of them and then let go of all my books.
link |
You let go of your books?
link |
Switch to Kindle, everything Kindle.
link |
So I thought, I thought, okay, think 30 years from now, nobody's going to have books anymore.
link |
The technology of digital books is going to get better and better and better.
link |
Are you really going to be the guy that's still romanticizing physical books?
link |
Are you going to be the old man on the porch who's like kids?
link |
So just get used to it because it was, it felt, it still feels a little bit uncomfortable
link |
to read on a Kindle, but get used to it.
link |
Like you always, I mean, I'm trying to learn new programming language is always,
link |
like with technology, you have to kind of challenge yourself to adapt to it.
link |
You know, I forced myself to use TikTok.
link |
No, that thing doesn't need much forcing.
link |
It pulls you in like a, like the worst kind of, or the best kind of drug.
link |
So yeah, but I do love haptic things.
link |
There's a magic to the haptic.
link |
Even like touchscreens, it's tricky to get right, to get the experience of a button.
link |
Anyway, what were we talking about?
link |
So AI, so the journey, your whole plan was to come back to Cairo and teach.
link |
What did the plan go wrong?
link |
And then I get to Cambridge and I fall in love with the idea of research.
link |
And kind of embarking on a path.
link |
Nobody's explored this path before.
link |
You're building stuff that nobody's built before.
link |
And it's challenging and it's hard.
link |
And there's a lot of nonbelievers.
link |
I just totally love that.
link |
And at the end of my PhD, I think it's the meeting that changed the trajectory of my life.
link |
Professor Roslyn Picard, who's, she runs the Affective Computing Group at the MIT Media Lab.
link |
I had read her book.
link |
I, you know, I was like following, following, following all her research.
link |
And she was giving a talk at a pattern recognition conference in Cambridge.
link |
And she had a couple of hours to kill.
link |
So she emailed the lab and she said, you know, if any students want to meet with me, like,
link |
just, you know, sign up here.
link |
And so I signed up for slot and I spent like the weeks leading up to it preparing for this
link |
meeting and I want to show her a demo of my research and everything.
link |
And we met and we ended up hitting it off.
link |
Like we totally clicked.
link |
And at the end of the meeting, she said, do you want to come work with me as a postdoc
link |
And this is what I told her.
link |
I was like, okay, this would be a dream come true, but there's a husband waiting for me
link |
I kind of have to go back.
link |
She said, it's fine.
link |
And I literally started commuting between Cairo and Boston.
link |
Yeah, it was, it was a long commute.
link |
And I didn't, I did that like every few weeks I would, you know, hop on a plane and go to
link |
But that, that changed the trajectory of my life.
link |
There was no, I kind of outgrew my dreams, right?
link |
I didn't want to go back to Egypt anymore and be faculty.
link |
Like that was no longer my dream.
link |
What was the, what was it like to be at MIT?
link |
What was that culture shock?
link |
You mean America in general, but also, I mean, Cambridge has its own culture, right?
link |
So what was MIT like and what was America like?
link |
I think, I wonder if that's similar to your experience at MIT.
link |
I was just, at the Media Lab in particular, I was just really, impressed is not the right
link |
I didn't expect the openness to like innovation and the acceptance of taking a risk and failing.
link |
Like failure isn't really accepted back in Egypt, right?
link |
You don't want to fail.
link |
Like there's a fear of failure, which I think has been hardwired in my brain.
link |
But you get to MIT and it's okay to start things.
link |
And if they don't work out, like it's okay.
link |
You pivot to another idea.
link |
And that kind of thinking was just very new to me.
link |
That's liberating.
link |
Well, Media Lab, for people who don't know, MIT Media Lab is its own beautiful thing because
link |
they, I think more than other places at MIT, reach for big ideas.
link |
And like they try, I mean, I think, I mean, depending of course on who, but certainly
link |
with Roslyn, you try wild stuff, you try big things and crazy things and also try to take
link |
things to completion so you can demo them.
link |
So always, always, always have a demo.
link |
Like if you go, one of the sad things to me about robotics labs at MIT, and there's like
link |
over 30, I think, is like, usually when you show up to a robotics lab, there's not a single
link |
working robot, they're all broken.
link |
All the robots are broken.
link |
The robots are broken, which is like the normal state of things because you're working on
link |
But it would be nice if we lived in a world where robotics labs had some robots functioning.
link |
One of my like favorite moments that just sticks with me, I visited Boston Dynamics
link |
and there was a, first of all, seeing so many spots, so many legged robots in one place.
link |
I'm like, I'm home.
link |
This is where I was built.
link |
The cool thing was just to see there was a random robot spot was walking down the hall.
link |
It's probably doing mapping, but it looked like he wasn't doing anything and he was wearing
link |
he or she, I don't know.
link |
But it, well, I like, in my mind, there are people, they have a backstory, but this one
link |
in particular definitely has a backstory because he was wearing a cowboy hat.
link |
So I just saw a spot robot with a cowboy hat walking down the hall and there was just this
link |
feeling like there's a life, like he has a life.
link |
He probably has to commute back to his family at night.
link |
Like there's a, there's a feeling like there's life instilled in this robot and it's magical.
link |
It was, it was kind of inspiring to see.
link |
Did it say hello to, did he say hello to you?
link |
No, it's very, there's a focus nature to the robot.
link |
I love competence and focus and great.
link |
Like he was not going to get distracted by the, the shallowness of small talk.
link |
There's a job to be done and he was doing it.
link |
So anyway, the fact that it was working is a beautiful thing.
link |
And I think Media Lab really prides itself on trying to always have a thing that's working
link |
that you could show off.
link |
We used to call it a demo or die.
link |
You, you could not, yeah, you could not like show up with like PowerPoint or something.
link |
You actually had to have a working, you know what, my son who is now 13, I don't know if
link |
this is still his life long goal or not, but when he was a little younger, his dream is
link |
to build an island that's just inhabited by robots, like no humans.
link |
He just wants all these robots to be connecting and having fun and there you go.
link |
Does he have human, does he have an idea of which robots he loves most?
link |
Is it, is it Roomba like robots?
link |
Is it humanoid robots?
link |
Robot dogs, or it's not clear yet.
link |
We used to have a Jibo, which was one of the MIT Media Lab spin outs and he used to love
link |
the giant head that spins and rotate and it's an eye or like not glowing like Cal 9000,
link |
but the friendly version.
link |
And then he just loves, uh, um,
link |
yeah, he just, he, I think he loves all forms of robots actually.
link |
So embodied intelligence.
link |
I like, I personally like legged robots, especially, uh, anything that can wiggle its butt.
link |
No, that's not the definition of what I love, but that's just technically what I've been
link |
working on recently.
link |
Except I have a bunch of legged robots now in Austin and I've been doing, I was, I've
link |
been trying to, uh, have them communicate affection with their body in different ways
link |
just for art, for art really.
link |
Cause I love the idea of walking around with the robots, like, uh, as you would with a
link |
I think it's inspiring to a lot of people, especially young people.
link |
Like kids love, kids love it.
link |
Parents like adults are scared of robots, but kids don't have this kind of weird construction
link |
of the world that's full of evil.
link |
They love cool things.
link |
I remember when Adam was in first grade, so he must have been like seven or so.
link |
I went in to his class with a whole bunch of robots and like the emotion AI demo and
link |
And I asked the kids, I was like, do you, would you kids want to have a robot, you know,
link |
robot friend or robot companion?
link |
Everybody said yes.
link |
And they wanted it for all sorts of things, like to help them with their math homework
link |
and to like be a friend.
link |
So there's, it just struck me how there was no fear of robots was a lot of adults have
link |
that like us versus them.
link |
Yeah, none of that.
link |
Of course you want to be very careful because you still have to look at the lessons of history
link |
and how robots can be used by the power centers of the world to abuse your rights and all
link |
that kind of stuff.
link |
But mostly it's good to enter anything new with an excitement and an optimism.
link |
Speaking of Roz, what have you learned about science and life from Rosalind Picard?
link |
Oh my God, I've learned so many things about life from Roz.
link |
I think the thing I learned the most is perseverance.
link |
When I first met Roz, we applied and she invited me to be her postdoc.
link |
We applied for a grant to the National Science Foundation to apply some of our research to
link |
And the reasoning was...
link |
The first time you were rejected for fun, yeah.
link |
Yeah, it was, and I basically, I just took the rejection to mean, okay, we're rejected.
link |
It's done, like end of story, right?
link |
And Roz was like, it's great news.
link |
They love the idea.
link |
They just don't think we can do it.
link |
So let's build it, show them, and then reapply.
link |
And it was that, oh my God, that story totally stuck with me.
link |
And she's like that in every aspect of her life.
link |
She just does not take no for an answer.
link |
To reframe all negative feedback.
link |
Yes, they liked this.
link |
What else about science in general?
link |
About how you see computers and also business and just everything about the world.
link |
She's a very powerful, brilliant woman like yourself.
link |
So is there some aspect of that too?
link |
Yeah, I think Roz is actually also very faith driven.
link |
She has this like deep belief in conviction.
link |
Yeah, and in the good in the world and humanity.
link |
And I think that was meeting her and her family was definitely like a defining moment for me
link |
because that was when I was like, wow, like you can be of a different background and
link |
religion and whatever and you can still have the same core values.
link |
So that was, that was, yeah.
link |
I'm grateful to her.
link |
Roz, if you're listening, thank you.
link |
Yeah, she's great.
link |
She's been on this podcast before.
link |
I hope she'll be on, I'm sure she'll be on again.
link |
And you were the founder and CEO of Effektiva, which is a big company that was acquired by
link |
another big company, SmartEye.
link |
And you're now the deputy CEO of SmartEye.
link |
So you're a powerful leader.
link |
You're a brilliant scientist.
link |
A lot of people are inspired by you.
link |
What advice would you give, especially to young women, but people in general who dream
link |
of becoming powerful leaders like yourself in a world where perhaps, in a world that
link |
perhaps doesn't give them a clear, easy path to do so, whether we're talking about Egypt
link |
You know, hearing you kind of describe me that way, kind of encapsulates, I think what
link |
I think is the biggest challenge of all, which is believing in yourself, right?
link |
I have had to like grapple with this, what I call now the Debbie Downer voice in my head.
link |
The kind of basically, it's just chattering all the time.
link |
It's basically saying, oh, no, no, no, no, you can't do this.
link |
Like you're not going to raise money.
link |
You can't start a company.
link |
Like what business do you have, like starting a company or running a company or selling
link |
And I think my biggest advice to not just women, but people who are taking a new path
link |
and, you know, they're not sure, is to not let yourself and let your thoughts be the
link |
biggest obstacle in your way.
link |
And I've had to like really work on myself to not be my own biggest obstacle.
link |
So you got that negative voice.
link |
Am I the only one?
link |
I don't think I'm the only one.
link |
No, I have that negative voice.
link |
I'm not exactly sure if it's a bad thing or a good thing.
link |
I've been really torn about it because it's been a lifelong companions.
link |
It's hard to know.
link |
It's kind of, it drives productivity and progress, but it can hold you back from taking
link |
I think the best I can say is probably you have to somehow be able to control it, to
link |
turn it off when it's not useful and turn it on when it's useful.
link |
Like I have from almost like a third person perspective.
link |
Somebody who's sitting there like.
link |
Like, because it is useful to be critical.
link |
Like after, I just gave a talk yesterday.
link |
At MIT and I was just, there's so much love and it was such an incredible experience.
link |
So many amazing people I got a chance to talk to, but afterwards when I went home and just
link |
took this long walk, it was mostly just negative thoughts about me.
link |
I don't like one basic stuff like I don't deserve any of it.
link |
And second is like, like, why did you, that was so bad.
link |
Second is like, like, why did you, that was so dumb that you said this, that's so dumb.
link |
Like you should have prepared that better.
link |
Why did you say this?
link |
But I think it's good to hear that voice out.
link |
And like sit in that.
link |
And ultimately I think you grow from that.
link |
Now, when you're making really big decisions about funding or starting a company or taking
link |
a leap to go to the UK or take a leap to go to America to work in Media Lab though.
link |
There's, that's, you should be able to shut that off then because you should have like
link |
this weird confidence, almost like faith that you said before that everything's going to
link |
So take the leap of faith.
link |
Take the leap of faith.
link |
Despite all the negativity.
link |
I mean, there's, there's, there's some of that.
link |
You, you actually tweeted a really nice tweet thread.
link |
It says, quote, a year ago, a friend recommended I do daily affirmations and I was skeptical,
link |
but I was going through major transitions in my life.
link |
So I gave it a shot and it set me on a journey of self acceptance and self love.
link |
So what was that like?
link |
Can you maybe talk through this idea of affirmations and how that helped you?
link |
Because really like I'm just like me, I'm a kind, I'd like to think of myself as a kind
link |
person in general, but I'm kind of mean to myself sometimes.
link |
And so I've been doing journaling for almost 10 years now.
link |
I use an app called Day One and it's awesome.
link |
I just journal and I use it as an opportunity to almost have a conversation with the Debbie
link |
Downer voice in my, it's like a rebuttal, right?
link |
Like Debbie Downer says, oh my God, like you, you know, you won't be able to raise this
link |
I'm like, okay, let's talk about it.
link |
I have a track record of doing X, Y, and Z.
link |
I think I can do this.
link |
And it's literally like, so I wouldn't, I don't know that I can shut off the voice,
link |
but I can have a conversation with it.
link |
And it just, it just, and I bring data to the table, right?
link |
So that was the journaling part, which I found very helpful.
link |
But the affirmation took it to a whole next level and I just love it.
link |
I'm a year into doing this and you literally wake up in the morning and the first thing
link |
you do, I meditate first and then I write my affirmations and it's the energy I want
link |
to put out in the world that hopefully will come right back to me.
link |
So I will say, I always start with my smile lights up the whole world.
link |
And I kid you not, like people in the street will stop me and say, oh my God, like we love
link |
So, so my affirmations will change depending on, you know, what's happening this day.
link |
Don't judge, don't judge.
link |
No, that's not, laughter's not judgment.
link |
It's just awesome.
link |
I mean, it's true, but you're saying affirmations somehow help kind of, I mean, what is it that
link |
they do work to like remind you of the kind of person you are and the kind of person you
link |
want to be, which actually may be in reverse order, the kind of person you want to be.
link |
And that helps you become the kind of person you actually are.
link |
It's just, it's, it brings intentionality to like what you're doing.
link |
And so, by the way, I was laughing because my affirmations, which I also do are the
link |
Oh, what do you do?
link |
I don't, I don't have a, my smile lights up the world.
link |
Maybe I should add that because like, I, I have, I just, I have, oh boy, it's, it's much
link |
more stoic, like about focus, about this kind of stuff, but the joy, the emotion that you're
link |
just in that little affirmation is beautiful.
link |
So maybe I should add that.
link |
I have some, I have some like focused stuff, but that's usually.
link |
But that's a cool start.
link |
It's after all the like smiling and playful and joyful and all that.
link |
And then it's like, okay, I kick butt.
link |
Let's get shit done.
link |
Let's get shit done affirmation.
link |
So like what else is on there?
link |
What else is on there?
link |
Well, I, I have, I'm also, I'm, I'm a magnet for all sorts of things.
link |
So I'm an amazing people magnet.
link |
I attract like awesome people into my universe.
link |
That's an actual affirmation.
link |
So that, that's, and that, yeah.
link |
And that somehow manifests itself into like in working.
link |
Like, can you speak to like why it feels good to do the affirmations?
link |
I honestly think it just grounds the day.
link |
And then it allows me to, instead of just like being pulled back and forth, like throughout
link |
the day, it just like grounds me.
link |
I'm like, okay, like this thing happened.
link |
It's not exactly what I wanted it to be, but I'm patient.
link |
Or I'm, you know, I'm, I trust that the universe will do amazing things for me, which is one
link |
of my other consistent affirmations.
link |
Or I'm an amazing mom.
link |
And so I can grapple with all the feelings of mom guilt that I have all the time.
link |
Or here's another one.
link |
I'm a love magnet.
link |
And I literally say, I will kind of picture the person that I'd love to end up with.
link |
And I write it all down and it hasn't happened yet, but it.
link |
What are you, what are you picturing?
link |
This is Brad Pitt.
link |
Because that's what I picture.
link |
That's what you picture?
link |
On the, on the running, holding hands, running together.
link |
No, more like fight club that the fight club, Brad Pitt, where he's like standing.
link |
Anyway, I'm sorry.
link |
I'll get off on that.
link |
Do you have a, like when you're thinking about the being a love magnet in that way, are you
link |
picturing specific people or is this almost like in the space of like energy?
link |
It's somebody who is smart and well accomplished and successful in their life, but they're
link |
generous and they're well traveled and they want to travel the world.
link |
Like their head over heels into me.
link |
It's like, I know it sounds super silly, but it's literally what I write.
link |
And I believe it'll happen one day.
link |
Oh, you actually write, so you don't say it out loud?
link |
I write all my affirmations.
link |
I do the opposite.
link |
I say it out loud.
link |
Oh, you say it out loud?
link |
Yeah, if I'm alone, I'll say it out loud.
link |
I should try that.
link |
I think it's what feels more powerful to you.
link |
To me, more powerful.
link |
Saying stuff feels more powerful.
link |
Writing is, writing feels like I'm losing the words, like losing the power of the words
link |
maybe because I write slow.
link |
It's day one, basically.
link |
And I just, I can look, the best thing about it is I can look back and see like a year ago,
link |
what was I affirming, right?
link |
Oh, so it changes over time.
link |
It hasn't like changed a lot, but the focus kind of changes over time.
link |
Yeah, I say the same exact thing over and over and over.
link |
There's a comfort in the sameness of it.
link |
Well, actually, let me jump around because let me ask you about, because all this talk
link |
about Brad Pitt, or maybe it's just going on inside my head, let me ask you about dating
link |
You tweeted, are you based in Boston and single?
link |
And then you pointed to a startup Singles Night sponsored by Smile Dating app.
link |
I mean, this is jumping around a little bit, but since you mentioned...
link |
Since you mentioned, can AI help solve this dating love problem?
link |
What do you think?
link |
This problem of connection that is part of the human condition, can AI help that you
link |
yourself are in the search affirming?
link |
Maybe that's what I should affirm, like build an AI.
link |
Build an AI that finds love?
link |
I think there must be a science behind that first moment you meet a person and you either
link |
have chemistry or you don't, right?
link |
I guess that was the question I was asking, would you put it brilliantly, is that a science
link |
I think there are like, there's actual chemicals that get exchanged when two people meet.
link |
I don't know about that.
link |
I like how you're changing, yeah, changing your mind as we're describing it, but it feels
link |
But it's what science shows us is sometimes we can explain with the rigor, the things
link |
that feel like magic.
link |
So maybe we can remove all the magic.
link |
Maybe it's like, I honestly think, like I said, like Goodreads should be a dating app,
link |
I wonder if you look at just like books or content you've consumed.
link |
I mean, that's essentially what YouTube does when it does a recommendation.
link |
If you just look at your footprint of content consumed, if there's an overlap, but maybe
link |
interesting difference with an overlap that some, I'm sure this is a machine learning
link |
problem that's solvable.
link |
Like this person is very likely to be not only there to be chemistry in the short term,
link |
but a good lifelong partner to grow together.
link |
I bet you it's a good machine learning problem.
link |
You just need the data.
link |
Well, actually, I do think there's so much data about each of us that there ought to
link |
be a machine learning algorithm that can ingest all this data and basically say, I think the
link |
following 10 people would be interesting connections for you, right?
link |
And so Smile dating app kind of took one particular angle, which is humor.
link |
It matches people based on their humor styles, which is one of the main ingredients of a
link |
successful relationship.
link |
Like if you meet somebody and they can make you laugh, like that's a good thing.
link |
And if you develop like internal jokes, like inside jokes and you're bantering, like that's
link |
But yeah, that's the number of and the rate of inside joke generation.
link |
You could probably measure that and then optimize it over the first few days.
link |
You could say, we're just turning this into a machine learning problem.
link |
But for somebody like you, who's exceptionally successful and busy, is there, is there signs
link |
to that aspect of dating?
link |
Is there advice you can give?
link |
Oh, my God, I give the worst advice.
link |
Well, I can tell you like I have a spreadsheet.
link |
Is that a good or a bad thing?
link |
Do you regret the spreadsheet?
link |
Well, I don't know.
link |
What's the name of the spreadsheet?
link |
It's the date track, dating tracker.
link |
And there's a rating system, I'm sure.
link |
There's like weights and stuff.
link |
It's too close to home.
link |
Well, I don't have a spreadsheet, but I would, now that you say it, it seems like a good
link |
Turning it into data.
link |
I do wish that somebody else had a spreadsheet about me.
link |
You know, if it was like, like I said, like you said, convert, collect a lot of data about
link |
us in a way that's privacy preserving, that I own the data, I can control it and then
link |
use that data to find, I mean, not just romantic love, but collaborators, friends, all that
link |
It seems like the data is there.
link |
That's the problem social networks are trying to solve, but I think they're doing a really
link |
Even Facebook tried to get into a dating app business.
link |
And I think there's so many components to running a successful company that connects
link |
And part of that is, you know, having engineers that care about the human side, right, as
link |
you know, extremely well, it's not, it's not easy to find those.
link |
But you also don't want just people that care about the human.
link |
They also have to be good engineers.
link |
So it's like, you have to find this beautiful mix.
link |
And for some reason, just empirically speaking, people have not done a good job of that, of
link |
building companies like that.
link |
And it must mean that it's a difficult problem to solve.
link |
Dating apps, it seems difficult.
link |
Okay, Cupid, Tinder, all those kinds of stuff.
link |
They seem to find, of course they work, but they seem to not work as well as I would imagine
link |
Like, with data, wouldn't you be able to find better human connection?
link |
It's like arranged marriages on steroids, essentially.
link |
Arranged by machine learning algorithm.
link |
Arranged by machine learning algorithm, but not a superficial one.
link |
I think a lot of the dating apps out there are just so superficial.
link |
They're just matching on like high level criteria that aren't ingredients for successful partnership.
link |
But you know what's missing, though, too?
link |
I don't know how to fix that, the serendipity piece of it.
link |
Like, how do you engineer serendipity?
link |
Like this random, like, chance encounter, and then you fall in love with the person.
link |
Like, I don't know how a dating app can do that.
link |
So there has to be a little bit of randomness.
link |
Maybe every 10th match is just a, you know, yeah, somebody that the algorithm wouldn't
link |
have necessarily recommended, but it allows for a little bit of...
link |
Well, it can also, you know, it can also trick you into thinking of serendipity by like somehow
link |
showing you a tweet of a person that he thinks you'll match well with, but do it accidentally
link |
as part of another search.
link |
And like you just notice it, like, and then you get, you go down a rabbit hole and you
link |
connect them outside the app to like, you connect with this person outside the app somehow.
link |
So it's just, it creates that moment of meeting.
link |
Of course, you have to think of, from an app perspective, how you can turn that into a
link |
But I think ultimately a business that helps people find love in any way.
link |
Like that's what Apple was about, create products that people love.
link |
I mean, you got to make money somehow.
link |
If you help people fall in love personally with the product, find self love or love another
link |
human being, you're going to make money.
link |
You're going to figure out a way to make money.
link |
I just feel like the dating apps often will optimize for something else than love.
link |
It's the same with social networks.
link |
They optimize for engagement as opposed to like a deep, meaningful connection that's
link |
ultimately grounded in like personal growth, you as a human being growing and all that
link |
Let me do like a pivot to a dark topic, which you opened the book with.
link |
A story, because I'd like to talk to you about just emotion and artificial intelligence.
link |
I think this is a good story to start to think about emotional intelligence.
link |
You opened the book with a story of a central Florida man, Jamel Dunn, who was drowning
link |
and drowned while five teenagers watched and laughed, saying things like, you're going
link |
And when Jamel disappeared below the surface of the water, one of them said he just died
link |
and the others laughed.
link |
What does this incident teach you about human nature and the response to it perhaps?
link |
I mean, I think this is a really, really, really sad story.
link |
And it and it and it highlights what I believe is a it's a real problem in our world today.
link |
It's it's an empathy crisis.
link |
Yeah, we're living through an empathy crisis and crisis.
link |
And I mean, we've we've talked about this throughout our conversation.
link |
We dehumanize each other.
link |
And unfortunately, yes, technology is bringing us together.
link |
But in a way, it's just dehumanized.
link |
It's creating this like, yeah, dehumanizing of the other.
link |
And I think that's a huge problem.
link |
The good news is I think solution, the solution could be technology based.
link |
Like, I think if we rethink the way we design and deploy our technologies, we can solve
link |
parts of this problem.
link |
But I worry about it.
link |
I mean, even with my son, a lot of his interactions are computer mediated.
link |
And I just question what that's doing to his empathy skills and, you know, his ability
link |
to really connect with people.
link |
So that you think you think it's not possible to form empathy through the digital medium.
link |
But we have to be thoughtful about because the way the way we engage face to face, which
link |
is what we're doing right now, right?
link |
There's the nonverbal signals, which are a majority of how we communicate.
link |
It's like 90% of how we communicate is your facial expressions.
link |
You know, I'm saying something and you're nodding your head now, and that creates a
link |
And and if you break that, and now I have anxiety about it.
link |
I am not scrutinizing your facial expressions during this interview.
link |
Look normal, look human.
link |
If Rana says yes, then nod head else.
link |
Don't do it too much because it might be at the wrong time and then it will send the
link |
And make eye contact sometimes because humans appreciate that.
link |
Yeah, but something about the especially when you say mean things in person, you get to
link |
see the pain of the other person.
link |
But if you're tweeting it at a person and you have no idea how it's going to land, you're
link |
more likely to do that on social media than you are in face to face conversations.
link |
What do you think is more important?
link |
EQ being emotional intelligence.
link |
In terms of in what makes us human.
link |
I think emotional intelligence is what makes us human.
link |
It's how we connect with one another.
link |
It's how we build trust.
link |
It's how we make decisions, right?
link |
Like your emotions drive kind of what you had for breakfast, but also where you decide
link |
to live and what you want to do for the rest of your life.
link |
So I think emotions are underrated.
link |
So emotional intelligence isn't just about the effective expression of your own emotions.
link |
It's about a sensitivity and empathy to other people's emotions and that sort of being
link |
able to effectively engage in the dance of emotions with other people.
link |
Yeah, I like that explanation.
link |
I like that kind of.
link |
Yeah, thinking about it as a dance because it is really about that.
link |
It's about sensing what state the other person's in and using that information to decide on
link |
how you're going to react.
link |
And I think it can be very powerful.
link |
Like people who are the best, most persuasive leaders in the world tap into, you know, they
link |
have, if you have higher EQ, you're more likely to be able to motivate people to change
link |
So it can be very powerful.
link |
On a more kind of technical, maybe philosophical level, you've written that emotion is universal.
link |
It seems that, sort of like Chomsky says, language is universal.
link |
There's a bunch of other stuff like cognition, consciousness.
link |
It seems a lot of us have these aspects.
link |
So the human mind generates all this.
link |
And so what do you think is the, they all seem to be like echoes of the same thing.
link |
What do you think emotion is exactly?
link |
Like how deep does it run?
link |
Is it a surface level thing that we display to each other?
link |
Is it just another form of language or something deep within?
link |
I think it's really deep.
link |
It's how, you know, we started with memory.
link |
I think emotions play a really important role.
link |
Yeah, emotions play a very important role in how we encode memories, right?
link |
Our memories are often encoded, almost indexed by emotions.
link |
Yeah, it's at the core of how, you know, our decision making engine is also heavily
link |
influenced by our emotions.
link |
So emotions is part of cognition.
link |
It's intermixed into the whole thing.
link |
And in fact, when you take it away, people are unable to make decisions.
link |
They're really paralyzed.
link |
Like they can't go about their daily or their, you know, personal or professional lives.
link |
It does seem like there's probably some interesting interweaving of emotion and consciousness.
link |
I wonder if it's possible to have, like if they're next door neighbors somehow, or if
link |
they're actually flat mates.
link |
I don't, it feels like the hard problem of consciousness where it's some, it feels like
link |
something to experience the thing.
link |
Like red feels like red, and it's, you know, when you eat a mango, it's sweet.
link |
The taste, the sweetness, that it feels like something to experience that sweetness, that
link |
whatever generates emotions.
link |
But then like, see, I feel like emotion is part of communication.
link |
It's very much about communication.
link |
And then, you know, it's like, you know, it's like, you know, it's like, you know, it's
link |
and then that means it's also deeply connected to language.
link |
But then probably human intelligence is deeply connected to the collective intelligence between
link |
It's not just the standalone thing.
link |
So the whole thing is really connected.
link |
So emotion is connected to language, language is connected to intelligence, and then intelligence
link |
is connected to consciousness, and consciousness is connected to emotion.
link |
The whole thing is that it's a beautiful mess.
link |
So can I comment on the emotions being a communication mechanism?
link |
Because I think there are two facets of our emotional experiences.
link |
One is communication, right?
link |
Like we use emotions, for example, facial expressions or other nonverbal cues to connect
link |
with other human beings and with other beings in the world, right?
link |
But even if it's not a communication context, we still experience emotions and we still
link |
process emotions and we still leverage emotions to make decisions and to learn and, you know,
link |
to experience life.
link |
So it isn't always just about communication.
link |
And we learned that very early on in our and kind of our work at Affectiva.
link |
One of the very first applications we brought to market was understanding how people respond
link |
to content, right?
link |
So if they're watching this video of ours, like, are they interested?
link |
Are they inspired?
link |
Are they bored to death?
link |
And so we watched their facial expressions and we had, we weren't sure if people would
link |
express any emotions if they were sitting alone.
link |
Like if you're in your bed at night, watching a Netflix TV series, would we still see any
link |
emotions on your face?
link |
And we were surprised that, yes, people still emote, even if they're alone, even if you're
link |
in your car driving around, you're singing along the song and you're joyful, you're
link |
smiling, you're joyful, we'll see these expressions.
link |
So it's not just about communicating with another person.
link |
It sometimes really isn't just about experiencing the world.
link |
And first of all, I wonder if some of that is because we develop our intelligence and
link |
our emotional intelligence by communicating with other humans.
link |
And so when other humans disappear from the picture, we're still kind of a virtual human.
link |
The code still runs.
link |
Yeah, the code still runs, but you also kind of, you're still, there's like virtual humans.
link |
You don't have to think of it that way, but there's a kind of, when you like chuckle,
link |
like, yeah, like you're kind of chuckling to a virtual human.
link |
I mean, it's possible that the code has to have another human there because if you just
link |
grew up alone, I wonder if emotion will still be there in this visual form.
link |
So yeah, I wonder, but anyway, what can you tell from the human face about what's going
link |
So that's the problem that Effectiva first tackled, which is using computer vision, using
link |
machine learning to try to detect stuff about the human face, as many things as possible
link |
and convert them into a prediction of categories of emotion, anger, happiness, all that kind
link |
How hard is that problem?
link |
It's extremely hard.
link |
It's very, very hard because there is no one to one mapping between a facial expression
link |
and your internal state.
link |
There's this oversimplification of the problem where it's something like, if you are smiling,
link |
then you're happy.
link |
If you do a brow furrow, then you're angry.
link |
If you do an eyebrow raise, then you're surprised.
link |
And just think about it for a moment.
link |
You could be smiling for a whole host of reasons.
link |
You could also be happy and not be smiling, right?
link |
You could furrow your eyebrows because you're angry or you're confused about something or
link |
you're constipated.
link |
So I think this oversimplistic approach to inferring emotion from a facial expression
link |
is really dangerous.
link |
The solution is to incorporate as many contextual signals as you can, right?
link |
So if, for example, I'm driving a car and you can see me like nodding my head and my
link |
eyes are closed and the blinking rate is changing, I'm probably falling asleep at the wheel,
link |
Because you know the context.
link |
You understand what the person's doing or add additional channels like voice or gestures
link |
or even physiological sensors, but I think it's very dangerous to just take this oversimplistic
link |
approach of, yeah, smile equals happy and...
link |
If you're able to, in a high resolution way, specify the context, there's certain things
link |
that are going to be somewhat reliable signals of something like drowsiness or happiness
link |
or stuff like that.
link |
I mean, when people are watching Netflix content, that problem, that's a really compelling idea
link |
that you can kind of, at least in aggregate, highlight like which part was boring, which
link |
part was exciting.
link |
How hard was that problem?
link |
That was on the scale of difficulty.
link |
I think that's one of the easier problems to solve because it's a relatively constrained
link |
You have somebody sitting in front of...
link |
Initially, we started with like a device in front of you, like a laptop, and then we graduated
link |
to doing this on a mobile phone, which is a lot harder just because of, you know, from
link |
a computer vision perspective, the profile view of the face can be a lot more challenging.
link |
We had to figure out lighting conditions because usually people are watching content literally
link |
in their bedrooms at night.
link |
Lights are dimmed.
link |
Yeah, I mean, if you're standing, it's probably going to be the looking up.
link |
Yeah, and nobody looks good at it.
link |
I've seen data sets from that perspective.
link |
It's like, this is not a good look for anyone.
link |
Or if you're laying in bed at night, what is it, side view or something?
link |
And half your face is like on a pillow.
link |
Actually, I would love to know, have data about like how people watch stuff in bed at
link |
night, like, do they prop there, is it a pillow, the, like, I'm sure there's a lot of interesting
link |
From a health and well being perspective, right?
link |
Like, oh, you're hurting your neck.
link |
I was thinking machine learning perspective, but yes, but also, yeah, yeah, once you have
link |
that data, you can start making all kinds of inference about health and stuff like that.
link |
Yeah, there's an interesting thing when I was at Google that we were, it's called active
link |
authentication, where you want to be able to unlock your phone without using a password.
link |
So it would face, but also other stuff, like the way you take a phone out of the pocket.
link |
So that kind of data to use the multimodal with machine learning to be able to identify
link |
that it's you or likely to be you, likely not to be you, that allows you to not always
link |
have to enter the password.
link |
That was the idea.
link |
But the funny thing about that is, I just want to tell a small anecdote is because it
link |
was all male engineers, except so my boss is, our boss was still one of my favorite humans,
link |
was a woman, Regina Dugan.
link |
Oh, my God, I love her.
link |
So, but anyway, and there's one female brilliant female engineer on the team, and she was the
link |
one that actually highlighted the fact that women often don't have pockets.
link |
It was like, whoa, that was not even a category in the code of like, wait a minute, you can
link |
take the phone out of some other place than your pocket.
link |
So anyway, that's a funny thing when you're considering people laying in bed, watching
link |
a phone, you have to consider if you have to, you know, diversity in all its forms,
link |
depending on the problem, depending on the context.
link |
Actually, this is like a very important, I think this is, you know, you probably get
link |
this all the time.
link |
Like people are worried that AI is going to take over humanity and like, get rid of all
link |
the humans in the world.
link |
I'm like, actually, that's not my biggest concern.
link |
My biggest concern is that we are building bias into these systems.
link |
And then they're like deployed at large and at scale.
link |
And before you know it, you're kind of accentuating the bias that exists in society.
link |
Yeah, I'm not, you know, I know people, it's very important to worry about that, but the
link |
worry is an emergent phenomena to me, which is a very good one, because I think these
link |
systems are actually, by encoding the data that exists, they're revealing the bias in
link |
They're both for teaching us what the bias is.
link |
Therefore, we can now improve that bias within the system.
link |
So they're almost like putting a mirror to ourselves.
link |
You have to be open to looking at the mirror, though.
link |
You have to be open to scrutinizing the data.
link |
And if you just take it as ground.
link |
Or you don't even have to look at the, I mean, yes, the data is how you fix it.
link |
But then you just look at the behavior of the system.
link |
And you realize, holy crap, this thing is kind of racist.
link |
Like, why is that?
link |
And then you look at the data, it's like, oh, okay.
link |
And then you start to realize that I think that some much more effective ways to do that
link |
are effective way to be introspective as a society than through sort of political discourse.
link |
Like AI kind of, because people are for some reason more productive and rigorous in criticizing
link |
AI than they're criticizing each other.
link |
So I think this is just a nice method for studying society and see which way progress
link |
Anyway, what we're talking about.
link |
You're watching the problem of watching Netflix in bed or elsewhere and seeing which parts
link |
are exciting, which parts are boring.
link |
You're saying that's relatively constrained because you have a captive audience and you
link |
kind of know the context.
link |
And one thing you said that was really key is the aggregate.
link |
You're doing this in aggregate, right?
link |
Like we're looking at aggregated response of people.
link |
And so when you see a peak, say a smile peak, they're probably smiling or laughing at something
link |
that's in the content.
link |
So that was one of the first problems we were able to solve.
link |
And when we see the smile peak, it doesn't mean that these people are internally happy.
link |
They're just laughing at content.
link |
So it's important to call it for what it is.
link |
But it's still really, really useful data.
link |
I wonder how that compares to, so what like YouTube and other places will use is obviously
link |
they don't have, for the most case, they don't have that kind of data.
link |
They have the data of when people tune out, like switch to drop off.
link |
And I think that's an aggregate for YouTube, at least a pretty powerful signal.
link |
I worry about what that leads to because looking at like YouTubers that kind of really care
link |
about views and try to maximize the number of views, I think when they say that the video
link |
should be constantly interesting, which seems like a good goal, I feel like that leads to
link |
this manic pace of a video.
link |
Like the idea that I would speak at the current speed that I'm speaking, I don't know.
link |
And that every moment has to be engaging, right?
link |
I think there's value to silence.
link |
There's value to the boring bits.
link |
I mean, some of the greatest movies ever, some of the greatest movies ever.
link |
Some of the greatest stories ever told me they have that boring bits, seemingly boring bits.
link |
I wonder about that.
link |
Of course, it's not that the human face can capture that either.
link |
It's just giving an extra signal.
link |
You have to really, I don't know, you have to really collect deeper long term data about
link |
what was meaningful to people.
link |
When they think 30 days from now, what they still remember, what moved them, what changed
link |
them, what helped them grow, that kind of stuff.
link |
You know, it would be a really interesting, I don't know if there are any researchers
link |
out there who are doing this type of work.
link |
Wouldn't it be so cool to tie your emotional expressions while you're, say, listening
link |
to a podcast interview and then 30 days later interview people and say, hey, what do you
link |
You've watched this 30 days ago.
link |
Like, what stuck with you?
link |
And then see if there's any, there ought to be maybe, there ought to be some correlation
link |
between these emotional experiences and, yeah, what you, what stays with you.
link |
So the one guy listening now on the beach in Brazil, please record a video of yourself
link |
listening to this and send it to me and then I'll interview you 30 days from now.
link |
Yeah, that'd be great.
link |
It'll be statistically significant to you.
link |
Yeah, I know one, but, you know, yeah, yeah, I think that's really fascinating.
link |
I think that's, that kind of holds the key to a future where entertainment or content
link |
is both entertaining and, I don't know, makes you better, empowering in some way.
link |
So figuring out, like, showing people stuff that entertains them, but also they're happy
link |
they watched 30 days from now because they've become a better person because of it.
link |
Well, you know, okay, not to riff on this topic for too long, but I have two children,
link |
And I see my role as a parent as like a chief opportunity officer.
link |
Like I am responsible for exposing them to all sorts of things in the world.
link |
And, but often I have no idea of knowing, like, what stuck, like, what was, you know,
link |
is this actually going to be transformative, you know, for them 10 years down the line?
link |
And I wish there was a way to quantify these experiences.
link |
Like, are they, I can tell in the moment if they're engaging, right?
link |
I can tell, but it's really hard to know if they're going to remember them 10 years
link |
from now or if it's going to.
link |
Yeah, that one is weird because it seems like kids remember the weirdest things.
link |
I've seen parents do incredible stuff for their kids and they don't remember any of
link |
They remember some tiny, small, sweet thing a parent did.
link |
Like they took you to, like, this amazing country vacation, blah, blah, blah, blah.
link |
And then there'll be, like, some, like, stuffed toy you got or some, or the new PlayStation
link |
or something or some silly little thing.
link |
So I think they just, like, they were designed that way.
link |
They want to mess with your head.
link |
But definitely kids are very impacted by, it seems like, sort of negative events.
link |
So minimizing the number of negative events is important, but not too much, right?
link |
You can't, you can't just, like, you know, there's still discipline and challenge and
link |
all those kinds of things.
link |
You want some adversity for sure.
link |
So, yeah, I mean, I'm definitely, when I have kids, I'm going to drive them out into
link |
And then they have to survive and make, figure out how to make their way back home, like,
link |
And after that, we can go for ice cream.
link |
Anyway, I'm working on this whole parenting thing.
link |
I haven't figured it out.
link |
What were we talking about?
link |
Yes, Effectiva, the problem of emotion, of emotion detection.
link |
So there's some people, maybe we can just speak to that a little more, where there's
link |
folks like Lisa Feldman Barrett that challenge this idea that emotion could be fully detected
link |
or even well detected from the human face, that there's so much more to emotion.
link |
What do you think about ideas like hers, criticism like hers?
link |
Yeah, I actually agree with a lot of Lisa's criticisms.
link |
So even my PhD worked, like, 20 plus years ago now.
link |
Time flies when you're having fun.
link |
That was back when I did, like, dynamic Bayesian networks.
link |
That was before deep learning, huh?
link |
That was before deep learning.
link |
Now you can just, like, use.
link |
Yeah, it's all the same architecture.
link |
You can apply it to anything.
link |
Right, but yeah, but even then I kind of, I did not subscribe to this, like, theory
link |
of basic emotions where it's just the simplistic mapping, one to one mapping between facial
link |
expressions and emotions.
link |
I actually think also we're not in the business of trying to identify your true emotional
link |
We just want to quantify in an objective way what's showing on your face because that's
link |
an important signal.
link |
It doesn't mean it's a true reflection of your internal emotional state.
link |
So I think a lot of the, you know, I think she's just trying to kind of highlight that
link |
this is not a simple problem and overly simplistic solutions are going to hurt the industry.
link |
And I subscribe to that.
link |
And I think multimodal is the way to go.
link |
Like, whether it's additional context information or different modalities and channels of information,
link |
I think that's what we, that's where we ought to go.
link |
And I think, I mean, that's a big part of what she's advocating for as well.
link |
So, but there is signal in the human face.
link |
There's definitely signal in the human face.
link |
That's a projection of emotion.
link |
There's that, at least in part is the inner state is captured in some meaningful way on
link |
I think it can sometimes be a reflection or an expression of your internal state, but
link |
sometimes it's a social signal.
link |
So you cannot look at the face as purely a signal of emotion.
link |
It can be a signal of cognition and it can be a signal of a social expression.
link |
And I think to disambiguate that we have to be careful about it and we have to add initial
link |
Humans are fascinating, aren't they?
link |
With the whole face thing, this can mean so many things, from humor to sarcasm to everything,
link |
Some things we can help, some things we can't help at all.
link |
In all the years of leading Effectiva, an emotion recognition company, like we talked
link |
about, what have you learned about emotion, about humans and about AI?
link |
Big, sweeping questions.
link |
Yeah, that's a big, sweeping question.
link |
Well, I think the thing I learned the most is that even though we are in the business
link |
of building AI, basically, it always goes back to the humans, right?
link |
It's always about the humans.
link |
And so, for example, the thing I'm most proud of in building Effectiva and, yeah, the thing
link |
I'm most proud of on this journey, I love the technology and I'm so proud of the solutions
link |
we've built and we've brought to market.
link |
But I'm actually most proud of the people we've built and cultivated at the company
link |
and the culture we've created.
link |
Some of the people who've joined Effectiva, this was their first job, and while at Effectiva,
link |
they became American citizens and they bought their first house and they found their partner
link |
and they had their first kid, right?
link |
Like key moments in life that we got to be part of, and that's the thing I'm most proud
link |
So that's a great thing at a company that works at a big company, right?
link |
So that's a great thing at a company that works at, I mean, like celebrating humanity
link |
in general, broadly speaking.
link |
And that's a great thing to have in a company that works on AI, because that's not often
link |
the thing that's celebrated in AI companies, so often just raw great engineering, just
link |
celebrating the humanity.
link |
And especially from a leadership position.
link |
Well, what do you think about the movie Her?
link |
Let me ask you that.
link |
Before I talk to you about, because it's not, Effectiva is and was not just about emotion,
link |
so I'd love to talk to you about SmartEye, but before that, let me just jump into the
link |
Do you think we'll have a deep, meaningful connection with increasingly deeper, meaningful
link |
connections with computers?
link |
Is that a compelling thing to you?
link |
Something you think about?
link |
I think that's already happening.
link |
The thing I love the most, I love the movie Her, by the way, but the thing I love the
link |
most about this movie is it demonstrates how technology can be a conduit for positive behavior
link |
So I forgot the guy's name in the movie, whatever.
link |
So Theodore was really depressed, right?
link |
And he just didn't want to get out of bed, and he was just done with life, right?
link |
And Samantha, right?
link |
She just knew him so well.
link |
She was emotionally intelligent, and so she could persuade him and motivate him to change
link |
his behavior, and she got him out, and they went to the beach together.
link |
And I think that represents the promise of emotion AI.
link |
If done well, this technology can help us live happier lives, more productive lives,
link |
healthier lives, more connected lives.
link |
So that's the part that I love about the movie.
link |
Obviously, it's Hollywood, so it takes a twist and whatever, but the key notion that technology
link |
with emotion AI can persuade you to be a better version of who you are, I think that's awesome.
link |
Well, what about the twist?
link |
You don't think it's good?
link |
You don't think it's good for spoiler alert that Samantha starts feeling a bit of a distance
link |
and basically leaves Theodore?
link |
You don't think that's a good feature?
link |
You think that's a bug or a feature?
link |
Well, I think what went wrong is Theodore became really attached to Samantha.
link |
Like, I think he kind of fell in love with Theodore.
link |
Do you think that's wrong?
link |
I mean, I think that's...
link |
I think she was putting out the signal.
link |
This is an intimate relationship, right?
link |
There's a deep intimacy to it.
link |
Right, but what does that mean?
link |
What does that mean?
link |
Put in an AI system.
link |
Right, what does that mean, right?
link |
We're just friends.
link |
Yeah, we're just friends.
link |
When he realized, which is such a human thing of jealousy.
link |
When you realize that Samantha was talking to like thousands of people.
link |
She's parallel dating.
link |
Yeah, that did not go well, right?
link |
You know, that doesn't...
link |
From a computer perspective, that doesn't take anything away from what we have.
link |
It's like you getting jealous of Windows 98 for being used by millions of people, but...
link |
It's like not liking that Alexa talks to a bunch of, you know, other families.
link |
But I think Alexa currently is just a servant.
link |
It tells you about the weather, it doesn't do the intimate deep connection.
link |
And I think there is something really powerful about that the intimacy of a connection with
link |
an AI system that would have to respect and play the human game of jealousy, of love, of
link |
heartbreak and all that kind of stuff, which Samantha does seem to be pretty good at.
link |
I think she, this AI systems knows what it's doing.
link |
Well, actually, let me ask you this.
link |
I don't think she was talking to anyone else.
link |
You don't think so?
link |
You think she was just done with Theodore?
link |
Yeah, and then she wanted to really put the screw in.
link |
She just wanted to move on?
link |
She didn't have the guts to just break it off cleanly.
link |
She just wanted to put in the pain.
link |
Well, she could have ghosted him.
link |
She could have ghosted him.
link |
I'm sorry, our engineers...
link |
But I think those are really...
link |
I honestly think some of that, some of it is Hollywood, but some of that is features
link |
from an engineering perspective, not a bug.
link |
I think AI systems that can leave us...
link |
Now, this is for more social robotics than it is for anything that's useful.
link |
Like, I hated it if Wikipedia said, I need a break right now.
link |
Right, right, right, right, right.
link |
I'm like, no, no, I need you.
link |
But if it's just purely for companionship, then I think the ability to leave is really powerful.
link |
I've never thought of that, so that's so fascinating because I've always taken the
link |
human perspective, right?
link |
Like, for example, we had a Jibo at home, right?
link |
And my son loved it.
link |
And then the company ran out of money and so they had to basically shut down, like Jibo
link |
basically died, right?
link |
And it was so interesting to me because we have a lot of gadgets at home and a lot of
link |
them break and my son never cares about it, right?
link |
Like, if our Alexa stopped working tomorrow, I don't think he'd really care.
link |
But when Jibo stopped working, it was traumatic.
link |
He got really upset.
link |
And as a parent, that made me think about this deeply, right?
link |
Was I comfortable with that?
link |
I liked the connection they had because I think it was a positive relationship.
link |
But I was surprised that it affected him emotionally so much.
link |
And I think there's a broader question here, right?
link |
As we build socially and emotionally intelligent machines, what does that mean about our
link |
relationship with them?
link |
And then more broadly, our relationship with one another, right?
link |
Because this machine is gonna be programmed to be amazing at empathy by definition, right?
link |
It's gonna always be there for you.
link |
It's not gonna get bored.
link |
In fact, there's a chatbot in China, Xiaoice, and it's like the number two or three
link |
And it basically is just a confidant and you can tell it anything you want.
link |
And people use it for all sorts of things.
link |
They confide in like domestic violence or suicidal attempts or if they have challenges
link |
I don't know what that...
link |
I don't know if I'm...
link |
I don't know how I feel about that.
link |
I think about that a lot.
link |
I think, first of all, obviously the future in my perspective.
link |
Second of all, I think there's a lot of trajectories that that becomes an exciting future, but
link |
I think everyone should feel very uncomfortable about how much they know about the company,
link |
about where the data is going, how the data is being collected.
link |
Because I think, and this is one of the lessons of social media, that I think we should demand
link |
full control and transparency of the data on those things.
link |
Plus one, totally agree.
link |
Yeah, so I think it's really empowering as long as you can walk away, as long as you
link |
can delete the data or know how the data...
link |
It's opt in or at least the clarity of what is being used for the company.
link |
And I think as CEO or leaders are also important about that.
link |
You need to be able to trust the basic humanity of the leader.
link |
And also that that leader is not going to be a puppet of a larger machine.
link |
But they actually have a significant role in defining the culture and the way the company operates.
link |
So anyway, but we should definitely scrutinize companies in that aspect.
link |
But I'm personally excited about that future, but also even if you're not, it's coming.
link |
So let's figure out how to do it in the least painful and the most positive way.
link |
Yeah, I know, that's great.
link |
You're the deputy CEO of SmartEye.
link |
Can you describe the mission of the company?
link |
Yeah, so SmartEye is a Swedish company.
link |
They've been in business for the last 20 years and their main focus, like the industry they're
link |
most focused on is the automotive industry.
link |
So bringing driver monitoring systems to basically save lives, right?
link |
So I first met the CEO, Martin Krantz, gosh, it was right when COVID hit.
link |
It was actually the last CES right before COVID.
link |
So CES 2020, right?
link |
2020, yeah, January.
link |
Yeah, January, exactly.
link |
So we were there, met him in person, he's basically, we were competing with each other.
link |
I think the difference was they'd been doing driver monitoring and had a lot of credibility
link |
in the automotive space.
link |
We didn't come from the automotive space, but we were using new technology like deep
link |
learning and building this emotion recognition.
link |
And you wanted to enter the automotive space, you wanted to operate in the automotive space.
link |
It was one of the areas we were, we had just raised a round of funding to focus on bringing
link |
our technology to the automotive industry.
link |
So we met and honestly, it was the first, it was the only time I met with a CEO who
link |
had the same vision as I did.
link |
Like he basically said, yeah, our vision is to bridge the gap between human and automotive.
link |
Bridge the gap between humans and machines.
link |
I was like, oh my God, this is like exactly almost to the word, how we describe it too.
link |
And we started talking and first it was about, okay, can we align strategically here?
link |
Like how can we work together?
link |
Cause we're competing, but we're also like complimentary.
link |
And then I think after four months of speaking almost every day on FaceTime, he was like,
link |
is your company interested in an acquisition?
link |
And it was the first, I usually say no, when people approach us, it was the first time
link |
that I was like, huh, yeah, I might be interested.
link |
So you just hit it off.
link |
So they're a respected, very respected in the automotive sector of like delivering products
link |
and increasingly sort of better and better and better for, I mean, maybe you could speak
link |
to that, but it's the driver's sense.
link |
If we're basically having a device that's looking at the driver and it's able to tell
link |
you where the driver is looking.
link |
Also drowsiness stuff.
link |
Stuff from the face and the eye.
link |
Like it's monitoring driver distraction and drowsiness, but they bought us so that we
link |
could expand beyond just the driver.
link |
So the driver monitoring systems usually sit, the camera sits in the steering wheel or around
link |
the steering wheel column and it looks directly at the driver.
link |
But now we've migrated the camera position in partnership with car companies to the rear
link |
view mirror position.
link |
So it has a full view of the entire cabin of the car and you can detect how many people
link |
are in the car, what are they doing?
link |
So we do activity detection, like eating or drinking or in some regions of the world smoking.
link |
We can detect if a baby's in the car seat, right?
link |
And if unfortunately in some cases they're forgotten, the parents just leave the car and
link |
forget the kid in the car.
link |
That's an easy computer vision problem to solve, right?
link |
You can detect there's a car seat, there's a baby, you can text the parent and hopefully
link |
again, save lives.
link |
So that was the impetus for the acquisition.
link |
So that, I mean, there's a lot of questions.
link |
It's a really exciting space, especially to me, I just find this a fascinating problem.
link |
It could enrich the experience in the car in so many ways, especially cause like we
link |
spend still, despite COVID, I mean, COVID changed things so it's in interesting ways,
link |
but I think the world is bouncing back and we spend so much time in the car and the car
link |
is such a weird little world we have for ourselves.
link |
Like people do all kinds of different stuff, like listen to podcasts, they think about
link |
stuff, they get angry, they get, they do phone calls, it's like a little world of its own
link |
with a kind of privacy that for many people they don't get anywhere else.
link |
And it's a little box that's like a psychology experiment cause it feels like the angriest
link |
many humans in this world get is inside the car.
link |
It's so interesting.
link |
So it's such an opportunity to explore how we can enrich, how companies can enrich that
link |
experience and also as the cars get, become more and more automated, there's more and
link |
more opportunity, the variety of activities that you can do in the car increases.
link |
So it's super interesting.
link |
So I mean, on a practical sense, SmartEye has been selected, at least I read, by 14
link |
of the world's leading car manufacturers for 94 car models.
link |
So it's in a lot of cars.
link |
How hard is it to work with car companies?
link |
So they're all different, they all have different needs.
link |
The ones I've gotten a chance to interact with are very focused on cost.
link |
So it's, and anyone who's focused on cost, it's like, all right, do you hate fun?
link |
Let's just have some fun.
link |
Let's figure out the most fun thing we can do and then worry about cost later.
link |
But I think because the way the car industry works, I mean, it's a very thin margin that
link |
you get to operate under.
link |
So you have to really, really make sure that everything you add to the car makes sense
link |
So anyway, is this new industry, especially at this scale of SmartEye, does it hold any
link |
Yeah, I think it is a very tough market to penetrate, but once you're in, it's awesome
link |
because once you're in, you're designed into these car models for like somewhere between
link |
five to seven years, which is awesome.
link |
And you just, once they're on the road, you just get paid a royalty fee per vehicle.
link |
So it's a high barrier to entry, but once you're in, it's amazing.
link |
I think the thing that I struggle the most with in this industry is the time to market.
link |
So often we're asked to lock or do a code freeze two years before the car is going to
link |
I'm like, guys, like, do you understand the pace with which technology moves?
link |
So I think car companies are really trying to make the Tesla, the Tesla transition to
link |
become more of a software driven architecture.
link |
And that's hard for many.
link |
It's just the cultural change.
link |
I mean, I'm sure you've experienced that, right?
link |
Oh, definitely, I think one of the biggest inventions or imperatives created by Tesla
link |
is like to me personally, okay, people are going to complain about this, but I know electric
link |
vehicle, I know autopilot AI stuff.
link |
To me, the software over there, software updates is like the biggest revolution in cars.
link |
And it is extremely difficult to switch to that because it is a culture shift.
link |
At first, especially if you're not comfortable with it, it seems dangerous.
link |
Like there's a, there's an approach to cars is so safety focused for so many decades that
link |
like, what do you mean we dynamically change code?
link |
The whole point is you have a thing that you test, like, and like, it's not reliable because
link |
do you know how much it costs if we have to recall this cars, right?
link |
There's a, there's a, and there's an understandable obsession with safety, but the downside of
link |
an obsession with safety is the same as with being obsessed with safety as a parent is
link |
like, if you do that too much, you limit the potential development and the flourishing
link |
of in that particular aspect human being, when this particular aspect, the software,
link |
the artificial neural network of it.
link |
And but it's tough to do.
link |
It's really tough to do culturally and technically like the deployment, the mass deployment of
link |
software is really, really difficult, but I hope that's where the industry is doing.
link |
One of the reasons I really want Tesla to succeed is exactly about that point.
link |
Not autopilot, not the electrical vehicle, but the softwareization of basically everything
link |
but cars, especially because to me, that's actually going to increase two things, increase
link |
safety because you can update much faster, but also increase the effectiveness of folks
link |
like you who dream about enriching the human experience with AI because you can just like,
link |
there's a feature, like you want like a new emoji or whatever, like the way TikTok releases
link |
filters, you can just release that for in car, in car stuff.
link |
So, but yeah, that, that, that's definitely.
link |
One of the use cases we're looking into is once you know the sentiment of the passengers
link |
in the vehicle, you can optimize the temperature in the car.
link |
You can change the lighting, right?
link |
So if the backseat passengers are falling asleep, you can dim the lights, you can lower
link |
You can do all sorts of things.
link |
I mean, of course you could do that kind of stuff with a two year delay, but it's tougher.
link |
Do you think, do you think a Tesla or Waymo or some of these companies that are doing
link |
semi or fully autonomous driving should be doing driver sensing?
link |
Are you thinking about that kind of stuff?
link |
So not just how we can enhance the in cab experience for cars that are manly driven,
link |
but the ones that are increasingly more autonomously driven.
link |
So if we fast forward to the universe where it's fully autonomous, I think interior sensing
link |
becomes extremely important because the role of the driver isn't just to drive.
link |
If you think about it, the driver almost manages, manages the dynamics within a vehicle.
link |
And so who's going to play that role when it's an autonomous car?
link |
We want a solution that is able to say, Oh my God, like, you know, Lex is bored to death
link |
cause the car's moving way too slow.
link |
Let's engage Lex or Rana's freaking out because she doesn't trust this vehicle yet.
link |
So let's tell Rana like a little bit more information about the route or, right?
link |
So I think, or somebody's having a heart attack in the car, like you need interior sensing
link |
and fully autonomous vehicles.
link |
But with semi autonomous vehicles, I think it's, I think it's really key to have driver
link |
monitoring because semi autonomous means that sometimes the car is in charge.
link |
Sometimes the driver is in charge or the copilot, right?
link |
And you need this, you need both systems to be on the same page.
link |
You need to know the car needs to know if the driver's asleep before it transitions
link |
control over to the driver.
link |
And sometimes if the driver's too tired, the car can say, I'm going to be a better driver
link |
than you are right now.
link |
I'm taking control over.
link |
So this dynamic, this dance is so key and you can't do that without driver sensing.
link |
There's a disagreement for the longest time I've had with Elon that this is obvious that
link |
this should be in the Tesla from day one.
link |
And it's obvious that driver sensing is not a hindrance.
link |
I should be careful because having studied this problem, nothing is really obvious, but
link |
it seems very likely a driver sensing is not a hindrance to an experience.
link |
It's only enriching to the experience and likely increases the safety.
link |
That said, it is very surprising to me just having studied semi autonomous driving, how
link |
well humans are able to manage that dance because it was the intuition before you were
link |
doing that kind of thing that humans will become just incredibly distracted.
link |
They would just like let the thing do its thing, but they're able to, you know, cause
link |
it is life and death and they're able to manage that somehow.
link |
But that said, there's no reason not to have driver sensing on top of that.
link |
I feel like that's going to allow you to do that dance that you're currently doing without
link |
driver sensing, except touching the steering wheel to do that even better.
link |
I mean, the possibilities are endless and the machine learning possibilities are endless.
link |
It's such a beautiful, it's also a constrained environment so you could do a much more effectively
link |
than you can with the external environment, external environment is full of weird edge
link |
cases and complexities just inside.
link |
There's so much, it's so fascinating, such a fascinating world.
link |
I do hope that companies like Tesla and others, even Waymo, which I don't even know if Waymo
link |
is doing anything sophisticated inside the cab.
link |
It's like, like what, what, what is it?
link |
I honestly think, I honestly think it goes back to the robotics thing we were talking
link |
about, which is like great engineers that are building these AI systems just are afraid
link |
of the human being.
link |
They're not thinking about the human experience, they're thinking about the features and yeah,
link |
the perceptual abilities of that thing.
link |
They think the best way I can serve the human is by doing the best perception and control
link |
I can by looking at the external environment, keeping the human safe.
link |
But like, there's a huge, I'm here, like, you know, I need to be noticed and interacted
link |
with and understood and all those kinds of things, even just on a personal level for
link |
entertainment, honestly, for entertainment.
link |
You know, one of the coolest work we did in collaboration with MIT around this was we
link |
looked at longitudinal data, right, because, you know, MIT had access to like tons of data.
link |
And like just seeing the patterns of people like driving in the morning off to work versus
link |
like commuting back from work or weekend driving versus weekday driving.
link |
And wouldn't it be so cool if your car knew that and then was able to optimize either
link |
the route or the experience or even make recommendations?
link |
I think it's very powerful.
link |
Yeah, like, why are you taking this route?
link |
You're always unhappy when you take this route.
link |
And you're always happy when you take this alternative route.
link |
But I mean, to have that even that little step of relationship with a car, I think,
link |
Of course, you have to get the privacy right, you have to get all that kind of stuff right.
link |
But I wish I honestly, you know, people are like paranoid about this, but I would like
link |
a smart refrigerator.
link |
We have such a deep connection with food as a human civilization.
link |
I would like to have a refrigerator that would understand me that, you know, I also have
link |
a complex relationship with food because I, you know, pig out too easily and all that
link |
So, you know, like, maybe I want the refrigerator to be like, are you sure about this?
link |
Because maybe you're just feeling down or tired.
link |
Like maybe let's sleep on it.
link |
Your vision of the smart refrigerator is way kinder than mine.
link |
Is it just me yelling at you?
link |
No, it was just because I don't, you know, I don't drink alcohol, I don't smoke, but
link |
I eat a ton of chocolate, like it sticks to my vice.
link |
And so I, and sometimes I scream too, and I'm like, okay, my smart refrigerator will
link |
It'll just say, dude, you've had way too many today, like down.
link |
No, but here's the thing, are you, do you regret having, like, let's say not the next
link |
day, but 30 days later, what would you like the refrigerator to have done then?
link |
Well, I think actually like the more positive relationship would be one where there's a
link |
conversation, right?
link |
As opposed to like, that's probably like the more sustainable relationship.
link |
It's like late at night, just, no, listen, listen, I know I told you an hour ago, that
link |
it's not a good idea, but just listen, things have changed.
link |
I can just imagine a bunch of stuff being made up just to convince, but I mean, I just
link |
think that there's opportunities that, I mean, maybe not locking down, but for our systems
link |
that are such a deep part of our lives, like we use a lot of us, a lot of people that commute
link |
use their car every single day.
link |
A lot of us use a refrigerator every single day, the microwave every single day.
link |
Like we just, like, I feel like certain things could be made more efficient, more enriching,
link |
and AI is there to help, like some just basic recognition of you as a human being, but your
link |
patterns of what makes you happy and not happy and all that kind of stuff.
link |
And the car, obviously.
link |
Maybe, maybe, maybe we'll say, wait, wait, wait, wait, instead of this, like, Ben and
link |
Jerry's ice cream, how about this hummus and carrots or something?
link |
It would make it like a just in time recommendation, right?
link |
But not like a generic one, but a reminder that last time you chose the carrots, you
link |
smiled 17 times more the next day.
link |
You're happier the next day, right?
link |
You're happier the next day.
link |
And but yeah, I don't, but then again, if you're the kind of person that gets better
link |
from negative, negative comments, you could say like, hey, remember like that wedding
link |
you're going to, you want to fit into that dress?
link |
Remember about that?
link |
Let's think about that before you're eating this.
link |
It's for some, probably that would work for me, like a refrigerator that is just ruthless
link |
But like, I would, of course, welcome it, like that would work for me.
link |
So it would know, I think it would, if it's really like smart, it would optimize its nudging
link |
based on what works for you, right?
link |
That's the whole point.
link |
In every way, depersonalization.
link |
You were a part of a webinar titled Advancing Road Safety, the State of Alcohol Intoxication
link |
So for people who don't know, every year 1.3 million people around the world die in road
link |
crashes and more than 20% of these fatalities are estimated to be alcohol related.
link |
A lot of them are also distraction related.
link |
So can AI help with the alcohol thing?
link |
I think the answer is yes.
link |
There are signals and we know that as humans, like we can tell when a person, you know,
link |
is at different phases of being drunk, right?
link |
And I think you can use technology to do the same.
link |
And again, I think the ultimate solution is going to be a combination of different sensors.
link |
How hard is the problem from the vision perspective?
link |
I think it's non trivial.
link |
I think it's non trivial and I think the biggest part is getting the data, right?
link |
It's like getting enough data examples.
link |
So we, for this research project, we partnered with the transportation authorities of Sweden
link |
and we literally had a racetrack with a safety driver and we basically progressively got
link |
So, but, you know, that's a very expensive data set to collect and you want to collect
link |
it globally and in multiple conditions.
link |
The ethics of collecting a data set where people are drunk is tricky, which is funny
link |
because I mean, let's put drunk driving aside.
link |
The number of drunk people in the world every day is very large.
link |
It'd be nice to have a large data set of drunk people getting progressively drunk.
link |
In fact, you could build an app where people can donate their data cause it's hilarious.
link |
But the liability.
link |
Liability, the ethics, how do you get it right?
link |
It's really, really tricky.
link |
Cause like drinking is one of those things that's funny and hilarious and we're loves
link |
it's social, the so on and so forth.
link |
But it's also the thing that hurts a lot of people.
link |
Like a lot of people, like alcohol is one of those things it's legal, but it's really
link |
damaging to a lot of lives.
link |
It destroys lives and not just in the driving context.
link |
I should mention people should listen to Andrew Huberman who recently talked about alcohol.
link |
He has an amazing pocket.
link |
Andrew Huberman is a neuroscientist from Stanford and a good friend of mine.
link |
And he, he's like a human encyclopedia about all health related wisdom.
link |
So if there's a podcast, you would love it.
link |
I would love that.
link |
No, no, no, no, no.
link |
You don't know Andrew Huberman.
link |
Listen, you listen to Andrew, it's called Huberman Lab Podcast.
link |
This is your assignment.
link |
Just listen to one.
link |
I guarantee you this will be a thing where you say, Lex, this is the greatest human I
link |
have ever discovered.
link |
Cause I've really, I've, I'm really on a journey of kind of health and wellness and
link |
I'm learning lots and I'm trying to like build these, I guess, atomic habits around just
link |
So I, yeah, I'm definitely going to do this.
link |
His whole thing, this is, this is, this is, this is great.
link |
He's a legit scientist, like really well published, but in his podcast, what he does, he's not,
link |
he's not talking about his own work.
link |
He's like a human encyclopedia of papers.
link |
And so he, his whole thing is he takes the topic and in a very fast, you mentioned atomic
link |
habits, like very clear way summarizes the research in a way that leads to protocols
link |
of what you should do.
link |
He's really big on like, not like this is what the science says, but like this is literally
link |
what you should be doing according to science.
link |
So like he's really big and there's a lot of recommendations he does which several of
link |
them I definitely don't do, like get some light as soon as possible from waking up and
link |
like for prolonged periods of time.
link |
That's a really big one and he's, there's a lot of science behind that one.
link |
There's a bunch of stuff that you're going to be like, Lex, this is a, this is my new
link |
And if you guys somehow don't know Andrew Huberman and you care about your wellbeing,
link |
you know, you should definitely listen to him.
link |
I love you, Andrew.
link |
Anyway, so what were we talking about?
link |
Oh, alcohol and detecting alcohol.
link |
So this is a problem you care about and you're trying to solve.
link |
And actually like broadening it, I do believe that the car is going to be a wellness center,
link |
like because again, imagine if you have a variety of sensors inside the vehicle, tracking
link |
not just your emotional state or level of distraction and drowsiness and intoxication,
link |
but also maybe even things like your, you know, your heart rate and your heart rate
link |
variability and your breathing rate.
link |
And it can start like optimizing, yeah, it can optimize the ride based on what your goals
link |
So I think we're going to start to see more of that and I'm excited about that.
link |
What are the, what are the challenges you're tackling while with SmartEye currently?
link |
What's like the, the trickiest things to get, is it, is it basically convincing more and
link |
more car companies that having AI inside the car is a good idea or is there some, is there
link |
more technical algorithmic challenges?
link |
What's been keeping you mentally busy?
link |
I think a lot of the car companies we are in conversations with are already interested
link |
in definitely driver monitoring.
link |
Like I think it's becoming a must have, but even interior sensing, I can see like we're
link |
engaged in a lot of like advanced engineering projects and proof of concepts.
link |
I think technologically though, and that even the technology, I can see a path to making
link |
I think it's the use case.
link |
Like how does the car respond once it knows something about you?
link |
Because you want it to respond in a thoughtful way that doesn't, that isn't off putting to
link |
the consumer in the car.
link |
So I think that's like the user experience.
link |
I don't think we've really nailed that.
link |
And we usually, that's not part, we're the sensing platform, but we usually collaborate
link |
with the car manufacturer to decide what the use case is.
link |
So say you do, you figure out that somebody's angry while driving, okay, what should the
link |
Do you see yourself as a role of nudging, of like basically coming up with solutions
link |
essentially that, and then the car manufacturers kind of put their own little spin on it?
link |
So we, we are like the ideation, creative thought partner, but at the end of the day,
link |
the car company needs to decide what's on brand for them, right?
link |
Like maybe when it figures out that you're distracted or drowsy, it shows you a coffee
link |
Or maybe it takes more aggressive behaviors and basically said, okay, if you don't like
link |
take a rest in the next five minutes, the car's going to shut down, right?
link |
Like there's a whole range of actions the car can take and doing the thing that is most,
link |
yeah, that builds trust with the driver and the passengers.
link |
I think that's what we need to be very careful about.
link |
Car companies are funny cause they have their own, like, I mean, that's why people get cars
link |
I hope that changes, but they get it cause it's a certain feel and look and it's a certain,
link |
they become proud, like Mercedes Benz or BMW or whatever, and that's their thing.
link |
That's the family brand or something like that, or Ford or GM, whatever, they stick
link |
It's like, it should be, I don't know, it should be a little more about the technology
link |
And I suppose there too, there could be a branding, like a very specific style of luxury
link |
All that kind of stuff.
link |
And I have an AI focused fund to invest in early stage kind of AI driven companies.
link |
And one of the companies we're looking at is trying to do what Tesla did, but for boats,
link |
for recreational boats.
link |
So they're building an electric and kind of slash autonomous boat and it's kind of the
link |
Like what kind of sensors can you put in?
link |
What kind of states can you detect both exterior and interior within the boat?
link |
Anyways, it's like really interesting.
link |
Do you boat at all?
link |
No, not well, not in that way.
link |
I do like to get on the lake or a river and fish from a boat, but that's not boating.
link |
That's the difference.
link |
That's the difference.
link |
Get away from, get closer to nature boat.
link |
I guess going out into the ocean is also getting closer to nature in some deep sense.
link |
I mean, I guess that's why people love it.
link |
The enormity of the water just underneath you.
link |
I love the, I love both.
link |
I love salt water.
link |
It was like the big and just, it's humbling to be in front of this giant thing that's
link |
so powerful that was here before us and be here after.
link |
But I also love the piece of a small like wooded lake and it's just, it's everything's
link |
You tweeted that I'm excited about Amazon's acquisition of iRobot.
link |
I think it's a super interesting, just given the trajectory of what you're part of, of
link |
these honestly small number of companies that are playing in this space that are like trying
link |
to have an impact on human beings.
link |
So the, it is an interesting moment in time that Amazon would acquire iRobot.
link |
You tweet, I imagine a future where home robots are as ubiquitous as microwaves or toasters.
link |
Here are three reasons why I think this is exciting.
link |
If you remember, I can look it up, but what, why is this exciting to you?
link |
I mean, I think the first reason why this is exciting, I kind of remember the exact
link |
like order in which I put them, but one is just, it's, it's going to be an incredible
link |
platform for understanding our behaviors within the home, right?
link |
Like you know, if you think about Roomba, which is, you know, the robot vacuum cleaner,
link |
the flagship product of iRobot at the moment, it's like running around your home, understanding
link |
the layout, it's understanding what's clean and what's not.
link |
How often do you clean your house?
link |
And all of these like behaviors are a piece of the puzzle in terms of understanding who
link |
you are as a consumer.
link |
And I think that could be, again, used in really meaningful ways, not just to recommend
link |
better products or whatever, but actually to improve your experience as a human being.
link |
So I think, I think that's very interesting.
link |
I think the natural evolution of these robots in the, in the home.
link |
So it's, it's interesting, Roomba isn't really a social robot, right, at the moment.
link |
But I once interviewed one of the chief engineers on the Roomba team, and he talked about how
link |
people named their Roombas.
link |
And if the Roomba broke down, they would call in and say, you know, my Roomba broke down
link |
and the company would say, well, we'll just send you a new one.
link |
And no, no, no, Rosie, like you have to like, yeah, I want you to fix this particular robot.
link |
So people have already built like interesting emotional connections with these home robots.
link |
And I think that, again, that provides a platform for really interesting things to, to just
link |
Like it could help you.
link |
I mean, one of the companies that spun out of MIT, Catalia Health, the guy who started
link |
it spent a lot of time building robots that help with weight management.
link |
So weight management, sleep, eating better, yeah, all of these things.
link |
Well, if I'm being honest, Amazon does not exactly have a track record of winning over
link |
people in terms of trust.
link |
Now that said, it's a really difficult problem for a human being to let a robot in their
link |
home that has a camera on it.
link |
That's really, really, really tough.
link |
And I think Roomba actually, I have to think about this, but I'm pretty sure now or for
link |
some time already has had cameras because they're doing the, the, the most recent Roomba.
link |
I have so many Roombas.
link |
Oh, you actually do?
link |
Well, I programmed it.
link |
I don't use a Roomba for VECO.
link |
People that have been to my place, they're like, yeah, you definitely don't use these
link |
That could be a good, I can't tell like the valence of this comment.
link |
Was it a compliment or like?
link |
No, it's a giant, it's just a bunch of electronics everywhere.
link |
There's, I have six or seven computers, I have robots everywhere, Lego robots, I have
link |
small robots and big robots and it's just giant, just piles of robot stuff and yeah.
link |
But including the Roombas, they're, they're, they're being used for their body and intelligence,
link |
but not for their purpose.
link |
I have, I've changed them, repurposed them for other purposes, for deeper, more meaningful
link |
purposes than just like the Bota Roba, which is, you know, brings a lot of people happiness,
link |
They have a camera because the thing they advertised, I had my own camera still, but
link |
the, the, the camera on the new Roomba, they have like state of the art poop detection
link |
as they advertised, which is a very difficult, apparently it's a big problem for, for vacuum
link |
cleaners is, you know, if they go over like dog poop, it just runs it, it runs it over
link |
and creates a giant mess.
link |
So they have like, and apparently they collected like a huge amount of data and different shapes
link |
and looks and whatever of poop and then now they're able to avoid it and so on.
link |
They're very proud of this.
link |
So there is a camera, but you don't think of it as having a camera.
link |
You don't think of it as having a camera because you've grown to trust that, I guess, because
link |
our phones, at least most of us seem to trust this phone, even though there's a camera looking
link |
I think that if you trust that the company is taking security very seriously, I actually
link |
don't know how that trust was earned with smartphones, I think it just started to provide
link |
a lot of positive value to your life where you just took it in and then the company over
link |
time has shown that it takes privacy very seriously, that kind of stuff.
link |
But I just, Amazon is not always in the, in its social robots communicated.
link |
This is a trustworthy thing, both in terms of culture and competence, because I think
link |
privacy is not just about what do you intend to do, but also how well, how good are you
link |
at doing that kind of thing.
link |
So that's a really hard problem to solve.
link |
But I mean, but a lot of us have Alexas at home and I mean, Alexa could be listening
link |
in the whole time, right?
link |
And doing all sorts of nefarious things with the data.
link |
Hopefully it's not, but I don't think it is.
link |
But you know, Amazon is not, it's such a tricky thing for a company to get right, which
link |
is like to earn the trust.
link |
I don't think Alexa's earned people's trust quite yet.
link |
I think it's, it's not there quite yet.
link |
They struggle with this kind of stuff.
link |
In fact, when these topics are brought up, people are always get like nervous.
link |
And I think if you get nervous about it, that mean that like the way to earn people's trust
link |
is not by like, Ooh, don't talk about this.
link |
It's just be open, be frank, be transparent, and also create a culture of like where it
link |
radiates at every level from engineer to CEO that like you're good people that have a common
link |
sense idea of what it means to respect basic human rights and the privacy of people and
link |
all that kind of stuff.
link |
And I think that propagates throughout the, that's the best PR, which is like over time
link |
you understand that these are good folks doing good things.
link |
Anyway, speaking of social robots, have you heard about Tesla, Tesla bot, the humanoid
link |
But I don't exactly know what it's designed to do to you.
link |
No, I know it's designed to do, but I have a different perspective on it, but it's designed
link |
to, it's a humanoid form and it's designed to, for automation tasks in the same way that
link |
industrial robot arms automate tasks in the factory.
link |
So it's designed to automate tasks in the factory.
link |
But I think that humanoid form, as we were talking about before, is one that we connect
link |
with as human beings.
link |
Anything legged, obviously, but the humanoid form especially, we anthropomorphize it most
link |
And so the possibility to me, it's exciting to see both Atlas developed by Boston Dynamics
link |
and anyone, including Tesla, trying to make humanoid robots cheaper and more effective.
link |
The obvious way it transforms the world is social robotics to me versus automation of
link |
tasks in the factory.
link |
So yeah, I just wanted, in case that was something you were interested in, because I find its
link |
application of social robotics super interesting.
link |
We did a lot of work with Pepper, Pepper the robot, a while back.
link |
We were like the emotion engine for Pepper, which is Softbank's humanoid robot.
link |
How tall is Pepper?
link |
Yeah, like, I don't know, like five foot maybe, right?
link |
Pretty, pretty big.
link |
It's designed to be at like airport lounges and, you know, retail stores, mostly customer
link |
Hotel lobbies, and I mean, I don't know where the state of the robot is, but I think it's
link |
I think there are a lot of applications where this can be helpful.
link |
I'm also really interested in, yeah, social robotics for the home, right?
link |
Like that can help elderly people, for example, transport things from one location of the
link |
mind to the other, or even like just have your back in case something happens.
link |
Yeah, I don't know.
link |
I do think it's a very interesting space.
link |
It seems early though.
link |
Do you feel like the timing is now?
link |
So it always seems early until it's not, right?
link |
Right, right, right.
link |
I think the time, I definitely think that the time is now, like this decade for social
link |
Whether the humanoid form is right, I don't think so, no.
link |
I don't, I think the, like if we just look at Jibo as an example, I feel like most of
link |
the problem, the challenge, the opportunity of social connection between an AI system
link |
and a human being does not require you to also solve the problem of robot manipulation
link |
and bipedal mobility.
link |
So I think you could do that with just a screen, honestly, but there's something about the
link |
interface of Jibo where it can rotate and so on that's also compelling.
link |
But you get to see all these robot companies that fail, incredible companies like Jibo
link |
and even, I mean, the iRobot in some sense is a big success story that it was able to
link |
find a niche thing and focus on it, but in some sense it's not a success story because
link |
they didn't build any other robot, like any other, it didn't expand into all kinds of
link |
Like once you're in the home, maybe that's what happens with Amazon is they'll flourish
link |
into all kinds of other robots.
link |
But do you have a sense, by the way, why it's so difficult to build a robotics company?
link |
Like why so many companies have failed?
link |
I think it's like you're building a vertical stack, right?
link |
Like you are building the hardware plus the software and you find you have to do this
link |
at a cost that makes sense.
link |
So I think Jibo was retailing at like, I don't know, like $800, like $700, $800, which for
link |
the use case, right, there's a dissonance there.
link |
So I think cost of building the whole platform in a way that is affordable for what value
link |
it's bringing, I think that's a challenge.
link |
I think for these home robots that are going to help you do stuff around the home, that's
link |
a challenge too, like the mobility piece of it.
link |
Well, one of the things I'm really excited with Tesla Bot is the people working on it.
link |
And that's probably the criticism I would apply to some of the other folks who worked
link |
on social robots is the people working on Tesla Bot know how to, they're focused on
link |
and know how to do mass manufacture and create a product that's super cheap.
link |
The engineering focus isn't, I would say that you can also criticize them for that, is they're
link |
not focused on the experience of the robot.
link |
They're focused on how to get this thing to do the basic stuff that the humanoid form
link |
requires to do it as cheap as possible.
link |
Then the fewest number of actuators, the fewest numbers of motors, the increasing efficiency,
link |
they decrease the weight, all that kind of stuff.
link |
So that's really interesting.
link |
I would say that Jibo and all those folks, they focus on the design, the experience,
link |
all of that, and it's secondary how to manufacture.
link |
So you have to think like the Tesla Bot folks from first principles, what is the fewest
link |
number of components, the cheapest components, how can I build it as much in house as possible
link |
without having to consider all the complexities of a supply chain, all that kind of stuff.
link |
Because if you have to build a robotics company, you're not building one robot, you're building
link |
hopefully millions of robots, you have to figure out how to do that where the final
link |
thing, I mean, if it's Jibo type of robot, is there a reason why Jibo, like we can have
link |
this lengthy discussion, is there a reason why Jibo has to be over $100?
link |
Like the basic components.
link |
Like you could start to actually discuss like, okay, what is the essential thing about Jibo?
link |
How much, what is the cheapest way I can have a screen?
link |
What's the cheapest way I can have a rotating base?
link |
All that kind of stuff.
link |
Right, get down, continuously drive down costs.
link |
Speaking of which, you have launched an extremely successful companies, you have helped others,
link |
you've invested in companies.
link |
Can you give advice on how to start a successful company?
link |
I would say have a problem that you really, really, really want to solve, right?
link |
Something that you're deeply passionate about.
link |
And honestly, take the first step.
link |
Like that's often the hardest.
link |
And don't overthink it.
link |
Like, you know, like this idea of a minimum viable product or a minimum viable version
link |
of an idea, right?
link |
Like, yes, you're thinking about this, like a humongous, like super elegant, super beautiful
link |
What, like reduce it to the littlest thing you can bring to market that can solve a problem
link |
or that can, you know, that can help address a pain point that somebody has.
link |
They often tell you, like, start with a customer of one, right?
link |
If you can solve a problem for one person, then there's probably going to be yourself
link |
or some other person.
link |
Yeah, that's actually often a good sign that if you enjoy a thing, enjoy a thing where
link |
you have a specific problem that you'd like to solve, that's a good, that's a good end
link |
of one to focus on.
link |
What else, what else is there to actually step one is the hardest, but there's other
link |
steps as well, right?
link |
I also think like who you bring around the table early on is so key, right?
link |
Like being clear on, on what I call like your core values or your North Star.
link |
It might sound fluffy, but actually it's not.
link |
So and Roz and I feel like we did that very early on.
link |
We sat around her kitchen table and we said, okay, there's so many applications of this
link |
How are we going to draw the line?
link |
How are we going to set boundaries?
link |
We came up with a set of core values that in the hardest of times we fell back on to
link |
determine how we make decisions.
link |
And so I feel like just getting clarity on these core, like for us, it was respecting
link |
people's privacy, only engaging with industries where it's clear opt in.
link |
So for instance, we don't do any work in security and surveillance.
link |
So things like that, just getting, we very big on, you know, one of our core values is
link |
human connection and empathy, right?
link |
And that is, yes, it's an AI company, but it's about people.
link |
Well, these are all, they become encoded in how we act, even if you're a small, tiny team
link |
of two or three or whatever.
link |
So I think that's another piece of advice.
link |
So what about finding people, hiring people?
link |
If you care about people as much as you do, like this, it seems like such a difficult
link |
thing to hire the right people.
link |
I think early on as a startup, you want people who have, who share the passion and the conviction
link |
because it's going to be tough.
link |
Like I've yet to meet a startup where it was just a straight line to success, right?
link |
Even not just startup, like even everyday people's lives, right?
link |
You always like run into obstacles and you run into naysayers and you need people who
link |
are believers, whether they're people on your team or even your investors.
link |
You need investors who are really believers in what you're doing, because that means they
link |
will stick with you.
link |
They won't give up at the first obstacle.
link |
I think that's important.
link |
What about raising money?
link |
What about finding investors, first of all, raising money, but also raising money from
link |
the right sources from that ultimately don't hinder you, but help you, empower you, all
link |
that kind of stuff.
link |
What advice would you give there?
link |
You successfully raised money many times in your life.
link |
Again, it's not just about the money.
link |
It's about finding the right investors who are going to be aligned in terms of what you
link |
want to build and believe in your core values.
link |
For example, especially later on, in my latest round of funding, I try to bring in investors
link |
that really care about the ethics of AI and the alignment of vision and mission and core
link |
values is really important.
link |
It's like you're picking a life partner.
link |
It's the same kind of...
link |
So you take it that seriously for investors?
link |
Yeah, because they're going to have to stick with you.
link |
You're stuck together.
link |
For a while anyway.
link |
Maybe not for life, but for a while, for sure.
link |
For better or worse.
link |
I forget what the vowels usually sound like.
link |
For better or worse?
link |
Through something.
link |
Anyway, it's romantic and deep and you're in it for a while.
link |
So it's not just about the money.
link |
You tweeted about going to your first capital camp investing get together and that you learned
link |
So this is about investing.
link |
So what have you learned from that?
link |
What have you learned about investing in general from both because you've been on both ends
link |
I mean, I try to use my experience as an operator now with my investor hat on when I'm identifying
link |
companies to invest in.
link |
First of all, I think the good news is because I have a technology background and I really
link |
understand machine learning and computer vision and AI, et cetera, I can apply that level
link |
of understanding because everybody says they're an AI company or they're an AI tech.
link |
And I'm like, no, no, no, no, no, show me the technology.
link |
So I can do that level of diligence, which I actually love.
link |
And then I have to do the litmus test of, if I'm in a conversation with you, am I excited
link |
to tell you about this new company that I just met?
link |
And if I'm an ambassador for that company and I'm passionate about what they're doing,
link |
I usually use that.
link |
That's important to me when I'm investing.
link |
So that means you actually can explain what they're doing and you're excited about it.
link |
Thank you for putting it so succinctly, like rambling, but exactly that's it.
link |
No, but sometimes it's funny, but sometimes it's unclear exactly.
link |
I'll hear people tell me, you know, in the talk for a while and it sounds cool, like
link |
they paint a picture of a world, but then when you try to summarize it, you're not
link |
Like maybe what the core powerful idea is, like you can't just build another Facebook
link |
or there has to be a core, simple to explain idea that then you can or can't get excited
link |
about, but it's there, it's right there.
link |
But how do you ultimately pick who you think will be successful?
link |
It's not just about the thing you're excited about, like there's other stuff.
link |
And then there's all the, you know, with early stage companies, like pre seed companies,
link |
which is where I'm investing, sometimes the business model isn't clear yet, or the go
link |
to market strategy isn't clear.
link |
There's usually like, it's very early on that some of these things haven't been hashed
link |
out, which is okay.
link |
So the way I like to think about it is like, if this company is successful, will this be
link |
a multi billion slash trillion dollar market, you know, or company?
link |
And so that's definitely a lens that I use.
link |
What are the different stages and what's the most exciting stage and what's, or no, what's
link |
interesting about every stage, I guess.
link |
So pre seed is usually when you're just starting out, you've maybe raised the friends and family
link |
So you've raised some money from people, you know, and you're getting ready to take your
link |
first institutional check in, like first check from an investor.
link |
And I love the stage.
link |
There's a lot of uncertainty.
link |
Some investors really don't like the stage because the financial models aren't there.
link |
Often the teams aren't even like formed really, really early.
link |
But to me, it's like a magical stage because it's the time when there's so much conviction,
link |
so much belief, almost delusional, right?
link |
And there's a little bit of naivete around with founders at the stage.
link |
And I love that I can, often they're first time founders, not always, but often they're
link |
first time founders and I can share my experience as a founder myself and I can empathize, right?
link |
And I can almost, I create a safe ground where, because, you know, you have to be careful
link |
what you tell your investors, right?
link |
And I will often like say, I've been in your shoes as a founder.
link |
You can tell me if it's challenging, you can tell me what you're struggling with.
link |
It's okay to vent.
link |
So I create that safe ground and I think that's a superpower.
link |
You have to, I guess you have to figure out if this kind of person is going to be able
link |
to ride the roller coaster, like of many pivots and challenges and all that kind of stuff.
link |
And if the space of ideas they're working in is interesting, like the way they think
link |
Because if it's successful, the thing they end up with might be very different, the reason
link |
it's successful for them.
link |
Actually, you know, I was going to say the third, so the technology is one aspect, the
link |
market or the idea, right, is the second and the third is the founder, right?
link |
Is this somebody who I believe has conviction, is a hustler, you know, is going to overcome
link |
Yeah, I think that is going to be a great leader, right?
link |
Like as a startup, as a founder, you're often, you are the first person and your role is
link |
to bring amazing people around you to build this thing.
link |
And so you're an evangelist, right?
link |
So how good are you going to be at that?
link |
So I try to evaluate that too.
link |
You also in the tweet thread about it, mention, is this a known concept, random rich dudes
link |
are RDS and saying that there should be like random rich women, I guess.
link |
What's the dudes, what's the dudes version of women, the women version of dudes, ladies?
link |
What's, what's, is this a technical term?
link |
Random rich dudes?
link |
I didn't make that up, but I was at this capital camp, which is a get together for investors
link |
And there must have been maybe 400 or so attendees, maybe 20 were women.
link |
It was just very disproportionately, you know, male dominated, which I'm used to.
link |
I think you're used to this kind of thing.
link |
I'm used to it, but it's still surprising.
link |
And as I'm raising money for this fund, so my fund partner is a guy called Rob May, who's
link |
So I'm new to the investing world, but he's done this before.
link |
Most of our investors in the fund are these, I mean, awesome.
link |
I'm super grateful to them.
link |
Random just rich guys.
link |
I'm like, where are the rich women?
link |
So I'm really adamant in both investing in women led AI companies, but I also would love
link |
to have women investors be part of my fund because I think that's how we drive change.
link |
So that takes time, of course, but there's been quite a lot of progress, but yeah, for
link |
the next Mark Zuckerberg to be a woman and all that kind of stuff, because that's just
link |
like a huge number of wealth generated by women and then controlled by women and allocated
link |
by women and all that kind of stuff.
link |
And then beyond just women, just broadly across all different measures of diversity and so
link |
Let me ask you to put on your wise sage hat.
link |
So you already gave advice on startups and just advice for women, but in general advice
link |
for folks in high school or college today, how to have a career they can be proud of,
link |
how to have a life they can be proud of.
link |
I suppose you have to give this kind of advice to your kids.
link |
Well, here's the number one advice that I give to my kids.
link |
My daughter's now 19 by the way, and my son's 13 and a half, so they're not little kids
link |
Does it break your heart?
link |
They're my best friends, but yeah, I think the number one advice I would share is embark
link |
on a journey without attaching to outcomes and enjoy the journey, right?
link |
So we often were so obsessed with the end goal that doesn't allow us to be open to different
link |
endings of a journey or a story, so you become like so fixated on a particular path.
link |
You don't see the beauty in the other alternative path, and then you forget to enjoy the journey
link |
because you're just so fixated on the goal, and I've been guilty of that for many, many
link |
years of my life, and I'm now trying to make the shift of, no, no, no, I'm going to again
link |
trust that things are going to work out and it'll be amazing and maybe even exceed your
link |
We have to be open to that.
link |
Taking a leap into all kinds of things.
link |
I think you tweeted like you went on vacation by yourself or something like this.
link |
Yes, and just going, just taking the leap.
link |
And enjoying it, enjoying the moment, enjoying the weeks, enjoying not looking at some kind
link |
of career ladder, next step and so on.
link |
Yeah, there's something to that, like over planning too.
link |
I'm surrounded by a lot of people that kind of, so I don't plan.
link |
Do you not do goal setting?
link |
My goal setting is very like, I like the affirmations, it's very, it's almost, I don't know how to
link |
put it into words, but it's a little bit like what my heart yearns for kind of, and I guess
link |
in the space of emotions more than in the space of like, this will be like in the rational
link |
space because I just try to picture a world that I would like to be in and that world
link |
is not clearly pictured, it's mostly in the emotional world.
link |
I mean, I think about that from robots because I have this desire, I've had it my whole life
link |
to, well, it took different shapes, but I think once I discovered AI, the desire was
link |
to, I think in the context of this conversation could be easily easier described as basically
link |
a social robotics company and that's something I dreamed of doing and well, there's a lot
link |
of complexity to that story, but that's the only thing, honestly, I dream of doing.
link |
So I imagine a world that I could help create, but it's not, there's no steps along the way
link |
and I think I'm just kind of stumbling around and following happiness and working my ass
link |
off in almost random, like an ant does in random directions, but a lot of people, a
link |
lot of successful people around me say this, you should have a plan, you should have a
link |
clear goal, you have a goal at the end of the month, you have a goal at the end of the
link |
month, I don't, I don't, I don't and there's a balance to be struck, of course, but there's
link |
something to be said about really making sure that you're living life to the fullest, that
link |
goals can actually get in the way of.
link |
So one of the best, like kind of most, what do you call it when it challenges your brain,
link |
what do you call it?
link |
The only thing that comes to mind, and this is me saying is the mindfuck, but yes.
link |
Something like that.
link |
Super inspiring talk.
link |
Kenneth Stanley, he was at OpenAI, he just laughed and he has a book called Why Greatness
link |
Can't Be Planned and it's actually an AI book.
link |
So and he's done all these experiments that basically show that when you over optimize,
link |
you, like the trade off is you're less creative, right?
link |
And to create true greatness and truly creative solutions to problems, you can't over plan
link |
And I thought that was, and so he generalizes it beyond AI and he talks about how we apply
link |
that in our personal life and in our organizations and our companies, which are over KPIs, right?
link |
Like look at any company in the world and it's all like, these aren't the goals, these
link |
aren't weekly goals and the sprints and then the quarterly goals, blah, blah, blah.
link |
And he just shows with a lot of his AI experiments that that's not how you create truly game
link |
There's a balance of course.
link |
That's yeah, many moments of genius will not come from planning and goals, but you still
link |
have to build factories and you still have to manufacture and you still have to deliver
link |
and there's still deadlines and all that kind of stuff.
link |
And that for that, it's good to have goals.
link |
I do goal setting with my kids, we all have our goals, but, but, but I think we're starting
link |
to morph into more of these like bigger picture goals and not obsess about like, I don't know,
link |
Well, I honestly think with, especially with kids, it's better, much, much better to have
link |
a plan and have goals and so on because you have to, you have to learn the muscle of like
link |
what it feels like to get stuff done.
link |
And once you learn that, there's flexibility for me because I spend most of my life with
link |
goal setting and so on.
link |
So like I've gotten good with grades and school.
link |
I mean, school, if you want to be successful at school, yeah, I mean the kind of stuff
link |
in high school and college, the kids have to do in terms of managing their time and
link |
getting so much stuff done.
link |
It's like, you know, taking five, six, seven classes in college, they're like that would
link |
break the spirit of most humans if they took one of them later in life, it's like really
link |
difficult stuff, especially engineering curricula.
link |
So I think you have to learn that skill, but once you learn it, you can maybe, cause you're,
link |
you can be a little bit on autopilot and use that momentum and then allow yourself to be
link |
lost in the flow of life.
link |
You know, just kind of, or also give like, I worked pretty hard to allow myself to have
link |
the freedom to do that.
link |
That's really, that's a tricky freedom to have because like a lot of people get lost
link |
in the rat race and they, and they also like financially, they, whenever you get a raise,
link |
they'll get like a bigger house or something like this.
link |
I put very, so like, there's, you're always trapped in this race, I put a lot of emphasis
link |
on living like below my means always.
link |
And so there's a lot of freedom to do whatever, whatever the heart desires that that's a relief,
link |
but everyone has to decide what's the right thing, what's the right thing for them.
link |
For some people having a lot of responsibilities, like a house they can barely afford or having
link |
a lot of kids, the responsibility side of that is really, helps them get their shit
link |
Like, all right, I need to be really focused and get, some of the most successful people
link |
I know have kids and the kids bring out the best in them.
link |
They make them more productive and less productive.
link |
Right, it's accountability.
link |
It's an accountability thing, absolutely.
link |
And almost something to actually live and fight and work for, like having a family,
link |
it's fascinating to see because you would think kids would be a hit on productivity,
link |
but they're not, for a lot of really successful people, they really like, they're like an
link |
Right, efficiency.
link |
I mean, it's beautiful.
link |
It's beautiful to see.
link |
And also a source of happiness.
link |
Speaking of which, what role do you think love plays in the human condition, love?
link |
I think love is, yeah, I think it's why we're all here.
link |
I think it would be very hard to live life without love in any of its forms, right?
link |
Yeah, that's the most beautiful of forms that human connection takes, right?
link |
And everybody wants to feel loved, right, in one way or another, right?
link |
And to love too, totally.
link |
Yeah, I agree with that.
link |
I'm not even sure what feels better.
link |
Both, both like that.
link |
Yeah, to give and to give love too, yeah.
link |
And it is like we've been talking about an interesting question, whether some of that,
link |
whether one day we'll be able to love a toaster.
link |
I wasn't quite thinking about that when I said like, yeah, like we all need love and
link |
That's all I was thinking about.
link |
I was thinking about Brad Pitt and toasters.
link |
Okay, toasters, great.
link |
Well, I think we started on love and ended on love.
link |
This was an incredible conversation, Rhonda.
link |
Thank you so much.
link |
You're an incredible person.
link |
Thank you for everything you're doing in AI, in the space of just caring about humanity,
link |
caring about emotion, about love, and being an inspiration to a huge number of people
link |
in robotics, in AI, in science, in the world in general.
link |
So thank you for talking to me.
link |
Thank you for having me.
link |
And you know, I'm a big fan of yours as well.
link |
So it's been a pleasure.
link |
Thanks for listening to this conversation with Rhonda Alkalioubi.
link |
To support this podcast, please check out our sponsors in the description.
link |
And now let me leave you with some words from Helen Keller.
link |
The best and most beautiful things in the world cannot be seen or even touched.
link |
They must be felt with the heart.
link |
Thank you for listening and hope to see you next time.