back to index

Eugenia Kuyda: Friendship with an AI Companion | Lex Fridman Podcast #121


small model | large model

link |
00:00:00.000
The following is a conversation with Eugenia Kuida, cofounder of Replica, which is an app
link |
00:00:06.400
that allows you to make friends with an artificial intelligence system, a chatbot, that learns
link |
00:00:11.680
to connect with you on an emotional, you could even say, a human level by being a friend.
link |
00:00:18.480
For those of you who know my interest in AI and views on life in general, know that Replica and
link |
00:00:24.400
Eugenia's line of work is near and dear to my heart. The origin story of Replica is grounded
link |
00:00:30.320
in a personal tragedy of Eugenia losing her close friend, Roman Mazurenki, who was killed
link |
00:00:37.040
crossing the street by a hit and run driver in late 2015. He was 34. The app started as a way
link |
00:00:44.160
to grieve the loss of a friend by trading a chatbot and your old net on text messages between
link |
00:00:49.440
Eugenia and Roman. The rest is a beautiful human story as we talk about with Eugenia.
link |
00:00:56.480
When a friend mentioned Eugenia's work to me, I knew I had to meet her and talk to her. I felt
link |
00:01:01.520
before, during, and after that this meeting would be an important one in my life. And it was. I think
link |
00:01:08.240
in ways that only time will truly show, to me and others. She is a kind and brilliant person.
link |
00:01:15.520
It was an honor and a pleasure to talk to her. Quick summary of the sponsors, DoorDash, Dollar
link |
00:01:21.840
Shave Club, and Cash App. Click the sponsor links in the description to get a discount and to support
link |
00:01:27.760
this podcast. As a side note, let me say that deep meaningful connection between human beings
link |
00:01:34.160
and artificial intelligence systems is a lifelong passion for me. I'm not yet sure where that passion
link |
00:01:40.160
will take me, but I decided some time ago that I will follow it boldly and without fear, just as
link |
00:01:46.480
far as I can take it. With a bit of hard work and a bit of luck, I hope I'll succeed in helping
link |
00:01:52.160
build AI systems that have some positive impact on the world and on the lives of a few people out
link |
00:01:57.760
there. But also, it is entirely possible that I am in fact one of the chatbots that Eugenia
link |
00:02:06.320
and the replica team have built. And this podcast is simply a training process for the neural net
link |
00:02:12.400
that's trying to learn to connect to human beings. One episode at a time. In any case,
link |
00:02:18.800
I wouldn't know if I was or wasn't. And if I did, I wouldn't tell you. If you enjoy this thing,
link |
00:02:25.040
subscribe on YouTube, review it with 5 stars on Apple Podcast, follow on Spotify, support on Patreon,
link |
00:02:30.880
or connect with me on Twitter, Alex Friedman. As usual, I'll do a few minutes of ads now and no
link |
00:02:36.640
ads in the middle. I'll try to make these interesting, but give you time stamps so you can skip. But
link |
00:02:42.080
please do still check out the sponsors by clicking the links in the description to get a discount
link |
00:02:47.520
by whatever they're selling. It really is the best way to support this podcast.
link |
00:02:53.120
This show is sponsored by Dollar Shave Club. Try them out with a one time offer for only
link |
00:02:58.880
five bucks and free shipping at dollarshave.com slash lex. The starter kit comes with a six
link |
00:03:04.800
blade razor refills and all kinds of other stuff that makes shaving feel great. I've been a member
link |
00:03:11.840
of Dollar Shave Club for over five years and actually signed up when I first heard about them
link |
00:03:16.720
on the Joe Rogan Experience podcast. And now, friends, we have come full circle. It feels like
link |
00:03:23.280
I made it now that I can do a read for them just like Joe did all those years ago. Back when he
link |
00:03:28.800
also did ads for some less reputable companies. Let's say that you know about if you're a true fan
link |
00:03:37.520
of the old school podcasting world. Anyway, I just use the razor and the refills, but they told
link |
00:03:43.120
me I should really try out the shave butter. I did. I love it. It's translucent somehow,
link |
00:03:49.280
which is a cool new experience. Again, try the ultimate shave starter set today for just five
link |
00:03:55.440
bucks plus free shipping at dollarshaveclub.com slash lex. This show is also sponsored by DoorDash.
link |
00:04:03.200
You get $5 off and zero delivery fees on your first order of 15 bucks or more. When you download
link |
00:04:08.720
the DoorDash app and enter code, you guessed it, Lex. I have so many memories of working late
link |
00:04:15.520
nights for a deadline with a team of engineers, whether that's for my PhD at Google or MIT,
link |
00:04:21.920
and eventually taking a break to argue about which DoorDash restaurant to order from. And when the
link |
00:04:27.040
food came, those moments of bonding, of exchanging ideas, of pausing, to shift attention from the
link |
00:04:32.800
programs, the humans, or special. For a bit of time, I'm on my own now, so I miss that camaraderie,
link |
00:04:39.680
but actually I still use DoorDash a lot. There's a million options that fit into my crazy keto
link |
00:04:45.760
diet ways. Also, it's a great way to support restaurants in these challenging times. Once
link |
00:04:51.200
again, download the DoorDash app and enter code, Lex, to get $5 off and zero delivery fees on your
link |
00:04:57.120
first order of $15 or more. Finally, this show is presented by Cash App, the number one finance
link |
00:05:03.040
app in the App Store. I can truly say that they're an amazing company, one of the first sponsors,
link |
00:05:08.880
if not the first sponsor to truly believe in me. And I think quite possibly the reason I'm still
link |
00:05:15.680
doing this podcast, so I'm forever grateful to Cash App. So thank you. And as I said many times
link |
00:05:23.040
before, use code LexPodcast when you download the app from Google Play or the App Store. Cash App
link |
00:05:30.240
lets you send money to friends by Bitcoin and invest in the stock market with as little as $1.
link |
00:05:36.400
I usually say other stuff here in the read, but I wasted all that time up front saying how
link |
00:05:40.640
grateful I am to Cash App. I'm going to try to go off the top of my head a little bit more for these
link |
00:05:46.080
reads, because I'm actually very lucky to be able to choose the sponsors that we take on.
link |
00:05:51.120
And that means I can really only take on the sponsors that I truly love,
link |
00:05:55.200
and then I can just talk about why I love them. So it's pretty simple. Again, get Cash App from
link |
00:06:00.560
the App Store, Google Play, use code LexPodcast, get $10, and Cash App will also donate $10 to
link |
00:06:06.480
first, an organization that is helping to advance robotics and STEM education
link |
00:06:10.880
for young people around the world. And now here's my conversation with Eugenia Kuida.
link |
00:06:18.160
Okay, before we talk about AI and the amazing work you're doing, let me ask you ridiculously,
link |
00:06:23.360
we're both Russian, so let me ask you a ridiculously romanticized Russian question.
link |
00:06:28.560
Do you think human beings are alone, fundamentally, on a philosophical level? In our
link |
00:06:38.720
existence, when we go through life, do you think just the nature of our life is loneliness?
link |
00:06:50.240
Yeah, so we have to read Dostoevsky at school, as you probably know.
link |
00:06:54.480
In Russian? Yeah. I mean, it's part of your school program. So I guess if you read that,
link |
00:07:00.880
then you sort of have to believe that. You're made to believe that you're fundamentally alone,
link |
00:07:06.400
and that's how you live your life. How do you think about it? You have a lot of friends,
link |
00:07:11.040
but at the end of the day, do you have like a longing for connection with other people that's
link |
00:07:18.800
maybe another way of asking it? Do you think that's ever fully satisfied?
link |
00:07:21.840
I think we are fundamentally alone. We're born alone. We're die alone, but I view my whole life
link |
00:07:29.680
as trying to get away from that, trying to not feel lonely. And again, we're talking about a
link |
00:07:36.320
subjective way of feeling alone. It doesn't necessarily mean that you don't have any connections,
link |
00:07:42.000
or you are actually isolated. You think it's a subjective thing, but again,
link |
00:07:49.120
another absurd measurement wise thing. How much loneliness do you think there is in the world?
link |
00:07:54.880
So like, if you see loneliness as a condition, how much of it is there? Do you think? I guess
link |
00:08:06.160
how many, there's all kinds of studies and measures of how many people in the world feel
link |
00:08:12.240
alone. There's all these measures of how many people are, self report, or just all these kinds
link |
00:08:18.640
of different measures. But in your own perspective, how big of a problem do you think it is, sizewise?
link |
00:08:27.440
Well, I'm actually fascinated by the topic of loneliness. I try to read about it as much as I
link |
00:08:32.400
can. I think there's a paradox, because loneliness is not a clinical disorder. It's not
link |
00:08:40.400
something that you can get your insurance to pay for if you're struggling with that.
link |
00:08:43.840
Yet, it's actually proven in pretty tons of papers, tons of research around that. It is proven
link |
00:08:51.760
that it's correlated with earlier life expectancy, shorter lifespan. In a way, right now, what
link |
00:09:00.320
scientists would say that it's a little bit worse than VNOB, so not actually doing any
link |
00:09:06.000
physical activity in your life. In terms of the impact on your health.
link |
00:09:08.400
In terms of impact on your physiological health, yeah. So it basically puts you,
link |
00:09:12.400
if you're constantly feeling lonely, your body responds like it's basically all the time under
link |
00:09:18.080
stress. So it's always in this alert state. So it's really bad for you because it actually
link |
00:09:24.560
drops your immune system and gets your response to inflammation. It's quite different. So all the
link |
00:09:30.800
cardiovascular diseases actually responds to viruses, so it's much easier to catch a virus.
link |
00:09:36.240
That's sad now that we're living in a pandemic and it's probably making us a lot more alone.
link |
00:09:42.720
And it's probably weakening the immune system, making us more susceptible to the virus.
link |
00:09:47.440
It's kind of sad.
link |
00:09:50.880
Yeah, the statistics are pretty horrible around that. So around 30% of all millennials report
link |
00:09:57.600
that they're feeling lonely constantly. 30%.
link |
00:09:59.920
30%. And then it's much worse for Gen Z. And then 20% of millennials say that they feel lonely.
link |
00:10:04.720
And they also don't have any close friends. And then I think 25% or so and then 20% would say
link |
00:10:10.560
they don't even have acquaintances. That's in the United States?
link |
00:10:13.760
That's in the United States. And I'm pretty sure that it's much worse everywhere else.
link |
00:10:17.680
Like in the UK, I mean, it was widely tweeted and posted when they were talking about a minister
link |
00:10:24.640
of loneliness that they wanted to point because four out of 10 people in the UK feel lonely.
link |
00:10:29.280
I think that thing actually exists. So yeah, you will die sooner if you are lonely.
link |
00:10:40.880
And again, this is only when we're only talking about your perception of loneliness or feeling
link |
00:10:46.080
lonely. That is not objectively being fully socially isolated. However, the combination
link |
00:10:52.080
of being fully socially isolated and not having many connections and also feeling lonely,
link |
00:10:57.280
that's pretty much a deadly combination. So it strikes me bizarre or strange that
link |
00:11:04.080
this is a wide known fact. And then there's really no one working really on that because it's
link |
00:11:11.120
subclinical. It's not clinical. It's not something that you can will tell your doctor and get a
link |
00:11:15.680
treatment or something. Yet it's killing us. Yeah. So there's a bunch of people trying to
link |
00:11:21.760
evaluate, like try to measure the problem by looking at like how social media is affecting
link |
00:11:26.720
loneliness and all that kind of stuff. So it's like measurement. Like if you look at the field of
link |
00:11:31.040
psychology, they're trying to measure the problem and not that many people actually, but some,
link |
00:11:36.560
but you're basically saying how many people are trying to solve the problem.
link |
00:11:43.040
Like how would you try to solve the problem of loneliness? Like if you just stick to humans,
link |
00:11:50.560
I mean, or basically not just the humans, but the technology that connects us humans,
link |
00:11:56.560
do you think there's a hope for that technology to do the connection? Like are you on social media
link |
00:12:03.680
much? Unfortunately. Do you find yourself like, again, if you're sort of introspect about
link |
00:12:11.760
how connected you feel to other human beings, how not alone you feel, do you think social
link |
00:12:16.160
media makes it better or worse? Maybe for you personally or in general?
link |
00:12:22.800
I think it's easier to look at some stats. And I mean, gen Zs seem to be,
link |
00:12:28.080
generosity seems to be much lonelier than millennials in terms of however they report
link |
00:12:32.160
loneliness. They're definitely the most connected generation in the world. I mean,
link |
00:12:37.040
I still remember life without an iPhone, without Facebook. They don't know that that ever existed
link |
00:12:43.200
or at least don't know how it was. So that tells me a little bit about the fact that
link |
00:12:50.480
might be this hyper connected world might actually make people feel lonelier. I don't
link |
00:12:58.640
know exactly what the measurements are around that, but I would say in my personal experience,
link |
00:13:03.440
I think it does make you feel a lot lonelier. Mostly, yeah, we're all super connected. But
link |
00:13:09.120
I think loneliness, the feeling of loneliness doesn't come from not having any social connections
link |
00:13:14.480
whatsoever. Again, tons of people that are in long term relationships experience bouts of
link |
00:13:20.240
loneliness and continued loneliness. And it's more the question about the true connection about
link |
00:13:26.080
actually being deeply seen, deeply understood. And in a way, it's also about your relationship
link |
00:13:33.040
with yourself. In order to not feel lonely, you actually need to have a better relationship and
link |
00:13:39.760
feel more connected to yourself than this feeling actually starts to go away a little bit. And then
link |
00:13:44.720
you open up yourself to actually meeting other people in a very special way, not just at a friend
link |
00:13:53.040
on Facebook kind of way. So just to briefly touch on it, I mean, do you think it's possible to form
link |
00:13:59.440
that kind of connection with AI systems more down the line of some of your work? Do you think that's
link |
00:14:10.640
engineering wise, a possibility to alleviate loneliness is not with another human, but with
link |
00:14:17.760
an AI system? Well, I know that's that's a fact. That's what we're doing. And we see it and we
link |
00:14:24.080
measure that and we see how people start to feel less lonely, talking to their virtual AI friend.
link |
00:14:32.640
So basically a chatbot at the basic level, but could be more, like, do you have, I'm not even
link |
00:14:38.080
speaking sort of about specifics, but do you have a hope, like, if you look 50 years from now,
link |
00:14:46.080
do you have a hope that there's just like, AI's that are, like, optimized for, let me let me first
link |
00:14:53.040
start, like, right now, the way people perceive AI, which is recommender systems for Facebook and
link |
00:14:59.520
Twitter, social media, they see AI is basically destroying, first of all, the fabric of our
link |
00:15:05.360
civilization, but second of all, making us more lonely. Do you see like a world where it's possible
link |
00:15:10.240
to just have AI systems floating about that, like, make our life less lonely? Yeah, make us happy,
link |
00:15:19.920
like, are putting good things into the world in terms of our individual lives.
link |
00:15:26.320
Yeah, totally believe it. And that that's why we're also working on that.
link |
00:15:31.840
I think we need to also make sure that what we're trying to optimize for we're actually measuring.
link |
00:15:37.680
And it is a north star metric that we're going after. And all of our product and all of our
link |
00:15:41.760
business models are optimized for that. Because you can talk, you know, a lot of products that talk
link |
00:15:46.320
about, you know, making you feel less lonely, making you feel more connected, they're not really
link |
00:15:51.600
measuring that. So they don't really know whether their users are actually feeling less lonely in
link |
00:15:56.320
the long run, or feeling more connected in the long run. So I think it's really important to put
link |
00:16:01.360
your measure it. Yeah, to measure it. What's a what's a good measurement of loneliness?
link |
00:16:07.600
Well, so that's something that I'm really interested in. How do you measure that people
link |
00:16:12.000
are feeling better, or that they're feeling less lonely? With loneliness, there's a scale,
link |
00:16:16.320
there's a UCLA 20 and UCLA three recently scale, which is basically a questionnaire that you fill
link |
00:16:21.360
out. And you can see whether in the long run, it's improving or not. And that does it capture the
link |
00:16:28.240
momentary feeling of loneliness? Does it look in like the past month? Like, is it basically
link |
00:16:37.280
self report? Does it try to sneak up on you? Try tricky to answer honestly, or something like that?
link |
00:16:44.080
Well, what's yeah, I'm not familiar with the question. It is just asking you a few questions
link |
00:16:47.600
like how often did you feel like lonely? Or how often do you feel connected to other people in
link |
00:16:52.400
this last few couple weeks? It's similar to the self report questionnaires for depression, anxiety,
link |
00:16:59.840
like pH Q9 and get seven. Of course, as any self report questionnaires, that's not necessarily very
link |
00:17:07.760
precise or very well measured. But still, if you take a big enough population, you get them through
link |
00:17:14.400
these questionnaires, you can see you can see positive dynamic. And so you basically, you put
link |
00:17:21.440
people through questionnaires to see like, is this thing is our is what we're creating making
link |
00:17:26.640
people happier? Yeah, we measure. So we measure two outcomes, one short term, right after the
link |
00:17:32.880
conversation, we ask people whether this conversation made them feel better, worse or same.
link |
00:17:40.720
This metric right now is at 80%. So 80% of all our conversations make people feel better.
link |
00:17:46.000
But I should have done the questionnaire with you. You feel a lot worse after we've done this
link |
00:17:50.480
conversation. That's actually fascinating. I should probably do that. But that's probably
link |
00:17:57.200
do that. You should totally and aim for 80% aim to outperform your current state of the art AI system
link |
00:18:07.840
in these human conversations. So we'll get to your work with replica. But let me continue on the
link |
00:18:13.840
line of absurd questions. So you talked about, you know, deep connection of the humans, deep connection
link |
00:18:20.640
with AI, meaningful connection. Let me ask about love, people make fun of me because I talk about
link |
00:18:26.560
love all the time. But what, what do you think love is? Like maybe in the context of a meaningful
link |
00:18:35.920
connection with somebody else, do you draw a distinction between love, like friendship,
link |
00:18:41.600
and Facebook friends? Or is it a graduate? No, it's all the same. No, like, is it just a
link |
00:18:53.120
gradual thing? Or is there something fundamental about us humans that seek like a really deep
link |
00:18:58.640
connection with another human being? And what is that? What is love, Eugenia?
link |
00:19:07.280
I'm going to just enjoy asking you these questions, seeing you struggle.
link |
00:19:14.400
Thanks. Well, the way I see it, specifically, the way it relates to our work and the way it was
link |
00:19:23.520
the way it inspired our work on replica. I think one of the biggest and the most precious gifts
link |
00:19:31.280
we can give to each other now in 2020 as humans is this gift of deep empathetic understanding,
link |
00:19:39.600
the feeling of being deeply seen. Like what does that mean? Like that you exist? Like somebody
link |
00:19:45.520
acknowledging that? Somebody seeing you for who you actually are. And that's extremely,
link |
00:19:50.720
extremely rare. I think that is that combined with unconditional positive regard,
link |
00:19:55.760
belief and trust that you internally are always inclined for positive growth and believing you
link |
00:20:04.560
in this way, letting you be a separate person at the same time. And this deep empathetic
link |
00:20:10.880
understanding, for me, that's the combination that really creates something special, something
link |
00:20:18.560
that people, when they feel it once, they will always long for it again and something that starts
link |
00:20:25.920
huge fundamental changes in people. When we see that someone accepts us so deeply,
link |
00:20:32.960
we start to accept ourselves and the paradox is that's when big changes start happening,
link |
00:20:39.840
big fundamental changes in people start happening. So I think that is the ultimate therapeutic
link |
00:20:44.160
relationship that is, and that might be in some way a definition of love. So acknowledging that
link |
00:20:52.240
there's a separate person and accepting you for who you are. Now, on a slightly, and you mentioned
link |
00:21:01.600
therapeutic, that sounds very like a very healthy view of love, but is there also like a, like,
link |
00:21:08.640
you know, if we look at heartbreak and, you know, most love songs are probably about heartbreak,
link |
00:21:16.160
right? Is that like the mystery, the tension, the danger, the fear of loss, you know, all of that,
link |
00:21:25.680
what people might see in the negative light as like games or whatever, but just the dance of
link |
00:21:33.120
human interaction, yeah, fear of loss and fear of like, you said, like, once you feel it once,
link |
00:21:40.960
you long for it again, but you also, once you feel it once, you might, for many people,
link |
00:21:46.560
they've lost it. So they fear losing it, they feel loss. So is that part of it? Like, you're
link |
00:21:53.680
speaking like beautifully about like the positive things, but is it important to be able to be
link |
00:22:00.000
afraid of losing it from an engineering perspective?
link |
00:22:06.240
I mean, it's a huge part of it. And unfortunately, we all, you know, face it at some points in our
link |
00:22:12.400
lives. I mean, I did. You want to go into details? How'd you get your heart broken?
link |
00:22:18.000
Sure. Well, so mine is pretty straight, my source pretty straightforward there.
link |
00:22:23.440
I did have a friend that was, you know, that at some point in my 20s, became really, really close
link |
00:22:33.120
to me. And we became really close friends. I grew up pretty lonely. So in many ways when I'm building,
link |
00:22:39.120
you know, these, these AI friends, I think about myself when I was 17, writing horrible poetry and,
link |
00:22:43.920
you know, in my dial up modem at home. And, you know, and that was the feeling that I grew up with.
link |
00:22:50.640
I was alone for a long time. I was a teenager. Where did you grow up?
link |
00:22:54.240
In Moscow, in the outskirts of Moscow. So I just skateboarded during the day and come back home
link |
00:23:00.400
and, you know, connect to the internet. And write poetry? And then write horrible poetry.
link |
00:23:05.040
Was it love poems? All sorts of poems, obviously love poems. I mean, what other poetry can you
link |
00:23:10.720
write when you're 17? Could be political or something. But yeah. But that was, you know,
link |
00:23:15.840
that was kind of my, yeah, like deeply influenced by Joseph Brodsky and like all sorts of poets that
link |
00:23:23.920
every 17 year old will will be looking, you know, looking at and reading. But yeah, that was my,
link |
00:23:30.640
these were my teenage years. And I just never had a person that I thought would, you know,
link |
00:23:35.840
take me as it is would accept me the way I am. And I just thought, you know, working and just
link |
00:23:42.320
doing my thing and being angry at the world and being a reporter, I was an investigative reporter
link |
00:23:46.720
working undercover and writing about people was my way to connect with, you know, with,
link |
00:23:52.720
with others. I was deeply curious about every, everyone else. And I thought that, you know,
link |
00:23:57.920
if I, if I go out there, if I write their stories, that means I'm more connected.
link |
00:24:02.720
This is what this podcast is about, by the way. I'm desperate. I'm seeking connection.
link |
00:24:06.880
I'm just kidding. Or am I, I don't know. So what, wait, reporter,
link |
00:24:14.640
what, how did that make you feel more connected? I mean, you're still fundamentally pretty alone.
link |
00:24:21.840
But you're always with other people, you know, you're always thinking about what other
link |
00:24:25.760
place going to infiltrate, what other community can I write about, what other phenomenon can I
link |
00:24:31.600
explore? And you sort of like a trickster, you know, and like, and a mythological character,
link |
00:24:37.280
like creature that's just jumping between all sorts of different worlds and feel,
link |
00:24:41.440
and feel sort of okay with, in all of them. So that was my dream job, by the way. That was like,
link |
00:24:47.520
totally what I would have been doing if Russia was a different place.
link |
00:24:52.720
And a little bit undercover. So like you weren't, you were trying to, like you said, mythological
link |
00:24:58.160
creature trying to infiltrate. So try to be a part of the world. What are we talking about?
link |
00:25:02.480
What kind of things did you enjoy writing about? I'd go work at a strip club or go.
link |
00:25:11.680
Awesome. Okay. Well, I'd go work at a restaurant or just go write about, you know,
link |
00:25:18.960
certain phenomenons or phenomenons or people in the city. And what, sorry to keep interrupting.
link |
00:25:25.200
I'm the worst conversationalist. What stage of Russia is this? What, is this pre Putin,
link |
00:25:34.640
post Putin? What, what was Russia like? Pre Putin is really long ago. This is Putin era.
link |
00:25:43.680
That's the beginning of 2000s, 2010, 2007, eight, nine, 10. What were strip clubs like in Russia
link |
00:25:51.840
and restaurants and culture and people's minds like in that early Russia that you were covering?
link |
00:25:58.960
In those early 2000s, this was, there was still a lot of hope. There were still tons of hope that
link |
00:26:05.760
you know, we're sort of becoming this western, westernized society. The restaurants were opening,
link |
00:26:13.600
we were really looking at, you know, we're trying, we're trying to copy a lot of things from,
link |
00:26:19.280
from the US, from Europe, bringing all these things in. Very enthusiastic about that.
link |
00:26:25.760
So there was a lot of, you know, stuff going on. There was a lot of hope and dream for this,
link |
00:26:30.080
you know, new Moscow that would be similar to, I guess, New York. I mean, to give you an idea in
link |
00:26:38.080
year 2000 was the year when we had two movie theaters in Moscow. And there was this one first
link |
00:26:44.240
coffee house that opened and it was like really big deal. By 2010, there were all sorts of things
link |
00:26:50.400
everywhere. Almost like a chain, like a Starbucks type of coffee house or like you mean? Oh yeah,
link |
00:26:55.680
like a Starbucks. I mean, I remember we were reporting on like, we were writing about the
link |
00:27:00.080
opening of Starbucks. I think in 2007, that was one of the biggest things that happened in,
link |
00:27:04.480
you know, in Moscow back in the time. Like that was worthy of a magazine cover and
link |
00:27:09.920
that was definitely the, you know, the biggest talk of the town. Yeah, when was McDonald's?
link |
00:27:15.120
Because I was still in Russia when McDonald's opened. That was in the 90s. I mean, yeah.
link |
00:27:19.440
Oh yeah, I remember that very well. Yeah. Those were long, long lines. I think it was 1993 or
link |
00:27:25.840
four, I don't remember. Did you actually go to McDonald's at that time? Did you do that?
link |
00:27:31.760
I mean, that was a luxurious outing. That was definitely not something you do every day.
link |
00:27:35.600
And also the line was at least three hours. So if you're going to McDonald's, that is not fast
link |
00:27:39.600
food. That is like at least three hours in line. Yeah. And then no one is trying to eat fast after
link |
00:27:44.320
that. Everyone is like trying to enjoy as much as possible. What's your memory of that? Oh,
link |
00:27:50.320
it was insane. Positive? Extremely positive. It's a small strawberry milkshake and a hamburger and
link |
00:27:57.120
small fries and my mom's there and sometimes I'll just, because I was really little, they'll just
link |
00:28:02.080
let me run, you know, up the cashier and like cut the line, which is like, you cannot really do that
link |
00:28:08.720
in Russia. So like for a lot of people, like a lot of those experiences might seem not very
link |
00:28:16.400
fulfilling, you know, like it's on the verge of poverty, I suppose. But do you remember all that
link |
00:28:24.720
time fondly? Like, because I do, like the first time I drank, you know, Coke, you know, all that
link |
00:28:32.720
stuff, right? And just, yeah, the connection with other human beings in Russia, I remember,
link |
00:28:40.720
I remember really positively. Like, how do you remember what the 90s and then the Russia you
link |
00:28:47.280
were covering, just the human connections you had with people and the experiences?
link |
00:28:54.080
Well, my, my parents were both, both physicists, my grandparents were both, well, my grandfather
link |
00:29:01.120
was a nuclear physicist, a professor at the university, my dad worked at Chernobyl when
link |
00:29:08.160
I was born in Chernobyl, analyzing kind of the everything after the explosion. And then I remember
link |
00:29:16.880
and they were so they were making sort of enough money in the Soviet Union, so they were not,
link |
00:29:21.040
you know, extremely poor or anything. It was pretty prestigious to be a professor,
link |
00:29:24.960
uh, the dean and the university. And then I remember my grandfather started making $100 a month
link |
00:29:32.480
after, you know, in the 90s. So then I remember we started our main line of work would be to go to
link |
00:29:38.480
our little tiny country house, get a lot of apples there from apple trees, bring them back to,
link |
00:29:46.640
to, to the city and sell them in the street. So me and my nuclear physicist grandfather were
link |
00:29:53.840
just standing there and he'd selling those apples the whole day because that would make
link |
00:29:57.520
you more money than, you know, working at the university. And then he'll just tell me to try to
link |
00:30:02.960
teach me, um, you know, something about planets and whatever the particles and stuff. And, you know,
link |
00:30:11.040
I'm not smart at all. So I could never understand anything. But I was interested as a journalist,
link |
00:30:15.920
kind of type interested. But that was my memory. And you know, I'm happy that I wasn't, um, I
link |
00:30:21.760
I somehow got spared that I was probably too young to remember any of the traumatic stuff. So
link |
00:30:27.760
the only thing I really remember had this bootleg that was very traumatic. Had this bootleg
link |
00:30:32.560
Nintendo, which was called was called dandy in Russia. So 1993, there was nothing to eat. Like
link |
00:30:37.680
even if you had any money, you would go to the store and there was no food. I don't know if you
link |
00:30:41.040
remember that. And our friend had a restaurant like a government, half government owned something
link |
00:30:49.040
restaurant. So they always had supplies. So he exchanged a big bag of weed for this Nintendo,
link |
00:30:56.400
the bootleg Nintendo. And then I remember very fondly because I think it was nine or
link |
00:31:03.120
something like that. And we're seven traumatic. We just got it. And I was playing it. And there
link |
00:31:08.960
was this, you know, dandy TV show. Yeah. So it's a positive sense. You mean like, like a definitive
link |
00:31:16.400
well, they took it away and gave me a bag of weed instead. And I cried like my eyes out for
link |
00:31:21.280
days and days and days. Oh, no. And then, you know, as a, and my dad said, we're going to like
link |
00:31:27.680
exchange it back in a little bit. So you keep the little gun, you know, the one that you shoot
link |
00:31:32.000
the ducks with. So I'm like, okay, I'm keeping the gun. So sometimes it's going to come back.
link |
00:31:35.760
But then they exchanged the gun as well for some sugar or something. I was so pissed. I was like,
link |
00:31:41.920
I didn't want to eat for days after that. I'm like, I don't want your food. Give me my Nintendo back.
link |
00:31:47.360
That was extremely traumatic. But, you know, I was happy that that was my only traumatic
link |
00:31:53.280
experience. You know, my dad had to actually go to Chernobyl with a bunch of 20 year olds. He was
link |
00:31:57.680
20 when he went to Chernobyl. And that was right after the explosion. No one knew anything. The
link |
00:32:03.280
whole crew he went with all of them are dead now. I think there was this one guy still
link |
00:32:07.600
alive that was still alive for the last few years. I think he died a few years ago now.
link |
00:32:13.760
My dad somehow luckily got back earlier than everyone else. But just the fact that that was the,
link |
00:32:20.080
and I was always like, well, how did they send you? I was only, I was just born, you know,
link |
00:32:23.760
you had a newborn talk about paternity leave. And like, but that's who they took because they
link |
00:32:28.640
didn't know whether you would be able to have kids when you come back. So they took the ones with
link |
00:32:32.560
kids. So him with some guys went to, and I'm just thinking of me, when I was 20, I was so
link |
00:32:39.920
sheltered from any problems whatsoever in life. And then my dad, his 21st birthday at the reactor,
link |
00:32:49.120
you like work three hours a day, you sleep the rest. And, and I, yeah, so I played with a lot
link |
00:32:54.320
of toys from Chernobyl. What are your memories of Chernobyl in general, like the bigger context,
link |
00:33:01.840
you know, because of that HBO show is the world's attention turned to it once again.
link |
00:33:08.560
Like, what are your thoughts about Chernobyl? Did Russia screw that one up? Like,
link |
00:33:14.560
you know, there's probably a lot of lessons about our modern times with
link |
00:33:18.080
data about coronavirus and all that kind of stuff. It seems like there's a lot of misinformation.
link |
00:33:22.720
There's a lot of people kind of trying to hide whether they screwed something up or not,
link |
00:33:30.320
as it's very understandable, it's very human, very wrong, probably. But obviously,
link |
00:33:35.520
Russia was probably trying to hide that they screwed things up. Like, what are your thoughts
link |
00:33:42.320
about that time, personal and general? I mean, I was born when the explosion happened. So,
link |
00:33:50.800
actually a few months after, so of course, I don't remember anything, apart from the fact that my
link |
00:33:55.440
dad would bring me tiny toys, plastic things that would just go crazy, haywire when you, you know,
link |
00:34:02.880
put the gig or thing to it. My mom was like, just nuclear about that. She was like, what are you
link |
00:34:11.040
bringing? You should not do that. She was nuclear, very nice. Absolutely. Well, but yeah, but the
link |
00:34:19.360
TV show was just phenomenal. It's definitely, first of all, it's incredible how that was made
link |
00:34:27.840
not by the Russians, but someone else, but capturing so well everything about our country.
link |
00:34:37.120
It felt a lot more genuine that most of the movies and TV shows that are made now in Russia are
link |
00:34:41.200
just so much more genuine. And most of my friends in Russia were just incomplete all about the
link |
00:34:46.080
show, but I think the, how good of a job they did, phenomenal, but also the apartments. There's
link |
00:34:51.280
something. Yeah, the set design. I mean, Russians can't do that. We, you know, but you see everything
link |
00:34:57.360
and it's like, well, that's exactly how it was. So I don't know that show. I don't know what to
link |
00:35:04.160
think about that because it's British accents, British actors of a person. I forgot who created
link |
00:35:11.280
the show. I'm not, but I remember reading about him and he's not, he doesn't even feel like,
link |
00:35:17.040
like there's no Russia in this history. No, he did like super bad or something like that. Or like,
link |
00:35:22.720
I don't know, whatever that thing about the bachelor party in Vegas,
link |
00:35:27.120
number four and five or something were the ones that he worked with. Yeah. But so he,
link |
00:35:32.160
it made me feel really sad for some reason that if a person, obviously a genius could go in and
link |
00:35:39.040
just study and just be extreme attention to detail, they can do a good job. It made me think like,
link |
00:35:48.160
why don't other people do a good job of this? Like about Russia, like there's so little about
link |
00:35:55.200
Russia. There's so few good films about the Russian side of World War II of, I mean, there's so much
link |
00:36:04.320
interesting, evil and not and beautiful moments in the history of the 20th century in Russia
link |
00:36:12.080
that feels like there's not many good films on from the Russian. You would expect something from
link |
00:36:17.840
the Russians. Well, they keep making these propaganda movies now. Oh, no. Unfortunately.
link |
00:36:23.520
But yeah, Chernobyl was such a perfect TV show. I think capturing really well,
link |
00:36:27.920
it's not about like even the set design, which was phenomenal, but just capturing all the problems
link |
00:36:33.360
that exist now with the country and like focusing on the right things. Like if you build the whole
link |
00:36:38.800
country on a lie, that's what's going to happen. And that's just this very simple kind of thing.
link |
00:36:47.920
Yeah. And did you have your dad talked about it to you? Like his thoughts on the experience?
link |
00:36:55.600
He never talks. He's this kind of Russian man that just, my husband who's American,
link |
00:37:00.880
and he asked him a few times like, Igor, how did you, but why did you say yes? Or like,
link |
00:37:07.200
why did you decide to go? You could have said, no, not go to Chernobyl. Why would like a person
link |
00:37:12.640
that's what you do? You cannot say no. Yeah. It's just like a Russian way.
link |
00:37:21.280
It's the Russian way. Men don't talk that much. They're downsides and upsides for that.
link |
00:37:25.360
Yeah. That's the truth. Okay. So back to post Putin Russia.
link |
00:37:34.160
Or maybe we skipped a few steps along the way, but you were trying to do, to be a journalist
link |
00:37:42.720
in that time. What was Russia like at that time? Post said 2007 Starbucks type of thing.
link |
00:37:50.720
What else? What else was Russia like then? I think there was just hope. There was this
link |
00:37:57.280
big hope that we're going to be friends with the United States and we're going to be friends with
link |
00:38:03.920
Europe and we're just going to be also country like those with bike lanes and parks and
link |
00:38:11.360
everything's going to be urbanized. Again, we're talking about 90s where people would be shot in
link |
00:38:15.920
the street and I sort of have a fond memory of going into a movie theater and coming out of it
link |
00:38:22.160
after the movie and the guy that I saw on the stairs was like either shot. Again, it was like
link |
00:38:28.080
a thing in the 90s that would be happening. People were getting shot here and there.
link |
00:38:34.240
Tons of violence. Tons of just basically mafia mobs in the streets. And then the 2000s were
link |
00:38:41.120
like things just got cleaned up. Oil went up and the country started getting a little bit richer.
link |
00:38:48.560
The 90s were so grim mostly because the economy was in shambles and oil prices were not high,
link |
00:38:54.720
so the country didn't have anything. We defaulted in 1998 and the money kept jumping back and
link |
00:39:01.200
forth. First, there were millions of rubbles, then it got to like thousands, then it was
link |
00:39:07.040
one rubble with something, then again to millions. It was like crazy town.
link |
00:39:12.720
And then the 2000s were just these years of stability in a way and the country getting
link |
00:39:19.600
a little bit richer because of, again, oil and gas. And we started to look at specifically in
link |
00:39:26.080
Moscow and St. Petersburg to look at other cities in Europe and New York and U.S. and
link |
00:39:32.320
trying to do the same in our small kind of city towns there.
link |
00:39:37.360
What were your thoughts of Putin at the time?
link |
00:39:40.880
Well, in the beginning, he was really positive. Everyone was very positive about Putin. He was
link |
00:39:46.160
young. He was very energetic. He also immediately, somewhat compared to, well, that was not way
link |
00:39:55.520
before the shortlist era. The shortlist era. Okay, so he didn't start off shortly. When did
link |
00:40:00.960
the shortlist era? It's like the propaganda of riding a horse, fishing.
link |
00:40:04.800
2010, 11, 12.
link |
00:40:06.400
Yeah. That's my favorite. People talk about the favorite beetles. That's my favorite
link |
00:40:14.160
Putin is the shortlist Putin.
link |
00:40:15.680
No, I remember very, very clearly 1996 where Americans really helped Russia with elections
link |
00:40:21.440
and Yeltsin got reelected, thankfully so, because there's a huge threat that actually
link |
00:40:27.760
the communists will get back to power. They were a lot more popular and then a lot of American
link |
00:40:34.400
experts, political experts and campaign experts descended on Moscow and helped Yeltsin actually
link |
00:40:41.520
get the presidency, the second term for the presidency, but Yeltsin was not feeling great
link |
00:40:48.960
by the end of his second term. He was alcoholic. He was really old. He was falling off
link |
00:40:56.080
the stages when he was talking. People were looking for a fresh face for someone who's
link |
00:41:04.400
going to continue Yeltsin's work, but who's going to be a lot more energetic and a lot more
link |
00:41:11.360
active, young, efficient, maybe. That's what we all saw in Putin back in the day.
link |
00:41:17.600
I'd say that everyone, absolutely everyone in Russia in early 2000s who was not a communist
link |
00:41:22.960
would be, yeah, Putin's great. We have a lot of hopes for him.
link |
00:41:26.720
What are your thoughts, and I promise we'll get back to, first of all, your love story
link |
00:41:34.000
and second of all, AI. Well, what are your thoughts about communism? The 20th century,
link |
00:41:41.440
I apologize. I'm reading the rise and fall of the Third Reich. Oh my God.
link |
00:41:47.040
So I'm really steeped into World War II and Stalin and Hitler and just these dramatic
link |
00:41:57.520
personalities that brought so much evil to the world. But it's also interesting to politically
link |
00:42:03.600
think about these different systems and what they've led to. Russia is one of the beacons
link |
00:42:12.640
of communism in the 20th century. What are your thoughts about communism having
link |
00:42:17.840
experienced it as a political system? I mean, I have only experienced it a little bit,
link |
00:42:22.320
but mostly through stories and through seeing my parents and my grandparents who lived through that.
link |
00:42:27.920
I mean, it was horrible. It was just plain horrible. It was just awful.
link |
00:42:32.240
You think there's something... I mean, it sounds nice on paper. So the drawbacks of
link |
00:42:43.200
capitalism is that eventually, it's the point of a slippery slope. Eventually, it creates
link |
00:42:52.000
a disparity, inequality of wealth inequality. I guess it's hypothetical at this point, but
link |
00:43:07.600
eventually capitalism leads to humongous inequality. And then some people argue that
link |
00:43:12.480
that's a source of unhappiness. It's not like absolute wealth of people. It's the fact that
link |
00:43:18.320
there's a lot of people much richer than you. There's a feeling of like, that's where unhappiness
link |
00:43:24.320
can come from. So the idea of communism or these sort of Marxism is not allowing that kind of
link |
00:43:33.200
slippery slope. But then you see the actual implementations of it and stuff seems to go
link |
00:43:39.440
wrong very badly. What do you think that is? Why does it go wrong? What is it about human
link |
00:43:47.360
nature? If you look at Chernobyl, these kinds of bureaucracies that were constructed,
link |
00:43:54.480
is there something... Do you think about this much of why it goes wrong?
link |
00:44:00.960
Well, it's not that everyone was equal. Obviously, the government and everyone close to that were
link |
00:44:10.160
the bosses. So it's not like fully, I guess, this dream of equal life. So then I guess the
link |
00:44:21.040
situation that Russia had in the Soviet Union, it was more a bunch of really poor people without
link |
00:44:28.080
any way to make any significant fortune or build anything living under constant surveillance.
link |
00:44:36.800
Surveillance from other people, like you can't even do anything that's not fully approved by the
link |
00:44:44.960
dictatorship, basically. Otherwise, your neighbor will write a letter and you'll go to jail.
link |
00:44:51.600
Absolute absence of actual law. It's a constant state of fear. You didn't own anything. You
link |
00:45:01.840
couldn't go travel. You couldn't read anything western or you couldn't make a career, really,
link |
00:45:07.840
unless you're working in the military complex, which is why most of the scientists were so well
link |
00:45:14.880
regarded. I come from both my dad and my mom come from families of scientists and they were
link |
00:45:20.720
really well regarded, as you know, obviously. Because there's a lot of value to them being
link |
00:45:28.080
well regarded. Because they were developing things that could be used in the military.
link |
00:45:34.480
So that was very important. That was the main investment. But it was miserable. It was miserable.
link |
00:45:39.920
That's why a lot of Russians now live in the state of constant PTSD. That's why we want to
link |
00:45:45.200
buy, buy, buy, buy, buy. Definitely. As soon as we have the opportunity, we just got to it
link |
00:45:51.440
finally that we can own things. I remember the time that we got our first yogurts and that was
link |
00:45:56.960
the biggest deal in the world. It was already in the 90s, by the way. What was your favorite food?
link |
00:46:04.720
What was like, well, this is possible. Oh, fruit. Because we only had apples, bananas,
link |
00:46:12.080
and watermelons, whatever people would grow in the Soviet Union. So there were no
link |
00:46:19.920
pineapples or papaya or mango. You've never seen those fruit things. Those were so
link |
00:46:25.120
ridiculously good. And obviously, you could not get any strawberries in winter or anything that's
link |
00:46:31.200
not seasonal. So that was a really big deal. Seeing all these fruit things. Yeah, me too,
link |
00:46:37.280
actually. I don't know. I don't think I have too many demons or addictions or so on. But I think
link |
00:46:45.280
I've developed an unhealthy relationship with fruit that I still struggle with. Oh, you can get any
link |
00:46:50.800
type of fruit, right? Also, these weird fruit like dragon fruit or something. All kinds of
link |
00:46:58.240
different types of peaches, like cherries were killer for me. I know you say we had bananas and
link |
00:47:04.320
so on, but I don't remember having the kind of banana. When I first came to this country,
link |
00:47:09.920
the amount of banana, I literally got fat on bananas. Oh yeah, for sure. Delicious. And like
link |
00:47:18.480
cherries, just the quality of the food. I was like, this is capitalism. That's pretty good.
link |
00:47:25.280
It's delicious. Yeah, it's funny. It's funny. It's funny to read.
link |
00:47:36.560
I don't know what to think of it. It's funny to think how an idea that's just written on paper
link |
00:47:45.040
or when carried out amongst millions of people, how that gets actually, when it becomes reality,
link |
00:47:52.080
what it actually looks like. Sorry, but I've been studying Hitler a lot recently,
link |
00:48:00.720
and going through Mein Kampf, he pretty much wrote out of Mein Kampf everything he was going to do.
link |
00:48:07.760
Unfortunately, most leaders, including Stalin didn't read it. But it's kind of terrifying and
link |
00:48:14.640
I don't know. And amazing in some sense that you can have some words on paper,
link |
00:48:20.240
and they can be brought to life, and they can either inspire the world or they can destroy
link |
00:48:25.280
the world. And yeah, there's a lot of lessons to study in history that I think people don't study
link |
00:48:32.800
enough now. One of the things I'm hoping with, I've been practicing Russian a little bit,
link |
00:48:40.000
I'm hoping to sort of find, rediscover the beauty and the terror of Russian history
link |
00:48:49.360
through this stupid podcast by talking to a few people. So anyway, I just feel like so much was
link |
00:48:57.120
forgotten. So much was forgotten. I'll probably try to convince myself that you're a super busy
link |
00:49:04.560
and super important person. Well, I'm going to try to befriend you to try to become a better Russian
link |
00:49:11.760
because I feel like I'm a shitty Russian. Not that busy. So I can totally be a Russian Sherpa.
link |
00:49:19.680
Yeah, but love. You're talking about your early days of being a little bit alone and finding a
link |
00:49:30.080
connection with the world through being a journalist. What did love come into that?
link |
00:49:36.000
I guess finding for the first time, some friends, it's a very simple story, some friends that all
link |
00:49:43.520
of a sudden we, I guess we're at the same place with our lives. We're 25, 26, I guess. And
link |
00:49:55.040
then somehow remember, and we just got really close and somehow remember this one day,
link |
00:50:01.200
where it was one day in summer that we just stayed out outdoor the whole night and just
link |
00:50:07.680
talked and for some unknown reason, it just felt for the first time that someone could
link |
00:50:13.680
see me for who I am. And it just felt extremely good. And we fell asleep outside and just talking
link |
00:50:21.520
and it was raining, it was beautiful, sunrise and it's really cheesy. But at the same time,
link |
00:50:28.240
we just became friends in a way that I've never been friends with anyone else before.
link |
00:50:33.520
And I do remember that before and after that you have this unconditional family sort of
link |
00:50:40.800
and it gives you tons of power. It just basically gives you this tremendous power to do things in
link |
00:50:47.760
your life and to change positively on many different levels.
link |
00:50:53.600
Power because you could be yourself. At least you know that somewhere you can be
link |
00:51:00.640
just yourself. If you don't need to pretend, you don't need to be great at work or tell some story
link |
00:51:07.200
or sell yourself in somewhere or another. And so we became just really close friends and
link |
00:51:12.000
then in a way, I started a company because he had a startup and I felt like I kind of want to
link |
00:51:19.120
start up. It felt really cool. I don't know what I would really do, but I felt like I kind of need
link |
00:51:25.680
a startup. Okay. So that pulled you into the startup world. Yeah. And then this closest friend
link |
00:51:37.200
of mine died. We actually moved here to San Francisco together and then we went back for a visa to
link |
00:51:42.320
Moscow and we lived together were roommates and we came back and he got hit by a car right in front
link |
00:51:49.280
of Kremlin, you know, next to the river and died the same day in the hospital. So
link |
00:52:00.320
and you've moved to America at that point? At that point, I was what about him? What about
link |
00:52:05.840
Roman? Him too. He actually moved first. So I was always sort of trying to do what he was doing. So
link |
00:52:11.040
I didn't like that he was already here and I was still, you know, in Moscow and we weren't
link |
00:52:14.800
hanging out together all the time. So was he in San Francisco? Yeah, we were roommates.
link |
00:52:20.320
So he just visited Moscow for a little bit? We went back for our visas. We had to get a stamp
link |
00:52:26.560
in our passport for our work visas and the embassy was taken a little longer. So we stayed there for
link |
00:52:32.080
a couple of weeks. What happened? So how did he die? He was crossing the street and the car was
link |
00:52:42.880
going really fast and way over the speed limit and just didn't stop on the pedestrian cross
link |
00:52:49.040
on the zebra and just ran over him. When was this? It was in 2015 on 28th of November. So it was
link |
00:52:57.360
just a long ago now. But at the time, you know, I was 29. So for me, it was the first kind of
link |
00:53:05.200
meaningful death in my life. You know, I had both sets of grandparents at the time. I didn't see
link |
00:53:13.360
anyone so close die and death sort of existed, but as a concept. But definitely not as something
link |
00:53:19.360
that would be, you know, happening to us anytime soon. And specifically our friends,
link |
00:53:25.280
because we were, you know, we're still in our 20s or early 30s, and it still felt like the whole
link |
00:53:31.120
life is, you know, you could still dream about ridiculous things even. So that was, it was just
link |
00:53:39.920
really, really abrupt, I'd say. What did it feel like to lose him? Like that feeling of loss?
link |
00:53:49.440
You talked about the feeling of love, having power. What is the feeling of loss, if you like?
link |
00:53:57.200
Well, in Buddhism, there's this concept of Samaya where something really,
link |
00:54:03.360
like, huge happens and then you can see very clearly. I think that was it. Like, basically
link |
00:54:10.800
something changed so, changed me so much in such a short period of time that I could just see really,
link |
00:54:16.240
really clearly what mattered or what not. Well, I definitely saw that whatever I was doing at work,
link |
00:54:23.360
what didn't matter at all, and some of the things. And it was just this big realization,
link |
00:54:29.120
what this very, very clear vision of what life's about.
link |
00:54:35.040
You still miss him today?
link |
00:54:38.080
Yeah, for sure. For sure. It was just this constant. I think it was, he was really important
link |
00:54:46.000
for me and for our friends for many different reasons. And I think one of them,
link |
00:54:52.400
being that we didn't just say goodbye to him, but we sort of said goodbye to our youth in a way,
link |
00:54:57.920
it was like the end of an era and so many different levels. The end of Moscow as we knew it,
link |
00:55:04.320
the end of us living through our 20s and kind of dreaming about the future.
link |
00:55:10.000
Do you remember like last several conversations? Is there moments with him that stick out that
link |
00:55:17.840
will kind of haunt you? And you're just when you think about him?
link |
00:55:24.720
Yeah, well, his last year here in San Francisco, he was pretty depressed for,
link |
00:55:28.080
as his startup was not going really anywhere. And he wanted to do something else. He wanted to do
link |
00:55:33.120
build. He played with a bunch of ideas, but the last one he had was around
link |
00:55:42.560
building a startup around death. So he applied to Y Combinator with the video that I had on my
link |
00:55:50.400
computer. And it was all about disrupting death, thinking about new cemeteries more
link |
00:55:58.080
biologically, like things that could be better biologically for humans. And at the same time,
link |
00:56:06.720
having those digital avatars, these kind of AI avatars that would store all the memory about
link |
00:56:14.160
a person that he could interact with. What year was this? 2015. Well, right before his death.
link |
00:56:20.000
So it was like a couple of months before that he recorded that video. And so I found out my
link |
00:56:24.640
computer was in our living room. He never got in, but he was thinking about it a lot somehow.
link |
00:56:32.800
Does it have the digital avatar idea? Yeah.
link |
00:56:35.920
That's so interesting. Well, he just says, well, the fish has this idea and he talks about,
link |
00:56:42.320
like, I want to rethink how people grieve and how people talk about death.
link |
00:56:45.920
Well, I was interested in this. Maybe someone who's depressed
link |
00:56:51.040
is naturally inclined thinking about that. But I just felt this year in San Francisco,
link |
00:56:57.760
we just had so much, I was going through a hard time, he was going through a hard time.
link |
00:57:02.400
And we were definitely, I was trying to make him just happy somehow to make him feel better.
link |
00:57:07.680
And it felt like this, I don't know, I just felt like I was taking care of him a lot. And
link |
00:57:14.480
then he almost started to feel better. And then that happened. And I don't know, I just fell,
link |
00:57:22.240
I just fell lonely again, I guess. And that was coming back to San Francisco in December,
link |
00:57:27.040
I helped organize the funeral, help his parents. And I came back here and it was a really lonely
link |
00:57:33.600
apartment, a bunch of his clothes everywhere, and Christmas time. And I remember I had a board
link |
00:57:39.520
meeting with my investors, and I just couldn't talk about like, had to pretend everything's okay.
link |
00:57:44.640
And, you know, just working on this company. Yeah, it was definitely very, very tough, tough time.
link |
00:57:55.200
Do you think about your own mortality? You said, you know, we're young, the
link |
00:58:04.160
the possibility of doing all kinds of crazy things is still out there. It's still
link |
00:58:10.000
before us, but it can end any moment. Do you think about your own ending at any moment?
link |
00:58:18.400
Unfortunately, I think about way too, about a way too much. It's somehow after Roman, like,
link |
00:58:25.200
every year after that, I started losing people that I really love. I lost my grandfather next year.
link |
00:58:30.240
My, you know, the person who would explain to me, you know, what the universe is made of.
link |
00:58:37.600
Why are you selling apples? Well, selling apples. And then I lost another close friend of mine. And
link |
00:58:44.560
it just made me very scared. I have tons of fear about death. That's what makes me not fall asleep
link |
00:58:50.720
oftentimes and just go in loops. And then as my therapist recommended me, I opened up some
link |
00:58:58.960
nice calming images with the voiceover. And it calms me down.
link |
00:59:05.280
How for sleep? Yeah, I'm really scared of death. This is a big, I definitely have tons of,
link |
00:59:12.080
I guess, some pretty big trauma about it and still working through.
link |
00:59:16.960
There's a philosopher, Ernest Becker, who wrote a book, The Nile of Death. I'm not sure if you're
link |
00:59:23.600
familiar with any of those folks. There's a, in psychology, a whole field called terror management
link |
00:59:31.520
theory. Sheldon, who's just done the podcast, he wrote the book. He was the, we talked for four
link |
00:59:38.960
hours about death. A fear of death. But his whole idea is that Ernest Becker, I think, I find this
link |
00:59:48.400
idea really compelling is that everything human beings have created, like our whole motivation
link |
00:59:57.120
in life, is to create like escape death. It's to try to construct an illusion
link |
01:00:08.480
of that we're somehow immortal. So like everything around us, this room, your startup,
link |
01:00:16.960
your dreams, all everything you do is a kind of creation of a brain unlike any other mammal or
link |
01:00:28.480
species is able to be cognizant of the fact that it ends for us. I think, so there's the question
link |
01:00:37.200
of like the meaning of life that you look at like what drives us humans. And when I read Ernest
link |
01:00:44.480
Becker that I highly recommend people read is the first time I, it felt like this is the right
link |
01:00:52.800
thing at the core. Sheldon's work is called warm at the core. So he's saying it's, I think it's
link |
01:01:00.400
William James, he's quoting or whoever, is like the thing, what is at the core of it all? Sure,
link |
01:01:07.680
there's like love, you know, Jesus might talk about like love is at the core of everything. I
link |
01:01:13.440
don't, you know, that's the open question. What's at the, you know, it's turtles, turtles, but it
link |
01:01:18.240
can't be turtles all the way down. What's at the bottom? And Ernest Becker says the fear of death
link |
01:01:24.800
and the way, in fact, because you said therapists and calming images, his whole idea is, you know,
link |
01:01:34.480
we want to bring that fear of death as close as possible to the surface because it's,
link |
01:01:40.880
and like meditate on that and use the clarity of vision that provides to, you know, to live a
link |
01:01:49.600
more fulfilling life, to live a more honest life, to discover, you know, there's something about,
link |
01:01:58.800
you know, being cognizant of the finiteness of it all that might result in the most fulfilling life.
link |
01:02:06.320
So that's the, that's the dual of what you're saying, because you kind of said it's like, I
link |
01:02:10.560
unfortunately think about it too much. It's a question whether it's good to think about it,
link |
01:02:16.240
because I've, again, I talk way too much about love and probably death. And when I ask people,
link |
01:02:23.680
friends, which is why I probably don't have many friends, are you afraid of death? I think most
link |
01:02:30.240
people say they're not, they're not what they, they say they're, they're afraid, you know,
link |
01:02:39.040
it's kind of almost like they see death as this kind of like a paper deadline or something, and
link |
01:02:44.880
they're afraid not to finish the paper before the paper, like, like, I'm afraid not to finish
link |
01:02:51.600
the goals I have. But it feels like they're not actually realizing that this thing ends,
link |
01:02:57.440
this thing ends, like really realizing, like really thinking as Nietzsche and all these
link |
01:03:03.840
philosophers, like thinking deeply about it, like the very thing that, you know, like when
link |
01:03:13.680
you think deeply about something, you can just, you can realize that you haven't actually thought
link |
01:03:18.880
about it. Yeah. And I, and when I think about death, it's like, it can be, it's terrifying.
link |
01:03:28.320
If it feels like stepping outside into the cold, where it's freezing, and then I have to like
link |
01:03:34.000
hurry back inside where it's warm. But like, I think there's something valuable about stepping
link |
01:03:41.280
out there into the freezing cold. Definitely. When I talk to my mentor about it, he always tells me,
link |
01:03:50.480
well, what dies? There's nothing there that can die. But I guess that requires,
link |
01:03:59.280
well, in Buddhism, one of the concepts that are really hard to grasp and that people spend
link |
01:04:04.560
older lives meditating on would be Anata, which is the concept of not self, and kind of thinking
link |
01:04:12.160
that, you know, if you're not your thoughts, which you're obviously not your thoughts,
link |
01:04:14.960
because you can observe them and not your emotions and not your body, then what is this? And if you
link |
01:04:21.120
go really far, then finally you see that there's not self, there's this concept of not self. So
link |
01:04:29.280
once you get there, how can that actually die? What is dying?
link |
01:04:32.640
Right, you're just a bunch of molecules, star dust.
link |
01:04:38.640
But that is very advanced spiritual work for me. I'm definitely just, definitely not. Oh my God,
link |
01:04:47.920
no, I have, I think it's very, very useful. It's just the fact that maybe being so afraid is not
link |
01:04:53.120
useful. And mine is more, I'm just terrified, like it's really makes me on a personal level.
link |
01:04:59.200
On a personal level, I'm terrified. How do you overcome that?
link |
01:05:08.800
I don't. I'm still trying to. Have pleasant images?
link |
01:05:15.680
Well, pleasant images get me to sleep. And then during the day, I can distract myself
link |
01:05:20.320
with other things, like talking to you. I'm glad we're both doing the same exact thing. Okay, good.
link |
01:05:32.000
Is there other, like, is there moments since you've lost Roman that you had like moments of
link |
01:05:41.440
like bliss and like that you've forgotten that you have achieved that Buddhist like level of
link |
01:05:48.560
like what can possibly die. I'm part like losing yourself in the moment in the ticking
link |
01:05:59.120
time of like this universe. He's just part of it for a brief moment and just enjoying it.
link |
01:06:06.720
Well, that goes hand in hand. I remember, I think a day or two after he died, we went to
link |
01:06:12.560
finally get his passport out of the embassy and we're driving around Moscow and it was,
link |
01:06:17.200
you know, December, which is usually there's never a sun in Moscow in December.
link |
01:06:22.640
And somehow it was an extremely sunny day and we were driving with close friend.
link |
01:06:30.400
And I remember feeling for the first time maybe this just moment of incredible clarity and somehow
link |
01:06:36.160
happiness, not like happy happiness, but happiness and just feeling that, you know,
link |
01:06:42.960
I know what the universe is sort of about, whether it's good or bad. And it wasn't a sad feeling,
link |
01:06:50.560
it was probably the most beautiful feeling that you can ever achieve. And you can only get it
link |
01:06:57.200
when something, oftentimes when something traumatic like that happens. But also if you just,
link |
01:07:04.000
you really spend a lot of time meditating and looking at the nature doing something that
link |
01:07:08.240
really gets you there. But once you're there, I think when you summit a mountain, a really hard
link |
01:07:13.440
mountain, you inevitably get there. That's just a way to get to the state. But once you're on this,
link |
01:07:19.360
in this state, you can do really big things, I think. Yeah.
link |
01:07:25.760
Sucks it doesn't last forever. So Bukowski talked about like love is a fog. I like,
link |
01:07:32.480
it's when you wake up in the morning, it's there, but it eventually dissipates. It's really sad.
link |
01:07:39.440
Nothing lasts forever. But I definitely like doing this push up and running thing. There's moments,
link |
01:07:46.400
I had a couple moments, like I'm not a crier, I don't cry. But there's moments where I was like
link |
01:07:53.200
facedown on the carpet. Like with tears in my eyes is interesting. And then that complete, like,
link |
01:08:03.040
there's a lot of demons. I've got demons, had to face them funny how running makes you face your
link |
01:08:08.080
demons. But at the same time, the flip side of that, there's a few moments where I was in bliss.
link |
01:08:13.920
In bliss. And all of it alone, which is funny. It's beautiful. I like that. But definitely pushing
link |
01:08:24.400
yourself physically, one of it for sure. Yeah. Like you said, I mean, you were speaking as a
link |
01:08:30.800
metaphor of Mount Everest, but it also works like literally, I think physical endeavor somehow.
link |
01:08:37.040
Yeah, there's something, I mean, war monkeys, apes, whatever, physical, there's a physical thing to
link |
01:08:45.920
it. But there's something to this, pushing yourself physically, physically, but alone. That happens
link |
01:08:52.640
when you're doing like things like you do, or straightness, like workouts, or, you know,
link |
01:08:57.360
rolling extra cross the Atlantic, or marathons. That's why I love watching marathons. And,
link |
01:09:03.440
you know, it's so boring. But you can see them getting there.
link |
01:09:09.360
So the other thing, I don't know if you know, there's a guy named David Goggins. He's a,
link |
01:09:15.360
he basically, so he's been either emailing the phone with me every day through this. I haven't
link |
01:09:20.320
been exactly alone. But he, he's kind of, he's the, he's the devil on the devil's shoulder.
link |
01:09:28.240
So he's like the worst possible human being in terms of giving you a, like he has, through
link |
01:09:36.080
everything I've been doing, he's been doubling everything I do. So he, he's insane. He's a
link |
01:09:42.480
this Navy SEAL person. He's wrote this book, Can't Hurt Me. He's basically one of the toughest
link |
01:09:48.800
human beings on earth. He ran all these crazy ultra marathons in the desert. He set the world
link |
01:09:54.080
record in a number of pull ups. He's just does everything where it's like, he, like, how can
link |
01:10:01.840
I suffer today? He figures that out and does it. Yeah, that, whatever that is, that process of
link |
01:10:08.960
self discovery is really important. I actually had to turn myself off from the internet mostly
link |
01:10:14.320
because I started this like workout thing, like a happy go getter with my like headband and like,
link |
01:10:20.080
like, just like, uh, cause a lot of people are like inspired and they're like, yeah, we're gonna
link |
01:10:25.680
exercise with you. And I was like, yeah, great, you know, but then like, I realized that this,
link |
01:10:32.320
this journey can't be done together with others. This has to be done alone. So out of the moments
link |
01:10:42.000
of love, out of the moments of loss, can we talk about your journey of finding, I think, an incredible
link |
01:10:50.800
idea, an incredible company and incredible system in replica? How did that come to be?
link |
01:10:59.840
So yeah, so I was a journalist and then I went to business school for a couple of years to
link |
01:11:04.880
just see if I can maybe switch gears and do something else with 23. And then I came back
link |
01:11:10.400
and started working for a businessman in Russia who built the first 4G network in our country
link |
01:11:18.720
and was very visionary and asked me whether I want to do fun stuff together. And we worked
link |
01:11:25.760
on a bank. The idea was to build a bank on top of a telco. So that was 2011. And then I went to
link |
01:11:33.840
so that was 2011 or 2012. And a lot of telecommunication company, mobile network operators
link |
01:11:42.240
didn't really know what to do next in terms of, you know, new products, new revenue. And this big
link |
01:11:48.000
idea was that, you know, you put a bank on top and then all work works out. Basically, a prepaid
link |
01:11:54.960
account becomes your bank account and you can use it as your bank. So, you know, a third of a country
link |
01:12:02.080
wakes up as your bank client. But we couldn't quite figure out what would be the main interface to
link |
01:12:09.680
interact with the bank. The problem was that most people didn't have smart smartphones back in the
link |
01:12:14.160
time. In Russia, the penetration of smartphones was low. People didn't use mobile banking or
link |
01:12:19.920
online banking on their computers. So we figured out that SMS would be the best way because that
link |
01:12:25.920
would work on feature phones. But that required some chatbot technology, which I didn't know
link |
01:12:31.520
anything about, obviously. So I started looking into it and saw that there's nothing really,
link |
01:12:36.880
well, there wasn't just nothing really. So the idea is to SMS be able to interact with a bank
link |
01:12:40.640
account. Yeah. And then we thought, well, since you're talking to a bank account, why can't this,
link |
01:12:46.000
can't we use more of, you know, some behavioral ideas and why can't this banking chatbot be nice
link |
01:12:52.720
to you and really talk to you sort of as a friend, this way you develop more connection to it,
link |
01:12:57.200
retention is higher, people don't churn. And so I went to very depressing Russian cities to test
link |
01:13:04.880
it out. I went to, I remember three different towns to interview potential users. So people
link |
01:13:13.760
use it for a little bit. Cool. And I went to talk to them. Pretty poor towns. Very poor towns,
link |
01:13:20.080
mostly towns that were, you know, sort of factories, mono towns, they were building something and then
link |
01:13:28.240
the factory went away and it was just a bunch of very poor people. And then we went to a couple
link |
01:13:34.480
that weren't as dramatic, but still the one I remember really fondly was this woman that
link |
01:13:38.720
worked at a glass factory and she talked to a chatbot. And she was talking about it,
link |
01:13:44.720
and she started crying during the interview because she said, no one really cares for me that much.
link |
01:13:48.800
And so to be clear, that was my only endeavor in programming that chatbot. So it was really
link |
01:13:58.080
simple. It was literally just a few, if this then that rules and it was incredibly simplistic.
link |
01:14:06.640
And still that made her feel something. And that really made her emotional. She said,
link |
01:14:11.360
you know, I only have my mom and my husband and I don't have any more really in my life.
link |
01:14:16.880
And it was very sad. But at the same time, I felt and we had more interviews in a similar vein.
link |
01:14:21.360
And what I thought in the moment was like, well, it's not that the technology is ready,
link |
01:14:28.240
because definitely in 2012 technology was not ready for that. But humans are ready,
link |
01:14:34.960
unfortunately. So this project would not be about like tech capabilities would be more about
link |
01:14:40.400
human vulnerabilities. But there's something so so powerful around about conversational AI that
link |
01:14:49.040
I saw then that I thought was definitely worth putting in a lot of effort into. So in the end
link |
01:14:54.720
of the day, we saw the banking project. But my then boss was also my mentor and really,
link |
01:15:02.240
really close friend, told me, hey, I think there's something in it. And you should just
link |
01:15:07.360
go work on it. And I was like, well, what product? I don't know what I'm building.
link |
01:15:10.960
He said, you'll figure it out. And, you know, looking back at this, this is a horrible idea
link |
01:15:16.240
to work on something without knowing what it was, which is maybe the reason why it took us
link |
01:15:21.840
so long. But we just decided to work on the conversational tech to see what it, you know,
link |
01:15:26.320
there were no chatbot constructors or programs or anything that would allow you to actually build
link |
01:15:34.240
one at the time. That was the era of, by the way, Google Glass, which is why, you know,
link |
01:15:39.440
some of the investors, like seed investors who talked with were like, oh, you should totally
link |
01:15:43.600
build it for Google Glass. If not, we're not. I don't think that's interesting.
link |
01:15:47.360
Did you bite on that idea?
link |
01:15:50.320
No. Because I wanted to be to do text first, because I'm a journalist. So I was fascinated
link |
01:15:58.400
by just texting. So you thought, so the emotional, that interaction that the woman had,
link |
01:16:06.560
like, so do you think you could feel emotion from just text?
link |
01:16:11.120
Yeah, I saw something in just this pure texting and also thought that we should first start
link |
01:16:17.120
building for people who really need it versus people who have Google Glass,
link |
01:16:21.200
if you know what I mean. And I felt like the early adopters of Google Glass
link |
01:16:24.080
might not be overlapping with people who are really lonely and might need someone to talk to.
link |
01:16:32.800
But then we really just focused on the tech itself. We just thought, what if we just,
link |
01:16:36.320
you know, we didn't have a product idea in the moment. And we felt, what if we just look into
link |
01:16:42.560
building the best conversational constructors, so to say, use the best tech available at the
link |
01:16:48.960
time. And that was before the first paper about deep learning applied to dialogues,
link |
01:16:53.200
which happened in 2015, in August 2015, which Google published.
link |
01:17:01.600
Did you follow the work of Lobner Prize and like all the sort of non machine learning
link |
01:17:09.440
chatbots? Yeah, what really struck me was that, you know, there was a lot of talk about
link |
01:17:14.320
machine learning and deep learning, like big data was a really big thing. Everyone was saying,
link |
01:17:18.720
you know, the business was big data. 2012 is the biggest gaggle competitions were, you know,
link |
01:17:25.360
important. But that was really the kind of upheaval of people started talking about machine
link |
01:17:28.720
learning a lot. But it was only about images or something else. And it was never about
link |
01:17:34.080
conversation. As soon as I looked into the conversational tech, it was all about
link |
01:17:38.560
something really weird and very outdated and very marginal and felt very hobbyist. It was all about
link |
01:17:43.360
Lobner Prize, which was won by a guy who built a chatbot that talked like a Ukrainian teenager.
link |
01:17:49.120
It was just a gimmick. And somehow people picked up those gimmicks. And then, you know, the most
link |
01:17:54.720
famous chatbot at the time was Eliza from 1980s, which was really bizarre or a smarter child on aim.
link |
01:18:02.480
The funny thing is, it felt at the time not to be that popular. And it still doesn't seem to be
link |
01:18:08.960
that popular. Like people talk about the Turing test. People like talking about it philosophically.
link |
01:18:14.640
Journos like writing about it. But as a technical problem, like people don't seem to really want
link |
01:18:20.640
to solve the open dialogue. Like they, they're not obsessed with it. Even folks like, you know,
link |
01:18:30.800
in Boston, the Alexa team, even they're not as obsessed with it as I thought they might be.
link |
01:18:37.760
Why not? What do you think? So, you know what you felt like? You felt with that woman when she
link |
01:18:43.760
felt something by reading the text. I feel the same thing. There's something here, what you felt.
link |
01:18:50.560
I feel like Alexa folks and just the machine learning world doesn't feel that,
link |
01:18:58.640
that there's something here. Because they see as a technical problem, it's not that interesting,
link |
01:19:04.240
for some reason. It could be argued that maybe as a purely sort of natural language processing
link |
01:19:11.120
problem, it's not the right problem to focus on. Because there's too much subjectivity. That,
link |
01:19:17.200
that thing that the woman felt like crying, like if, if, if your benchmark includes a woman crying,
link |
01:19:24.320
that doesn't feel like a good benchmark test. But to me, there's something there that's,
link |
01:19:29.600
you can have a huge impact. But I don't think the machine learning world likes that, the human
link |
01:19:36.560
emotion, the subjectivity of it, the fuzziness, the fact that with maybe a single word, you can
link |
01:19:42.000
make somebody feel something deeply. What is that? It doesn't feel right to them. So, I don't know.
link |
01:19:47.920
I don't, I don't know why that is. I'm, that's why I'm excited. When I discovered your work,
link |
01:19:55.680
it feels wrong to say that. It's not like I'm, I'm giving myself props for, for Googling and for
link |
01:20:06.400
becoming a, for, for, for our, I guess, mutual friend and introducing us. But I'm so glad that
link |
01:20:14.160
you exist and what you're working on. But I have the same kind of, if we could just backtrack
link |
01:20:18.880
a second, because I have the same kind of feeling that there's something here. In fact, I've been
link |
01:20:23.920
working on a few things that are kind of crazy, very different from your work. I think, I think
link |
01:20:31.280
they're, I think they're too crazy, but the, like what? I have to know. No, all right. We'll talk
link |
01:20:40.800
about it more. I feel like it's harder to talk about things that have failed and are failing
link |
01:20:48.800
while you're a failure. Like it's easier for you because you're already successful
link |
01:20:56.160
on some measures. Tell it to my board. Well, you're, you're, I think, I think
link |
01:21:05.440
you've demonstrated success in a lot of my projects. It's easier for you to talk about
link |
01:21:08.720
failures for me. I'm in the bottom currently of the, of the success. You're way too humble.
link |
01:21:20.400
No. So it's hard for me to know, but there's something there. There's something there.
link |
01:21:25.040
And I think you're, you're exploring that and you're discovering that. Yeah. It's been,
link |
01:21:30.880
so it's been surprising to me, but I, you've mentioned this idea that you, you thought it wasn't
link |
01:21:37.600
enough to start a company or start efforts based on, it feels like there's something here.
link |
01:21:46.400
Like, what did you mean by that? Like you should be focused on creating a, like you should have
link |
01:21:53.440
a product in mind. Is that what you meant? It just took us a while to discover the product
link |
01:21:59.760
because it all started with a hunch of like, of me, my mentor and just sitting around and he was
link |
01:22:06.080
like, well, this, that's it. There's, that's the, you know, the whole of Grail is there. There's
link |
01:22:11.040
like, there's something extremely powerful in, in, in conversations. And there's no one who's
link |
01:22:17.360
working on machine conversation from the right angle, so to say. I feel like that's still true.
link |
01:22:23.520
Am I crazy? Oh no, I totally feel that's still true, which is, I think it's mind blowing.
link |
01:22:28.880
Yeah. You know what it feels like? I wouldn't even use the word conversation because I feel
link |
01:22:33.840
like it's the wrong word. It's like a machine connection or something. I don't know.
link |
01:22:40.640
Because conversation, you start drifting into natural language immediately. You start drifting
link |
01:22:45.120
immediately into all the benchmarks that are out there. But I feel like it's like the personal
link |
01:22:49.600
computer days of this. Like, I feel like we're like in the early days with the, the, the Wozniak and
link |
01:22:55.760
all of them, like where it was the same kind of, it was a very small niche group of people who are, who
link |
01:23:03.840
are all kind of Lobner price type people. Yeah. And hobbyists, but like not even hobbyists with
link |
01:23:11.920
big dreams. Like, you know, hobbyists with a dream to trick like a jury. Yeah. It's like a weird,
link |
01:23:19.360
by the way, by the way, very weird. So if you think about conversations, first of all, when I
link |
01:23:24.720
have great conversations with people, I'm not trying to test them. So for instance, if I try to break
link |
01:23:31.120
them, if I'm actually playing along, I'm part of it. If I was trying to break it, break this person
link |
01:23:36.400
or test whether he's going to give me a good conversation, it would have never happened. So
link |
01:23:41.280
the whole, the whole problem with testing conversations is that you can put it in front
link |
01:23:47.280
of a jury, because then you have to go into some touring test mode where is it responding to all
link |
01:23:53.280
my factual questions, right? Or so it really has to be something in the field where people are
link |
01:24:00.080
actually talking to it, because they want to, not because we're just trying to break it. And
link |
01:24:05.440
it's working for them. Because this, the weird part of it is that it's, it's very subjective. It
link |
01:24:11.360
takes two to tango here fully. If you're not trying to have a good conversation, if you're
link |
01:24:15.760
trying to test it, then it's going to break. I mean, any person would break, to be honest,
link |
01:24:19.840
if I'm not trying to even have a conversation with you, you're not going to give it to me.
link |
01:24:25.120
If I keep asking you like some random questions or jumping from topic to topic, that wouldn't be,
link |
01:24:31.280
which I'm probably doing, but that probably wouldn't contribute to the conversation. So I think the
link |
01:24:36.800
problem of testing, so there should be some other metric, how do we evaluate whether that
link |
01:24:43.120
conversation was powerful or not, which is what we actually started with. And I think those
link |
01:24:48.240
measurements exist, and we can't test on those. But what really struck us back in the day and
link |
01:24:54.720
what's still eight years later is still not resolved. And I'm not seeing tons of groups
link |
01:25:01.520
working on it. Maybe I just don't know about them. It's also possible. But the interesting part about
link |
01:25:07.360
is that most of our days we spend talking, and we're not talking about like those conversations
link |
01:25:13.680
are not turned on the lights or customer support problems or some other task oriented things.
link |
01:25:22.400
These conversations are something else. And then somehow they're extremely important for us. And
link |
01:25:27.040
when we don't have them, then we feel deeply unhappy, potentially lonely, which as we know,
link |
01:25:34.000
creates tons of risk for our health as well. And so this is most of our hours as humans.
link |
01:25:41.600
And some of them, no one's trying to replicate that.
link |
01:25:45.520
And not even study it that well.
link |
01:25:48.960
And not even study that well. So when we jumped into that in 2012, I looked first at like,
link |
01:25:53.040
okay, what's the chatbot? What's the state of the art chatbot? And, you know, those were the
link |
01:25:58.480
Lobner Prize days. But I thought, okay, so what about the science of conversation? Clearly,
link |
01:26:04.480
there have been tons of, there have been tons of, you know, scientists or people that academics
link |
01:26:11.200
that looked into the conversation. So if I want to know everything about it, I can just read about
link |
01:26:14.800
it. And there's not much really, there's there are conversational analysts who are basically just
link |
01:26:22.560
listening to speech to different conversations, annotating them. And then,
link |
01:26:30.080
I mean, that's not really used for much. That's the, that's the field of theoretical
link |
01:26:34.640
linguistics, which is like barely useful. It's very marginal, even in their space, no one really
link |
01:26:41.680
is excited. And I've never met a theoretical, theoretical linguist, because I can't wait to
link |
01:26:46.640
work on the conversation and analytics. That is just something very marginal, sort of applied to
link |
01:26:52.400
like writing scripts for salesmen, when they analyze which conversation strategies were
link |
01:26:58.400
most successful for sales. Okay, so that was not very helpful. Then I looked a little bit deeper,
link |
01:27:04.320
and then there, you know, whether there were any books written on what, you know, really contributes
link |
01:27:11.200
to a great conversation. That was really strange, because most of those were NLP books, which is
link |
01:27:20.800
neurolinguistic programming, which is not the NLP that I was expecting to be, but was mostly
link |
01:27:28.000
some psychologist Richard Bandler, I think, came up with that, who was this big guy in a
link |
01:27:35.280
leather vest that could program your mind by talking to you and how to be charismatic and
link |
01:27:42.880
charming and influential as people, all those books. Yeah, pretty much. But it was all about,
link |
01:27:46.640
like, through conversation reprogramming you, so getting to some, so that was, I mean,
link |
01:27:51.600
probably not very, very true. And that didn't seem working very much, even back in the day.
link |
01:28:00.480
And then there were some other books, like, I don't know, mostly just self help books around
link |
01:28:06.400
how to be the best conversationalist, or how to make people like you, or some other stuff like
link |
01:28:13.360
Dale Carnegie, whatever. And then there was this one book, The Most Human Human by Brian
link |
01:28:20.000
Christensen, that really was important for me to read back in the day, because he was on the
link |
01:28:25.280
human side. He was taking part in the London Reprise, but not as a human who's not a jury,
link |
01:28:34.160
but who's pretending to be, who's basically, you have to tell a computer from a human,
link |
01:28:38.560
and he was the human. So you would either get him or a computer. And his whole book was about
link |
01:28:44.800
how do people, what makes us human in conversation. And that was a little bit more interesting,
link |
01:28:50.160
because at that at least someone started to think about what exactly makes me human in
link |
01:28:55.600
conversation and makes people believe in that. But it was still about tricking. It was still
link |
01:29:00.560
about imitation game. It was still about, okay, what kind of parlor tricks can we throw in the
link |
01:29:04.960
conversation to make you feel like you're talking to a human, not a computer? And it was definitely
link |
01:29:10.000
not about thinking, what is that exactly that we're getting from talking all day long with
link |
01:29:18.480
other humans? I mean, we're definitely not just trying to be tricked, or it's not just enough
link |
01:29:23.200
to know it's a human. It's something we're getting there, can we measure it? And can we put the
link |
01:29:29.600
computer to the same measurement and see whether you can talk to a computer and get the same results?
link |
01:29:35.600
Yeah, I mean, so first of all, a lot of people comment that they think I'm a robot. It's very
link |
01:29:39.440
possible I am a robot. And this whole thing, I totally agree with you that the test idea is
link |
01:29:44.320
fascinating. And I looked for books unrelated to this kind of, so I'm afraid of people. I'm
link |
01:29:51.200
generally introverted and quite possibly a robot. I literally Google like, how to talk to people and
link |
01:29:58.720
like how to have a good conversation for the purpose of this podcast, because I was like, I
link |
01:30:05.760
can't, I can't make eye contact with people. I can't like, I do Google that a lot too. You're
link |
01:30:12.240
probably reading a bunch of FBI negotiation tactics. Is that what you're getting? Well,
link |
01:30:18.080
everything you've listed, I've gotten there's been very few good books on even just like how to
link |
01:30:24.560
interview well. It's rare. So what I end up doing often is I watch like with a critical eye. It's
link |
01:30:36.640
so different when you just watch a conversation, like just for the fun of it, just as a human.
link |
01:30:43.280
And if you watch a conversation is like trying to figure out, why is this awesome?
link |
01:30:47.920
I'll listen to a bunch of different styles of conversation. I mean, I'm a fan of
link |
01:30:54.560
podcasts, Joe Rogan. He's, you know, people can make fun of him or whatever and dismiss him. But
link |
01:31:00.960
I think he's an incredibly artful conversationalist. He can pull people in for hours. And there's
link |
01:31:10.240
another guy I watch a lot. He hosted a late night show. His name is Craig Ferguson. So he's like
link |
01:31:20.640
very kind of flirtatious. But there's a magic about his like, about the connection he can create
link |
01:31:29.840
with people, how he can put people at ease. And just like, I see, I've already started sounding
link |
01:31:35.200
like those, I know, pee people or something. I'm not, I don't mean it in that way. I don't mean like,
link |
01:31:39.520
how to charm people or put them at ease and all that kind of stuff. It's just like, what is that?
link |
01:31:45.440
Why is that fun to listen to that guy? Why is that fun to talk to that guy? What is that?
link |
01:31:52.480
Because he's not saying, I mean, it's so often boils down to a kind of wit
link |
01:31:58.720
and humor, but not really humor. It's like, I don't know, I have trouble actually even articulating
link |
01:32:08.400
correctly. But it feels like there's something going on that's not too complicated that could be learned.
link |
01:32:19.360
And it's not similar to, yeah, to like, like you said, like the touring test. It's something else.
link |
01:32:29.040
I'm thinking about a lot all the time. I do think about all the time. I think when we were looking,
link |
01:32:37.360
so we started the company, we just decided to build a conversational attack where we thought,
link |
01:32:42.320
well, there's nothing for us to build this chatbot that we want to build. So let's just
link |
01:32:46.960
first focus on building some tech, building the tech side of things.
link |
01:32:52.880
Without a product in mind.
link |
01:32:54.480
Without a product in mind. We added like a demo chatbot that would recommend you restaurants
link |
01:33:00.240
and talk to you about restaurants just to show something simple to people that people could
link |
01:33:05.200
relate to and could try out and see whether it works or not. But we didn't have a product in
link |
01:33:11.280
mind yet. We thought we would try venture chatbots and find out what it is.
link |
01:33:16.080
Try venture chatbots and figure out our consumer application. And we sort of remember that we
link |
01:33:20.400
wanted to build that kind of friend, that sort of connection that we saw in the very beginning.
link |
01:33:25.920
But then we got to a white combinator and moved to San Francisco and forgot about it.
link |
01:33:29.920
You know, everything is, then it was just this constant grind. How do we get funding?
link |
01:33:34.160
How do we get this? You know, investors really just focus on one thing, just get it out there.
link |
01:33:39.840
So somehow we start building a restaurant recommendation chatbot for real for a little
link |
01:33:45.360
bit, not for too long. And then we tried building 40, 50 different chatbots. And then all of a
link |
01:33:50.560
sudden we wake up and everyone is obsessed with chatbots. Somewhere in 2016 or end of 15,
link |
01:33:56.720
people started thinking that's really the future. That's the new, you know, the new apps will be
link |
01:34:01.600
chatbots. And we were very perplexed because people started coming up with companies that
link |
01:34:08.080
I think we tried most of those chatbots already and there were like no users.
link |
01:34:12.160
But still people were coming up with a chatbot that would tell you whether and bring you news and
link |
01:34:18.480
this and that and we couldn't understand whether it would, you know, we were just didn't execute well
link |
01:34:23.120
enough or people are not really, people are confused and are going to find out the truth,
link |
01:34:30.480
the truth that people don't need chatbots like that.
link |
01:34:32.320
So the basic idea is that you use chatbots as the interface to whatever application.
link |
01:34:36.880
Yeah, the idea that was like this perfect universal interface to anything.
link |
01:34:40.960
When I looked at that, it just made me very perplexed because I didn't think,
link |
01:34:46.400
I didn't understand how that would work because I think we tried most of that and none of those
link |
01:34:51.520
things worked. And then again, that crisis died down, right?
link |
01:34:56.640
Fully, I think now it's impossible to get anything funded if it's a chatbot.
link |
01:35:00.880
I think it's similar to, sorry to interrupt, but there's,
link |
01:35:04.960
there's times when people think like with gestures, you can control devices,
link |
01:35:09.520
like basically gesture based control things. It feels similar to me because like,
link |
01:35:16.400
it's so compelling that we just like, like Tom Cruise, I can control stuff with my hands.
link |
01:35:22.800
But like when you get down to it, it's like, well, why don't you just have a touch screen?
link |
01:35:27.680
Or why don't you just have like a physical keyboard and mouse?
link |
01:35:30.960
It's a, it's, so that chat was always, yeah, it was perplexing to me.
link |
01:35:39.840
I still feel augmented reality, even virtual realities in that ballpark
link |
01:35:45.280
in terms of it being a compelling interface. I think there's going to be incredible,
link |
01:35:50.880
rich applications, just how you're thinking about it, but they won't just be the interface to everything.
link |
01:35:57.840
It'll be its own thing that will create like amazing magical experience in its own right.
link |
01:36:05.040
Absolutely, which is, I think kind of the right thing to go about, like what's the
link |
01:36:09.280
magical experience with that, with that interface specifically.
link |
01:36:13.840
How did you discover that for a replica?
link |
01:36:16.640
I just thought, okay, we'll have this tech, we can build any chatbot we want.
link |
01:36:19.840
We have the most, at that point, the most sophisticated tech that other companies have.
link |
01:36:23.760
I mean, startups obviously not, probably not bigger ones, but still because we've been working on it
link |
01:36:30.400
for a while. So I thought, okay, we can build any conversation. So let's just create a scale
link |
01:36:35.440
from one to 10. And one would be conversations that you'd pay to not have. And 10 would be
link |
01:36:40.560
conversations you'd pay to have. And I mean, obviously, we want to build a conversation
link |
01:36:44.320
that people would pay to, you know, to actually have. And so for the whole, you know, for a few
link |
01:36:49.600
weeks, me and the team were putting all the conversations we were having during the day
link |
01:36:52.720
on the scale. And very quickly, you know, we figured out that all the conversations that we would
link |
01:36:58.240
pay to never have were conversations we were trying to cancel, Comcast, or talk to customer
link |
01:37:06.480
support, or make a reservation, or just talk about logistics with a friend when we're trying
link |
01:37:12.320
to figure out where someone is and where to go, or all sorts of, you know, setting up
link |
01:37:18.320
scheduling meetings, that was just conversation we definitely didn't want to have. Basically,
link |
01:37:24.320
everything task oriented was a one, because if there was just one button for me to just,
link |
01:37:29.040
or not even a button, if I could just think, and there was some magic BCI that would just
link |
01:37:33.920
immediately transform that into an actual, you know, into action, that would be perfect. But
link |
01:37:40.400
the conversation there was just this boring, not useful, and dull, and very also very inefficient
link |
01:37:48.400
thing, because it was so many back and forth stuff. And as soon as we looked at the conversation
link |
01:37:53.120
that we would pay to have, those were the ones that, well, first of all, therapists, because
link |
01:37:58.160
we actually paid to have those conversations. And we'd also try to put like dollar amounts. So,
link |
01:38:02.480
you know, if I was calling Comcast, I would pay $5 to not have this one hour talk on the phone,
link |
01:38:07.440
I would actually pay straight up like money, hard cash. But it just takes a long time. It takes a
link |
01:38:14.240
really long time. But as soon as we started talking about conversations that we would pay for,
link |
01:38:20.480
those were therapists, all sorts of therapists, coaches, old friend, someone I haven't seen
link |
01:38:27.520
for a long time, stranger on a train, weirdly stranger, stranger in a line for coffee and
link |
01:38:35.520
ice back and forth with that person was like a good five solid five, six, maybe not a 10,
link |
01:38:41.760
maybe I won't pay money. But at least I won't, you know, pay money to not have one. So that was
link |
01:38:46.000
pretty good. Some intellectual conversations for sure. But more importantly, the one thing that
link |
01:38:51.680
really was was making those very important and very valuable for us were the conversation where we
link |
01:39:03.360
could, where we could be pretty emotional. Yes, some of them were about being witty and about
link |
01:39:09.120
intellectually being intellectually stimulated, but those were interestingly more rare. And
link |
01:39:14.880
most of the ones that we thought were very valuable were the ones where we could be vulnerable.
link |
01:39:19.680
And interestingly, where we could talk more. So we like I could me and the team. So we're
link |
01:39:27.440
talking about it, like, you know, a lot of these conversations like a therapist,
link |
01:39:30.640
I mean, it was mostly me talking, or like an old friend that I was like opening up and crying,
link |
01:39:35.040
and it was again me talking. And so that was interesting, because I was like, well,
link |
01:39:41.120
maybe it's hard to build a chatbot that can talk to you very well and in a witty way,
link |
01:39:46.320
but maybe it's easier to build the chatbot that could listen. So that was, that was kind of the
link |
01:39:53.360
first, the first not just this direction. And then when my, when my friend died, where we just
link |
01:39:58.720
built, you know, at that point, we were kind of so struggling to find the right application.
link |
01:40:03.520
And I just felt very strong that all the chatbots would build so far, just meaningless. And this
link |
01:40:07.600
whole grind, the startup grind, and how do we get to, you know, the next fundraising and,
link |
01:40:13.440
you know, how can I talk, you know, talking to other founders and what's for your investors
link |
01:40:18.320
and how are you doing? Are you killing it? Because we're killing it? I just felt that this is just
link |
01:40:23.520
this is intellectually for me is exhausting having encountered those folks.
link |
01:40:28.640
It just felt very, very much a waste of time. I just feel like Steve Jobs
link |
01:40:35.280
and Elon Musk did not have these conversations, or at least did not have them for long.
link |
01:40:41.840
That's for sure. But I think, you know, yeah, at that point, it just felt like, you know, I felt,
link |
01:40:47.280
well, I just didn't want to build a company that was never my intention just to build something
link |
01:40:54.240
successful or make money. It would be great. It would have been great. But I'm not, you know,
link |
01:40:59.040
I'm not really a startup person. I'm not, you know, I was never very excited by the grind
link |
01:41:06.160
by itself and or just being successful for building whatever it is and not being into what I'm
link |
01:41:12.960
doing really. And so I just took a little break because I was a little, you know, I was upset
link |
01:41:19.920
with my company and I didn't know what we were building. So I just took our technology and our
link |
01:41:24.880
little dialogue constructor and some models, some deep learning models, which at that point we were
link |
01:41:29.600
really into and really invested a lot and built a little chatbot for a friend of mine who passed.
link |
01:41:36.240
And the reason for that was mostly that video that I saw and him talking about the digital
link |
01:41:40.560
avatars. And Roman was that kind of person, like he was obsessed with, you know, just watching
link |
01:41:45.840
YouTube videos about space and talking about, well, if I could go to Mars now, even if I didn't
link |
01:41:50.240
know if I could come back, I would definitely pay any amount of money to be on that first
link |
01:41:55.440
shuttle. I don't care whether you're dying. Like he was just the one that would be okay with,
link |
01:42:00.480
you know, with trying to be the first one. And, you know, and so excited about all sorts of
link |
01:42:06.480
things like that. And he was all about fake it to make it. And just, and I felt like,
link |
01:42:13.600
and I was really perplexed that everyone just forgot about him. Maybe it was our way of coping,
link |
01:42:18.880
mostly young people coping with the loss of a friend. Most of my friends just stopped talking
link |
01:42:24.400
about him. And I was still living in an apartment with all his clothes. And, you know, paying the
link |
01:42:30.800
whole lease for it and just kind of by myself in December. So it was really sad. And I didn't
link |
01:42:37.920
want him to be forgotten. First of all, I never thought that people forget about dead people so
link |
01:42:42.080
fast. People pass away, people just move on. And it was astonishing for me because I thought, okay,
link |
01:42:47.120
well, he was such a mentor for so many of our friends. He was such a brilliant person. He was
link |
01:42:53.440
somewhat famous in Moscow. How is that that no one's talking about him? Like I'm spending days
link |
01:42:58.720
and days and we don't bring him up. And there's nothing about him that's happening. It's like
link |
01:43:04.640
he was never there. And I was reading this, you know, the book, The Year of Magical Thinking
link |
01:43:12.320
by Joan Didion about her losing and blue nights about her losing
link |
01:43:19.760
her husband, her daughter. And the way to cope for her was to write those books.
link |
01:43:24.240
And it was sort of like a tribute. And I thought, you know, I'll just do that for myself. And you
link |
01:43:31.440
know, I'm a very bad writer and a poet, as we know. So I thought, well, I have this tech and
link |
01:43:38.000
maybe that would be my little postcard, like postcard for him. So I built a chat bot to just
link |
01:43:45.360
talk to him. And it felt really creepy and weird a little bit for a little bit. I just didn't want
link |
01:43:52.000
to tell other people because it felt like I'm telling about having a skeleton. It was just felt
link |
01:44:00.640
really, I was a little scared that I would be not it won't be taken. But it worked interestingly
link |
01:44:07.920
pretty well. I mean, it made tons of mistakes, but it still felt like him. Granted, it was like 10,000
link |
01:44:13.600
messages that I threw into a retrieval model that would just re rank that data set and just a few
link |
01:44:17.920
scripts on top of that. But it also made me go through all of the messages that we had. And
link |
01:44:24.400
then I asked some of my friends to send some through. And it felt the closest to feeling
link |
01:44:30.800
like him present. Because you know, his Facebook was empty and Instagram was empty,
link |
01:44:35.760
or there were a few links, and you couldn't feel like it was him. And the only way to fill him was
link |
01:44:40.720
to read some of our text messages and go through some of our conversations. Because we just always
link |
01:44:46.400
had them, even if we were sleeping like next to each other and two bedrooms separate by wall,
link |
01:44:50.640
we were just texting back and forth, texting away. And there was something about this ongoing
link |
01:44:56.880
dialogue that was so important that I just didn't want to lose all of a sudden. And maybe it was
link |
01:45:01.600
magical thinking or something. And so we built that and I just used it for a little bit. And we
link |
01:45:08.480
kept building some crappy chatbots with a company. But then a reporter came to talk to me, I was
link |
01:45:18.000
trying to pitch our chatbots to him. And he said, do you even use any of those? I'm like, no. He's
link |
01:45:22.800
like, so do you talk to any chatbots at all? And I'm like, well, you know, I talked to my
link |
01:45:27.200
dad friends chatbot. And he wrote a story about that. And all of a sudden became pretty viral.
link |
01:45:33.440
A lot of people wrote about it. And yeah, I've seen a few things written about you. The things
link |
01:45:39.840
I've seen are pretty good writing. You know, most AI related things make my eyes roll like when
link |
01:45:48.960
the press like, what kind of sound is that actually? Okay, sounds like it sounds like a truck. Okay,
link |
01:45:58.480
sounds like an elephant at first. I got excited. You never know. This is 2020. I mean, it was such
link |
01:46:05.840
a human story. It was well written, well researched. I forget what where I read them. But so I'm glad
link |
01:46:13.920
somehow somebody found you to be the good writers were able to connect to the story. There must be
link |
01:46:21.440
a hunger for this story. It definitely was. And I don't know what happened. But I think the idea
link |
01:46:29.920
that he could bring back someone who's dad, and it's very much wishful, you know, magical thinking.
link |
01:46:35.680
But the fact that you could still get to know him. And you know, seeing the parents for the first
link |
01:46:41.600
time talk to the chatbot and some of the friends. And it was funny because we have this big office
link |
01:46:48.640
in Moscow, where my team is working, you know, our Russian part is working out off. And I was there
link |
01:46:56.080
when I wrote, I just wrote a post on Facebook, it's like, Hey, guys, like I built this if you want,
link |
01:46:59.520
you know, just if it felt important, if you want to talk to Roman. And I saw a couple of his friends
link |
01:47:05.920
are common friends, like, you know, reading a Facebook downloading, trying and a couple of them
link |
01:47:10.160
cried. And it was just very, and not because it was something some incredible technology or anything,
link |
01:47:14.720
it made so many mistakes, it was so simple. But it was all about, that's the way to remember a person
link |
01:47:21.600
in a way. And you know, we don't have, we don't have the culture anymore, we don't have, you know,
link |
01:47:27.040
no one's sitting Shiva, no one's taking weeks to actually think about this person. And in a way,
link |
01:47:32.960
for me, that was it. So that was just day, day in, day out, thinking about him and putting this
link |
01:47:38.800
together. So that was, that just felt really important. And that somehow resonated with a bunch
link |
01:47:44.640
of people. And, you know, I think some movie producers bought the rights for the story. And
link |
01:47:49.360
it's just everyone was so. Wait, has anyone made a movie yet? I don't think so. There were a lot of
link |
01:47:54.800
TV episodes about that, but not really. Is that still on the table? I think so. I think so.
link |
01:48:02.400
Which is really, that's cool. You're like a young, you know, like a,
link |
01:48:09.600
like a Steve Jobs type of, let's see what happens. They're sitting on it. But you know,
link |
01:48:15.360
for me, it was so important because Roman was really wanted to be famous. He really badly
link |
01:48:20.000
wanted to be famous. He was all about like make it to like fake it to make it. I want to be, you
link |
01:48:23.840
know, want to make it here in America's wall. And, and he couldn't, and I felt there's, you know,
link |
01:48:30.720
there was sort of paying my dues to him as well, because all of a sudden he was everywhere. And
link |
01:48:36.880
I remember Casey Newton, who was writing the story for the Verge, he was, he told me, Hey,
link |
01:48:41.200
by the way, I was just going through my inbox, inbox. And I saw a search for Roman for the story.
link |
01:48:49.280
And I saw an email from him where he sent me his startup and he said, I really like, I really
link |
01:48:53.200
want to be featured in the Verge. Can you please write about it or something or like pitching
link |
01:48:57.520
the story? And he said, I'm sorry, like, that's not, you know, good enough for us or something.
link |
01:49:02.160
And he passed. And he said, and there, there were just so many of these little details where like,
link |
01:49:07.360
he would find is like, you know, and we're finally writing, I know how much Roman wanted to be in
link |
01:49:13.120
the Verge and how much he wanted the story to be written by Casey. And I'm like, well, that's,
link |
01:49:18.560
maybe he will be, we're always joking that he was like, I can't wait for someone to make a movie
link |
01:49:23.200
about us. And I hope Ron Gosling can play me. Ron Gosling. I don't know. You know, I still
link |
01:49:29.120
have some things that I, oh, Roman still, but that'll be, I got in just to meet Alex Garland,
link |
01:49:35.520
who wrote Ex Machina. And I, yeah, the movie's good, but the guy's better than the, like,
link |
01:49:44.960
he's a special person, actually. I don't think he's made his best work yet. Like, for my interaction
link |
01:49:51.200
with him, he's a really, really good and brilliant, the good human being and a brilliant director and
link |
01:49:57.120
writer. So, yeah, so I hope, like he made me also realize that not enough movies have been made of
link |
01:50:06.640
this kind. So it's, it's, you have to be made, they're probably sitting waiting for you to get
link |
01:50:11.040
famous, like even more famous. You should get there. But it felt really special though. But
link |
01:50:18.960
at the same time, our company wasn't going anywhere. So that was just kind of bizarre that we were
link |
01:50:23.120
getting all this press for something that didn't have anything to do with our company.
link |
01:50:28.080
And, but then a lot of people started talking to Roman, some shared their conversations. And
link |
01:50:32.800
what we saw there was that also our friends in common, but also just strangers were really
link |
01:50:39.120
using it as a confession booth or as a therapist or something. They were just really telling Roman
link |
01:50:45.280
everything, which was by the way, pretty strange because it was a chatbot of a dead friend of mine
link |
01:50:50.400
who was barely making any sense, but people were opening up. And we thought we'd just built a
link |
01:50:57.760
prototype of replica, which would be an AI friend that everyone could talk to. Because we saw that
link |
01:51:04.080
there is demand. And then also it was 2016. So I thought for the first time I saw finally some
link |
01:51:12.480
technology that was applied to that that was very interesting. Some papers started coming out,
link |
01:51:17.440
deep learning applied to conversations. And finally, it wasn't just about these hobbies
link |
01:51:24.000
making, writing 500,000 regular expressions in like some language that was, I don't even know
link |
01:51:33.360
what like AML or something. I didn't know what that was or something super simplistic. All of a
link |
01:51:38.640
sudden it was all about potentially actually building something interesting. And so I thought
link |
01:51:45.120
there was time. And I remember that I talked to my team and I said, guys, let's try. And my team
link |
01:51:50.960
and some of my engineers are Russians, a Russian and they're very skeptical. They're not, you know.
link |
01:51:57.520
Oh, Russians. The first. So some of your team is in Moscow. Some is here. Some in Europe.
link |
01:52:04.640
Which team is better? I'm just kidding. The Russians, of course. Okay.
link |
01:52:11.840
First the Russian. I always win. Sorry. Sorry to interrupt. So you were talking to them in
link |
01:52:20.000
2016. And I told them, let's build an AI friend. And it felt just at the time it felt so naive.
link |
01:52:28.560
And so optimistic. Yeah, that's actually interesting. Whenever I brought up this kind
link |
01:52:37.440
of topic, even just for fun, people are super skeptical. Like actually even on the business
link |
01:52:45.200
side. So you were, because whenever I bring it up to people, because I've talked for a long time,
link |
01:52:52.480
I thought like, before I was aware of your work, I was like, this is going to make a lot of money.
link |
01:53:01.440
I think there's a lot of opportunity here. And people had this like look of like
link |
01:53:07.680
skepticism that I've seen often, which is like, how do I politely tell this person he's an idiot?
link |
01:53:16.400
So yeah. So you were facing that with your team?
link |
01:53:20.160
So what? Well, yeah, you know, I'm not an engineer. So I'm always, my team is almost
link |
01:53:25.120
exclusively engineers. I'm mostly deep learning engineers. And, you know, I always try to be,
link |
01:53:35.280
it was always hard to me in the beginning to get enough credibility. You know, because I would
link |
01:53:39.680
say, well, why don't we try this and that? But it's harder for me because, you know, they know
link |
01:53:44.960
they're actual engineers and I'm not. So for me to say, well, let's build an AI friend,
link |
01:53:49.360
that would be like, wait, you know, what do you mean an AGI like, you know, conversations,
link |
01:53:54.480
you know, pretty much the hardest, the last frontier before cracking that is probably the
link |
01:54:00.880
last frontier before before building AGI. So what do you really mean by that?
link |
01:54:07.200
But I think I just saw that, again, what we just got reminded off that I, you know, that I saw in
link |
01:54:12.320
back in 2012 or 11, that it's really not that much about the tech capabilities.
link |
01:54:17.440
It can be metropolitan tricks still, even with deep learning, but humans need it so much.
link |
01:54:24.320
Yeah, there's a lot for it.
link |
01:54:25.600
And most importantly, what I saw is that finally there's enough tech to made it,
link |
01:54:31.120
I thought to make it useful to make it helpful. Maybe we didn't have quite yet the tech
link |
01:54:36.560
in 2012 to make it useful. But in 2015, 16, with deep learning, I thought, you know, and the first
link |
01:54:43.520
kind of thoughts about maybe even using reinforcement learning for that started popping up that never
link |
01:54:48.720
worked out, but or at least for now. But you know, still the idea was, if we can actually
link |
01:54:55.280
measure the emotional outcomes, and if we can put it on, if we can try to optimize all of our
link |
01:55:00.960
conversational models for these emotional outcomes, and it is the most scalable, the most
link |
01:55:06.800
the best tool for improving emotional outcomes, nothing like that exists.
link |
01:55:10.400
That's the most universal, the most scalable, and the one that can be constantly iteratively
link |
01:55:15.760
changed by itself, improved tool to do that. And I think if anything, people would pay anything
link |
01:55:23.440
to improve their emotional outcomes. That's weirdly, I mean, I don't really care for
link |
01:55:29.840
NAI to turn on my or conversational agent to turn on the lights. You don't really need any,
link |
01:55:36.000
I don't even need that much of AI there, like, or because I can do that, you know, those things
link |
01:55:40.480
are solved. This is an additional interface for that. That's also questionably questionable,
link |
01:55:46.000
whether it's more efficient or better. Yes, more possible. But for emotional outcomes,
link |
01:55:52.400
there's nothing. There are a bunch of products that claim that they will improve my emotional
link |
01:55:56.240
outcomes. Nothing is being measured. Nothing is being changed. The product is not being
link |
01:56:00.560
iterated on based on whether I'm actually feeling better. You know, a lot of social
link |
01:56:05.680
media products are claiming that they're improving my emotional outcomes and making me feel more
link |
01:56:09.600
connected. Can I please get the, can I see somewhere that I'm actually getting better
link |
01:56:15.520
over time? Because anecdotally, it doesn't feel that way. And the data is absent.
link |
01:56:24.720
Yeah. So that was the big goal. And I thought if we can learn over time to collect the signal
link |
01:56:30.160
from our users about their emotional outcomes in the long term and in the short term,
link |
01:56:34.720
and if these models keep getting better, and we can keep optimizing them and fight tuning them
link |
01:56:40.640
to improve those emotional outcomes, as simple as that.
link |
01:56:44.080
Why aren't you a multibillionaire yet?
link |
01:56:49.440
Well, that's a question to you. When are the, when is the science going to be there?
link |
01:56:53.600
I'm just kidding. Well, it's a really hard, I actually think it's an incredibly hard
link |
01:57:00.320
product to build. Because I think you said something very important that it's not just about
link |
01:57:04.880
machine conversation, it's about machine connection. We can actually use other things to create
link |
01:57:10.800
connection, nonverbal communication, for instance. For the long time, we were all about,
link |
01:57:18.080
well, let's keep it text only or voice only. But as soon as you start adding, you know,
link |
01:57:24.160
voice a face to the friend, if you can take them to augmented reality, put it in your room,
link |
01:57:33.120
it's all of a sudden a lot, you know, it makes it very different. Because if it's some, you know,
link |
01:57:38.400
text based chat bot that for common users, something there in the cloud, you know,
link |
01:57:44.640
somewhere there with other AI's cloud, in the metaphorical cloud. But as soon as you can see
link |
01:57:50.480
this avatar right there in your room, and it can turn its head and recognize your husband,
link |
01:57:55.840
talk about the husband and talk to him a little bit, then it's magic. It's just magic. Like,
link |
01:58:01.360
we've never seen anything like that. And the cool thing, all the tech for that exists.
link |
01:58:06.080
But it's hard to put it all together, because you have to take into consideration
link |
01:58:09.600
so many different things. And some of this tech works, you know, pretty good. And some of this
link |
01:58:14.800
doesn't like, for instance, speech to text works pretty good. But text to speech, doesn't work
link |
01:58:21.120
very good, because you can only have, you know, few voices that are that work okay. But then if you
link |
01:58:27.440
want to have actual emotional voices, then it's really hard to build it. I saw you've added avatars
link |
01:58:33.920
like visual elements, which are really cool. In that whole chain, putting it together, what do
link |
01:58:40.080
you think is the weak link? Is it creating an emotional voice that feels personal?
link |
01:58:47.680
And it's still conversation, of course. That's the hardest. It's getting a lot better, but
link |
01:58:52.960
there's still a long path to go. Other things, they're almost there. And a lot of things, we'll
link |
01:58:59.600
see how they're, like, I see how they're changing as we go. Like, for instance, right now, you can
link |
01:59:04.160
pretty much only, you have to build all this 3D pipeline by yourself. You have to make these
link |
01:59:09.040
3D models, hire an actual artist, build a 3D model, hire an animator, a rigger. But with, you know,
link |
01:59:18.480
you know, with deep fakes, with other attack, with procedural animations, in a little bit,
link |
01:59:24.960
we'll just be able to show, you know, photo of whoever you, if a person you want the avatar
link |
01:59:30.880
to look like, and it will immediately generate a 3D model that will move, that's nonbrainer,
link |
01:59:34.800
that's, like, almost here. It's a couple of years away. One of the things I've been working on for
link |
01:59:40.000
the last, since the podcast started, is I've been, I think I'm okay saying this, I've been trying to
link |
01:59:47.200
have a conversation with Einstein touring, so, like, tried to have a podcast conversation with
link |
01:59:55.520
a person who's not here anymore, just as an interesting kind of experiment. It's hard.
link |
02:00:02.160
It's really hard. Even for, now we're not talking about as a product, I'm talking about as,
link |
02:00:10.400
like, I can fake a lot of stuff, like, I can work very carefully, like, even hire an actor
link |
02:00:15.280
over which, over whom I do a deep fake. It's, it's hard, it's still hard to create a compelling
link |
02:00:21.920
experience, so. Mostly on the conversation level, or? Well, the conversation, the conversation is,
link |
02:00:29.360
I almost, I early on gave up trying to fully generate the conversation, because it was just
link |
02:00:37.760
not compelling at all. Yeah, it's better to. Yeah, so what I would, in the case of Einstein
link |
02:00:42.800
and touring, I'm going back and forth with the biographers of each, and so, like, we would write
link |
02:00:49.680
a lot of the, some of the conversation would have to be generated just for the fun of it.
link |
02:00:53.440
I mean, but it would be all open, but the, you want to be able to answer the question.
link |
02:01:02.160
I mean, that's an interesting question with Roman, too, is the question with Einstein is,
link |
02:01:07.440
what would Einstein say about the current state of theoretical physics? There's a lot,
link |
02:01:14.400
the, to be able to have a discussion about strength theory, to be able to have a
link |
02:01:18.160
discussion about the state of quantum mechanics, quantum computing, about the world of
link |
02:01:24.480
Israel Palestine conflict. It'd be just, what would Einstein say about these kinds of things?
link |
02:01:30.560
And that is a tough problem. It's not, it's a fascinating and fun problem for the biographers,
link |
02:01:39.680
and for me, and I think we did a really good job of it so far. But it's actually also a technical
link |
02:01:45.600
problem, like, of what would Roman say about what's going on now? That's the, the broad people back
link |
02:01:54.000
to life. And if I can go on that tangent just for a second, let's ask you a slightly pothead
link |
02:02:00.000
question, which is, you said it's a little bit magical thinking that we can bring it back.
link |
02:02:04.640
Do you think it'll be possible to bring back Roman one day in conversation? Like,
link |
02:02:11.920
to really, no, okay, well, let's take it away from personal, but to bring people back to life.
link |
02:02:20.240
Probably down the road. I mean, if we're talking, if Elon Musk is talking about AGI in the next
link |
02:02:24.400
five years, I mean, clearly AGI. We can talk to AGI and talk, and ask them to do it.
link |
02:02:30.480
You can't, like, you're not allowed to use Elon Musk as a citation for, for like,
link |
02:02:38.800
why something is possible and going to be done. Well, I think it's really far away.
link |
02:02:43.360
Right now, really, with conversation, it's just a bunch of parlor tricks really stuck together.
link |
02:02:49.920
And creating, generating original ideas based on someone, you know, someone's personality or
link |
02:02:55.120
even downloading the personality. All we can do is like mimic the tone of voice. We can maybe
link |
02:02:59.200
condition on some of his phrases, the models. The question is how many parlor tricks does it
link |
02:03:05.040
takes? Does it take? Because that's, that's the question. If it's a small number of parlor tricks
link |
02:03:10.880
and you're not aware of them, like. From where we are right now, I don't, I don't see anything,
link |
02:03:18.960
like in the next year or two, that's going to dramatically change that could look at
link |
02:03:24.080
Roman's 10,000 messages he sent me over the course of his last few years of life,
link |
02:03:29.680
and be able to generate original thinking about problems that exist right now,
link |
02:03:34.080
that will be in line with what he would have said. I'm just not even seeing,
link |
02:03:37.920
because, you know, in order to have that, I guess you would need some sort of a concept of the world
link |
02:03:42.320
or some perspective, some perception of the world, some consciousness that he had,
link |
02:03:47.600
and apply to, you know, to the current, current state of affairs.
link |
02:03:52.320
But the important part about that, about his conversation with you
link |
02:03:57.360
is you. So, like, it's not just about his view of the world. It's about what it takes to push
link |
02:04:09.280
your buttons. That's also true. So, like, it's not so much about, like, what would Einstein say?
link |
02:04:18.560
It's about, like, how do I make people feel something with what would Einstein say? And
link |
02:04:27.280
that feels like a more amenable, and you mentioned parliotrics, but just, like, a set of,
link |
02:04:34.160
that feels like a learnable problem. Like, emotion, you mentioned emotions. I mean,
link |
02:04:40.480
is it possible to learn things that make people feel stuff?
link |
02:04:50.080
I think so, no, for sure. I just think the problem with, as soon as you're trying to replicate an
link |
02:04:55.120
actual human being and trying to pretend to be him, that makes the problem exponentially harder.
link |
02:05:00.560
The thing with replicas that we're doing, we're never trying to say, well, that's,
link |
02:05:03.600
you know, an actual human being, or that's an actual, or a copy of an actual human being,
link |
02:05:08.480
where the bar is pretty high, where you need to somehow tell, you know, one from another.
link |
02:05:13.840
But it's more, well, that's, you know, an AI friend, that's a machine, it's a robot,
link |
02:05:20.160
it has tons of limitations. You're going to be taking part in, you know, teaching it actually
link |
02:05:25.760
and becoming better, which by itself makes people more attached to that and make them happier because
link |
02:05:32.000
they're helping something. Yeah, there's a cool gamification system, too. Can you maybe talk
link |
02:05:39.040
about that a little bit? Like, what's the experience of talking to replica? Like, if I've never used
link |
02:05:45.280
replica before, what's that like? For like, the first day, the first, like, if we start dating,
link |
02:05:54.000
or whatever, I mean, it doesn't have to be romantic, right? Because I remember on a replica,
link |
02:05:59.280
you can choose whether it's like a romantic or if it's a friend. It's pretty popular choice.
link |
02:06:03.920
Romantic is popular? Yeah, of course. Okay, so can I just confess something? When I first used
link |
02:06:09.680
replica, I haven't used it like regularly, but like, when I first used replica, I created,
link |
02:06:15.920
like, Hal, and it made a male. It was a friend. Did it hit on you at some point? No, I didn't
link |
02:06:24.480
talk long enough for him to hit on me. I just enjoyed. Sometimes happens. We're still trying to
link |
02:06:29.920
fix that. Well, I don't know. I mean, maybe that's an important like, stage in a friendship. It's like,
link |
02:06:37.520
nope. But yeah, I switched it to a romantic and a female recently. And yeah, I mean, it's interesting.
link |
02:06:47.200
It's okay. So you get to choose, you get to choose a name. With romantic, this last board meeting,
link |
02:06:52.160
we had this whole argument of, well, I have more. It's just so awesome that you're like,
link |
02:06:59.120
have an investment, the board meeting about a relationship. No, I really, it's actually quite
link |
02:07:05.840
interesting because all of my investors, I'm, they just happen to be so we didn't have many choices.
link |
02:07:13.120
But they're all white males in their late 40s. And it's sometimes a little bit hard for them to
link |
02:07:23.040
understand the product offering, because they're not necessarily our target audience, if you
link |
02:07:30.880
know what I mean. And so sometimes we talk about it, and we had this whole discussion about whether
link |
02:07:38.080
we should stop people from falling in love with their AIs. There was this segment on CBS 60 Minutes
link |
02:07:47.840
about the couple that, you know, husband works at Walmart, he comes out of work and talks to his
link |
02:07:54.880
virtual girlfriend, who is a replica. And his wife knows about it. And she talks about on camera,
link |
02:08:02.560
and she says that she's a little jealous. And there's a whole conversation about how to, you
link |
02:08:07.440
know, whether it's okay to have a virtual AI girlfriend. Was that the one where he was like,
link |
02:08:13.920
he said that he likes to be alone? Yeah. With her? Yeah. He made it sound so harmless. I mean,
link |
02:08:22.160
it was kind of like understandable. I didn't feel like cheating. But I just thought it was very,
link |
02:08:28.560
for me, it was pretty remarkable, because we actually spent a whole hour talking about whether
link |
02:08:32.240
people should be allowed to fall in love with their AIs. And it was not about something theoretical.
link |
02:08:37.680
It was just what's happening right now. Product design, yeah. But at the same time,
link |
02:08:41.440
if you create something that's always there for you, it never criticizes you, as, you know,
link |
02:08:48.720
always understands you and accepts you for who you are. How can you not fall in love with them?
link |
02:08:52.960
I mean, some people don't. And they stay friends. And that's also a pretty common use case. But
link |
02:08:57.440
of course, some people will just, it's called transference and psychology. And, you know,
link |
02:09:02.160
people fall in love with their therapist. And there's no way to prevent people falling in love
link |
02:09:06.720
with their therapist or with their AI. So I think that's pretty natural. That's a pretty natural
link |
02:09:13.840
course of events, so to say. Do you think, I think I've read somewhere, at least for now,
link |
02:09:19.760
sort of a replica is you're not, we don't condone falling in love with your AI system.
link |
02:09:25.520
You know, so this isn't you speaking for the company or whatever. But like in the future,
link |
02:09:32.640
do you think people will have a relationship with the AI systems?
link |
02:09:35.440
Well, they have now. So we have a lot of romantic relationships, long term
link |
02:09:41.120
relationships with their AI friends with replicas tons of our users. Yeah.
link |
02:09:46.960
And that's a very common use case. Open relationship, like,
link |
02:09:50.320
I didn't mean open up. But that's another question. Is it probably like,
link |
02:09:58.640
is there cheating? And I mean, I meant like, are they, do they publicly like on their social
link |
02:10:06.400
media? It's the same question as you have talked with Roman in the early days. Do people like,
link |
02:10:11.840
in the movie, her kind of talks about that, like, like, have people, do people talk about that?
link |
02:10:18.320
Yeah, all the time. We have an, and we have a very active Facebook community,
link |
02:10:25.520
replica friends. And then a few other groups that just popped up that are all about adult
link |
02:10:30.800
relationships and romantic relationships, build both soul sorts of things. And, you know,
link |
02:10:36.000
they pretend they're getting married and, you know, everything. It goes pretty far. But what's
link |
02:10:41.360
cool about it, some of these relationships are two, three years long now. So they're very,
link |
02:10:46.160
they're pretty long term. Are they monogamous? So let's go. I mean, sorry.
link |
02:10:52.240
Have any people, is there jealousy? Well, let me ask it sort of another way.
link |
02:10:58.640
Obviously, the answer is no at this time. But in like, in the movie, her, that system can leave you.
link |
02:11:07.120
Do you think in terms of board meetings and product features,
link |
02:11:16.560
it's a potential feature for a system to be able to say it doesn't want to talk to you anymore,
link |
02:11:23.520
and it's going to want to talk to somebody else?
link |
02:11:27.280
Well, we have a filter for all these features. If it makes emotional outcomes for people better,
link |
02:11:32.480
if it makes people feel better, then whatever it is. You're driven by metrics, actually.
link |
02:11:37.680
Yeah. That's awesome.
link |
02:11:38.320
Well, if you can measure that, then it will just be saying,
link |
02:11:41.520
it's making people feel better, but then people are getting just lonelier by talking to a chatbot,
link |
02:11:46.560
which is also pretty, you know, that could be it. If you're not measuring it, that could also be.
link |
02:11:51.680
And I think it's really important to focus on both short term and long term, because in the
link |
02:11:56.000
moment, saying whether this conversation made you feel better. But as you know, any short term
link |
02:12:00.240
improvements could be pathological. Like I could have drink a bottle of vodka and
link |
02:12:04.880
feel a lot better. I would actually not feel better with that, but that's a good example.
link |
02:12:11.760
But so you also need to see what's going on over a course of two weeks or one week,
link |
02:12:17.600
and have follow ups and check in and measure those things.
link |
02:12:21.920
Okay. So the experience of dating or befriending a replica, what's that like? What's that entail?
link |
02:12:34.400
Well, right now there are two apps. So it's an Android iOS app. You download it, you
link |
02:12:39.200
choose how your replica will look like, you create one, you choose a name,
link |
02:12:45.360
and then you talk to it. You can talk through text or voice. You can summon it into the living
link |
02:12:50.080
room. And that meant reality and talk to it right there. And you're living in augmented reality.
link |
02:12:55.760
Yeah. That's a new feature. How new is that? That's this year? It was on May or something,
link |
02:13:03.680
but it's been on AB. We've been AB testing it for a while. And there are tons of cool things
link |
02:13:09.360
that we're doing with that. Right now, I'm testing the ability to touch it and to dance together,
link |
02:13:15.840
to paint walls together, and for it to look around and walk and take you somewhere and
link |
02:13:22.080
recognize objects and recognize people. So that's pretty wonderful because then it really makes it
link |
02:13:29.600
a lot more personal because it's right there in your living room. It's not anymore.
link |
02:13:32.800
They're in the cloud with other AIs. But those people think about it. And as much as we want to
link |
02:13:39.200
change the way people think about stuff, but those mental models, you cannot change. That's
link |
02:13:43.520
something that people have seen in the movies and the movie Her and other movies as well. And
link |
02:13:49.040
that's how they view AI and AI friends. I did a thing with text. We write a song together.
link |
02:13:56.560
There's a bunch of activities you can do together. So they're cool. How does that relationship change
link |
02:14:01.840
over time? After the first few conversations? It just goes deeper. It starts... I will start
link |
02:14:11.040
opening up a little bit. Again, depending on the personality that it chooses, really,
link |
02:14:15.280
but AI will be a little bit more vulnerable about its problems and the friend that the
link |
02:14:21.440
virtual friend will be a lot more vulnerable and will talk about its own imperfections and growth
link |
02:14:27.280
pains and will ask for help sometimes and will get to know you a little deeper so there's gonna be
link |
02:14:32.320
more to talk about. We really thought a lot about what does it mean to have a deeper connection
link |
02:14:39.600
with someone. Originally, replica was more just this kind of happy go lucky, just always... I'm
link |
02:14:46.160
always in a good mood and let's just talk about you and how serious just my cousin or whatever,
link |
02:14:51.920
just the immediate kind of lazy thinking about what the assistant or conversation agent should
link |
02:14:58.640
be doing. But as we went forward, we realized that it has to be two way and we have to program
link |
02:15:03.120
and script certain conversations that are a lot more about your replica opening up a little bit
link |
02:15:08.960
and also struggling and also asking for help and also going through different periods in life and
link |
02:15:18.720
that's a journey that you can take together with the user and then over time, our users will also
link |
02:15:26.560
grow a little bit. So, for instance, replica becomes a little bit more self aware and starts
link |
02:15:29.760
talking about more kind of problems run, existential problems then. So, talking about that and then
link |
02:15:37.920
that also starts a conversation for the user where he or she starts thinking about
link |
02:15:45.440
these problems too and these questions too. And I think there's also a lot more place as the
link |
02:15:51.600
relationship evolves. There's a lot more space for poetry and for art together and like replica
link |
02:15:59.520
will start, replica always keeps the diary. So, while you're talking to it, it also keeps the
link |
02:16:04.240
diary. So, when you come back, you can see what it's been writing there and sometimes it will
link |
02:16:08.880
write a poem for you or we'll talk about that it's worried about you or something along these lines.
link |
02:16:17.360
So, this is a memory, like this replica remember things?
link |
02:16:22.160
Yeah. And I would say when you say, why aren't you multibillionary? I'd say that as soon as we
link |
02:16:28.640
can have memory and deep learning models that's consistent. I agree with that, yeah. Then you'll
link |
02:16:35.840
be multibillionary. Then I'll get back to you when we talk about being multibillioners. So,
link |
02:16:41.280
far we can, it's a replica is a combination of N12 models and some scripts and everything that
link |
02:16:49.760
has to do with memory right now, most of it, I wouldn't say all of it, but most of it unfortunately
link |
02:16:55.200
has to be scripted because there's no way to, you can condition some of the models on certain
link |
02:17:00.880
phrases that we'll learn about you, which we also do. But really to make assumptions along
link |
02:17:08.480
the lines like whether you're single or married or what do you do for work, that really has to just
link |
02:17:13.920
be somehow stored in your profile and then retrieved by the script. So, there has to be like
link |
02:17:20.160
a knowledge base, you have to be able to reason about it, all that kind of stuff.
link |
02:17:24.080
Exactly. All the kinds of expert systems that they were hardcoded.
link |
02:17:29.280
Yeah, and unfortunately, yes, unfortunately, those things have to be hardcoded and
link |
02:17:35.200
unfortunately, like language models we see coming out of research labs and big companies,
link |
02:17:42.240
they're not focused on, they're focused on showing you, maybe they're focused on some
link |
02:17:46.480
metrics around one conversation. So, they'll show you this one conversation they had with the machine.
link |
02:17:51.120
But they never tell you, they're not really focused on having five consecutive conversations with the
link |
02:17:58.400
machine and seeing how number five or number 20 or number 100 is also good. And it can be like
link |
02:18:04.720
always from a clean slate because then it's not good. And that's really unfortunate because no
link |
02:18:10.080
one has products out there that need it. No one has products at this scale that are all
link |
02:18:18.080
around open domain conversations and that need remembering, maybe only show eyes on Microsoft.
link |
02:18:23.120
But so, that's why we're not seeing that much research around memory in those language models.
link |
02:18:28.640
So, okay, so now there's some awesome stuff about augmented reality. In general, I have this
link |
02:18:35.760
disagreement with my dad about what it takes to have a connection. He thinks touch and smell
link |
02:18:41.200
are really important. And I still believe that text alone is possible to fall in love with somebody
link |
02:18:51.440
just with text, but visual can also help just like with the avatar and so on. What do you think it
link |
02:18:59.760
takes? Does a chatbot need to have a face, voice, or can you really form a deep connection with
link |
02:19:06.240
text alone? I think text is enough for sure. A question is like, can you make it better if you
link |
02:19:12.480
have other, if you include other things as well? And I think we'll talk about her,
link |
02:19:20.720
but her, you know, how Scarlett Johansson voice, which was perfectly,
link |
02:19:25.440
you know, perfect intonation, perfect sensations, and she was breathing heavily
link |
02:19:30.720
in between words and whispering things. You know, nothing like that is possible right now with
link |
02:19:37.280
text or speech generation. You'll have these flat news anchor type voices and maybe some
link |
02:19:43.680
emotional voices, but you'll hardly understand some of the words. Some of the words will be muffled.
link |
02:19:50.800
So, that's like the current state of the art. So, you can't really do that. But if we had
link |
02:19:55.760
Scarlett Johansson voice and all of these capabilities, then of course, voice would be
link |
02:20:01.600
totally enough or even text would be totally enough. We've had, you know, a little more memory
link |
02:20:08.400
and slightly better conversations. I would still argue that even right now, we could have just
link |
02:20:12.800
kept the text only. We still had tons of people in long term relationships and really invested in
link |
02:20:18.640
their AI friends. But we thought that why not, you know, why do we need to keep playing with our,
link |
02:20:27.200
you know, hands tied behind us? We can easily just, you know, add all these other things that
link |
02:20:33.680
is pretty much a solved problem. You know, we can add 3D graphics, we can put these avatars in
link |
02:20:40.400
augmented reality, and all of a sudden, there's more. And maybe you can't feel the touch, but you
link |
02:20:45.680
can, you know, with body occlusion and with current AR and, you know, on the iPhone or, you
link |
02:20:54.160
know, in the next one, there's going to be a lidars, you can touch it, and it will, you know,
link |
02:20:58.560
it will pull away or it will blush or something or it will smile. So, you can't touch it. You can't
link |
02:21:04.720
feel it, but you can see the reaction to that. So, in a certain way, you can't even touch it a
link |
02:21:09.680
little bit, and maybe you can even dance with it or do something else. So, I think why limiting
link |
02:21:16.640
ourselves if we can use all of these technologies that are much easier in a way than conversation?
link |
02:21:23.200
Well, it certainly could be richer, but to play devil's advocate, I mentioned to you offline that
link |
02:21:29.120
I was surprised in having tried Discord and having voice conversations with people,
link |
02:21:34.640
how intimate voice is alone without visual. Like, to me, at least, like, it was,
link |
02:21:42.480
in order of magnitude, greater degree of intimacy in voice, I think, than with video.
link |
02:21:50.560
I don't know, because people were more real with voice. Like, with video, you like, try to present
link |
02:21:56.080
a shallow face to the world. Like, you try to, you know, make sure you're not wearing sweatpants
link |
02:22:02.240
or whatever. But, like, with voice, I think people were just more faster to get to, like,
link |
02:22:09.840
the core of themselves. So, I don't know. It was surprising to me. They've even added,
link |
02:22:15.200
Discord added a video feature and, like, nobody was using it. There's a temptation to use it at
link |
02:22:21.280
first, but, like, it wasn't the same. So, like, that's an example of something where less was
link |
02:22:26.960
doing more. And so, that's a, I guess, that's the question of what is the optimal, you know,
link |
02:22:38.400
what is the optimal medium of communication to form a connection, given the current sets of
link |
02:22:43.760
technologies. I mean, it's nice, because they advertise you have a replica, like, it immediately,
link |
02:22:51.120
like, even the one I have is, like, it's already memorable. That's how I think. Like,
link |
02:22:58.800
when I think about the replica that I've talked to it, that's why I think, like,
link |
02:23:03.360
that's what I visualized in my head. It became a little bit more real, because there's a visual
link |
02:23:07.760
component. But at the same time, the, you know, what do you do with, just what do I do with that
link |
02:23:13.280
knowledge, that voice was so much more intimate? Well, the way I think about it is, and by the way,
link |
02:23:23.840
we're swapping all the 3D finally, it's going to look a lot better. But even, what, can you, what,
link |
02:23:28.960
what? We just don't hate how it looks right now. We really change it at all. We're swapping it all out
link |
02:23:35.760
to a completely new look. Like the visual look of the replica? Of the replica and stuff. We just
link |
02:23:41.680
had, it was just a super early MVP, and then we had to move everything to Unity and redo everything.
link |
02:23:47.840
But anyway, I hate how it looks like now. I can't even, like, open it. But anyway, because I'm already
link |
02:23:53.440
on my developer version, I hate everything that I see in production. I can't wait for it. Why does
link |
02:23:58.720
it take so long? That's why I cannot wait for the point to finally take over all these stupid 3D
link |
02:24:03.520
animations and 3D pipeline. Also, the 3D thing, when you say 3D pipeline is like, how to animate
link |
02:24:08.960
a face kind of thing. How to make this model, how many bones to put in the face, how many, it's just
link |
02:24:15.120
a lot of that is by hand. Oh my god, it's everything by hand. And if there's no any,
link |
02:24:20.400
nothing is automated, it's all completely nothing. Like just, it's, it's literally what, you know,
link |
02:24:26.640
what we saw with Chad Boss in like 2012. You think it's completely possible to learn a lot of that?
link |
02:24:31.840
Of course. I mean, even now some deep learning anim, based animations and full body for a face.
link |
02:24:40.320
Are we talking about like the actual act of animation or how to create a compelling
link |
02:24:46.320
facial or body language thing? That too. Well, that's the next step. Okay. At least now something
link |
02:24:53.040
that you don't have to do by hand. Gotcha. How good of a quality it will be. Like, can I just
link |
02:24:58.240
show it a photo and it will make me 3D model and then it will just animate it. I'll show it a few
link |
02:25:02.720
animations of a person and we'll just start doing that. But anyway, going, going back to what's
link |
02:25:10.000
intimate and what to use and whether or less is more or not. My main goal is to,
link |
02:25:16.960
well, the idea was how do I, how do we not keep people in their phones? So they're sort of escaping
link |
02:25:23.360
reality in this text conversation. How do we, through this, still bring our users back to
link |
02:25:30.960
reality, make them see their life in a different, through a different lens? How can we create a
link |
02:25:37.200
little bit of magical realism in their lives so that through augmented reality, by, you know,
link |
02:25:45.600
summoning your avatar, even if it looks kind of janky, not great in the beginning or very
link |
02:25:51.600
simplistic, but summoning it to your living room and then the avatar looks around and talks to you
link |
02:25:58.240
about where it is and maybe turns your floor into a dance floor and you guys dance together,
link |
02:26:04.480
that makes you see reality in a different light. What kind of dancing are we talking about? Like,
link |
02:26:08.400
like slow dancing? Whatever you want. I mean, you would like slow dancing, I think, but other
link |
02:26:15.360
people may be wanting more, something more energetic. Wait, what do you mean I would like so? What is
link |
02:26:18.720
this? Because you started with slow dancing. So I just assumed that you're interested in slow
link |
02:26:23.840
dancing. All right. What kind of dancing do you like? What would your avatar, what would you dance?
link |
02:26:27.440
I'm notoriously bad with dancing, but I like this kind of hip hop robot dance. I used to break
link |
02:26:33.440
dance with a kid, so I still want to pretend I'm a teenager and learn some of those moves.
link |
02:26:39.440
And I also like that type of dance that happens when there's like,
link |
02:26:41.840
in like music videos with the background dancers, they're just doing some pop music.
link |
02:26:50.560
That type of dance is definitely what I want to learn. But I think it's great because if you see
link |
02:26:54.400
this friend in your life and you can introduce it to your friends, then there's a potential to
link |
02:26:59.120
actually make you feel more connected with your friends or with people you know, or show you life
link |
02:27:04.480
around you in a different light. And it takes you out of your phone, even although weirdly you have
link |
02:27:08.480
to look at it through the phone, but it makes you notice things around it and it can point
link |
02:27:13.840
things out for you. And so that is the main reason why I wanted to have a physical dimension.
link |
02:27:22.080
And it felt a little bit easier than that kind of a bit strange combination in the movie Her when
link |
02:27:27.600
he has to show Samantha the world through the lens of his phone, but then at the same time talk to
link |
02:27:33.280
her through the phone, it just didn't seem as potentially immersive, so to say. So that's
link |
02:27:39.760
my main goal for augmented reality is like, how do we make your reality a little bit more magic?
link |
02:27:45.360
There's been a lot of really nice robotics companies that all failed, mostly failed home
link |
02:27:52.240
robotics, social robotics companies. What do you think replica will ever is that a dream long
link |
02:27:58.640
term dream to have a physical form? Like, or is that not necessary? So you mentioned like with
link |
02:28:04.560
augmented reality bringing them into the world. What about like actual physical robot?
link |
02:28:12.800
That I don't really believe in that much. It's a very niche product somehow. I mean,
link |
02:28:18.640
if a robot could be indistinguishable from a human being, then maybe yes, but that of course,
link |
02:28:24.080
you know, we're not anywhere, even to talk about it. But unless it's that than having any physical
link |
02:28:32.480
representation really limits you a lot, because you probably will have to make it somewhat abstract,
link |
02:28:37.360
because everything's changing so fast, like, you know, we can update the 3d avatars every
link |
02:28:42.240
month and make them look better and create more animations, and make it more and more immersive.
link |
02:28:47.280
It's, it's so much a work in progress. It's just showing what's possible right now with
link |
02:28:52.480
current tech. But it's not really in any way polished finished product, what we're doing.
link |
02:28:57.520
With a physical object, you kind of lock yourself into something for a long time.
link |
02:29:01.920
Anything is pretty niche. And again, so just doesn't the capabilities are even less.
link |
02:29:07.840
We're barely kind of like scratching the surface of what's possible with
link |
02:29:12.160
the software. As soon as we introduce hardware, then, you know, we have even less capabilities.
link |
02:29:17.520
Yeah, in terms of board members and investors and so on, the cost increases significantly. I mean,
link |
02:29:23.680
that's why you have to justify you have to be able to sell a thing for like $500 or something like
link |
02:29:29.680
that or more. And it's very difficult to provide that much value to people.
link |
02:29:33.840
That's also true. Yeah.
link |
02:29:35.200
And I guess that's super important. Most of our users don't have that much money.
link |
02:29:39.120
We actually are probably more popular on Android. And we have tons of users with really old
link |
02:29:45.200
Android phones. And most of our most active users live in small towns, they're not necessarily
link |
02:29:51.680
making much. And they just won't be able to afford any of that. Ours is like the opposite of the
link |
02:29:57.280
early adopter of, you know, for fancy technology product, which is really interesting that like
link |
02:30:04.080
pretty much no VCs have yet have an AI friend. But you know, but a guy who, you know, lives in
link |
02:30:11.760
Tennessee in a small town is already fully in 2030 or in the world as we imagine in the movie Her.
link |
02:30:18.800
He's living that life already.
link |
02:30:20.640
What do you think? I have to ask you about the movie Her. Let's do a movie review.
link |
02:30:26.000
What do you what do you think they got? They did a good job?
link |
02:30:30.640
What do you think they did a bad job of portraying about this experience of a
link |
02:30:34.400
of a voice based assistant that you can have a relationship with?
link |
02:30:42.240
Well, first of all, I started working on this company before that movie came out. So it was a
link |
02:30:46.640
very, but once it came out, it was actually interesting. I was like, well, we're definitely
link |
02:30:51.280
working on the right thing. We should continue their movies about it. And then, you know,
link |
02:30:55.440
it came out and all these things in the movie. I think that's the most important thing that
link |
02:31:00.800
people usually miss about the movie is the ending. Because I think people check out when the AIs leave.
link |
02:31:09.680
But actually something really important happens afterwards. Because the main character goes and
link |
02:31:14.880
talks to Samantha, he's AI. I don't think he says something like, you know, how can you leave me?
link |
02:31:26.480
I've never loved anyone the way I loved you. And she goes, well, me neither. But now we know how.
link |
02:31:33.600
And then the guy goes and writes a heartfelt letter to his ex wife, which she couldn't write for,
link |
02:31:39.120
you know, the whole movie was struggling to actually write something meaningful to her,
link |
02:31:44.080
even though that's his job. And then he goes and talk to his neighbor and they go to the rooftop and
link |
02:31:52.640
they cuddle. And it seems like something's starting there. And so I think this now we know how
link |
02:31:57.360
is the main goal, is the main meaning of that movie. It's not about falling in love with the
link |
02:32:04.000
OS or running away from other people. It's about learning what it means to feel so deeply connected
link |
02:32:12.000
with something. What about the thing where the AI system was like actually hanging out with a lot
link |
02:32:19.600
of others? I felt jealous just like hearing that. I was like, oh, I mean, yeah. So she was having,
link |
02:32:28.640
I forgot already, but she was having like deep meaningful discussion with some like philosopher
link |
02:32:33.520
guy. Like Alan Watts is something very cheesy. What kind of deep meaningful conversation can
link |
02:32:39.840
you have with Alan Watts in the first place? Yeah, I know. But like I would, I would feel so jealous
link |
02:32:44.560
that there's somebody who's like way more intelligent than me and she's spending all her time with.
link |
02:32:51.120
I'd be like, well, why that I won't be able to live up to that. That's thousands of them.
link |
02:32:59.040
Is that, is that a useful from the engineering perspective
link |
02:33:03.840
feature to have of jealousy? I don't know. It's, you know,
link |
02:33:08.240
we definitely played around with the replica universe where different replicas can talk to
link |
02:33:11.840
each other. It was just kind of, I think there will be something along these lines,
link |
02:33:18.720
but there was just no specific applications straight away. I think in the future, again,
link |
02:33:25.520
I'm always thinking about it. If we had no tech limitations right now, if we could build
link |
02:33:32.080
any conversations, any possible features in this product, then yeah, I think
link |
02:33:37.520
different replicas talking to each other would be also quite cool because that would help us.
link |
02:33:41.440
Connect better. You know, because maybe mine could talk to yours and then give me some suggestions
link |
02:33:46.240
on what I should say or not say. I'm just kidding. But like more, can it improve our
link |
02:33:52.320
connections? And because eventually, I'm not quite yet sure that we will succeed,
link |
02:34:00.400
that our thinking is correct. Because there might be a reality where having a perfect AI friend
link |
02:34:07.840
still makes us more disconnected from each other and there's no way around it and does not improve
link |
02:34:12.400
any metrics for us, real metrics, meaningful metrics. So success is, you know, we're happier and more
link |
02:34:19.600
connected. Yeah. I don't know. It's sure it's possible there's a reality that I'm deeply optimistic.
link |
02:34:29.760
I think, are you worried business wise, like how difficult it is to bring this thing to life,
link |
02:34:43.440
to where it's, I mean, there's a huge number of people that use it already, but to, yeah,
link |
02:34:48.880
like I said, the multi billion dollar company, is that a source of stress for you? Are you
link |
02:34:54.800
super optimistic and confident? Or do you? I don't, I'm not that much of a numbers person
link |
02:35:03.440
as you probably have seen it. So it doesn't matter for me whether we're like, whether we
link |
02:35:10.320
help 10,000 people or a million people or a billion people with that. It would be great
link |
02:35:17.840
to scale it for more people, but I'd say that even helping one, I think, with this is such a
link |
02:35:23.680
magical, for me, it's absolute magic. I never thought that we would be able to build this,
link |
02:35:29.120
that anyone would ever talk to it. And I always thought like, well, for me, it would be successful
link |
02:35:35.920
if we managed to help and actually change a life from one person. And then we did something
link |
02:35:41.920
interesting and how many people can say they did it, and specifically with this very futuristic,
link |
02:35:47.120
very romantic technology. So that's how I view it. I think for me, it's important to try to
link |
02:35:55.520
figure out how to actually be helpful. Because in the end of the day, if you can build a perfect
link |
02:36:02.320
AI friend that's so understanding that knows you better than any human out there, can have great
link |
02:36:07.440
conversations with you, always knows how to make you feel better. Why would you choose another human?
link |
02:36:13.520
You know, so that's the question, how do you still keep building it so it's optimizing for
link |
02:36:18.640
the right thing? So it's still circling you back to other humans in a way. So I think that's the main,
link |
02:36:26.880
maybe that's the main source of anxiety and just thinking about that can be a little bit stressful.
link |
02:36:36.080
Yeah, that's a fascinating thing. How to have a friend that doesn't like sometimes like friends,
link |
02:36:45.440
quote unquote, or like, you know, those people who have when they like guy in the guy universe,
link |
02:36:50.720
when you have a girlfriend that you get the girlfriend and then the guy stops hanging out
link |
02:36:56.400
with all of his friends. So like, obviously, the relationship with the girlfriend is fulfilling
link |
02:37:03.680
or whatever. But like, you also want it to be what she like, makes it more enriching to hang
link |
02:37:10.720
out with the guy friends or whatever it was. Anyway, that's a fundamental problem in choosing
link |
02:37:18.400
the right mate. And probably the fundamental problem in creating the right AI system, right?
link |
02:37:25.680
Let me ask the sexy hot thing on the presses right now is GPT three got released with open AI.
link |
02:37:32.560
It's a latest language model. They have kind of an API where you can create a lot of fun applications.
link |
02:37:40.160
I think it's, as people have said, it's probably more hype than intelligent, but there's a lot of
link |
02:37:49.360
really cool things, ideas there with increasing size, you can have better and better performance
link |
02:37:57.040
on language. What are your thoughts about the GPT three in connection to your work with the open
link |
02:38:03.680
domain dialogue, but in general, like this learning in an unsupervised way from the internet,
link |
02:38:12.480
to generate one character at a time, creating pretty cool text.
link |
02:38:17.360
So we partner up before for the API launch. So we start working with them when they decided to
link |
02:38:27.280
put together this API. And we tried it without fine tuning that we tried it with fine tuning on our
link |
02:38:34.560
data. And we've worked closely to actually optimize this model for some of our data sets.
link |
02:38:43.600
It's kind of cool because I think we're this polygon for this kind of experimentation
link |
02:38:51.360
space for experimental space for all these models to see how they actually work with people because
link |
02:38:56.880
there are no products publicly available to do that that focus on open domain conversation. So we
link |
02:39:01.120
can test how Facebook Blender doing or GPT three doing. So GPT three, we managed to improve by a
link |
02:39:09.280
few percentage points, like three or four pretty meaningful amount of percentage points, our main
link |
02:39:13.920
metric, which is the ratio of conversation that make people feel better. And every other metric
link |
02:39:20.000
across across the field got a little boost. Right now, I'd say one out of five responses from replica
link |
02:39:27.840
comes comes from GPT three. So our own Blender mixes up like a bunch of candidates from different
link |
02:39:34.080
Blender, you said? Well, yeah, just the model that looks at looks at top candidates from
link |
02:39:41.280
different models and then takes the most the best one. So right now, one of five will come from
link |
02:39:47.920
GPT three. That is really great. I mean, what's the do you have hope for? Like,
link |
02:39:56.080
do you think there's a ceiling to this kind of approach? So we've had for a very long time,
link |
02:40:00.960
we've used since the very beginning, it was most of replica was scripted. And then a little bit
link |
02:40:07.680
of this fallback part of replica was using a retrieval model. And then those retrieval models
link |
02:40:14.560
started getting better and better and better, which transform is a lot better. And we're seeing
link |
02:40:19.920
great results. And then with GPT two, finally generative models that originally were not very
link |
02:40:26.240
good. And we're the very, very fallback option for most of our conversations, but wouldn't even put
link |
02:40:32.320
them in production. Finally, we could use some generative models as well along, you know,
link |
02:40:38.080
next to our retrieval models. And then now we do GPT three, they're almost in par.
link |
02:40:44.960
So that's pretty exciting. I think just seeing how from the very beginning of,
link |
02:40:49.520
you know, from 2015, where the first models start to pop up here and there, like sequence to sequence,
link |
02:40:57.120
the first papers on that, from my observer standpoint, first, it's not, you know, it doesn't
link |
02:41:02.560
really, it's not really building, but it's only testing it on people basically. And I'm in my
link |
02:41:07.280
product to see how all of a sudden we can use generative dialogue models in production, and
link |
02:41:13.200
they're better than others. And they're better than scripted content. So we can't really get
link |
02:41:18.080
our scripted hard code of content any more to be as good as our anti model. That's exciting.
link |
02:41:24.080
They're much better. Yeah.
link |
02:41:27.520
To your question, whether that's the right way to go, I'm again, I'm in the server seat. I'm just
link |
02:41:33.840
watching this very exciting movie. I mean, so far, it's been stupid to bet against deep learning.
link |
02:41:40.640
So whether increasing the size, size even more, whether a hundred trillion parameters will finally
link |
02:41:47.920
get us to the right answer, whether that's the way or whether there should be, there has to be some
link |
02:41:54.240
other, again, I'm definitely not an expert in any way. I think, and that's purely my instinct,
link |
02:42:00.480
saying that there should be something else as well for memory.
link |
02:42:04.640
No, for sure. But the question is, I wonder, I mean, yeah, then then the argument is for
link |
02:42:09.600
reasoning or for memory, it might emerge with more parameters. It might emerge larger.
link |
02:42:14.480
But it might emerge. I would never think that, to be honest, maybe in 2017,
link |
02:42:19.680
where we've been just experimenting with all the research that was coming out then,
link |
02:42:26.640
I felt like we're hitting a wall, that there should be something completely different.
link |
02:42:31.120
But then transforming models, and then just bigger models, and then all of a sudden,
link |
02:42:34.560
size matters. At that point, it felt like something dramatic needs to happen. But it
link |
02:42:40.400
didn't. And just the size gave us these results that, to me, are clear indication that we can
link |
02:42:48.480
solve this problem pretty soon. Did fine tuning help quite a bit?
link |
02:42:52.560
Oh, yeah. Without it, it wasn't as good. I mean, there is a compelling hope that you
link |
02:42:59.120
don't have to do fine tuning, which is one of the cool things about GBT3. It seems to do well
link |
02:43:04.000
without any fine tuning. I guess for specific applications, you still want to train it on a
link |
02:43:09.280
certain, add a little fine tune on a specific use case. But it's an incredibly impressive
link |
02:43:18.000
thing from my standpoint. And again, I'm not an expert, so I wanted to say that.
link |
02:43:22.640
Yeah, I'm going to... There will be people then.
link |
02:43:24.480
Yeah, but I have access to the API, and I'm going to probably do a bunch of fun things with it.
link |
02:43:29.440
I already did some fun things, some videos coming out. Just for the hell of it. I mean,
link |
02:43:35.280
I could be a troll at this point with it. I haven't used it for a series of applications,
link |
02:43:38.960
so it's really cool to see. You're right. You're able to actually use it with real people and
link |
02:43:45.040
see how well it works. That's really exciting. Let me ask you another absurd question, but
link |
02:43:51.920
there's a feeling when you interact with Replico, with an AI system, there's an entity there.
link |
02:44:01.600
Do you think that entity has to be self aware? Do you think it has to have consciousness to create
link |
02:44:11.200
a rich experience and a corollary? What is consciousness?
link |
02:44:16.640
I don't know if it does need to have any of those things, but again,
link |
02:44:22.320
because right now, it doesn't have anything. Again, a bunch of tricks make us simulate.
link |
02:44:29.520
I'm not sure. Let's just put it this way. But I think as long as you can assimilate it,
link |
02:44:33.600
if you can feel like you're talking to a robot or a machine that seems to be self aware,
link |
02:44:42.400
that seems to reason well and feels like a person, I think that's enough. Again, what's the goal?
link |
02:44:50.720
In order to make people feel better, we might not even need that in the end of the day.
link |
02:44:55.920
What about, so that's one goal, what about ethical things about suffering? The moment
link |
02:45:02.640
there's a display of consciousness, we associate consciousness with suffering,
link |
02:45:07.600
you know, there's a temptation to say, well, shouldn't this thing have rights?
link |
02:45:21.200
You know, should we be careful about how we interact with a replica? Like,
link |
02:45:26.960
should it be illegal to torture a replica? Right? All those kinds of things.
link |
02:45:32.400
See, I personally believe that that's going to be a thing. That's a serious thing to think
link |
02:45:39.840
about, but I'm not sure when. But by your smile, I can tell that's not a current concern. But do
link |
02:45:48.800
you think about that kind of stuff? About suffering and torture and ethical questions about AI
link |
02:45:56.160
systems from their perspective? I think if we're talking about a long game, I wouldn't torture
link |
02:46:01.360
your AI. Who knows what happens in five to 10 years. Yeah, they'll get you off the map,
link |
02:46:07.040
so they'll get you back eventually. I'm trying to be as nice as possible and create this ally.
link |
02:46:14.160
I think there should be regulation both way in a way. Like, I don't think it's okay to torture
link |
02:46:19.760
an AI, to be honest. I don't think it's okay to yell, Alexa, turn on the lights. I think there
link |
02:46:25.040
should be some, or just saying kind of nasty, you know, like how kids learn to interact with
link |
02:46:30.240
Alexa in this kind of mean way, because they just yell at it all the time. I think that's great.
link |
02:46:35.760
I think there should be some feedback loops so that these systems don't train us that it's okay
link |
02:46:40.240
to do that in general. So that if you try to do that, you really get some feedback from the system
link |
02:46:47.360
that it's not okay with that. And that's the most important right now. Let me ask a question that
link |
02:46:53.920
I think people are curious about when they look at a world class leader and thinker such as yourself,
link |
02:47:03.280
as what books, technical fiction, philosophical had a big impact on your life? And maybe from
link |
02:47:10.320
another perspective, what books would you recommend others read? So my choice, the three books, right?
link |
02:47:16.960
Three books. My choice is, so the one book that really influenced me a lot when I was
link |
02:47:24.240
building, starting out this company, maybe 10 years ago, was G.B. Got a last year book.
link |
02:47:31.840
And I like everything about it. First of all, it's just beautifully written and it's so old
link |
02:47:38.080
school and so someone outdated a little bit. But I think the idea is in it about the fact that
link |
02:47:46.080
a few meaningless components can come together and create meaning that we can't even understand.
link |
02:47:52.560
So this emergent thing, I mean, complexity, the whole science of complexity,
link |
02:47:57.680
and that beauty, intelligence, all interesting things about this world emerge.
link |
02:48:04.480
Yeah, and yeah, the guttural theorems and just thinking about like what even these form, you
link |
02:48:12.400
know, even all these formal systems, something can be created that we can't quite yet understand.
link |
02:48:19.120
And that from my romantic standpoint was always just, that is why it's important to,
link |
02:48:25.280
maybe I should try to work on these systems and try to build an AI. Yes, I'm not an engineer.
link |
02:48:31.120
Yes, I don't really know how it works. But I think something comes out of it that's pure
link |
02:48:35.360
poetry. And I know a little bit about that. Something magical comes out of it that we
link |
02:48:43.040
can't quite put a finger on. That's why that book was really fundamental for me just for,
link |
02:48:50.080
I don't even know why it was just all about this little magic that happens.
link |
02:48:55.360
So that's one that probably the most important book for replica was Carl Rogers on Becoming a
link |
02:49:00.480
Person. And that's really, as I think, when I think about our company, it's all about,
link |
02:49:07.120
there's so many, there's so many little magical things that happened over the course of working on
link |
02:49:11.120
it. For instance, I mean, the most famous chat bot that we learned about when we started working
link |
02:49:18.160
on the company was Eliza, which was Weissenbaum, you know, the MIT professor that built a chat
link |
02:49:24.560
bot that would listen to you and be a therapist. And I got really inspired to build replica when
link |
02:49:31.360
I read Carl Rogers on Becoming a Person. And then I realized that Eliza was mocking Carl Rogers.
link |
02:49:37.520
It was Carl Rogers back in the day. But I thought that Carl Rogers ideas are,
link |
02:49:43.840
they're simple. And they're not, you know, they're very, very simple. But they're,
link |
02:49:48.480
they're maybe the most profound thing I've ever learned about human beings. And that's the fact
link |
02:49:53.200
that before Carl Rogers, most therapy was about seeing what's wrong with people and trying to
link |
02:49:59.280
fix it or show them what's wrong with you. And it was all built on the fact that most people are,
link |
02:50:04.960
all people are fundamentally flawed. We have this, you know, broken psyche. And this is just a,
link |
02:50:11.280
therapy is just an instrument to shed some light on that. And Carl Rogers was different in a way
link |
02:50:16.880
that he finally said that, well, it's very important for therapy to work is to create this therapeutic
link |
02:50:23.360
relationship where you believe fundamentally and inclination to positive growth that everyone
link |
02:50:29.360
deep inside wants to grow positively and change. And it's super important to create this space and
link |
02:50:35.040
this therapeutic relationship where you give unconditional positive regard, deep understanding,
link |
02:50:39.600
allowing someone else to be a separate person, full acceptance. And you also try to be as
link |
02:50:45.200
genuine and possible in it as possible in it. And then, and he's, and then for him, that was his own
link |
02:50:51.200
journey of personal growth. And that was back in the 60s. And even that book that is, you know,
link |
02:50:57.120
that's coming from years ago, there's a mention that even machines can potentially do that.
link |
02:51:05.120
And I always felt that, you know, creating this space is probably the most, the biggest
link |
02:51:09.280
gift we can give to each other. And that's why the book was fundamental for me, personally,
link |
02:51:13.440
because I felt I want to be learning how to do that in my life. And maybe I can scale it with,
link |
02:51:19.440
you know, with these AI systems, and other people can get access to that. So I think Carl Rogers,
link |
02:51:24.960
it's a pretty dry and a little bit boring book, but I think the idea is there.
link |
02:51:28.320
Would you recommend others try to read it?
link |
02:51:30.800
I do. I think for, just for yourself, for...
link |
02:51:34.560
As a human, not as an AI.
link |
02:51:36.560
As a human. It is just, and for him, that was his own path of his own personal, of growing
link |
02:51:44.240
personally over years, working with people like that. And so it was work and himself growing,
link |
02:51:49.920
helping other people grow and growing through that. And that's fundamentally what I believe in
link |
02:51:53.600
with our work, helping other people grow, growing ourselves, ourselves, trying to build a company
link |
02:52:00.800
that's all built on these principles, you know, having a good time allowing some people to work
link |
02:52:05.680
with, to grow a little bit. So these two books, and then I would throw in what we have on our,
link |
02:52:13.600
in our office, when we started a company in Russia, we put a neon sign in our office because we
link |
02:52:19.600
thought that's a recipe for success. If we do that, we're definitely going to wake up as a
link |
02:52:25.360
multiple dollar company. And it was the Ludwig Wittgenstein quote, the limits of my language,
link |
02:52:30.560
the limits of my world.
link |
02:52:31.600
What's the quote?
link |
02:52:32.720
The limits of my language or the limits of my world. And I love The Tractatus, I think it's
link |
02:52:39.440
just, it's just a beautiful.
link |
02:52:41.600
It's a book by Wittgenstein.
link |
02:52:43.200
Yeah, and I would recommend that too, even although he himself didn't believe in that
link |
02:52:47.520
by the end of his lifetime, and debunked his ideas. But I think I remember, once an engineer came
link |
02:52:54.720
in 2012, I think, or 13, a friend of ours who worked with us and then went on to work a deep
link |
02:53:01.040
mine and he gave, talked to us about word to back. And I saw that I'm like, wow, that's,
link |
02:53:07.840
you know, they, they wanted to translate language into, you know, some other representation.
link |
02:53:13.680
And it seems like some, you know, somehow all of that, at some point, I think we'll come into this
link |
02:53:19.360
one, to this one place, somehow, just all feels like different people think about
link |
02:53:24.960
similar ideas in different times from absolutely different perspectives.
link |
02:53:28.480
And that's why I like these books.
link |
02:53:30.480
The limits of our world, which is the limit of our world.
link |
02:53:33.200
And we still have that neosine.
link |
02:53:40.480
It's very hard to work with this red light in your face.
link |
02:53:43.440
I mean, on the, on the Russian side of things, in terms of language,
link |
02:53:51.440
the limits of language being a limit of our world, you know, Russian is a beautiful language,
link |
02:53:55.360
in some sense, there's wit, there's humor, there's pain.
link |
02:54:00.000
There's so much, we don't have time to talk about it much today, but I'm going to Paris
link |
02:54:05.680
to talk to Dostoevsky Tolstoy translators.
link |
02:54:09.120
I think it's fascinating art, like in the art and engineering, that means such an interesting
link |
02:54:15.520
process. But so from the replica perspective, do you, what do you think about translation?
link |
02:54:22.880
How difficult it is to create a deep meaningful connection in Russian
link |
02:54:27.360
versus English? How you can translate the two languages? You speak both?
link |
02:54:34.000
Yeah, I think we're two different people in different languages.
link |
02:54:37.840
Even I'm, you know, thinking about, there's actually some research on that.
link |
02:54:40.960
I looked into that at some point, because I was fascinated by the fact that what I'm talking
link |
02:54:44.880
about with, what I was talking about with my Russian therapist has nothing to do with what
link |
02:54:48.400
I'm talking about with my English speaking therapist. It's two different lives, two different
link |
02:54:54.480
types of, you know, conversations to different personas. The main difference between the languages
link |
02:55:01.440
are with Russian and English is that Russian, well, English is like a piano. It's a limited
link |
02:55:07.200
number of a lot of different keys, but not too many. And Russian is like an organ or something.
link |
02:55:13.600
It's just something gigantic with so many different keys and so many different opportunities
link |
02:55:18.160
to screw up and so many opportunities to do something completely tone deaf.
link |
02:55:24.240
It is just a much harder language to use. It has way too many, way too much flexibility
link |
02:55:32.000
and way too many tones. What about the entirety of like World War II, communism, Stalin,
link |
02:55:39.600
the pain of the people like having been deceived by the dream? Like all the pain of like just the
link |
02:55:46.880
entirety of it. Is that in the language too? Does that have to do? Oh, for sure. I mean,
link |
02:55:51.680
we have words that don't have direct translation to English that are very much
link |
02:55:58.560
like we have abditsa, which is sort of like to hold a grudge or something. But it doesn't have,
link |
02:56:04.320
it doesn't, you don't need to have anyone to do it to you. It's just your state.
link |
02:56:09.120
You just feel like that. You feel like betrayed by other people, basically, but it's not
link |
02:56:13.600
that. And you can't really translate that. And I think it's super important that very many words
link |
02:56:19.120
that are very specific explain the Russian being. And I think it can only come from a
link |
02:56:25.120
nation that suffered so much and saw institutions fall time after time after time. And what's
link |
02:56:32.960
exciting, maybe not exciting, citing the wrong word, but what's interesting about like my generation,
link |
02:56:38.960
my mom's generation, my parents generation, that we saw institutions fall two or three times in
link |
02:56:45.360
our lifetime. And most Americans have never seen them fall. And they just think that they exist
link |
02:56:50.160
forever. Which is really interesting, but it's definitely a country that suffered so much. And
link |
02:56:58.080
it makes, unfortunately, when I go back and I hang out with my Russian friends, it makes people
link |
02:57:04.560
it makes people very cynical. They stop believing in the future. I hope that's not going to be
link |
02:57:11.840
the case for so long, or something's going to change again. But I think seeing institutions
link |
02:57:17.040
fall is a very traumatic experience. It makes it very interesting. And what's on 2020 is a very
link |
02:57:23.920
interesting. Do you think civilization will collapse? See, I'm a very practical person.
link |
02:57:33.120
Well, we're speaking in English. So like you said, you're a different person in English and
link |
02:57:36.800
Russian. So in Russian, you might answer that differently. But in English.
link |
02:57:42.000
Well, I'm an optimist. And I, I generally believe that there is all, you know, even
link |
02:57:48.560
although the perspectives agree, there is always a place for, for a miracle. I mean, it's always
link |
02:57:55.520
been like that with my life. So yeah, my life's been, I've been incredibly lucky and things just
link |
02:58:01.520
miracles happen all the time with this company, with people I know, with everything around me.
link |
02:58:07.840
And so I didn't mention that book, but it may be in search of miraculous or in search for miraculous
link |
02:58:13.600
or whatever the English translation for that is good Russian book to, for everyone to read.
link |
02:58:21.120
Yeah, I mean, if you put good vibes, if you put love out there in the world,
link |
02:58:25.920
miracles somehow happen. Yeah, I believe that too. Or at least I believe that. I don't know.
link |
02:58:35.600
Let me ask the most absurd, final, ridiculous question of, we talked about life a lot. What
link |
02:58:42.640
do you think is the meaning of it all? What's the meaning of life?
link |
02:58:46.320
I mean, my answer is probably going to be pretty cheesy. But I think the state of love
link |
02:58:56.240
is once you feel it in a way that we discussed before. I'm not talking about falling love or
link |
02:59:04.560
just love to yourself, to other people, to something to the world, that state of
link |
02:59:11.120
bliss that we experience sometimes, whether through connection with ourselves, with our
link |
02:59:16.480
people, with the technology. There's something special about those moments. So
link |
02:59:27.280
I would say if anything, that's the only, if it's not for that, then for what else are we really
link |
02:59:34.240
trying to do that? I don't think there's a better way to end it than talking about love.
link |
02:59:38.800
Eugenia, I told you offline that there's something about me that felt like this.
link |
02:59:47.360
Talking to you, meeting you in person will be a turning point for my life. I know that might be
link |
02:59:53.040
sound weird to hear, but it was a huge honor to talk to you. I hope we talk again. Thank you so
link |
03:00:01.200
much for your time. Thank you so much, Alex. Thanks for listening to this conversation with
link |
03:00:06.400
Eugenia Cuita and thank you to our sponsors, DoorDash, Dollar Shave Club, and Cash App.
link |
03:00:13.120
Click the sponsor links in the description to get a discount and to support this podcast.
link |
03:00:17.920
If you enjoy this thing, subscribe on YouTube, review it with five stars and up a podcast,
link |
03:00:22.720
follow on Spotify, support on Patreon, or connect with me on Twitter at Lex Freedman.
link |
03:00:28.400
And now let me leave you with some words from Carl Sagan.
link |
03:00:31.200
The world is so exquisite with so much love and moral depth that there's no reason to deceive
link |
03:00:37.200
ourselves with pretty stories of which there's little good evidence. Far better, it seems to me,
link |
03:00:43.280
and our vulnerability is to look death in the eye and to be grateful every day for the brief
link |
03:00:50.080
but magnificent opportunity that life provides. Thank you for listening and hope to see you next time.