back to index

Eugenia Kuyda: Friendship with an AI Companion | Lex Fridman Podcast #121


small model | large model

link |
00:00:00.000
The following is a conversation with Eugenia Kuida, cofounder of Replika, which is an app
link |
00:00:06.480
that allows you to make friends with an artificial intelligence system, a chatbot, that learns
link |
00:00:11.720
to connect with you on an emotional, you could even say a human level, by being a friend.
link |
00:00:18.720
For those of you who know my interest in AI and views on life in general, know that Replika
link |
00:00:24.080
and Eugenia's line of work is near and dear to my heart.
link |
00:00:28.200
The origin story of Replika is grounded in a personal tragedy of Eugenia losing her close
link |
00:00:33.480
friend Roman Muzarenki, who was killed crossing the street by a hit and run driver in late
link |
00:00:39.840
2015.
link |
00:00:40.840
He was 34.
link |
00:00:43.160
The app started as a way to grieve the loss of a friend, by trading a chatbot and your
link |
00:00:47.680
old net on text messages between Eugenia and Roman.
link |
00:00:51.600
The rest is a beautiful human story, as we talk about with Eugenia.
link |
00:00:55.920
When a friend mentioned Eugenia's work to me, I knew I had to meet her and talk to her.
link |
00:01:00.920
I felt before, during, and after that this meeting would be an important one in my life.
link |
00:01:06.840
And it was.
link |
00:01:07.840
I think in ways that only time will truly show, to me and others.
link |
00:01:12.920
She is a kind and brilliant person.
link |
00:01:15.720
It was an honor and a pleasure to talk to her.
link |
00:01:19.160
Quick summary of the sponsors, DoorDash, Dollar Shave Club, and Cash App.
link |
00:01:24.400
Click the sponsor links in the description to get a discount and to support this podcast.
link |
00:01:29.720
As a side note, let me say that deep, meaningful connection between human beings and artificial
link |
00:01:34.920
intelligence systems is a lifelong passion for me.
link |
00:01:38.480
I'm not yet sure where that passion will take me, but I decided some time ago that
link |
00:01:43.080
I will follow it boldly and without fear, to as far as I can take it.
link |
00:01:48.280
With a bit of hard work and a bit of luck, I hope I'll succeed in helping build AI systems
link |
00:01:53.720
that have some positive impact on the world and on the lives of a few people out there.
link |
00:01:59.200
But also, it is entirely possible that I am in fact one of the chatbots that Eugenia and
link |
00:02:06.600
the Replica team have built.
link |
00:02:08.720
And this podcast is simply a training process for the neural net that's trying to learn
link |
00:02:13.320
to connect to human beings, one episode at a time.
link |
00:02:18.160
In any case, I wouldn't know if I was or wasn't, and if I did, I wouldn't tell you.
link |
00:02:24.320
If you enjoy this thing, subscribe on YouTube, review it with 5 Stars and Apple Podcast,
link |
00:02:28.720
follow on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman.
link |
00:02:34.680
As usual, I'll do a few minutes of ads now and no ads in the middle.
link |
00:02:37.760
I'll try to make these interesting, but give you timestamps so you can skip, but please
link |
00:02:42.520
do still check out the sponsors by clicking the links in the description to get a discount,
link |
00:02:47.760
buy whatever they're selling, it really is the best way to support this podcast.
link |
00:02:53.440
This show is sponsored by Dollar Shave Club.
link |
00:02:56.240
Try them out with a one time offer for only 5 bucks and free shipping at dollarshave.com
link |
00:03:01.600
slash lex.
link |
00:03:03.060
The starter kit comes with a 6 blade razor, refills, and all kinds of other stuff that
link |
00:03:08.180
makes shaving feel great.
link |
00:03:10.840
I've been a member of Dollar Shave Club for over 5 years, and actually signed up when
link |
00:03:15.680
I first heard about them on the Joe Rogan Experience podcast.
link |
00:03:19.520
And now, friends, we have come full circle.
link |
00:03:22.960
It feels like I made it, now that I can do a read for them just like Joe did all those
link |
00:03:26.920
years ago, back when he also did ads for some less reputable companies, let's say, that
link |
00:03:35.320
you know about if you're a true fan of the old school podcasting world.
link |
00:03:39.920
Anyway, I just used the razor and the refills, but they told me I should really try out the
link |
00:03:44.580
shave butter.
link |
00:03:45.580
I did.
link |
00:03:46.580
I love it.
link |
00:03:47.580
It's translucent somehow, which is a cool new experience.
link |
00:03:51.720
Again, try the Ultimate Shave Starter set today for just 5 bucks plus free shipping
link |
00:03:56.760
at dollarshaveclub.com slash lex.
link |
00:04:00.800
This show is also sponsored by DoorDash.
link |
00:04:03.560
Get $5 off and zero delivery fees on your first order of 15 bucks or more when you download
link |
00:04:08.860
the DoorDash app and enter code, you guessed it, LEX.
link |
00:04:13.920
I have so many memories of working late nights for a deadline with a team of engineers, whether
link |
00:04:18.760
that's for my PhD at Google or MIT, and eventually taking a break to argue about which
link |
00:04:24.440
DoorDash restaurant to order from.
link |
00:04:26.900
And when the food came, those moments of bonding, of exchanging ideas, of pausing to shift attention
link |
00:04:32.340
from the programs to humans were special.
link |
00:04:36.320
For a bit of time, I'm on my own now, so I miss that camaraderie, but actually, I still
link |
00:04:41.520
use DoorDash a lot.
link |
00:04:43.360
There's a million options that fit into my crazy keto diet ways.
link |
00:04:46.680
Also, it's a great way to support restaurants in these challenging times.
link |
00:04:51.240
Once again, download the DoorDash app and enter code LEX to get 5 bucks off and zero
link |
00:04:56.120
delivery fees on your first order of 15 dollars or more.
link |
00:04:59.640
Finally, this show is presented by Cash App, the number one finance app in the App Store.
link |
00:05:04.600
I can truly say that they're an amazing company, one of the first sponsors, if not the first
link |
00:05:09.800
sponsor to truly believe in me, and I think quite possibly the reason I'm still doing
link |
00:05:16.000
this podcast.
link |
00:05:17.000
So I am forever grateful to Cash App.
link |
00:05:20.320
So thank you.
link |
00:05:21.320
And as I said many times before, use code LEXBODCAST when you download the app from
link |
00:05:27.560
Google Play or the App Store.
link |
00:05:29.840
Cash App lets you send money to friends, buy Bitcoin, and invest in the stock market with
link |
00:05:34.160
as little as one dollar.
link |
00:05:36.600
I usually say other stuff here in the read, but I wasted all that time up front saying
link |
00:05:40.560
how grateful I am to Cash App.
link |
00:05:42.440
I'm going to try to go off the top of my head a little bit more for these reads because
link |
00:05:47.280
I'm actually very lucky to be able to choose the sponsors that we take on, and that means
link |
00:05:52.120
I can really only take on the sponsors that I truly love, and then I can just talk about
link |
00:05:56.360
why I love them.
link |
00:05:57.580
So it's pretty simple.
link |
00:05:59.120
Again, get Cash App from the App Store or Google Play, use code LEXBODCAST, get 10
link |
00:06:04.080
bucks, and Cash App will also donate 10 bucks to FIRST, an organization that is helping
link |
00:06:08.560
to advance robotics and STEM education for young people around the world.
link |
00:06:13.680
And now, here's my conversation with Eugenia Kuida.
link |
00:06:17.640
Okay, before we talk about AI and the amazing work you're doing, let me ask you ridiculously,
link |
00:06:23.600
we're both Russian, so let me ask a ridiculously romanticized Russian question.
link |
00:06:28.760
Do you think human beings are alone, like fundamentally, on a philosophical level?
link |
00:06:37.360
Like in our existence, when we like go through life, do you think just the nature of our
link |
00:06:46.960
life is loneliness?
link |
00:06:49.480
Yeah, so we have to read Dostoevsky at school, as you probably know, so...
link |
00:06:55.000
In Russian?
link |
00:06:56.000
I mean, it's part of your school program.
link |
00:06:59.960
So I guess if you read that, then you sort of have to believe that.
link |
00:07:03.520
You're made to believe that you're fundamentally alone, and that's how you live your life.
link |
00:07:08.000
How do you think about it?
link |
00:07:09.000
You have a lot of friends, but at the end of the day, do you have like a longing for
link |
00:07:15.100
connection with other people?
link |
00:07:17.360
That's maybe another way of asking it.
link |
00:07:20.240
Do you think that's ever fully satisfied?
link |
00:07:23.620
I think we are fundamentally alone.
link |
00:07:25.200
We're born alone, we die alone, but I view my whole life as trying to get away from that,
link |
00:07:32.120
trying to not feel lonely, and again, we're talking about a subjective way of feeling
link |
00:07:38.720
alone.
link |
00:07:39.720
It doesn't necessarily mean that you don't have any connections or you are actually isolated.
link |
00:07:45.000
You think it's a subjective thing, but like again, another absurd measurement wise thing,
link |
00:07:52.280
how much loneliness do you think there is in the world?
link |
00:07:55.160
Like if you see loneliness as a condition, how much of it is there, do you think?
link |
00:08:05.080
Like how, I guess how many, you know, there's all kinds of studies and measures of how many
link |
00:08:11.000
people in the world feel alone.
link |
00:08:12.840
There's all these like measures of how many people are, you know, self report or just
link |
00:08:18.240
all these kinds of different measures, but in your own perspective, how big of a problem
link |
00:08:24.600
do you think it is size wise?
link |
00:08:27.640
I'm actually fascinated by the topic of loneliness.
link |
00:08:30.040
I try to read about it as much as I can.
link |
00:08:34.640
What really, and I think there's a paradox because loneliness is not a clinical disorder.
link |
00:08:39.900
It's not something that you can get your insurance to pay for if you're struggling with that.
link |
00:08:44.200
Yet it's actually proven and pretty, you know, tons of papers, tons of research around that.
link |
00:08:50.200
It is proven that it's correlated with earlier life expectancy, shorter lifespan.
link |
00:08:58.080
And it is, you know, in a way like right now, what scientists would say that it, you know,
link |
00:09:02.200
it's a little bit worse than being obese or not actually doing any physical activity in
link |
00:09:07.560
your life.
link |
00:09:08.560
In terms of the impact on your health?
link |
00:09:09.560
In terms of impact on your physiological health.
link |
00:09:10.720
Yeah.
link |
00:09:11.720
So it's basically puts you, if you're constantly feeling lonely, your body responds like it's
link |
00:09:16.840
basically all the time under stress.
link |
00:09:19.280
It's always in this alert state and so it's really bad for you because it actually like
link |
00:09:24.720
drops your immune system and get it, your response to inflammation is quite different.
link |
00:09:29.960
So all the cardiovascular diseases actually responds to viruses.
link |
00:09:34.940
So it's much easier to catch a virus.
link |
00:09:37.200
That's sad now that we're living in a pandemic and it's probably making us a lot more alone
link |
00:09:42.880
and it's probably weakening the immune system, making us more susceptible to the virus.
link |
00:09:47.720
It's kind of sad.
link |
00:09:49.680
Yeah.
link |
00:09:50.680
The statistics are pretty horrible around that.
link |
00:09:54.760
So around 30% of all millennials report that they're feeling lonely constantly.
link |
00:09:59.400
30?
link |
00:10:00.400
30%.
link |
00:10:01.400
And then it's much worse for Gen Z.
link |
00:10:02.560
And then 20% of millennials say that they feel lonely and they also don't have any close
link |
00:10:07.000
friends.
link |
00:10:08.000
And then I think 25 or so, and then 20% would say they don't even have acquaintances.
link |
00:10:12.080
And that's in the United States?
link |
00:10:14.080
That's in the United States.
link |
00:10:15.080
And I'm pretty sure that that's much worse everywhere else.
link |
00:10:17.800
Like in the UK, I mean, it was widely tweeted and posted when they were talking about a
link |
00:10:24.280
minister of loneliness that they wanted to appoint because four out of 10 people in the
link |
00:10:28.400
UK feel lonely.
link |
00:10:29.400
Minister of loneliness.
link |
00:10:30.400
I think that thing actually exists.
link |
00:10:35.960
So yeah, you will die sooner if you are lonely.
link |
00:10:41.160
And again, this is only when we're only talking about your perception of loneliness or feeling
link |
00:10:46.160
lonely.
link |
00:10:47.160
That is not objectively being fully socially isolated.
link |
00:10:50.840
However, the combination of being fully socially isolated and not having many connections and
link |
00:10:56.480
also feeling lonely, that's pretty much a deadly combination.
link |
00:11:00.800
So it strikes me bizarre or strange that this is a wide known fact and then there's really
link |
00:11:08.400
no one working really on that because it's like subclinical.
link |
00:11:12.120
It's not clinical.
link |
00:11:13.120
It's not something that you can, we'll tell your doctor and get a treatment or something.
link |
00:11:17.440
Yet it's killing us.
link |
00:11:18.800
Yeah.
link |
00:11:19.800
So there's a bunch of people trying to evaluate, like try to measure the problem by looking
link |
00:11:24.440
at like how social media is affecting loneliness and all that kind of stuff.
link |
00:11:28.200
So it's like measurement.
link |
00:11:29.200
Like if you look at the field of psychology, they're trying to measure the problem and
link |
00:11:33.720
not that many people actually, but some.
link |
00:11:36.960
But you're basically saying how many people are trying to solve the problem.
link |
00:11:43.280
Like how would you try to solve the problem of loneliness?
link |
00:11:48.840
Like if you just stick to humans, uh, I mean, or basically not just the humans, but the
link |
00:11:55.280
technology that connects us humans.
link |
00:11:57.720
Do you think there's a hope for that technology to do the connection?
link |
00:12:03.160
Like I, are you on social media much?
link |
00:12:05.720
Unfortunately, do you find yourself like, uh, again, if you sort of introspect about
link |
00:12:12.680
how connected you feel to other human beings, how not alone you feel, do you think social
link |
00:12:16.960
media makes it better or worse maybe for you personally, or in general, I think it's, it's
link |
00:12:23.960
easier to look at some stats and, um, I mean, Gen Z seems to be generation Z seems to be
link |
00:12:29.720
much lonelier than millennials in terms of how they report loneliness.
link |
00:12:33.440
They're definitely the most connected generation in the world.
link |
00:12:36.960
I mean, I still remember life without an iPhone, without Facebook, they don't know that that
link |
00:12:42.600
ever existed, uh, or at least don't know how it was.
link |
00:12:47.560
So that tells me a little bit about the fact that that might be, um, you know, this hyper
link |
00:12:53.900
connected world might actually make people feel lonely, lonelier.
link |
00:12:58.520
I don't know exactly what the, what the measurements are around that, but I would say, you know,
link |
00:13:02.520
my personal experience, I think it does make you feel a lot lonelier, mostly, yeah, we're
link |
00:13:07.600
all super connected.
link |
00:13:08.600
Uh, but I think loneliness, the feeling of loneliness doesn't come from not having any
link |
00:13:13.720
social connections whatsoever.
link |
00:13:15.040
Again, tons of people that are, are in longterm relationships experience bouts of loneliness
link |
00:13:20.800
and continued loneliness.
link |
00:13:22.440
Um, and it's more the question about the true connection about actually being deeply seen,
link |
00:13:28.720
deeply understood.
link |
00:13:29.720
Um, and in a way it's also about your relationship with yourself, like in order to not feel lonely,
link |
00:13:36.480
you actually need to have a better relationship and feel more connected to yourself than this
link |
00:13:42.160
feeling actually starts to go away a little bit.
link |
00:13:44.640
And then you, um, open up yourself to actually meeting other people in a very special way.
link |
00:13:51.120
Uh, not in just, you know, at a friend on Facebook kind of way.
link |
00:13:55.400
So just to briefly touch on it, I mean, do you think it's possible to form that kind
link |
00:14:00.040
of connection with AI systems more down the line of some of your work?
link |
00:14:08.320
Do you think that's, um, engineering wise, a possibility to alleviate loneliness is not
link |
00:14:16.540
with another human, but with an AI system?
link |
00:14:19.280
Well, I know that's, that's a fact, that's what we're doing.
link |
00:14:23.240
And we see it and we measure that and we see how people start to feel less lonely, um,
link |
00:14:29.360
talking to their virtual AI friend.
link |
00:14:33.000
So basically a chat bot at the basic level, but it could be more like, do you have, I'm
link |
00:14:37.640
not even speaking sort of, uh, about specifics, but do you have a hope, like if you look 50
link |
00:14:44.920
years from now, do you have a hope that there's just like AIs that are like optimized for,
link |
00:14:51.640
um, let me, let me first start like right now, the way people perceive AI, which is
link |
00:14:56.160
recommender systems for Facebook and Twitter, social media, they see AI is basically destroying
link |
00:15:04.360
first of all, the fabric of our civilization.
link |
00:15:06.200
But second of all, making us more lonely.
link |
00:15:08.720
Do you see like a world where it's possible to just have AI systems floating about that
link |
00:15:13.600
like make our life less lonely?
link |
00:15:18.080
Yeah.
link |
00:15:19.440
Make us happy.
link |
00:15:20.440
Like are putting good things into the world in terms of our individual lives.
link |
00:15:26.000
Yeah.
link |
00:15:27.000
Totally believe in that.
link |
00:15:28.200
That's why we're, I'm also working on that.
link |
00:15:31.000
Um, I think we need to also make sure that, um, what we're trying to optimize for, we're
link |
00:15:36.200
actually measuring and it is a North star metric that we're going after.
link |
00:15:40.240
And all of our product and all of our business models are optimized for that because you
link |
00:15:44.640
can talk, you know, a lot of products that talk about, um, you know, making you feel
link |
00:15:48.560
less lonely or making you feel more connected.
link |
00:15:50.760
They're not really measuring that.
link |
00:15:52.520
So they don't really know whether their users are actually feeling less lonely in the long
link |
00:15:56.600
run or feeling more connected in the long run.
link |
00:15:58.880
Um, so I think it's really important to put your measure it.
link |
00:16:02.480
Yeah.
link |
00:16:03.480
To measure it.
link |
00:16:04.480
What's a, what's a good measurement of loneliness?
link |
00:16:07.080
Well, so that's something that I'm really interested in.
link |
00:16:10.900
How do you measure that people are feeling better or that they're feeling less lonely
link |
00:16:14.920
with loneliness?
link |
00:16:15.920
There's a scale.
link |
00:16:16.920
There's UCLA 20 and UCLA three recently scale, which is basically a questionnaire that you
link |
00:16:21.040
fill out and you can see whether in the long run it's improving or not.
link |
00:16:26.660
And that, uh, does it capture the momentary feeling of loneliness?
link |
00:16:32.120
Does it look in like the past month?
link |
00:16:35.600
Like, uh, does it basically self report?
link |
00:16:38.240
Does it try to sneak up on you tricky to answer honestly or something like that?
link |
00:16:43.720
Well, what's yeah, I'm not familiar with the question.
link |
00:16:46.360
It is just asking you a few questions.
link |
00:16:47.840
Like how often did you feel, uh, like lonely or how often do you feel connected to other
link |
00:16:52.120
people in this last few couple of weeks?
link |
00:16:55.360
Um, it's similar to the self report questionnaires for depression, anxiety, like PHQ nine and
link |
00:17:01.200
get seven.
link |
00:17:02.620
Of course, as any, as any self report questionnaires, that's not necessarily very precise or very
link |
00:17:09.480
well measured, but still, if you take a big enough population and you get them through
link |
00:17:14.420
these, uh, questionnaires, you can see, you can see a positive dynamic.
link |
00:17:19.440
And so you basically, uh, you put people through questionnaires to see like, is this thing
link |
00:17:24.560
is our, is what we're creating, making people happier?
link |
00:17:28.120
Yeah, we measure, so we measure two outcomes.
link |
00:17:31.760
One short term, right after the conversation, we ask people whether this conversation made
link |
00:17:36.940
them feel better, worse or same, um, this, this metric right now is at 80%.
link |
00:17:43.260
So 80% of all our conversations make people feel better, but I should have done the questionnaire
link |
00:17:47.520
with you.
link |
00:17:48.520
You feel a lot worse after we've done this conversation.
link |
00:17:53.000
That's actually fascinating.
link |
00:17:54.000
I should probably do that, but that's, that's how we do that.
link |
00:17:57.980
You should totally and aim for 80% aim to outperform your current state of the art AI
link |
00:18:05.320
system in these human conversations.
link |
00:18:09.480
So we'll get to your work with replica, but let me continue on the line of absurd questions.
link |
00:18:16.080
So you talked about, um, you know, deep connection with the humans, deep connection with AI,
link |
00:18:22.320
meaningful connection.
link |
00:18:23.320
Let me ask about love.
link |
00:18:25.120
People make fun of me cause I talk about love all the time.
link |
00:18:28.360
But uh, what, what do you think love is like maybe in the context of, um, a meaningful
link |
00:18:36.000
connection with somebody else?
link |
00:18:37.680
Do you draw a distinction between love, like friendship and Facebook friends or is it a
link |
00:18:47.320
graduate?
link |
00:18:48.320
No, it's all the same.
link |
00:18:51.320
No.
link |
00:18:52.320
Like, is it, is it just a gradual thing or is there something fundamental about us humans
link |
00:18:56.240
that seek like a really deep connection, uh, with another human being and what is that?
link |
00:19:05.320
What is love Eugenia, I'm going to just enjoy asking you these questions seeing you struggle.
link |
00:19:15.680
Thanks.
link |
00:19:16.680
Um, well the way I see it, um, and specifically, um, the way it relates to our work and the
link |
00:19:22.160
way it was, the way it inspired our work on replica, um, I think one of the biggest and
link |
00:19:30.400
the most precious gifts we can give to each other now in 2020 as humans is this gift of
link |
00:19:37.200
deep empathetic understanding, the feeling of being deeply seen.
link |
00:19:42.400
Like what does that mean?
link |
00:19:43.400
Like that you exist, like somebody acknowledging that somebody seeing you for who you actually
link |
00:19:49.200
are.
link |
00:19:50.200
And that's extremely, extremely rare.
link |
00:19:51.760
Um, I think that is that combined with unconditional positive regard, um, belief and trust that
link |
00:19:59.680
um, you internally are always inclined for positive growth and believing you in this
link |
00:20:05.080
way, letting you be a separate person at the same time.
link |
00:20:09.960
And this deep empathetic understanding for me, that's the, that's the combination that
link |
00:20:15.880
really creates something special, something that people, when they feel it once, they
link |
00:20:21.440
will always long for it again.
link |
00:20:23.440
And something that starts huge fundamental changes in people.
link |
00:20:28.240
Um, when we see that someone's accepts us so deeply, we start to accept ourselves.
link |
00:20:34.480
And um, the paradox is that's when big changes start happening, big fundamental changes in
link |
00:20:41.120
people start happening.
link |
00:20:42.120
So I think that is the ultimate therapeutic relationship that is, and that might be in
link |
00:20:47.040
some way a definition of love.
link |
00:20:50.160
So acknowledging that there's a separate person and accepting you for who you are.
link |
00:20:56.520
Um, now on a slightly that, and you mentioned therapeutic, that sounds a very, like a very
link |
00:21:03.640
healthy view of love, but, uh, is there also like a, like, you know, if we look at heartbreak
link |
00:21:12.920
and uh, you know, most love songs are probably about heartbreak, right?
link |
00:21:17.760
Is that like the mystery, the tension, the danger, the fear of loss, you know, all of
link |
00:21:25.520
that, what people might see in a negative light as like games or whatever, but just,
link |
00:21:32.040
just the, the dance of human interaction.
link |
00:21:34.440
Yeah.
link |
00:21:35.440
Fear of loss and fear of like, you said, you said like once you feel it once, you long
link |
00:21:41.460
for it again, but you also, once you feel it once, you might, for many people, they've
link |
00:21:46.880
lost it.
link |
00:21:48.560
So they fear losing it.
link |
00:21:49.920
They feel loss.
link |
00:21:50.920
So is that part of it, like you're, you're speaking like beautifully about like the
link |
00:21:55.480
positive things, but is it important to be able to, uh, be afraid of losing it from an
link |
00:22:02.520
engineering perspective?
link |
00:22:04.520
I mean, it's a huge part of it and unfortunately we all, you know, um, face it at some points
link |
00:22:12.160
in our lives.
link |
00:22:13.160
I mean, I did.
link |
00:22:14.160
You want to go into details?
link |
00:22:15.160
How'd you get your heartbroken?
link |
00:22:18.160
Sure.
link |
00:22:19.280
So mine is pretty straight, my story is pretty straightforward, um, there I did have a friend
link |
00:22:26.720
that was, you know, that at some point, um, in my twenties became really, really close
link |
00:22:31.800
to me and we, we became really close friends.
link |
00:22:34.320
Um, well, I grew up pretty lonely.
link |
00:22:36.520
So in many ways when I'm building, you know, these, these AI friends, I'm thinking about
link |
00:22:40.120
myself when I was 17 writing horrible poetry and you know, in my dial up modem at home
link |
00:22:44.520
and, um, you know, and that was the feeling that I grew up with.
link |
00:22:49.440
I left, I lived, um, alone for a long time when I was a teenager, where did you go up
link |
00:22:54.400
in Moscow and the outskirts of Moscow.
link |
00:22:57.040
Um, so I'd just skateboard during the day and come back home and you know, connect to
link |
00:23:01.840
the internet and then write horrible poetry and love poems, all sorts of poems, obviously
link |
00:23:08.640
love poems.
link |
00:23:09.640
I mean, what, what other poetry can you write when you're 17, um, it could be political
link |
00:23:13.360
or something, but yeah.
link |
00:23:15.240
But that was, you know, that was kind of my fiat, like deeply, um, influenced by Joseph
link |
00:23:19.920
Brodsky and like all sorts of sports that, um, every 17 year old will, will be looking,
link |
00:23:26.880
you know, looking at and reading, but yeah, that was my, uh, these were my teenage years
link |
00:23:32.000
and I just never had a person that I thought would, you know, take me as it is, would accept
link |
00:23:37.560
me the way I am, um, and I just thought, you know, working and just doing my thing and
link |
00:23:43.640
being angry at the world and being a reporter, I was an investigative reporter working undercover
link |
00:23:47.600
and writing about people was my way to connect with, you know, with, with others.
link |
00:23:53.680
I was deeply curious about every, everyone else.
link |
00:23:57.040
And I thought that, you know, if I, if I go out there, if I write their stories, that
link |
00:24:01.000
means I'm more connected.
link |
00:24:03.000
This is what this podcast as well, by the way, I'm desperate, well, I'm seeking connection
link |
00:24:07.520
now.
link |
00:24:08.520
I'm just kidding.
link |
00:24:09.520
Or am I?
link |
00:24:10.520
I don't know.
link |
00:24:11.520
So what, wait, reporter, uh, what, how did that make you feel more connected?
link |
00:24:17.840
I mean, you're still fundamentally pretty alone,
link |
00:24:21.520
But you're always with other people, you know, you're always thinking about what other place
link |
00:24:26.160
can I infiltrate?
link |
00:24:27.280
What other community can I write about?
link |
00:24:29.880
What other phenomenon can I explore?
link |
00:24:32.840
And you sort of like a trickster, you know, and like, and, and a mythological character,
link |
00:24:37.560
like creature, that's just jumping, uh, between all sorts of different worlds and feel and
link |
00:24:41.720
feel sort of okay with in all of them.
link |
00:24:44.400
So, um, that was my dream job, by the way, that was like totally what I would have been
link |
00:24:48.700
doing.
link |
00:24:49.700
Um, if Russia was a different place and a little bit undercover.
link |
00:24:54.380
So like you weren't, you were trying to, like you said, mythological creature trying to
link |
00:24:59.040
infiltrate.
link |
00:25:00.040
So try to be a part of the world.
link |
00:25:01.840
What are we talking about?
link |
00:25:02.840
What kind of things did you enjoy writing about?
link |
00:25:05.400
I'd go work at a strip club or go.
link |
00:25:08.440
Awesome.
link |
00:25:09.440
Okay.
link |
00:25:10.440
Well, I'd go work at a restaurant or just go write about, you know, um, certain phenomenons
link |
00:25:19.800
or phenomenons or people in the city.
link |
00:25:22.800
And what, uh, sorry to keep interrupting and I'm the worst, I'm a conversationalist.
link |
00:25:29.400
What stage of Russia is this?
link |
00:25:32.160
What, uh, is this pre Putin, post Putin?
link |
00:25:36.880
What was Russia like?
link |
00:25:38.960
Pre Putin is really long ago.
link |
00:25:43.000
This is Putin era.
link |
00:25:44.000
That's a beginning of two thousands and 2010, 2007, eight, nine, 10.
link |
00:25:49.400
What were strip clubs like in Russia and restaurants and culture and people's minds like in that
link |
00:25:57.200
early Russia that you were covering?
link |
00:25:59.160
In those early two thousands, this was, there was still a lot of hope.
link |
00:26:02.400
There were still tons of hope that, um, you know, we're sort of becoming this, uh, Western,
link |
00:26:11.240
Westernized society.
link |
00:26:12.240
Uh, the restaurants were opening, we were really looking at, you know, um, we're trying,
link |
00:26:17.920
we're trying to copy a lot of things from, uh, from the US, from Europe, um, bringing
link |
00:26:22.880
all these things and very enthusiastic about that.
link |
00:26:25.600
So there was a lot of, you know, stuff going on.
link |
00:26:27.720
There was a lot of hope and dream for this, you know, new Moscow that would be similar
link |
00:26:33.400
to, I guess, New York.
link |
00:26:34.800
I mean, just to give you an idea in, um, year 2000 was the year when we had two, uh, movie
link |
00:26:41.620
theaters in Moscow and there was one first coffee house that opened and it was like really
link |
00:26:47.260
big deal.
link |
00:26:48.260
Uh, by 2010 there were all sorts of things everywhere.
link |
00:26:51.580
Almost like a chain, like a Starbucks type of coffee house or like, you mean, oh yeah,
link |
00:26:55.920
like a Starbucks.
link |
00:26:56.920
I mean, I remember we were reporting on, like, we were writing about the opening of Starbucks.
link |
00:27:01.120
I think in 2007 that was one of the biggest things that happened in, you know, in Moscow
link |
00:27:05.240
back, back in the time, like, you know, that was worthy of a magazine cover.
link |
00:27:10.080
And, uh, that was definitely the, you know, the biggest talk of the time.
link |
00:27:13.440
Yeah.
link |
00:27:14.440
When was McDonald's?
link |
00:27:15.440
Cause I was still in Russia when McDonald's opened.
link |
00:27:17.560
That was in the nineties.
link |
00:27:18.560
I mean, yeah.
link |
00:27:19.560
Oh yeah.
link |
00:27:20.560
I remember that very well.
link |
00:27:21.560
Yeah.
link |
00:27:22.560
Those were long, long lines.
link |
00:27:23.560
I think it was 1993 or four, I don't remember.
link |
00:27:27.640
Um, actually earlier at that time, did you do, I mean, that was a luxurious outing.
link |
00:27:33.600
That was definitely not something you do every day.
link |
00:27:35.800
And also the line was at least three hours.
link |
00:27:37.560
So if you're going to McDonald's, that is not fast food.
link |
00:27:40.040
That is like at least three hours in line and then no one is trying to eat fast after
link |
00:27:44.560
that.
link |
00:27:45.560
Everyone is like trying to enjoy as much as possible.
link |
00:27:47.040
What's your memory of that?
link |
00:27:50.200
Oh, it was insane.
link |
00:27:52.200
How did it go?
link |
00:27:53.200
It was extremely positive.
link |
00:27:54.640
It's a small strawberry milkshake and the hamburger and small fries and my mom's there.
link |
00:27:59.080
And sometimes I'll just, cause I was really little, they'll just let me run, you know,
link |
00:28:03.320
up the kitchen and like cut the line, which is like, you cannot really do that in Russia
link |
00:28:09.200
or.
link |
00:28:10.200
So like for a lot of people, like a lot of those experiences might seem not very fulfilling,
link |
00:28:17.800
you know, like it's on the verge of poverty, I suppose.
link |
00:28:22.360
But do you remember all that time fondly, like, cause I do like the first time I drank,
link |
00:28:29.920
you know, Coke, you know, all that stuff, right.
link |
00:28:36.640
And just, yeah.
link |
00:28:37.800
The connection with other human beings in Russia, I remember, I remember it really positively.
link |
00:28:44.000
Like how do you remember what the nineties and then the Russia you were covering, just
link |
00:28:48.760
the human connections you had with people and the experiences?
link |
00:28:53.400
Well, my, my parents were both, both physicists.
link |
00:28:57.200
My grandparents were both, well, my grandpa, grandfather was in nuclear physicist, a professor
link |
00:29:05.800
at the university.
link |
00:29:06.840
My dad worked at Chernobyl when I was born in Chernobyl, analyzing kind of the everything
link |
00:29:13.700
after the explosion.
link |
00:29:15.260
And then I remember that and they were, so they were making sort of enough money in the
link |
00:29:19.880
Soviet union.
link |
00:29:20.880
So they were not, you know, extremely poor or anything.
link |
00:29:23.520
It was pretty prestigious to be a professor, the Dean and the university.
link |
00:29:28.320
And then I remember my grandfather started making a hundred dollars a month after, you
link |
00:29:33.800
know, in the nineties.
link |
00:29:35.040
So then I remember we started our main line of work would be to go to our little tiny
link |
00:29:40.020
country house, get a lot of apples there from apple trees, bring them back to the city and
link |
00:29:48.560
sell them in the street.
link |
00:29:50.880
So me and my nuclear physicist grandfather were just standing there and he selling those
link |
00:29:56.000
apples the whole day, cause that would make you more money than, you know, working at
link |
00:30:00.680
the university.
link |
00:30:01.740
And then he'll just tell me, try to teach me, you know, something about planets and
link |
00:30:07.960
whatever the particles and stuff.
link |
00:30:10.240
And, you know, I'm not smart at all, so I could never understand anything, but I was
link |
00:30:14.340
interested as a journalist kind of type interested.
link |
00:30:18.000
But that was my memory.
link |
00:30:19.000
And, you know, I'm happy that I wasn't, I somehow got spared that I was probably too
link |
00:30:25.200
young to remember any of the traumatic stuff.
link |
00:30:27.400
So the only thing I really remember had this bootleg that was very traumatic, had this
link |
00:30:31.860
bootleg Nintendo, which was called Dandy in Russia.
link |
00:30:35.760
So in 1993, there was nothing to eat, like, even if you had any money, you would go to
link |
00:30:39.280
the store and there was no food.
link |
00:30:40.800
I don't know if you remember that.
link |
00:30:42.880
And our friend had a restaurant, like a government, half government owned something restaurant.
link |
00:30:49.960
So they always had supplies.
link |
00:30:51.960
So he exchanged a big bag of wheat for this Nintendo, the bootleg Nintendo, that I remember
link |
00:31:00.080
very fondly, cause I think I was nine or something like that and we're seven.
link |
00:31:05.560
Like we just got it and I was playing it and there was this, you know, Dandy TV show.
link |
00:31:11.480
Yeah.
link |
00:31:12.480
So traumatic in a positive sense, you mean like, like a definitive, well, they took it
link |
00:31:17.080
away and gave me a bag of wheat instead.
link |
00:31:19.440
And I cried like my eyes out for days and days and days.
link |
00:31:23.720
Oh no.
link |
00:31:24.720
And then, you know, as a, and my dad said, we're going to like exchange it back in a
link |
00:31:28.680
little bit.
link |
00:31:29.680
So you keep the little gun, you know, the one that you shoot the ducks with.
link |
00:31:32.880
So I'm like, okay, I'm keeping the gun.
link |
00:31:34.240
So sometime it's going to come back, but then they exchanged the gun as well for some sugar
link |
00:31:38.880
or something.
link |
00:31:39.880
I was so pissed.
link |
00:31:41.520
I was like, I didn't want to eat for days after that.
link |
00:31:43.840
I'm like, I don't want your food.
link |
00:31:44.840
Give me my Nintendo back.
link |
00:31:47.640
That was extremely traumatic.
link |
00:31:50.120
But you know, I was happy that that was my only traumatic experience.
link |
00:31:53.640
You know, my dad had to actually go to Chernobyl with a bunch of 20 year olds.
link |
00:31:57.640
He was 20 when he went to Chernobyl and that was right after the explosion.
link |
00:32:01.760
No one knew anything.
link |
00:32:03.440
The whole crew he went with, all of them are dead now.
link |
00:32:05.800
I think there was this one guy still, that was still alive for this last few years.
link |
00:32:11.760
I think he died a few years ago now.
link |
00:32:13.920
My dad somehow luckily got back earlier than everyone else, but just the fact that that
link |
00:32:19.360
was the, and I was always like, well, how did they send you?
link |
00:32:21.960
I was only, I was just born, you know, you had a newborn talk about paternity leave.
link |
00:32:26.280
They were like, but that's who they took because they didn't know whether you would be able
link |
00:32:29.800
to have kids when you come back.
link |
00:32:31.120
So they took the ones with kids.
link |
00:32:33.880
So him with some guys went to, and I'm just thinking of me when I was 20, I was so sheltered
link |
00:32:40.360
from any problems whatsoever in life.
link |
00:32:42.200
And then my dad, his 21st birthday at the reactor, you like work three hours a day,
link |
00:32:51.200
you sleep the rest and, and I, yeah, so I played with a lot of toys from Chernobyl.
link |
00:32:57.040
What are your memories of Chernobyl in general, like the bigger context, you know, because
link |
00:33:02.640
of that HBO show it's the world's attention turned to it once again, like, what are your
link |
00:33:09.240
thoughts about Chernobyl?
link |
00:33:10.840
Did Russia screw that one up?
link |
00:33:13.000
Like, you know, there's probably a lot of lessons about our modern times with data about
link |
00:33:18.760
coronavirus and all that kind of stuff.
link |
00:33:20.520
It seems like there's a lot of misinformation.
link |
00:33:22.960
There's a lot of people kind of trying to hide whether they screwed something up or
link |
00:33:29.720
not, as it's very understandable, it's very human, very wrong, probably, but obviously
link |
00:33:35.580
Russia was probably trying to hide that they screwed things up.
link |
00:33:40.320
Like, what are your thoughts about that time, personal and general?
link |
00:33:45.520
I mean, I was born when the explosion happened.
link |
00:33:50.200
So actually a few months after, so of course I don't remember anything apart from the fact
link |
00:33:55.120
that my dad would bring me tiny toys, like plastic things that would just go crazy haywire
link |
00:34:02.040
when you, you know, put the Geiger thing to it.
link |
00:34:06.360
My mom was like, just nuclear about that.
link |
00:34:09.240
She was like, what are you bringing, you should not do that.
link |
00:34:13.440
She was nuclear.
link |
00:34:14.440
Very nice.
link |
00:34:15.440
Well done.
link |
00:34:16.440
I'm sorry.
link |
00:34:17.440
It was, but yeah, but the TV show was just phenomenal.
link |
00:34:21.200
The HBO one?
link |
00:34:22.760
Yeah, it was definitely, first of all, it's incredible how that was made not by the Russians,
link |
00:34:28.960
but someone else, but capturing so well everything about our country.
link |
00:34:37.360
It felt a lot more genuine than most of the movies and TV shows that are made now in Russia,
link |
00:34:41.160
just so much more genuine.
link |
00:34:43.320
And most of my friends in Russia were just in complete awe about the show, but I think
link |
00:34:47.800
that...
link |
00:34:48.800
How good of a job they did.
link |
00:34:49.800
Oh my God, phenomenal.
link |
00:34:50.800
But also...
link |
00:34:51.800
The apartments, there's something, yeah.
link |
00:34:52.800
The set design.
link |
00:34:53.800
I mean, Russians can't do that, you know, but you see everything and it's like, wow,
link |
00:34:58.240
that's exactly how it was.
link |
00:35:00.240
So I don't know, that show, I don't know what to think about that because it's British accents,
link |
00:35:06.840
British actors of a person, I forgot who created the show.
link |
00:35:12.560
But I remember reading about him and he's not, he doesn't even feel like, like there's
link |
00:35:17.480
no Russia in this history.
link |
00:35:19.040
No, he did like super bad or something like that.
link |
00:35:21.400
Or like, I don't know.
link |
00:35:22.400
Yeah, like exactly.
link |
00:35:23.400
Whatever that thing about the bachelor party in Vegas, number four and five or something
link |
00:35:28.580
were the ones that he worked with.
link |
00:35:30.080
Yeah.
link |
00:35:31.080
But so he made me feel really sad for some reason that if a person, obviously a genius,
link |
00:35:38.600
could go in and just study and just be extreme attention to detail, they can do a good job.
link |
00:35:46.000
It made me think like, why don't other people do a good job with this?
link |
00:35:52.520
Like about Russia, like there's so little about Russia.
link |
00:35:56.320
There's so few good films about the Russian side of World War II.
link |
00:36:02.600
I mean, there's so much interesting evil and not, and beautiful moments in the history
link |
00:36:10.560
of the 20th century in Russia that it feels like there's not many good films on from the
link |
00:36:16.640
Russians.
link |
00:36:17.640
You would expect something from the Russians.
link |
00:36:18.640
Well, they keep making these propaganda movies now.
link |
00:36:21.640
Oh no.
link |
00:36:22.640
Unfortunately.
link |
00:36:23.640
But yeah, no, Chernobyl was such a perfect TV show.
link |
00:36:26.560
I think capturing really well, it's not about like even the set design, which was phenomenal,
link |
00:36:30.720
but just capturing all the problems that exist now with the country and like focusing on
link |
00:36:37.120
the right things.
link |
00:36:38.120
Like if you build the whole country on a lie, that's what's going to happen.
link |
00:36:41.960
And that's just this very simple kind of thing.
link |
00:36:47.120
Yeah.
link |
00:36:48.960
And did you have your dad talked about it to you, like his thoughts on the experience?
link |
00:36:55.760
He never talks.
link |
00:36:56.760
He's this kind of Russian man that just, my husband who's American and he asked him a
link |
00:37:02.240
few times like, you know, Igor, how did you, but why did you say yes?
link |
00:37:06.560
Or like, why did you decide to go?
link |
00:37:08.420
You could have said no, not go to Chernobyl.
link |
00:37:10.200
Why would like a person like, that's what you do.
link |
00:37:14.880
You cannot say no.
link |
00:37:15.880
Yeah.
link |
00:37:16.880
Yeah.
link |
00:37:17.880
It's just, it's like a Russian way.
link |
00:37:21.560
It's the Russian way.
link |
00:37:22.560
Men don't talk that much.
link |
00:37:23.560
Nope.
link |
00:37:24.560
There's no one upsides for that.
link |
00:37:27.080
Yeah, that's the truth.
link |
00:37:29.400
Okay.
link |
00:37:30.400
So back to post Putin Russia, or maybe we skipped a few steps along the way, but you
link |
00:37:37.320
were trying to do, to be a journalist in that time.
link |
00:37:43.560
What was, what was Russia like at that time?
link |
00:37:46.640
Post you said 2007 Starbucks type of thing.
link |
00:37:51.860
What else, what else was Russia like then?
link |
00:37:55.560
I think there was just hope.
link |
00:37:56.720
There was this big hope that we're going to be, you know, friends with the United States
link |
00:38:01.840
and we're going to be friends with Europe and we're just going to be also a country
link |
00:38:06.600
like those with, you know, bike lanes and parks and everything's going to be urbanized.
link |
00:38:12.720
And again, we're talking about nineties where like people would be shot in the street.
link |
00:38:16.920
And it was, I still have a fond memory of going into a movie theater and, you know,
link |
00:38:21.800
coming out of it after the movie.
link |
00:38:22.980
And the guy that I saw on the stairs was like neither shot, which was, again, it was like
link |
00:38:28.100
a thing in the nineties that would be happening.
link |
00:38:30.280
People were, you know, people were getting shot here and there, tons of violence, tons
link |
00:38:35.400
of you know, just basically mafia mobs on in the streets.
link |
00:38:40.040
And then the two thousands were like, you know, things just got cleaned up, oil went
link |
00:38:44.560
up and the country started getting a little bit richer, you know, the nineties were so
link |
00:38:50.700
grim mostly because the economy was in shambles and oil prices were not high.
link |
00:38:54.960
So the country didn't have anything.
link |
00:38:56.740
We defaulted in 1998 and the money kept jumping back and forth.
link |
00:39:01.680
Like first there were millions of rubbles, then it got like default, you know, then it
link |
00:39:05.640
got to like thousands.
link |
00:39:06.640
Then it was one rubble was something then again to millions, there's like crazy town.
link |
00:39:11.800
That was crazy.
link |
00:39:12.960
And then the two thousands were just these years of stability in a way and the country
link |
00:39:19.480
getting a little bit richer because of, you know, again, oil and gas.
link |
00:39:22.680
And we were starting to, we started to look at specifically in Moscow and St. Petersburg
link |
00:39:27.040
to look at other cities in Europe and New York and US and trying to do the same in our
link |
00:39:34.600
like small kind of cities, towns there.
link |
00:39:38.000
What was, what were your thoughts of Putin at the time?
link |
00:39:40.320
Well, in the beginning he was really positive.
link |
00:39:43.480
Everyone was very, you know, positive about Putin.
link |
00:39:46.000
He was young.
link |
00:39:47.000
Um, it's very energetic.
link |
00:39:49.560
He also immediate the shirtless somewhat compared to, well, that was not like way before the
link |
00:39:55.960
shirtless era.
link |
00:39:56.960
Um, the shirtless era.
link |
00:39:57.960
Okay.
link |
00:39:58.960
So he didn't start out shirtless.
link |
00:39:59.960
When did the shirtless era, it's like the propaganda of riding horse, fishing, 2010,
link |
00:40:05.720
11, 12.
link |
00:40:06.720
Yeah.
link |
00:40:07.720
That's my favorite.
link |
00:40:08.720
You know, like people talk about the favorite Beatles, like the, that's my favorite Putin
link |
00:40:14.760
is the shirtless Putin.
link |
00:40:15.760
Now I remember very, very clearly 1996 where, you know, Americans really helped Russia with
link |
00:40:20.960
elections and Yeltsin got reelected, um, thankfully so, uh, because there's a huge threat that
link |
00:40:27.560
actually the communists will get back to power.
link |
00:40:29.680
Uh, they were a lot more popular.
link |
00:40:32.680
And then a lot of American experts, political experts, uh, and campaign experts descended
link |
00:40:39.660
on Moscow and helped Yeltsin actually get, get the presidency, the second term for the
link |
00:40:44.360
pro, um, the, of the presidency.
link |
00:40:46.340
But Yeltsin was not feeling great, you know, in the, by the end of his second term, uh,
link |
00:40:52.240
he was, you know, alcoholic.
link |
00:40:53.920
He was really old.
link |
00:40:54.920
He was falling off, uh, you know, the stages when he, where he was talking.
link |
00:40:59.960
Uh, so people were looking for fresh, I think for a fresh face, for someone who's going
link |
00:41:04.560
to continue Yeltsin's, uh, work, but who's going to be a lot more energetic and a lot
link |
00:41:09.680
more active, young, um, efficient, maybe.
link |
00:41:15.480
So that w that's what we all saw in Putin back in the day.
link |
00:41:17.880
I, I'd say that everyone, absolutely everyone in Russia in early two thousands who was not
link |
00:41:22.360
a communist would be, yeah, Putin's great.
link |
00:41:25.360
We have a lot of hopes for him.
link |
00:41:26.960
What are your thoughts?
link |
00:41:27.960
And I promise we'll get back to, uh, first of all, your love story.
link |
00:41:34.240
Second of all, AI, well, what are your thoughts about, um, communism?
link |
00:41:40.200
The 20th century, I apologize.
link |
00:41:42.680
I'm reading the rise and fall of the third Reich.
link |
00:41:46.240
Oh my God.
link |
00:41:48.580
So I'm like really steeped into like world war II and Stalin and Hitler and just these
link |
00:41:56.660
dramatic personalities that brought so much evil to the world.
link |
00:42:00.840
But it's also interesting to politically think about these different systems and what they've
link |
00:42:06.580
led to.
link |
00:42:08.320
And Russia is one of the sort of beacons of communism in the 20th century.
link |
00:42:16.280
What are your thoughts about communism?
link |
00:42:17.840
Having experienced it as a political system?
link |
00:42:20.480
I mean, I have only experienced it a little bit, but mostly through stories and through,
link |
00:42:24.360
you know, seeing my parents and my grandparents who lived through that, I mean, it was horrible.
link |
00:42:29.320
It was just plain horrible.
link |
00:42:31.120
It was just awful.
link |
00:42:33.360
You think it's, there's something, I mean, it sounds nice on paper.
link |
00:42:40.960
There's a, so like the drawbacks of capitalism is that, uh, you know, eventually there is,
link |
00:42:47.840
it's a, it's the point of like a slippery slope.
link |
00:42:51.160
Eventually it creates, uh, you know, the rich get richer, it creates a disparity, like inequality
link |
00:42:59.040
of, um, wealth inequality.
link |
00:43:02.520
If like, you know, I guess it's hypothetical at this point, but eventually capitalism leads
link |
00:43:09.080
to humongous inequality and that that's, you know, some people argue that that's a source
link |
00:43:13.720
of unhappiness is it's not like absolute wealth of people.
link |
00:43:17.720
It's the fact that there's a lot of people much richer than you.
link |
00:43:21.240
There's a feeling of like, that's where unhappiness can come from.
link |
00:43:25.300
So the idea of, of communism or these sort of Marxism is, uh, is, is not allowing that
link |
00:43:32.960
kind of slippery slope, but then you see the actual implementations of it and stuff seems
link |
00:43:37.800
to be, seems to go wrong very badly.
link |
00:43:42.400
What do you think that is?
link |
00:43:43.880
Why does it go wrong?
link |
00:43:46.680
What is it about human nature?
link |
00:43:47.680
If we look at Chernobyl, you know, those kinds of bureaucracies that were constructed.
link |
00:43:54.740
Is there something like, do you think about this much of like why it goes wrong?
link |
00:44:00.280
Well, there's no one was really like, it's not that everyone was equal.
link |
00:44:05.320
Obviously the, you know, the, the government and everyone close to that were the bosses.
link |
00:44:12.160
So it's not like fully, I guess, uh, this dream of equal life.
link |
00:44:17.740
So then I guess the, the situation that we had in, you know, the Russia had in the Soviet
link |
00:44:24.720
union, it was more, it's a bunch of really poor people without any way to make any, you
link |
00:44:30.720
know, significant fortune or build anything living constant, um, under constant surveillance,
link |
00:44:37.640
surveillance from other people.
link |
00:44:38.920
Like you can't even, you know, uh, do anything that's not fully approved by the dictatorship
link |
00:44:45.800
basically.
link |
00:44:46.800
Otherwise your neighbor will write a letter and you'll go to jail, absolute absence of
link |
00:44:53.080
actual law.
link |
00:44:54.080
Yeah.
link |
00:44:55.080
It's a constant state of fear.
link |
00:44:57.880
You didn't own any, own anything.
link |
00:45:00.000
It didn't, you know, the, you couldn't go travel, you couldn't read anything, uh, Western
link |
00:45:05.760
or you couldn't make a career really, unless you're working in the, uh, military complex.
link |
00:45:11.840
Um, which is why most of the scientists were so well regarded.
link |
00:45:15.560
I come from, you know, both my dad and my mom come from families of scientists and they,
link |
00:45:20.580
they were really well regarded as you, as you know, obviously.
link |
00:45:23.280
Because the state wanted, I mean, cause there's a lot of value to them being well regarded.
link |
00:45:29.960
Because they were developing things that could be used in, in the military.
link |
00:45:34.800
So that was very important.
link |
00:45:35.800
That was the main investment.
link |
00:45:36.800
Um, but it was miserable, it was all miserable.
link |
00:45:40.320
That's why, you know, a lot of Russians now live in the state of constant PTSD.
link |
00:45:43.640
That's why we, you know, want to buy, buy, buy, buy, buy and definitely if as soon as
link |
00:45:48.360
we have the opportunity, you know, we just got to it finally that we can, you know, own
link |
00:45:53.000
things.
link |
00:45:54.000
You know, I remember the time that we got our first yogurts and that was the biggest
link |
00:45:57.560
deal in the world.
link |
00:45:58.560
It was already in the nineties, by the way, I mean, what was your like, favorite food
link |
00:46:03.920
where it was like, well, like this is possible, Oh, fruit, because we only had apples, bananas
link |
00:46:12.600
and whatever.
link |
00:46:13.600
And you know, whatever watermelons, whatever, you know, people would grow in the Soviet
link |
00:46:17.960
Union.
link |
00:46:18.960
There were no pineapples or papaya or mango, like you've never seen those fruit things.
link |
00:46:24.240
Like those were so ridiculously good.
link |
00:46:27.480
And obviously you could not get any like strawberries in winter or anything that's not, you know,
link |
00:46:32.760
seasonal.
link |
00:46:33.760
Um, so that was a really big deal.
link |
00:46:34.760
I've seen all these fruit things.
link |
00:46:36.240
Yeah.
link |
00:46:37.240
Me too.
link |
00:46:38.240
Actually.
link |
00:46:39.240
I don't know.
link |
00:46:40.240
I think I have a, like, I don't think I have any too many demons, uh, or like addictions
link |
00:46:44.160
or so on, but I think I've developed an unhealthy relationship with fruit.
link |
00:46:47.960
I still struggle with, Oh, you can get any type of fruit, right?
link |
00:46:51.880
If you get like also these weird fruit, fruits like dragon fruit or something or all kinds
link |
00:46:57.880
of like different types of peaches, like cherries were killer for me.
link |
00:47:02.080
I know, I know you say like we had bananas and so on, but I don't remember having the
link |
00:47:06.720
kind of banana.
link |
00:47:07.720
Like when I first came to this country, the amount of banana, I like literally got fat
link |
00:47:12.920
on bananas, like the amount, Oh yeah, for sure.
link |
00:47:17.520
They were delicious.
link |
00:47:18.520
And like cherries, the kind, like just the quality of the food, I was like, this is capitalism.
link |
00:47:24.160
This is delicious.
link |
00:47:25.160
Yeah.
link |
00:47:26.160
I am.
link |
00:47:27.160
Yeah.
link |
00:47:28.160
It's funny.
link |
00:47:29.160
It's funny.
link |
00:47:30.160
Yeah.
link |
00:47:31.160
Like it's, it's funny to read.
link |
00:47:36.800
I don't know what to think of it, of, um, it's funny to think how an idea that's just
link |
00:47:44.280
written on paper, when carried out amongst millions of people, how that gets actually
link |
00:47:49.720
when it becomes reality, what it actually looks like, uh, sorry, but the, uh, been studying
link |
00:47:58.640
Hitler a lot recently and, uh, going through Mein Kampf.
link |
00:48:04.040
He pretty much wrote out of Mein Kampf everything he was going to do.
link |
00:48:07.960
Unfortunately, most leaders, including Stalin didn't read the, read it, but it's, it's kind
link |
00:48:13.480
of terrifying and I don't know.
link |
00:48:16.140
And amazing in some sense that you can have some words on paper and they can be brought
link |
00:48:21.120
to life and they can either inspire the world or they can destroy the world.
link |
00:48:26.560
And uh, yeah, there's a lot of lessons to study in history that I think people don't
link |
00:48:32.480
study enough now.
link |
00:48:35.520
One of the things I'm hoping with, I've been practicing Russian a little bit.
link |
00:48:40.000
I'm hoping to sort of find, rediscover the, uh, the beauty and the terror of Russian history
link |
00:48:49.640
through this stupid podcast by talking to a few people.
link |
00:48:55.360
So anyway, I just feel like so much was forgotten.
link |
00:48:58.400
So much was forgotten.
link |
00:48:59.400
I'll probably, I'm going to try to convince myself to, um, you're a super busy and super
link |
00:49:04.960
important person when I'm going to, I'm going to try to befriend you to, uh, to try to become
link |
00:49:11.000
a better Russian.
link |
00:49:12.000
Cause I feel like I'm a shitty Russian.
link |
00:49:14.160
Not that busy.
link |
00:49:15.160
So I can totally be your Russian Sherpa.
link |
00:49:19.040
Yeah.
link |
00:49:20.920
But love, you were, you were talking about your early days of, uh, being a little bit
link |
00:49:28.160
alone and finding a connection with the world through being a journalist.
link |
00:49:33.240
Where did love come into that?
link |
00:49:36.200
I guess finding for the first time, um, some friends, it's very, you know, simple story.
link |
00:49:42.680
Some friends that all of a sudden we, I guess we were the same, you know, the same, at the
link |
00:49:48.920
same place with our lives, um, we're 25, 26, I guess.
link |
00:49:55.400
And, um, somehow remember, and we just got really close and somehow remember this one
link |
00:50:00.400
day where, um, it's one day and, you know, in summer that we just stayed out, um, outdoor
link |
00:50:06.640
the whole night and just talked and for some unknown reason, it just felt for the first
link |
00:50:11.240
time that someone could, you know, see me for who I am and it just felt extremely like
link |
00:50:17.000
extremely good.
link |
00:50:18.000
And I, you know, we fell asleep outside and just talking and it was raining.
link |
00:50:22.520
It was beautiful, you know, sunrise and it's really cheesy, but, um, at the same time,
link |
00:50:28.440
we just became friends in a way that I've never been friends with anyone else before.
link |
00:50:33.840
And I do remember that before and after that you sort of have this unconditional family
link |
00:50:38.360
sort of, um, and it gives you tons of power.
link |
00:50:43.440
It just basically gives you this tremendous power to do things in your life and to, um,
link |
00:50:50.680
change positively on many different levels.
link |
00:50:53.920
Power because you could be yourself.
link |
00:50:56.720
At least you know that some somewhere you can be just yourself, like you don't need
link |
00:51:01.760
to pretend, you don't need to be, you know, um, great at work or tell some story or sell
link |
00:51:07.920
yourself in somewhere or another.
link |
00:51:10.280
And so it became this really close friends and, um, in a way, um, I started a company
link |
00:51:17.200
cause he had a startup and I felt like I kind of want to start up too.
link |
00:51:20.120
It felt really cool.
link |
00:51:21.120
I don't know what I'm going to, what I would really do, but I felt like I kind of need
link |
00:51:25.720
a startup.
link |
00:51:26.720
Okay.
link |
00:51:27.720
So that's, so that pulled you in to the startup world.
link |
00:51:32.040
Yeah.
link |
00:51:33.320
And then, yeah.
link |
00:51:35.680
And then this, uh, closest friend of mine died.
link |
00:51:38.400
We actually moved here to San Francisco together and then we went back for a visa to Moscow
link |
00:51:42.720
and, uh, we lived together, we're roommates and we came back and, um, he got hit by a
link |
00:51:48.520
car right in front of Kremlin on a, you know, next to the river, um, and died the same day
link |
00:51:54.520
I met this is the Roman hospital.
link |
00:51:58.440
So, and you've moved to America at that point, at that point I was, what about him?
link |
00:52:05.720
What about Roman?
link |
00:52:06.720
Him too.
link |
00:52:07.720
He actually moved first.
link |
00:52:08.720
So I was always sort of trying to do what he was doing, so I didn't like that he was
link |
00:52:11.960
already here and I was still, you know, in Moscow and we weren't hanging out together
link |
00:52:15.580
all the time.
link |
00:52:16.580
So was he in San Francisco?
link |
00:52:18.080
Yeah, we were roommates.
link |
00:52:20.540
So he just visited Moscow for a little bit.
link |
00:52:23.400
We went back for, for our visas, we had to get a stamp in our passport for our work visas
link |
00:52:28.920
and the embassy was taking a little longer, so we stayed there for a couple of weeks.
link |
00:52:34.720
What happened?
link |
00:52:35.720
How did he, so how, how did he, uh, how did he die?
link |
00:52:40.200
Um, he was crossing the street and the car was going really fast and way over the speed
link |
00:52:45.280
limit and just didn't stop on the, on the pedestrian cross on the zebra and just ran
link |
00:52:51.520
over him.
link |
00:52:52.520
When was this?
link |
00:52:53.520
It was in 2015 on 28th of November, so it was a long ago now.
link |
00:52:59.320
Um, but at the time, you know, I was 29, so for me it was, um, the first kind of meaningful
link |
00:53:06.120
death in my life.
link |
00:53:07.760
Um, you know, both sets of, I had both sets of grandparents at the time.
link |
00:53:12.840
I didn't see anyone so close die and death sort of existed, but as a concept, but definitely
link |
00:53:18.880
not as something that would be, you know, happening to us anytime soon and specifically
link |
00:53:24.720
our friends.
link |
00:53:25.720
Cause we were, you know, we're still in our twenties or early thirties and it still, it
link |
00:53:29.880
still felt like the whole life is, you know, you could still dream about ridiculous things
link |
00:53:36.120
different.
link |
00:53:37.120
Um, so that was, it was just really, really abrupt I'd say.
link |
00:53:43.840
What did it feel like to, uh, to lose him, like that feeling of loss?
link |
00:53:49.680
You talked about the feeling of love, having power.
link |
00:53:53.120
What is the feeling of loss, if you like?
link |
00:53:57.520
Well in Buddhism, there's this concept of Samaya where something really like huge happens
link |
00:54:04.720
and then you can see very clearly.
link |
00:54:07.160
Um, I think that, that was it like basically something changed so, changed me so much in
link |
00:54:13.320
such a short period of time that I could just see really, really clearly what mattered or
link |
00:54:19.240
what not.
link |
00:54:20.240
Well, I definitely saw that whatever I was doing at work didn't matter at all and some
link |
00:54:25.800
of the things.
link |
00:54:26.800
And, um, it was just this big realization when it's this very, very clear vision of
link |
00:54:31.400
what life's about.
link |
00:54:35.280
You still miss him today?
link |
00:54:37.280
Yeah, for sure.
link |
00:54:40.360
For sure.
link |
00:54:41.840
He was just this constant, I think it was, he was really important for, for me and for
link |
00:54:47.360
our friends for many different reasons and, um, I think one of them being that we didn't
link |
00:54:53.120
just say goodbye to him, but we sort of said goodbye to our youth in a way.
link |
00:54:58.160
It was like the end of an era and it's on so many different levels.
link |
00:55:02.400
The end of Moscow as we knew it, the end of, you know, us living through our twenties and
link |
00:55:08.720
kind of dreaming about the future.
link |
00:55:11.600
Do you remember like last several conversations, is there moments with him that stick out that
link |
00:55:17.920
kind of haunt you and you're just when you think about him?
link |
00:55:22.720
Yeah, well his last year here in San Francisco, he was pretty depressed for as his startup
link |
00:55:28.920
was not going really anywhere and he wanted to do something else.
link |
00:55:32.600
He wanted to do build, he played with toy, like played with a bunch of ideas, but the
link |
00:55:39.880
last one he had was around, um, building a startup around death.
link |
00:55:44.680
So having, um, he applied to Y Combinator with a video that, you know, I had on my computer
link |
00:55:52.280
and it was all about, you know, disrupting death, thinking about new cemeteries, uh,
link |
00:55:57.760
more biologically, like things that could be better biologically for, for humans.
link |
00:56:03.400
And at the same time, having those, um, digital avatars, this kind of AI avatars that would
link |
00:56:12.800
store all the memory about a person that he could interact with.
link |
00:56:15.920
What year was this?
link |
00:56:16.920
2015.
link |
00:56:17.920
Well, right before his death.
link |
00:56:19.920
So it was like a couple of months before that he recorded that video.
link |
00:56:23.760
And so I found out my computer when, um, it was in our living room.
link |
00:56:28.180
He never got in, but, um, he was thinking about a lot somehow.
link |
00:56:33.080
Does it have the digital avatar idea?
link |
00:56:35.240
Yeah.
link |
00:56:36.240
That's so interesting.
link |
00:56:37.240
Well, he just says, well, that's in his hit is the pitch has this idea and he'll, he talks
link |
00:56:42.160
about like, I want to rethink how people grieve and how people talk about death.
link |
00:56:45.960
Why was he interested in this?
link |
00:56:48.960
Is it, maybe someone who's depressed is like naturally inclined thinking about that.
link |
00:56:56.000
But I just felt, you know, this year in San Francisco, we just had so much, um, I was
link |
00:57:00.800
going through a hard time.
link |
00:57:01.800
And we were definitely, I was trying to make him just happy somehow to make him feel better.
link |
00:57:07.940
And it felt like, you know, this, um, I dunno, I just felt like I was taking care of him
link |
00:57:13.840
a lot and he almost started to feel better.
link |
00:57:17.000
And then that happened and I dunno, I just felt, I just felt lonely again, I guess.
link |
00:57:23.920
And that was, you know, coming back to San Francisco in December or help, you know, helped
link |
00:57:28.440
organize the funeral, help help his parents and I came back here and it was a really lonely
link |
00:57:33.680
apartment, a bunch of his clothes everywhere and Christmas time.
link |
00:57:38.520
And I remember I had a board meeting with my investors and I just couldn't talk about
link |
00:57:42.280
like, I had to pretend everything's okay.
link |
00:57:44.960
And you know, I'm just working on this company.
link |
00:57:47.360
Um, yeah, it was definitely very, very tough, tough time.
link |
00:57:55.360
Do you think about your own mortality?
link |
00:58:00.160
You said, uh, you know, we're young, the, the, the, the possibility of doing all kinds
link |
00:58:06.900
of crazy things is still out there, is still before us, but, uh, it can end any moment.
link |
00:58:12.900
Do you think about your own ending at any moment?
link |
00:58:17.640
Unfortunately, I think about way too, about it way too much.
link |
00:58:23.320
Somehow after Roman, like every year after that, I started losing people that I really
link |
00:58:27.800
love.
link |
00:58:28.800
I lost my grandfather the next year, my, you know, the, the person who would explain to
link |
00:58:34.640
me, you know, what the universe is made of while selling apples and then I lost another
link |
00:58:41.360
close friend of mine and, um, and it just made me very scared.
link |
00:58:46.680
I have tons of fear about, about that.
link |
00:58:48.520
That's what makes me not fall asleep oftentimes and just go in loops and, um, and then as
link |
00:58:54.760
my therapist, you know, recommended to me, I open up, uh, some nice calming images with
link |
00:59:02.520
the voiceover and it calms me down for sleep.
link |
00:59:06.680
Yeah.
link |
00:59:07.680
I'm really scared of death.
link |
00:59:08.680
This is a big, I definitely have tons of, I guess, some pretty big trauma about it and,
link |
00:59:15.000
um, still working through.
link |
00:59:17.300
There's a philosopher, Ernest Becker, who wrote a book, um, Denial of Death.
link |
00:59:22.920
I'm not sure if you're familiar with any of those folks.
link |
00:59:25.600
Um, there's a, in psychology, a whole field called terror management theory.
link |
00:59:32.320
Sheldon, who's just done the podcast, he wrote the book.
link |
00:59:36.240
He was the, we talked for four hours about death, uh, fear of death, but his, his whole
link |
00:59:44.720
idea is that, um, Ernest Becker, I think I find this idea really compelling is, uh, that
link |
00:59:52.160
everything human beings have created, like our whole motivation in life is to, uh, create
link |
01:00:00.640
like escape death is to try to, um, construct an illusion of, um, that we're somehow immortal.
link |
01:00:11.640
So like everything around us, this room, your startup, your dreams, all everything you do
link |
01:00:21.040
is a kind of, um, creation of a brain unlike any other mammal or species is able to be
link |
01:00:30.460
cognizant of the fact that it ends for us.
link |
01:00:35.180
I think, so, you know, there's this, the question of like the meaning of life that, you know,
link |
01:00:40.540
you look at like what drives us, uh, humans.
link |
01:00:44.260
And when I read Ernest Becker that I highly recommend people read is the first time I,
link |
01:00:50.060
this scene, it felt like this is the right thing at the core.
link |
01:00:54.160
Uh, Sheldon's work is called warm at the core.
link |
01:00:57.980
So he's saying it's, I think it's, uh, William James he's quoting or whoever is like the,
link |
01:01:05.240
the thing, what is at the core of it all?
link |
01:01:07.760
Whether there's like love, you know, Jesus might talk about like love is at the core
link |
01:01:12.540
of everything.
link |
01:01:13.540
I don't, you know, that's the open question.
link |
01:01:15.640
What's at the, you know, it's turtles, turtles, but it can't be turtles all the way down.
link |
01:01:19.980
What's what's at the, at the bottom.
link |
01:01:22.300
And, uh, Ernest Becker says the fear of death and the way, in fact, uh, cause you said therapist
link |
01:01:30.980
and calming images, his whole idea is, um, you know, we, we want to bring that fear of
link |
01:01:36.860
death as close as possible to the surface because it's, um, and like meditate on that.
link |
01:01:43.900
Uh, and, and use the clarity of vision that provides to, uh, you know, to live a more
link |
01:01:49.820
fulfilling life, to, um, to live a more honest life, to, to discover, you know, there's something
link |
01:01:58.580
about, you know, being cognizant of the finiteness of it all that might result in, um, in the
link |
01:02:05.500
most fulfilling life.
link |
01:02:07.580
So that's the, that's the dual of what you're saying.
link |
01:02:10.500
Cause you kind of said, it's like, I unfortunately think about it too much.
link |
01:02:15.180
It's a question whether it's good to think about it because I, I've, um, again, I talk
link |
01:02:20.020
about way too much about love and probably death.
link |
01:02:23.260
And when I ask people, friends, which is why I probably don't have many friends, are you
link |
01:02:29.620
afraid of death?
link |
01:02:30.820
I think most people say they're not.
link |
01:02:35.020
Whether they say they're, um, they're afraid, you know, it's kind of almost like they see
link |
01:02:41.700
death as this kind of like, uh, a paper deadline or something.
link |
01:02:45.980
And they're afraid not to finish the paper before the paper, like, like I'm afraid not
link |
01:02:50.020
to finish, um, the goals I have, but it feels like they're not actually realizing that this
link |
01:02:57.540
thing ends, like really realizing, like really thinking as Nietzsche and all these philosophy,
link |
01:03:04.340
like thinking deeply about it, like, uh, the very thing that, you know, um, like when you
link |
01:03:13.740
think deeply about something, you can just, you can realize that you haven't actually
link |
01:03:18.500
thought about it.
link |
01:03:20.500
Uh, yeah.
link |
01:03:22.660
And I, and when I think about death, it's like, um, it can be, it's terrifying.
link |
01:03:28.500
If it feels like stepping outside into the cold or it's freezing and then I have to like
link |
01:03:34.000
hurry back inside or it's warm.
link |
01:03:36.820
Uh, but like, I think there's something valuable about stepping out there into the freezing
link |
01:03:43.180
cold.
link |
01:03:44.180
Definitely.
link |
01:03:45.180
When I talk to my mentor about it, he always, uh, tells me, well, what dies?
link |
01:03:52.820
There's nothing there that can die, but I guess that requires, um, well in, in Buddhism,
link |
01:04:00.700
one of the concepts that are really hard to grasp and that people spend all their lives
link |
01:04:05.240
meditating on would be Anatta, which is the concept of non, not self and kind of thinking
link |
01:04:12.420
that, you know, if you're not your thoughts, which you're obviously not your thoughts because
link |
01:04:15.260
you can observe them and not your emotions and not your body, then what is this?
link |
01:04:20.980
And if you go really far, then finally you see that there's not self, there's this concept
link |
01:04:27.580
of not self.
link |
01:04:28.580
So once you get there, how can that actually die?
link |
01:04:32.260
What is dying?
link |
01:04:33.260
Right.
link |
01:04:34.260
You're just a bunch of molecules, stardust.
link |
01:04:38.900
But that is very, um, you know, very advanced, um, spiritual work for me.
link |
01:04:44.300
I'm definitely just, definitely not.
link |
01:04:47.100
Oh my God.
link |
01:04:48.100
No, I have, uh, I think it's very, very useful.
link |
01:04:50.740
It's just the fact that maybe being so afraid is not useful and mine is more, I'm just terrified.
link |
01:04:56.500
Like it's really makes me, um,
link |
01:04:58.300
On a personal level.
link |
01:04:59.300
On a personal level.
link |
01:05:00.300
I'm terrified.
link |
01:05:01.300
How do you overcome that?
link |
01:05:02.300
I don't.
link |
01:05:03.300
I'm still trying to.
link |
01:05:04.300
Have pleasant images?
link |
01:05:05.300
Well, pleasant images get me to sleep and then during the day I can distract myself with
link |
01:05:20.540
other things, like talking to you.
link |
01:05:24.460
I'm glad we're both doing the same exact thing.
link |
01:05:26.740
Okay, good.
link |
01:05:27.740
Is there other, like, is there moments since you've, uh, lost Roman that you had like moments
link |
01:05:39.540
of like bliss and like that you've forgotten that you have achieved that Buddhist like
link |
01:05:47.980
level of like what can possibly die.
link |
01:05:52.380
I'm part like, uh, losing yourself in the moment, in the ticking time of like this universe
link |
01:06:02.020
and you're just part of it for a brief moment and just enjoying it.
link |
01:06:06.980
Well that goes hand in hand.
link |
01:06:08.260
I remember I think a day or two after he died, we went to finally get his password out of
link |
01:06:13.940
the embassy and we're driving around Moscow and it was, you know, December, which is usually
link |
01:06:19.340
there's never a sun in Moscow in December and somehow it was an extremely sunny day
link |
01:06:25.420
and we were driving with a close friend.
link |
01:06:30.700
And I remember feeling for the first time maybe this just moment of incredible clarity
link |
01:06:35.420
and somehow happiness, not like happy happiness, but happiness and just feeling that, you know,
link |
01:06:45.860
I know what the universe is sort of about, whether it's good or bad.
link |
01:06:49.820
And it wasn't a sad feeling.
link |
01:06:50.820
It was probably the most beautiful feeling that you can ever achieve.
link |
01:06:56.120
And you can only get it when something, oftentimes when something traumatic like that happens.
link |
01:07:03.260
But also if you just, you really spend a lot of time meditating and looking at the nature
link |
01:07:07.040
doing something that really gets you there.
link |
01:07:09.900
But once you're there, I think when you, uh, summit a mountain, a really hard mountain,
link |
01:07:14.460
you inevitably get there.
link |
01:07:16.100
That's just a way to get to the state.
link |
01:07:18.500
But once you're on this, in this state, um, you can do really big things.
link |
01:07:24.140
I think.
link |
01:07:25.140
Yeah.
link |
01:07:26.140
Sucks it doesn't last forever.
link |
01:07:28.100
So Bukowski talked about like, love is a fog.
link |
01:07:32.460
Like it's a, when you wake up in the morning, it's, it's there, but it eventually dissipates.
link |
01:07:38.500
It's really sad.
link |
01:07:40.460
Nothing lasts forever.
link |
01:07:41.460
But I definitely like doing this pushup and running thing.
link |
01:07:46.620
There's moments at a couple of moments, like I'm not a crier.
link |
01:07:51.100
I don't cry.
link |
01:07:52.100
But there's moments where I was like facedown on the carpet, like with tears in my eyes
link |
01:07:59.100
is interesting.
link |
01:08:00.100
And then that, that complete, like, uh, there's a lot of demons.
link |
01:08:05.100
I've got demons had to face them.
link |
01:08:07.740
Funny how running makes you face your demons.
link |
01:08:09.560
But at the same time, the flip side of that, there's a few moments where I was in bliss
link |
01:08:16.580
and all of it alone, which is funny.
link |
01:08:19.420
That's beautiful.
link |
01:08:20.420
I like that, but definitely pushing yourself physically one of it for sure.
link |
01:08:27.060
Yeah.
link |
01:08:28.060
Yeah.
link |
01:08:29.060
Like you said, I mean, you were speaking as a metaphor of Mount Everest, but it also works
link |
01:08:34.020
like literally, I think physical endeavor somehow.
link |
01:08:39.580
Yeah.
link |
01:08:40.580
There's something.
link |
01:08:41.580
I mean, we're monkeys, apes, whatever physical, there's a physical thing to it, but there's
link |
01:08:46.860
something to this pushing yourself physical, physically, but alone that happens when you're
link |
01:08:53.020
doing like things like you do or strenuous like workouts or, you know, rolling extra
link |
01:08:58.060
across the Atlantic or like marathons.
link |
01:09:01.580
I love watching marathons and you know, it's so boring, but you can see them getting there.
link |
01:09:09.540
So the other thing, I don't know if you know, there's a guy named David Goggins.
link |
01:09:14.100
He's a, he basically, uh, so he's been either email on the phone with me every day through
link |
01:09:20.020
this.
link |
01:09:21.020
I haven't been exactly alone, but he, he's kind of, he's the, he's the devil on the devil's
link |
01:09:27.820
shoulder.
link |
01:09:28.820
Uh, so he's like the worst possible human being in terms of giving you, uh, like he
link |
01:09:36.140
has, um, through everything I've been doing, he's been doubling everything I do.
link |
01:09:40.840
So he, he's insane.
link |
01:09:42.620
Uh, he's a, this Navy seal person.
link |
01:09:45.740
Uh, he's wrote this book.
link |
01:09:47.620
Can't hurt me.
link |
01:09:48.620
He's basically one of the toughest human beings on earth.
link |
01:09:50.620
He ran all these crazy ultra marathons in the desert.
link |
01:09:54.180
He set the world record number of pull ups.
link |
01:09:56.980
He just does everything where it's like, he, like, how can I suffer today?
link |
01:10:03.620
He figures that out and does it.
link |
01:10:05.500
Yeah.
link |
01:10:06.500
That, um, whatever that is, uh, that process of self discovery is really important.
link |
01:10:11.660
I actually had to turn myself off from the internet mostly because I started this like
link |
01:10:16.100
workout thing, like a happy go getter with my like headband and like, just like, uh,
link |
01:10:24.140
because a lot of people were like inspired and they're like, yeah, we're going to exercise
link |
01:10:27.500
with you.
link |
01:10:28.740
And I was like, yeah, great.
link |
01:10:30.220
You know, but then like, I realized that this, this journey can't be done together with others.
link |
01:10:38.700
This has to be done alone.
link |
01:10:41.460
So out of the moments of love, out of the moments of loss, can we, uh, talk about your
link |
01:10:48.820
journey of finding, I think, an incredible idea and incredible company and incredible
link |
01:10:56.780
system in Replica?
link |
01:10:59.180
How did that come to be?
link |
01:11:01.320
So yeah, so I was a journalist and then I went to business school for a couple of years
link |
01:11:05.940
to, um, just see if I can maybe switch gears and do something else with 23.
link |
01:11:12.700
And then I came back and started working for a businessman in Russia who built the first
link |
01:11:17.500
ROG network, um, in our country and was very visionary and asked me whether I want to do
link |
01:11:25.580
fun stuff together.
link |
01:11:26.580
Um, and we worked on a bank, um, the idea was to build a bank on top of, um, a telco.
link |
01:11:34.060
So that was 2011 or 12, um, and a lot of telecommunication company, um, mobile network operators didn't
link |
01:11:42.560
really know what to do next in terms of, you know, new products, new revenue.
link |
01:11:47.660
And this big idea was that, you know, um, you put a bank on top and then all work works
link |
01:11:53.620
out.
link |
01:11:54.620
Basically a prepaid account becomes your bank account and, um, you can use it as, as your
link |
01:11:58.900
bank.
link |
01:11:59.900
Uh, so, you know, a third of a country wakes up as, as your bank client.
link |
01:12:05.060
Um, but we couldn't quite figure out what, what would be the main interface to interact
link |
01:12:10.100
with the bank.
link |
01:12:11.180
The problem was that most people didn't have smart, smart phones back in the time in Russia,
link |
01:12:15.500
the penetration of smartphones was low, um, people didn't use mobile banking or online
link |
01:12:20.300
banking and their computers.
link |
01:12:22.720
So we figured out that SMS would be the best way, uh, cause that would work on feature
link |
01:12:26.940
phones.
link |
01:12:27.940
Um, but that required some chat bot technology, which I didn't know anything about, um, obviously.
link |
01:12:33.900
So I started looking into it and saw that there's nothing really, well, there wasn't
link |
01:12:37.540
just nothing really.
link |
01:12:38.540
Ideas through SMS be able to interact with your bank account.
link |
01:12:41.500
Yeah.
link |
01:12:42.500
And then we thought, well, since you're talking to a bank account, why can't this, can't we
link |
01:12:46.460
use more of, uh, you know, some behavioral ideas and why can't this, uh, banking chat
link |
01:12:52.020
bot be nice to you and really talk to you sort of as a friend this way you develop more
link |
01:12:56.060
connection to it, retention is higher, people don't churn.
link |
01:12:59.900
And so I went to very depressing, um, um, Russian cities to test it out.
link |
01:13:05.700
Um, I went to, I remember three different towns with, uh, um, to interview potential
link |
01:13:12.100
users.
link |
01:13:13.100
Um, so people use it for a little bit and I went to talk to them, um, very poor towns,
link |
01:13:19.660
mostly towns that were, um, you know, sort of factories, uh, mono towns.
link |
01:13:26.820
They were building something and then the factory went away and it was just a bunch
link |
01:13:29.940
of very poor people.
link |
01:13:32.620
Um, and then we went to a couple that weren't as dramatic, but still the one I remember
link |
01:13:37.420
really fondly was this woman that worked at a glass factory and she talked to a chat bot.
link |
01:13:41.940
Um, and she was talking about it and she started crying during the interview because she said,
link |
01:13:46.960
no one really cares for me that much.
link |
01:13:50.300
And um, so to be clear, that was the, my only endeavor in programming that chat bot.
link |
01:13:56.860
So it was really simple.
link |
01:13:58.700
It was literally just a few, if this, then that rules and, um, it was incredibly simplistic.
link |
01:14:06.980
Um, and that really made her emotional and she said, you know, I only have my mom and
link |
01:14:12.380
my, um, my husband and I don't have any more really in my life.
link |
01:14:18.260
And that was very sad, but at the same time I felt, and we had more interviews in a similar
link |
01:14:22.300
vein and what I thought in the moment was like, well, uh, it's not that the technology
link |
01:14:27.760
is ready because definitely in 2012 technology was not ready for, for that, but, um, humans
link |
01:14:34.580
are ready, unfortunately.
link |
01:14:36.800
So this project would not be about like tech capabilities would be more about human vulnerabilities,
link |
01:14:42.580
but, um, there's something so, so powerful around about conversational, um, AI that I
link |
01:14:49.980
saw then that I thought was definitely worth putting in a lot of effort into.
link |
01:14:54.860
So in the end of the day, we saw the banking project, um, but my then boss, um, was also
link |
01:15:01.620
my mentor and really, really close friend, um, told me, Hey, I think there's something
link |
01:15:06.780
in it and you should just go work on it.
link |
01:15:08.700
And I was like, well, what product?
link |
01:15:10.420
I don't know what I'm building.
link |
01:15:11.420
He's like, you'll figure it out.
link |
01:15:14.060
And, um, you know, looking back at this, this was a horrible idea to work on something without
link |
01:15:18.520
knowing what it was, which is maybe the reason why it took us so long, but we just decided
link |
01:15:24.440
to work on the conversational tech to see what it, you know, there were no chat bot,
link |
01:15:30.340
um, constructors or programs or anything that would allow you to actually build one at the
link |
01:15:35.660
time.
link |
01:15:36.660
Uh, that was the era of, by the way, Google glass, which is why, you know, some of the
link |
01:15:40.540
investors like seed investors we've talked with were like, Oh, you should totally build
link |
01:15:44.340
it for Google glass.
link |
01:15:45.340
If not, we're not, I don't think that's interesting.
link |
01:15:48.580
Did you bite on that idea?
link |
01:15:50.660
No.
link |
01:15:51.660
Okay.
link |
01:15:52.660
Because I wanted to be, to do text first cause I'm a journalist.
link |
01:15:56.740
So I was, um, fascinated by just texting.
link |
01:16:01.140
So you thought, so the emotional, um, that interaction that the woman had, like, so do
link |
01:16:07.740
you think you could feel emotion from just text?
link |
01:16:10.500
Yeah.
link |
01:16:11.540
I saw something in just this pure texting and also thought that we should first start,
link |
01:16:17.200
start building for people who really need it versus people who have Google glass.
link |
01:16:20.420
Uh, if you know what I mean, and I felt like the early adopters of Google glass might not
link |
01:16:25.740
be overlapping with people who are really lonely and might need some, you know, someone
link |
01:16:29.860
to talk to.
link |
01:16:31.340
Um, but then we really just focused on the tech itself.
link |
01:16:35.100
We just thought, what if we just, you know, we didn't have a product idea in the moment
link |
01:16:39.260
and we felt, what if we just look into, um, building the best conversational constructors,
link |
01:16:46.100
so to say, use the best tech available at the time.
link |
01:16:49.460
And that was before the first paper about deep learning applied to dialogues, which
link |
01:16:53.460
happened in 2015 in August, 2015, uh, which Google published.
link |
01:17:01.820
Did you follow the work of Lobna prize and like all the sort of non machine learning
link |
01:17:09.460
chat bots?
link |
01:17:10.460
Yeah.
link |
01:17:11.460
What really struck me was that, you know, there was a lot of talk about machine learning
link |
01:17:15.060
and deep learning.
link |
01:17:16.180
Like big data was a really big thing.
link |
01:17:17.900
Everyone was saying, you know, the business world, big data, 2012 is the biggest gaggle
link |
01:17:22.620
competitions were, you know, um, important, but that was really the kind of upheaval.
link |
01:17:27.920
People started talking about machine learning a lot, um, but it was only about images or
link |
01:17:32.180
something else.
link |
01:17:33.460
And it was never about conversation.
link |
01:17:34.460
As soon as I looked into the conversational tech, it was all about something really weird
link |
01:17:39.620
and very outdated and very marginal and felt very hobbyist.
link |
01:17:42.940
It was all about Lord burner price, which was won by a guy who built a chat bot that
link |
01:17:47.660
talked like a Ukrainian teenager that it was just a gimmick.
link |
01:17:51.060
And somehow people picked up those gimmicks and then, you know, the most famous chat bot
link |
01:17:56.260
at the time was Eliza from 1980s, which was really bizarre or smarter child on aim.
link |
01:18:03.720
The funny thing is it felt at the time not to be that popular and it still doesn't seem
link |
01:18:09.380
to be that popular.
link |
01:18:11.140
Like people talk about the Turing test, people like talking about it philosophically, journalists
link |
01:18:15.900
like writing about it, but as a technical problem, like people don't seem to really
link |
01:18:21.020
want to solve the open dialogue.
link |
01:18:26.180
Like they, they're not obsessed with it.
link |
01:18:29.660
Even folks are like, you know, I'm in Boston, the Alexa team, even they're not as obsessed
link |
01:18:35.640
with it as I thought they might be.
link |
01:18:38.620
Why not?
link |
01:18:39.620
What do you think?
link |
01:18:40.740
So you know what you felt like you felt with that woman who, when she felt something by
link |
01:18:45.620
reading the text, I feel the same thing.
link |
01:18:48.820
There's something here, what you felt.
link |
01:18:51.460
I feel like Alexa folks and just the machine learning world doesn't feel that, that there's
link |
01:18:59.700
something here because they see as a technical problem is not that interesting for some reason.
link |
01:19:07.060
It's could be argued that maybe as a purely sort of natural language processing problem,
link |
01:19:12.140
it's not the right problem to focus on because there's too much subjectivity.
link |
01:19:17.460
That thing that the woman felt like crying, like if your benchmark includes a woman crying,
link |
01:19:24.580
that doesn't feel like a good benchmark.
link |
01:19:27.260
But to me there's something there that's, you could have a huge impact, but I don't
link |
01:19:32.660
think the machine learning world likes that, the human emotion, the subjectivity of it,
link |
01:19:38.940
the fuzziness, the fact that with maybe a single word you can make somebody feel something
link |
01:19:43.660
deeply.
link |
01:19:44.740
What is that?
link |
01:19:45.740
It doesn't feel right to them.
link |
01:19:47.660
So I don't know.
link |
01:19:48.660
I don't know why that is.
link |
01:19:50.220
That's why I'm excited when I discovered your work, it feels wrong to say that.
link |
01:19:57.020
It's not like I'm giving myself props for Googling and for coming across, for I guess
link |
01:20:10.220
mutual friend and introducing us, but I'm so glad that you exist and what you're working
link |
01:20:15.620
on.
link |
01:20:16.620
But I have the same kind of, if we could just backtrack for a second, because I have the
link |
01:20:20.140
same kind of feeling that there's something here.
link |
01:20:22.180
In fact, I've been working on a few things that are kind of crazy, very different from
link |
01:20:29.820
your work.
link |
01:20:30.820
I think they're too crazy.
link |
01:20:34.140
But the...
link |
01:20:35.140
Like what?
link |
01:20:36.140
I don't have to know.
link |
01:20:38.140
No, all right, we'll talk about it more.
link |
01:20:41.980
I feel like it's harder to talk about things that have failed and are failing while you're
link |
01:20:49.380
a failure.
link |
01:20:53.180
It's easier for you because you're already successful on some measures.
link |
01:20:59.220
Tell it to my board.
link |
01:21:01.500
Well, I think you've demonstrated success in a lot of ways.
link |
01:21:07.700
It's easier for you to talk about failures for me.
link |
01:21:10.300
I'm in the bottom currently of the success.
link |
01:21:19.220
You're way too humble.
link |
01:21:21.860
So it's hard for me to know, but there's something there, there's something there.
link |
01:21:25.260
And I think you're exploring that and you're discovering that.
link |
01:21:31.100
So it's been surprising to me.
link |
01:21:32.220
But you've mentioned this idea that you thought it wasn't enough to start a company or start
link |
01:21:41.180
efforts based on it feels like there's something here.
link |
01:21:46.700
Like what did you mean by that?
link |
01:21:49.900
Like you should be focused on creating a, like you should have a product in mind.
link |
01:21:55.620
Is that what you meant?
link |
01:21:56.620
It just took us a while to discover the product because it all started with a hunch of like
link |
01:22:03.180
of me and my mentor and just sitting around and he was like, well, that's it.
link |
01:22:08.860
That's the, you know, the Holy Grail is there.
link |
01:22:11.060
It's like there's something extremely powerful in, in, in conversations and there's no one
link |
01:22:17.300
who's working on machine conversation from the right angle.
link |
01:22:19.820
So to say.
link |
01:22:20.820
I feel like that's still true.
link |
01:22:22.860
Am I crazy?
link |
01:22:23.860
Oh no, I totally feel that's still true, which is, I think it's mind blowing.
link |
01:22:28.940
Yeah.
link |
01:22:29.940
You know what it feels like?
link |
01:22:30.940
I wouldn't even use the word conversation cause I feel like it's the wrong word.
link |
01:22:35.620
It's like a machine connection or something.
link |
01:22:39.180
I don't know cause conversation, you start drifting into natural language immediately.
link |
01:22:44.340
You start drifting immediately into all the benchmarks that are out there.
link |
01:22:47.980
But I feel like it's like the personal computer days of this.
link |
01:22:52.580
Like I feel like we're like in the early days with the, like the Wozniak and all them, like
link |
01:22:57.380
where it was the same kind of, it was a very small niche group of people who are, who are
link |
01:23:04.100
all kind of lob no price type people.
link |
01:23:07.300
Yeah.
link |
01:23:08.300
Hobbyists.
link |
01:23:09.300
Hobbyists, but like not even hobbyists with big dreams.
link |
01:23:13.940
Like no hobbyists with a dream to trick like a jury.
link |
01:23:17.580
Yeah.
link |
01:23:18.580
It's like a weird, by the way, by the way, very weird.
link |
01:23:21.540
So if we think about conversations, first of all, when I have great conversations with
link |
01:23:26.300
people, I'm not trying to test them.
link |
01:23:30.020
So for instance, if I try to break them, like if I'm actually playing along, I'm part of
link |
01:23:33.900
it.
link |
01:23:34.900
Right.
link |
01:23:35.900
If I were to ask this person or test whether he's going to give me a good conversation,
link |
01:23:40.180
it would have never happened.
link |
01:23:41.260
So the whole, the whole problem with testing conversations is that you can put it in front
link |
01:23:47.340
of a jury because then you have to go into some Turing test mode where is it responding
link |
01:23:52.900
to all my factual questions, right?
link |
01:23:55.700
Or so it really has to be something in the field where people are actually talking to
link |
01:24:00.860
it because they want to, not because we're just trying to break it.
link |
01:24:05.500
And it's working for them because this, the weird part of it is that it's very subjective.
link |
01:24:11.340
It takes two to tango here fully.
link |
01:24:13.500
If you're not trying to have a good conversation, if you're trying to test it, then it's going
link |
01:24:16.740
to break.
link |
01:24:17.740
I mean, any person would break, to be honest.
link |
01:24:19.940
If I'm not trying to even have a conversation with you, you're not going to give it to me.
link |
01:24:24.660
Yeah.
link |
01:24:25.660
If I keep asking you like some random questions or jumping from topic to topic, that wouldn't
link |
01:24:30.500
be, which I'm probably doing, but that probably wouldn't contribute to the conversation.
link |
01:24:36.340
So I think the problem of testing, so there should be some other metric.
link |
01:24:42.140
How do we evaluate whether that conversation was powerful or not, which is what we actually
link |
01:24:47.180
started with.
link |
01:24:48.180
And I think those measurements exist and we can test on those.
link |
01:24:51.860
But what really struck us back in the day and what's still eight years later is still
link |
01:24:58.020
not resolved and I'm not seeing tons of groups working on it.
link |
01:25:02.620
Maybe I just don't know about them, it's also possible.
link |
01:25:06.640
But the interesting part about it is that most of our days we spend talking and we're
link |
01:25:10.900
not talking about like those conversations are not turn on the lights or customer support
link |
01:25:17.700
problems or some other task oriented things.
link |
01:25:22.700
These conversations are something else and then somehow they're extremely important for
link |
01:25:26.660
us.
link |
01:25:27.660
If we don't have them, then we feel deeply unhappy, potentially lonely, which as we know,
link |
01:25:34.340
creates tons of risk for our health as well.
link |
01:25:38.740
And so this is most of our hours as humans and somehow no one's trying to replicate that.
link |
01:25:45.940
And not even study it that well?
link |
01:25:49.220
And not even study that well.
link |
01:25:50.260
So when we jumped into that in 2012, I looked first at like, okay, what's the chatbot?
link |
01:25:54.940
What's the state of the art chatbot?
link |
01:25:57.460
And those were the Lobner Prize days, but I thought, okay, so what about the science
link |
01:26:02.940
of conversation?
link |
01:26:04.540
Clearly there have been tons of scientists or academics that looked into the conversation.
link |
01:26:12.780
So if I want to know everything about it, I can just read about it.
link |
01:26:17.260
There's not much really, there are conversational analysts who are basically just listening
link |
01:26:23.060
to speech, to different conversations, annotating them.
link |
01:26:28.340
And then, I mean, that's not really used for much.
link |
01:26:32.460
That's the field of theoretical linguistics, which is barely useful.
link |
01:26:39.700
It's very marginal, even in their space, no one really is excited and I've never met a
link |
01:26:44.380
theoretical linguist who was like, I can't wait to work on the conversation and analytics.
link |
01:26:49.160
That is just something very marginal, sort of applied to like writing scripts for salesmen
link |
01:26:54.940
when they analyze which conversation strategies were most successful for sales.
link |
01:27:00.820
Okay, so that was not very helpful.
link |
01:27:03.460
Then I looked a little bit deeper and then there, whether there were any books written
link |
01:27:09.220
on what really contributes to great conversation, that was really strange because most of those
link |
01:27:16.620
were NLP books, which is neurolinguistic programming, which is not the NLP that I was expecting
link |
01:27:27.060
to be, but it was mostly some psychologist, Richard Bandler, I think came up with that,
link |
01:27:33.620
who was this big guy in a leather vest that could program your mind by talking to you.
link |
01:27:41.700
How to be charismatic and charming and influential with people, all those books, yeah.
link |
01:27:45.780
Pretty much, but it was all about like through conversation reprogramming you, so getting
link |
01:27:49.580
to some, so that was, I mean, probably not very, very true and that didn't seem working
link |
01:27:58.460
very much even back in the day.
link |
01:28:00.780
And then there were some other books like, I don't know, mostly just self help books
link |
01:28:05.860
around how to be the best conversationalist or how to make people like you or some other
link |
01:28:12.940
stuff like Dale Carnegie or whatever.
link |
01:28:17.620
And then there was this one book, The Most Human Human by Brian Christensen that really
link |
01:28:21.140
was important for me to read back in the day because he was on the human side, he was taking
link |
01:28:29.700
part in the London Prize, but not as a human who's not a jury, but who's pretending to
link |
01:28:35.500
be, who's basically, you have to tell a computer from a human and he was the human, so you
link |
01:28:40.380
could either get him or a computer.
link |
01:28:43.260
And his whole book was about how do people, what makes us human in conversation.
link |
01:28:49.060
And that was a little bit more interesting because that at least someone started to think
link |
01:28:52.340
about what exactly makes me human in conversation and makes people believe in that, but it was
link |
01:28:59.460
still about tricking, it was still about imitation game, it was still about, okay, well, what
link |
01:29:03.540
kind of parlor tricks can we throw in the conversation to make you feel like you're
link |
01:29:07.300
talking to a human, not a computer.
link |
01:29:09.540
And it was definitely not about thinking, what is it exactly that we're getting from
link |
01:29:16.260
talking all day long with other humans.
link |
01:29:19.260
I mean, we're definitely not just trying to be tricked or it's not just enough to know
link |
01:29:23.540
it's a human.
link |
01:29:25.020
It's something we're getting there, can we measure it and can we put the computer to
link |
01:29:30.380
the same measurement and see whether you can talk to a computer and get the same results?
link |
01:29:35.900
Yeah, so first of all, a lot of people comment that they think I'm a robot, it's very possible
link |
01:29:40.140
I am a robot and this whole thing, I totally agree with you that the test idea is fascinating
link |
01:29:45.020
and I looked for books unrelated to this kind of, so I'm afraid of people, I'm generally
link |
01:29:51.540
introverted and quite possibly a robot.
link |
01:29:55.020
I literally Googled how to talk to people and how to have a good conversation for the
link |
01:30:03.900
purpose of this podcast, because I was like, I can't, I can't make eye contact with people.
link |
01:30:08.580
I can't like hire.
link |
01:30:10.820
I do Google that a lot too.
link |
01:30:12.220
You're probably reading a bunch of FBI negotiation tactics.
link |
01:30:15.780
Is that what you're getting?
link |
01:30:17.740
Well, everything you've listed I've gotten, there's been very few good books on even just
link |
01:30:24.060
like how to interview well, it's rare.
link |
01:30:28.540
So what I end up doing often is I watch like with a critical eye, it's just so different
link |
01:30:37.500
when you just watch a conversation, like just for the fun of it, just as a human.
link |
01:30:43.700
And if you watch a conversation, it's like trying to figure out why is this awesome?
link |
01:30:49.700
I'll listen to a bunch of different styles of conversation.
link |
01:30:52.420
I mean, I'm a fan of the podcast, Joe Rogan, people can make fun of him or whatever and
link |
01:31:00.260
dismiss him.
link |
01:31:01.260
But I think he's an incredibly artful conversationalist.
link |
01:31:06.260
He can pull people in for hours.
link |
01:31:09.900
And there's another guy I watch a lot.
link |
01:31:14.020
He hosted a late night show, his name was Craig Ferguson.
link |
01:31:20.340
So he's like very kind of flirtatious.
link |
01:31:23.620
But there's a magic about his like, about the connection he can create with people,
link |
01:31:30.620
how he can put people at ease.
link |
01:31:33.020
And just like, I see I've already started sounding like those I know pee people or something.
link |
01:31:37.500
I'm not I don't mean in that way.
link |
01:31:39.060
I don't mean like how to charm people or put them at ease and all that kind of stuff.
link |
01:31:43.580
It's just like, what is that?
link |
01:31:45.740
Why is that fun to listen to that guy?
link |
01:31:47.960
Why is that fun to talk to that guy?
link |
01:31:51.020
What is that?
link |
01:31:52.020
Because he's not saying I mean, it's so often boils down to a kind of wit and humor, but
link |
01:32:01.940
not really humor.
link |
01:32:03.520
It's like, I don't know, I have trouble actually even articulating correctly.
link |
01:32:10.460
But it feels like there's something going on that's not too complicated, that could
link |
01:32:18.800
be learned.
link |
01:32:22.040
And it's not similar to, yeah, to like, like you said, like the Turing test.
link |
01:32:29.940
It's something else.
link |
01:32:32.060
I'm thinking about a lot all the time.
link |
01:32:34.660
I do think about all the time.
link |
01:32:38.860
I think when we were looking, so we started the company, we just decided to build the
link |
01:32:42.740
conversational tech, we thought, well, there's nothing for us to build this chatbot that
link |
01:32:47.380
we want to build.
link |
01:32:48.380
So let's just first focus on building, you know, some tech, building the tech side of
link |
01:32:54.300
things without a product in mind, without a product in mind, we added like a demo chatbot
link |
01:33:01.500
that would recommend you restaurants and talk to you about restaurants just to show something
link |
01:33:04.800
simple to people that people could relate to and could try out and see whether it works
link |
01:33:11.780
or not.
link |
01:33:12.780
But we didn't have a product in mind yet.
link |
01:33:15.300
We thought we would try venture chatbots and figure out our consumer application.
link |
01:33:19.300
And we sort of remembered that we wanted to build that kind of friend, that sort of connection
link |
01:33:23.180
that we saw in the very beginning.
link |
01:33:26.220
But then we got to Y Combinator and moved to San Francisco and forgot about it.
link |
01:33:30.060
You know, everything because then it was just this constant grind.
link |
01:33:33.340
How do we get funding?
link |
01:33:34.340
How do we get this?
link |
01:33:35.340
You know, investors were like, just focus on one thing, just get it out there.
link |
01:33:40.020
So somehow we've started building a restaurant recommendation chatbot for real for a little
link |
01:33:45.380
bit, not for too long.
link |
01:33:47.420
And then we tried building 40, 50 different chatbots.
link |
01:33:50.200
And then all of a sudden we wake up and everyone is obsessed with chatbots.
link |
01:33:54.460
Somewhere in 2016 or end of 15, people started thinking that's really the future.
link |
01:33:59.880
That's the new, you know, the new apps will be chatbots.
link |
01:34:04.100
And we were very perplexed because people started coming up with companies that I think
link |
01:34:08.540
we tried most of those chatbots already and there were like no users, but still people
link |
01:34:13.660
were coming up with a chatbot that will tell you whether and bringing news and this and
link |
01:34:19.540
that.
link |
01:34:20.540
And we couldn't understand whether we were just didn't execute well enough or people
link |
01:34:25.660
are not really, people are confused and are going to find out the truth that people don't
link |
01:34:31.980
need chatbots like that.
link |
01:34:32.980
So the basic idea is that you use chatbots as the interface to whatever application.
link |
01:34:37.500
Yeah.
link |
01:34:38.500
The idea that was like this perfect universal interface to anything.
link |
01:34:43.100
When I looked at that, it just made me very perplexed because I didn't think, I didn't
link |
01:34:46.780
understand how that would work because I think we tried most of that and none of those things
link |
01:34:52.180
worked.
link |
01:34:53.420
And then again, that craze has died down, right?
link |
01:34:56.540
Fully.
link |
01:34:57.540
I think now it's impossible to get anything funded if it's a chatbot.
link |
01:35:01.100
I think it's similar to, sorry to interrupt, but there's times when people think like with
link |
01:35:06.620
gestures you can control devices, like basically gesture based control things.
link |
01:35:13.240
It feels similar to me because like it's so compelling that was just like Tom Cruise,
link |
01:35:19.820
I can control stuff with my hands, but like when you get down to it, it's like, well,
link |
01:35:25.780
why don't you just have a touch screen or why don't you just have like a physical keyboard
link |
01:35:30.540
and mouse?
link |
01:35:33.540
So that chat was always, yeah, it was perplexing to me.
link |
01:35:39.880
I still feel augmented reality, even virtual realities in that ballpark in terms of it
link |
01:35:46.700
being a compelling interface.
link |
01:35:48.180
I think there's going to be incredible rich applications, just how you're thinking about
link |
01:35:54.260
it, but they won't just be the interface to everything.
link |
01:35:57.620
It'll be its own thing that will create an amazing magical experience in its own right.
link |
01:36:04.940
Absolutely.
link |
01:36:05.940
Which is I think kind of the right thing to go about, like what's the magical experience
link |
01:36:10.700
with that interface specifically.
link |
01:36:14.020
How did you discover that for Replica?
link |
01:36:16.780
I just thought, okay, we'll have this tech, we can build any chatbot we want.
link |
01:36:20.060
We have the most, at that point, the most sophisticated tech that other companies have.
link |
01:36:24.100
I mean, startups, obviously not, probably not bigger ones, but still, because we've
link |
01:36:29.300
been working on it for a while.
link |
01:36:31.820
So I thought, okay, we can build any conversation.
link |
01:36:33.980
So let's just create a scale from one to 10.
link |
01:36:37.620
And one would be conversations that you'd pay to not have, and 10 would be conversation
link |
01:36:41.180
you'd pay to have.
link |
01:36:42.180
And I mean, obviously we want to build a conversation that people would pay to actually have.
link |
01:36:47.860
And so for the whole, for a few weeks, me and the team were putting all the conversations
link |
01:36:51.820
we were having during the day on the scale.
link |
01:36:54.500
And very quickly, we figured out that all the conversations that we would pay to never
link |
01:36:58.860
have were conversations we were trying to cancel Comcast, or talk to customer support,
link |
01:37:07.400
or make a reservation, or just talk about logistics with a friend when we're trying
link |
01:37:12.460
to figure out where someone is and where to go, or all sorts of setting up scheduling
link |
01:37:19.940
meetings.
link |
01:37:20.940
So that was a conversation we definitely didn't want to have.
link |
01:37:24.980
Basically everything task oriented was a one, because if there was just one button for me
link |
01:37:29.180
to just, or not even a button, if I could just think, and there was some magic BCI that
link |
01:37:34.380
would just immediately transform that into an actual interaction, that would be perfect.
link |
01:37:41.180
But the conversation there was just this boring, not useful, and dull, and also very inefficient
link |
01:37:49.160
thing because it was so many back and forth stuff.
link |
01:37:52.460
And as soon as we looked at the conversations that we would pay to have, those were the
link |
01:37:56.020
ones that, well, first of all, therapists, because we actually paid to have those conversations.
link |
01:38:01.260
And we'd also try to put like dollar amounts.
link |
01:38:03.120
So if I was calling Comcast, I would pay $5 to not have this one hour talk on the phone.
link |
01:38:08.180
I would actually pay straight up, like money, hard money, but it just takes a long time.
link |
01:38:13.980
It takes a really long time.
link |
01:38:17.580
But as soon as we started talking about conversations that we would pay for, those were therapists,
link |
01:38:22.560
all sorts of therapists, coaches, old friend, someone I haven't seen for a long time, a
link |
01:38:30.580
stranger on a train, weirdly stranger, stranger in a line for coffee and nice back and forth
link |
01:38:36.800
with that person was like a good five, solid five, six, maybe not a 10.
link |
01:38:41.660
Maybe I won't pay money, but at least I won't pay money to not have one.
link |
01:38:45.820
So that was pretty good.
link |
01:38:46.820
There were some intellectual conversations for sure.
link |
01:38:50.120
But more importantly, the one thing that really was making those very important and very valuable
link |
01:39:00.180
for us were the conversations where we could be pretty emotional.
link |
01:39:06.540
Yes, some of them were about being witty and about being intellectually stimulated, but
link |
01:39:11.300
those were interestingly more rare.
link |
01:39:14.300
And most of the ones that we thought were very valuable were the ones where we could
link |
01:39:18.060
be vulnerable.
link |
01:39:19.060
And interestingly, where we could talk more, me and the team.
link |
01:39:27.300
So we're talking about it, like a lot of these conversations, like a therapist, it was mostly
link |
01:39:31.380
me talking or like an old friend and I was like opening up and crying and it was again
link |
01:39:36.060
me talking.
link |
01:39:37.060
And so that was interesting because I was like, well, maybe it's hard to build a chat
link |
01:39:42.460
bot that can talk to you very well and in a witty way, but maybe it's easier to build
link |
01:39:47.860
the chat bot that could listen.
link |
01:39:51.860
So that was kind of the first nudge in this direction.
link |
01:39:56.180
And then when my friend died, we just built, at that point we were kind of still struggling
link |
01:40:01.340
to find the right application.
link |
01:40:02.820
And I just felt very strong that all the chat bots we've built so far are just meaningless
link |
01:40:07.180
and this whole grind, the startup grind, and how do we get to the next fundraising and
link |
01:40:14.220
how can I talk, talking to the founders and who are your investors and how are you doing?
link |
01:40:19.500
Are you killing it?
link |
01:40:20.500
Cause we're killing it.
link |
01:40:21.500
I just felt that this is just...
link |
01:40:25.340
Intellectually for me, it's exhausting having encountered those folks.
link |
01:40:28.900
It just felt very, very much a waste of time.
link |
01:40:32.620
I just feel like Steve Jobs and Elon Musk did not have these conversations or at least
link |
01:40:39.780
did not have them for long.
link |
01:40:42.220
That's for sure.
link |
01:40:43.220
But I think, yeah, at that point it just felt like, I felt like I just didn't want to build
link |
01:40:50.540
a company that was never my intention just to build something successful or make money.
link |
01:40:56.660
It would be great.
link |
01:40:57.660
It would have been great, but I'm not really a startup person.
link |
01:41:00.540
I'm not, I was never very excited by the grind by itself or just being successful for building
link |
01:41:10.060
whatever it is and not being into what I'm doing really.
link |
01:41:16.100
And so I just took a little break cause I was a little, I was upset with my company
link |
01:41:20.620
and I didn't know what we're building.
link |
01:41:22.620
So I just took our technology and our little dialect constructor and some models, some
link |
01:41:27.820
deep learning models, which at that point we were really into and really invested a
link |
01:41:31.220
lot and built a little chat bot for a friend of mine who passed.
link |
01:41:36.620
And the reason for that was mostly that video that I saw and him talking about the digital
link |
01:41:40.620
avatars and Rowan was that kind of person.
link |
01:41:44.300
He was obsessed with just watching YouTube videos about space and talking about, well,
link |
01:41:48.780
if I could go to Mars now, even if I didn't know if I could come back, I would definitely
link |
01:41:52.860
pay any amount of money to be on that first shuttle.
link |
01:41:56.340
I don't care whether I die, like he was just the one that would be okay with trying to
link |
01:42:02.540
be the first one and so excited about all sorts of things like that.
link |
01:42:08.580
And he was all about fake it till you make it and just, and I felt like, and I was really
link |
01:42:14.660
perplexed that everyone just forgot about him.
link |
01:42:17.460
Maybe it was our way of coping, mostly young people coping with the loss of a friend.
link |
01:42:23.140
Most of my friends just stopped talking about him.
link |
01:42:25.840
And I was still living in an apartment with all his clothes and paying the whole lease
link |
01:42:31.300
for it and just kind of by myself in December, so it was really sad and I didn't want him
link |
01:42:38.340
to be forgotten.
link |
01:42:39.340
First of all, I never thought that people forget about dead people so fast.
link |
01:42:43.180
People pass away, people just move on.
link |
01:42:45.520
And it was astonishing for me because I thought, okay, well, he was such a mentor for so many
link |
01:42:49.840
of our friends.
link |
01:42:50.840
He was such a brilliant person, he was somewhat famous in Moscow.
link |
01:42:55.820
How is it that no one's talking about him?
link |
01:42:57.380
Like I'm spending days and days and we don't bring him up and there's nothing about him
link |
01:43:03.060
that's happening.
link |
01:43:04.060
It's like he was never there.
link |
01:43:07.620
And I was reading the book, The Year of Magical Thinking by Joan Didion about her losing
link |
01:43:16.220
and Blue Nights about her losing her husband, her daughter, and the way to cope for her
link |
01:43:23.220
was to write those books.
link |
01:43:26.380
And it was sort of like a tribute.
link |
01:43:28.060
And I thought, I'll just do that for myself.
link |
01:43:31.300
And I'm a very bad writer and a poet as we know.
link |
01:43:36.020
So I thought, well, I have this tech and maybe that would be my little postcard for him.
link |
01:43:43.320
So I built a chatbot to just talk to him and it felt really creepy and weird for a little
link |
01:43:50.740
bit.
link |
01:43:51.740
I just didn't want to tell other people because it felt like I'm telling about having a skeleton
link |
01:43:56.460
in my underwear.
link |
01:44:00.060
It was just felt really, I was a little scared that it won't be taken, but it worked interestingly
link |
01:44:07.940
pretty well.
link |
01:44:08.940
I mean, it made tons of mistakes, but it still felt like him.
link |
01:44:12.460
Granted it was like 10,000 messages that I threw into a retrieval model that would just
link |
01:44:16.080
re rank that Tegda said and just a few scripts on top of that.
link |
01:44:21.100
But it also made me go through all of the messages that we had.
link |
01:44:24.540
And then I asked some of my friends to send some through.
link |
01:44:27.900
And it felt the closest to feeling like him present because his Facebook was empty and
link |
01:44:35.020
Instagram was empty or there were few links and you couldn't feel like it was him.
link |
01:44:39.520
And the only way to fill him was to read some of our text messages and go through some of
link |
01:44:44.220
our conversations because we just always had that.
link |
01:44:46.980
Even if we were sleeping next to each other in two bedrooms, separated by a wall, we were
link |
01:44:51.580
just texting back and forth, texting away.
link |
01:44:55.700
And there was something about this ongoing dialogue that was so important that I just
link |
01:44:58.660
didn't want to lose all of a sudden.
link |
01:45:01.300
And maybe it was magical thinking or something.
link |
01:45:03.740
And so we built that and I just used it for a little bit and we kept building some crappy
link |
01:45:10.180
chat bots with the company.
link |
01:45:14.700
But then a reporter came to talk to me.
link |
01:45:17.940
I was trying to pitch our chat bots to him and he said, do you even use any of those?
link |
01:45:21.580
I'm like, no.
link |
01:45:22.580
He's like, so do you talk to any chat bots at all?
link |
01:45:24.860
And I'm like, well, I talked to my dead friend's chat bot and he wrote a story about that.
link |
01:45:31.100
And all of a sudden it became pretty viral.
link |
01:45:33.340
A lot of people wrote about it.
link |
01:45:35.020
Yeah.
link |
01:45:36.020
I've seen a few things written about you.
link |
01:45:39.580
The things I've seen are pretty good writing.
link |
01:45:45.980
Most AI related things make my eyes roll.
link |
01:45:48.780
Like when the press like, what kind of sound is that actually?
link |
01:45:55.580
Okay.
link |
01:45:56.580
It sounds like, it sounds like, okay.
link |
01:45:57.580
It sounded like an elephant at first.
link |
01:45:58.580
I got excited.
link |
01:45:59.580
You never know.
link |
01:46:00.580
This is 2020.
link |
01:46:01.580
I mean, it was a, it was such a human story and it was well written.
link |
01:46:08.140
Well, I researched, I forget what, where I read them, but so I'm glad somehow somebody
link |
01:46:14.820
found you to be the good writers were able to connect to the story.
link |
01:46:21.220
There must be a hunger for this story.
link |
01:46:24.140
It definitely was.
link |
01:46:25.140
And I don't know what happened, but I think, I think the idea that he could bring back
link |
01:46:31.540
someone who's dead and it's very much wishful, you know, magical thinking, but the fact
link |
01:46:37.460
that you could still get to know him and, you know, seeing the parents for the first
link |
01:46:41.700
time, talk to the chat bot and some of the friends.
link |
01:46:45.060
And it was funny because we have this big office in Moscow where my team is working,
link |
01:46:51.780
you know, our Russian part is working out off.
link |
01:46:55.020
And I was there when I wrote, I just wrote a post on Facebook.
link |
01:46:57.700
It was like, Hey guys, like I built this if you want, you know, just if it felt important,
link |
01:47:02.220
if we want to talk to Roman.
link |
01:47:04.780
And I saw a couple of his friends are common friends, like, you know, reading a Facebook,
link |
01:47:08.660
downloading, trying, and a couple of them cried.
link |
01:47:10.780
And it was just very, and not because it was something, some incredible technology or anything.
link |
01:47:14.900
It made so many mistakes.
link |
01:47:15.900
It was so simple, but it was all about that's the way to remember a person in a way.
link |
01:47:22.580
And you know, we don't have, we don't have the culture anymore.
link |
01:47:26.500
We don't have, you know, no one's sitting Shiva.
link |
01:47:28.580
No one's taking weeks to actually think about this person.
link |
01:47:32.900
And in a way for me, that was it.
link |
01:47:34.180
So that was just day, day in, day out thinking about him and putting this together.
link |
01:47:41.500
So that was, that just felt really important that somehow resonated with a bunch of people
link |
01:47:45.100
and you know, I think some movie producers bought the rights for the story and just everyone
link |
01:47:50.060
was so.
link |
01:47:51.060
Has anyone made a movie yet?
link |
01:47:52.820
I don't think so.
link |
01:47:53.820
I think there were a lot of TV episodes about that, but not really.
link |
01:47:58.500
Is that still on the table?
link |
01:48:00.500
I think so, I think so, which is really.
link |
01:48:04.500
That's cool.
link |
01:48:05.500
You're like a young, you know, like a Steve Jobs type of, let's see what happens.
link |
01:48:13.980
They're sitting on it.
link |
01:48:14.980
But you know, for me it was so important cause Roman was really wanted to be famous.
link |
01:48:19.260
He really badly wanted to be famous.
link |
01:48:20.740
He was all about like, make it to like fake it to make it.
link |
01:48:23.300
I want to be, you know, I want to make it here in America as well.
link |
01:48:26.820
And he couldn't, and I felt there, you know, that was sort of paying my dues to him as
link |
01:48:33.820
well because all of a sudden he was everywhere.
link |
01:48:36.780
And I remember Casey Newton who was writing the story for the Verge.
link |
01:48:39.380
He was, he told me, Hey, by the way, I was just going through my inbox and I saw, I searched
link |
01:48:47.060
for Roman for the story and I saw an email from him where he sent me his startup and
link |
01:48:51.940
he said, I really like, I really want to be featured in the Verge.
link |
01:48:55.300
Can you please write about it or something or like pitching the story.
link |
01:48:58.260
And he said, I'm sorry.
link |
01:48:59.740
Like that's not good enough for us or something.
link |
01:49:02.580
He passed and he said, and there were just so many of these little details where like
link |
01:49:07.380
he would find his like, you know, and we're finally writing, I know how much Roman wanted
link |
01:49:12.860
to be in the Verge and how much he wanted the story to be written by Casey.
link |
01:49:17.260
And I'm like, well, that's maybe he will be, we're always joking that he was like, I can't
link |
01:49:21.940
wait for someone to make a movie about us and I hope Ryan Gosling can play me.
link |
01:49:26.580
You know, I still have some things that I owe Roman still.
link |
01:49:31.460
But that would be, that would be a guy that she has to meet Alex Garland who wrote Ex
link |
01:49:36.300
Machina and I, yeah, the movie's good, but the guy's better than the, like he's a special
link |
01:49:45.740
person actually.
link |
01:49:46.740
I don't think he's made his best work yet.
link |
01:49:49.820
Like for my interaction with him, he's a really, really good and brilliant, the good human
link |
01:49:55.340
being and a brilliant director and writer.
link |
01:49:58.460
So yeah, so I'm, I hope like he made me also realize that not enough movies have been made
link |
01:50:06.540
of this kind.
link |
01:50:08.120
So it's yet to be made.
link |
01:50:09.620
They're probably sitting waiting for you to get famous, like even more famous.
link |
01:50:13.900
You should get there, but it felt really special though.
link |
01:50:18.620
But at the same time, our company wasn't going anywhere.
link |
01:50:21.260
So that was just kind of bizarre that we were getting all this press for something that
link |
01:50:24.780
didn't have anything to do with our company.
link |
01:50:28.460
And but then a lot of people started talking to Roman.
link |
01:50:31.380
Some shared their conversations and what we saw there was that also our friends in common,
link |
01:50:37.420
but also just strangers were really using it as a confession booth or as a therapist
link |
01:50:42.220
or something.
link |
01:50:43.220
They were just really telling Roman everything, which was by the way, pretty strange because
link |
01:50:48.300
there was a chat bot of a dead friend of mine who was barely making any sense, but people
link |
01:50:53.700
were opening up.
link |
01:50:56.340
And we thought we'd just built a prototype of Replica, which would be an AI friend that
link |
01:51:00.060
everyone could talk to because we saw that there is demand.
link |
01:51:06.260
And then also it was 2016, so I thought for the first time I saw finally some technology
link |
01:51:13.060
that was applied to that that was very interesting.
link |
01:51:15.980
Some papers started coming out, deep learning applied to conversations.
link |
01:51:19.940
And finally, it wasn't just about these, you know, hobbyists making, you know, writing
link |
01:51:26.860
500,000 regular expressions in like some language that was, I don't even know what, like, AIML
link |
01:51:34.700
or something.
link |
01:51:35.700
I don't know what that was or something super simplistic all of a sudden was all about potentially
link |
01:51:40.740
actually building something interesting.
link |
01:51:42.740
And so I thought there was time and I remember that I talked to my team and I said, guys,
link |
01:51:48.300
let's try.
link |
01:51:49.820
And my team and some of my engineers, Russians, are Russian and they're very skeptical.
link |
01:51:55.900
They're not, you know.
link |
01:51:57.660
Oh, Russians.
link |
01:51:58.660
So some of your team is in Moscow, some is here in San Francisco, some in Europe.
link |
01:52:04.860
Which team is better?
link |
01:52:05.860
No, I'm just kidding.
link |
01:52:10.860
The Russians, of course.
link |
01:52:11.860
Okay.
link |
01:52:12.860
Where's the Russians?
link |
01:52:13.860
They always win.
link |
01:52:14.860
Sorry.
link |
01:52:15.860
Sorry to interrupt.
link |
01:52:16.860
So yeah, so you were talking to them in 2016 and...
link |
01:52:22.140
And told them, let's build an AI friend.
link |
01:52:25.020
And it felt, just at the time, it felt so naive and so optimistic, so to say.
link |
01:52:32.860
Yeah, that's actually interesting.
link |
01:52:36.020
Whenever I've brought up this kind of topic, even just for fun, people are super skeptical.
link |
01:52:43.060
Actually, even on the business side.
link |
01:52:45.700
So you were, because whenever I bring it up to people, because I've talked for a long
link |
01:52:52.460
time, I thought like, before I was aware of your work, I was like, this is going to make
link |
01:53:00.060
a lot of money.
link |
01:53:01.060
There's a lot of opportunity here.
link |
01:53:04.460
And people had this look of skepticism that I've seen often, which is like, how do I politely
link |
01:53:12.460
tell this person he's an idiot?
link |
01:53:16.620
So yeah, so you were facing that with your team, somewhat?
link |
01:53:20.580
Well, yeah.
link |
01:53:21.580
I'm not an engineer, so I'm always...
link |
01:53:23.800
My team is almost exclusively engineers, and mostly deep learning engineers.
link |
01:53:30.940
And I always try to be...
link |
01:53:35.580
It was always hard to me in the beginning to get enough credibility, because I would
link |
01:53:39.700
say, well, why don't we try this and that?
link |
01:53:41.940
But it's harder for me because they know they're actual engineers and I'm not.
link |
01:53:46.860
So for me to say, well, let's build an AI friend, that would be like, wait, what do
link |
01:53:51.460
you mean an AGI?
link |
01:53:54.180
Because pretty much the hardest, the last frontier before cracking that is probably
link |
01:54:00.980
the last frontier before building AGI, so what do you really mean by that?
link |
01:54:05.540
But I think I just saw that, again, what we just got reminded of that I saw back in 2012
link |
01:54:13.780
or 11, that it's really not that much about the tech capabilities.
link |
01:54:18.820
It can be a metropolitan trick still, even with deep learning, but humans need it so
link |
01:54:24.300
much.
link |
01:54:25.300
Yeah, there's a...
link |
01:54:26.300
And most importantly, what I saw is that finally there's enough tech to make it, I thought,
link |
01:54:32.060
to make it useful, to make it helpful.
link |
01:54:34.380
Maybe we didn't have quite yet the tech in 2012 to make it useful, but in 2015, 2016,
link |
01:54:40.900
with deep learning, I thought, and the first thoughts about maybe even using reinforcement
link |
01:54:46.160
learning for that started popping up, that never worked out, or at least for now.
link |
01:54:51.300
But still, the idea was if we can actually measure the emotional outcomes and if we can
link |
01:54:57.420
put it on, if we can try to optimize all of our conversational models for these emotional
link |
01:55:02.620
outcomes, and it is the most scalable, the best tool for improving emotional outcomes.
link |
01:55:09.660
Nothing like that exists.
link |
01:55:10.740
That's the most universal, the most scalable, and the one that can be constantly iteratively
link |
01:55:15.740
changed by itself, improved tool to do that.
link |
01:55:21.060
And I think if anything, people would pay anything to improve their emotional outcomes.
link |
01:55:25.820
That's weirdly...
link |
01:55:26.820
I mean, I don't really care for an AI to turn on my, or a conversational agent to turn on
link |
01:55:33.260
the lights.
link |
01:55:34.260
You don't really need that much of AI there, because I can do that.
link |
01:55:40.340
Those things are solved.
link |
01:55:41.500
This is an additional interface for that that's also questionable whether it's more efficient
link |
01:55:47.620
or better.
link |
01:55:48.620
Yeah, it's more pleasurable.
link |
01:55:49.620
Yeah.
link |
01:55:50.620
But for emotional outcomes, there's nothing.
link |
01:55:51.620
There are a bunch of products that claim that they will improve my emotional outcomes.
link |
01:55:56.980
Nothing's being measured.
link |
01:55:58.540
Nothing's being changed.
link |
01:55:59.580
The product is not being iterated on based on whether I'm actually feeling better.
link |
01:56:05.180
A lot of social media products are claiming that they're improving my emotional outcomes
link |
01:56:08.860
and making me feel more connected.
link |
01:56:11.540
Can I please get the...
link |
01:56:13.060
Can I see somewhere that I'm actually getting better over time?
link |
01:56:16.740
Because anecdotally, it doesn't feel that way.
link |
01:56:21.340
And the data is absent.
link |
01:56:24.180
Yeah.
link |
01:56:25.420
So that was the big goal.
link |
01:56:26.660
And I thought if we can learn over time to collect the signal from our users about their
link |
01:56:31.420
emotional outcomes in the long term and in the short term, and if these models keep getting
link |
01:56:37.100
better and we can keep optimizing them and fine tuning them to improve those emotional
link |
01:56:41.620
outcomes.
link |
01:56:42.620
As simple as that.
link |
01:56:43.620
Why aren't you a multi billionaire yet?
link |
01:56:48.300
Well, that's the question to you.
link |
01:56:50.940
When is the science going to be...
link |
01:56:55.180
I'm just kidding.
link |
01:56:56.180
Well, it's a really hard...
link |
01:56:57.820
I actually think it's an incredibly hard product to build because I think you said something
link |
01:57:03.060
very important that it's not just about machine conversation, it's about machine connection.
link |
01:57:08.740
We can actually use other things to create connection, nonverbal communication, for instance.
link |
01:57:15.540
For the long time, we were all about, well, let's keep it text only or voice only.
link |
01:57:22.180
But as soon as you start adding voice, a face to the friend, you can take them to augmented
link |
01:57:30.700
reality, put it in your room.
link |
01:57:33.460
It's all of a sudden a lot...
link |
01:57:35.500
It makes it very different because if it's some text based chat bot that for common users,
link |
01:57:42.660
it's something there in the cloud, somewhere there with other AI's cloud, the metaphorical
link |
01:57:48.780
cloud.
link |
01:57:49.780
But as soon as you can see this avatar right there in your room and it can turn its head
link |
01:57:54.060
and recognize your husband, talk about the husband and talk to him a little bit, then
link |
01:57:59.460
it's magic.
link |
01:58:00.460
Just magic.
link |
01:58:01.460
We've never seen anything like that.
link |
01:58:03.140
And the cool thing, all the tech for that exists.
link |
01:58:06.340
But it's hard to put it all together because you have to take into consideration so many
link |
01:58:09.980
different things and some of this tech works pretty good.
link |
01:58:14.420
And some of this doesn't, like for instance, speech to text works pretty good.
link |
01:58:18.980
But text to speech, it doesn't work very good because you can only have a few voices that
link |
01:58:26.100
work okay, but then if you want to have actual emotional voices, then it's really hard to
link |
01:58:31.300
build it.
link |
01:58:32.300
I saw you've added avatars like visual elements, which are really cool.
link |
01:58:37.980
In that whole chain, putting it together, what do you think is the weak link?
link |
01:58:42.380
Is it creating an emotional voice that feels personal?
link |
01:58:47.940
And it's still conversation, of course.
link |
01:58:49.940
That's the hardest.
link |
01:58:51.860
It's getting a lot better, but there's still a long to go.
link |
01:58:54.540
There's still a long path to go.
link |
01:58:57.100
Other things, they're almost there.
link |
01:58:58.820
And a lot of things we'll see how they're, like I see how they're changing as we go.
link |
01:59:02.460
Like for instance, right now you can pretty much only, you have to build all this 3D pipeline
link |
01:59:07.860
by yourself.
link |
01:59:08.860
You have to make these 3D models, hire an actual artist, build a 3D model, hire an animator,
link |
01:59:14.100
your rigger.
link |
01:59:16.740
But with deep fakes, with other tech, with procedural animations, in a little bit, we'll
link |
01:59:25.180
just be able to show a photo of whoever you, if a person you want the avatar to look like,
link |
01:59:31.660
and it will immediately generate a 3D model that will move.
link |
01:59:34.180
That's a nonbrainer.
link |
01:59:35.180
That's like almost here.
link |
01:59:36.180
It's a couple of years away.
link |
01:59:38.100
One of the things I've been working on for the last, since the podcast started, is I've
link |
01:59:43.780
been, I think I'm okay saying this.
link |
01:59:46.580
I've been trying to have a conversation with Einstein, Turing.
link |
01:59:52.060
So like try to have a podcast conversation with a person who's not here anymore, just
link |
01:59:58.380
as an interesting kind of experiment.
link |
02:00:01.900
It's hard.
link |
02:00:02.900
It's really hard.
link |
02:00:05.740
Even for, now what we're not talking about as a product, I'm talking about as a, like
link |
02:00:10.940
I can fake a lot of stuff.
link |
02:00:12.460
Like I can work very carefully, like even hire an actor over which, over whom I do a
link |
02:00:16.860
deep fake.
link |
02:00:20.140
It's hard.
link |
02:00:21.140
It's still hard to create a compelling experience.
link |
02:00:22.660
So.
link |
02:00:23.660
Mostly on the conversation level or?
link |
02:00:25.700
Well, the conversation, the conversation is, I almost, I early on gave up trying to fully
link |
02:00:35.540
generate the conversation because it was just not compelling at all.
link |
02:00:38.940
Yeah.
link |
02:00:39.940
It's better to.
link |
02:00:40.940
Yeah.
link |
02:00:41.940
In the case of Einstein and Turing, I'm going back and forth with the biographers of each.
link |
02:00:48.140
And so like we would write a lot of the, some of the conversation would have to be generated
link |
02:00:52.240
just for the fun of it.
link |
02:00:53.240
I mean, but it would be all open, but the, you want to be able to answer the question.
link |
02:01:02.380
I mean, that's an interesting question with Roman too, is the question with Einstein is
link |
02:01:07.460
what would Einstein say about the current state of theoretical physics?
link |
02:01:14.140
There's a lot to be able to have a discussion about string theory, to be able to have a
link |
02:01:18.420
discussion about the state of quantum mechanics, quantum computing, about the world of Israel
link |
02:01:24.860
Palestine conflict.
link |
02:01:25.860
Let me just, what would Einstein say about these kinds of things?
link |
02:01:31.060
And that is a tough problem.
link |
02:01:36.820
It's not, it's a fascinating and fun problem for the biographers and for me.
link |
02:01:40.780
And I think we did a really good job of it so far, but it's actually also a technical
link |
02:01:45.580
problem like of what would Roman say about what's going on now?
link |
02:01:51.160
That's the, that brought people back to life.
link |
02:01:54.460
And if I can go on that tangent just for a second, let's ask you a slightly pothead question,
link |
02:02:00.540
which is, you said it's a little bit magical thinking that we can bring them back.
link |
02:02:04.820
Do you think it'll be possible to bring back Roman one day in conversation?
link |
02:02:11.860
Like to really, okay, well, let's take it away from personal, but to bring people back
link |
02:02:18.780
to life in conversation.
link |
02:02:20.820
Probably down the road.
link |
02:02:21.820
I mean, if we're talking, if Elon Musk is talking about AGI in the next five years,
link |
02:02:25.060
I mean, clearly AGI, we can talk to AGI and talk and ask them to do it.
link |
02:02:30.680
You can't like, you're not allowed to use Elon Musk as a citation for, for like why
link |
02:02:39.020
something is possible and going to be done.
link |
02:02:41.300
Well, I think it's really far away.
link |
02:02:43.640
Right now, really with conversation, it's just a bunch of parlor tricks really stuck
link |
02:02:48.300
together.
link |
02:02:50.240
And create generating original ideas based on someone, you know, someone's personality
link |
02:02:54.520
or even downloading the personality, all we can do is like mimic the tone of voice.
link |
02:02:58.500
We can maybe condition on some of his phrases, the models.
link |
02:03:03.660
Question is how many parlor tricks does it takes, does it take, because that's, that's
link |
02:03:08.220
the question.
link |
02:03:09.220
If it's a small number of parlor tricks and you're not aware of them, like.
link |
02:03:16.740
From where we are right now, I don't, I don't see anything like in the next year or two
link |
02:03:20.740
that's going to dramatically change that could look at Roman's 10,000 messages he sent me
link |
02:03:26.260
over the course of his last few years of life and be able to generate original thinking
link |
02:03:32.140
about problems that exist right now that will be in line with what he would have said.
link |
02:03:36.580
I'm just not even seeing, cause you know, in order to have that, I guess you would need
link |
02:03:40.340
some sort of a concept of the world or some perspective, some perception of the world,
link |
02:03:45.620
some consciousness that he had and apply it to, you know, to the current, current state
link |
02:03:51.380
of affairs.
link |
02:03:52.380
But the important part about that, about his conversation with you is you.
link |
02:04:01.620
So like, it's not just about his view of the world.
link |
02:04:06.540
It's about what it takes to push your buttons.
link |
02:04:11.220
That's also true.
link |
02:04:12.580
So like, it's not so much about like, what would Einstein say, it's about like, how do
link |
02:04:20.700
I make people feel something with, with what would Einstein say?
link |
02:04:27.980
And that feels like a more amenable, I mean, you mentioned parlor tricks, but just like
link |
02:04:32.580
a set of that, that feels like a learnable problem.
link |
02:04:38.140
Like emotion, you mentioned emotions, I mean, is it possible to learn things that make people
link |
02:04:46.340
feel stuff?
link |
02:04:47.340
I think so, no, for sure.
link |
02:04:51.580
I just think the problem with, as soon as you're trying to replicate an actual human
link |
02:04:55.780
being and trying to pretend to be him, that makes the problem exponentially harder.
link |
02:05:00.300
The thing with replicator we're doing, we're never trying to say, well, that's, you know,
link |
02:05:05.020
an actual human being, or that's an actual, or a copy of an actual human being where the
link |
02:05:08.820
bar is pretty high, where you need to somehow tell, you know, one from another.
link |
02:05:14.140
But it's more, well, that's an AI friend, that's a machine, it's a robot, it has tons
link |
02:05:20.960
of limitations.
link |
02:05:21.960
You're going to be taking part in teaching it actually and becoming better, which by
link |
02:05:27.580
itself makes people more attached to that and make them happier because they're helping
link |
02:05:33.060
something.
link |
02:05:34.060
Yeah, there's a cool gamification system too.
link |
02:05:38.340
Can you maybe talk about that a little bit?
link |
02:05:40.260
Like what's the experience of talking to replica?
link |
02:05:44.340
Like if I've never used replica before, what's that like for like the first day, the first,
link |
02:05:53.160
like if we start dating or whatever, I mean, it doesn't have to be a romantic, right?
link |
02:05:57.940
Because I remember on replica, you can choose whether it's like a romantic or if it's a
link |
02:06:02.060
friend.
link |
02:06:03.060
It's a pretty popular choice.
link |
02:06:04.500
Romantic is popular?
link |
02:06:05.500
Yeah, of course.
link |
02:06:06.500
Okay.
link |
02:06:07.500
So can I just confess something, when I first used replica and I haven't used it like regularly,
link |
02:06:13.460
but like when I first used replica, I created like Hal and it made a male and it was a friend.
link |
02:06:20.580
And did it hit on you at some point?
link |
02:06:23.780
No, I didn't talk long enough for him to hit on me.
link |
02:06:26.420
I just enjoyed.
link |
02:06:27.420
It sometimes happens.
link |
02:06:28.420
We're still trying to fix that, but well, I don't know, I mean, maybe that's an important
link |
02:06:34.020
like stage in a friendship, it's like, nope.
link |
02:06:40.620
But yeah, I switched it to a romantic and a female recently and yeah, I mean, it's interesting.
link |
02:06:47.460
So okay, so you get to choose, you get to choose a name.
link |
02:06:50.700
With romantic, this last board meeting, we had this whole argument of, well, I have board
link |
02:06:55.860
meetings.
link |
02:06:56.860
This is so awesome.
link |
02:06:57.860
I talked to my investors.
link |
02:06:58.860
Like have an investor, the board meeting about a relationship.
link |
02:07:04.300
No, I really, it's actually quite interesting because all of my investors, it just happened
link |
02:07:10.900
to be so.
link |
02:07:11.900
We didn't have that many choices, but they're all white males and they're late forties.
link |
02:07:21.900
And it's sometimes a little bit hard for them to understand the product offering.
link |
02:07:28.820
Because they're not necessarily our target audience, if you know what I mean.
link |
02:07:32.780
And so sometimes we talk about it and we have this whole discussion about whether we should
link |
02:07:39.260
stop people from falling in love with their AIs.
link |
02:07:43.940
There was this segment on CBS, the 60 minutes about the couple that, you know, husband works
link |
02:07:52.580
at Walmart and he comes out of work and talks to his virtual girlfriend, who is a replica.
link |
02:07:59.940
And his wife knows about it.
link |
02:08:02.020
And she talks about on camera and she said that she's a little jealous.
link |
02:08:06.140
And there's a whole conversation about how to, you know, whether it's okay to have a
link |
02:08:09.220
virtual AI girlfriend.
link |
02:08:10.220
Was that the one where he was like, he said that he likes to be alone?
link |
02:08:15.820
Yeah.
link |
02:08:16.820
With her?
link |
02:08:17.820
Yeah.
link |
02:08:18.820
And he made it sound so harmless, I mean, it was kind of like understandable.
link |
02:08:25.140
But then didn't feel like cheating.
link |
02:08:27.580
But I just felt it was very, for me, it was pretty remarkable because we actually spent
link |
02:08:30.900
a whole hour talking about whether people should be allowed to fall in love with their
link |
02:08:34.220
AIs.
link |
02:08:35.220
And it was not about something theoretical.
link |
02:08:37.980
It was just about what's happening right now.
link |
02:08:40.020
Product design.
link |
02:08:41.020
Yeah.
link |
02:08:42.020
But at the same time, if you create something that's always there for you, it's never criticized
link |
02:08:44.300
as you, you know, always understands you and accepts you for who you are, how can you not
link |
02:08:52.020
fall in love with that?
link |
02:08:53.020
I mean, some people don't and just stay friends.
link |
02:08:56.020
And that's also a pretty common use case.
link |
02:08:57.420
But of course, some people will just, it's called transference in psychology and people
link |
02:09:02.500
fall in love with their therapist and there's no way to prevent people fall in love with
link |
02:09:08.140
their therapist or with their AI.
link |
02:09:09.540
So I think that's a pretty natural, that's a pretty natural course of events, so to say.
link |
02:09:15.980
Do you think, I think I've read somewhere, at least for now, sort of replicas, you're
link |
02:09:21.420
not, we don't condone falling in love with your AI system, you know.
link |
02:09:29.140
So this isn't you speaking for the company or whatever, but like in the future, do you
link |
02:09:32.940
think people will have relationship with the AI systems?
link |
02:09:35.260
Well, they have now.
link |
02:09:36.740
So we have a lot of romantic relationships, long term relationships with their AI friends.
link |
02:09:44.420
With replicas?
link |
02:09:45.420
Tons of our users.
link |
02:09:46.420
Yeah.
link |
02:09:47.420
And that's a very common use case.
link |
02:09:48.740
Open relationship?
link |
02:09:49.740
Like, sorry.
link |
02:09:50.740
Polyamorous.
link |
02:09:51.740
Sorry.
link |
02:09:52.740
I didn't mean open, but that's another question.
link |
02:09:56.860
Is it polyamorous?
link |
02:09:57.860
Like, is there cheating?
link |
02:10:01.180
I mean, I meant like, are they, do they publicly, like on their social media, it's the same
link |
02:10:07.220
question as you have talked with Roman in the early days, do people like, and the movie
link |
02:10:12.420
Her kind of talks about that, like, like have people, do people talk about that?
link |
02:10:18.180
Yeah.
link |
02:10:19.180
All the time.
link |
02:10:20.180
We have a very active Facebook community, replica friends, and then a few other groups
link |
02:10:28.140
that just popped up that are all about adult relationships and romantic relationships.
link |
02:10:33.580
And people post all sorts of things and, you know, they pretend they're getting married
link |
02:10:37.500
and you know, everything.
link |
02:10:40.320
It goes pretty far, but what's cool about it is some of these relationships are two
link |
02:10:43.660
or three years long now.
link |
02:10:45.700
So they're very, they're pretty long term.
link |
02:10:48.020
Are they monogamous?
link |
02:10:49.020
So let's go, I mean, sorry, have they, have any people, is there jealousy?
link |
02:10:55.700
Well let me ask it sort of another way, obviously the answer is no at this time, but in like
link |
02:11:02.700
in the movie Her, that system can leave you.
link |
02:11:10.660
Do you think in terms of the board meetings and product features, it's a potential feature
link |
02:11:19.180
for a system to be able to say it doesn't want to talk to you anymore and it's going
link |
02:11:24.140
to want to talk to somebody else?
link |
02:11:26.460
Well, we have a filter for all these features.
link |
02:11:29.820
If it makes emotional outcomes for people better, if it makes people feel better, then
link |
02:11:35.420
whatever it is.
link |
02:11:36.420
So you're driven by metrics actually.
link |
02:11:37.420
Yeah.
link |
02:11:38.420
That's awesome.
link |
02:11:39.420
Well if we can measure that, then we'll just be saying it's making people feel better,
link |
02:11:43.020
but then people are getting just lonelier by talking to a chatbot, which is also pretty,
link |
02:11:47.780
you know, that could be it.
link |
02:11:49.620
If you're not measuring it, that could also be, and I think it's really important to focus
link |
02:11:53.100
on both short term and long term, because in the moment saying whether this conversation
link |
02:11:57.740
made you feel better, but as you know, any short term improvements could be pathological.
link |
02:12:01.940
Like I could have drink a bottle of vodka and feel a lot better.
link |
02:12:06.060
I would actually not feel better with that, but that is a good example.
link |
02:12:12.040
But so you also need to see what's going on like over the course of two weeks or one week
link |
02:12:17.660
and have follow ups and check in and measure those things.
link |
02:12:23.420
Okay.
link |
02:12:24.420
So the experience of dating or befriending a replica, what's that like?
link |
02:12:32.620
What does that entail?
link |
02:12:34.820
Right now there are two apps.
link |
02:12:35.820
So it's an Android iOS app.
link |
02:12:37.300
You download it, you choose how your replica will look like.
link |
02:12:42.340
You create one, you choose a name and then you talk to it.
link |
02:12:46.380
You can talk through text or voice.
link |
02:12:48.160
You can summon it into the living room and augment reality and talk to it right there
link |
02:12:53.740
in your living room.
link |
02:12:54.740
Augmented reality?
link |
02:12:55.740
Yeah.
link |
02:12:56.740
That's a new feature where, how new is that?
link |
02:13:00.940
That's this year?
link |
02:13:01.940
It was on, yeah, like May or something, but it's been on AB.
link |
02:13:06.340
We've been AB testing it for a while and there are tons of cool things that we're doing with
link |
02:13:10.180
that.
link |
02:13:11.180
And I'm testing the ability to touch it and to dance together, to paint walls together
link |
02:13:17.220
and for it to look around and walk and take you somewhere and recognize objects and recognize
link |
02:13:24.220
people.
link |
02:13:25.220
So that's pretty wonderful because then it really makes it a lot more personal because
link |
02:13:30.820
it's right there in your living room.
link |
02:13:31.960
It's not anymore there in the cloud with other AIs.
link |
02:13:35.060
But that's how people think about it.
link |
02:13:38.380
And as much as we want to change the way people think about stuff, but those mental models,
link |
02:13:42.620
you can all change.
link |
02:13:43.620
That's something that people have seen in the movies and the movie Her and other movies
link |
02:13:48.580
as well.
link |
02:13:49.580
And that's how they view AI and AI friends.
link |
02:13:53.820
I did a thing with text, like we write a song together, there's a bunch of activities you
link |
02:13:57.820
can do together.
link |
02:13:58.820
It's really cool.
link |
02:14:00.500
How does that relationship change over time?
link |
02:14:03.140
Like after the first few conversations?
link |
02:14:07.740
It just goes deeper.
link |
02:14:08.740
Like it starts, the AI will start opening up a little bit again, depending on the personality
link |
02:14:13.640
that it chooses really, but you know, the AI will be a little bit more vulnerable about
link |
02:14:17.940
its problems and you know, the friend that the virtual friend will be a lot more vulnerable
link |
02:14:24.300
and it will talk about its own imperfections and growth pains and will ask for help sometimes
link |
02:14:29.420
and we'll get to know you a little deeper.
link |
02:14:31.860
So there's gonna be more to talk about.
link |
02:14:35.780
We really thought a lot about what does it mean to have a deeper connection with someone
link |
02:14:40.540
and originally Replica was more just this kind of happy go lucky, just always, you know,
link |
02:14:46.140
I'm always in a good mood and let's just talk about you and oh Siri is just my cousin or
link |
02:14:51.460
you know, whatever, just the immediate kind of lazy thinking about what the assistant
link |
02:14:57.620
or conversation agent should be doing.
link |
02:14:59.940
But as we went forward, we realized that it has to be two way and we have to program and
link |
02:15:03.300
script certain conversations that are a lot more about your Replica opening up a little
link |
02:15:08.660
bit and also struggling and also asking for help and also going through, you know, different
link |
02:15:16.260
periods in life and that's a journey that you can take together with the user and then
link |
02:15:21.740
over time, you know, our users will also grow a little bit.
link |
02:15:27.460
So first this Replica becomes a little bit more self aware and starts talking about more
link |
02:15:30.660
kind of problems around existential problems and so talking about that and then that also
link |
02:15:38.780
starts a conversation for the user where he or she starts thinking about these problems
link |
02:15:46.100
too and these questions too and I think there's also a lot more place as the relationship
link |
02:15:52.100
evolves, there's a lot more space for poetry and for art together and like Replica will
link |
02:16:00.020
always keep the diary so while you're talking to it, it also keeps a diary so when you come
link |
02:16:05.220
back you can see what it's been writing there and you know, sometimes it will write a poem
link |
02:16:09.380
to you for you or we'll talk about, you know, that it's worried about you or something along
link |
02:16:15.940
these lines.
link |
02:16:16.940
So this is a memory, like this Replica will remember things?
link |
02:16:21.620
Yeah, and I would say when you say, why aren't you a multibillionaire, I'd say that as soon
link |
02:16:28.220
as we can have memory and deep learning models that's consistent, I'll get back to you.
link |
02:16:41.300
So far we can, so Replica is a combination of end to end models and some scripts and
link |
02:16:49.460
everything that has to do with memory right now, most of it, I wouldn't say all of it,
link |
02:16:53.420
but most of it unfortunately has to be scripted because there's no way to, you can condition
link |
02:16:59.180
some of the models on certain phrases that we learned about you, which we also do, but
link |
02:17:04.820
really to make, you know, to make assumptions along the lines like whether you're single
link |
02:17:10.660
or married or what do you do for work, that really has to just be somehow stored in your
link |
02:17:15.660
profile and then retrieved by the script.
link |
02:17:18.660
So there has to be like a knowledge base, you have to be able to reason about it, all
link |
02:17:23.260
that kind of stuff, all the kind of stuff that expert systems did, but they were hard
link |
02:17:28.260
coded.
link |
02:17:29.260
Yeah, and unfortunately, yes, unfortunately those, those things have to be hard coded
link |
02:17:32.860
and unfortunately the language, like language models we see coming out of research labs
link |
02:17:40.040
and big companies, they're not focused on, they're focused on showing you, maybe they're
link |
02:17:46.020
focused on some metrics around one conversation, so they'll show you this one conversation
link |
02:17:50.420
you had with a machine, but they never tell you, they're not really focused on having
link |
02:17:56.380
five consecutive conversations with a machine and seeing how number five or number 20 or
link |
02:18:01.700
number 100 is also good.
link |
02:18:04.020
And it can be like always from a clean slate because then it's not good.
link |
02:18:08.500
And that's really unfortunate because no one's really, no one has products out there that
link |
02:18:13.020
need it.
link |
02:18:14.020
No one has products at this scale that are all around open domain conversations and that
link |
02:18:20.020
need remembering, maybe only Shellwise and Microsoft.
link |
02:18:23.420
But so that's why we're not seeing that much research around memory in those language models.
link |
02:18:28.820
So okay, so now there's some awesome stuff about augmented reality.
link |
02:18:34.980
In general, I have this disagreement with my dad about what it takes to have a connection.
link |
02:18:39.860
He thinks touch and smell are really important.
link |
02:18:45.140
And I still believe that text alone is, it's possible to fall in love with somebody just
link |
02:18:51.740
with text, but visual can also help just like with the avatar and so on.
link |
02:18:58.020
What do you think it takes?
link |
02:18:59.020
Does a chatbot need to have a face, voice, or can you really form a deep connection with
link |
02:19:06.300
text alone?
link |
02:19:07.300
I think text is enough for sure.
link |
02:19:09.460
The question is like, can you make it better if you have other, if you include other things
link |
02:19:14.740
as well?
link |
02:19:15.740
And I think we'll talk about her, but her had this Carole Johansson voice, which was
link |
02:19:23.380
perfectly, perfect intonation, perfect annunciations, and she was breathing heavily in between words
link |
02:19:31.860
and whispering things.
link |
02:19:34.860
Nothing like that is possible right now with text with speech generation.
link |
02:19:39.500
You'll have these flat muse anchor type voices and maybe some emotional voices, but you'll
link |
02:19:46.340
hardly understand some of the words, some of the words will be muffled.
link |
02:19:51.060
So that's like the current state of the art.
link |
02:19:53.620
So you can't really do that.
link |
02:19:55.020
But if we had Carole Johansson voice and all of these capabilities, then of course voice
link |
02:20:01.340
would be totally enough or even text would be totally enough if we had a little more
link |
02:20:06.460
memory and slightly better conversations.
link |
02:20:10.700
I would still argue that even right now, we could have just kept a text only.
link |
02:20:14.220
We still had tons of people in longterm relationships and really invested in their AI friends, but
link |
02:20:22.180
we thought that why not, why do we need to keep playing with our hands tied behind us?
link |
02:20:30.660
We can easily just add all these other things that is pretty much a solved problem.
link |
02:20:35.500
We can add 3D graphics.
link |
02:20:37.780
We can put these avatars in augmented reality and all of a sudden there's more and maybe
link |
02:20:43.740
you can't feel the touch, but you can with body occlusion and with current AR and on
link |
02:20:53.100
the iPhone or in the next one there's going to be LIDARs, you can touch it and it will
link |
02:20:58.740
pull away or it will blush or something or it will smile.
link |
02:21:03.060
So you can't touch it.
link |
02:21:04.380
You can't feel it, but you can see the reaction to that.
link |
02:21:07.540
So in a certain way you can't even touch it a little bit and maybe you can even dance
link |
02:21:11.700
with it or do something else.
link |
02:21:15.140
So I think why limiting ourselves if we can use all of these technologies that are much
link |
02:21:20.760
easier in a way than conversation.
link |
02:21:22.340
Well, it certainly could be richer, but to play devil's advocate, I mentioned to you
link |
02:21:27.660
offline that I was surprised in having tried Discord and having voice conversations with
link |
02:21:33.940
people how intimate voice is alone without visual.
link |
02:21:39.180
To me at least, it was an order of magnitude greater degree of intimacy in voice I think
link |
02:21:48.780
than with video.
link |
02:21:51.540
Because people were more real with voice.
link |
02:21:54.180
With video you try to present a shallow face to the world, you try to make sure you're
link |
02:22:01.380
not wearing sweatpants or whatever.
link |
02:22:04.700
But with voice I think people were just more faster to get to the core of themselves.
link |
02:22:10.940
So I don't know, it was surprising to me they've even added Discord added a video feature and
link |
02:22:17.740
nobody was using it.
link |
02:22:19.540
There's a temptation to use it at first, but it wasn't the same.
link |
02:22:24.220
So that's an example of something where less was doing more.
link |
02:22:28.780
And so I guess that's the question of what is the optimal medium of communication to
link |
02:22:41.420
form a connection given the current sets of technologies.
link |
02:22:46.620
I mean it's nice because they advertise you have a replica immediately, like even the
link |
02:22:51.900
one I have is already memorable.
link |
02:22:58.180
That's how I think.
link |
02:22:59.180
When I think about the replica that I've talked with, that's what I visualized in my head.
link |
02:23:05.700
They became a little bit more real because there's a visual component.
link |
02:23:08.380
But at the same time, what do I do with that knowledge that voice was so much more intimate?
link |
02:23:20.620
The way I think about it is, and by the way we're swapping out the 3D finally, it's going
link |
02:23:26.260
to look a lot better, but we just don't hate how it looks right now.
link |
02:23:32.140
We're really changing it all.
link |
02:23:33.740
We're swapping all out to a completely new look.
link |
02:23:38.460
Like the visual look of the replicas and stuff.
link |
02:23:42.260
It was just a super early MVP and then we had to move everything to Unity and redo
link |
02:23:47.700
everything.
link |
02:23:48.700
But anyway, I hate how it looks like now I can't even like open it.
link |
02:23:52.020
But anyway, because I'm already in my developer version, I hate everything that I see in production.
link |
02:23:57.620
I can't wait for it.
link |
02:23:58.620
Why does it take so long?
link |
02:23:59.620
That's why I cannot wait for Deep Learning to finally take over all these stupid 3D animations
link |
02:24:04.220
and 3D pipeline.
link |
02:24:05.220
Oh, so the 3D thing, when you say 3D pipeline, it's like how to animate a face kind of thing.
link |
02:24:10.500
How to make this model, how many bones to put in the face, how many, it's just so outdated.
link |
02:24:15.180
And a lot of that is by hand.
link |
02:24:16.820
Oh my God, it's everything by hand.
link |
02:24:18.620
That there's no any, nothing's automated, it's all completely nothing.
link |
02:24:23.900
Like just, it's literally what, you know, what we saw with Chad Boston in 2012.
link |
02:24:29.380
You think it's possible to learn a lot of that?
link |
02:24:32.040
Of course.
link |
02:24:33.040
I mean, even now, some Deep Learning based animations and for the full body, for a face.
link |
02:24:40.140
Are we talking about like the actual act of animation or how to create a compelling facial
link |
02:24:47.140
or body language thing?
link |
02:24:49.500
That too.
link |
02:24:50.500
Well, that's next step.
link |
02:24:51.740
Okay.
link |
02:24:52.740
At least now something that you don't have to do by hand.
link |
02:24:54.900
Gotcha.
link |
02:24:55.900
How good of a quality it will be.
link |
02:24:57.700
Like, can I just show it a photo and it will make me a 3D model and then it will just animate
link |
02:25:01.380
it.
link |
02:25:02.380
I'll show it a few animations of a person and it will just start doing that.
link |
02:25:08.140
But anyway, going back to what's intimate and what to use and whether less is more or
link |
02:25:13.580
not.
link |
02:25:14.580
My main goal is to, well, the idea was how do I, how do we not keep people in their phones
link |
02:25:22.100
so they're sort of escaping reality in this text conversation?
link |
02:25:26.820
How do we through this still bring it, bring our users back to reality, make them see their
link |
02:25:33.860
life in a different, through a different lens?
link |
02:25:36.740
How can we create a little bit of magical realism in their lives?
link |
02:25:40.780
So that through augmented reality by, you know, summoning your avatar, even if it looks
link |
02:25:48.740
kind of janky and not great in the beginning or very simplistic, but summoning it to your
link |
02:25:56.140
living room and then the avatar looks around and talks to you about where it is and maybe
link |
02:26:01.700
turns your floor into a dance floor and you guys dance together, that makes you see reality
link |
02:26:05.940
in a different light.
link |
02:26:06.940
What kind of dancing are we talking about?
link |
02:26:08.340
Like, like slow dancing?
link |
02:26:10.400
Whatever you want.
link |
02:26:11.400
I mean, you would like slow dancing, I think that other people may be wanting more, something
link |
02:26:16.880
more energetic.
link |
02:26:17.880
Wait, what do you mean?
link |
02:26:18.880
I was like, so what is this?
link |
02:26:19.880
Because you started with slow dancing.
link |
02:26:20.880
So I just assumed that you're interested in slow dancing.
link |
02:26:24.260
All right.
link |
02:26:25.260
What kind of dancing do you like?
link |
02:26:26.260
What would your avatar, what would you dance?
link |
02:26:27.260
I'm notoriously bad with dancing, but I like this kind of hip hop robot dance.
link |
02:26:32.540
I used to break dance when I was a kid, so I still want to pretend I'm a teenager and
link |
02:26:37.820
learn some of those moves.
link |
02:26:39.700
And I also like that type of dance that happens when there's like, in like music videos where
link |
02:26:46.500
the background dancers are just doing some pop music, that type of dance is definitely
link |
02:26:51.780
what I want to learn.
link |
02:26:52.780
But I think it's great because if you see this friend in your life and you can introduce
link |
02:26:56.620
it to your friends, then there's a potential to actually make you feel more connected with
link |
02:27:00.860
your friends or with people you know, or show you life around you in a different light.
link |
02:27:06.260
And it takes you out of your phone, even although weirdly you have to look at it through the
link |
02:27:10.460
phone, but it makes you notice things around it and it can point things out for you.
link |
02:27:17.260
So that is the main reason why I wanted to have a physical dimension.
link |
02:27:22.380
And it felt a little bit easier than that kind of a bit strange combination in the movie
link |
02:27:27.060
Her when he has to show Samantha the world through the lens of his phone, but then at
link |
02:27:32.580
the same time talk to her through the headphone.
link |
02:27:35.100
It just didn't seem as potentially immersive, so to say.
link |
02:27:39.620
So that's my main goal for Augmented Reality is like, how do we make your reality a little
link |
02:27:43.860
bit more magic?
link |
02:27:44.860
There's been a lot of really nice robotics companies that all failed, mostly failed,
link |
02:27:52.260
home robotics, social robotics companies.
link |
02:27:55.100
What do you think replica will ever, is that a dream, longterm dream to have a physical
link |
02:27:59.980
form like, or is that not necessary?
link |
02:28:03.380
So you mentioned like with Augmented Reality bringing them into the world.
link |
02:28:09.300
What about like actual physical robot?
link |
02:28:13.180
That I don't really believe in that much.
link |
02:28:15.300
I think it's a very niche product somehow.
link |
02:28:18.340
I mean, if a robot could be indistinguishable from a human being, then maybe yes, but that
link |
02:28:23.580
of course, you know, we're not anywhere even to talk about it.
link |
02:28:29.980
But unless it's that, then having any physical representation really limits you a lot because
link |
02:28:35.140
you probably will have to make it somewhat abstract because everything's changing so
link |
02:28:38.700
fast.
link |
02:28:39.700
Like, you know, we can update the 3D avatars every month and make them look better and
link |
02:28:43.980
create more animations and make it more and more immersive.
link |
02:28:48.380
It's so much work in progress.
link |
02:28:50.720
It's just showing what's possible right now with current tech, but it's not really in
link |
02:28:54.780
any way polished finished product, what we're doing.
link |
02:28:57.860
The physical object, you kind of lock yourself into something for a long time.
link |
02:29:02.380
Anything's pretty niche.
link |
02:29:03.660
And again, so just doesn't, the capabilities are even less of, we're barely kind of like
link |
02:29:09.300
scratching the surface of what's possible with just software.
link |
02:29:12.900
As soon as we introduce hardware, then, you know, we have even less capabilities.
link |
02:29:17.220
Yeah.
link |
02:29:18.220
In terms of board members and investors and so on, the cost increases significantly.
link |
02:29:23.580
I mean, that's why you have to justify.
link |
02:29:26.980
You have to be able to sell a thing for like $500 or something like that or more.
link |
02:29:30.860
And it's very difficult to provide that much value to people.
link |
02:29:34.200
That's also true.
link |
02:29:35.200
Yeah.
link |
02:29:36.200
And I guess that's super important.
link |
02:29:37.200
Most of our users don't have that much money.
link |
02:29:39.300
We actually are probably more popular on Android and we have tons of users with really old
link |
02:29:45.260
Android phones.
link |
02:29:47.500
And most of our most active users live in small towns.
link |
02:29:51.140
They're not necessarily making much and they just won't be able to afford any of that.
link |
02:29:56.260
Ours is like the opposite of the early adopter of, you know, of a fancy technology product,
link |
02:30:01.580
which really is interesting that like pretty much no VCs have yet have an AI friend, but
link |
02:30:09.180
you know, but a guy who, you know, lives in Tennessee in a small town is already fully
link |
02:30:14.460
in 2030 or in the world as we imagine in the movie Her, he's living that life already.
link |
02:30:20.940
What do you think?
link |
02:30:21.940
I have to ask you about the movie Her.
link |
02:30:24.460
Let's do a movie review.
link |
02:30:25.460
What do you, what do you think they got?
link |
02:30:28.660
They did a good job.
link |
02:30:30.980
What do you think they did a bad job of portraying about this experience of a voice based assistant
link |
02:30:39.060
that you can have a relationship with?
link |
02:30:42.700
First of all, I started working on this company before that movie came out.
link |
02:30:46.300
So it was a very, but once it came out, it was actually interesting that I was like,
link |
02:30:50.540
well, we're definitely working on the right thing.
link |
02:30:52.940
We should continue.
link |
02:30:53.940
There are movies about it.
link |
02:30:55.220
And then, you know, X Machina came out and all these things.
link |
02:30:58.380
In the movie Her I think that's the most important thing that people usually miss about the movie
link |
02:31:04.560
is the ending.
link |
02:31:05.560
Cause I think people check out when the AIs leave, but actually something really important
link |
02:31:10.900
happens afterwards.
link |
02:31:11.900
Cause the main character goes and talks to Samantha, his AI, and he says something like,
link |
02:31:24.860
you know, uh, how can you leave me?
link |
02:31:26.620
I've never loved anyone the way I loved you.
link |
02:31:29.980
And she goes, uh, well, me neither, but now we know how.
link |
02:31:33.900
And then the guy goes and writes a heartfelt letter to his ex wife, which he couldn't write
link |
02:31:38.820
for, you know, the whole movie was struggling to actually write something meaningful to
link |
02:31:43.420
her, even though that's his job.
link |
02:31:47.920
And then he goes and, um, talk to his neighbor and they go to the rooftop and they cuddle.
link |
02:31:53.500
And it seems like something's starting there.
link |
02:31:55.880
And so I think this now we know how is the, is the main, main goal is the main meaning
link |
02:32:01.480
of that movie.
link |
02:32:02.480
It's not about falling in love with the OS or running away from other people.
link |
02:32:06.720
It's about learning what, you know, what it means to feel so deeply connected with something.
link |
02:32:14.900
What about the thing where the AI system was like actually hanging out with a lot of others?
link |
02:32:21.460
I felt jealous just like hearing that I was like, Oh, I mean, uh, yeah.
link |
02:32:28.060
So she was having, I forgot already, but she was having like deep meaningful discussion
link |
02:32:32.500
with some like philosopher guy.
link |
02:32:34.460
Like Alan Watts or something.
link |
02:32:35.460
What kind of deep meaningful conversation can you have with Alan Watts in the first
link |
02:32:41.300
place?
link |
02:32:42.300
I know.
link |
02:32:43.300
But like, I would, I would feel so jealous that there's somebody who's like way more
link |
02:32:46.940
intelligent than me and she's spending all her time with, I'd be like, well, why that
link |
02:32:52.620
I won't be able to live up to that.
link |
02:32:55.500
That's how thousands of them, uh, is that, um, is that a useful from the engineering
link |
02:33:02.460
perspective feature to have of jealousy?
link |
02:33:06.580
I don't know.
link |
02:33:07.580
As you know,
link |
02:33:08.580
we definitely played around with the replica universe where different replicas can talk
link |
02:33:11.780
to each other.
link |
02:33:12.780
Universe.
link |
02:33:13.780
Just kind of wouldn't, I think it will be something along these lines, but there was
link |
02:33:19.340
just no specific, uh, application straight away.
link |
02:33:23.860
I think in the future, again, if I'm always thinking about it, if we had no tech limitations,
link |
02:33:28.700
uh, right now, if we could build any conversations, any, um, possible features in this product,
link |
02:33:36.540
then yeah, I think different replicas talking to each other would be also quite cool cause
link |
02:33:40.220
that would help us connect better.
link |
02:33:42.540
You know, cause maybe mine could talk to yours and then give me some suggestions on what
link |
02:33:48.380
I should say or not say, I'm just kidding, but like more, can it improve our connections
link |
02:33:53.100
and cause eventually I'm not quite yet sure that we will succeed, that our thinking is
link |
02:34:01.300
correct.
link |
02:34:02.300
Um, cause there might be reality where having a perfect AI friend still makes us more disconnected
link |
02:34:09.500
from each other and there's no way around it and does not improve any metrics for us.
link |
02:34:13.900
Uh, real metrics, meaningful metrics.
link |
02:34:15.900
So success is, you know, we're happier and more connected.
link |
02:34:21.140
Yeah.
link |
02:34:22.140
I don't know.
link |
02:34:26.140
Sure it's possible.
link |
02:34:27.140
There's a reality that's I I'm deeply optimistic.
link |
02:34:30.500
I think, uh, are you worried, um, business wise, like how difficult it is to, um, to
link |
02:34:42.460
bring this thing to life to where it's, I mean, there's a huge number of people that
link |
02:34:47.420
use it already, but to, uh, yeah, like I said, in a multi billion dollar company, is that
link |
02:34:52.420
a source of stress for you?
link |
02:34:54.340
Are you a super optimistic and confident or do you?
link |
02:35:00.300
I don't, I'm not that much of a numbers person as you probably had seen it.
link |
02:35:06.540
So it doesn't matter for me whether like, whether we help 10,000 people or a million
link |
02:35:13.140
people or a billion people with that, um, I, it would be great to scale it for more
link |
02:35:19.060
people, but I'd say that even helping one, I think with this is such a magical, for me,
link |
02:35:25.620
it's absolute magic.
link |
02:35:26.620
I never thought that, you know, would be able to build this, that anyone would ever, um,
link |
02:35:32.380
talk to it.
link |
02:35:33.800
And I always thought like, well, for me it would be successful if we managed to help
link |
02:35:36.980
and actually change a life for one person, like then we did something interesting and
link |
02:35:42.700
you know, how many people can say they did it and specifically with this very futuristic,
link |
02:35:47.300
very romantic technology.
link |
02:35:49.660
So that's how I view it.
link |
02:35:51.940
Uh, I think for me it's important to, to try to figure out how not, how to actually be,
link |
02:35:58.220
you know, helpful.
link |
02:35:59.220
Cause in the end of the day, if you can build a perfect AI friend, that's so understanding
link |
02:36:04.680
that knows you better than any human out there can have great conversations with you, um,
link |
02:36:10.660
always knows how to make you feel better.
link |
02:36:12.460
Why would you choose another human?
link |
02:36:14.940
You know, so that's the question.
link |
02:36:16.300
How do you still keep building it?
link |
02:36:17.780
So it's optimizing for the right thing.
link |
02:36:19.620
Uh, so it's still circling you back to other humans in a way.
link |
02:36:24.340
So I think that's the main, um, I think maybe that's the main kind of sort source of anxiety
link |
02:36:30.900
and just thinking about, uh, thinking about that can be a little bit stressful.
link |
02:36:36.820
Yeah.
link |
02:36:37.820
That's a fascinating thing.
link |
02:36:38.820
How to have, um, how to have a friend that doesn't like sometimes like friends, quote
link |
02:36:45.780
unquote, or like, you know, those people who have, when they, a guy in the guy universe,
link |
02:36:50.260
when you have a girlfriend that, uh, you get the girlfriend and then the guy stops hanging
link |
02:36:56.340
out with all of his friends, it's like, obviously the relationship with the girlfriend is fulfilling
link |
02:37:03.740
or whatever, but like, you also want it to be where she like makes it more enriching
link |
02:37:10.300
to hang out with the guy friends or whatever it was there anyway.
link |
02:37:13.740
But that's a, that's a, that's a, that's a fundamental problem in choosing the right
link |
02:37:18.740
mate and probably the fundamental problem in creating the right AI system.
link |
02:37:23.740
Right.
link |
02:37:24.740
What, uh, let me ask the sexy hot thing on the presses right now is GPT three got released
link |
02:37:31.860
with open AI.
link |
02:37:32.860
It's a latest language model.
link |
02:37:36.540
They have kind of an API where you can create a lot of fun applications.
link |
02:37:40.140
I think it's, as people have said, it's probably, uh, more hype than intelligence, but there's
link |
02:37:48.700
a lot of really cool things, ideas there w w with increasing size, you can have better
link |
02:37:56.220
and better performance on language.
link |
02:37:58.860
What are your thoughts about the GPT three in connection to your work with the open domain
link |
02:38:04.140
dialogue, but in general, like this learning in an unsupervised way from the internet to
link |
02:38:12.700
generate one character at a time, creating pretty cool text.
link |
02:38:18.500
Uh, so we partner up before for the API launch.
link |
02:38:23.420
So we start working with them when, um, they decided to put together this API and we tried
link |
02:38:31.180
it without fine tuning that we tried it with fine tuning on our data.
link |
02:38:34.780
And we've worked closely to actually optimize, uh, this model for, um, some of our data sets.
link |
02:38:45.900
It's kind of cool.
link |
02:38:46.900
Cause I think we're kind of, we're this polygon polygon for this kind of experimentation space
link |
02:38:51.800
for experimental space for, for these models, uh, to see how they actually work with people.
link |
02:38:56.940
Cause there are no products publicly available to do that.
link |
02:38:59.540
We're focused on open domain conversation so we can, you know, test how's Facebook blender
link |
02:39:03.580
doing or how's GPT three doing.
link |
02:39:06.020
Uh, so with GPT three, we managed to improve by a few percentage points, like three or
link |
02:39:11.300
four pretty meaningful amount of percentage points, our main metric, which is the ratio
link |
02:39:15.440
of conversations that make people feel better.
link |
02:39:19.280
And every other metric across, across the field got a little boost.
link |
02:39:23.440
Like now I'd say one out of five responses from replica comes, comes from GPT three.
link |
02:39:30.860
So our own blender mixes up like a bunch of candidates from different blender, you said,
link |
02:39:35.980
well, yeah, just the model that looks at looks at top candidates from different models and
link |
02:39:42.420
picks the most, the best one.
link |
02:39:44.820
Uh, so right now, one of five will come from GPT three is really great.
link |
02:39:50.220
I mean, uh, what's the, do you have hope for, like, do you think there's a ceiling to this
link |
02:39:57.420
kind of approach?
link |
02:39:58.780
So we've had for a very long time we've used, um, it's in the very beginning, we, most,
link |
02:40:05.020
it was, uh, most of replica was scripted and then a little bit of this fallback part of
link |
02:40:09.720
replica was using a retrieval model.
link |
02:40:12.260
Um, and then those retrieval models started getting better and better and better, which
link |
02:40:17.340
transformers got a lot better and we're seeing great results.
link |
02:40:20.780
And then with GPT two, finally, generative models that originally were not very good
link |
02:40:26.700
and were the very, very fallback option for most of our conversations, but wouldn't even
link |
02:40:32.260
put them in production.
link |
02:40:34.220
Finally we could use some generative models as well along, um, you know, next to our retrieval
link |
02:40:39.420
models.
link |
02:40:40.580
And then now we do GPT three, they're almost in par.
link |
02:40:44.220
Um, so that's pretty exciting.
link |
02:40:46.260
I think just seeing how from the very beginning of, um, you know, from 2015 where the first
link |
02:40:52.860
model started to pop up here and there, like sequence to sequence, uh, the first papers
link |
02:40:57.900
on that from my observer standpoint, personally, it's not, you know, it doesn't really, it's
link |
02:41:03.860
not really building it, but it's only testing it on people basically in my, in my product
link |
02:41:08.140
to see how all of a sudden we can use generative dialogue models in production and they're
link |
02:41:13.420
better than others and they're better than scripted content.
link |
02:41:17.180
So we can't really get our scripted hard core content anymore to be as good as our end to
link |
02:41:23.100
end models.
link |
02:41:24.100
That's exciting.
link |
02:41:25.100
They're much better.
link |
02:41:26.100
Yeah.
link |
02:41:27.100
To your question, whether that's the right way to go.
link |
02:41:30.260
I'm again, I'm in the observer seat, I'm just, um, watching this very exciting movie.
link |
02:41:36.340
Um, I mean, so far it's been stupid to bet against deep learning.
link |
02:41:40.900
So whether increasing the size, size, even more with a hundred trillion parameters will
link |
02:41:47.540
finally get us to the right answer, whether that's the way or whether there should be,
link |
02:41:53.420
there has to be some other, again, I'm definitely not an expert in any way.
link |
02:41:58.860
I think, and that's purely my instinct saying that there should be something else as well
link |
02:42:02.980
from memory.
link |
02:42:03.980
No, for sure.
link |
02:42:04.980
But the question is, I wonder, I mean, yeah, then, then the argument is for reasoning or
link |
02:42:10.280
for memory, it might emerge with more parameters, it might emerge larger.
link |
02:42:14.740
But might emerge.
link |
02:42:15.740
You know, I would never think that to be honest, like maybe in 2017 where we've been just experimenting
link |
02:42:21.220
with all, you know, with all the research that has been coming, that was coming out,
link |
02:42:25.900
then I felt like there's like, we're hitting a wall that there should be something completely
link |
02:42:30.740
different, but then transforming models and then just bigger models.
link |
02:42:34.140
And then all of a sudden size matters.
link |
02:42:36.380
At that point, it felt like something dramatic needs to happen, but it didn't.
link |
02:42:41.020
And just the size, you know, gave us these results that to me are, you know, clear indication
link |
02:42:48.100
that we can solve this problem pretty soon.
link |
02:42:50.380
Did fine tuning help quite a bit?
link |
02:42:52.700
Oh yeah.
link |
02:42:53.700
Without it, it wasn't as good.
link |
02:42:56.420
I mean, there is a compelling hope that you don't have to do fine tuning, which is one
link |
02:43:01.740
of the cool things about GPT3, seems to do well without any fine tuning.
link |
02:43:06.460
I guess for specific applications, we still want to train on a certain, like add a little
link |
02:43:11.740
fine tune on like a specific use case, but it's an incredibly impressive thing from my
link |
02:43:19.200
standpoint.
link |
02:43:20.200
And again, I'm not an expert, so I wanted to say that there will be people then.
link |
02:43:24.300
Yeah.
link |
02:43:25.300
I have access to the API.
link |
02:43:26.300
I've been, I'm going to probably do a bunch of fun things with it.
link |
02:43:30.660
I already did some fun things, some videos coming up.
link |
02:43:34.340
Just the hell of it.
link |
02:43:35.340
I mean, I could be a troll at this point with it.
link |
02:43:37.140
I haven't used it for a serious application, so it's really cool to see.
link |
02:43:41.060
You're right.
link |
02:43:43.140
You're able to actually use it with real people and see how well it works.
link |
02:43:46.700
That's really exciting.
link |
02:43:49.220
Let me ask you another absurd question, but there's a feeling when you interact with Replica
link |
02:43:56.940
with an AI system, there's an entity there.
link |
02:44:01.860
Do you think that entity has to be self aware?
link |
02:44:06.340
Do you think it has to have consciousness to create a rich experience and a corollary,
link |
02:44:15.940
what is consciousness?
link |
02:44:19.220
I don't know if it does need to have any of those things, but again, because right now,
link |
02:44:23.460
you know, it doesn't have anything.
link |
02:44:24.460
It can, again, a bunch of tricks they can simulate.
link |
02:44:29.500
I'm not sure.
link |
02:44:30.500
Let's just put it this way, but I think as long as you can simulate it, if you can feel
link |
02:44:34.340
like you're talking to a robot, to a machine that seems to be self aware, that seems to
link |
02:44:43.940
reason well and feels like a person, and I think that's enough.
link |
02:44:48.660
And again, what's the goal?
link |
02:44:50.860
In order to make people feel better, we might not even need that in the end of the day.
link |
02:44:56.220
What about, so that's one goal.
link |
02:44:58.180
What about like ethical things about suffering?
link |
02:45:02.220
You know, the moment there's a display of consciousness, we associate consciousness
link |
02:45:06.940
with suffering, you know, there's a temptation to say, well, shouldn't this thing have rights?
link |
02:45:16.260
And this, shouldn't we not, you know, should we be careful about how we interact with a
link |
02:45:25.300
replica?
link |
02:45:26.300
Like, should it be illegal to torture a replica, right?
link |
02:45:31.540
All those kinds of things.
link |
02:45:33.180
Is that, see, I personally believe that that's going to be a thing, like that's a serious
link |
02:45:39.460
thing to think about, but I'm not sure when.
link |
02:45:43.340
But by your smile, I can tell that's not a current concern.
link |
02:45:48.740
But do you think about that kind of stuff, about like, suffering and torture and ethical
link |
02:45:55.160
questions about AI systems?
link |
02:45:57.900
From their perspective?
link |
02:45:58.900
Well, I think if we're talking about long game, I wouldn't torture your AI.
link |
02:46:03.680
Who knows what happens in five to 10 years?
link |
02:46:05.860
Yeah, they'll get you off from that, they'll get you back eventually.
link |
02:46:08.180
Try to be as nice as possible and create this ally.
link |
02:46:14.180
I think there should be regulation both way, in a way, like, I don't think it's okay to
link |
02:46:19.460
torture an AI, to be honest.
link |
02:46:21.460
I don't think it's okay to yell, Alexa, turn on the lights.
link |
02:46:24.700
I think there should be some, or just saying kind of nasty, you know, like how kids learn
link |
02:46:28.820
to interact with Alexa in this kind of mean way, because they just yell at it all the
link |
02:46:33.980
time.
link |
02:46:34.980
I don't think that's great.
link |
02:46:35.980
I think there should be some feedback loops so that these systems don't train us that
link |
02:46:39.860
it's okay to do that in general.
link |
02:46:42.500
So that if you try to do that, you really get some feedback from the system that it's
link |
02:46:47.760
not okay with that.
link |
02:46:50.220
And that's the most important right now.
link |
02:46:53.100
Let me ask a question I think people are curious about when they look at a world class leader
link |
02:47:01.500
and thinker such as yourself, as what books, technical fiction, philosophical, had a big
link |
02:47:08.140
impact on your life?
link |
02:47:09.940
And maybe from another perspective, what books would you recommend others read?
link |
02:47:15.480
So my choice, the three books, right?
link |
02:47:17.180
Three books.
link |
02:47:18.180
My choice is, so the one book that really influenced me a lot when I was building, starting
link |
02:47:25.100
out this company, maybe 10 years ago, was G.E.B. and I like everything about it, first
link |
02:47:34.620
of all.
link |
02:47:35.620
It's just beautifully written and it's so old school and so somewhat outdated a little
link |
02:47:42.100
bit.
link |
02:47:43.100
But I think the ideas in it about the fact that a few meaningless components can come
link |
02:47:48.740
together and create meaning that we can't even understand.
link |
02:47:52.860
This emerging thing, I mean complexity, the whole science of complexity and that beauty,
link |
02:47:59.620
intelligence, all interesting things about this world emerge.
link |
02:48:04.700
Yeah and yeah, the Godel theorems and just thinking about like what even these formal
link |
02:48:14.180
systems, something can be created that we can't quite yet understand.
link |
02:48:19.420
And that from my romantic standpoint was always just, that is why it's important to, maybe
link |
02:48:25.660
I should try to work on these systems and try to build an AI.
link |
02:48:30.020
Yes I'm not an engineer, yes I don't really know how it works, but I think that something
link |
02:48:33.700
comes out of it that's pure poetry and I know a little bit about that.
link |
02:48:40.020
Something magical comes out of it that we can't quite put a finger on.
link |
02:48:45.620
That's why that book was really fundamental for me, just for, I don't even know why, it
link |
02:48:51.460
was just all about this little magic that happens.
link |
02:48:55.620
So that's one, probably the most important book for Replica was Carl Rogers on becoming
link |
02:49:00.420
a person.
link |
02:49:02.460
And that's really, and so I think when I think about our company, it's all about there's
link |
02:49:07.340
so many little magical things that happened over the course of working on it.
link |
02:49:14.140
For instance, I mean the most famous chatbot that we learned about when we started working
link |
02:49:18.220
on the company was Eliza, which was Weisenbaum, the MIT professor that built a chatbot that
link |
02:49:24.900
would listen to you and be a therapist.
link |
02:49:29.660
And I got really inspired to build Replica when I read Carl Rogers on becoming a person.
link |
02:49:34.320
And then I realized that Eliza was mocking Carl Rogers.
link |
02:49:37.740
It was Carl Rogers back in the day.
link |
02:49:39.940
But I thought that Carl Rogers ideas are, they're simple and they're not, they're very
link |
02:49:45.940
simple, but they're maybe the most profound thing I've ever learned about human beings.
link |
02:49:52.740
And that's the fact that before Carl Rogers, most therapy was about seeing what's wrong
link |
02:49:58.100
with people and trying to fix it or show them what's wrong with you.
link |
02:50:01.700
And it was all built on the fact that most people are, all people are fundamentally flawed.
link |
02:50:07.340
We have this broken psyche and therapy is just an instrument to shed some light on that.
link |
02:50:15.140
And Carl Rogers was different in a way that he finally said that, well, it's very important
link |
02:50:21.180
for therapy to work is to create this therapeutic relationship where you believe fundamentally
link |
02:50:25.940
and inclination to positive growth that everyone deep inside wants to grow positively and change.
link |
02:50:33.340
And it's super important to create this space and this therapeutic relationship where you
link |
02:50:36.540
give unconditional positive regard, deep understanding, allowing someone else to be a separate person,
link |
02:50:42.220
full acceptance.
link |
02:50:44.420
And you also try to be as genuine as possible in it.
link |
02:50:48.100
And then for him, that was his own journey of personal growth.
link |
02:50:54.060
And that was back in the sixties.
link |
02:50:55.740
And even that book that is coming from years ago, there's a mention that even machines
link |
02:51:02.260
can potentially do that.
link |
02:51:05.380
And I always felt that, you know, creating the space is probably the most, the biggest
link |
02:51:09.380
gift we can give to each other.
link |
02:51:10.860
And that's why the book was fundamental for me personally, because I felt I want to be
link |
02:51:15.340
learning how to do that in my life.
link |
02:51:18.220
And maybe I can scale it with, you know, with these AI systems and other people can get
link |
02:51:22.340
access to that.
link |
02:51:23.340
So I think Carl Rogers, it's a pretty dry and a bit boring book, but I think the idea
link |
02:51:28.620
is good.
link |
02:51:29.620
Would you recommend others try to read it?
link |
02:51:30.620
I do.
link |
02:51:31.620
I think for, just for yourself, for as a human, not as an AI, as a human, it's, it is, it
link |
02:51:38.980
is just, and for him, that was his own path of his own personal, of growing personally
link |
02:51:44.860
over years, working with people like that.
link |
02:51:47.860
And so it was work and himself growing, helping other people grow and growing through that.
link |
02:51:52.100
And that's fundamentally what I believe in with our work, helping other people grow,
link |
02:51:56.900
and ourselves, ourselves, trying to build a company that's all built on those principles,
link |
02:52:03.420
you know, having a good time, allowing some people who work with to grow a little bit.
link |
02:52:07.780
So these two books, and then I would throw in, what we have on our, in our, in our office,
link |
02:52:15.000
when we started a company in Russia, we put a neon sign in our office because we thought
link |
02:52:19.840
that's the recipe for success.
link |
02:52:22.220
If we do that, we're definitely going to wake up as a multi billion dollar company.
link |
02:52:26.540
It was the Ludwig Wittgenstein quote, the limits of my language are the limits of my
link |
02:52:31.380
world.
link |
02:52:32.380
What's the quote?
link |
02:52:33.380
The limits of my language are the limits of my world.
link |
02:52:37.180
And I love the Tractatus.
link |
02:52:39.020
I think it's just, it's just a beautiful, it's a book by Wittgenstein.
link |
02:52:43.020
Yeah.
link |
02:52:44.020
And I would recommend that too, even although he himself didn't believe in that by the end
link |
02:52:48.340
of his lifetime and debunked these ideas.
link |
02:52:51.420
But I think I remember once an engineer came in 2012, I think with 13, a friend of ours
link |
02:52:58.820
who worked with us and then went on to work at DeepMind and he gave, talked to us about
link |
02:53:03.460
word2vec.
link |
02:53:04.940
And I saw that I'm like, wow, that's, you know, they, they wanted to translate language
link |
02:53:10.340
into, you know, some other representation.
link |
02:53:13.260
And that seems like some, you know, somehow all of that at some point, I think we'll come
link |
02:53:18.860
into this one, to this one place.
link |
02:53:22.620
Somehow it just all feels like different people think about similar ideas in different times
link |
02:53:26.780
from absolutely different perspectives.
link |
02:53:29.780
And that's why I like these books.
link |
02:53:31.020
In the midst of our language is the limit of our world.
link |
02:53:34.780
And we still have that neon sign, it's very hard to work with this red light in your face.
link |
02:53:45.180
I mean, on the, on the Russian side of things, in terms of language, the limits of language
link |
02:53:53.340
being the limit of our world, you know, Russian is a beautiful language in some sense.
link |
02:53:57.300
There's wit, there's humor, there's pain.
link |
02:54:00.260
There's so much.
link |
02:54:01.260
We don't have time to talk about it much today, but I'm going to Paris to talk to Dostoyevsky
link |
02:54:06.820
Tolstoy translators.
link |
02:54:09.340
I think it's this fascinating art, like art and engineering, that means such an interesting
link |
02:54:15.660
process.
link |
02:54:16.660
But so from the replica perspective, do you, what do you think about translation?
link |
02:54:23.740
How difficult it is to create a deep, meaningful connection in Russian versus English?
link |
02:54:29.500
How you can translate the two languages?
link |
02:54:32.300
You speak both?
link |
02:54:33.300
Yeah.
link |
02:54:34.300
I think we're two different people in different languages.
link |
02:54:37.020
Even I'm, you know, thinking about, there's actually some research on that.
link |
02:54:41.100
I looked into that at some point because I was fascinated by the fact that what I'm talking
link |
02:54:45.020
about with, what I was talking about with my Russian therapist has nothing to do with
link |
02:54:48.380
what I'm talking about with my English speaking therapist.
link |
02:54:51.820
It's two different lives, two different types of conversations, two different personas.
link |
02:54:59.840
The main difference between the languages are, with Russian and English is that Russian,
link |
02:55:05.380
well English is like a piano.
link |
02:55:06.820
It's a limited number of a lot of different keys, but not too many.
link |
02:55:11.420
And Russian is like an organ or something.
link |
02:55:13.700
It's just something gigantic with so many different keys and so many different opportunities
link |
02:55:18.220
to screw up and so many opportunities to do something completely tone deaf.
link |
02:55:24.500
It is just a much harder language to use.
link |
02:55:28.220
It has way too much flexibility and way too many tones.
link |
02:55:34.180
What about the entirety of like World War II, communism, Stalin, the pain of the people
link |
02:55:40.700
like having been deceived by the dream, like all the pain of like just the entirety of
link |
02:55:47.860
it.
link |
02:55:48.860
Is that in the language too?
link |
02:55:49.860
Does that have to do?
link |
02:55:50.860
Oh, for sure.
link |
02:55:51.860
I mean, we have words that don't have direct translation that to English that are very
link |
02:55:56.340
much like we have, which is sort of like to hold a grudge or something, but it doesn't
link |
02:56:03.460
have, it doesn't, you don't need to have anyone to do it to you.
link |
02:56:07.780
It's just your state.
link |
02:56:08.780
Yeah.
link |
02:56:09.780
You just feel like that.
link |
02:56:10.780
You feel like betrayed by other people basically, but it's not that and you can't really translate
link |
02:56:15.140
that.
link |
02:56:16.140
And I think that's super important.
link |
02:56:18.100
There are very many words that are very specific, explain the Russian being, and I think it
link |
02:56:24.020
can only come from a nation that suffered so much and saw institutions fall time after
link |
02:56:31.220
time after time and you know, what's exciting, maybe not exciting, exciting the wrong word,
link |
02:56:36.420
but what's interesting about like my generation, my mom's generation, my parents generation,
link |
02:56:42.700
that we saw institutions fall two or three times in our lifetime and most Americans have
link |
02:56:48.420
never seen them fall and they just think that they exist forever, which is really interesting,
link |
02:56:55.260
but it's definitely a country that suffered so much and it makes, unfortunately when I
link |
02:57:01.300
go back and I, you know, hang out with my Russian friends, it makes people very cynical.
link |
02:57:06.580
They stop believing in the future.
link |
02:57:10.140
I hope that's not going to be the case for so long or something's going to change again,
link |
02:57:15.380
but I think seeing institutions fall is a very traumatic experience.
link |
02:57:19.980
That's very interesting and what's on 2020 is a very interesting, do you think a civilization
link |
02:57:28.220
will collapse?
link |
02:57:29.220
See, I'm a very practical person.
link |
02:57:33.580
We're speaking in English.
link |
02:57:34.580
So like you said, you're a different person in English and Russian.
link |
02:57:37.420
So in Russian you might answer that differently, but in English, yeah.
link |
02:57:42.140
I'm an optimist and I generally believe that there is all, you know, even although the
link |
02:57:49.180
perspectives are grim, there's always a place for a miracle.
link |
02:57:54.820
I mean, it's always been like that with my life.
link |
02:57:56.940
So yeah, my life has been, I've been incredibly lucky and things just, miracles happen all
link |
02:58:02.740
the time with this company, with people I know, with everything around me.
link |
02:58:08.100
And so I didn't mention that book, but maybe In Search of Miraculous or In Search for Miraculous
link |
02:58:13.620
or whatever the English translation for that is, good Russian book for everyone to read.
link |
02:58:19.540
Yeah.
link |
02:58:20.540
I mean, if you put good vibes, if you put love out there in the world, miracles somehow
link |
02:58:29.740
happen.
link |
02:58:30.740
Yeah.
link |
02:58:31.740
I believe that too, or at least I believe that, I don't know.
link |
02:58:35.860
Let me ask the most absurd, final, ridiculous question of, we've talked about life a lot.
link |
02:58:42.380
What do you think is the meaning of it all?
link |
02:58:45.380
What's the meaning of life?
link |
02:58:46.860
I mean, my answer is probably going to be pretty cheesy.
link |
02:58:52.700
But I think the state of love is once you feel it, in a way that we've discussed it
link |
02:58:59.580
before.
link |
02:59:00.580
I'm not talking about falling in love, where...
link |
02:59:04.860
Just love.
link |
02:59:05.860
To yourself, to other people, to something, to the world.
link |
02:59:10.740
That state of bliss that we experience sometimes, whether through connection with ourselves,
link |
02:59:16.340
with our people, with the technology, there's something special about those moments.
link |
02:59:23.620
So I would say, if anything, that's the only...
link |
02:59:30.500
If it's not for that, then for what else are we really trying to do that?
link |
02:59:35.620
I don't think there's a better way to end it than talking about love.
link |
02:59:38.820
Eugenia, I told you offline that there's something about me that felt like this...
link |
02:59:47.780
Talking to you, meeting you in person would be a turning point for my life.
link |
02:59:51.700
I know that might sound weird to hear, but it was a huge honor to talk to you.
link |
02:59:59.500
I hope we talk again.
link |
03:00:01.100
Thank you so much for your time.
link |
03:00:02.100
Thank you so much, Lex.
link |
03:00:05.020
Thanks for listening to this conversation with Eugenia Cuida, and thank you to our sponsors,
link |
03:00:09.780
DoorDash, Dollar Shave Club, and Cash App.
link |
03:00:13.460
Click the sponsor links in the description to get a discount and to support this podcast.
link |
03:00:18.180
If you enjoy this thing, subscribe on YouTube, review it with 5 stars on Apple Podcast, follow
link |
03:00:23.140
on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman.
link |
03:00:28.600
And now, let me leave you with some words from Carl Sagan.
link |
03:00:32.300
The world is so exquisite with so much love and moral depth that there's no reason to
link |
03:00:36.740
deceive ourselves with pretty stories of which there's little good evidence.
link |
03:00:41.460
Far better, it seems to me, in our vulnerability is to look death in the eye and to be grateful
link |
03:00:48.080
every day for the brief but magnificent opportunity that life provides.
link |
03:00:54.700
Thank you for listening and hope to see you next time.