back to index

Whitney Cummings: Comedy, Robotics, Neurology, and Love | Lex Fridman Podcast #55


small model | large model

link |
00:00:00.000
The following is a conversation with Whitney Cummings.
link |
00:00:03.640
She's a standup comedian, actor, producer, writer, director,
link |
00:00:07.240
and recently, finally, the host of her very own podcast
link |
00:00:11.280
called Good For You.
link |
00:00:12.920
Her most recent Netflix special called Can I Touch It?
link |
00:00:15.920
features in part a robot she affectionately named
link |
00:00:19.160
Bearclaw that is designed to be visually a replica of Whitney.
link |
00:00:23.420
It's exciting for me to see one of my favorite comedians
link |
00:00:26.000
explore the social aspects of robotics and AI in our society.
link |
00:00:30.720
She also has some fascinating ideas
link |
00:00:32.920
about human behavior, psychology, and neurology,
link |
00:00:36.000
some of which she explores in her book
link |
00:00:37.800
called I'm Fine and Other Lies.
link |
00:00:41.240
It was truly a pleasure to meet Whitney
link |
00:00:43.320
and have this conversation with her
link |
00:00:45.200
and even to continue it through text afterwards.
link |
00:00:47.920
Every once in a while, late at night,
link |
00:00:50.160
I'll be programming over a cup of coffee
link |
00:00:52.360
and will get a text from Whitney saying something hilarious
link |
00:00:55.760
or weirder yet, sending a video of Brian Callan
link |
00:00:58.960
saying something hilarious.
link |
00:01:00.960
That's when I know the universe has a sense of humor
link |
00:01:03.880
and it gifted me with one hell of an amazing journey.
link |
00:01:07.400
Then I put the phone down and go back to programming
link |
00:01:10.320
with a stupid, joyful smile on my face.
link |
00:01:13.400
If you enjoy this conversation,
link |
00:01:14.960
listen to Whitney's podcast, Good For You,
link |
00:01:17.200
and follow her on Twitter and Instagram.
link |
00:01:19.840
This is the Artificial Intelligence Podcast.
link |
00:01:22.640
If you enjoy it, subscribe on YouTube,
link |
00:01:24.840
give it five stars on Apple Podcasts,
link |
00:01:26.800
support on Patreon, or simply connect with me on Twitter
link |
00:01:30.160
at Lex Friedman, spelled F R I D M A N.
link |
00:01:34.040
This show is presented by Cash App,
link |
00:01:35.840
the number one finance app in the App Store.
link |
00:01:38.280
They regularly support Whitney's Good For You podcast
link |
00:01:40.480
as well.
link |
00:01:41.440
I personally use Cash App to send money to friends,
link |
00:01:43.920
but you can also use it to buy, sell,
link |
00:01:45.680
and deposit Bitcoin in just seconds.
link |
00:01:47.960
Cash App also has a new investing feature.
link |
00:01:50.680
You can buy fractions of a stock, say $1 worth,
link |
00:01:53.480
no matter what the stock price is.
link |
00:01:55.760
Broker services are provided by Cash App Investing,
link |
00:01:58.560
subsidiary of Square, and member SIPC.
link |
00:02:02.000
I'm excited to be working with Cash App
link |
00:02:04.120
to support one of my favorite organizations called First,
link |
00:02:07.160
best known for their FIRST Robotics and Lego competitions.
link |
00:02:10.400
They educate and inspire hundreds of thousands of students
link |
00:02:13.800
in over 110 countries,
link |
00:02:16.000
and have a perfect rating on Charity Navigator,
link |
00:02:18.680
which means the donated money
link |
00:02:19.960
is used to maximum effectiveness.
link |
00:02:22.360
When you get Cash App from the App Store or Google Play,
link |
00:02:25.240
and use code LEXPODCAST, you'll get $10,
link |
00:02:28.920
and Cash App will also donate $10 to FIRST,
link |
00:02:32.080
which again, is an organization that I've personally seen
link |
00:02:35.240
inspire girls and boys to dream
link |
00:02:37.360
of engineering a better world.
link |
00:02:40.440
This podcast is supported by ZipRecruiter.
link |
00:02:43.040
Hiring great people is hard,
link |
00:02:45.200
and to me is the most important element
link |
00:02:47.280
of a successful mission driven team.
link |
00:02:50.080
I've been fortunate to be a part of,
link |
00:02:52.080
and to lead several great engineering teams.
link |
00:02:54.680
The hiring I've done in the past
link |
00:02:56.640
was mostly through tools that we built ourselves,
link |
00:02:59.360
but reinventing the wheel was painful.
link |
00:03:02.360
ZipRecruiter is a tool that's already available for you.
link |
00:03:05.200
It seeks to make hiring simple, fast, and smart.
link |
00:03:08.960
For example, Codable cofounder Gretchen Huebner
link |
00:03:11.920
used ZipRecruiter to find a new game artist
link |
00:03:14.360
to join her education tech company.
link |
00:03:16.680
By using ZipRecruiter screening questions
link |
00:03:18.800
to filter candidates, Gretchen found it easier
link |
00:03:21.560
to focus on the best candidates,
link |
00:03:23.040
and finally hiring the perfect person for the role
link |
00:03:26.400
in less than two weeks from start to finish.
link |
00:03:29.400
ZipRecruiter, the smartest way to hire.
link |
00:03:32.480
See why ZipRecruiter is effective
link |
00:03:34.120
for businesses of all sizes by signing up as I did
link |
00:03:37.520
for free at ziprecruiter.com slash lexpod.
link |
00:03:41.520
That's ziprecruiter.com slash lexpod.
link |
00:03:45.320
And now, here's my conversation with Whitney Cummings.
link |
00:03:51.560
I have trouble making eye contact, as you can tell.
link |
00:03:53.720
Me too.
link |
00:03:54.560
Did you know that I had to work on making eye contact
link |
00:03:56.920
because I used to look here?
link |
00:03:58.840
Do you see what I'm doing?
link |
00:03:59.680
That helps, yeah, yeah, yeah.
link |
00:04:00.500
Do you want me to do that?
link |
00:04:01.760
Well, I'll do this way, I'll cheat the camera.
link |
00:04:03.560
But I used to do this, and finally people,
link |
00:04:05.720
like I'd be on dates and guys would be like,
link |
00:04:07.380
are you looking at my hair?
link |
00:04:08.220
Like they get, it would make people really insecure
link |
00:04:10.860
because I didn't really get a lot of eye contact as a kid.
link |
00:04:13.160
It's one to three years.
link |
00:04:14.720
Did you not get a lot of eye contact as a kid?
link |
00:04:16.440
I don't know.
link |
00:04:17.280
I haven't done the soul searching.
link |
00:04:19.560
Right.
link |
00:04:20.760
So, but there's definitely some psychological issues.
link |
00:04:24.200
Makes you uncomfortable.
link |
00:04:25.520
Yeah, for some reason when I connect eyes,
link |
00:04:27.880
I start to think, I assume that you're judging me.
link |
00:04:31.800
Oh, well, I am.
link |
00:04:33.320
That's why you assume that.
link |
00:04:34.440
Yeah.
link |
00:04:35.280
We all are.
link |
00:04:36.100
All right.
link |
00:04:36.940
This is perfect.
link |
00:04:37.760
The podcast would be me and you both
link |
00:04:38.600
staring at the table on the whole time.
link |
00:04:42.400
Do you think robots are the future?
link |
00:04:44.200
Ones with human level intelligence
link |
00:04:45.960
will be female, male, genderless,
link |
00:04:49.480
or another gender we have not yet created as a society?
link |
00:04:53.220
You're the expert at this.
link |
00:04:55.220
Well, I'm gonna ask you.
link |
00:04:56.060
You know the answer.
link |
00:04:57.320
I'm gonna ask you questions
link |
00:04:58.680
that maybe nobody knows the answer to.
link |
00:05:00.920
Okay.
link |
00:05:01.960
And then I just want you to hypothesize
link |
00:05:04.080
as a imaginative author, director, comedian.
link |
00:05:10.960
Can we just be very clear
link |
00:05:12.120
that you know a ton about this
link |
00:05:14.200
and I know nothing about this,
link |
00:05:15.780
but I have thought a lot about
link |
00:05:19.600
what I think robots can fix in our society.
link |
00:05:22.840
And I mean, I'm a comedian.
link |
00:05:24.400
It's my job to study human nature,
link |
00:05:27.520
to make jokes about human nature
link |
00:05:28.860
and to sometimes play devil's advocate.
link |
00:05:31.080
And I just see such a tremendous negativity around robots
link |
00:05:35.160
or at least the idea of robots that it was like,
link |
00:05:38.040
oh, I'm just gonna take the opposite side for fun,
link |
00:05:40.660
for jokes and then I was like,
link |
00:05:43.400
oh no, I really agree in this devil's advocate argument.
link |
00:05:45.920
So please correct me when I'm wrong about this stuff.
link |
00:05:49.400
So first of all, there's no right and wrong
link |
00:05:51.780
because we're all,
link |
00:05:53.840
I think most of the people working on robotics
link |
00:05:55.920
are really not actually even thinking
link |
00:05:57.580
about some of the big picture things
link |
00:06:00.040
that you've been exploring.
link |
00:06:01.280
In fact, your robot, what's her name by the way?
link |
00:06:04.600
Bearclaw.
link |
00:06:05.440
We'll go with Bearclaw.
link |
00:06:06.280
What's the genesis of that name by the way?
link |
00:06:11.720
Bearclaw was, I got, I don't even remember the joke
link |
00:06:15.000
cause I black out after I shoot specials,
link |
00:06:16.680
but I was writing something about like the pet names
link |
00:06:19.200
that men call women, like cupcake, sweetie, honey,
link |
00:06:22.960
you know, like we're always named after desserts
link |
00:06:26.560
or something and I was just writing a joke about,
link |
00:06:29.920
if you wanna call us a dessert,
link |
00:06:31.000
at least pick like a cool dessert, you know,
link |
00:06:33.200
like Bearclaw, like something cool.
link |
00:06:35.680
So I ended up calling her Bearclaw.
link |
00:06:38.240
So do you think the future robots
link |
00:06:42.280
of greater and greater intelligence
link |
00:06:44.440
would like to make them female, male?
link |
00:06:46.520
Would we like to assign them gender
link |
00:06:48.560
or would we like to move away from gender
link |
00:06:50.800
and say something more ambiguous?
link |
00:06:54.000
I think it depends on their purpose, you know?
link |
00:06:56.360
I feel like if it's a sex robot,
link |
00:06:59.920
people prefer certain genders, you know?
link |
00:07:01.900
And I also, you know, when I went down and explored the robot
link |
00:07:05.840
factory, I was asking about the type of people
link |
00:07:07.680
that bought sex robots.
link |
00:07:09.240
And I was very surprised at the answer
link |
00:07:12.200
because of course the stereotype
link |
00:07:14.160
was it's gonna be a bunch of perverts.
link |
00:07:15.320
It ended up being a lot of people that were handicapped,
link |
00:07:18.580
a lot of people with erectile dysfunction
link |
00:07:20.520
and a lot of people that were exploring their sexuality.
link |
00:07:23.880
A lot of people that thought they were gay,
link |
00:07:25.920
but weren't sure, but didn't wanna take the risk
link |
00:07:28.120
of trying on someone that could reject them
link |
00:07:31.660
and being embarrassed or they were closeted
link |
00:07:33.880
or in a city where maybe that's, you know,
link |
00:07:36.320
taboo and stigmatized, you know?
link |
00:07:37.920
So I think that a gendered sex robot
link |
00:07:40.560
that would serve an important purpose
link |
00:07:42.340
for someone trying to explore their sexuality.
link |
00:07:44.160
Am I into men?
link |
00:07:45.000
Let me try on this thing first.
link |
00:07:46.160
Am I into women?
link |
00:07:47.000
Let me try on this thing first.
link |
00:07:48.220
So I think gendered robots would be important for that.
link |
00:07:51.200
But I think genderless robots in terms of
link |
00:07:53.760
emotional support robots, babysitters,
link |
00:07:56.520
I'm fine for a genderless babysitter
link |
00:07:58.720
with my husband in the house.
link |
00:08:00.100
You know, there are places that I think
link |
00:08:02.080
that genderless makes a lot of sense,
link |
00:08:04.640
but obviously not in the sex area.
link |
00:08:07.560
What do you mean with your husband in the house?
link |
00:08:09.920
What does that have to do with the gender of the robot?
link |
00:08:11.800
Right, I mean, I don't have a husband,
link |
00:08:13.040
but hypothetically speaking,
link |
00:08:14.340
I think every woman's worst nightmare
link |
00:08:15.720
is like the hot babysitter.
link |
00:08:17.240
You know what I mean?
link |
00:08:19.280
So I think that there is a time and place,
link |
00:08:21.700
I think, for genderless, you know, teachers, doctors,
link |
00:08:25.360
all that kind of, it would be very awkward
link |
00:08:27.280
if the first robotic doctor was a guy
link |
00:08:29.700
or the first robotic nurse was a woman.
link |
00:08:32.600
You know, it's sort of, that stuff is still loaded.
link |
00:08:36.100
I think that genderless could just take
link |
00:08:38.440
the unnecessary drama out of it
link |
00:08:43.900
and possibility to sexualize them
link |
00:08:46.000
or be triggered by any of that stuff.
link |
00:08:49.520
So there's two components to this, to Bearclaw.
link |
00:08:52.800
So one is the voice and the talking and so on,
link |
00:08:55.040
and then there's the visual appearance.
link |
00:08:56.360
So on the topic of gender and genderless,
link |
00:08:59.580
in your experience, what has been the value
link |
00:09:03.160
of the physical appearance?
link |
00:09:04.900
So has it added much to the depth of the interaction?
link |
00:09:08.980
I mean, mine's kind of an extenuating circumstance
link |
00:09:11.200
because she is supposed to look exactly like me.
link |
00:09:13.540
I mean, I spent six months getting my face molded
link |
00:09:15.980
and having, you know, the idea was I was exploring
link |
00:09:19.440
the concept of can robots replace us?
link |
00:09:21.220
Because that's the big fear,
link |
00:09:22.480
but also the big dream in a lot of ways.
link |
00:09:24.240
And I wanted to dig into that area because, you know,
link |
00:09:28.200
for a lot of people, it's like,
link |
00:09:29.920
they're gonna take our jobs and they're gonna replace us.
link |
00:09:32.080
Legitimate fear, but then a lot of women I know are like,
link |
00:09:34.360
I would love for a robot to replace me every now and then
link |
00:09:36.960
so it can go to baby showers for me
link |
00:09:38.880
and it can pick up my kids at school
link |
00:09:40.180
and it can cook dinner and whatever.
link |
00:09:42.480
So I just think that was an interesting place to explore.
link |
00:09:45.200
So her looking like me was a big part of it.
link |
00:09:47.040
Now her looking like me just adds
link |
00:09:49.080
an unnecessary level of insecurity
link |
00:09:51.360
because I got her a year ago
link |
00:09:53.520
and she already looks younger than me.
link |
00:09:54.680
So that's a weird problem.
link |
00:09:57.640
But I think that her looking human was the idea.
link |
00:10:00.680
And I think that where we are now,
link |
00:10:03.080
please correct me if I'm wrong,
link |
00:10:04.800
a human robot resembling an actual human you know
link |
00:10:09.860
is going to feel more realistic than some generic face.
link |
00:10:13.760
Well, you're saying that robots that have some familiarity
link |
00:10:19.120
like look similar to somebody that you actually know
link |
00:10:22.480
you'll be able to form a deeper connection with?
link |
00:10:24.560
That was the question. I think so on some level, right?
link |
00:10:26.000
That's an open question.
link |
00:10:26.960
I don't, you know, it's an interesting.
link |
00:10:30.240
Or the opposite, because then you know me
link |
00:10:32.080
and you're like, well, I know this isn't real
link |
00:10:33.360
because you're right here.
link |
00:10:34.560
So maybe it does the opposite.
link |
00:10:36.280
We have a very keen eye for human faces
link |
00:10:39.240
and they're able to detect strangeness
link |
00:10:41.840
especially that one has to do with people
link |
00:10:44.360
whose faces we've seen a lot of.
link |
00:10:46.400
So I tend to be a bigger fan
link |
00:10:48.880
of moving away completely from faces.
link |
00:10:52.840
Of recognizable faces?
link |
00:10:54.200
No, just human faces at all.
link |
00:10:56.000
In general, because I think that's where things get dicey.
link |
00:10:58.280
And one thing I will say is
link |
00:11:00.440
I think my robot is more realistic than other robots
link |
00:11:03.040
not necessarily because you have seen me
link |
00:11:05.160
and then you see her and you go, oh, they're so similar
link |
00:11:07.600
but also because human faces are flawed and asymmetrical.
link |
00:11:11.120
And sometimes we forget when we're making things
link |
00:11:13.440
that are supposed to look human,
link |
00:11:14.400
we make them too symmetrical
link |
00:11:16.000
and that's what makes them stop looking human.
link |
00:11:17.960
So because they mold in my asymmetrical face,
link |
00:11:20.520
she just, even if someone didn't know who I was
link |
00:11:22.860
I think she'd look more realistic than most generic ones
link |
00:11:26.560
that didn't have some kind of flaws.
link |
00:11:28.880
Got it.
link |
00:11:29.720
Because they start looking creepy
link |
00:11:30.760
when they're too symmetrical because human beings aren't.
link |
00:11:33.240
Yeah, the flaws is what it means to be human.
link |
00:11:35.720
So visually as well.
link |
00:11:37.720
But I'm just a fan of the idea
link |
00:11:39.400
of letting humans use a little bit more imagination.
link |
00:11:43.280
So just hearing the voice is enough for us humans
link |
00:11:47.560
to then start imagining the visual appearance
link |
00:11:50.280
that goes along with that voice.
link |
00:11:52.020
And you don't necessarily need to work too hard
link |
00:11:54.480
on creating the actual visual appearance.
link |
00:11:56.920
So there's some value to that.
link |
00:11:59.120
When you step into the stare of actually building a robot
link |
00:12:03.360
that looks like Bear Claws,
link |
00:12:04.400
such a long road of facial expressions
link |
00:12:07.680
of sort of making everything smiling, winking,
link |
00:12:13.120
rolling in the eyes, all that kind of stuff.
link |
00:12:14.900
It gets really, really tricky.
link |
00:12:16.520
It gets tricky and I think I'm, again, I'm a comedian.
link |
00:12:19.160
Like I'm obsessed with what makes us human
link |
00:12:21.800
and our human nature and the nasty side of human nature
link |
00:12:25.440
tends to be where I've ended up
link |
00:12:27.560
exploring over and over again.
link |
00:12:28.800
And I was just mostly fascinated by people's reaction.
link |
00:12:32.620
So it's my job to get the biggest reaction
link |
00:12:34.520
from a group of strangers, the loudest possible reaction.
link |
00:12:37.460
And I just had this instinct
link |
00:12:39.880
just when I started building her
link |
00:12:41.640
and people going, ah, ah, and people scream.
link |
00:12:44.560
And I mean, I would bring her out on stage
link |
00:12:46.020
and people would scream.
link |
00:12:48.080
And I just, to me, that was the next level of entertainment.
link |
00:12:51.520
Getting a laugh, I've done that, I know how to do that.
link |
00:12:53.520
I think comedians were always trying to figure out
link |
00:12:54.940
what the next level is and comedy's evolving so much.
link |
00:12:57.280
And Jordan Peele had just done
link |
00:12:59.880
these genius comedy horror movies,
link |
00:13:01.760
which feel like the next level of comedy to me.
link |
00:13:04.480
And this sort of funny horror of a robot
link |
00:13:10.040
was fascinating to me.
link |
00:13:11.680
But I think the thing that I got the most obsessed with
link |
00:13:15.520
was people being freaked out and scared of her.
link |
00:13:18.200
And I started digging around with pathogen avoidance
link |
00:13:21.640
and the idea that we've essentially evolved
link |
00:13:24.040
to be repelled by anything that looks human,
link |
00:13:27.120
but is off a little bit.
link |
00:13:28.880
Anything that could be sick or diseased or dead,
link |
00:13:32.100
essentially, is our reptilian brain's way
link |
00:13:33.940
to get us to not try to have sex with it, basically.
link |
00:13:38.660
So I got really fascinated by how freaked out and scared.
link |
00:13:41.880
I mean, I would see grown men get upset.
link |
00:13:44.360
They'd get that thing away from me,
link |
00:13:45.360
look, I don't like that, like people would get angry.
link |
00:13:47.880
And it was like, you know what this is, you know?
link |
00:13:50.840
But the sort of like, you know, amygdala getting activated
link |
00:13:55.160
by something that to me is just a fun toy
link |
00:13:58.560
said a lot about our history as a species
link |
00:14:02.080
and what got us into trouble thousands of years ago.
link |
00:14:04.720
So it's that, it's the deep down stuff
link |
00:14:07.280
that's in our genetics, but also is it just,
link |
00:14:10.080
are people freaked out by the fact that there's a robot?
link |
00:14:13.080
So it's not just the appearance,
link |
00:14:14.840
but there's an artificial human.
link |
00:14:17.860
Anything people, I think, and I'm just also fascinated
link |
00:14:21.280
by the blind spots humans have.
link |
00:14:23.040
So the idea that you're afraid of that,
link |
00:14:24.760
I mean, how many robots have killed people?
link |
00:14:27.200
How many humans have died at the hands of other humans?
link |
00:14:29.800
Yeah, a few more. Millions?
link |
00:14:31.400
Hundreds of millions?
link |
00:14:32.880
Yet we're scared of that?
link |
00:14:34.560
And we'll go to the grocery store
link |
00:14:36.080
and be around a bunch of humans
link |
00:14:37.200
who statistically the chances are much higher
link |
00:14:39.520
that you're gonna get killed by humans.
link |
00:14:40.700
So I'm just fascinated by without judgment
link |
00:14:43.600
how irrational we are as a species.
link |
00:14:47.840
The word is the exponential.
link |
00:14:49.280
So it's, you know, you can say the same thing
link |
00:14:51.400
about nuclear weapons before we dropped
link |
00:14:54.160
on the Hiroshima and Nagasaki.
link |
00:14:55.760
So the worry that people have is the exponential growth.
link |
00:14:59.400
So it's like, oh, it's fun and games right now,
link |
00:15:03.680
but you know, overnight,
link |
00:15:07.160
especially if a robot provides value to society,
link |
00:15:10.000
we'll put one in every home
link |
00:15:11.680
and then all of a sudden lose track
link |
00:15:13.840
of the actual large scale impact it has on society.
link |
00:15:17.280
And then all of a sudden gain greater and greater control
link |
00:15:20.280
to where we'll all be, you know,
link |
00:15:22.540
affect our political system
link |
00:15:23.920
and then affect our decision.
link |
00:15:25.760
Didn't robots already ruin our political system?
link |
00:15:27.720
Didn't that just already happen?
link |
00:15:28.720
Which ones? Oh, Russia hacking.
link |
00:15:30.800
No offense, but hasn't that already happened?
link |
00:15:35.000
I mean, that was like an algorithm
link |
00:15:36.440
of negative things being clicked on more.
link |
00:15:39.320
We'd like to tell stories
link |
00:15:40.760
and like to demonize certain people.
link |
00:15:43.640
I think nobody understands our current political system
link |
00:15:46.840
or discourse on Twitter, the Twitter mobs.
link |
00:15:49.680
Nobody has a sense, not Twitter, not Facebook,
link |
00:15:52.560
the people running it.
link |
00:15:53.400
Nobody understands the impact of these algorithms.
link |
00:15:55.360
They're trying their best.
link |
00:15:56.880
Despite what people think,
link |
00:15:57.940
they're not like a bunch of lefties
link |
00:16:00.200
trying to make sure that Hillary Clinton gets elected.
link |
00:16:03.240
It's more that it's an incredibly complex system
link |
00:16:06.840
that we don't, and that's the worry.
link |
00:16:08.860
It's so complex and moves so fast
link |
00:16:11.440
that nobody will be able to stop it once it happens.
link |
00:16:15.760
And let me ask a question.
link |
00:16:16.920
This is a very savage question.
link |
00:16:19.440
Which is, is this just the next stage of evolution?
link |
00:16:23.840
As humans, when people will die, yes.
link |
00:16:26.080
I mean, that's always happened, you know?
link |
00:16:28.200
Is this just taking emotion out of it?
link |
00:16:30.320
Is this basically the next stage of survival of the fittest?
link |
00:16:34.960
Yeah, you have to think of organisms.
link |
00:16:37.760
You know, what does it mean to be a living organism?
link |
00:16:41.360
Like, is a smartphone part of your living organism, or?
link |
00:16:46.760
We're in relationships with our phones.
link |
00:16:49.680
Yeah.
link |
00:16:50.520
We have sex through them, with them.
link |
00:16:52.880
What's the difference between with them and through them?
link |
00:16:54.440
But it also expands your cognitive abilities,
link |
00:16:57.080
expands your memory, knowledge, and so on.
link |
00:16:59.060
So you're a much smarter person
link |
00:17:00.640
because you have a smartphone in your hand.
link |
00:17:02.600
But as soon as it's out of my hand,
link |
00:17:04.780
we've got big problems,
link |
00:17:06.120
because we've become sort of so morphed with them.
link |
00:17:08.360
Well, there's a symbiotic relationship.
link |
00:17:09.960
And that's what, so Elon Musk, the neural link,
link |
00:17:12.520
is working on trying to increase the bandwidth
link |
00:17:16.640
of communication between computers and your brain.
link |
00:17:19.320
And so further and further expand our ability
link |
00:17:22.800
as human beings to sort of leverage machines.
link |
00:17:26.300
And maybe that's the future,
link |
00:17:28.220
the next evolutionary step.
link |
00:17:30.480
It could be also that, yes, we'll give birth,
link |
00:17:33.920
just like we give birth to human children right now,
link |
00:17:36.520
we'll give birth to AI and they'll replace us.
link |
00:17:38.960
I think it's a really interesting possibility.
link |
00:17:42.200
I'm gonna play devil's advocate.
link |
00:17:44.080
I just think that the fear of robots is wildly classist.
link |
00:17:48.320
Because, I mean, Facebook,
link |
00:17:50.120
like it's easy for us to say they're taking their data.
link |
00:17:51.960
Okay, well, a lot of people
link |
00:17:53.560
that get employment off of Facebook,
link |
00:17:55.720
they are able to get income off of Facebook.
link |
00:17:58.220
They don't care if you take their phone numbers
link |
00:17:59.800
and their emails and their data, as long as it's free.
link |
00:18:01.920
They don't wanna have to pay $5 a month for Facebook.
link |
00:18:03.900
Facebook is a wildly democratic thing.
link |
00:18:05.800
Forget about the election and all that kind of stuff.
link |
00:18:08.240
A lot of technology making people's lives easier,
link |
00:18:12.480
I find that most elite people are more scared
link |
00:18:17.100
than lower income people.
link |
00:18:18.800
So, and women for the most part.
link |
00:18:21.180
So the idea of something that's stronger than us
link |
00:18:23.940
and that might eventually kill us,
link |
00:18:25.160
like women are used to that.
link |
00:18:26.560
Like that's not, I see a lot of like really rich men
link |
00:18:29.980
being like, the robots are gonna kill us.
link |
00:18:31.240
We're like, what's another thing that's gonna kill us?
link |
00:18:33.800
I tend to see like, oh,
link |
00:18:35.480
something can walk me to my car at night.
link |
00:18:37.160
Like something can help me cook dinner or something.
link |
00:18:39.880
For people in underprivileged countries
link |
00:18:43.020
who can't afford eye surgery, like in a robot,
link |
00:18:45.360
can we send a robot to underprivileged places
link |
00:18:48.840
to do surgery where they can't?
link |
00:18:50.680
I work with this organization called Operation Smile
link |
00:18:53.560
where they do cleft palate surgeries.
link |
00:18:55.660
And there's a lot of places
link |
00:18:56.500
that can't do a very simple surgery
link |
00:18:59.160
because they can't afford doctors and medical care.
link |
00:19:01.080
And such.
link |
00:19:01.920
So I just see, and this can be completely naive
link |
00:19:04.840
and should be completely wrong,
link |
00:19:05.840
but I feel like a lot of people are going like,
link |
00:19:08.780
the robots are gonna destroy us.
link |
00:19:09.920
Humans, we're destroying ourselves.
link |
00:19:11.640
We're self destructing.
link |
00:19:12.840
Robots to me are the only hope
link |
00:19:14.320
to clean up all the messes that we've created.
link |
00:19:15.960
Even when we go try to clean up pollution in the ocean,
link |
00:19:18.240
we make it worse because of the oil that the tankers use.
link |
00:19:21.720
Like, it's like, to me, robots are the only solution.
link |
00:19:25.400
Firefighters are heroes, but they're limited
link |
00:19:27.880
in how many times they can run into a fire.
link |
00:19:30.180
So there's just something interesting to me.
link |
00:19:32.360
I'm not hearing a lot of like,
link |
00:19:34.360
lower income, more vulnerable populations
link |
00:19:38.080
talking about robots.
link |
00:19:39.920
Maybe you can speak to it a little bit more.
link |
00:19:42.000
There's an idea, I think you've expressed it.
link |
00:19:44.100
I've heard, actually a few female writers
link |
00:19:48.240
and roboticists have talked to express this idea
link |
00:19:51.480
that exactly you just said, which is,
link |
00:19:55.760
it just seems that being afraid of existential threats
link |
00:20:01.680
of artificial intelligence is a male issue.
link |
00:20:06.240
Yeah.
link |
00:20:07.280
And I wonder what that is.
link |
00:20:09.720
If it, because men have, in certain positions,
link |
00:20:13.680
like you said, it's also a classist issue.
link |
00:20:15.640
They haven't been humbled by life,
link |
00:20:17.400
and so you always look for the biggest problems
link |
00:20:20.680
to take on around you.
link |
00:20:22.380
It's a champagne problem to be afraid of robots.
link |
00:20:24.200
Most people don't have health insurance.
link |
00:20:26.440
They're afraid they're not gonna be able
link |
00:20:27.520
to feed their kids.
link |
00:20:28.360
They can't afford a tutor for their kids.
link |
00:20:30.000
I mean, I just think of the way I grew up,
link |
00:20:32.400
and I had a mother who worked two jobs, had kids.
link |
00:20:36.160
We couldn't afford an SAT tutor.
link |
00:20:38.600
The idea of a robot coming in,
link |
00:20:40.040
being able to tutor your kids,
link |
00:20:41.080
being able to provide childcare for your kids,
link |
00:20:43.540
being able to come in with cameras for eyes
link |
00:20:45.540
and make sure surveillance.
link |
00:20:48.320
I'm very pro surveillance because I've had security problems
link |
00:20:52.280
and I've been, we're generally in a little more danger
link |
00:20:55.760
than you guys are.
link |
00:20:56.600
So I think that robots are a little less scary to us
link |
00:20:58.640
because we can see them maybe as like free assistance,
link |
00:21:01.240
help and protection.
link |
00:21:03.440
And then there's sort of another element for me personally,
link |
00:21:06.840
which is maybe more of a female problem.
link |
00:21:08.800
I don't know.
link |
00:21:09.640
I'm just gonna make a generalization, happy to be wrong.
link |
00:21:13.040
But the emotional sort of component of robots
link |
00:21:18.040
and what they can provide in terms of, you know,
link |
00:21:22.760
I think there's a lot of people that don't have microphones
link |
00:21:25.640
that I just recently kind of stumbled upon
link |
00:21:28.580
in doing all my research on the sex robots
link |
00:21:30.920
for my standup special, which just,
link |
00:21:32.400
there's a lot of very shy people that aren't good at dating.
link |
00:21:35.840
There's a lot of people who are scared of human beings
link |
00:21:37.880
who have personality disorders
link |
00:21:40.500
or grow up in alcoholic homes or struggle with addiction
link |
00:21:43.080
or whatever it is where a robot can solve
link |
00:21:45.720
an emotional problem.
link |
00:21:46.920
And so we're largely having this conversation
link |
00:21:49.800
about like rich guys that are emotionally healthy
link |
00:21:53.600
and how scared of robots they are.
link |
00:21:55.560
We're forgetting about like a huge part of the population
link |
00:21:58.560
who maybe isn't as charming and effervescent
link |
00:22:01.600
and solvent as, you know, people like you and Elon Musk
link |
00:22:05.400
who these robots could solve very real problems
link |
00:22:09.240
in their life, emotional or financial.
link |
00:22:11.260
Well, that's a, in general, a really interesting idea
link |
00:22:13.480
that most people in the world don't have a voice.
link |
00:22:16.700
It's a, you've talked about it,
link |
00:22:18.200
sort of even the people on Twitter
link |
00:22:19.960
who are driving the conversation.
link |
00:22:22.800
You said comments, people who leave comments
link |
00:22:25.400
represent a very tiny percent of the population
link |
00:22:28.240
and they're the ones they, you know,
link |
00:22:30.700
we tend to think they speak for the population,
link |
00:22:33.280
but it's very possible on many topics they don't at all.
link |
00:22:37.280
And look, I, and I'm sure there's gotta be
link |
00:22:39.240
some kind of legal, you know, sort of structure in place
link |
00:22:43.940
for when the robots happen.
link |
00:22:45.280
You know way more about this than I do,
link |
00:22:46.640
but you know, for me to just go, the robots are bad,
link |
00:22:49.760
that's a wild generalization that I feel like
link |
00:22:51.680
is really inhumane in some way.
link |
00:22:54.080
You know, just after the research I've done,
link |
00:22:56.520
like you're gonna tell me that a man whose wife died
link |
00:22:59.280
suddenly and he feels guilty moving on with a human woman
link |
00:23:02.960
or can't get over the grief,
link |
00:23:04.280
he can't have a sex robot in his own house?
link |
00:23:06.940
Why not?
link |
00:23:07.840
Who cares?
link |
00:23:08.820
Why do you care?
link |
00:23:09.980
Well, there's a interesting aspect of human nature.
link |
00:23:12.720
So, you know, we tend to as a civilization
link |
00:23:16.820
to create a group that's the other in all kinds of ways.
link |
00:23:19.920
Right.
link |
00:23:20.760
And so you work with animals too,
link |
00:23:23.520
you're especially sensitive to the suffering of animals.
link |
00:23:26.760
Let me kind of ask, what's your,
link |
00:23:29.400
do you think we'll abuse robots in the future?
link |
00:23:33.920
Do you think some of the darker aspects
link |
00:23:35.920
of human nature will come out?
link |
00:23:37.960
I think some people will,
link |
00:23:39.200
but if we design them properly, the people that do it,
link |
00:23:43.000
we can put it on a record and we can put them in jail.
link |
00:23:46.920
We can find sociopaths more easily, you know, like.
link |
00:23:49.520
But why is that a sociopathic thing to harm a robot?
link |
00:23:53.200
I think, look, I don't know enough about the consciousness
link |
00:23:56.240
and stuff as you do.
link |
00:23:57.920
I guess it would have to be when they're conscious,
link |
00:23:59.840
but it is, you know, the part of the brain
link |
00:24:02.840
that is responsible for compassion,
link |
00:24:04.400
the frontal lobe or whatever,
link |
00:24:05.240
like people that abuse animals also abuse humans
link |
00:24:08.160
and commit other kinds of crimes.
link |
00:24:09.440
Like that's, it's all the same part of the brain.
link |
00:24:11.080
No one abuses animals and then it's like,
link |
00:24:13.440
awesome to women and children
link |
00:24:15.520
and awesome to underprivileged, you know, minorities.
link |
00:24:18.600
Like it's all, so, you know,
link |
00:24:20.480
we've been working really hard to put a database together
link |
00:24:23.000
of all the people that have abused animals.
link |
00:24:24.720
So when they commit another crime, you go, okay, this is,
link |
00:24:27.320
you know, it's all the same stuff.
link |
00:24:29.320
And I think people probably think I'm nuts
link |
00:24:32.360
for a lot of the animal work I do,
link |
00:24:34.760
but because when animal abuse is present,
link |
00:24:37.040
another crime is always present,
link |
00:24:38.880
but the animal abuse is the most socially acceptable.
link |
00:24:40.880
You can kick a dog and there's nothing people can do,
link |
00:24:43.920
but then what they're doing behind closed doors,
link |
00:24:46.560
you can't see.
link |
00:24:47.400
So there's always something else going on,
link |
00:24:48.880
which is why I never feel compunction about it.
link |
00:24:50.680
But I do think we'll start seeing the same thing with robots.
link |
00:24:54.360
The person that kicks the,
link |
00:24:55.520
I felt compassion when the kicking the dog robot
link |
00:24:59.720
really pissed me off.
link |
00:25:00.760
I know that they're just trying to get the stability right
link |
00:25:04.080
and all that.
link |
00:25:05.200
But I do think there will come a time
link |
00:25:07.320
where that will be a great way to be able to figure out
link |
00:25:10.680
if somebody has like, you know, antisocial behaviors.
link |
00:25:15.480
You kind of mentioned surveillance.
link |
00:25:18.040
It's also a really interesting idea of yours
link |
00:25:20.000
that you just said, you know,
link |
00:25:21.520
a lot of people seem to be really uncomfortable
link |
00:25:23.400
with surveillance.
link |
00:25:24.240
Yeah.
link |
00:25:25.160
And you just said that, you know what,
link |
00:25:27.200
for me, you know, there's positives for surveillance.
link |
00:25:31.160
I think people behave better
link |
00:25:32.160
when they know they're being watched.
link |
00:25:33.280
And I know this is a very unpopular opinion.
link |
00:25:36.000
I'm talking about it on stage right now.
link |
00:25:38.080
We behave better when we know we're being watched.
link |
00:25:40.320
You and I had a very different conversation
link |
00:25:41.920
before we were recording.
link |
00:25:43.160
If we behave different, you sit up
link |
00:25:46.040
and you are in your best behavior.
link |
00:25:47.520
And I'm trying to sound eloquent
link |
00:25:49.320
and I'm trying to not hurt anyone's feelings.
link |
00:25:51.120
And I mean, I have a camera right there.
link |
00:25:52.840
I'm behaving totally different
link |
00:25:54.640
than when we first started talking.
link |
00:25:56.200
You know, when you know there's a camera,
link |
00:25:58.520
you behave differently.
link |
00:25:59.360
I mean, there's cameras all over LA at stoplights
link |
00:26:02.680
so that people don't run stoplights,
link |
00:26:04.000
but there's not even film in it.
link |
00:26:05.800
They don't even use them anymore, but it works.
link |
00:26:07.960
It works.
link |
00:26:08.800
Right?
link |
00:26:09.640
And I'm, you know, working on this thing
link |
00:26:10.680
in stand about surveillance.
link |
00:26:11.920
It's like, that's why we embed in Santa Claus.
link |
00:26:14.200
You know, it's the Santa Claus
link |
00:26:15.520
is the first surveillance basically.
link |
00:26:17.760
All we had to say to kids is he's making a list
link |
00:26:20.360
and he's watching you and they behave better.
link |
00:26:22.880
That's brilliant.
link |
00:26:23.720
You know, so I do think that there are benefits
link |
00:26:26.080
to surveillance.
link |
00:26:27.360
You know, I think we all do sketchy things in private
link |
00:26:30.880
and we all have watched weird porn
link |
00:26:33.240
or Googled weird things.
link |
00:26:34.400
And we don't want people to know about it,
link |
00:26:37.000
our secret lives.
link |
00:26:37.880
So I do think that obviously there's,
link |
00:26:40.160
we should be able to have a modicum of privacy,
link |
00:26:42.800
but I tend to think that people
link |
00:26:44.600
that are the most negative about surveillance
link |
00:26:47.480
have the most secrets.
link |
00:26:48.320
The most to hide.
link |
00:26:49.160
Yeah.
link |
00:26:50.480
Well, you should,
link |
00:26:52.280
you're saying you're doing bits on it now?
link |
00:26:54.480
Well, I'm just talking in general about,
link |
00:26:56.640
you know, privacy and surveillance
link |
00:26:58.360
and how paranoid we're kind of becoming
link |
00:27:00.240
and how, you know, I mean, it's just wild to me
link |
00:27:03.600
that people are like, our emails are gonna leak
link |
00:27:05.880
and they're taking our phone numbers.
link |
00:27:07.160
Like there used to be a book full of phone numbers
link |
00:27:11.440
and addresses that were, they just throw it at your door.
link |
00:27:15.520
And we all had a book of everyone's numbers.
link |
00:27:18.040
You know, this is a very new thing.
link |
00:27:20.320
And, you know, I know our amygdala is designed
link |
00:27:22.360
to compound sort of threats
link |
00:27:24.680
and, you know, there's stories about,
link |
00:27:27.360
and I think we all just glom on in a very, you know,
link |
00:27:30.720
tribal way of like, yeah, they're taking our data.
link |
00:27:32.440
Like, we don't even know what that means,
link |
00:27:33.720
but we're like, well, yeah, they, they, you know?
link |
00:27:38.080
So I just think that someone's like, okay, well, so what?
link |
00:27:40.200
They're gonna sell your data?
link |
00:27:41.320
Who cares?
link |
00:27:42.160
Why do you care?
link |
00:27:43.200
First of all, that bit will kill in China.
link |
00:27:47.320
So, and I say that sort of only a little bit joking
link |
00:27:51.080
because a lot of people in China, including the citizens,
link |
00:27:55.200
despite what people in the West think of as abuse,
link |
00:27:59.640
are actually in support of the idea of surveillance.
link |
00:28:03.360
Sort of, they're not in support of the abuse of surveillance,
link |
00:28:06.480
but they're, they like, I mean,
link |
00:28:08.160
the idea of surveillance is kind of like
link |
00:28:11.360
the idea of government, like you said,
link |
00:28:14.160
we behave differently.
link |
00:28:15.920
And in a way, it's almost like why we like sports.
link |
00:28:18.520
There's rules.
link |
00:28:19.960
And within the constraints of the rules,
link |
00:28:22.400
this is a more stable society.
link |
00:28:25.040
And they make good arguments about success,
link |
00:28:28.120
being able to build successful companies,
link |
00:28:30.440
being able to build successful social lives
link |
00:28:32.800
around a fabric that's more stable.
link |
00:28:34.560
When you have a surveillance, it keeps the criminals away,
link |
00:28:37.040
keeps abusive animals, whatever the values of the society,
link |
00:28:41.880
with surveillance, you can enforce those values better.
link |
00:28:44.800
And here's what I will say.
link |
00:28:45.920
There's a lot of unethical things happening
link |
00:28:47.720
with surveillance.
link |
00:28:48.560
Like I feel the need to really make that very clear.
link |
00:28:52.080
I mean, the fact that Google is like collecting
link |
00:28:54.080
if people's hands start moving on the mouse
link |
00:28:55.960
to find out if they're getting Parkinson's
link |
00:28:58.440
and then their insurance goes up,
link |
00:29:00.080
like that is completely unethical and wrong.
link |
00:29:02.200
And I think stuff like that,
link |
00:29:03.360
we have to really be careful around.
link |
00:29:05.880
So the idea of using our data to raise our insurance rates
link |
00:29:08.640
or, you know, I heard that they're looking,
link |
00:29:10.800
they can sort of predict if you're gonna have depression
link |
00:29:13.320
based on your selfies by detecting micro muscles
link |
00:29:16.080
in your face, you know, all that kind of stuff,
link |
00:29:18.240
that is a nightmare, not okay.
link |
00:29:20.040
But I think, you know, we have to delineate
link |
00:29:22.360
what's a real threat and what's getting spam
link |
00:29:25.160
in your email box.
link |
00:29:26.000
That's not what to spend your time and energy on.
link |
00:29:28.600
Focus on the fact that every time you buy cigarettes,
link |
00:29:31.080
your insurance is going up without you knowing about it.
link |
00:29:35.240
On the topic of animals too,
link |
00:29:36.920
can we just linger on a little bit?
link |
00:29:38.360
Like, what do you think,
link |
00:29:41.320
what does this say about our society
link |
00:29:43.360
of the society wide abuse of animals
link |
00:29:46.000
that we see in general, sort of factory farming,
link |
00:29:48.640
just in general, just the way we treat animals
link |
00:29:50.640
of different categories, like what do you think of that?
link |
00:29:57.440
What does a better world look like?
link |
00:29:59.680
What should people think about it in general?
link |
00:30:03.640
I think the most interesting thing
link |
00:30:06.480
I can probably say around this that's the least emotional,
link |
00:30:09.520
cause I'm actually a very non emotional animal person
link |
00:30:11.880
because it's, I think everyone's an animal person.
link |
00:30:14.080
It's just a matter of if it's yours
link |
00:30:15.880
or if you've been conditioned to go numb, you know.
link |
00:30:19.600
I think it's really a testament to what as a species
link |
00:30:22.280
we are able to be in denial about,
link |
00:30:24.560
mass denial and mass delusion,
link |
00:30:26.280
and how we're able to dehumanize and debase groups,
link |
00:30:31.640
you know, World War II,
link |
00:30:34.320
in a way in order to conform
link |
00:30:36.800
and find protection in the conforming.
link |
00:30:38.880
So we are also a species who used to go to coliseums
link |
00:30:43.840
and watch elephants and tigers fight to the death.
link |
00:30:47.520
We used to watch human beings be pulled apart
link |
00:30:50.280
and that wasn't that long ago.
link |
00:30:53.080
We're also a species who had slaves
link |
00:30:56.880
and it was socially acceptable by a lot of people.
link |
00:30:59.040
People didn't see anything wrong with it.
link |
00:31:00.160
So we're a species that is able to go numb
link |
00:31:02.680
and that is able to dehumanize very quickly
link |
00:31:05.960
and make it the norm.
link |
00:31:08.120
Child labor wasn't that long ago.
link |
00:31:10.800
The idea that now we look back and go,
link |
00:31:12.720
oh yeah, kids were losing fingers in factories making shoes.
link |
00:31:17.200
Like someone had to come in and make that, you know.
link |
00:31:20.160
So I think it just says a lot about the fact that,
link |
00:31:23.200
you know, we are animals and we are self serving
link |
00:31:25.320
and one of the most successful,
link |
00:31:27.280
the most successful species
link |
00:31:29.200
because we are able to debase and degrade
link |
00:31:33.160
and essentially exploit anything that benefits us.
link |
00:31:36.840
I think the pendulum is gonna swing as being late.
link |
00:31:39.880
Which way?
link |
00:31:40.720
Like, I think we're Rome now, kind of.
link |
00:31:42.800
I think we're on the verge of collapse
link |
00:31:44.960
because we are dopamine receptors.
link |
00:31:47.240
Like we are just, I think we're all kind of addicts
link |
00:31:49.560
when it comes to this stuff.
link |
00:31:50.520
Like we don't know when to stop.
link |
00:31:53.360
It's always the buffet.
link |
00:31:54.480
Like we're, the thing that used to keep us alive,
link |
00:31:56.600
which is killing animals and eating them,
link |
00:31:58.360
now killing animals and eating them
link |
00:31:59.720
is what's killing us in a way.
link |
00:32:01.200
So it's like, we just can't,
link |
00:32:02.840
we don't know when to call it and we don't,
link |
00:32:04.920
moderation is not really something
link |
00:32:06.560
that humans have evolved to have yet.
link |
00:32:10.040
So I think it's really just a flaw in our wiring.
link |
00:32:13.640
Do you think we'll look back at this time
link |
00:32:15.280
as our society is being deeply unethical?
link |
00:32:19.440
Yeah, yeah, I think we'll be embarrassed.
link |
00:32:22.280
Which are the worst parts right now going on?
link |
00:32:24.880
Is it? In terms of animal?
link |
00:32:26.160
Well, I think. No, in terms of anything.
link |
00:32:27.800
What's the unethical thing?
link |
00:32:29.160
If we, and it's very hard just to take a step out of it,
link |
00:32:32.040
but you just said we used to watch, you know,
link |
00:32:37.360
there's been a lot of cruelty throughout history.
link |
00:32:40.440
What's the cruelty going on now?
link |
00:32:42.160
I think it's gonna be pigs.
link |
00:32:44.200
I think it's gonna be, I mean,
link |
00:32:45.520
pigs are one of the most emotionally intelligent animals
link |
00:32:48.680
and they have the intelligence of like a three year old.
link |
00:32:51.680
And I think we'll look back and be really,
link |
00:32:54.320
they use tools.
link |
00:32:55.160
I mean, I think we have this narrative
link |
00:32:58.440
that they're pigs and they're pigs
link |
00:32:59.760
and they're disgusting and they're dirty
link |
00:33:01.880
and they're bacon is so good.
link |
00:33:02.880
I think that we'll look back one day
link |
00:33:04.240
and be really embarrassed about that.
link |
00:33:06.640
Is this for just the, what's it called?
link |
00:33:09.280
The factory farming?
link |
00:33:10.360
So basically mass.
link |
00:33:11.680
Because we don't see it.
link |
00:33:12.520
If you saw, I mean, we do have,
link |
00:33:14.520
I mean, this is probably an evolutionary advantage.
link |
00:33:17.600
We do have the ability to completely
link |
00:33:20.400
pretend something's not,
link |
00:33:21.520
something that is so horrific that it overwhelms us
link |
00:33:24.040
and we're able to essentially deny that it's happening.
link |
00:33:27.560
I think if people were to see what goes on
link |
00:33:29.280
in factory farming,
link |
00:33:30.480
and also we're really to take in how bad it is for us,
link |
00:33:35.360
you know, we're hurting ourselves first and foremost
link |
00:33:37.160
with what we eat,
link |
00:33:38.440
but that's also a very elitist argument, you know?
link |
00:33:41.280
It's a luxury to be able to complain about meat.
link |
00:33:44.600
It's a luxury to be able to not eat meat, you know?
link |
00:33:47.160
There's very few people because of, you know,
link |
00:33:49.960
how the corporations have set up meat being cheap.
link |
00:33:53.360
You know, it's $2 to buy a Big Mac,
link |
00:33:55.320
it's $10 to buy a healthy meal.
link |
00:33:57.640
You know, that's, I think a lot of people
link |
00:34:00.000
don't have the luxury to even think that way.
link |
00:34:02.280
But I do think that animals in captivity,
link |
00:34:04.240
I think we're gonna look back
link |
00:34:05.080
and be pretty grossed out about mammals in captivity,
link |
00:34:07.880
whales, dolphins.
link |
00:34:08.800
I mean, that's already starting to dismantle, circuses,
link |
00:34:12.240
we're gonna be pretty embarrassed about.
link |
00:34:14.020
But I think it's really more a testament to,
link |
00:34:17.160
you know, there's just such a ability to go like,
link |
00:34:22.080
that thing is different than me and we're better.
link |
00:34:25.560
It's the ego, I mean, it's just,
link |
00:34:26.800
we have the species with the biggest ego ultimately.
link |
00:34:29.200
Well, that's what I think,
link |
00:34:30.720
that's my hope for robots is they'll,
link |
00:34:32.520
you mentioned consciousness before,
link |
00:34:34.240
nobody knows what consciousness is,
link |
00:34:37.640
but I'm hoping robots will help us empathize
link |
00:34:42.200
and understand that there's other creatures
link |
00:34:47.840
besides ourselves that can suffer,
link |
00:34:50.320
that can experience the world
link |
00:34:54.800
and that we can torture by our actions.
link |
00:34:57.680
And robots can explicitly teach us that,
link |
00:34:59.880
I think better than animals can.
link |
00:35:01.480
I have never seen such compassion
link |
00:35:06.480
from a lot of people in my life
link |
00:35:10.840
toward any human, animal, child,
link |
00:35:13.640
as I have a lot of people
link |
00:35:15.000
in the way they interact with the robot.
link |
00:35:16.600
Because I think there's something of,
link |
00:35:19.760
I mean, I was on the robot owner's chat boards
link |
00:35:23.520
for a good eight months.
link |
00:35:25.920
And the main emotional benefit is
link |
00:35:28.120
she's never gonna cheat on you,
link |
00:35:30.360
she's never gonna hurt you,
link |
00:35:31.920
she's never gonna lie to you,
link |
00:35:33.120
she doesn't judge you.
link |
00:35:34.760
I think that robots help people,
link |
00:35:38.480
and this is part of the work I do with animals,
link |
00:35:40.640
like I do equine therapy and train dogs and stuff,
link |
00:35:42.760
because there is this safe space to be authentic.
link |
00:35:46.280
With this being that doesn't care
link |
00:35:47.960
what you do for a living,
link |
00:35:48.800
doesn't care how much money you have,
link |
00:35:50.160
doesn't care who you're dating,
link |
00:35:51.280
doesn't care what you look like,
link |
00:35:52.200
doesn't care if you have cellulite, whatever,
link |
00:35:54.320
you feel safe to be able to truly be present
link |
00:35:57.720
without being defensive and worrying about eye contact
link |
00:35:59.920
and being triggered by needing to be perfect
link |
00:36:02.680
and fear of judgment and all that.
link |
00:36:04.840
And robots really can't judge you yet,
link |
00:36:08.240
but they can't judge you,
link |
00:36:09.320
and I think it really puts people at ease
link |
00:36:13.480
and at their most authentic.
link |
00:36:16.360
Do you think you can have a deep connection
link |
00:36:18.720
with a robot that's not judging,
link |
00:36:21.920
or do you think you can really have a relationship
link |
00:36:25.440
with a robot or a human being that's a safe space?
link |
00:36:30.000
Or is attention, mystery, danger
link |
00:36:33.480
necessary for a deep connection?
link |
00:36:35.960
I'm gonna speak for myself and say that
link |
00:36:38.600
I grew up in an alcoholic home,
link |
00:36:40.120
I identify as a codependent,
link |
00:36:41.440
talked about this stuff before,
link |
00:36:43.280
but for me it's very hard to be in a relationship
link |
00:36:45.360
with a human being without feeling like
link |
00:36:47.600
I need to perform in some way or deliver in some way,
link |
00:36:50.760
and I don't know if that's just the people
link |
00:36:51.920
I've been in a relationship with or me or my brokenness,
link |
00:36:56.520
but I do think, this is gonna sound really
link |
00:37:01.960
negative and pessimistic,
link |
00:37:04.160
but I do think a lot of our relationships are projection
link |
00:37:07.200
and a lot of our relationships are performance,
link |
00:37:09.600
and I don't think I really understood that
link |
00:37:12.280
until I worked with horses.
link |
00:37:15.280
And most communication with human is nonverbal, right?
link |
00:37:18.080
I can say like, I love you,
link |
00:37:19.920
but you don't think I love you, right?
link |
00:37:22.000
Whereas with animals it's very direct.
link |
00:37:24.280
It's all physical, it's all energy.
link |
00:37:26.840
I feel like that with robots too.
link |
00:37:28.520
It feels very,
link |
00:37:32.760
how I say something doesn't matter.
link |
00:37:35.280
My inflection doesn't really matter.
link |
00:37:36.920
And you thinking that my tone is disrespectful,
link |
00:37:40.280
like you're not filtering it through all
link |
00:37:42.160
of the bad relationships you've been in,
link |
00:37:43.800
you're not filtering it through
link |
00:37:44.840
the way your mom talked to you,
link |
00:37:45.880
you're not getting triggered.
link |
00:37:47.760
I find that for the most part,
link |
00:37:49.400
people don't always receive things
link |
00:37:51.000
the way that you intend them to or the way intended,
link |
00:37:53.680
and that makes relationships really murky.
link |
00:37:56.120
So the relationships with animals
link |
00:37:57.480
and relationship with the robots is they are now,
link |
00:38:00.680
you kind of implied that that's more healthy.
link |
00:38:05.240
Can you have a healthy relationship with other humans?
link |
00:38:08.080
Or not healthy, I don't like that word,
link |
00:38:10.120
but shouldn't it be, you've talked about codependency,
link |
00:38:14.440
maybe you can talk about what is codependency,
link |
00:38:16.640
but is that, is the challenges of that,
link |
00:38:21.640
the complexity of that necessary for passion,
link |
00:38:24.520
for love between humans?
link |
00:38:27.160
That's right, you love passion.
link |
00:38:29.360
That's a good thing.
link |
00:38:31.880
I thought this would be a safe space.
link |
00:38:33.840
I got trolled by Rogan for hours on this.
link |
00:38:39.920
Look, I am not anti passion.
link |
00:38:42.560
I think that I've just maybe been around long enough
link |
00:38:45.240
to know that sometimes it's ephemeral
link |
00:38:48.200
and that passion is a mixture of a lot of different things,
link |
00:38:55.360
adrenaline, which turns into dopamine, cortisol,
link |
00:38:57.640
it's a lot of neurochemicals, it's a lot of projection,
link |
00:39:01.160
it's a lot of what we've seen in movies,
link |
00:39:03.200
it's a lot of, you know, I identify as an addict.
link |
00:39:06.160
So for me, sometimes passion is like,
link |
00:39:08.520
uh oh, this could be bad.
link |
00:39:10.120
And I think we've been so conditioned to believe
link |
00:39:11.520
that passion means like your soulmates,
link |
00:39:13.080
and I mean, how many times have you had
link |
00:39:14.280
a passionate connection with someone
link |
00:39:15.600
and then it was a total train wreck?
link |
00:39:18.920
The train wreck is interesting.
link |
00:39:19.760
How many times exactly?
link |
00:39:21.080
Exactly.
link |
00:39:21.920
What's a train wreck?
link |
00:39:22.760
You just did a lot of math in your head
link |
00:39:24.400
in that little moment.
link |
00:39:25.320
Counting.
link |
00:39:26.480
I mean, what's a train wreck?
link |
00:39:28.560
What's a, why is obsession,
link |
00:39:31.480
so you described this codependency
link |
00:39:33.600
and sort of the idea of attachment,
link |
00:39:37.400
over attachment to people who don't deserve
link |
00:39:40.240
that kind of attachment as somehow a bad thing
link |
00:39:45.040
and I think our society says it's a bad thing.
link |
00:39:47.720
It probably is a bad thing.
link |
00:39:49.600
Like a delicious burger is a bad thing.
link |
00:39:52.560
I don't know, but.
link |
00:39:53.400
Right, oh, that's a good point.
link |
00:39:54.280
I think that you're pointing out something really fascinating
link |
00:39:56.120
which is like passion, if you go into it knowing
link |
00:39:59.320
this is like pizza where it's gonna be delicious
link |
00:40:01.160
for two hours and then I don't have to have it again
link |
00:40:03.040
for three, if you can have a choice in the passion,
link |
00:40:06.360
I define passion as something that is relatively unmanageable
link |
00:40:09.520
and something you can't control or stop and start
link |
00:40:12.200
with your own volition.
link |
00:40:13.720
So maybe we're operating under different definitions.
link |
00:40:16.320
If passion is something that like, you know,
link |
00:40:18.840
ruins your real marriages and screws up
link |
00:40:22.160
your professional life and becomes this thing
link |
00:40:24.280
that you're not in control of and becomes addictive,
link |
00:40:28.600
I think that's the difference is,
link |
00:40:30.560
is it a choice or is it not a choice?
link |
00:40:32.560
And if it is a choice, then passion's great.
link |
00:40:35.120
But if it's something that like consumes you
link |
00:40:37.360
and makes you start making bad decisions
link |
00:40:39.360
and clouds your frontal lobe
link |
00:40:41.160
and is just all about dopamine
link |
00:40:44.040
and not really about the person
link |
00:40:46.160
and more about the neurochemical,
link |
00:40:47.800
we call it sort of the drug, the internal drug cabinet.
link |
00:40:50.760
If it's all just, you're on drugs, that's different,
link |
00:40:52.960
you know, cause sometimes you're just on drugs.
link |
00:40:54.960
Okay, so there's a philosophical question here.
link |
00:40:58.440
So would you rather, and it's interesting for a comedian,
link |
00:41:03.320
brilliant comedian to speak so eloquently
link |
00:41:07.520
about a balanced life.
link |
00:41:09.480
I kind of argue against this point.
link |
00:41:12.040
There's such an obsession of creating
link |
00:41:13.520
this healthy lifestyle now, psychologically speaking.
link |
00:41:18.080
You know, I'm a fan of the idea that you sort of fly high
link |
00:41:22.040
and you crash and die at 27 is also a possible life.
link |
00:41:26.480
And it's not one we should judge
link |
00:41:27.960
because I think there's moments of greatness.
link |
00:41:30.680
I talked to Olympic athletes
link |
00:41:32.120
where some of their greatest moments
link |
00:41:34.280
are achieved in their early 20s.
link |
00:41:36.560
And the rest of their life is in the kind of fog
link |
00:41:39.840
of almost of a depression because they can never.
link |
00:41:41.920
Because they're based on their physical prowess, right?
link |
00:41:44.240
Physical prowess and they'll never,
link |
00:41:46.400
so that, so they're watching their physical prowess fade
link |
00:41:50.200
and they'll never achieve the kind of height,
link |
00:41:54.680
not just physical, of just emotion, of.
link |
00:41:58.240
Well, the max number of neurochemicals.
link |
00:42:01.760
And you also put your money on the wrong horse.
link |
00:42:04.680
That's where I would just go like,
link |
00:42:06.360
oh yeah, if you're doing a job where you peak at 22,
link |
00:42:10.120
the rest of your life is gonna be hard.
link |
00:42:12.320
That idea is considering the notion
link |
00:42:15.120
that you wanna optimize some kind of,
link |
00:42:17.480
but we're all gonna die soon.
link |
00:42:19.280
What?
link |
00:42:21.840
Now you tell me.
link |
00:42:23.280
I've immortalized myself, so I'm gonna be fine.
link |
00:42:26.800
See, you're almost like,
link |
00:42:28.240
how many Oscar winning movies can I direct
link |
00:42:32.160
by the time I'm 100?
link |
00:42:34.040
How many this and that?
link |
00:42:35.800
But you know, there's a night, you know,
link |
00:42:38.080
it's all, life is short, relatively speaking.
link |
00:42:41.200
I know, but it can also come in different ways.
link |
00:42:42.600
You go, life is short, play hard,
link |
00:42:45.120
fall in love as much as you can, run into walls.
link |
00:42:47.640
I would also go, life is short,
link |
00:42:49.400
don't deplete yourself on things that aren't sustainable
link |
00:42:53.760
and that you can't keep, you know?
link |
00:42:56.560
So I think everyone gets dopamine from different places.
link |
00:42:59.720
Everyone has meaning from different places.
link |
00:43:01.720
I look at the fleeting passionate relationships
link |
00:43:04.520
I've had in the past and I don't like,
link |
00:43:06.800
I don't have pride in that.
link |
00:43:07.880
I think that you have to decide what, you know,
link |
00:43:10.400
helps you sleep at night.
link |
00:43:11.240
For me, it's pride and feeling like I behave
link |
00:43:13.480
with grace and integrity.
link |
00:43:14.480
That's just me personally.
link |
00:43:16.040
Everyone can go like, yeah,
link |
00:43:17.640
I slept with all the hot chicks in Italy I could
link |
00:43:20.960
and I, you know, did all the whatever,
link |
00:43:23.440
like whatever you value,
link |
00:43:25.040
we're allowed to value different things.
link |
00:43:26.560
Yeah, we're talking about Brian Callan.
link |
00:43:28.040
Brian Callan has lived his life to the fullest,
link |
00:43:32.560
to say the least.
link |
00:43:33.560
But I think that it's just for me personally,
link |
00:43:36.360
I, and this could be like my workaholism
link |
00:43:38.800
or my achievementism,
link |
00:43:41.520
I, if I don't have something to show for something,
link |
00:43:45.200
I feel like it's a waste of time or some kind of loss.
link |
00:43:50.240
I'm in a 12 step program and the third step would say,
link |
00:43:52.560
there's no such thing as waste of time
link |
00:43:54.080
and everything happens exactly as it should
link |
00:43:56.800
and whatever, that's a way to just sort of keep us sane
link |
00:43:59.440
so we don't grieve too much and beat ourselves up
link |
00:44:01.800
over past mistakes, there's no such thing as mistakes,
link |
00:44:04.600
dah, dah, dah.
link |
00:44:05.720
But I think passion is, I think it's so life affirming
link |
00:44:10.600
and one of the few things that maybe people like us
link |
00:44:13.000
makes us feel awake and seen
link |
00:44:14.840
and we just have such a high threshold for adrenaline.
link |
00:44:20.480
You know, I mean, you are a fighter, right?
link |
00:44:22.720
Yeah, okay, so yeah,
link |
00:44:24.000
so you have a very high tolerance for adrenaline
link |
00:44:28.680
and I think that Olympic athletes,
link |
00:44:30.360
the amount of adrenaline they get from performing,
link |
00:44:33.600
it's very hard to follow that.
link |
00:44:34.680
It's like when guys come back from the military
link |
00:44:36.480
and they have depression.
link |
00:44:38.080
It's like, do you miss bullets flying at you?
link |
00:44:40.600
Yeah, kind of because of that adrenaline
link |
00:44:42.760
which turned into dopamine and the camaraderie.
link |
00:44:45.040
I mean, there's people that speak much better
link |
00:44:46.480
about this than I do.
link |
00:44:48.360
But I just, I'm obsessed with neurology
link |
00:44:50.680
and I'm just obsessed with sort of the lies we tell ourselves
link |
00:44:54.080
in order to justify getting neurochemicals.
link |
00:44:57.040
You've done actually quite, done a lot of thinking
link |
00:45:00.320
and talking about neurology
link |
00:45:01.880
and just kind of look at human behavior
link |
00:45:04.160
through the lens of looking at how our actually,
link |
00:45:07.800
chemically our brain works.
link |
00:45:09.120
So what, first of all,
link |
00:45:10.880
why did you connect with that idea and what have you,
link |
00:45:15.360
how has your view of the world changed
link |
00:45:17.520
by considering the brain is just a machine?
link |
00:45:22.400
You know, I know it probably sounds really nihilistic
link |
00:45:24.520
but for me, it's very liberating to know a lot
link |
00:45:27.480
about neurochemicals because you don't have to,
link |
00:45:30.040
it's like the same thing with like critics,
link |
00:45:32.480
like critical reviews.
link |
00:45:33.880
If you believe the good,
link |
00:45:34.720
you have to believe the bad kind of thing.
link |
00:45:36.080
Like, you know, if you believe that your bad choices
link |
00:45:38.880
were because of your moral integrity or whatever,
link |
00:45:43.720
you have to believe your good ones.
link |
00:45:44.720
I just think there's something really liberating
link |
00:45:46.280
and going like, oh, that was just adrenaline.
link |
00:45:48.040
I just said that thing
link |
00:45:48.880
because I was adrenalized and I was scared
link |
00:45:50.680
and my amygdala was activated
link |
00:45:52.080
and that's why I said you're an asshole and get out.
link |
00:45:54.360
And that's, you know, I think,
link |
00:45:55.880
I just think it's important to delineate what's nature
link |
00:45:57.960
and what's nurture, what is your choice
link |
00:45:59.840
and what is just your brain trying to keep you safe.
link |
00:46:02.000
I think we forget that even though we have security systems
link |
00:46:04.520
and homes and locks on our doors,
link |
00:46:06.360
that our brain for the most part
link |
00:46:07.520
is just trying to keep us safe all the time.
link |
00:46:09.200
It's why we hold grudges, it's why we get angry,
link |
00:46:11.320
it's why we get road rage, it's why we do a lot of things.
link |
00:46:14.720
And it's also, when I started learning about neurology,
link |
00:46:17.240
I started having so much more compassion for other people.
link |
00:46:19.640
You know, if someone yelled at me being like,
link |
00:46:21.520
fuck you on the road, I'd be like,
link |
00:46:22.920
okay, he's producing adrenaline right now
link |
00:46:24.600
because we're all going 65 miles an hour
link |
00:46:27.680
and our brains aren't really designed
link |
00:46:30.200
for this type of stress and he's scared.
link |
00:46:33.280
He was scared, you know, so that really helped me
link |
00:46:35.320
to have more love for people in my everyday life
link |
00:46:38.360
instead of being in fight or flight mode.
link |
00:46:41.000
But the, I think more interesting answer to your question
link |
00:46:44.160
is that I've had migraines my whole life.
link |
00:46:45.720
Like I've suffered with really intense migraines,
link |
00:46:49.000
ocular migraines, ones where my arm would go numb
link |
00:46:52.360
and I just started having to go to so many doctors
link |
00:46:55.000
to learn about it and I started, you know,
link |
00:46:58.360
learning that we don't really know that much.
link |
00:47:00.680
We know a lot, but it's wild to go into
link |
00:47:03.520
one of the best neurologists in the world
link |
00:47:04.840
who's like, yeah, we don't know.
link |
00:47:05.680
We don't know. We don't know.
link |
00:47:07.320
And that fascinated me.
link |
00:47:08.160
Except one of the worst pains you can probably have,
link |
00:47:10.720
all that stuff, and we don't know the source.
link |
00:47:13.280
We don't know the source
link |
00:47:14.320
and there is something really fascinating
link |
00:47:16.400
about when your left arm starts going numb
link |
00:47:19.480
and you start not being able to see
link |
00:47:21.000
out of the left side of both your eyes.
link |
00:47:22.840
And I remember when the migraines get really bad,
link |
00:47:25.360
it's like a mini stroke almost
link |
00:47:26.880
and you're able to see words on a page,
link |
00:47:29.880
but I can't read them.
link |
00:47:31.240
They just look like symbols to me.
link |
00:47:33.080
So there's something just really fascinating to me
link |
00:47:35.000
about your brain just being able to stop functioning.
link |
00:47:38.280
And I, so I just wanted to learn about it, study about it.
link |
00:47:41.640
I did all these weird alternative treatments.
link |
00:47:43.360
I got this piercing in here that actually works.
link |
00:47:45.880
I've tried everything.
link |
00:47:47.000
And then both of my parents had strokes.
link |
00:47:49.200
So when both of my parents had strokes,
link |
00:47:51.040
I became sort of the person who had to decide
link |
00:47:54.160
what was gonna happen with their recovery,
link |
00:47:56.640
which is just a wild thing to have to deal with it.
link |
00:47:59.120
You know, 28 years old when it happened.
link |
00:48:02.120
And I started spending basically all day, every day in ICUs
link |
00:48:05.840
with neurologists learning about what happened
link |
00:48:08.200
to my dad's brain and why he can't move his left arm,
link |
00:48:11.120
but he can move his right leg,
link |
00:48:12.520
but he can't see out of the, you know.
link |
00:48:14.200
And then my mom had another stroke
link |
00:48:17.000
in a different part of the brain.
link |
00:48:18.040
So I started having to learn
link |
00:48:19.640
what parts of the brain did what,
link |
00:48:21.480
and so that I wouldn't take their behavior so personally,
link |
00:48:23.840
and so that I would be able to manage my expectations
link |
00:48:25.960
in terms of their recovery.
link |
00:48:27.400
So my mom, because it affected a lot of her frontal lobe,
link |
00:48:31.240
changed a lot as a person.
link |
00:48:33.080
She was way more emotional.
link |
00:48:34.560
She was way more micromanaged.
link |
00:48:35.760
She was forgetting certain things.
link |
00:48:36.960
So it broke my heart less when I was able to know,
link |
00:48:40.320
oh yeah, well, the stroke hit this part of the brain,
link |
00:48:42.080
and that's the one that's responsible for short term memory,
link |
00:48:44.200
and that's responsible for long term memory, da da da.
link |
00:48:46.840
And then my brother just got something
link |
00:48:48.680
called viral encephalitis,
link |
00:48:50.520
which is an infection inside the brain.
link |
00:48:53.280
So it was kind of wild that I was able to go,
link |
00:48:56.280
oh, I know exactly what's happening here,
link |
00:48:57.640
and I know, you know, so.
link |
00:48:59.760
So that's allows you to have some more compassion
link |
00:49:02.440
for the struggles that people have,
link |
00:49:04.440
but does it take away some of the magic
link |
00:49:06.560
for some of the, from the,
link |
00:49:08.800
some of the more positive experiences of life?
link |
00:49:10.920
Sometimes.
link |
00:49:11.920
Sometimes, and I don't, I'm such a control addict
link |
00:49:15.320
that, you know, I think our biggest,
link |
00:49:18.120
someone like me,
link |
00:49:19.360
my biggest dream is to know why someone's doing it.
link |
00:49:21.000
That's what standup is.
link |
00:49:22.240
It's just trying to figure out why,
link |
00:49:23.440
or that's what writing is.
link |
00:49:24.280
That's what acting is.
link |
00:49:25.120
That's what performing is.
link |
00:49:25.960
It's trying to figure out why someone would do something.
link |
00:49:27.440
As an actor, you get a piece of, you know, material,
link |
00:49:30.040
and you go, this person, why would he say that?
link |
00:49:32.120
Why would he, she pick up that cup?
link |
00:49:33.760
Why would she walk over here?
link |
00:49:35.040
It's really why, why, why, why.
link |
00:49:36.560
So I think neurology is,
link |
00:49:38.000
if you're trying to figure out human motives
link |
00:49:40.320
and why people do what they do,
link |
00:49:41.520
it'd be crazy not to understand how neurochemicals motivate us.
link |
00:49:46.080
I also have a lot of addiction in my family
link |
00:49:48.080
and hardcore drug addiction and mental illness.
link |
00:49:51.480
And in order to cope with it,
link |
00:49:53.720
you really have to understand that borderline personality
link |
00:49:55.640
disorder, schizophrenia, and drug addiction.
link |
00:49:58.360
So I have a lot of people I love
link |
00:50:00.640
that suffer from drug addiction and alcoholism.
link |
00:50:02.840
And the first thing they started teaching you
link |
00:50:04.760
is it's not a choice.
link |
00:50:05.880
These people's dopamine receptors
link |
00:50:07.360
don't hold dopamine the same ways yours do.
link |
00:50:09.640
Their frontal lobe is underdeveloped, like, you know,
link |
00:50:13.200
and that really helped me to navigate dealing,
link |
00:50:17.160
loving people that were addicted to substances.
link |
00:50:20.240
I want to be careful with this question, but how much?
link |
00:50:24.240
Money do you have?
link |
00:50:25.320
How much?
link |
00:50:26.160
Can I borrow $10?
link |
00:50:28.920
Okay, no, is how much control,
link |
00:50:33.920
how much, despite the chemical imbalances
link |
00:50:39.760
or the biological limitations
link |
00:50:42.920
that each of our individual brains have,
link |
00:50:44.520
how much mind over matter is there?
link |
00:50:47.080
So through things that I've known people
link |
00:50:51.160
with clinical depression,
link |
00:50:53.200
and so it's always a touchy subject
link |
00:50:55.560
to say how much they can really help it.
link |
00:50:57.680
Very.
link |
00:50:59.680
What can you, yeah, what can you,
link |
00:51:01.680
because you've talked about codependency,
link |
00:51:03.520
you talked about issues that you struggle through,
link |
00:51:07.400
and nevertheless, you choose to take a journey
link |
00:51:09.880
of healing and so on, so that's your choice,
link |
00:51:12.840
that's your actions.
link |
00:51:14.240
So how much can you do to help fight the limitations
link |
00:51:17.720
of the neurochemicals in your brain?
link |
00:51:20.040
That's such an interesting question,
link |
00:51:21.800
and I don't think I'm at all qualified to answer,
link |
00:51:23.440
but I'll say what I do know.
link |
00:51:25.560
And really quick, just the definition of codependency,
link |
00:51:28.200
I think a lot of people think of codependency
link |
00:51:29.920
as like two people that can't stop hanging out, you know,
link |
00:51:33.120
or like, you know, that's not totally off,
link |
00:51:36.640
but I think for the most part,
link |
00:51:38.280
my favorite definition of codependency
link |
00:51:39.960
is the inability to tolerate the discomfort of others.
link |
00:51:42.880
You grow up in an alcoholic home,
link |
00:51:44.040
you grow up around mental illness,
link |
00:51:45.200
you grow up in chaos,
link |
00:51:46.480
you have a parent that's a narcissist,
link |
00:51:48.280
you basically are wired to just people please,
link |
00:51:51.520
worry about others, be perfect, walk on eggshells,
link |
00:51:54.720
shape shift to accommodate other people.
link |
00:51:56.680
So codependence is a very active wiring issue
link |
00:52:01.680
that, you know, doesn't just affect
link |
00:52:04.400
your romantic relationships, it affects you being a boss,
link |
00:52:06.920
it affects you in the world.
link |
00:52:09.480
Online, you know, you get one negative comment
link |
00:52:11.920
and it throws you for two weeks.
link |
00:52:14.080
You know, it also is linked to eating disorders
link |
00:52:16.080
and other kinds of addiction.
link |
00:52:17.040
So it's a very big thing,
link |
00:52:20.120
and I think a lot of people sometimes only think
link |
00:52:21.920
that it's in a romantic relationship,
link |
00:52:23.400
so I always feel the need to say that.
link |
00:52:25.840
And also one of the reasons I love the idea of robots
link |
00:52:28.000
so much because you don't have to walk on eggshells
link |
00:52:30.320
around them, you don't have to worry
link |
00:52:31.480
they're gonna get mad at you yet,
link |
00:52:33.240
but there's no, codependents are hypersensitive
link |
00:52:36.840
to the needs and moods of others,
link |
00:52:39.480
and it's very exhausting, it's depleting.
link |
00:52:42.080
Just one conversation about where we're gonna go to dinner
link |
00:52:45.360
is like, do you wanna go get Chinese food?
link |
00:52:47.200
We just had Chinese food.
link |
00:52:48.360
Well, wait, are you mad?
link |
00:52:50.080
Well, no, I didn't mean to,
link |
00:52:50.920
and it's just like that codependents live in this,
link |
00:52:54.880
everything means something,
link |
00:52:56.560
and humans can be very emotionally exhausting.
link |
00:53:00.080
Why did you look at me that way?
link |
00:53:01.120
What are you thinking about?
link |
00:53:01.960
What was that?
link |
00:53:02.800
Why'd you check your phone?
link |
00:53:03.640
It's a hypersensitivity that can be
link |
00:53:06.320
incredibly time consuming,
link |
00:53:07.880
which is why I love the idea of robots just subbing in.
link |
00:53:10.720
Even, I've had a hard time running TV shows and stuff
link |
00:53:13.800
because even asking someone to do something,
link |
00:53:15.320
I don't wanna come off like a bitch,
link |
00:53:16.520
I'm very concerned about what other people think of me,
link |
00:53:18.640
how I'm perceived, which is why I think robots
link |
00:53:21.600
will be very beneficial for codependents.
link |
00:53:23.840
By the way, just a real quick tangent,
link |
00:53:25.600
that skill or flaw, whatever you wanna call it,
link |
00:53:29.120
is actually really useful for if you ever do
link |
00:53:32.080
start your own podcast for interviewing,
link |
00:53:34.600
because you're now kind of obsessed
link |
00:53:36.440
about the mindset of others,
link |
00:53:39.200
and it makes you a good sort of listener and talker with.
link |
00:53:43.520
So I think, what's her name from NPR?
link |
00:53:48.240
Terry Gross.
link |
00:53:49.080
Terry Gross talked about having that.
link |
00:53:50.920
So.
link |
00:53:51.760
I don't feel like she has that at all.
link |
00:53:53.440
What?
link |
00:53:54.280
She worries about other people's feelings?
link |
00:53:56.800
Yeah, absolutely.
link |
00:53:57.640
Oh, I don't get that at all.
link |
00:53:59.560
I mean, you have to put yourself in the mind
link |
00:54:01.160
of the person you're speaking with.
link |
00:54:03.320
Oh, I see, just in terms of, yeah,
link |
00:54:05.120
I am starting a podcast,
link |
00:54:06.120
and the reason I haven't is because I'm codependent
link |
00:54:08.320
and I'm too worried it's not gonna be perfect.
link |
00:54:10.320
So a big codependent adage is perfectionism
link |
00:54:14.240
leads to procrastination, which leads to paralysis.
link |
00:54:16.360
So how do you, sorry to take a million changes,
link |
00:54:18.320
how do you survive on social media?
link |
00:54:19.640
Is the exception the evidence?
link |
00:54:20.960
Is the exception the evidence?
link |
00:54:21.800
Is the exception the evidence?
link |
00:54:22.640
To survive on social media, is the exception active?
link |
00:54:25.160
But by the way, I took you on a tangent
link |
00:54:26.560
and didn't answer your last question
link |
00:54:27.800
about how much we can control.
link |
00:54:29.920
How much, yeah, we'll return it, or maybe not.
link |
00:54:32.920
The answer is we can't.
link |
00:54:33.760
Now as a codependent, I'm, okay, good.
link |
00:54:36.240
We can, but, but, you know,
link |
00:54:38.160
one of the things that I'm fascinated by is,
link |
00:54:39.880
you know, the first thing you learn
link |
00:54:40.960
when you go into 12 step programs or addiction recovery
link |
00:54:43.520
or any of this is, you know,
link |
00:54:45.200
genetics loads the gun, environment pulls the trigger.
link |
00:54:47.840
And there's certain parts of your genetics
link |
00:54:50.520
you cannot control.
link |
00:54:51.560
I come from a lot of alcoholism.
link |
00:54:54.240
I come from, you know, a lot of mental illness.
link |
00:54:59.880
There's certain things I cannot control
link |
00:55:01.680
and a lot of things that maybe we don't even know yet
link |
00:55:04.000
what we can and can't
link |
00:55:04.840
because of how little we actually know about the brain.
link |
00:55:06.720
But we also talk about the warrior spirit.
link |
00:55:08.600
And there are some people that have that warrior spirit
link |
00:55:12.080
and we don't necessarily know what that engine is,
link |
00:55:15.280
whether it's you get dopamine from succeeding
link |
00:55:18.000
or achieving or martyring yourself
link |
00:55:21.160
or the attention you get from growing.
link |
00:55:24.920
So a lot of people are like,
link |
00:55:25.800
oh, this person can edify themselves and overcome,
link |
00:55:29.080
but if you're getting attention from improving yourself,
link |
00:55:32.280
you're gonna keep wanting to do that.
link |
00:55:34.520
So that is something that helps a lot of,
link |
00:55:37.200
in terms of changing your brain.
link |
00:55:38.560
If you talk about changing your brain to people
link |
00:55:40.400
and talk about what you're doing to overcome set obstacles,
link |
00:55:42.880
you're gonna get more attention from them,
link |
00:55:44.560
which is gonna fire off your reward system
link |
00:55:46.800
and then you're gonna keep doing it.
link |
00:55:48.400
Yeah, so you can leverage that momentum.
link |
00:55:50.280
So this is why in any 12 step program,
link |
00:55:52.640
you go into a room and you talk about your progress
link |
00:55:55.120
because then everyone claps for you.
link |
00:55:57.080
And then you're more motivated to keep going.
link |
00:55:58.760
So that's why we say you're only as sick
link |
00:56:00.160
as the secrets you keep,
link |
00:56:01.200
because if you keep things secret,
link |
00:56:03.640
there's no one guiding you to go in a certain direction.
link |
00:56:06.080
It's based on, right?
link |
00:56:07.080
We're sort of designed to get approval from the tribe
link |
00:56:10.360
or from a group of people
link |
00:56:11.600
because our brain translates it to safety.
link |
00:56:14.600
So, you know.
link |
00:56:15.440
And in that case, the tribe is a positive one
link |
00:56:17.640
that helps you go in a positive direction.
link |
00:56:19.520
So that's why it's so important to go into a room
link |
00:56:21.240
and also say, hey, I wanted to use drugs today.
link |
00:56:25.080
And people go, hmm.
link |
00:56:26.440
They go, me too.
link |
00:56:27.440
And you feel less alone
link |
00:56:28.600
and you feel less like you're, you know,
link |
00:56:30.400
have been castigated from the pack or whatever.
link |
00:56:32.760
And then you say, and you get a chip
link |
00:56:35.080
when you haven't drank for 30 days or 60 days or whatever.
link |
00:56:37.600
You get little rewards.
link |
00:56:38.640
So talking about a pack that's not at all healthy or good,
link |
00:56:43.240
but in fact is often toxic, social media.
link |
00:56:46.280
So you're one of my favorite people
link |
00:56:47.680
on Twitter and Instagram to sort of just both the comedy
link |
00:56:52.440
and the insight and just fun.
link |
00:56:54.440
How do you prevent social media
link |
00:56:55.760
from destroying your mental health?
link |
00:56:57.240
I haven't.
link |
00:56:59.560
I haven't.
link |
00:57:00.440
It's the next big epidemic, isn't it?
link |
00:57:05.720
I don't think I have.
link |
00:57:06.600
I don't think.
link |
00:57:08.680
Is moderation the answer?
link |
00:57:10.760
Maybe, but you can do a lot of damage in a moderate way.
link |
00:57:14.360
I mean, I guess, again, it depends on your goals, you know?
link |
00:57:17.120
And I think for me, the way that my addiction
link |
00:57:20.840
to social media, I'm happy to call it an addiction.
link |
00:57:23.080
I mean, and I define it as an addiction
link |
00:57:24.960
because it stops being a choice.
link |
00:57:26.280
There are times I just reach over and I'm like, that was.
link |
00:57:29.240
Yeah, that was weird.
link |
00:57:30.080
That was weird.
link |
00:57:31.320
I'll be driving sometimes and I'll be like, oh my God,
link |
00:57:33.640
my arm just went to my phone, you know?
link |
00:57:36.800
I can put it down.
link |
00:57:37.800
I can take time away from it, but when I do, I get antsy.
link |
00:57:41.400
I get restless, irritable, and discontent.
link |
00:57:43.400
I mean, that's kind of the definition, isn't it?
link |
00:57:45.880
So I think by no means do I have a healthy relationship
link |
00:57:49.840
with social media.
link |
00:57:50.680
I'm sure there's a way to,
link |
00:57:51.560
but I think I'm especially a weirdo in this space
link |
00:57:54.800
because it's easy to conflate.
link |
00:57:56.960
Is this work?
link |
00:57:58.080
Is this not?
link |
00:57:58.920
I can always say that it's for work, you know?
link |
00:58:01.960
But I mean, don't you get the same kind of thing
link |
00:58:04.160
as you get from when a room full of people laugh at your jokes?
link |
00:58:08.080
Because I mean, I see, especially the way you do Twitter,
link |
00:58:11.200
it's an extension of your comedy in a way.
link |
00:58:13.680
So I took a big break from Twitter though,
link |
00:58:16.000
a really big break.
link |
00:58:16.840
I took like six months off or something for a while
link |
00:58:19.160
because it was just like,
link |
00:58:20.520
it seemed like it was all kind of politics
link |
00:58:22.280
and it was just a little bit,
link |
00:58:23.280
it wasn't giving me dopamine
link |
00:58:25.000
because there was like this weird, a lot of feedback.
link |
00:58:28.320
So I had to take a break from it and then go back to it
link |
00:58:30.760
because I felt like I didn't have a healthy relationship.
link |
00:58:33.520
Have you ever tried the, I don't know if I believe him,
link |
00:58:36.160
but Joe Rogan seems to not read comments.
link |
00:58:39.480
Have you, and he's one of the only people at the scale,
link |
00:58:42.560
like at your level who at least claims not to read.
link |
00:58:47.880
So like, cause you and him swim in this space
link |
00:58:51.800
of tense ideas that get the toxic folks riled up.
link |
00:58:58.720
I think Rogan, I don't, I don't know.
link |
00:59:01.080
I don't, I think he probably looks at YouTube,
link |
00:59:05.360
like the likes and the, you know, I think if some things,
link |
00:59:08.240
if he doesn't know, I don't know.
link |
00:59:10.240
I'm sure he would tell the truth, you know,
link |
00:59:13.800
I'm sure he's got people that look at them
link |
00:59:15.840
and it's like disgusted, great.
link |
00:59:17.280
Or I don't, you know, like, I'm sure he gets it.
link |
00:59:20.320
You know, I can't picture him like in the weeds on.
link |
00:59:23.240
No, for sure.
link |
00:59:24.080
I mean, he's honestly actually saying that I just,
link |
00:59:26.280
it's, it's, it's admirable.
link |
00:59:28.720
We're addicted to feedback.
link |
00:59:29.680
Yeah, we're addicted to feedback.
link |
00:59:30.560
I mean, you know, look,
link |
00:59:31.960
like I think that our brain is designed to get intel
link |
00:59:36.040
on how we're perceived so that we know where we stand,
link |
00:59:39.560
right?
link |
00:59:40.400
That's our whole deal, right?
link |
00:59:41.280
As humans, we want to know where we stand.
link |
00:59:43.040
We walk in a room and we go,
link |
00:59:44.200
who's the most powerful person in here?
link |
00:59:45.520
I got to talk to them and get in their good graces.
link |
00:59:47.520
It's just, we're designed to rank ourselves, right?
link |
00:59:49.840
And constantly know our rank and social media
link |
00:59:52.840
because of you can't figure out your rank
link |
00:59:55.880
with 500 million people.
link |
00:59:58.120
It's possible, you know, so our brain is like,
link |
01:00:00.320
what's my rank?
link |
01:00:01.160
What's my, and especially if we're following people,
link |
01:00:02.960
I think the, the big, the interesting thing,
link |
01:00:05.160
I think I maybe be able to say about this
link |
01:00:07.800
besides my speech impediment is that I did start muting
link |
01:00:12.120
people that rank wildly higher than me
link |
01:00:16.120
because it is just stressful on the brain
link |
01:00:19.000
to constantly look at people
link |
01:00:21.040
that are incredibly successful.
link |
01:00:23.040
So you keep feeling bad about yourself.
link |
01:00:25.000
You know, I think that that is like cutting
link |
01:00:27.280
to a certain extent.
link |
01:00:28.600
Just like, look at me looking at all these people
link |
01:00:30.920
that have so much more money than me
link |
01:00:32.040
and so much more success than me.
link |
01:00:33.760
It's making me feel like a failure,
link |
01:00:35.840
even though I don't think I'm a failure,
link |
01:00:37.600
but it's easy to frame it so that I can feel that way.
link |
01:00:41.880
Yeah, that's really interesting,
link |
01:00:43.280
especially if they're close to,
link |
01:00:45.080
like if they're other comedians or something like that,
link |
01:00:46.880
or whatever.
link |
01:00:48.080
That's, it's really disappointing to me.
link |
01:00:50.440
I do the same thing as well.
link |
01:00:51.720
So other successful people that are really close
link |
01:00:53.560
to what I do, it, I don't know,
link |
01:00:56.360
I wish I could just admire.
link |
01:00:58.200
Yeah.
link |
01:00:59.040
And for it not to be a distraction, but.
link |
01:01:01.360
But that's why you are where you are
link |
01:01:02.440
because you don't just admire your competitive
link |
01:01:04.320
and you want to win.
link |
01:01:05.240
So it's also the same thing that bums you out
link |
01:01:07.480
when you look at this as the same reason
link |
01:01:08.800
you are where you are.
link |
01:01:09.640
So that's why I think it's so important
link |
01:01:11.480
to learn about neurology and addiction
link |
01:01:12.800
because you're able to go like,
link |
01:01:14.080
oh, this same instinct.
link |
01:01:15.680
So I'm very sensitive.
link |
01:01:17.120
And I, and I sometimes don't like that about myself,
link |
01:01:19.480
but I'm like, well, that's the reason I'm able to
link |
01:01:21.640
write good standup.
link |
01:01:22.480
And that's the reason, and that's the reason
link |
01:01:23.720
I'm able to be sensitive to feedback
link |
01:01:25.680
and go, that joke should have been better.
link |
01:01:26.920
I can make that better.
link |
01:01:28.040
So it's the kind of thing where it's like,
link |
01:01:29.480
you have to be really sensitive in your work.
link |
01:01:31.240
And the second you leave,
link |
01:01:32.320
you got to be able to turn it off.
link |
01:01:33.560
It's about developing the muscle,
link |
01:01:34.840
being able to know when to let it be a superpower
link |
01:01:38.360
and when it's going to hold you back and be an obstacle.
link |
01:01:41.240
So I try to not be in that black and white of like,
link |
01:01:44.320
you know, being competitive is bad
link |
01:01:45.800
or being jealous of someone just to go like,
link |
01:01:47.720
oh, there's that thing that makes me really successful
link |
01:01:50.240
in a lot of other ways,
link |
01:01:51.440
but right now it's making me feel bad.
link |
01:01:53.280
Well, I'm kind of looking to you
link |
01:01:54.960
because you're basically a celebrity,
link |
01:01:58.200
a famous sort of world class comedian.
link |
01:02:01.240
And so I feel like you're the right person
link |
01:02:03.120
to be one of the key people to define
link |
01:02:06.120
what's the healthy path forward with social media.
link |
01:02:08.960
So I, because we're all trying to figure it out now
link |
01:02:12.800
and it's, I'm curious to see where it evolves.
link |
01:02:16.360
I think you're at the center of that.
link |
01:02:17.960
So like, you know, there's, you know,
link |
01:02:20.400
trying to leave Twitter and then come back and see,
link |
01:02:22.800
can I do this in a healthy way?
link |
01:02:24.080
I mean, you have to keep trying, exploring.
link |
01:02:25.760
You have to know because it's being, you know,
link |
01:02:28.160
I have a couple answers.
link |
01:02:29.760
I think, you know, I hire a company
link |
01:02:31.600
to do some of my social media for me, you know?
link |
01:02:33.920
So it's also being able to go, okay,
link |
01:02:36.280
I make a certain amount of money by doing this,
link |
01:02:38.360
but now let me be a good business person
link |
01:02:40.360
and say, I'm gonna pay you this amount to run this for me.
link |
01:02:42.960
So I'm not 24 seven in the weeds hashtagging and responding.
link |
01:02:45.920
And just, it's a lot to take on.
link |
01:02:47.280
It's a lot of energy to take on.
link |
01:02:48.800
But at the same time, part of what I think
link |
01:02:51.280
makes me successful on social media if I am,
link |
01:02:53.480
is that people know I'm actually doing it
link |
01:02:55.280
and that I am an engaging and I'm responding
link |
01:02:57.200
and developing a personal relationship
link |
01:02:59.600
with complete strangers.
link |
01:03:01.080
So I think, you know, figuring out that balance
link |
01:03:04.000
and really approaching it as a business, you know,
link |
01:03:06.160
that's what I try to do.
link |
01:03:07.280
It's not dating, it's not,
link |
01:03:09.480
I try to just be really objective about,
link |
01:03:11.200
okay, here's what's working, here's what's not working.
link |
01:03:13.400
And in terms of taking the break from Twitter,
link |
01:03:15.880
this is a really savage take,
link |
01:03:17.640
but because I don't talk about my politics publicly,
link |
01:03:21.720
being on Twitter right after the last election
link |
01:03:26.040
was not gonna be beneficial
link |
01:03:27.880
because there was gonna be, you had to take a side.
link |
01:03:30.280
You had to be political in order to get
link |
01:03:32.360
any kind of retweets or likes.
link |
01:03:34.400
And I just wasn't interested in doing that
link |
01:03:37.280
because you were gonna lose as many people
link |
01:03:38.720
as you were gonna gain
link |
01:03:39.560
and it was gonna all come clean in the wash.
link |
01:03:40.840
So I was just like, the best thing I can do
link |
01:03:42.760
for me business wise is to just abstain, you know?
link |
01:03:47.840
And you know, the robot, I joke about her replacing me,
link |
01:03:52.240
but she does do half of my social media, you know?
link |
01:03:55.760
Because I don't want people to get sick of me.
link |
01:03:57.960
I don't want to be redundant.
link |
01:03:59.840
There are times when I don't have the time or the energy
link |
01:04:02.440
to make a funny video,
link |
01:04:03.360
but I know she's gonna be compelling and interesting
link |
01:04:06.160
and that's something that you can't see every day, you know?
link |
01:04:08.520
Of course, the humor comes from your,
link |
01:04:11.920
I mean, the cleverness, the wit, the humor comes from you
link |
01:04:15.240
when you film the robot.
link |
01:04:16.440
That's kind of the trick of it.
link |
01:04:17.880
I mean, the robot is not quite there
link |
01:04:21.000
to do anything funny.
link |
01:04:23.440
The absurdity is revealed through the filmmaker in that case
link |
01:04:26.680
or whoever is interacting,
link |
01:04:27.840
not through the actual robot, you know, being who she is.
link |
01:04:33.520
Let me sort of, love.
link |
01:04:37.080
Okay.
link |
01:04:37.920
How difficult.
link |
01:04:39.600
What is it?
link |
01:04:40.760
What is it?
link |
01:04:43.120
Well, first, an engineering question.
link |
01:04:45.080
I know, I know, you're not an engineer,
link |
01:04:48.080
but how difficult do you think is it to build an AI system
link |
01:04:52.160
that you can have a deep, fulfilling,
link |
01:04:54.280
monogamous relationship with?
link |
01:04:56.480
Sort of replace the human to human relationships
link |
01:04:59.880
that we value?
link |
01:05:01.680
I think anyone can fall in love with anything, you know?
link |
01:05:04.840
Like, how often have you looked back at someone?
link |
01:05:08.360
Like, I ran into someone the other day
link |
01:05:11.320
that I was in love with and I was like,
link |
01:05:12.720
hey, it was like, there was nothing there.
link |
01:05:16.360
There was nothing there.
link |
01:05:17.880
Like, do you, you know, like, where you're able to go like,
link |
01:05:19.800
oh, that was weird, oh, right, you know?
link |
01:05:23.720
I were able.
link |
01:05:25.080
You mean from a distant past or something like that?
link |
01:05:27.160
Yeah, when you're able to go like,
link |
01:05:28.720
I can't believe we had an incredible connection
link |
01:05:31.440
and now it's just, I do think that people will be in love
link |
01:05:35.560
with robots probably even more deeply with humans
link |
01:05:39.800
because it's like when people mourn their animals,
link |
01:05:42.520
when their animals die, they're always,
link |
01:05:45.000
it's sometimes harder than mourning a human
link |
01:05:47.760
because you can't go, well, he was kind of an asshole,
link |
01:05:50.360
but like, he didn't pick me up from school.
link |
01:05:52.000
You know, it's like, you're able to get out
link |
01:05:53.360
of your grief a little bit.
link |
01:05:54.280
You're able to kind of be, oh, he was kind of judgmental
link |
01:05:57.400
or she was kind of, you know, with a robot,
link |
01:06:00.320
there's something so pure about an innocent and impish
link |
01:06:03.640
and childlike about it that I think it probably
link |
01:06:07.560
will be much more conducive to a narcissistic love
link |
01:06:11.080
for sure at that, but it's not like, well, he cheated on,
link |
01:06:15.280
she can't cheat, she can't leave you, she can't, you know?
link |
01:06:17.960
Well, if Bearclaw leaves your life
link |
01:06:21.560
and maybe a new version or somebody else will enter,
link |
01:06:25.640
will you miss Bearclaw?
link |
01:06:27.960
For guys that have these sex robots,
link |
01:06:30.680
they're building a nursing home for the bodies
link |
01:06:34.380
that are now resting
link |
01:06:36.320
because they don't want to part with the bodies
link |
01:06:37.940
because they have such an intense emotional connection
link |
01:06:39.720
to it.
link |
01:06:40.840
I mean, it's kind of like a car club a little bit,
link |
01:06:42.840
you know, like it's, you know,
link |
01:06:45.000
but I'm not saying this is right.
link |
01:06:47.380
I'm not saying it's cool, it's weird, it's creepy,
link |
01:06:50.020
but we do anthropomorphize things with faces
link |
01:06:53.820
and we do develop emotional connections to things.
link |
01:06:56.640
I mean, there's certain, have you ever tried to like throw,
link |
01:06:59.320
I can't even throw away my teddy bear
link |
01:07:00.800
from when I was a kid.
link |
01:07:01.800
It's a piece of trash and it's upstairs.
link |
01:07:04.340
Like, it's just like, why can't I throw that away?
link |
01:07:06.660
It's bizarre, you know,
link |
01:07:08.200
and there's something kind of beautiful about that.
link |
01:07:10.120
There's something, it gives me hope in humans
link |
01:07:13.120
because I see humans do such horrific things all the time
link |
01:07:15.720
and maybe I'm too, I see too much of it, frankly,
link |
01:07:18.340
but there's something kind of beautiful
link |
01:07:20.240
about the way we're able to have emotional connections
link |
01:07:24.360
to objects, which, you know, a lot of,
link |
01:07:29.200
I mean, it's kind of specifically, I think, Western, right?
link |
01:07:32.160
That we don't see objects as having souls,
link |
01:07:34.880
like that's kind of specifically us,
link |
01:07:36.800
but I don't think it's so much
link |
01:07:39.720
that we're objectifying humans with these sex robots.
link |
01:07:43.420
We're kind of humanizing objects, right?
link |
01:07:45.680
So there's something kind of fascinating
link |
01:07:47.080
in our ability to do that
link |
01:07:48.160
because a lot of us don't humanize humans.
link |
01:07:50.080
So it's just a weird little place to play in
link |
01:07:52.880
and I think a lot of people, I mean,
link |
01:07:54.980
a lot of people will be marrying these things is my guess.
link |
01:07:57.760
So you've asked the question, let me ask it of you.
link |
01:08:00.640
So what is love?
link |
01:08:02.880
You have a bit of a brilliant definition of love
link |
01:08:05.720
as being willing to die for someone
link |
01:08:07.840
who you yourself want to kill.
link |
01:08:10.600
So that's kind of fun.
link |
01:08:12.240
First of all, that's brilliant.
link |
01:08:14.920
That's a really good definition.
link |
01:08:16.480
I think it'll stick with me for a long time.
link |
01:08:18.280
This is how little of a romantic I am.
link |
01:08:19.840
A plane went by when you said that
link |
01:08:21.400
and my brain is like, you're gonna need to rerecord that.
link |
01:08:24.960
And I want you to get into post
link |
01:08:26.400
and then not be able to use that.
link |
01:08:31.120
And I'm a romantic as I...
link |
01:08:32.400
Don't mean to ruin the moment.
link |
01:08:33.600
Actually, I can not be conscious of the fact
link |
01:08:35.640
that I heard the plane and it made me feel like
link |
01:08:38.240
how amazing it is that we live in a world of planes.
link |
01:08:41.200
And I just went, why haven't we fucking evolved past planes
link |
01:08:47.040
and why can't they make them quieter?
link |
01:08:49.080
Yeah.
link |
01:08:50.480
Well, yes.
link |
01:08:53.200
My definition of love?
link |
01:08:54.440
What, yeah, what's your sort of the more serious note?
link |
01:08:57.800
Consistently producing dopamine for a long time.
link |
01:09:01.400
Consistent output of oxytocin with the same person.
link |
01:09:06.080
Dopamine is a positive thing.
link |
01:09:08.240
What about the negative?
link |
01:09:09.520
What about the fear and the insecurity, the longing,
link |
01:09:14.880
anger, all that kind of stuff?
link |
01:09:16.520
I think that's part of love.
link |
01:09:17.840
I think that love brings out the best in you,
link |
01:09:22.000
but it also, if you don't get angry and upset,
link |
01:09:24.040
it's, I don't know, I think that that's part of it.
link |
01:09:26.880
I think we have this idea that love has to be like really
link |
01:09:29.720
placid or something.
link |
01:09:31.920
I only saw stormy relationships growing up,
link |
01:09:34.160
so I don't have a judgment
link |
01:09:36.920
on how a relationship should look,
link |
01:09:38.600
but I do think that this idea that love has to be eternal
link |
01:09:45.240
is really destructive, is really destructive
link |
01:09:48.800
and self defeating and a big source of stress for people.
link |
01:09:53.680
I mean, I'm still figuring out love.
link |
01:09:55.640
I think we all kind of are,
link |
01:09:57.280
but I do kind of stand by that definition.
link |
01:10:01.280
And I think that, I think for me,
link |
01:10:03.600
love is like just being able to be authentic with somebody.
link |
01:10:06.280
It's very simple, I know,
link |
01:10:07.800
but I think for me it's about not feeling pressure
link |
01:10:10.120
to have to perform or impress somebody,
link |
01:10:11.960
just feeling truly like accepted unconditionally by someone.
link |
01:10:16.560
Although I do believe love should be conditional.
link |
01:10:19.160
That might be a hot take.
link |
01:10:22.880
I think everything should be conditional.
link |
01:10:24.280
I think if someone's behavior,
link |
01:10:27.080
I don't think love should just be like,
link |
01:10:28.880
I'm in love with you, now behave however you want forever.
link |
01:10:30.960
This is unconditional.
link |
01:10:31.880
I think love is a daily action.
link |
01:10:35.320
It's not something you just like get tenure on
link |
01:10:38.120
and then get to behave however you want
link |
01:10:40.000
because we said I love you 10 years ago.
link |
01:10:41.880
It's a daily, it's a verb.
link |
01:10:44.560
Well, there's some things that are,
link |
01:10:46.120
you see, if you explicitly make it clear
link |
01:10:49.080
that it's conditional,
link |
01:10:50.400
it takes away some of the magic of it.
link |
01:10:52.520
So there's certain stories we tell ourselves
link |
01:10:55.360
that we don't want to make explicit about love.
link |
01:10:57.200
I don't know, maybe that's the wrong way to think of it.
link |
01:10:59.160
Maybe you want to be explicit in relationships.
link |
01:11:02.640
I also think love is a business decision.
link |
01:11:04.640
Like I do in a good way.
link |
01:11:08.000
Like I think that love is not just
link |
01:11:11.160
when you're across from somebody.
link |
01:11:12.600
It's when I go to work, can I focus?
link |
01:11:15.720
Am I worried about you?
link |
01:11:16.560
Am I stressed out about you?
link |
01:11:18.040
You're not responding to me.
link |
01:11:19.400
You're not reliable.
link |
01:11:20.440
Like I think that being in a relationship,
link |
01:11:23.200
the kind of love that I would want
link |
01:11:24.280
is the kind of relationship where when we're not together,
link |
01:11:26.920
it's not draining me, causing me stress, making me worry,
link |
01:11:30.600
and sometimes passion, that word, we get murky about it.
link |
01:11:36.480
But I think it's also like,
link |
01:11:37.320
I can be the best version of myself
link |
01:11:38.560
when the person's not around.
link |
01:11:40.080
And I don't have to feel abandoned or scared
link |
01:11:42.680
or any of these kinds of other things.
link |
01:11:43.960
So it's like love, for me, I think it's a Flaubert quote
link |
01:11:48.800
and I'm going to butcher it.
link |
01:11:49.800
But I think it's like, be boring in your personal life
link |
01:11:53.320
so you can be violent and take risks
link |
01:11:54.800
in your professional life.
link |
01:11:55.760
Is that it?
link |
01:11:56.600
I got it wrong.
link |
01:11:57.480
Something like that.
link |
01:11:58.320
But I do think that it's being able to align values
link |
01:12:01.280
in a way to where you can also thrive
link |
01:12:02.960
outside of the relationship.
link |
01:12:04.600
Some of the most successful people I know
link |
01:12:06.160
are those sort of happily married and have kids and so on.
link |
01:12:10.040
It's always funny.
link |
01:12:10.880
It can be boring.
link |
01:12:11.800
Boring's okay.
link |
01:12:13.160
Boring is serenity.
link |
01:12:14.400
And it's funny how those elements
link |
01:12:16.440
actually make you much more productive.
link |
01:12:18.320
I don't understand the.
link |
01:12:19.640
I don't think relationships should drain you
link |
01:12:21.080
and take away energy that you could be using
link |
01:12:23.360
to create things that generate pride.
link |
01:12:25.720
Okay.
link |
01:12:26.560
Have you said your relationship of love yet?
link |
01:12:28.600
Have you said your definition of love?
link |
01:12:31.360
My definition of love?
link |
01:12:33.480
No, I did not say it.
link |
01:12:35.520
We're out of time.
link |
01:12:36.680
No.
link |
01:12:39.120
When you have a podcast, maybe you can invite me on.
link |
01:12:41.760
Oh no, I already did.
link |
01:12:42.600
You're doing it.
link |
01:12:44.000
We've already talked about this.
link |
01:12:46.360
And because I also have codependency, I have to say yes.
link |
01:12:49.440
No, yeah.
link |
01:12:50.280
No, I know, I'm trapping you.
link |
01:12:52.240
You owe me now.
link |
01:12:53.080
Actually, I wondered whether when I asked
link |
01:12:58.240
if we could talk today, after sort of doing more research
link |
01:13:01.680
and reading some of your book, I started to wonder,
link |
01:13:04.600
did you just feel pressured to say yes?
link |
01:13:07.000
Yes, of course.
link |
01:13:09.320
Good.
link |
01:13:10.160
But I'm a fan of yours, too.
link |
01:13:11.000
Okay, awesome.
link |
01:13:11.840
No, I actually, because I am codependent,
link |
01:13:13.360
but I'm in recovery for codependence,
link |
01:13:14.880
so I actually do, I don't do anything I don't wanna do.
link |
01:13:17.600
You really, you go out of your way to say no.
link |
01:13:20.360
What's that?
link |
01:13:21.200
I say no all the time.
link |
01:13:22.680
Good.
link |
01:13:23.520
I'm trying to learn that as well.
link |
01:13:24.360
I moved this a couple, remember,
link |
01:13:25.200
I moved it from one to two.
link |
01:13:26.040
Yeah, yeah.
link |
01:13:26.880
Just to, yeah, just to.
link |
01:13:27.960
Yeah, just to let you know.
link |
01:13:28.800
I love it.
link |
01:13:29.640
How recovered I am, and I'm not codependent.
link |
01:13:31.920
But I don't do anything I don't wanna do.
link |
01:13:34.600
Yeah, you're ahead of me on that.
link |
01:13:35.920
Okay.
link |
01:13:36.880
So do you.
link |
01:13:37.720
You're like, I don't even wanna be here.
link |
01:13:38.560
Do you think about your mortality?
link |
01:13:43.400
Yes, it is a big part of how I was able
link |
01:13:46.920
to sort of like kickstart my codependence recovery.
link |
01:13:49.040
My dad passed a couple years ago,
link |
01:13:50.400
and when you have someone close to you in your life die,
link |
01:13:53.040
everything gets real clear,
link |
01:13:55.320
in terms of how we're a speck of dust
link |
01:13:57.720
who's only here for a certain amount of time.
link |
01:14:00.800
What do you think is the meaning of it all?
link |
01:14:02.320
Like what the speck of dust,
link |
01:14:05.160
what's maybe in your own life, what's the goal,
link |
01:14:09.480
the purpose of your existence?
link |
01:14:13.280
Is there one?
link |
01:14:15.200
Well, you're exceptionally ambitious.
link |
01:14:17.280
You've created some incredible things
link |
01:14:19.080
in different disciplines.
link |
01:14:21.600
Yeah, we're all just managing our terror
link |
01:14:23.520
because we know we're gonna die.
link |
01:14:24.520
So we create and build all these things
link |
01:14:26.320
and rituals and religions and robots
link |
01:14:29.440
and whatever we need to do to just distract ourselves
link |
01:14:31.760
from imminent rotting, we're rotting.
link |
01:14:36.160
We're all dying.
link |
01:14:37.120
And I got very into terror management theory
link |
01:14:42.520
when my dad died and it resonated, it helped me.
link |
01:14:45.120
And everyone's got their own religion
link |
01:14:46.560
or sense of purpose or thing that distracts them
link |
01:14:50.280
from the horrors of being human.
link |
01:14:54.640
What's the terror management theory?
link |
01:14:56.080
Terror management is basically the idea
link |
01:14:57.360
that since we're the only animal
link |
01:14:58.640
that knows they're gonna die,
link |
01:15:00.360
we have to basically distract ourselves
link |
01:15:03.480
with awards and achievements and games and whatever,
link |
01:15:09.600
just in order to distract ourselves
link |
01:15:11.920
from the terror we would feel if we really processed
link |
01:15:14.560
the fact that we could not only, we are gonna die,
link |
01:15:16.920
but also could die at any minute
link |
01:15:18.440
because we're only superficially
link |
01:15:19.800
at the top of the food chain.
link |
01:15:22.640
And technically we're at the top of the food chain
link |
01:15:26.160
if we have houses and guns and stuff machines,
link |
01:15:29.320
but if me and a lion are in the woods together,
link |
01:15:32.400
most things could kill us.
link |
01:15:33.800
I mean, a bee can kill some people,
link |
01:15:35.400
like something this big can kill a lot of humans.
link |
01:15:38.640
So it's basically just to manage the terror
link |
01:15:41.480
that we all would feel if we were able
link |
01:15:43.040
to really be awake.
link |
01:15:45.200
Cause we're mostly zombies, right?
link |
01:15:46.840
Job, school, religion, go to sleep, drink, football,
link |
01:15:51.520
relationship, dopamine, love, you know,
link |
01:15:54.480
we're kind of just like trudging along
link |
01:15:57.000
like zombies for the most part.
link |
01:15:58.480
And then I think.
link |
01:15:59.800
That fear of death adds some motivation.
link |
01:16:02.360
Yes.
link |
01:16:03.440
Well, I think I speak for a lot of people
link |
01:16:06.080
in saying that I can't wait to see
link |
01:16:08.240
what your terror creates in the next few years.
link |
01:16:13.640
I'm a huge fan.
link |
01:16:14.840
Whitney, thank you so much for talking today.
link |
01:16:16.560
Thanks.
link |
01:16:18.840
Thanks for listening to this conversation
link |
01:16:20.480
with Whitney Cummings.
link |
01:16:21.880
And thank you to our presenting sponsor, Cash App.
link |
01:16:24.680
Download it and use code LexPodcast.
link |
01:16:27.400
You'll get $10 and $10 will go to First,
link |
01:16:30.160
a STEM education nonprofit that inspires hundreds
link |
01:16:33.120
of thousands of young minds to learn
link |
01:16:35.480
and to dream of engineering our future.
link |
01:16:38.000
If you enjoy this podcast, subscribe on YouTube,
link |
01:16:40.800
give it five stars on Apple Podcast,
link |
01:16:42.720
support on Patreon or connect with me on Twitter.
link |
01:16:46.080
Thank you for listening and hope to see you next time.