back to index

Whitney Cummings: Comedy, Robotics, Neurology, and Love | Lex Fridman Podcast #55


small model | large model

link |
00:00:00.000
The following is a conversation with Whitney Cummings.
link |
00:00:03.600
She's a standup comedian, actor, producer, writer, director,
link |
00:00:07.240
and recently, finally, the host of her very own podcast
link |
00:00:11.240
called Good for You.
link |
00:00:12.920
Her most recent Netflix special called Can I Touch It
link |
00:00:15.920
features in part a robot.
link |
00:00:17.800
She affectionately named Bear Claw
link |
00:00:20.280
that is designed to be visually a replica of Whitney.
link |
00:00:23.440
It's exciting for me to see one of my favorite comedians
link |
00:00:25.960
explore the social aspects of robotics and AI in our society.
link |
00:00:30.720
She also has some fascinating ideas
link |
00:00:32.920
about human behavior, psychology, and urology,
link |
00:00:36.000
some of which she explores in her book
link |
00:00:37.800
called I'm Fine and Other Lies.
link |
00:00:41.200
It was truly a pleasure to meet Whitney
link |
00:00:43.320
and have this conversation with her,
link |
00:00:45.160
and even to continue it through texts afterwards.
link |
00:00:47.920
Every once in a while, late at night,
link |
00:00:50.120
I'll be programming over a cup of coffee
link |
00:00:52.280
and we'll get a text from Whitney saying something hilarious.
link |
00:00:55.720
Or weirder yet, sending a video of Brian Callan
link |
00:00:58.960
saying something hilarious.
link |
00:01:00.920
That's when I know the universe has a sense of humor
link |
00:01:03.840
and it gifted me with one hell of an amazing journey.
link |
00:01:07.400
Then I put the phone down and go back to programming
link |
00:01:10.320
with a stupid, joyful smile on my face.
link |
00:01:13.400
If you enjoy this conversation,
link |
00:01:14.960
listen to Whitney's podcast, Good for You,
link |
00:01:17.200
and follow her on Twitter and Instagram.
link |
00:01:19.840
This is the Artificial Intelligence podcast.
link |
00:01:22.640
If you enjoy it, subscribe on YouTube,
link |
00:01:24.840
give it five stars on Apple Podcasts,
link |
00:01:26.800
support on Patreon, or simply connect with me on Twitter.
link |
00:01:30.160
Alex Friedman, spelled F R I D M A N.
link |
00:01:34.040
This show is presented by Cash App,
link |
00:01:35.840
the number one finance app in the App Store.
link |
00:01:38.280
They regularly support Whitney's Good for You podcast as well.
link |
00:01:41.440
I personally use Cash App to send money to friends,
link |
00:01:43.920
but you can also use it to buy, sell,
link |
00:01:45.680
and deposit Bitcoin in just seconds.
link |
00:01:47.960
Cash App also has a new investing feature.
link |
00:01:50.640
You can buy fractions of a stock, say $1 worth,
link |
00:01:53.480
no matter what the stock price is.
link |
00:01:55.760
Broker services are provided by Cash App investing,
link |
00:01:58.560
subsidiary of Square, and member SIPC.
link |
00:02:02.000
I'm excited to be working with Cash App
link |
00:02:04.120
to support one of my favorite organizations called First,
link |
00:02:07.160
best known for their first robotics and Lego competitions.
link |
00:02:10.400
They educate and inspire hundreds of thousands of students
link |
00:02:13.760
in over 110 countries,
link |
00:02:16.000
and have a perfect rating on Charity Navigator,
link |
00:02:18.680
which means the donated money is used
link |
00:02:20.480
to maximum effectiveness.
link |
00:02:22.360
When you get Cash App from the App Store or Google Play
link |
00:02:25.240
and use code LEX Podcast,
link |
00:02:27.840
you'll get $10 and Cash App will also donate $10 to First,
link |
00:02:32.080
which again is an organization that I've personally seen
link |
00:02:35.240
inspire girls and boys to dream of engineering
link |
00:02:38.120
a better world.
link |
00:02:40.440
This podcast is supported by Zippercruiter.
link |
00:02:43.040
Hiring great people is hard,
link |
00:02:45.200
and to me is the most important element
link |
00:02:47.280
of a successful mission driven team.
link |
00:02:50.080
I've been fortunate to be a part of,
link |
00:02:52.080
and to lead several great engineering teams.
link |
00:02:54.680
The hiring I've done in the past
link |
00:02:56.640
was mostly through tools that we built ourselves,
link |
00:02:59.400
but reinventing the wheel was painful.
link |
00:03:02.360
Zippercruiter is a tool that's already available for you.
link |
00:03:05.200
It seeks to make hiring simple, fast, and smart.
link |
00:03:08.960
For example, codable cofounder Gretchen Hebner
link |
00:03:11.920
used Zippercruiter to find a new game artist
link |
00:03:14.360
to join her education tech company.
link |
00:03:16.680
By using Zippercruiter screening questions
link |
00:03:18.800
to filter candidates,
link |
00:03:20.280
Gretchen found it easier to focus on the best candidates
link |
00:03:23.080
and finally hiring the perfect person for the role
link |
00:03:26.440
in less than two weeks from start to finish.
link |
00:03:29.440
Zippercruiter, the smartest way to hire.
link |
00:03:32.520
See why Zippercruiter is effective
link |
00:03:34.160
for businesses of all sizes by signing up as I did
link |
00:03:37.560
for free at zippercruiter.com slash lexpod.
link |
00:03:41.560
That's zippercruiter.com slash lexpod.
link |
00:03:45.360
And now here's my conversation with Whitney Cummings.
link |
00:03:51.600
I have trouble making eye contact, as you can tell.
link |
00:03:53.760
Me too.
link |
00:03:54.600
Do you know that I had to work on making eye contact
link |
00:03:56.960
because I used to look here?
link |
00:03:58.880
Do you see what I'm doing?
link |
00:03:59.720
That helps, yeah, yeah, yeah.
link |
00:04:00.560
Do you want me to do that?
link |
00:04:01.800
Well, do this way, I'll cheat the camera.
link |
00:04:03.600
But I used to do this and finally people,
link |
00:04:05.760
like I'd be on dates and guys would be like,
link |
00:04:07.400
are you looking at my hair?
link |
00:04:08.240
Like they get, it would make people really insecure
link |
00:04:10.920
because I didn't really get a lot of eye contact as a kid.
link |
00:04:12.960
It's one to three years.
link |
00:04:14.760
Did you not get a lot of eye contact as a kid?
link |
00:04:16.480
I don't know.
link |
00:04:17.320
I haven't done the soul searching.
link |
00:04:19.600
Right.
link |
00:04:20.800
So, but there's definitely some psychological issues.
link |
00:04:24.200
Makes you uncomfortable.
link |
00:04:25.560
Yeah.
link |
00:04:26.520
For some reason when I connect eyes,
link |
00:04:27.920
I start to think, I assume that you're judging me.
link |
00:04:31.840
Oh, well I am.
link |
00:04:33.360
That's why you assume that.
link |
00:04:34.480
Yeah.
link |
00:04:35.320
We all are.
link |
00:04:36.160
All right.
link |
00:04:37.000
This is perfect.
link |
00:04:37.840
The podcast has to be me and you both staring at the table.
link |
00:04:39.320
All right.
link |
00:04:42.560
Do you think robots of the future,
link |
00:04:44.240
ones with human level intelligence,
link |
00:04:46.000
will be female, male, genderless,
link |
00:04:49.480
or another gender we have not yet created as a society?
link |
00:04:53.240
You're the expert at this.
link |
00:04:55.240
Well, I'm going to ask you.
link |
00:04:56.080
You know the answer.
link |
00:04:57.320
I'm going to ask you questions
link |
00:04:58.680
that maybe nobody knows the answer to or,
link |
00:05:01.960
and then I just want you to hypothesize
link |
00:05:04.080
as a imaginative author, director, comedian.
link |
00:05:10.760
Can we just be very clear that you know a ton about this
link |
00:05:14.200
and I know nothing about this,
link |
00:05:15.760
but I have thought a lot about
link |
00:05:19.560
what I think robots can fix in our society.
link |
00:05:22.840
And I mean, I'm a comedian.
link |
00:05:24.400
It's my job to study human nature,
link |
00:05:27.520
to make jokes about human nature
link |
00:05:28.840
and to sometimes play devil's advocate.
link |
00:05:31.080
And I just see such a tremendous negativity
link |
00:05:34.200
around robots or at least the idea of robots
link |
00:05:37.000
that it was like, oh, I'm just going to
link |
00:05:39.200
take the opposite side for fun, for jokes.
link |
00:05:41.400
And then I was like, oh no, I really agree
link |
00:05:44.440
in this devil's advocate argument.
link |
00:05:45.920
So please correct me when I'm wrong about this stuff.
link |
00:05:49.400
So first of all, there's no right and wrong
link |
00:05:51.800
because we're all, I think most of the people
link |
00:05:54.880
working on robotics are really not actually even thinking
link |
00:05:57.600
about some of the big picture things
link |
00:06:00.040
that you've been exploring.
link |
00:06:01.280
In fact, your robot, what's her name by the way?
link |
00:06:04.680
Bearclaw.
link |
00:06:05.520
We'll go with Bearclaw.
link |
00:06:06.840
What's the genesis of that name by the way?
link |
00:06:11.720
Bearclaw was, I, God, I don't even remember the joke
link |
00:06:15.040
because I black out after I shoot specials,
link |
00:06:16.720
but I was writing something about like the pet names
link |
00:06:19.240
that men call women like cupcake, sweetie, honey,
link |
00:06:23.000
you know, like we're always named after desserts
link |
00:06:26.560
or something.
link |
00:06:27.400
And I was just writing a joke about
link |
00:06:29.960
if you want to call us a dessert,
link |
00:06:31.040
at least pick like a cool dessert, you know,
link |
00:06:33.240
like Bearclaw, like something cool.
link |
00:06:35.720
So I ended up calling her Bearclaw.
link |
00:06:37.040
Interesting.
link |
00:06:38.280
So do you think the future robots
link |
00:06:42.320
of greater and greater intelligence
link |
00:06:44.480
will like to make them female, male?
link |
00:06:46.560
Would we like to assign them gender?
link |
00:06:48.600
Or would we like to move away from gender
link |
00:06:50.840
and say something more ambiguous?
link |
00:06:54.040
I think it depends on their purpose, you know?
link |
00:06:56.400
I feel like if it's a sex robot,
link |
00:06:59.920
people prefer certain genders, you know?
link |
00:07:01.920
And I also, you know, when I went down
link |
00:07:04.000
and explored the robot factory,
link |
00:07:06.320
I was asking about the type of people
link |
00:07:07.680
that bought sex robots.
link |
00:07:09.280
And I was very surprised at the answer
link |
00:07:12.240
because of course the stereotype
link |
00:07:14.160
was it's going to be a bunch of perverts.
link |
00:07:15.360
It ended up being a lot of people that were handicapped,
link |
00:07:18.600
a lot of people with erectile dysfunction,
link |
00:07:20.560
and a lot of people that were exploring their sexuality.
link |
00:07:23.920
A lot of people that were thought they were gay
link |
00:07:25.960
but weren't sure but didn't want to take the risk
link |
00:07:28.160
of trying on someone that could reject them
link |
00:07:31.680
and being embarrassed or they were closeted
link |
00:07:33.920
or in a city where maybe that's, you know,
link |
00:07:36.360
taboo and stigmatized, you know?
link |
00:07:37.960
So I think that a gendered sex robot
link |
00:07:40.600
that would serve an important purpose
link |
00:07:42.400
for someone trying to explore their sexuality.
link |
00:07:44.200
Am I into men?
link |
00:07:45.040
Let me try on this thing first.
link |
00:07:46.200
Am I into women?
link |
00:07:47.040
Let me try on this thing first.
link |
00:07:48.240
So I think gendered robots would be important for that.
link |
00:07:51.240
But I think genderless robots in terms of
link |
00:07:53.800
emotional support robots, babysitters.
link |
00:07:56.560
I'm fine for a genderless babysitter
link |
00:07:58.760
with my husband in the house.
link |
00:08:00.120
You know, there are places that I think
link |
00:08:02.120
that genderless makes a lot of sense
link |
00:08:04.680
but obviously not in the sex area.
link |
00:08:07.600
What do you mean with your husband in the house?
link |
00:08:09.960
What does that have to do with the gender of the robot?
link |
00:08:11.840
Right, I mean, I don't have a husband
link |
00:08:13.080
but hypothetically speaking,
link |
00:08:14.400
I think every woman's worst nightmare
link |
00:08:15.760
is like the hot babysitter, you know what I mean?
link |
00:08:19.320
So I think that there is a time and place I think
link |
00:08:22.080
for genderless, you know, teachers, doctors,
link |
00:08:25.440
all that kind of, it would be very awkward
link |
00:08:27.320
if the first robotic doctor was a guy
link |
00:08:29.760
or the first robotic nurse is a woman.
link |
00:08:32.640
You know, it's sort of, that stuff is so loaded.
link |
00:08:36.120
I think that genderless could just take the unnecessary
link |
00:08:42.080
drama out of it and possibility to sexualize them
link |
00:08:46.040
or be triggered by any of that stuff.
link |
00:08:49.560
So there's two components to this, to bear clause.
link |
00:08:52.920
So one is the voice and the talking and so on.
link |
00:08:55.040
And then there's the visual appearance.
link |
00:08:56.400
So on the topic of gender and genderless, in your experience,
link |
00:09:01.400
what has been the value of the physical appearance?
link |
00:09:04.920
So has it added much to the depth of the interaction?
link |
00:09:09.000
I mean, mine's kind of an extenuating circumstance
link |
00:09:11.240
because she is supposed to look exactly like me.
link |
00:09:13.560
I mean, I spent six months getting my face molded
link |
00:09:16.000
and having, you know, the idea was I was exploring
link |
00:09:19.480
the concept of can robots replace us
link |
00:09:21.280
because that's the big fear
link |
00:09:22.520
but also the big dream in a lot of ways.
link |
00:09:24.280
And I wanted to dig into that area
link |
00:09:26.800
because, you know, for a lot of people, it's like,
link |
00:09:29.920
they're going to take our jobs
link |
00:09:30.840
and they're going to replace us, legitimate fear.
link |
00:09:33.000
But then a lot of women I know are like,
link |
00:09:34.400
I would love for a robot to replace me every now and then.
link |
00:09:37.000
So it can go to baby showers for me
link |
00:09:38.920
and it can pick up my kids at school
link |
00:09:40.240
and it can cook dinner and whatever.
link |
00:09:42.520
So I just think that was an interesting place to explore.
link |
00:09:45.240
So her looking like me was a big part of it.
link |
00:09:47.080
Now her looking like me just adds
link |
00:09:49.120
an unnecessary level of insecurity
link |
00:09:51.400
because I got her a year ago
link |
00:09:53.560
and she already looks younger than me.
link |
00:09:54.720
So that's a weird problem.
link |
00:09:57.680
But I think that her looking human was the idea.
link |
00:10:00.720
And I think that where we are now,
link |
00:10:03.120
please correct me if I'm wrong,
link |
00:10:04.880
a human robot resembling an actual human you know
link |
00:10:09.880
is going to feel more realistic than some generic face.
link |
00:10:13.760
Well, you're saying that robots
link |
00:10:16.080
that have some familiarity like look similar to somebody
link |
00:10:21.080
that you actually know you'll be able to form
link |
00:10:23.680
a deeper connection with?
link |
00:10:24.600
That was the question?
link |
00:10:25.440
I need to show on some level.
link |
00:10:26.280
That's an open question.
link |
00:10:27.120
I don't, you know, it's an interesting.
link |
00:10:30.240
Or the opposite.
link |
00:10:31.080
It's been, you know me and you're like,
link |
00:10:32.400
well I know this isn't real cause you're right here.
link |
00:10:34.600
So maybe it does the opposite.
link |
00:10:36.160
We have a very keen eye for human faces
link |
00:10:39.280
and they're able to detect strangeness,
link |
00:10:41.880
especially that one has to do with people
link |
00:10:44.400
whose faces we've seen a lot of.
link |
00:10:46.440
So I tend to be a bigger fan of moving away
link |
00:10:50.640
completely from faces.
link |
00:10:52.840
Recognizable faces?
link |
00:10:54.240
No, just human faces at all.
link |
00:10:56.040
In general, cause I think that's where things get dicey.
link |
00:10:58.320
And one thing I will say is I think my robot
link |
00:11:01.360
is more realistic than other robots,
link |
00:11:03.080
not necessarily because you have seen me
link |
00:11:05.200
and then you see her and you go, oh, they're so similar.
link |
00:11:07.640
But also because human faces are flawed and asymmetrical
link |
00:11:11.120
and sometimes we forget when we're making things
link |
00:11:13.480
that supposed to look human, we make them too symmetrical.
link |
00:11:16.040
And that's what makes them stop looking human.
link |
00:11:18.000
So because they mold in my asymmetrical face,
link |
00:11:20.560
she just, even if someone didn't know who I was,
link |
00:11:22.880
I think she'd look more realistic than most generic ones
link |
00:11:26.600
that didn't have some kind of flaws.
link |
00:11:28.920
Got it.
link |
00:11:29.760
Cause they start looking creepy when they're too symmetrical
link |
00:11:31.760
cause human beings aren't like that.
link |
00:11:33.280
Yeah, the flaws is what it means to be human.
link |
00:11:35.760
So visually as well.
link |
00:11:37.760
But I'm just a fan of the idea of letting humans use
link |
00:11:42.040
a little bit more imagination.
link |
00:11:43.320
So just hearing the voice is enough for us humans
link |
00:11:47.640
to then start imagining the visual appearance
link |
00:11:50.320
that goes along with that voice.
link |
00:11:52.080
And you don't necessarily need to work too hard
link |
00:11:54.520
on creating the actual visual appearance.
link |
00:11:56.960
So there's some value to that.
link |
00:11:59.160
When you step into this character of actually building
link |
00:12:01.840
of a robot that looks like Bear Claw,
link |
00:12:04.360
it's such a long road of facial expressions
link |
00:12:07.720
of sort of making everything smiling, winking,
link |
00:12:12.240
rolling in the eyes, all that kind of stuff.
link |
00:12:14.960
It gets really, really tricky.
link |
00:12:16.560
It gets tricky and I think I'm, again, I'm a comedian.
link |
00:12:19.200
Like I'm obsessed with what makes us human
link |
00:12:21.840
and our human nature in the nasty side of human nature
link |
00:12:25.480
tends to be where I've, you know, ended up
link |
00:12:27.600
exploring over and over again.
link |
00:12:28.840
And I was just mostly fascinated by people's reaction.
link |
00:12:32.640
So it's my job to get the biggest reaction
link |
00:12:34.560
from a group of strangers, the loudest possible reaction.
link |
00:12:37.520
And I just had this instinct just when I started
link |
00:12:41.240
building her and people going, and scream,
link |
00:12:43.480
and people scream and they, I mean,
link |
00:12:45.080
I would bring her out on stage and people would scream.
link |
00:12:48.120
And I just, to me, that was the next level of entertainment.
link |
00:12:51.560
Getting a laugh, I've done that, I know how to do that.
link |
00:12:53.560
I think comedians were always trying to figure out
link |
00:12:54.960
what the next level is and comedy is evolving so much.
link |
00:12:57.320
And, you know, Jordan Peele had just done, you know,
link |
00:12:59.920
these genius comedy horror movies,
link |
00:13:01.800
which feel like the next level of comedy to me.
link |
00:13:04.520
And this sort of funny horror of a robot
link |
00:13:10.040
was fascinating to me.
link |
00:13:11.720
But I think the thing that I got the most obsessed with
link |
00:13:15.560
was people being freaked out and scared of her.
link |
00:13:18.240
And I started digging around with pathogen avoidance.
link |
00:13:21.680
And the idea that we've essentially evolved
link |
00:13:24.080
to be repelled by anything that looks human,
link |
00:13:27.160
but is off a little bit.
link |
00:13:28.920
Anything that could be sick or diseased or dead,
link |
00:13:32.120
essentially, is our reptilian brain's way
link |
00:13:33.960
to get us to not try to have sex with it, basically.
link |
00:13:38.360
You know, so I got really fascinated
link |
00:13:39.920
by how freaked out and scared,
link |
00:13:41.920
I mean, I would see grown men get upset.
link |
00:13:44.400
I would get that thing away from me, like, I'm like,
link |
00:13:46.280
people get angry.
link |
00:13:47.880
And it was like, you know what this is, you know,
link |
00:13:50.840
but the sort of like, you know, amygdala getting activated
link |
00:13:55.200
by something that to me is just a fun toy
link |
00:13:58.600
said a lot about our history as a species
link |
00:14:02.080
and what got us into trouble thousands of years ago.
link |
00:14:04.760
So that is the deep down stuff that's in our genetics,
link |
00:14:08.640
but also is it just, are people freaked out
link |
00:14:11.000
by the fact that there's a robot?
link |
00:14:13.120
So it's not just the appearance,
link |
00:14:14.880
but that there's an artificial human.
link |
00:14:17.880
Anything people I think, and I'm just also fascinated
link |
00:14:21.280
by the blind spots humans have.
link |
00:14:23.080
So the idea that you're afraid of that,
link |
00:14:24.800
I mean, how many robots have killed people?
link |
00:14:27.240
How many humans have died at the hands of other humans?
link |
00:14:29.840
Yeah.
link |
00:14:30.680
Millions?
link |
00:14:31.520
Hundreds of millions, yet we're scared of that
link |
00:14:34.600
and we'll go to the grocery store
link |
00:14:36.120
and be around a bunch of humans who statistically,
link |
00:14:38.520
the chances are much higher
link |
00:14:39.560
that you're gonna get killed by humans.
link |
00:14:40.760
So I'm just fascinated by, without judgment,
link |
00:14:43.640
how irrational we are as a species.
link |
00:14:47.880
The worry is the exponential.
link |
00:14:49.320
So it's, you know, you can say the same thing
link |
00:14:51.440
about nuclear weapons before we dropped
link |
00:14:54.200
on Hiroshima and Nagasaki.
link |
00:14:55.800
So the worry that people have is the exponential growth.
link |
00:14:59.480
So it's like, oh, it's fun and games right now,
link |
00:15:03.760
but you know, overnight, especially if a robot provides
link |
00:15:08.760
value to society, we'll put one in every home.
link |
00:15:11.360
And then all of a sudden, loose track
link |
00:15:13.560
of the actual large scale impact it has on society.
link |
00:15:17.000
And then all of a sudden, gain greater and greater control
link |
00:15:20.000
to where we'll all be, you know,
link |
00:15:22.280
affect our political system and then affect our decision.
link |
00:15:25.360
Did robots already ruin our political system?
link |
00:15:27.440
Didn't that just already happen?
link |
00:15:28.440
Which ones?
link |
00:15:29.280
Oh, Russia hacking.
link |
00:15:30.560
No offense.
link |
00:15:32.280
But hasn't that already happened?
link |
00:15:34.760
I mean, that was like an algorithm
link |
00:15:36.200
of negative things being clicked on more.
link |
00:15:39.320
We'd like to tell stories
link |
00:15:40.760
and like to demonize certain people.
link |
00:15:43.640
I think nobody understands our current political system
link |
00:15:46.840
or discourse on Twitter, the Twitter mobs.
link |
00:15:49.680
Nobody has a sense, not Twitter, not Facebook,
link |
00:15:52.560
the people running it.
link |
00:15:53.400
Nobody understands the impact of these algorithms
link |
00:15:55.400
that are trying their best.
link |
00:15:56.880
Despite what people think, they're not like a bunch
link |
00:15:59.160
of lefties trying to make sure
link |
00:16:01.440
that Hillary Clinton gets elected.
link |
00:16:03.240
It's more that it's an incredibly complex system
link |
00:16:06.840
that we don't, and that's the worry.
link |
00:16:08.880
It's so complex and moves so fast
link |
00:16:11.440
that nobody will be able to stop it once it happens.
link |
00:16:15.760
And let me ask a question.
link |
00:16:16.920
This is a very savage question.
link |
00:16:18.880
Yeah.
link |
00:16:19.720
Which is, is this just the next stage of evolution?
link |
00:16:23.840
As humans, some people will die.
link |
00:16:25.640
Yes, I mean, that's always happened.
link |
00:16:27.400
You know, this is just taking emotion out of it.
link |
00:16:30.320
Is this basically the next stage of survival, the fittest?
link |
00:16:35.000
Yeah, you have to think of organisms.
link |
00:16:37.800
You know, what is it mean to be a living organism?
link |
00:16:41.400
Like is a smartphone part of your living organism, or?
link |
00:16:47.240
We're in relationships with our phones.
link |
00:16:49.720
Yeah.
link |
00:16:50.560
We have sex through them, with them.
link |
00:16:52.920
What's the difference between with them and through them?
link |
00:16:54.480
But it also expands your cognitive abilities,
link |
00:16:57.120
expands your memory, knowledge, and so on.
link |
00:16:59.080
So you're a much smarter person
link |
00:17:00.640
because you have a smartphone in your hand.
link |
00:17:02.600
But as soon as it's out of my hand,
link |
00:17:04.760
we've got big problems,
link |
00:17:06.120
because we've become sort of so morphed with them.
link |
00:17:08.360
Well, there's a symbiotic relationship.
link |
00:17:09.960
And that's what Elon Musk, when you're a link,
link |
00:17:12.520
is working on trying to increase the bandwidth
link |
00:17:16.640
of communication between computers and your brain.
link |
00:17:19.320
And so further and further expand our ability,
link |
00:17:22.800
as human beings, to sort of leverage machines.
link |
00:17:26.280
And maybe that's the future, the next evolutionary step.
link |
00:17:30.440
It could be also that, yes, we'll give birth,
link |
00:17:33.880
just like we give birth to human children right now,
link |
00:17:36.520
we'll give birth to AI and they'll replace us.
link |
00:17:38.920
I think it's a really interesting possibility.
link |
00:17:42.160
I'm gonna play devil's advocate.
link |
00:17:44.040
I just think that the fear of robots is wildly classist
link |
00:17:48.280
because, I mean, Facebook.
link |
00:17:50.080
Like it's easy for us to say they're taking their data.
link |
00:17:51.920
Okay, a lot of people that get employment off of Facebook,
link |
00:17:55.680
they are able to get income off of Facebook.
link |
00:17:58.200
They don't care if you take their phone numbers
link |
00:17:59.800
and their emails and their data, as long as it's free.
link |
00:18:01.920
They don't wanna have to pay $5 a month for Facebook.
link |
00:18:03.920
Facebook is a wildly democratic thing.
link |
00:18:05.800
Forget about the election and all that kind of stuff.
link |
00:18:08.240
A lot of technology making people's lives easier,
link |
00:18:12.480
I find that most elite people are more scared
link |
00:18:17.120
than lower income people and women for the most part.
link |
00:18:21.200
So the idea of something that's stronger than us
link |
00:18:23.960
and that might eventually kill us, like women are used to that.
link |
00:18:26.640
Like that's not, I see a lot of like really rich men
link |
00:18:30.080
being like the robots are gonna kill us.
link |
00:18:31.320
We're like, what's another thing that's gonna kill us?
link |
00:18:33.840
I tend to see like, oh, something can walk me
link |
00:18:36.320
to my car at night.
link |
00:18:37.160
Like something can help me cook dinner or something.
link |
00:18:39.920
For people in underprivileged countries
link |
00:18:43.080
who can't afford eye surgery, like in a robot,
link |
00:18:45.400
can we send a robot to underprivileged places
link |
00:18:48.880
to do surgery where they can't?
link |
00:18:50.720
I work with this organization called Operation Smile
link |
00:18:53.600
where they do cleft palate surgeries.
link |
00:18:55.720
And there's a lot of places
link |
00:18:56.560
that can't do a very simple surgery
link |
00:18:59.200
because they can't afford doctors
link |
00:19:00.440
and medical care and such.
link |
00:19:01.480
So I just see, and this can be completely naive
link |
00:19:04.880
and should be completely wrong,
link |
00:19:05.880
but I feel like a lot of people are going like,
link |
00:19:08.800
the robots are gonna destroy us.
link |
00:19:09.960
Humans, we're destroying ourselves.
link |
00:19:11.680
We're self destructing.
link |
00:19:12.880
Robots to me are the only hope
link |
00:19:14.360
to clean up all the messes that we've created.
link |
00:19:16.000
Even when we go try to clean up pollution in the ocean,
link |
00:19:18.280
we make it worse because of the oil that the tankers,
link |
00:19:21.760
like it's like to me, robots are the only solution.
link |
00:19:25.400
Firefighters are heroes, but they're limited
link |
00:19:27.880
and how many times they can run into a fire.
link |
00:19:30.200
So there's just something interesting to me.
link |
00:19:32.360
I'm not hearing a lot of like lower income,
link |
00:19:36.120
more vulnerable populations talking about robots.
link |
00:19:39.920
Maybe you can speak to it a little bit more.
link |
00:19:42.000
There's an idea, I think you've expressed it.
link |
00:19:44.120
I've heard actually a few female writers
link |
00:19:48.240
and roboticists have talked to express this idea
link |
00:19:51.480
that exactly you just said,
link |
00:19:54.520
which is it just seems that being afraid
link |
00:20:00.360
of existential threats of artificial intelligence
link |
00:20:03.080
is a male issue.
link |
00:20:06.240
Yeah.
link |
00:20:07.080
And I wonder what that is if it,
link |
00:20:10.520
because men have in certain positions,
link |
00:20:13.680
like you said, it's also a classist issue.
link |
00:20:15.640
They haven't been humbled by life.
link |
00:20:17.440
And so you're always look for the biggest problems
link |
00:20:20.680
to take on around you.
link |
00:20:22.400
It's a champagne problem to be afraid of robots.
link |
00:20:24.240
Most people don't have health insurance.
link |
00:20:26.480
They're afraid they're not gonna be able to feed their kids.
link |
00:20:28.240
They can't afford a tutor for their kids.
link |
00:20:30.040
I mean, I just think of the way I grew up
link |
00:20:32.440
and I had a mother who worked two jobs, had kids.
link |
00:20:36.200
We couldn't afford an SAT tutor.
link |
00:20:38.600
The idea of a robot coming in, being able to tutor your kids,
link |
00:20:41.120
being able to provide childcare for your kids,
link |
00:20:43.560
being able to come in with cameras for eyes
link |
00:20:45.600
and make sure surveillance, I'm very pro surveillance
link |
00:20:49.880
because I've had security problems
link |
00:20:52.320
and we're generally in a little more danger
link |
00:20:55.800
than you guys are.
link |
00:20:56.640
So I think that robots are a little less scared of us
link |
00:20:58.680
because we can see that maybe it's like
link |
00:21:00.320
free assistance, help and protection.
link |
00:21:03.480
And then there's sort of another element for me personally,
link |
00:21:06.880
which is maybe more of a female problem.
link |
00:21:08.840
I don't know.
link |
00:21:09.680
I'm just gonna make a generalization.
link |
00:21:11.720
Happy to be wrong.
link |
00:21:13.080
But the emotional sort of component of robots
link |
00:21:18.080
and what they can provide in terms of,
link |
00:21:20.720
I think there's a lot of people that don't have microphones
link |
00:21:25.160
that I just recently kind of stumbled upon
link |
00:21:28.200
in doing all my research on the sex robots
link |
00:21:30.520
for my standup special,
link |
00:21:31.520
which is there's a lot of very shy people
link |
00:21:34.440
that aren't good at dating.
link |
00:21:35.440
There's a lot of people who are scared of human beings
link |
00:21:37.480
who have personality disorders
link |
00:21:40.120
or grew up in alcoholic homes or struggled with addiction
link |
00:21:42.680
or whatever it is where a robot can solve an emotional problem.
link |
00:21:46.560
And so we're largely having this conversation
link |
00:21:49.360
about like rich guys that are emotionally healthy
link |
00:21:53.160
and how scared a robot are.
link |
00:21:55.120
We're forgetting about like a huge part of the population
link |
00:21:58.120
who maybe isn't as charming and effervescent
link |
00:22:01.200
and solvent as people like you and Elon Musk
link |
00:22:05.000
who these robots could solve very real problems
link |
00:22:08.840
in their life, emotional or financial.
link |
00:22:10.880
Well, that's a in general really interesting idea
link |
00:22:13.120
that most people in the world don't have a voice.
link |
00:22:15.800
You've talked about it sort of even the people on Twitter
link |
00:22:19.920
who are driving the conversation.
link |
00:22:22.800
You said comments, people who leave comments
link |
00:22:25.400
represent a very tiny percent of the population
link |
00:22:28.240
and they're the ones they,
link |
00:22:30.640
we tend to think they speak for the population
link |
00:22:33.280
but it's very possible on many topics they don't at all.
link |
00:22:37.240
And look, I'm sure there's gotta be some kind of legal
link |
00:22:42.760
sort of structure in place for when the robots happen.
link |
00:22:45.280
You know way more about this than I do.
link |
00:22:46.680
But for me to just go, the robots are bad.
link |
00:22:49.800
That's a wild generalization
link |
00:22:51.240
that I feel like is really inhumane in some way.
link |
00:22:54.480
Just after the research I've done,
link |
00:22:56.560
like you're gonna tell me that a man whose wife died
link |
00:22:59.320
suddenly and he feels guilty moving on with a human woman
link |
00:23:03.000
or can't get over the grief.
link |
00:23:04.320
He can't have a sex robot in his own house.
link |
00:23:06.600
Why not?
link |
00:23:07.880
Who cares?
link |
00:23:08.840
Why do you care?
link |
00:23:10.040
Well, there's an interesting aspect of human nature.
link |
00:23:12.720
So, you know, we tend to as a civilization
link |
00:23:16.840
to create a group that's the other in all kinds of ways.
link |
00:23:20.640
And so you work with animals to your,
link |
00:23:24.040
especially sensitive to the suffering of animals.
link |
00:23:26.760
Let me kind of ask, what's your,
link |
00:23:29.480
do you think will abuse robots in the future?
link |
00:23:33.920
Do you think some of the darker aspects
link |
00:23:35.920
of human nature will come out?
link |
00:23:37.960
I think some people will,
link |
00:23:39.200
but if we design them properly, the people that do it,
link |
00:23:43.000
we can put it on a record
link |
00:23:44.480
and we can put them in jail.
link |
00:23:46.920
We can find sociopaths more easily, you know?
link |
00:23:49.520
But why is that a sociopathic thing to harm a robot?
link |
00:23:53.200
I think, look, I don't know enough
link |
00:23:55.200
about the consciousness and stuff as you do.
link |
00:23:57.920
I guess it would have to be when they're conscious,
link |
00:23:59.840
but it is the part of the brain
link |
00:24:02.840
that is responsible for compassion,
link |
00:24:04.400
the frontal lobe or whatever.
link |
00:24:05.240
Like people that abuse animals also abuse humans
link |
00:24:08.160
and commit other kinds of crimes.
link |
00:24:09.440
Like that's, it's all the same part of the brain.
link |
00:24:11.080
No one abuses animals and then it's like,
link |
00:24:13.440
awesome to women and children
link |
00:24:15.520
and awesome to underprivileged, you know, minorities.
link |
00:24:18.640
Like it's all, so, you know,
link |
00:24:20.480
we've been working really hard to put a database together
link |
00:24:23.000
of all the people that have abused animals.
link |
00:24:24.760
So when they commit another crime, you go, okay,
link |
00:24:26.440
this is, you know, it's all the same stuff.
link |
00:24:29.320
And I think people probably think I'm nuts
link |
00:24:32.360
for the, a lot of the animal work I do,
link |
00:24:34.760
but because when animal abuse is present,
link |
00:24:37.040
another crime is always present,
link |
00:24:38.880
but the animal abuse is the most socially acceptable.
link |
00:24:40.880
You can kick a dog and there's nothing people can do,
link |
00:24:43.920
but then what they're doing behind closed doors,
link |
00:24:46.560
you can't see.
link |
00:24:47.400
So there's always something else going on,
link |
00:24:48.880
which is why I never feel compunction about it.
link |
00:24:50.680
But I do think we'll start seeing the same thing
link |
00:24:52.400
with robots, the person that kicks the,
link |
00:24:55.520
I felt compassion when the kicking the dog robot
link |
00:24:59.720
really pissed me off.
link |
00:25:01.760
I know that they're just trying to get the stability right
link |
00:25:04.080
and all that, but I do think there will come a time
link |
00:25:07.360
where that will be a great way to be able to figure out
link |
00:25:10.720
if somebody has like, you know, anti social behaviors.
link |
00:25:15.520
You kind of mentioned surveillance.
link |
00:25:18.080
It's also a really interesting idea of yours
link |
00:25:20.000
that you just said, you know,
link |
00:25:21.440
a lot of people seem to be really uncomfortable
link |
00:25:23.440
with surveillance.
link |
00:25:24.280
Yeah.
link |
00:25:25.200
And you just said that, you know what, for me,
link |
00:25:28.560
you know, they're positives for surveillance.
link |
00:25:31.200
I think people behave better
link |
00:25:32.200
when they know they're being watched.
link |
00:25:33.320
And I know this is a very unpopular opinion.
link |
00:25:36.040
I'm talking about it on stage right now.
link |
00:25:38.120
We behave better when we know we're being watched.
link |
00:25:40.360
You and I had a very different conversation
link |
00:25:41.960
before we were recording.
link |
00:25:43.200
If we behave different, you sit up,
link |
00:25:46.080
and you are in your best behavior,
link |
00:25:47.560
and I'm trying to sound eloquent,
link |
00:25:49.360
and I'm trying to not hurt anyone's feelings.
link |
00:25:51.160
And I mean, I have a camera right there.
link |
00:25:52.880
I'm behaving totally different
link |
00:25:54.680
than when we first started talking, you know?
link |
00:25:56.960
When you know there's a camera, you behave differently.
link |
00:25:59.400
I mean, there's cameras all over LA at stop lights
link |
00:26:02.720
so that people don't run stop lights,
link |
00:26:04.040
but there's not even film in it.
link |
00:26:05.840
They don't even use them anymore, but it works.
link |
00:26:08.040
It works.
link |
00:26:08.880
Right?
link |
00:26:09.720
And, you know, working on this thing
link |
00:26:10.720
in stand about surveillance,
link |
00:26:12.000
it's like, that's why we embed in Santa Claus.
link |
00:26:14.280
You know, it's the Santa Claus
link |
00:26:15.600
is the first surveillance, basically.
link |
00:26:17.840
All we had to say to kids is he's making a list
link |
00:26:20.440
and he's watching you, and they behave better.
link |
00:26:22.960
That's brilliant.
link |
00:26:23.800
You know, so I do think that there are benefits
link |
00:26:26.160
to surveillance.
link |
00:26:27.440
You know, I think we all do sketchy things in private,
link |
00:26:30.960
and we all have watched weird porn
link |
00:26:33.320
or Googled weird things,
link |
00:26:34.480
and we don't want people to know about it,
link |
00:26:37.080
our secret lives.
link |
00:26:37.920
So I do think that obviously there's,
link |
00:26:40.240
we should be able to have a modicum of privacy,
link |
00:26:42.880
but I tend to think that people that are the most negative
link |
00:26:46.680
about surveillance are the most secret.
link |
00:26:48.280
The most the hype.
link |
00:26:49.120
Well, you should,
link |
00:26:52.120
is your saying you're doing bits on it now?
link |
00:26:54.560
Well, I'm just talking in general about,
link |
00:26:56.720
you know, privacy and surveillance
link |
00:26:58.440
and how paranoid we're kind of becoming
link |
00:27:00.320
and how, you know, I mean, it's just wild to me
link |
00:27:03.640
that people are like, our emails are gonna leak
link |
00:27:05.920
and they're taking our phone numbers.
link |
00:27:07.200
Like, there used to be a book
link |
00:27:10.040
full of phone numbers and addresses
link |
00:27:12.680
that they just throw it at your door.
link |
00:27:15.560
And we all had a book of everyone's numbers.
link |
00:27:18.080
You know, this is a very new thing.
link |
00:27:20.360
And, you know, I know our amygdala is designed
link |
00:27:22.400
to compound sort of threats and, you know,
link |
00:27:25.360
there's stories about, and I think we all just glom on
link |
00:27:29.280
in a very, you know, tribal way of, yeah,
link |
00:27:31.600
they're taking our data.
link |
00:27:32.440
Like, we don't even know that means,
link |
00:27:33.720
but we're like, well, yeah, they, they, you know.
link |
00:27:38.080
So I just think that someone's like, okay, well,
link |
00:27:39.760
so what, they're gonna sell your data?
link |
00:27:41.320
Who cares?
link |
00:27:42.160
Why do you care?
link |
00:27:43.200
First of all, that bit will kill in China.
link |
00:27:47.320
So, and I said it as sort of only a little bit joking
link |
00:27:51.080
because a lot of people in China,
link |
00:27:53.320
including the citizens,
link |
00:27:55.200
despite what people in the West think of as abuse,
link |
00:27:59.640
are actually in support of the idea of surveillance.
link |
00:28:03.400
Sort of, they're not in support of the abuse of surveillance,
link |
00:28:06.480
but they're, they like, I mean, the idea of surveillance
link |
00:28:09.440
is kind of like the idea of government.
link |
00:28:13.520
Like you said, we behave differently.
link |
00:28:15.920
And in a way, it's almost like why we like sports.
link |
00:28:18.520
There's rules and within the constraints of the rules,
link |
00:28:22.400
this is a more stable society.
link |
00:28:25.040
And they make good arguments about success,
link |
00:28:28.120
being able to build successful companies,
link |
00:28:30.480
being able to build successful social lives
link |
00:28:32.800
around the fabric that's more stable.
link |
00:28:34.560
When you have a surveillance, it keeps the criminals away,
link |
00:28:37.040
it keeps abusive animals,
link |
00:28:38.560
whatever the values of the society with surveillance,
link |
00:28:42.880
you can enforce those values better.
link |
00:28:44.800
And here's what I will say.
link |
00:28:45.920
There's a lot of unethical things happening with surveillance.
link |
00:28:48.600
Like I feel the need to really make that very clear.
link |
00:28:52.080
I mean, the fact that Google is like collecting
link |
00:28:54.080
if people's hands start moving on the mouse
link |
00:28:55.960
to find out if they're getting Parkinson's
link |
00:28:58.440
and then their insurance goes up,
link |
00:29:00.080
like that is completely unethical and wrong.
link |
00:29:02.200
And I think stuff like that,
link |
00:29:03.360
we have to really be careful around.
link |
00:29:05.880
So the idea of using our data to raise our insurance rates
link |
00:29:08.640
or, you know, I heard that they're looking,
link |
00:29:10.800
they can sort of predict if you're gonna have depression
link |
00:29:13.320
based on your selfies by detecting micro muscles
link |
00:29:16.080
in your face, you know, all that kind of stuff.
link |
00:29:18.280
That is a nightmare, not okay.
link |
00:29:20.040
But I think, you know, we have to delineate
link |
00:29:22.360
what's a real threat and what's getting spam
link |
00:29:25.160
in your email box.
link |
00:29:26.000
That's not what to spend your time and energy on.
link |
00:29:28.600
Focus on the fact that every time you buy cigarettes,
link |
00:29:31.080
your insurance is going up without you knowing about it.
link |
00:29:35.240
On the topic of animals too,
link |
00:29:36.920
can we just linger on it a little bit?
link |
00:29:38.360
Like what do you think,
link |
00:29:41.320
what does it say about our society
link |
00:29:43.360
of the society wide abuse of animals
link |
00:29:45.640
that we see in general, sort of factory farming,
link |
00:29:48.640
just in general, just the way we treat animals
link |
00:29:50.640
of different categories?
link |
00:29:53.600
Like what, what do you think of that?
link |
00:29:58.160
What does a better world look like?
link |
00:29:59.680
What should people think about it in general?
link |
00:30:03.640
I think the most interesting thing I can probably say
link |
00:30:07.360
around this, that's the least emotional,
link |
00:30:09.480
because I'm actually a very non emotional animal person
link |
00:30:11.880
because it's, I think everyone's an animal person.
link |
00:30:14.080
It's just a matter of if it's yours
link |
00:30:15.880
or if you've, you know, been conditioned to go numb.
link |
00:30:18.480
You know, I think it's really a testament
link |
00:30:20.800
to what as a species we are able to be in denial about,
link |
00:30:24.560
mass denial and mass delusion
link |
00:30:26.280
and how we're able to dehumanize and debase groups,
link |
00:30:31.640
you know, World War II in a way in order to conform
link |
00:30:36.760
and find protection in the conforming.
link |
00:30:38.840
So we are also a species who used to go to coliseums
link |
00:30:43.840
and watch elephants and tigers fight to the death.
link |
00:30:47.520
We used to watch human beings be pulled apart
link |
00:30:50.280
in the, that wasn't that long ago.
link |
00:30:53.080
We're also a species who had slaves
link |
00:30:56.880
and it was socially acceptable by a lot of people.
link |
00:30:59.040
People didn't see anything wrong with it.
link |
00:31:00.160
So we're a species that is able to go numb
link |
00:31:02.680
and that is able to dehumanize very quickly
link |
00:31:05.960
and make it the norm.
link |
00:31:08.120
Child labor wasn't that long ago like the idea
link |
00:31:11.360
that now we look back and go, oh yeah, kids,
link |
00:31:13.840
we're losing fingers and factories making shoes.
link |
00:31:17.200
Like someone had to come in and make that, you know,
link |
00:31:20.160
so I think it just says a lot about the fact that, you know,
link |
00:31:23.560
we are animals and we are self serving
link |
00:31:25.320
and one of the most success, the most successful species
link |
00:31:29.200
because we are able to debase and degrade
link |
00:31:33.160
and essentially exploit anything that benefits us.
link |
00:31:36.840
I think the pendulums are gonna swing as being late.
link |
00:31:40.560
Like I think we're Rome now kind of like,
link |
00:31:42.800
I think we're on the verge of collapse
link |
00:31:44.960
because we are dopamine receptors.
link |
00:31:47.240
Like we are just, I think we're all kind of addicts
link |
00:31:49.560
when it comes to this stuff.
link |
00:31:50.520
Like we don't know when to stop.
link |
00:31:53.360
It's always the buffet.
link |
00:31:54.480
Like we're the thing that used to keep us alive
link |
00:31:56.600
which is killing animals and eating them.
link |
00:31:58.360
Now killing animals and eating them
link |
00:31:59.720
is what's killing us in a way.
link |
00:32:01.200
So it's like we just can't, we don't know when to call it
link |
00:32:04.200
and we don't moderation is not really something
link |
00:32:06.560
that humans have evolved to have yet.
link |
00:32:10.040
So I think it's really just a flaw in our wiring.
link |
00:32:13.600
Do you think we'll look back at this time
link |
00:32:15.240
as at our society is being deeply unethical?
link |
00:32:19.400
Yeah, yeah.
link |
00:32:20.520
I think we'll be embarrassed.
link |
00:32:22.240
Which are the worst parts right now going on?
link |
00:32:24.840
Is it, is it?
link |
00:32:25.680
In terms of animal, well, I think
link |
00:32:26.600
No, in terms of anything.
link |
00:32:27.800
What's the unethical thing?
link |
00:32:29.160
If we, and it's very hard to just take a step out of it
link |
00:32:32.000
but you just said we used to watch, you know,
link |
00:32:37.320
there's been a lot of cruelty throughout history.
link |
00:32:40.400
What's the cruelty going on now?
link |
00:32:42.160
I think it's going to be pigs.
link |
00:32:44.200
I think it's going to be, I mean, pigs are
link |
00:32:46.360
one of the most emotionally intelligent animals
link |
00:32:48.720
and they have the intelligence of like a three year old
link |
00:32:51.680
and I think we'll look back and be really,
link |
00:32:54.320
they use tools.
link |
00:32:55.160
I mean, they're, I think we have this narrative
link |
00:32:58.440
that they're pigs and they're pigs
link |
00:32:59.760
and they're they're disgusting and they're dirty
link |
00:33:01.880
and they're bacon is so good.
link |
00:33:02.880
I think that we'll look back one day
link |
00:33:04.240
and be really embarrassed about that.
link |
00:33:06.680
Is this for just what's it called the factory farming?
link |
00:33:10.360
So basically mass.
link |
00:33:11.720
Because we don't see it.
link |
00:33:12.560
If you saw, I mean, we do have, I mean,
link |
00:33:14.840
this is probably an evolutionary advantage.
link |
00:33:17.600
We do have the ability to completely pretend something's not
link |
00:33:21.520
something that is so horrific that it overwhelms us
link |
00:33:24.040
and we're able to essentially deny that it's happening.
link |
00:33:27.560
I think if people were to see what goes on in factory farming
link |
00:33:30.520
and also we're really to take in how bad it is for us,
link |
00:33:35.360
you know, we're hurting ourselves first and foremost
link |
00:33:37.160
with what we eat,
link |
00:33:38.440
but that's also a very elitist argument, you know?
link |
00:33:41.280
It's a luxury to be able to complain about meat.
link |
00:33:44.600
It's a luxury to be able to not eat meat.
link |
00:33:46.640
You know, there's very few people because of, you know,
link |
00:33:49.960
how the corporations have set up meat being cheap.
link |
00:33:53.320
You know, it's $2 to buy a Big Mac.
link |
00:33:55.280
It's $10 to buy a healthy meal.
link |
00:33:57.640
You know, that's, I think a lot of people don't have
link |
00:34:00.240
the luxury to even think that way.
link |
00:34:02.280
But I do think that animals and captivity,
link |
00:34:04.200
I think we're going to look back
link |
00:34:05.040
and be pretty grossed out about mammals
link |
00:34:06.960
and captivity, whales, dolphins.
link |
00:34:08.760
I mean, that's already starting to dismantle circuses.
link |
00:34:12.200
We're going to be pretty embarrassed about,
link |
00:34:13.960
but I think it's really more a testament to, you know,
link |
00:34:18.280
there's just such a ability to go like,
link |
00:34:22.080
that thing is different than me and we're better.
link |
00:34:25.520
It's the ego.
link |
00:34:26.360
I mean, it's just, we have the species
link |
00:34:27.560
with the biggest ego ultimately.
link |
00:34:29.160
Well, that's what I think, that's my hope for robots
link |
00:34:31.840
is they'll, you mentioned consciousness before,
link |
00:34:34.200
nobody knows what consciousness is,
link |
00:34:37.640
but I'm hoping robots will help us empathize
link |
00:34:42.200
and understand that there's other creatures
link |
00:34:47.400
besides ourselves that can suffer,
link |
00:34:50.320
that can experience the world
link |
00:34:54.800
and that we can torture by our actions.
link |
00:34:57.680
And robots can explicitly teach us that,
link |
00:34:59.880
I think, better than animals can.
link |
00:35:01.480
I have never seen such compassion
link |
00:35:06.480
from a lot of people in my life
link |
00:35:10.840
toward any human, animal, child,
link |
00:35:13.600
as I have a lot of people in the way they interact
link |
00:35:15.640
with the robot.
link |
00:35:16.600
Because I think there's,
link |
00:35:18.240
I think there's something of,
link |
00:35:19.760
I mean, I was on the robot owner's chat boards
link |
00:35:23.520
for a good eight months.
link |
00:35:25.920
And the main emotional benefit is,
link |
00:35:28.120
she's never going to cheat on you.
link |
00:35:30.360
She's never going to hurt you.
link |
00:35:31.920
She's never going to lie to you.
link |
00:35:33.120
She doesn't judge you.
link |
00:35:34.760
I think that robots help people,
link |
00:35:38.680
and this is part of the work I do with animals,
link |
00:35:40.840
like I do equine therapy and trained dogs and stuff,
link |
00:35:42.960
because there is this safe space to be authentic.
link |
00:35:46.240
You're with this being that doesn't care what you do
link |
00:35:48.480
for a living, doesn't care how much money you have,
link |
00:35:50.320
doesn't care who you're dating,
link |
00:35:51.480
doesn't care what you look like,
link |
00:35:52.400
doesn't care if you have cellulite, whatever,
link |
00:35:54.520
you feel safe to be able to truly be present
link |
00:35:57.920
without being defensive and worrying about eye contact
link |
00:36:00.080
and being triggered by needing to be perfect
link |
00:36:02.840
and fear of judgment and all that.
link |
00:36:04.800
And robots really can't judge you yet,
link |
00:36:08.200
but they can't judge you.
link |
00:36:09.240
And I think it really puts people at ease
link |
00:36:13.440
and at their most authentic.
link |
00:36:16.280
Do you think you can have a deep connection
link |
00:36:18.640
with the robot that's not judging or,
link |
00:36:23.200
do you think you can really have a relationship
link |
00:36:25.400
with a robot or a human being that's a safe space
link |
00:36:29.920
or is attention, mystery, danger necessary
link |
00:36:34.240
for a deep connection?
link |
00:36:35.920
I'm gonna speak for myself and say that
link |
00:36:38.560
I grew up in an alcohol calm.
link |
00:36:40.080
I identify as a codependent, talked about this stuff before,
link |
00:36:43.240
but for me, it's very hard to be in a relationship
link |
00:36:45.320
with a human being without feeling like I need to perform
link |
00:36:48.480
in some way or deliver in some way.
link |
00:36:50.720
And I don't know if that's just the people
link |
00:36:51.880
I've been in a relationship with or me or my brokenness,
link |
00:36:56.480
but I do think this is gonna sound
link |
00:37:00.040
really negative and pessimistic,
link |
00:37:04.160
but I do think a lot of our relationships are projection
link |
00:37:07.160
and a lot of our relationships are performance.
link |
00:37:09.600
And I don't think I really understood that
link |
00:37:12.240
until I worked with horses.
link |
00:37:15.240
And most communications with human is nonverbal, right?
link |
00:37:18.040
I can say like, I love you,
link |
00:37:19.880
but you don't think I love you, right?
link |
00:37:21.960
Whereas with animals, it's very direct.
link |
00:37:24.240
It's all physical, it's all energy.
link |
00:37:26.800
I feel like that with robots too.
link |
00:37:28.480
It feels very,
link |
00:37:32.240
how I say something doesn't matter.
link |
00:37:35.240
My inflection doesn't really matter.
link |
00:37:36.880
And you thinking that my tone is disrespectful,
link |
00:37:40.280
like you're not filtering it through all
link |
00:37:42.120
of the bad relationships you've been in.
link |
00:37:43.760
You're not filtering it through the way your mom talked to you.
link |
00:37:45.840
You're not getting triggered.
link |
00:37:47.720
I find that for the most part,
link |
00:37:49.360
people don't always receive things the way
link |
00:37:51.320
that you intend them to or the way intended,
link |
00:37:53.640
and that makes relationships really murky.
link |
00:37:56.120
So the relationships with animals
link |
00:37:57.480
and relationships with the robots as they are now,
link |
00:38:00.680
you kind of imply that that's more healthy.
link |
00:38:05.200
Can you have a healthy relationship with other humans?
link |
00:38:08.080
Or not healthy, I don't like that word,
link |
00:38:10.120
but shouldn't it be, you've talked about codependency.
link |
00:38:14.440
Maybe you can talk about what is codependency,
link |
00:38:16.640
but is that, is the challenges of that,
link |
00:38:21.640
the complexity of that necessary for passion,
link |
00:38:24.600
for love between humans?
link |
00:38:27.200
That's right, you love passion.
link |
00:38:29.200
That's a good thing.
link |
00:38:31.960
I thought this would be a safe space.
link |
00:38:33.920
I got trolled by Rogan for hours on this.
link |
00:38:40.000
Look, I am not anti passion.
link |
00:38:42.600
I think that I've just maybe been around long enough
link |
00:38:45.280
to know that sometimes it's ephemeral
link |
00:38:48.280
and that passion is a mixture of a lot of different things.
link |
00:38:55.440
Adrenaline, which turns into dopamine,
link |
00:38:57.280
cortisol, it's a lot of neurochemicals.
link |
00:38:59.120
It's a lot of projection.
link |
00:39:01.200
It's a lot of what we've seen in movies.
link |
00:39:03.240
It's a lot of, you know, it's identified as an addict.
link |
00:39:06.200
So for me, sometimes passion is like,
link |
00:39:08.560
uh oh, this could be bad.
link |
00:39:10.160
And I think we've been so conditioned to believe
link |
00:39:11.560
that passion means like your soulmates.
link |
00:39:13.120
And I mean, how many times have you had
link |
00:39:14.320
a passionate connection with someone
link |
00:39:15.640
and then it was a total train wreck?
link |
00:39:18.200
Passion.
link |
00:39:19.040
It's a train wreck.
link |
00:39:19.860
How many times exactly?
link |
00:39:20.920
Exactly.
link |
00:39:22.240
What's a train wreck?
link |
00:39:23.080
You just did a lot of math in your head
link |
00:39:24.400
in that little moment.
link |
00:39:25.400
Counting.
link |
00:39:26.560
I mean, what's a train wreck?
link |
00:39:28.640
What's, why is obsession,
link |
00:39:31.560
so you describe this codependency
link |
00:39:33.680
and sort of the idea of attachment over attachment
link |
00:39:38.920
to people who don't deserve that kind of attachment
link |
00:39:41.920
as somehow a bad thing.
link |
00:39:45.120
And I think our society says it's a bad thing.
link |
00:39:47.800
It probably is a bad thing.
link |
00:39:49.680
Like a delicious burger is a bad thing.
link |
00:39:52.640
I don't know.
link |
00:39:53.480
Right. Oh, that's a good point.
link |
00:39:54.360
I think that you're pointing out
link |
00:39:55.440
something really fascinating, which is like passion,
link |
00:39:57.280
if you go into it knowing this is like pizza
link |
00:40:00.280
where it's going to be delicious for two hours
link |
00:40:01.920
and then I don't have to have it again for three.
link |
00:40:03.480
If you can have a choice in the passion,
link |
00:40:06.440
I define passion as something that is relatively unmanageable
link |
00:40:09.600
and something you can't control or stop and start
link |
00:40:12.280
with your own volition.
link |
00:40:13.760
So maybe we're operating under different definitions.
link |
00:40:16.360
If passion is something that ruins your marriages
link |
00:40:21.000
and screws up your professional life
link |
00:40:23.240
and becomes this thing that you're not in control of
link |
00:40:27.320
and becomes addictive, I think that's the difference.
link |
00:40:29.880
Is it a choice or is it not a choice?
link |
00:40:32.600
And if it is a choice, then passion's great.
link |
00:40:35.160
But if it's something that consumes you
link |
00:40:37.400
and makes you start making bad decisions
link |
00:40:39.400
and clouds your frontal lobe
link |
00:40:41.200
and is just all about dopamine
link |
00:40:44.080
and not really about the person
link |
00:40:46.200
and more about the neurochemical,
link |
00:40:47.800
we call it sort of the internal drug cabinet.
link |
00:40:50.760
If it's all just you're on drugs, that's different,
link |
00:40:53.320
because sometimes you're just on drugs.
link |
00:40:55.000
Okay, so there's a philosophical question here.
link |
00:40:58.440
So would you rather, and it's interesting for a comedian,
link |
00:41:03.360
brilliant comedian to speak so eloquently
link |
00:41:07.560
about a balanced life.
link |
00:41:10.520
I kind of argue against this point.
link |
00:41:12.040
There's such an obsession of creating
link |
00:41:13.520
this healthy lifestyle now, psychologically speaking.
link |
00:41:18.120
You know, I'm a fan of the idea that you sort of fly high
link |
00:41:22.040
and you crash and die at 27 as also a possible life.
link |
00:41:26.480
And it's not one we should judge
link |
00:41:27.960
because I think there's moments of greatness.
link |
00:41:30.680
I talked to Olympic athletes
link |
00:41:32.160
where some of their greatest moments
link |
00:41:34.280
are achieved in the early 20s.
link |
00:41:36.600
And the rest of their life is in the kind of fog
link |
00:41:39.880
of almost of a depression.
link |
00:41:41.600
Because they're based on their physical prowess, right?
link |
00:41:44.280
Physical prowess, and they'll never sort of that.
link |
00:41:47.240
So they're watching their physical prowess fade
link |
00:41:50.240
and they'll never achieve the kind of height,
link |
00:41:54.720
not just physical, of just emotion, of...
link |
00:41:58.640
The max number of neurochemicals.
link |
00:42:01.800
And you also put your money on the wrong horse.
link |
00:42:04.720
That's where I would just go like,
link |
00:42:06.440
oh yeah, if you're doing a job where you peak at 22,
link |
00:42:10.200
the rest of your life is going to be hard.
link |
00:42:12.400
That idea is considering the notion
link |
00:42:15.240
that you want to optimize some kind of,
link |
00:42:17.600
but we're all going to die soon.
link |
00:42:19.400
What?
link |
00:42:21.920
Now you tell me.
link |
00:42:23.360
I have immortalized myself, so I'm going to be fine.
link |
00:42:26.880
See, you're almost like,
link |
00:42:28.360
how many Oscar winning movies can I direct
link |
00:42:32.240
by the time I'm 100?
link |
00:42:34.160
How many this and that?
link |
00:42:35.880
But it's all, life is short, relatively speaking.
link |
00:42:41.280
I know, but it can also come a different way.
link |
00:42:42.680
You go, life is short, play hard,
link |
00:42:45.200
fall in love as much as you can, run into walls.
link |
00:42:47.720
I would also go, life is short.
link |
00:42:49.480
Don't deplete yourself on things that aren't sustainable
link |
00:42:53.840
and that you can't keep.
link |
00:42:55.920
Yeah.
link |
00:42:56.760
So I think everyone gets dopamine from different places.
link |
00:42:59.800
Everyone has meaning from different places.
link |
00:43:01.800
I look at the fleeting, passionate relationships
link |
00:43:04.600
I've had in the past and I don't have pride in them.
link |
00:43:07.960
I think that you have to decide
link |
00:43:09.080
what helps you sleep at night.
link |
00:43:11.200
For me, it's pride and feeling like I behave
link |
00:43:13.560
with grace and integrity.
link |
00:43:14.600
That's just me personally.
link |
00:43:16.120
Everyone can go like, yeah,
link |
00:43:18.240
slept with all the hot chicks in Italy I could
link |
00:43:21.040
and I did all the whatever you value.
link |
00:43:25.120
We're allowed to value different things.
link |
00:43:26.680
We're talking about Brian Callan.
link |
00:43:28.200
Brian Callan has lived his life to the fullest,
link |
00:43:32.640
to say the least,
link |
00:43:33.640
but I think that it's just for me personally,
link |
00:43:36.480
I, and this could be like my workaholism
link |
00:43:38.920
or my achievementism.
link |
00:43:42.720
If I don't have something to show for something,
link |
00:43:45.320
I feel like it's a waste of time or some kind of loss.
link |
00:43:50.360
I'm in a 12 step program and the third step would say
link |
00:43:52.680
there's no such thing as waste of time
link |
00:43:54.200
and everything happens exactly as it should
link |
00:43:56.920
and whatever, that's a way to just sort of keep us sane
link |
00:43:59.560
so we don't grieve too much and beat ourselves up
link |
00:44:01.880
over past mistakes.
link |
00:44:03.560
There's no such thing as mistakes, da, da, da.
link |
00:44:05.880
But I think passion is,
link |
00:44:09.040
I think it's so life affirming and one of the few things
link |
00:44:11.800
that maybe people like us makes us feel awakened and seen
link |
00:44:14.960
and we just have such a high threshold for adrenaline.
link |
00:44:20.600
You know, I mean, you are a fighter, right?
link |
00:44:22.880
Yeah, okay, so yeah,
link |
00:44:24.160
so you have a very high tolerance for adrenaline.
link |
00:44:28.840
And I think that Olympic athletes,
link |
00:44:30.480
the amount of adrenaline they get from performing,
link |
00:44:33.720
it's very hard to follow that.
link |
00:44:34.800
It's like when guys come back from the military
link |
00:44:36.600
and they have depression,
link |
00:44:38.200
it's like, do you miss bullets flying at you?
link |
00:44:40.960
Kind of, because of that adrenaline
link |
00:44:42.920
which turned into dopamine and the camaraderie.
link |
00:44:45.200
I mean, there's people that speak much better
link |
00:44:46.600
about this than I do, but I just,
link |
00:44:49.160
I'm obsessed with neurology and I'm just obsessed
link |
00:44:51.760
with sort of the lies we tell ourselves
link |
00:44:54.200
in order to justify getting neurochemicals.
link |
00:44:57.160
You've done actually quite,
link |
00:44:59.080
done a lot of thinking and talking about neurology
link |
00:45:01.960
and just kind of look at human behavior
link |
00:45:04.240
through the lens of looking at how
link |
00:45:07.360
are actually chemically our brain works.
link |
00:45:09.200
So what, first of all, why did you connect with that idea
link |
00:45:13.960
and what have you, how has your view of the world changed
link |
00:45:17.600
by considering the brain is just a machine?
link |
00:45:22.480
You know, I know it probably sounds really nihilistic,
link |
00:45:24.600
but for me, it's very liberating
link |
00:45:26.640
to know a lot about neurochemicals
link |
00:45:28.520
because you don't have to,
link |
00:45:30.120
it's like the same thing with like,
link |
00:45:31.600
like critics, like critical reviews.
link |
00:45:33.960
If you believe the good,
link |
00:45:34.800
you have to believe the bad kind of thing.
link |
00:45:36.160
Like, you know, if you believe that your bad choices
link |
00:45:39.000
were because of your moral integrity
link |
00:45:42.960
or whatever, you have to believe your good ones.
link |
00:45:44.800
I just think there's something really liberating
link |
00:45:46.400
and going like, oh, that was just adrenaline.
link |
00:45:48.120
I just said that thing
link |
00:45:48.960
because I was adrenalized and I was scared
link |
00:45:50.760
and my amygdala was activated
link |
00:45:52.160
and that's why I said you're an asshole and get out.
link |
00:45:54.480
And that's, you know, I think,
link |
00:45:56.000
I just think it's important to delineate what's nature
link |
00:45:58.080
and what's nurture, what is your choice
link |
00:45:59.920
and what is just your brain trying to keep you safe.
link |
00:46:02.120
I think we forget that even though we have security systems
link |
00:46:04.640
and homes and locks on our doors,
link |
00:46:06.440
that our brain for the most part
link |
00:46:07.640
is just trying to keep us safe all the time.
link |
00:46:09.320
It's why we hold grudges, it's why we get angry,
link |
00:46:11.440
it's why we get road rage,
link |
00:46:13.000
it's why we do a lot of things.
link |
00:46:14.840
And it's also, when I started learning about neurology,
link |
00:46:17.320
I started having so much more compassion for other people.
link |
00:46:19.720
You know, if someone yelled at me,
link |
00:46:21.120
being like, fuck you on the road,
link |
00:46:22.640
I'd be like, okay, he's producing adrenaline right now
link |
00:46:24.680
because we're all going 65 miles an hour
link |
00:46:27.760
and our brains aren't really designed
link |
00:46:30.360
for this type of stress and he's scared.
link |
00:46:33.360
He was scared, you know,
link |
00:46:34.200
so that really helped me to have more love for people
link |
00:46:36.960
in my everyday life instead of being in fight or flight mode.
link |
00:46:41.080
But the, I think more interesting answer to your question
link |
00:46:44.200
is that I've had migraines my whole life,
link |
00:46:45.760
like I've suffered with really intense migraines,
link |
00:46:49.080
ocular migraines, ones where my arm would go numb
link |
00:46:52.400
and I just started having to go to so many doctors
link |
00:46:55.040
to learn about it.
link |
00:46:56.280
And I started, you know, learning that
link |
00:46:59.080
we don't really know that much, we know a lot,
link |
00:47:01.800
but it's wild to go into one of the best neurologists
link |
00:47:04.400
in the world who's like, yeah, we don't know.
link |
00:47:05.920
We don't know.
link |
00:47:06.760
We don't know.
link |
00:47:07.600
And that fascinated me.
link |
00:47:08.680
Except one of the worst pains,
link |
00:47:09.840
you can probably have all that stuff.
link |
00:47:11.560
And we don't know the source.
link |
00:47:13.320
We don't know the source.
link |
00:47:14.360
And there is something really fascinating about
link |
00:47:17.240
when your left arm starts going numb
link |
00:47:19.520
and you start not being able to see
link |
00:47:21.040
out of the left side of both your eyes.
link |
00:47:22.840
And I remember when the migraines get really bad,
link |
00:47:25.360
it's like a mini stroke almost,
link |
00:47:26.880
and you're able to see words on a page,
link |
00:47:29.880
but I can't read them.
link |
00:47:31.240
They just look like symbols to me.
link |
00:47:33.080
So there's something just really fascinating to me
link |
00:47:35.000
about your brain just being able to stop functioning.
link |
00:47:38.280
And so I just wanted to learn about it, study about it.
link |
00:47:41.640
I did all these weird alternative treatments.
link |
00:47:43.360
I got this piercing in here that actually works.
link |
00:47:45.880
I've tried everything.
link |
00:47:47.000
And then both of my parents had strokes.
link |
00:47:49.200
So when both of my parents had strokes,
link |
00:47:51.040
I became sort of the person who had to decide
link |
00:47:54.160
what was gonna happen with their recovery,
link |
00:47:56.680
which is just a wild thing to have to deal with it.
link |
00:47:59.440
28 years old when it happened.
link |
00:48:02.120
And I started spending basically all day every day
link |
00:48:05.120
in ICU's with neurologists learning
link |
00:48:07.560
about what happened to my dad's brain
link |
00:48:09.080
and why he can't move his left arm,
link |
00:48:11.160
but he can move his right leg,
link |
00:48:12.520
but he can't see out of the,
link |
00:48:14.200
and then my mom had another stroke
link |
00:48:17.040
in a different part of the brain.
link |
00:48:18.080
So I started having to learn
link |
00:48:19.680
what parts of the brain did what
link |
00:48:21.520
and so that I wouldn't take their behavior so personally.
link |
00:48:23.880
And so that I would be able to manage my expectations
link |
00:48:26.000
in terms of their recovery.
link |
00:48:27.440
So my mom, because it affected a lot of her frontal lobe,
link |
00:48:31.280
changed a lot as a person.
link |
00:48:33.120
She was way more emotional.
link |
00:48:34.600
She was way more micromanaged.
link |
00:48:35.800
She was forgetting certain things.
link |
00:48:37.000
So it broke my heart less when I was able to know,
link |
00:48:40.320
oh yeah, we'll just stroke hit this part of the brain.
link |
00:48:42.080
And that's the one that's responsible for short term memory.
link |
00:48:44.240
And that's responsible for long term memory, da, da, da.
link |
00:48:46.880
And then my brother just got something called viral
link |
00:48:49.320
encephalitis, which is an infection inside the brain.
link |
00:48:53.320
So it was kind of wild that I was able to go,
link |
00:48:56.280
oh, I know exactly what's happening here.
link |
00:48:57.640
And I know, you know, so.
link |
00:48:59.760
So that's allows you to have some more compassion
link |
00:49:02.440
for the struggles that people have.
link |
00:49:04.440
But does it take away some of the magic
link |
00:49:06.600
for some of the, from the,
link |
00:49:08.840
some of the more positive experiences of life?
link |
00:49:11.960
Sometimes, and I don't, I don't,
link |
00:49:13.400
I'm such a control addict that, you know,
link |
00:49:16.480
I think our biggest, someone like me,
link |
00:49:19.360
my biggest dream is to know why someone's doing it.
link |
00:49:21.000
That's what stand up is.
link |
00:49:22.240
It's just trying to figure out why,
link |
00:49:23.440
or that's what writing is.
link |
00:49:24.280
That's what acting is.
link |
00:49:25.120
That's what performing is.
link |
00:49:25.960
It's trying to figure out why someone would do something.
link |
00:49:27.480
As an actor, you get a piece of, you know, material
link |
00:49:30.040
and you go, this person, why would he say that?
link |
00:49:32.080
Why would she pick up that cup?
link |
00:49:33.760
Why would she walk over here?
link |
00:49:35.040
It's really why, why, why, why?
link |
00:49:36.560
So I think neurology is, if you're trying to figure out
link |
00:49:39.600
human motives and why people do what they do,
link |
00:49:41.520
it'd be crazy not to understand
link |
00:49:44.000
how neurochemicals motivate us.
link |
00:49:46.080
I also have a lot of addiction in my family
link |
00:49:48.080
and hardcore drug addiction and mental illness.
link |
00:49:51.480
And in order to cope with it,
link |
00:49:53.720
you really have to understand it,
link |
00:49:54.760
borderline personality disorder, schizophrenia
link |
00:49:57.160
and drug addiction.
link |
00:49:58.360
So I have a lot of people I love
link |
00:50:00.640
that suffer from drug addiction and alcoholism.
link |
00:50:02.840
And the first thing they started teaching you
link |
00:50:04.760
is it's not a choice.
link |
00:50:05.880
These people's dopamine receptors
link |
00:50:07.360
don't hold dopamine the same ways yours do.
link |
00:50:09.640
Their frontal lobe is underdeveloped, like, you know,
link |
00:50:13.200
and that really helped me to navigate dealing,
link |
00:50:17.160
loving people that were addicted to substances.
link |
00:50:20.240
I want to be careful with this question,
link |
00:50:22.600
but how much?
link |
00:50:24.240
Money do you have?
link |
00:50:25.320
How much?
link |
00:50:26.160
Can I borrow 10 dollars?
link |
00:50:30.920
Okay.
link |
00:50:33.160
No, is how much control,
link |
00:50:36.520
how much despite the chemical imbalances
link |
00:50:39.760
or the biological limitations
link |
00:50:42.920
that each of our individual brains have,
link |
00:50:44.480
how much mind over matter is there?
link |
00:50:47.080
So through things that I've known people
link |
00:50:50.840
with clinical depression.
link |
00:50:53.160
And so it's always a touchy subject
link |
00:50:55.560
to say how much they can really help it.
link |
00:50:57.640
Very.
link |
00:50:59.640
What can you, yeah, what can you,
link |
00:51:01.680
because you've talked about codependency,
link |
00:51:03.520
you've talked about issues that you're struggled through.
link |
00:51:07.360
And nevertheless, you choose to take a journey
link |
00:51:09.840
of healing and so on.
link |
00:51:11.120
So that's your choice, that's your actions.
link |
00:51:14.200
So how much can you do to help fight
link |
00:51:16.680
the limitations of the neurochemicals in your brain?
link |
00:51:20.000
That's such an interesting question.
link |
00:51:21.800
And I don't think I'm at all qualified to answer,
link |
00:51:23.440
but I'll say what I do know.
link |
00:51:25.560
And really quick, just the definition of codependency,
link |
00:51:28.200
I think a lot of people think of codependency
link |
00:51:29.920
as like two people that can't stop hanging out,
link |
00:51:32.720
you know, or like, you know, that's not totally off,
link |
00:51:36.640
but I think for the most part,
link |
00:51:38.280
my favorite definition of codependency
link |
00:51:39.960
is the inability to tolerate the discomfort of others.
link |
00:51:42.920
You grow up in an alcoholic home,
link |
00:51:44.040
you grow up around mental illness,
link |
00:51:45.200
you grow up in chaos,
link |
00:51:46.480
you have a parent that's a narcissist,
link |
00:51:48.280
you basically are wired to just people,
link |
00:51:50.800
please worry about others, be perfect,
link |
00:51:53.720
walk on eggshells, shapeshift to accommodate other people.
link |
00:51:56.680
So codependence is a very active wiring issue
link |
00:52:01.760
that, you know, doesn't just affect your romantic
link |
00:52:05.000
relationships, it affects you being a boss,
link |
00:52:07.040
it affects you in the world.
link |
00:52:09.560
Online, you know, you get one negative comment
link |
00:52:12.040
and it throws you for two weeks.
link |
00:52:14.160
You know, it also is linked to eating disorders
link |
00:52:16.160
and other kinds of addiction.
link |
00:52:17.120
So it's a very big thing.
link |
00:52:20.200
And I think a lot of people sometimes only think
link |
00:52:22.000
that it's in romantic relationships,
link |
00:52:23.520
so I always feel the need to say that.
link |
00:52:25.960
And also one of the reasons,
link |
00:52:26.920
I love the idea of robots so much
link |
00:52:28.520
because you don't have to walk on eggshells around them,
link |
00:52:30.840
you don't have to worry, they're gonna get mad at you yet,
link |
00:52:33.280
but there's no, codependence are hypersensitive
link |
00:52:36.920
to the needs and moods of others.
link |
00:52:39.560
And it's very exhausting, it's depleting.
link |
00:52:42.120
Just one conversation about where we're gonna go to dinner
link |
00:52:45.400
is like, do you wanna go get Chinese food?
link |
00:52:47.240
We just had Chinese food.
link |
00:52:48.400
Well, wait, are you mad?
link |
00:52:50.120
Well, no, I didn't mean to.
link |
00:52:51.080
And it's just like, that codependence live in this,
link |
00:52:54.920
everything means something
link |
00:52:56.640
and humans can be very emotionally exhausting.
link |
00:53:00.120
Why did you look at me that way?
link |
00:53:01.160
What are you thinking about?
link |
00:53:02.000
What was that?
link |
00:53:02.840
Why did you take your phone?
link |
00:53:03.680
It's just, it's a hypersensitivity
link |
00:53:05.040
that can be incredibly time consuming,
link |
00:53:07.920
which is why I love the idea of robots just subbing in.
link |
00:53:10.760
Even, I've had a hard time running TV shows and stuff
link |
00:53:13.840
because even asking someone to do something,
link |
00:53:15.360
I don't wanna come off like a bitch.
link |
00:53:16.560
I'm very concerned about what other people think of me,
link |
00:53:18.720
how I'm perceived, which is why I think robots
link |
00:53:21.680
will be very beneficial for codependence.
link |
00:53:23.920
By the way, just the real quick tangent,
link |
00:53:25.680
that skill or flaw, whatever you wanna call it,
link |
00:53:29.200
is actually really useful for,
link |
00:53:30.880
if you ever do start your own podcast for interviewing
link |
00:53:34.680
because you're now kind of obsessed
link |
00:53:36.520
about the mindset of others.
link |
00:53:39.240
And it makes you a good sort of listener and talker with.
link |
00:53:43.600
So I think, what's her name from NPR?
link |
00:53:48.240
Terry Gross?
link |
00:53:49.080
Terry Gross talked about having that.
link |
00:53:51.000
So...
link |
00:53:51.840
I don't feel like she has that at all.
link |
00:53:53.600
What?
link |
00:53:56.240
What?
link |
00:53:58.280
She worries about other people's feelings.
link |
00:54:00.760
Yeah, absolutely.
link |
00:54:01.800
Oh, I don't get that at all.
link |
00:54:03.720
I mean, you have to put yourself in the mind
link |
00:54:05.280
of the person you're speaking with.
link |
00:54:07.200
Yes, oh, I see, just in terms of, yeah,
link |
00:54:09.120
I am starting a podcast and the reason I haven't
link |
00:54:11.440
is because I'm codependent and I'm too worried
link |
00:54:13.040
that it's not gonna be perfect.
link |
00:54:14.400
So a big codependent adage is perfectionism
link |
00:54:18.280
leads to procrastination, which leads to paralysis.
link |
00:54:20.360
So how do you, sorry to take a million tangents,
link |
00:54:22.320
how do you survive in social media?
link |
00:54:23.600
Because you're exceptionally active.
link |
00:54:25.200
But by the way, I took you on a tangent
link |
00:54:26.640
and didn't answer your last question
link |
00:54:27.880
about how much we can control.
link |
00:54:30.000
How much, yeah, we'll return it or maybe not.
link |
00:54:33.000
The answer is we can't.
link |
00:54:33.840
Now as a codependent, I'm worried, okay, go ahead.
link |
00:54:36.280
We can, but one of the things that I'm fascinated by
link |
00:54:39.680
is the first thing you learn when you go
link |
00:54:41.360
into 12 step programs or addiction recovery or any of this
link |
00:54:44.240
is genetics loads the gun, environment pulls the trigger.
link |
00:54:47.880
And there's certain parts of your genetics
link |
00:54:50.600
you cannot control.
link |
00:54:51.600
I come from a lot of alcoholism.
link |
00:54:54.280
I come from a lot of mental illness.
link |
00:54:59.920
There's certain things I cannot control
link |
00:55:01.720
and a lot of things that maybe we don't even know yet
link |
00:55:04.040
what we can and can't
link |
00:55:04.880
because of how little we actually know about the brain.
link |
00:55:06.760
But we also talk about the warrior spirit.
link |
00:55:08.640
And there are some people that have that warrior spirit
link |
00:55:12.120
and we don't necessarily know what that engine is.
link |
00:55:15.320
Whether it's you get dopamine from succeeding
link |
00:55:18.040
or achieving or martyring yourself
link |
00:55:21.200
or that tension you get from growing.
link |
00:55:24.960
So a lot of people are like,
link |
00:55:25.800
oh, this person can edify themselves and overcome.
link |
00:55:29.120
But if you're getting attention from improving yourself,
link |
00:55:32.320
you're gonna keep wanting to do that.
link |
00:55:34.560
So that is something that helps a lot of
link |
00:55:37.240
in terms of changing your brain.
link |
00:55:38.600
So you talk about changing your brain to people
link |
00:55:40.400
and talk about what you're doing to overcome set obstacles.
link |
00:55:42.880
You're gonna get more attention from them
link |
00:55:44.560
which is gonna fire off your reward system
link |
00:55:46.800
and then you're gonna keep doing it.
link |
00:55:48.240
Yeah, so you can leverage that momentum.
link |
00:55:50.280
So this is why in any 12 step program,
link |
00:55:52.680
you go into a room and you talk about your progress
link |
00:55:55.160
because then everyone claps for you.
link |
00:55:57.080
And then you're more motivated to keep going.
link |
00:55:58.760
So that's why we say you're only as sick
link |
00:56:00.160
as the secrets you keep.
link |
00:56:01.200
Because if you keep things secret,
link |
00:56:03.640
there's no one guiding you to go in a certain direction.
link |
00:56:06.080
It's based on, right?
link |
00:56:07.080
We're sort of designed to get approval from the tribe
link |
00:56:10.400
or from a group of people
link |
00:56:11.640
because our brain, you know, translates it to safety.
link |
00:56:14.640
So, you know.
link |
00:56:15.480
And in that case, the tribe is a positive one
link |
00:56:17.720
that helps you go the positive direction.
link |
00:56:19.560
So that's why it's so important to go into a room
link |
00:56:21.240
and also say, hey, I wanted to use drugs today
link |
00:56:25.080
and people go, hmm, they go, me too.
link |
00:56:27.440
And you feel less alone
link |
00:56:28.600
and you feel less like you're, you know,
link |
00:56:30.400
have been castigated from the pack or whatever.
link |
00:56:32.760
And then you say, and I do haven't,
link |
00:56:34.200
you get a chip when you haven't drank for 30 days
link |
00:56:36.440
or 60 days or whatever, you get little rewards.
link |
00:56:38.640
So talking about a pack that's not at all healthy or good,
link |
00:56:43.240
but in fact is often toxic, social media.
link |
00:56:46.280
So you're one of my favorite people on Twitter
link |
00:56:48.200
and Instagram to sort of just both the comedy
link |
00:56:52.480
and the insight and just fun.
link |
00:56:54.480
How do you prevent social media
link |
00:56:55.800
from destroying your mental health?
link |
00:56:57.280
I haven't.
link |
00:56:59.600
I haven't.
link |
00:57:00.440
It's the next big epidemic, isn't it?
link |
00:57:03.200
I don't think I have.
link |
00:57:06.600
I don't, I don't think.
link |
00:57:08.680
Is moderation the answer?
link |
00:57:09.960
What?
link |
00:57:10.800
Maybe, but you can do a lot of damage in a moderate way.
link |
00:57:14.320
I mean, I guess, again, it depends on your goals, you know?
link |
00:57:17.120
And, and I think for me,
link |
00:57:19.600
the way that my addiction to social media,
link |
00:57:21.800
I'm happy to call it an addiction.
link |
00:57:23.080
I mean, and I define it as an addiction
link |
00:57:24.960
because it stops being a choice.
link |
00:57:26.280
There are times I just reach over and I'm like, that was.
link |
00:57:29.240
Yeah, that was weird.
link |
00:57:30.080
That was weird.
link |
00:57:31.360
I'll be driving sometimes.
link |
00:57:32.320
I'll be like, oh my God,
link |
00:57:33.720
my arm just went to my phone, you know?
link |
00:57:36.800
I can put it down.
link |
00:57:37.800
I can't take time away from it.
link |
00:57:39.000
But when I do, I get antsy.
link |
00:57:41.400
I get restless, irritable and discontent.
link |
00:57:43.400
I mean, that's kind of the definition, isn't it?
link |
00:57:45.840
So I think by no means,
link |
00:57:48.680
do I have a healthy relationship with social media?
link |
00:57:50.480
I'm sure there's a way to,
link |
00:57:51.560
but I think I'm especially a weirdo in this space
link |
00:57:54.800
because it's easy to conflate is this work, is this?
link |
00:57:58.800
I can always say that it's for work, you know?
link |
00:58:01.960
But I mean, don't you get the same kind of thing
link |
00:58:04.160
as you get from when a room full of people laugh at your jokes?
link |
00:58:08.080
Because, I mean, I see, especially the way you do Twitter,
link |
00:58:11.200
it's an extension of your comedy in a way.
link |
00:58:13.680
I took a big break from Twitter though, a really big break.
link |
00:58:16.720
I took like six months off or something for a while
link |
00:58:19.160
because it was just like,
link |
00:58:20.480
it seemed like it was all kind of politics
link |
00:58:22.240
and it was just a little bit,
link |
00:58:23.280
it wasn't giving me dopamine
link |
00:58:25.000
because there was like this weird, a lot of feedback.
link |
00:58:28.280
So I had to take a break from it and then go back to it
link |
00:58:30.760
because I felt like I didn't have a healthy relationship.
link |
00:58:33.480
Have you ever tried the, I don't know if I believe him,
link |
00:58:36.160
but Joe Rogan seems to not read comments.
link |
00:58:39.440
Have you, and he's one of the only people at the scale,
link |
00:58:42.560
like at your level who at least claims not to read.
link |
00:58:47.840
So like, because you and him swim in this space
link |
00:58:51.760
of tense ideas that get the toxic folks riled up.
link |
00:58:56.760
I think Rogan, I don't, I don't know, I don't,
link |
00:59:00.080
I think he probably looks at YouTube, like the likes and
link |
00:59:05.080
that, you know, I think if some things, if he doesn't know,
link |
00:59:07.880
I don't know, I'm sure he would tell the truth, you know?
link |
00:59:11.880
I'm sure he's got people that look at them
link |
00:59:14.240
and is like disgusted, great, or they don't, you know,
link |
00:59:16.840
like I'm sure he gets it, you know,
link |
00:59:19.160
I can't picture him like in the weeds on.
link |
00:59:21.520
No, for sure.
link |
00:59:22.360
I mean, he's honestly actually saying that.
link |
00:59:24.200
I just, it's, it's a, it's admirable.
link |
00:59:27.280
We're addicted to feedback.
link |
00:59:28.120
Yeah, we're addicted to feedback.
link |
00:59:28.960
I mean, you know, look, like I think that our brain
link |
00:59:31.880
is designed to get intel on how we're perceived
link |
00:59:36.080
so that we know where we stand, right?
link |
00:59:38.320
That's our whole deal, right?
link |
00:59:39.680
As humans, we want to know where we stand.
link |
00:59:41.440
We walk into a room and we go,
link |
00:59:42.480
who's the most powerful person in here?
link |
00:59:43.920
I got to talk to them and get in their good graces.
link |
00:59:45.840
It's just, we're designed to rank ourselves, right?
link |
00:59:48.280
And constantly know our rank and social media
link |
00:59:51.200
because of, you can't figure out your rank
link |
00:59:54.440
with 500 million people, you get it's possible, you know?
link |
00:59:58.080
So our brain is like, what's my rank?
link |
00:59:59.400
What's my rank?
link |
01:00:00.240
And especially for following people.
link |
01:00:01.480
I think the big, the, the interesting thing I think
link |
01:00:04.040
I may be able to say about this
link |
01:00:06.520
besides my speech impediment is that
link |
01:00:08.880
I did start muting people that rank wildly higher than me
link |
01:00:14.920
because it is just stressful on the brain
link |
01:00:17.800
to constantly look at people that are,
link |
01:00:20.640
that are incredibly successful.
link |
01:00:23.080
So you keep feeling bad about yourself, you know?
link |
01:00:25.320
I think that that is like cutting to a certain extent.
link |
01:00:28.640
Just like, look at me looking at all these people
link |
01:00:30.960
that have so much more money than me
link |
01:00:32.080
and so much more success than me.
link |
01:00:33.800
It's making me feel like a failure,
link |
01:00:35.880
even though I don't think I'm a failure,
link |
01:00:37.640
but it's easy to frame it so that I can feel that way.
link |
01:00:41.920
Yeah, that's really interesting.
link |
01:00:43.280
Especially if they're close to,
link |
01:00:45.120
like if there are other comedians or something like that
link |
01:00:46.960
or whatever, that's really disappointing to me.
link |
01:00:50.480
I do the same thing as well.
link |
01:00:51.760
So other successful people that are really close
link |
01:00:53.600
to what I do, it, I don't know.
link |
01:00:56.400
I wish I could just admire.
link |
01:00:58.240
Yeah.
link |
01:00:59.080
And for it not to be a distraction, but...
link |
01:01:01.360
But that's why you are where you are
link |
01:01:02.440
because you don't just admire your competitive
link |
01:01:04.360
and you want to win.
link |
01:01:05.280
So it's also the same thing that bums you out
link |
01:01:07.520
when you look at this as the same reason you are
link |
01:01:09.120
where you are.
link |
01:01:09.960
So that's why I think it's so important to learn
link |
01:01:11.760
about neurology and addiction
link |
01:01:12.800
because you're able to go like,
link |
01:01:14.080
oh, this same instinct, so I'm very sensitive.
link |
01:01:17.160
And I sometimes don't like that about myself,
link |
01:01:19.480
but I'm like, well, that's the reason I'm able to
link |
01:01:21.640
write good standup.
link |
01:01:22.480
And that's the reason,
link |
01:01:23.320
and that's the reason I'm able to be sensitive to feedback
link |
01:01:25.680
and go, that joke should have been better.
link |
01:01:26.880
I can make that better.
link |
01:01:28.040
So it's the kind of thing where it's like,
link |
01:01:29.480
you have to be really sensitive in your work.
link |
01:01:31.240
And the second you leave,
link |
01:01:32.360
you got to be able to turn it off.
link |
01:01:33.560
It's about developing the muscle,
link |
01:01:34.840
being able to know when to let it be a superpower
link |
01:01:38.360
and when it's gonna hold you back and be an obstacle.
link |
01:01:41.240
So I try to not be in that black and white of like,
link |
01:01:44.320
you know, being competitive is bad
link |
01:01:45.800
or being jealous of someone just to go like,
link |
01:01:47.720
oh, there's that thing that makes me really successful
link |
01:01:50.240
in a lot of other ways,
link |
01:01:51.440
but right now it's making me feel bad.
link |
01:01:53.280
Well, I'm kind of looking to you
link |
01:01:54.960
because you're basically a celebrity,
link |
01:01:58.040
a famous sort of world class comedian.
link |
01:02:01.240
And so I feel like you're the right person
link |
01:02:03.120
to be one of the key people to define
link |
01:02:06.120
what's the healthy path forward with social media.
link |
01:02:11.280
Because we're all trying to figure it out now.
link |
01:02:12.800
And it's, I'm curious to see where it evolves.
link |
01:02:16.320
I think you're at the center of that.
link |
01:02:17.920
So like, you know, there's trying to leave Twitter
link |
01:02:21.600
and then come back and see,
link |
01:02:22.760
can I do this in a healthy way?
link |
01:02:24.040
You mean you have to keep trying, exploring and thinking about.
link |
01:02:25.880
I have to know because it's being, you know,
link |
01:02:28.120
I have a couple of answers.
link |
01:02:29.720
I think, you know, I hire a company
link |
01:02:31.560
to do some of my social media for me, you know?
link |
01:02:33.880
So it's also being able to go,
link |
01:02:35.960
okay, I make a certain amount of money by doing this,
link |
01:02:38.360
but now let me be a good business person
link |
01:02:40.320
and say I'm gonna pay you this amount to run this for me.
link |
01:02:42.960
So I'm not 24 seven in the weeds,
link |
01:02:44.600
hashtagging and responding and just, it's a lot to take on.
link |
01:02:47.280
It's a lot of energy to take on,
link |
01:02:48.800
but at the same time part of what I think makes me successful
link |
01:02:52.320
in social media, if I am,
link |
01:02:53.520
is that people know I'm actually doing it
link |
01:02:55.320
and that I am an engaging and I'm responding
link |
01:02:57.200
and developing a personal relationship
link |
01:02:59.600
with complete strangers.
link |
01:03:01.080
So I think, you know, figuring out that balance
link |
01:03:04.000
and really approaching it as a business, you know,
link |
01:03:06.200
that's what I try to do.
link |
01:03:07.320
It's not dating.
link |
01:03:08.360
It's not, I try to just be really objective about,
link |
01:03:11.200
okay, here's what's working, here's what's not working.
link |
01:03:13.440
And in terms of taking the break from Twitter,
link |
01:03:15.880
this is a really savage take,
link |
01:03:17.640
but because I don't talk about my politics publicly,
link |
01:03:21.720
being on Twitter right after the last election
link |
01:03:26.040
was not gonna be beneficial
link |
01:03:27.880
because there was gonna be, you had to take a side.
link |
01:03:30.280
You had to be political
link |
01:03:31.560
in order to get any kind of retweets or likes.
link |
01:03:34.400
And I just wasn't interested in doing that
link |
01:03:37.240
because you were gonna lose as many people
link |
01:03:38.720
as you were gonna gain
link |
01:03:39.560
and it was gonna all come clean in the wash.
link |
01:03:40.840
So I was just like the best thing I can do
link |
01:03:42.760
for me business wise is to just abstain, you know?
link |
01:03:47.800
And, you know, the robot,
link |
01:03:49.920
I joke about her replacing me,
link |
01:03:52.200
but she does do half of my social media, you know?
link |
01:03:55.720
Because I don't want people to get sick of me.
link |
01:03:57.920
I don't want to be redundant.
link |
01:03:59.800
There are times when I don't have the time or the energy
link |
01:04:02.440
to make a funny video,
link |
01:04:03.320
but I know she's gonna be compelling and interesting
link |
01:04:06.160
and that's something that you can't see every day, you know?
link |
01:04:08.520
Of course the humor comes from your,
link |
01:04:11.920
I mean, the cleverness, the wit,
link |
01:04:13.400
the humor comes from you when you film the robot.
link |
01:04:16.400
That's kind of the trick of it.
link |
01:04:17.840
I mean, the robot is not quite there
link |
01:04:21.000
to make anything, to do anything funny.
link |
01:04:23.440
The absurdity is revealed through the filmmaker,
link |
01:04:26.200
in that case, for whoever's interacting,
link |
01:04:27.840
not through the actual robot, you know, being who she is.
link |
01:04:33.520
Let me sort of love.
link |
01:04:37.040
Okay.
link |
01:04:37.880
How difficult?
link |
01:04:39.600
What is it?
link |
01:04:40.760
What is it?
link |
01:04:43.120
Well, first, an engineering question.
link |
01:04:45.040
I know, I know you're not an engineer,
link |
01:04:48.040
but how difficult do you think is it to build
link |
01:04:51.040
an AI system that you can have a deep,
link |
01:04:53.720
fulfilling monogamous relationship with?
link |
01:04:56.480
Sort of replace the human to human relationships
link |
01:04:59.840
that we value.
link |
01:05:01.640
I think anyone can fall in love with anything, you know?
link |
01:05:04.800
Like, how often have you looked back at someone,
link |
01:05:08.320
like I ran into someone the other day
link |
01:05:11.320
that I was in love with and I was like,
link |
01:05:12.640
hey, it was like, there was nothing there.
link |
01:05:16.320
There was nothing there.
link |
01:05:17.840
Like, you know, like where you're able to go like,
link |
01:05:19.760
oh, that was weird.
link |
01:05:20.920
Oh, right.
link |
01:05:22.440
You know?
link |
01:05:23.680
I, I, we're able.
link |
01:05:25.080
You mean it's from a distant past or something?
link |
01:05:27.120
Yeah.
link |
01:05:27.960
When you're able to go like,
link |
01:05:28.800
I can't believe we had an incredible connection
link |
01:05:31.400
and now it's just, I do think that people will be in love
link |
01:05:35.520
with robots probably even more deeply with humans
link |
01:05:39.760
because it's like when people mourn their animals,
link |
01:05:42.480
when their animals die, they're always,
link |
01:05:44.960
it's sometimes harder than mourning a human
link |
01:05:47.720
because you can't go, well, he was kind of an asshole.
link |
01:05:50.320
But like, he didn't pick me up from school.
link |
01:05:51.920
You know, it's like you're able to get out of your grief
link |
01:05:53.840
a little bit.
link |
01:05:54.680
You're able to kind of be, oh, he was kind of judgmental
link |
01:05:57.400
or she was kind of, you know, with a robot,
link |
01:06:00.120
it's there's something so pure about an innocent
link |
01:06:02.800
impish and childlike about it
link |
01:06:05.560
that I think it probably will be much more conducive
link |
01:06:09.680
to a narcissistic love for sure at that.
link |
01:06:12.520
But it's not like, well, he cheated on it.
link |
01:06:15.320
She can't cheat.
link |
01:06:16.160
She can't leave you.
link |
01:06:17.000
She can't, you know.
link |
01:06:17.960
Well, if bear claw leaves your life
link |
01:06:21.560
and maybe a new version or somebody else will enter,
link |
01:06:25.680
will you miss bear claw?
link |
01:06:27.960
For guys that have these sex robots,
link |
01:06:30.680
they're building a nursing home for the bodies
link |
01:06:34.360
that are now resting
link |
01:06:36.320
because they don't want to part with the bodies
link |
01:06:37.920
because they have such an intense emotional connection to it.
link |
01:06:40.840
I mean, it's kind of like a car club a little bit.
link |
01:06:42.840
You know, like it's, you know,
link |
01:06:45.000
but I'm not saying this is right.
link |
01:06:47.360
I'm not saying it's cool.
link |
01:06:48.800
It's weird.
link |
01:06:49.640
It's creepy, but we do anthropomorphize things with faces
link |
01:06:53.800
and we do develop emotional connections to things.
link |
01:06:56.640
I mean, we're, there's certain,
link |
01:06:58.040
have you ever tried to like throw it?
link |
01:06:59.320
I can't even throw away my teddy bear from when I was a kid.
link |
01:07:01.800
It's a piece of trash and it's upstairs.
link |
01:07:04.360
Like it's just like, why can't I throw that away?
link |
01:07:06.640
It's bizarre.
link |
01:07:07.920
You know, and there's something kind of beautiful about that.
link |
01:07:10.120
There's something, it gives me hope in humans
link |
01:07:13.120
because I see humans do such horrific things all the time.
link |
01:07:15.720
And maybe I'm too, I see too much of it, frankly,
link |
01:07:18.320
but there's something kind of beautiful
link |
01:07:20.240
about the way we're able to have emotional connections
link |
01:07:24.360
to objects, which, you know, a lot of,
link |
01:07:29.160
I mean, it's, I can't kind of specifically,
link |
01:07:30.800
I think Western, right?
link |
01:07:32.120
That we don't see objects as having souls.
link |
01:07:34.840
Like that's kind of specifically us,
link |
01:07:36.760
but I don't think it's so much
link |
01:07:39.680
that we're objectifying humans with these sex robots.
link |
01:07:43.400
We're kind of humanizing objects, right?
link |
01:07:45.640
So there's something kind of fascinating
link |
01:07:47.040
in our ability to do that.
link |
01:07:48.120
Cause a lot of us don't humanize humans.
link |
01:07:50.000
So it's just a weird little place to play in.
link |
01:07:52.800
And I think a lot of people, I mean,
link |
01:07:54.920
a lot of people will be marrying these things is my guess.
link |
01:07:57.680
So you've asked the question, let me ask it of you.
link |
01:08:00.560
So what is love?
link |
01:08:02.800
You have a bit of a brilliant definition of love
link |
01:08:05.640
as being willing to die for someone
link |
01:08:07.760
who you yourself want to kill.
link |
01:08:10.520
So that's, that's kind of fun.
link |
01:08:12.160
First of all, that's brilliant.
link |
01:08:14.840
That's a really good definition.
link |
01:08:16.400
I don't think it'll stick for me for a long time.
link |
01:08:18.040
This is how little of a romantic I am.
link |
01:08:19.800
A plane went by when you said that
link |
01:08:21.360
and my brain is like, you're going to need to rerecord that.
link |
01:08:24.880
And I want you to get into post
link |
01:08:26.360
and then not be able to use it.
link |
01:08:31.040
And I'm a romantic.
link |
01:08:32.000
Cause I don't mean to ruin the moment.
link |
01:08:33.560
I actually, I can not be conscious of the fact
link |
01:08:35.600
that I heard the plane and it made me feel like
link |
01:08:38.200
how amazing it is that we live in a world of planes.
link |
01:08:43.280
And I just went, why have we fucking evolved past planes?
link |
01:08:47.040
And why can't they make them quieter?
link |
01:08:49.040
Yeah.
link |
01:08:50.520
Well, this, um, my definition of love, what, what?
link |
01:08:55.400
Yeah, what's your sort of the most serious?
link |
01:08:57.720
Producing dopamine for a long time.
link |
01:09:01.360
Consistent output of oxytocin with the same person.
link |
01:09:06.000
Dopamine is a positive thing.
link |
01:09:08.160
What about the negative?
link |
01:09:09.480
What about the fear and the insecurity, the longing?
link |
01:09:14.840
Anger, all that kind of stuff.
link |
01:09:16.480
I think that's part of love, you know, I think you don't,
link |
01:09:19.040
I think that love brings out the best in you,
link |
01:09:21.960
but it also, if you don't get angry and upset, it's, you know,
link |
01:09:25.040
I don't know, I think that that's, that's part of it.
link |
01:09:26.840
I think we have this idea that love has to be like really,
link |
01:09:29.320
you know, placid or something.
link |
01:09:31.880
I only saw stormy relationships growing up,
link |
01:09:34.120
so I don't, I don't have a judgment
link |
01:09:36.880
on how our relationship should look.
link |
01:09:38.560
But I do think that this idea that love has to be eternal
link |
01:09:45.200
is really destructive, is really destructive
link |
01:09:48.760
and self defeating and a big source of stress for people.
link |
01:09:53.640
I mean, I'm still figuring out love.
link |
01:09:55.640
I think we all kind of are,
link |
01:09:57.280
but I do kind of stand by that definition.
link |
01:10:01.280
And I think that, I think for me, love is like
link |
01:10:04.160
just being able to be authentic with somebody.
link |
01:10:06.240
It's very simple, I know, but I think for me,
link |
01:10:08.520
it's about not feeling pressure to have to perform
link |
01:10:11.040
or impress somebody, just feeling truly like
link |
01:10:14.760
accepted unconditionally by someone.
link |
01:10:16.520
Although I do believe love should be conditional,
link |
01:10:19.160
that might be a hot take.
link |
01:10:22.000
I think everything should be conditional.
link |
01:10:24.280
I think if someone's behavior,
link |
01:10:27.080
I don't think love should just be like,
link |
01:10:28.840
I'm in love with you,
link |
01:10:29.680
now behave however you want forever.
link |
01:10:30.960
This is unconditional.
link |
01:10:31.880
I think love is a daily action.
link |
01:10:35.320
It's not something you just like get tenure on
link |
01:10:38.120
and then get to behave however you want.
link |
01:10:40.000
Cause we said, I love you 10 years ago.
link |
01:10:41.920
It's a daily, it's a verb.
link |
01:10:44.560
Well, there's some things that are,
link |
01:10:46.160
you see, if you make it,
link |
01:10:47.320
if you explicitly make it clear that it's conditional,
link |
01:10:50.440
it takes away some of the magic of it.
link |
01:10:52.520
So there's certain stories we tell ourselves
link |
01:10:55.360
that we don't want to make explicit about love.
link |
01:10:57.200
I don't know, maybe that's the wrong way to think of it.
link |
01:10:59.160
Maybe you want to be explicit in relationships.
link |
01:11:02.560
I also think love is a business decision.
link |
01:11:04.640
Like I do in a good way.
link |
01:11:08.000
Like I think that love is not just
link |
01:11:11.160
when you're across from somebody.
link |
01:11:12.600
It's when I go to work, can I focus?
link |
01:11:15.360
Do I, am I worried about you?
link |
01:11:16.440
Am I stressed out about you?
link |
01:11:17.440
Am I, you're not responding to me.
link |
01:11:19.400
You're not reliable.
link |
01:11:20.440
Like I think that being in a relationship,
link |
01:11:23.200
the kind of love that I would want
link |
01:11:24.280
is the kind of relationship where when we're not together,
link |
01:11:26.920
it's not draining me, causing me stress, making me worry.
link |
01:11:30.320
You know, and sometimes passion, that word, you know,
link |
01:11:33.840
we get murky about it, but I think it's also like,
link |
01:11:37.280
I can be the best version of myself
link |
01:11:38.520
when the person's not around.
link |
01:11:40.040
And I don't have to feel abandoned or scared
link |
01:11:42.640
or any of these kind of other things.
link |
01:11:43.920
So it's like love, you know, for me, I think is,
link |
01:11:47.000
I think it's a flow bear quote.
link |
01:11:48.760
And I'm gonna butcher it, but I think it's like be,
link |
01:11:51.520
you know, boring in your personal life.
link |
01:11:53.280
So you could be violent and take risks
link |
01:11:54.760
in your professional life.
link |
01:11:55.720
Is that it?
link |
01:11:56.560
I got it wrong.
link |
01:11:57.440
Something like that.
link |
01:11:58.280
But I do think that it's being able to align values
link |
01:12:01.320
in a way to where you can also thrive
link |
01:12:03.000
outside of the relationship.
link |
01:12:04.640
Some of the most successful people I know are
link |
01:12:07.240
those sort of happily married and have kids and so on.
link |
01:12:10.080
It's always funny.
link |
01:12:10.920
It can be boring.
link |
01:12:11.840
Boring's okay.
link |
01:12:13.200
Boring is serenity.
link |
01:12:14.440
And it's funny how that, those elements
link |
01:12:16.480
actually make you much more productive.
link |
01:12:18.360
I don't understand the...
link |
01:12:19.720
I don't think relationships should drain you
link |
01:12:21.160
and take away energy that you could be using
link |
01:12:23.440
to create things that generate pride.
link |
01:12:25.800
Okay.
link |
01:12:26.640
Did you say your relationship of love yet?
link |
01:12:28.240
Huh?
link |
01:12:29.080
Have you said your relationship,
link |
01:12:29.920
your definition of love?
link |
01:12:31.400
My definition of love?
link |
01:12:33.520
No, I did not say it.
link |
01:12:34.760
We're out of time.
link |
01:12:36.720
No.
link |
01:12:38.640
Dude, when you have a podcast,
link |
01:12:40.600
maybe you can invite me on.
link |
01:12:41.800
Oh no, I already did.
link |
01:12:42.640
You're doing it.
link |
01:12:44.040
We've already talked about this.
link |
01:12:46.440
And because I also have codependency,
link |
01:12:48.560
I had to say yes.
link |
01:12:49.520
No, yeah.
link |
01:12:50.360
Yeah, no, no, I'm trapping you.
link |
01:12:52.240
You owe me now.
link |
01:12:53.080
Actually, I wondered whether when I asked
link |
01:12:58.280
if we could talk today,
link |
01:13:00.320
after sort of doing more research
link |
01:13:01.720
and reading some of your book,
link |
01:13:03.880
I started to wonder,
link |
01:13:04.720
did she just feel pressured to say yes?
link |
01:13:07.080
Yes.
link |
01:13:08.240
Of course.
link |
01:13:09.360
Good.
link |
01:13:10.200
But I'm a fan of yours too.
link |
01:13:11.040
Okay, awesome.
link |
01:13:11.880
No, I actually, because I am codependent,
link |
01:13:13.400
but I'm in recovery for codependence.
link |
01:13:14.920
So I actually do, I don't do anything I don't want to do.
link |
01:13:17.640
You really, you go out of your way and say no.
link |
01:13:20.400
What's that?
link |
01:13:21.240
I say no all the time.
link |
01:13:22.760
Good.
link |
01:13:23.600
I'm trying to learn that as well.
link |
01:13:24.440
I moved this a couple, remember?
link |
01:13:25.280
I moved it from one to two.
link |
01:13:26.120
Yeah, yeah, just to let you know how recovered I am.
link |
01:13:30.280
I'm not a codependent,
link |
01:13:31.960
but I don't do anything I don't want to do.
link |
01:13:34.680
Yeah, you're ahead of me on that.
link |
01:13:36.000
Okay.
link |
01:13:36.960
So do you think about your mortality?
link |
01:13:43.480
Yes.
link |
01:13:44.440
That is a big part of how I was able to sort of like kickstart
link |
01:13:48.040
my codependence recovery.
link |
01:13:49.080
My dad passed a couple of years ago,
link |
01:13:50.440
and when you have someone close to you in your life die,
link |
01:13:53.080
everything gets real clear in terms of how we're a speck of dust
link |
01:13:57.760
who's only here for a certain amount of time.
link |
01:14:00.880
What do you think is the meaning of it all?
link |
01:14:02.360
Like what the speck of dust, what's maybe in your own life,
link |
01:14:08.080
what's the goal, the purpose of your existence?
link |
01:14:13.080
Is there one?
link |
01:14:15.280
Well, you're exceptionally ambitious.
link |
01:14:17.360
You've created some incredible things
link |
01:14:19.120
in different disciplines.
link |
01:14:21.640
Yeah, we're all just managing our terror
link |
01:14:23.560
because we know we're going to die.
link |
01:14:24.560
So we create and build all these things and rituals
link |
01:14:27.160
and religions and robots and whatever we need to do
link |
01:14:30.440
to just distract ourselves from imminent rotting.
link |
01:14:34.920
We're rotting, we're all dying.
link |
01:14:37.160
And you know, I got very into terror management theory
link |
01:14:42.160
when my dad died and it resonated, it helped me
link |
01:14:45.160
and everyone's got their own religion or sense of purpose
link |
01:14:48.360
or thing that distracts them from the horrors of being human.
link |
01:14:54.640
What's the terror management theory?
link |
01:14:56.120
Terror management is basically the idea that
link |
01:14:57.680
since we're the only animal that knows they're gonna die,
link |
01:15:00.360
we have to basically distract ourselves with awards
link |
01:15:04.760
and achievements and games and whatever
link |
01:15:09.600
just in order to distract ourselves from the terror
link |
01:15:12.600
we would feel if we really processed the fact
link |
01:15:14.880
that we could not only, we are gonna die,
link |
01:15:16.920
but also could die at any minute
link |
01:15:18.400
because we're only superficially
link |
01:15:19.760
at the top of the food chain.
link |
01:15:22.600
And you know, technically we're at the top of the food chain
link |
01:15:26.120
if we have houses and guns and stuff, machines,
link |
01:15:29.280
but if me and a lion are in the woods together,
link |
01:15:31.720
but most things could kill us.
link |
01:15:33.760
I mean, a bee can kill some people.
link |
01:15:35.400
Like something this big can kill a lot of humans.
link |
01:15:37.920
Like, you know, so it's basically just to manage the terror
link |
01:15:41.440
that we all would feel if we were able to really be awake
link |
01:15:45.200
because we're mostly zombies, right?
link |
01:15:46.840
Job, school, religion, go to sleep, drink, football,
link |
01:15:51.520
relationship, dopamine, love, you know?
link |
01:15:54.480
We're kind of just like trudging along
link |
01:15:57.000
like zombies for the most part.
link |
01:15:58.440
And then I think...
link |
01:15:59.800
That fear of death adds some motivation.
link |
01:16:02.360
Yes.
link |
01:16:03.440
Well, I think I speak for a lot of people
link |
01:16:06.040
in saying that I can't wait to see
link |
01:16:08.240
what your terror creates in the next few years.
link |
01:16:13.600
I'm a huge fan.
link |
01:16:14.840
Whitney, thank you so much for talking to me.
link |
01:16:16.560
Thanks.
link |
01:16:18.800
Thanks for listening to this conversation
link |
01:16:20.480
with Whitney Cummings.
link |
01:16:21.840
And thank you to our presenting sponsor, Cash App.
link |
01:16:24.640
Download it and use code LEX Podcast.
link |
01:16:27.360
You'll get $10 and $10 will go to first,
link |
01:16:30.120
a STEM education nonprofit that inspires
link |
01:16:32.520
hundreds of thousands of young minds
link |
01:16:34.560
to learn and to dream of engineering our future.
link |
01:16:37.960
If you enjoy this podcast, subscribe on YouTube,
link |
01:16:40.760
give it five stars on Apple Podcast,
link |
01:16:42.680
support on Patreon, or connect with me on Twitter.
link |
01:16:46.040
Thank you for listening and I hope to see you next time.