back to index

Grimes: Music, AI, and the Future of Humanity | Lex Fridman Podcast #281


small model | large model

link |
00:00:00.000
We are becoming cyborgs.
link |
00:00:02.000
Our brains are fundamentally changed.
link |
00:00:04.080
Everyone who grew up with electronics,
link |
00:00:05.760
we are fundamentally different from previous,
link |
00:00:08.920
from homo sapiens.
link |
00:00:09.840
I call us homo techno.
link |
00:00:11.080
I think we have evolved into homo techno,
link |
00:00:13.240
which is like essentially a new species.
link |
00:00:15.800
Previous technologies, I mean,
link |
00:00:17.840
may have even been more profound
link |
00:00:19.120
and moved us to a certain degree,
link |
00:00:20.180
but I think the computers are what make us homo techno.
link |
00:00:22.780
I think this is what, it's a brain augmentation.
link |
00:00:25.600
So it like allows for actual evolution.
link |
00:00:27.860
Like the computers accelerate the degree
link |
00:00:29.480
to which all the other technologies can also be accelerated.
link |
00:00:32.680
Would you classify yourself as a homo sapien or a homo techno?
link |
00:00:35.600
Definitely homo techno.
link |
00:00:37.080
So you're one of the earliest of the species.
link |
00:00:40.960
I think most of us are.
link |
00:00:45.400
The following is a conversation with Grimes,
link |
00:00:47.760
an artist, musician, songwriter, producer, director,
link |
00:00:50.520
and a fascinating human being
link |
00:00:53.040
who thinks a lot about both the history
link |
00:00:55.480
and the future of human civilization.
link |
00:00:57.520
Studying the dark periods of our past
link |
00:00:59.800
to help form an optimistic vision of our future.
link |
00:01:03.880
This is the Lex Friedman podcast.
link |
00:01:05.720
To support it, please check out our sponsors
link |
00:01:07.840
in the description.
link |
00:01:08.960
And now, dear friends, here's Grimes.
link |
00:01:12.840
Oh yeah, the cloud lifter, there you go.
link |
00:01:14.480
There you go.
link |
00:01:15.300
You know your stuff.
link |
00:01:16.440
Have you ever used a cloud lifter?
link |
00:01:18.240
Yeah, I actually, this microphone cloud lifter
link |
00:01:20.980
is what Michael Jackson used, so.
link |
00:01:23.440
No, really?
link |
00:01:24.420
Yeah, this is like Thriller and stuff.
link |
00:01:26.000
This mic and a cloud lifter?
link |
00:01:28.160
Yeah, it's a incredible microphone.
link |
00:01:30.620
It's very flattering on vocals.
link |
00:01:32.040
I've used this a lot.
link |
00:01:33.600
It's great for demo vocals.
link |
00:01:34.720
It's great in a room.
link |
00:01:36.640
Sometimes it's easier to record vocals
link |
00:01:38.280
if you're just in a room and the music's playing
link |
00:01:40.600
and you just wanna feel it so it's not in the headphones.
link |
00:01:43.040
And this mic is pretty directional,
link |
00:01:44.700
so I think it's a good mic for just vibing out
link |
00:01:47.740
and just getting a real good vocal take.
link |
00:01:49.840
Just vibing, just in a room.
link |
00:01:51.860
Anyway, this is the Michael Jackson, Quincy Jones
link |
00:01:55.920
microphone.
link |
00:01:57.000
I feel way more badass now.
link |
00:01:58.800
All right, you wanna just get into it?
link |
00:02:01.760
I guess so.
link |
00:02:03.040
All right, one of your names, at least in this space
link |
00:02:05.720
and time, is C, like the letter C.
link |
00:02:08.320
And you told me that C means a lot of things.
link |
00:02:11.280
It's the speed of light.
link |
00:02:12.640
It's the render rate of the universe.
link |
00:02:14.700
It's yes in Spanish.
link |
00:02:16.120
It's the crescent moon.
link |
00:02:17.660
And it happens to be my favorite programming language
link |
00:02:21.140
because it basically runs the world,
link |
00:02:24.140
but it's also powerful, fast, and it's dangerous
link |
00:02:28.200
because you can mess things up really bad with it
link |
00:02:30.000
because of all the pointers.
link |
00:02:31.160
But anyway, which of these associations
link |
00:02:33.960
will the name C is the coolest to you?
link |
00:02:37.760
I mean, to me, the coolest is the speed of light,
link |
00:02:40.700
obviously, or the speed of light.
link |
00:02:42.720
When I say render rate of the universe,
link |
00:02:44.400
I think I mean the speed of light
link |
00:02:46.280
because essentially that's what we're rendering at.
link |
00:02:49.120
See, I think we'll know if we're in a simulation
link |
00:02:52.200
if the speed of light changes
link |
00:02:53.740
because if they can improve their render speed, then.
link |
00:02:57.280
Well, it's already pretty good.
link |
00:02:58.480
It's already pretty good, but if it improves,
link |
00:03:01.040
then we'll know, we can probably be like,
link |
00:03:03.880
okay, they've updated or upgraded.
link |
00:03:05.360
Well, it's fast enough for us humans
link |
00:03:06.800
because it seems immediate.
link |
00:03:10.960
There's no delay, there's no latency
link |
00:03:13.360
in terms of us humans on Earth interacting with things.
link |
00:03:16.240
But if you're like intergalactic species
link |
00:03:20.000
operating on a much larger scale,
link |
00:03:21.440
then you're gonna start noticing some weird stuff.
link |
00:03:23.840
Or if you can operate in like around a black hole,
link |
00:03:27.340
then you're gonna start to see some render issues.
link |
00:03:29.680
You can't go faster than the speed of light, correct?
link |
00:03:32.680
So it really limits our ability
link |
00:03:34.520
or one's ability to travel space.
link |
00:03:36.680
Theoretically, you can, you have wormholes.
link |
00:03:38.920
So there's nothing in general relativity
link |
00:03:41.880
that precludes faster than the speed of light travel.
link |
00:03:48.280
But it just seems you're gonna have to do
link |
00:03:49.840
some really funky stuff with very heavy things
link |
00:03:54.000
that have like weirdnesses,
link |
00:03:56.080
that have basically terrors in space time.
link |
00:03:58.600
We don't know how to do that.
link |
00:03:59.740
Do navigators know how to do it?
link |
00:04:01.860
Do navigators? Yeah.
link |
00:04:03.800
Folding space, basically making wormholes.
link |
00:04:07.000
So the name C. Yes.
link |
00:04:11.880
Who are you?
link |
00:04:14.800
Do you think of yourself as multiple people?
link |
00:04:16.960
Are you one person?
link |
00:04:18.280
Do you know, like in this morning,
link |
00:04:20.860
were you a different person than you are tonight?
link |
00:04:23.560
We are, I should say, recording this basically at midnight,
link |
00:04:27.080
which is awesome. Yes, thank you so much.
link |
00:04:29.600
I think I'm about eight hours late.
link |
00:04:31.640
No, you're right on time.
link |
00:04:33.960
Good morning.
link |
00:04:34.800
This is the beginning of a new day soon.
link |
00:04:37.200
Anyway, are you the same person
link |
00:04:39.480
you were in the morning and the evening?
link |
00:04:43.040
Is there multiple people in there?
link |
00:04:44.320
Do you think of yourself as one person?
link |
00:04:46.240
Or maybe you have no clue?
link |
00:04:47.480
Or are you just a giant mystery to yourself?
link |
00:04:50.160
Okay, these are really intense questions, but.
link |
00:04:52.480
Let's go, let's go.
link |
00:04:53.320
Because I asked this myself, like look in the mirror,
link |
00:04:55.320
who are you?
link |
00:04:56.320
People tell you to just be yourself,
link |
00:04:57.960
but what does that even mean?
link |
00:04:59.240
I mean, I think my personality changes
link |
00:05:01.520
with everyone I talk to.
link |
00:05:02.960
So I have a very inconsistent personality.
link |
00:05:06.600
Yeah.
link |
00:05:07.440
Person to person, so the interaction,
link |
00:05:08.960
your personality materializes.
link |
00:05:11.480
Or my mood.
link |
00:05:12.320
Like I'll go from being like a megalomaniac
link |
00:05:16.120
to being like, you know, just like a total hermit
link |
00:05:19.920
who is very shy.
link |
00:05:21.400
So some combinatorial combination of your mood
link |
00:05:24.560
and the person you're interacting with.
link |
00:05:26.280
Yeah, mood and people I'm interacting with.
link |
00:05:28.080
But I think everyone's like that.
link |
00:05:29.720
Maybe not.
link |
00:05:30.880
Well, not everybody acknowledges it
link |
00:05:32.560
and able to introspect it.
link |
00:05:34.000
Who brings out, what kind of person,
link |
00:05:35.800
what kind of mood brings out the best in you?
link |
00:05:38.120
As an artist and as a human.
link |
00:05:40.840
Can you introspect this?
link |
00:05:41.840
Like my best friends, like people I can,
link |
00:05:45.200
when I'm like super confident
link |
00:05:47.480
and I know that they're gonna understand
link |
00:05:50.280
everything I'm saying, so like my best friends,
link |
00:05:52.160
then when I can start being really funny,
link |
00:05:55.360
that's always my like peak mode.
link |
00:05:57.640
But it's like, yeah, takes a lot to get there.
link |
00:06:00.120
Let's talk about constraints.
link |
00:06:02.320
You've talked about constraints and limits.
link |
00:06:06.960
Do those help you out as an artist or as a human being?
link |
00:06:09.560
Or do they get in the way?
link |
00:06:10.760
Do you like the constraints?
link |
00:06:11.880
So in creating music, in creating art, in living life,
link |
00:06:16.760
do you like the constraints that this world puts on you?
link |
00:06:21.840
Or do you hate them?
link |
00:06:24.720
If constraints are moving, then you're good, right?
link |
00:06:29.720
Like it's like as we are progressing with technology,
link |
00:06:32.040
we're changing the constraints of like artistic creation.
link |
00:06:34.800
You know, making video and music and stuff
link |
00:06:38.440
is getting a lot cheaper.
link |
00:06:39.720
There's constantly new technology and new software
link |
00:06:42.080
that's making it faster and easier.
link |
00:06:44.000
We have so much more freedom than we had in the 70s.
link |
00:06:46.680
Like when Michael Jackson, you know,
link |
00:06:48.640
when they recorded Thriller with this microphone,
link |
00:06:51.440
like they had to use a mixing desk and all this stuff.
link |
00:06:54.000
And like probably even get in a studio,
link |
00:06:55.600
it's probably really expensive
link |
00:06:56.560
and you have to be a really good singer
link |
00:06:57.600
and you have to know how to use
link |
00:06:59.080
like the mixing desk and everything.
link |
00:07:00.520
And now I can just, you know,
link |
00:07:02.480
make I've made a whole album on this computer.
link |
00:07:05.240
I have a lot more freedom,
link |
00:07:06.760
but then I'm also constrained in different ways
link |
00:07:10.280
because there's like literally millions more artists.
link |
00:07:13.720
It's like a much bigger playing field.
link |
00:07:15.720
It's just like, I also, I didn't learn music.
link |
00:07:18.680
I'm not a natural musician.
link |
00:07:20.240
So I don't know anything about actual music.
link |
00:07:22.640
I just know about like the computer.
link |
00:07:24.800
So I'm really kind of just like messing around
link |
00:07:30.520
and like trying things out.
link |
00:07:33.280
Well, yeah, I mean, but the nature of music is changing.
link |
00:07:35.760
So you're saying you don't know actual music,
link |
00:07:37.320
what music is changing.
link |
00:07:39.260
Music is becoming, you've talked about this,
link |
00:07:41.920
is becoming, it's like merging with technology.
link |
00:07:46.640
Yes.
link |
00:07:47.680
It's becoming something more than just like
link |
00:07:51.400
the notes on a piano.
link |
00:07:53.000
It's becoming some weird composition
link |
00:07:54.960
that requires engineering skills, programming skills,
link |
00:07:59.440
some kind of human robot interaction skills,
link |
00:08:03.440
and still some of the same things that Michael Jackson had,
link |
00:08:05.680
which is like a good ear for a good sense of taste
link |
00:08:08.480
of what's good and not the final thing
link |
00:08:10.360
when it's put together.
link |
00:08:11.520
Like you're allowed, you're enabled, empowered
link |
00:08:14.920
with a laptop to layer stuff,
link |
00:08:17.200
to start like layering insane amounts of stuff.
link |
00:08:20.280
And it's super easy to do that.
link |
00:08:22.280
I do think music production is a really underrated art form.
link |
00:08:25.000
I feel like people really don't appreciate it.
link |
00:08:26.700
When I look at publishing splits,
link |
00:08:27.960
the way that people like pay producers and stuff,
link |
00:08:32.240
it's super, producers are just deeply underrated.
link |
00:08:35.560
Like so many of the songs that are popular right now
link |
00:08:39.160
or for the last 20 years,
link |
00:08:40.920
like part of the reason they're popular
link |
00:08:42.240
is because the production is really interesting
link |
00:08:44.000
or really sick or really cool.
link |
00:08:45.640
And it's like, I don't think listeners,
link |
00:08:50.920
like people just don't really understand
link |
00:08:52.520
what music production is.
link |
00:08:54.640
It's not, it's sort of like this weird,
link |
00:08:57.680
discombobulated art form.
link |
00:08:59.360
It's not like a formal, because it's so new,
link |
00:09:01.380
there isn't like a formal training path for it.
link |
00:09:06.880
It's mostly driven by like autodidacts.
link |
00:09:10.200
Like it's like almost everyone I know
link |
00:09:11.280
who's good at production,
link |
00:09:12.200
like they didn't go to music school or anything.
link |
00:09:13.760
They just taught themselves.
link |
00:09:15.080
Are they're mostly different?
link |
00:09:16.040
Like the music producers, you know,
link |
00:09:18.400
is there some commonalities that time together
link |
00:09:21.320
or are they all just different kinds of weirdos?
link |
00:09:23.580
Cause I just, I just hung out with Rick Rubin.
link |
00:09:25.440
I don't know if you've.
link |
00:09:26.280
Yeah, I mean, Rick Rubin is like literally
link |
00:09:29.760
one of the gods of music production.
link |
00:09:31.200
Like he's one of the people who first,
link |
00:09:33.780
you know, who like made music production,
link |
00:09:36.320
you know, made the production as important
link |
00:09:39.360
as the actual lyrics or the notes.
link |
00:09:41.600
But the thing he does, which is interesting,
link |
00:09:43.560
I don't know if you can speak to that,
link |
00:09:45.520
but just hanging out with him,
link |
00:09:46.760
he seems to just sit there in silence,
link |
00:09:48.520
close his eyes and listen.
link |
00:09:50.800
It's like, he almost does nothing.
link |
00:09:53.640
And that nothing somehow gives you freedom
link |
00:09:55.880
to be the best version of yourself.
link |
00:09:58.160
So that's music production somehow too,
link |
00:10:00.060
which is like encouraging you to do less,
link |
00:10:02.680
to simplify, to like push towards minimalism.
link |
00:10:06.900
I mean, I guess, I mean,
link |
00:10:09.600
I work differently from Rick Rubin
link |
00:10:11.600
cause Rick Rubin produces for other artists,
link |
00:10:14.160
whereas like I mostly produce for myself.
link |
00:10:17.080
So it's a very different situation.
link |
00:10:19.240
I also think Rick Rubin, he's in that,
link |
00:10:21.760
I would say advanced category of producer
link |
00:10:23.600
where like you've like earned your,
link |
00:10:26.600
you can have an engineer and stuff
link |
00:10:27.960
and people like do the stuff for you.
link |
00:10:29.840
But I usually just like do stuff myself.
link |
00:10:32.400
So you're the engineer, the producer and the artist.
link |
00:10:38.080
Yeah, I guess I would say I'm in the era,
link |
00:10:39.880
like the post Rick Rubin era.
link |
00:10:41.280
Like I come from the kind of like
link |
00:10:44.320
Skrillex school of thought,
link |
00:10:47.040
which is like where you are.
link |
00:10:49.240
Yeah, the engineer, producer, artist.
link |
00:10:51.040
Like where, I mean lately,
link |
00:10:53.760
sometimes I'll work with a producer now.
link |
00:10:55.560
I'm gently sort of delicately starting
link |
00:10:59.120
to collaborate a bit more,
link |
00:10:59.960
but like I think I'm kind of from the,
link |
00:11:02.800
like the whatever 2010s explosion of things
link |
00:11:07.120
where everything became available on the computer
link |
00:11:11.920
and you kind of got this like lone wizard energy thing going.
link |
00:11:16.680
So you embraced being the loneliness.
link |
00:11:19.680
Is the loneliness somehow an engine of creativity?
link |
00:11:22.440
Like, so most of your stuff,
link |
00:11:24.560
most of your creative quote unquote genius in quotes
link |
00:11:28.640
is in the privacy of your mind.
link |
00:11:32.160
Yes, well, it was,
link |
00:11:36.680
but here's the thing.
link |
00:11:39.060
I was talking to Daniel Eck and he said,
link |
00:11:40.840
he's like most artists, they have about 10 years,
link |
00:11:43.400
like 10 good years.
link |
00:11:45.160
And then they usually stop making their like vital shit.
link |
00:11:49.880
And I feel like I'm sort of like nearing the end
link |
00:11:53.360
of my 10 years on my own.
link |
00:11:56.540
So you have to become somebody else.
link |
00:11:58.640
Now I'm like, I'm in the process
link |
00:11:59.840
of becoming somebody else and reinventing.
link |
00:12:01.720
When I work with other people,
link |
00:12:02.840
because I've never worked with other people,
link |
00:12:04.180
I find that I make like, that I'm exceptionally rejuvenated
link |
00:12:08.380
and making like some of the most vital work I've ever made.
link |
00:12:10.980
So, because I think another human brain
link |
00:12:13.840
is like one of the best tools you can possibly find.
link |
00:12:17.560
Like.
link |
00:12:18.400
It's a funny way to put it, I love it.
link |
00:12:20.560
It's like if a tool is like, you know,
link |
00:12:23.340
whatever HP plus one or like adds some like stats
link |
00:12:27.320
to your character, like another human brain
link |
00:12:30.760
will like square it instead of just like adding something.
link |
00:12:34.240
Double up the experience points, I love this.
link |
00:12:36.320
We should also mention we're playing Tavern music
link |
00:12:38.280
before this and which I love, which I first,
link |
00:12:41.600
I think I first.
link |
00:12:42.440
You had to stop the Tavern music.
link |
00:12:43.760
Yeah, because it doesn't, the audio.
link |
00:12:46.380
Okay, okay.
link |
00:12:47.220
But it makes.
link |
00:12:48.040
Yeah, it'll make the podcast annoying.
link |
00:12:48.880
Add it in post, add it in post.
link |
00:12:50.040
No one will want to listen to the podcast.
link |
00:12:51.560
They probably would, but it makes me,
link |
00:12:53.400
it reminds me like of a video game,
link |
00:12:55.480
like a role playing video game
link |
00:12:56.760
where you have experience points.
link |
00:12:58.400
There's something really joyful about wandering places
link |
00:13:03.440
like Elder Scrolls, like Skyrim,
link |
00:13:06.480
just exploring these landscapes in another world
link |
00:13:10.500
and then you get experience points
link |
00:13:12.000
and you can work on different skills
link |
00:13:14.000
and somehow you progress in life.
link |
00:13:16.160
I don't know, it's simple.
link |
00:13:17.600
It doesn't have some of the messy complexities of life
link |
00:13:19.960
and there's usually a bad guy you can fight in Skyrim.
link |
00:13:23.960
It's dragons and so on.
link |
00:13:25.560
I'm sure in Elden Ring,
link |
00:13:26.460
there's a bunch of monsters you can fight.
link |
00:13:28.240
I love that.
link |
00:13:29.080
I feel like Elden Ring,
link |
00:13:29.960
I feel like this is a good analogy
link |
00:13:31.440
to music production though
link |
00:13:32.360
because it's like, I feel like the engineers
link |
00:13:34.400
and the people creating these open worlds are,
link |
00:13:36.600
it's sort of like similar to people, to music producers
link |
00:13:39.660
where it's like this hidden archetype
link |
00:13:42.720
that like no one really understands what they do
link |
00:13:44.600
and no one really knows who they are,
link |
00:13:46.160
but they're like, it's like the artist engineer
link |
00:13:49.300
because it's like, it's both art
link |
00:13:51.760
and fairly complex engineering.
link |
00:13:54.840
Well, you're saying they don't get enough credit.
link |
00:13:57.200
Aren't you kind of changing that
link |
00:13:58.600
by becoming the person doing everything?
link |
00:14:01.320
Aren't you, isn't the engineer?
link |
00:14:03.680
Well, I mean, others have gone before me.
link |
00:14:05.440
I'm not, you know, there's like Timbaland and Skrillex
link |
00:14:07.800
and there's all these people that are like,
link |
00:14:10.360
you know, very famous for this,
link |
00:14:12.040
but I just think the general,
link |
00:14:13.920
I think people get confused about what it is
link |
00:14:15.920
and just don't really know what it is per se
link |
00:14:19.200
and it's just when I see a song,
link |
00:14:20.480
like when there's like a hit song,
link |
00:14:22.280
like I'm just trying to think of like,
link |
00:14:27.520
just going for like even just a basic pop hit,
link |
00:14:29.860
like, what's it?
link |
00:14:33.120
Like Rules by Dua Lipa or something.
link |
00:14:36.100
The production on that is actually like really crazy.
link |
00:14:39.220
I mean, the song is also great,
link |
00:14:40.560
but it's like the production is exceptionally memorable.
link |
00:14:43.360
Like, you know, and it's just like no one,
link |
00:14:47.200
I can't, I don't even know who produced that song.
link |
00:14:49.180
It's just like, isn't part of like the rhetoric
link |
00:14:50.680
of how we just discuss the creation of art.
link |
00:14:53.420
We just sort of like don't consider the music producer
link |
00:14:57.200
because I think the music producer used to be more
link |
00:15:00.320
just simply recording things.
link |
00:15:03.700
Yeah, that's interesting
link |
00:15:04.640
because when you think about movies,
link |
00:15:06.000
we talk about the actor and the actresses,
link |
00:15:08.600
but we also talk about the directors.
link |
00:15:11.520
We don't talk about like that with the music as often.
link |
00:15:15.760
The Beatles music producer
link |
00:15:17.360
was one of the first kind of guy,
link |
00:15:19.880
one of the first people sort of introducing
link |
00:15:21.220
crazy sound design into pop music.
link |
00:15:22.640
I forget his name.
link |
00:15:24.160
He has the same, I forget his name,
link |
00:15:25.860
but you know, like he was doing all the weird stuff
link |
00:15:29.180
like dropping pianos and like, yeah.
link |
00:15:32.420
Oh, to get the, yeah, yeah, yeah,
link |
00:15:33.480
to get the sound, to get the authentic sound.
link |
00:15:36.580
What about lyrics?
link |
00:15:38.100
You think those, where did they fit
link |
00:15:40.960
into how important they are?
link |
00:15:42.960
I was heartbroken to learn
link |
00:15:44.860
that Elvis didn't write his songs.
link |
00:15:46.800
I was very mad.
link |
00:15:47.880
A lot of people don't write their songs.
link |
00:15:49.480
I understand this, but.
link |
00:15:50.820
But here's the thing.
link |
00:15:52.200
I feel like there's this desire for authenticity.
link |
00:15:54.880
I used to be like really mad
link |
00:15:56.140
when like people wouldn't write or produce their music
link |
00:15:58.000
and I'd be like, that's fake.
link |
00:15:59.280
And then I realized there's all this like weird bitterness
link |
00:16:04.520
and like agronus in art about authenticity.
link |
00:16:07.760
But I had this kind of like weird realization recently
link |
00:16:12.200
where I started thinking that like art
link |
00:16:14.960
is sort of a decentralized collective thing.
link |
00:16:20.220
Like art is kind of a conversation
link |
00:16:25.300
with all the artists that have ever lived before you.
link |
00:16:28.560
You know, like it's like, you're really just sort of,
link |
00:16:31.080
it's not like anyone's reinventing the wheel here.
link |
00:16:33.520
Like you're kind of just taking, you know,
link |
00:16:36.620
thousands of years of art
link |
00:16:38.200
and like running it through your own little algorithm
link |
00:16:41.700
and then like making your interpretation of it.
link |
00:16:45.040
You just joined the conversation
link |
00:16:46.240
with all the other artists that came before.
link |
00:16:47.560
It's just a beautiful way to look at it.
link |
00:16:49.580
Like, and it's like, I feel like everyone's always like,
link |
00:16:51.540
there's all this copyright and IP and this and that
link |
00:16:54.120
or authenticity.
link |
00:16:55.160
And it's just like, I think we need to stop seeing this
link |
00:16:59.220
as this like egotistical thing of like,
link |
00:17:01.580
oh, the creative genius, the lone creative genius
link |
00:17:04.180
or this or that.
link |
00:17:05.020
Because it's like, I think art shouldn't be about that.
link |
00:17:08.760
I think art is something that sort of
link |
00:17:10.360
brings humanity together.
link |
00:17:12.040
And it's also, art is also kind of the collective memory
link |
00:17:14.080
of humans.
link |
00:17:14.920
It's like, we don't give a fuck about
link |
00:17:18.720
whatever ancient Egypt,
link |
00:17:20.280
like how much grain got sent that day
link |
00:17:22.760
and sending the records and like, you know,
link |
00:17:24.920
like who went where and, you know,
link |
00:17:27.600
how many shields needed to be produced for this.
link |
00:17:29.640
Like we just remember their art.
link |
00:17:32.240
And it's like, you know, it's like in our day to day life,
link |
00:17:34.840
there's all this stuff that seems more important than art
link |
00:17:38.080
because it helps us function and survive.
link |
00:17:40.200
But when all this is gone,
link |
00:17:41.840
like the only thing that's really gonna be left is the art.
link |
00:17:45.040
The technology will be obsolete.
link |
00:17:46.800
That's so fascinating.
link |
00:17:47.640
Like the humans will be dead.
link |
00:17:49.080
That is true.
link |
00:17:49.920
A good compression of human history
link |
00:17:52.520
is the art we've generated across the different centuries,
link |
00:17:56.200
the different millennia.
link |
00:17:57.800
So when the aliens come.
link |
00:17:59.900
When the aliens come,
link |
00:18:00.740
they're gonna find the hieroglyphics and the pyramids.
link |
00:18:02.740
I mean, art could be broadly defined.
link |
00:18:04.360
They might find like the engineering marvels,
link |
00:18:06.240
the bridges, the rockets, the.
link |
00:18:09.860
I guess I sort of classify though.
link |
00:18:11.480
Architecture is art too.
link |
00:18:13.580
I consider engineering in those formats to be art, for sure.
link |
00:18:19.360
It sucks that like digital art is easier to delete.
link |
00:18:23.200
So if there's an apocalypse, a nuclear war,
link |
00:18:25.800
that can disappear.
link |
00:18:26.880
Yes.
link |
00:18:28.080
And the physical.
link |
00:18:28.920
There's something still valuable
link |
00:18:30.000
about the physical manifestation of art.
link |
00:18:32.360
That sucks that like music, for example,
link |
00:18:35.580
has to be played by somebody.
link |
00:18:37.940
Yeah, I do think we should have a foundation type situation
link |
00:18:41.640
where we like, you know how we have like seed banks
link |
00:18:44.120
up in the north and stuff?
link |
00:18:45.520
Like we should probably have like a solar powered
link |
00:18:48.000
or geothermal little bunker
link |
00:18:49.760
that like has all human knowledge.
link |
00:18:52.360
You mentioned Daniel Ek and Spotify.
link |
00:18:55.240
What do you think about that as an artist?
link |
00:18:56.960
What's Spotify?
link |
00:18:58.280
Is that empowering?
link |
00:19:00.000
To me, Spotify as a consumer is super exciting.
link |
00:19:02.560
It makes it easy for me to access music
link |
00:19:04.960
from all kinds of artists,
link |
00:19:06.600
get to explore all kinds of music,
link |
00:19:08.420
make it super easy to sort of curate my own playlist
link |
00:19:12.280
and have fun with all that.
link |
00:19:13.960
It was so liberating to let go.
link |
00:19:16.040
You know, I used to collect, you know,
link |
00:19:17.560
albums and CDs and so on, like horde albums.
link |
00:19:22.120
Yeah.
link |
00:19:22.960
Like they matter.
link |
00:19:23.780
But the reality you could, you know,
link |
00:19:25.640
that was really liberating that I could let go of that.
link |
00:19:28.080
And letting go of the albums you're kind of collecting
link |
00:19:32.160
allows you to find new music,
link |
00:19:33.600
exploring new artists and all that kind of stuff.
link |
00:19:36.200
But I know from a perspective of an artist that could be,
link |
00:19:38.360
like you mentioned,
link |
00:19:39.200
competition could be a kind of constraint
link |
00:19:42.000
because there's more and more and more artists
link |
00:19:44.640
on the platform.
link |
00:19:46.040
I think it's better that there's more artists.
link |
00:19:47.880
I mean, again, this might be propaganda
link |
00:19:49.800
because this is all from a conversation with Daniel Ek.
link |
00:19:51.680
So this could easily be propaganda.
link |
00:19:54.040
We're all a victim of somebody's propaganda.
link |
00:19:56.720
So let's just accept this.
link |
00:19:58.960
But Daniel Ek was telling me that, you know,
link |
00:20:01.000
at the, because I, you know, when I met him,
link |
00:20:03.680
I came in all furious about Spotify
link |
00:20:06.440
and like I grilled him super hard.
link |
00:20:07.760
So I've got his answers here.
link |
00:20:10.640
But he was saying like at the sort of peak
link |
00:20:13.520
of the CD industry,
link |
00:20:15.240
there was like 20,000 artists making millions
link |
00:20:18.280
and millions of dollars.
link |
00:20:19.520
Like there was just like a very tiny kind of 1%.
link |
00:20:22.840
And Spotify has kind of democratized the industry
link |
00:20:27.400
because now I think he said there's about a million artists
link |
00:20:30.320
making a good living from Spotify.
link |
00:20:33.120
And when I heard that, I was like, honestly,
link |
00:20:36.920
I would rather make less money
link |
00:20:38.840
and have just like a decent living
link |
00:20:43.720
and have more artists be able to have that,
link |
00:20:46.560
even though I like, I wish it could include everyone, but.
link |
00:20:49.320
Yeah, that's really hard to argue with.
link |
00:20:50.760
YouTube is the same.
link |
00:20:52.280
It's YouTube's mission.
link |
00:20:54.120
They want to basically have as many creators as possible
link |
00:20:58.280
and make a living, some kind of living.
link |
00:21:00.720
And that's so hard to argue with.
link |
00:21:02.720
It's so hard.
link |
00:21:03.560
But I think there's better ways to do it.
link |
00:21:04.480
My manager, I actually wish he was here.
link |
00:21:06.520
Like I would have brought him up.
link |
00:21:07.840
My manager is building an app that can manage you.
link |
00:21:13.800
So it'll like help you organize your percentages
link |
00:21:16.520
and get your publishing and dah, dah, dah, dah, dah.
link |
00:21:18.840
So you can take out all the middlemen
link |
00:21:19.960
so you can have a much bigger,
link |
00:21:21.360
it'll just like automate it.
link |
00:21:23.000
So you can get.
link |
00:21:23.840
So automate the manager?
link |
00:21:24.680
Automate management, publishing,
link |
00:21:28.040
and legal, it can read,
link |
00:21:32.440
the app he's building can read your contract
link |
00:21:34.080
and like tell you about it.
link |
00:21:35.640
Because one of the issues with music right now,
link |
00:21:38.280
it's not that we're not getting paid enough,
link |
00:21:39.800
but it's that the art industry is filled with middlemen
link |
00:21:44.960
because artists are not good at business.
link |
00:21:47.760
And from the beginning, like Frank Sinatra,
link |
00:21:50.400
it's all mob stuff.
link |
00:21:51.640
Like it's the music industry is run by business people,
link |
00:21:56.640
not the artists and the artists really get very small cuts
link |
00:21:59.480
of like what they make.
link |
00:22:00.400
And so I think part of the reason I'm a technocrat,
link |
00:22:04.720
which I mean, your fans are gonna be technocrats.
link |
00:22:07.040
So no one's, they're not gonna be mad at me about this,
link |
00:22:09.240
but like my fans hate it when I say this kind of thing
link |
00:22:12.160
or the general public.
link |
00:22:13.000
They don't like technocrats.
link |
00:22:14.160
They don't like technocrats.
link |
00:22:15.600
Like when I watched Battle Angel Alita
link |
00:22:18.720
and they were like the Martian technocracy
link |
00:22:20.360
and I was like, yeah, Martian technocracy.
link |
00:22:22.000
And then they were like, and they're evil.
link |
00:22:23.520
And I was like, oh, okay.
link |
00:22:25.560
I was like, cause Martian technocracy sounds sick to me.
link |
00:22:28.760
Yeah, so your intuition as technocrats
link |
00:22:31.920
would create some kind of beautiful world.
link |
00:22:34.200
For example, what my manager's working on,
link |
00:22:36.040
if you can create an app that removes the need for a lawyer
link |
00:22:39.880
and then you could have smart contracts on the blockchain,
link |
00:22:43.360
removes the need for like management
link |
00:22:46.800
and organizing all this stuff,
link |
00:22:48.000
like can read your stuff and explain it to you,
link |
00:22:50.920
can collect your royalties, you know,
link |
00:22:54.200
like then the small amounts,
link |
00:22:56.960
the amount of money that you're getting from Spotify
link |
00:22:58.680
actually means a lot more and goes a lot farther.
link |
00:23:01.760
It can remove some of the bureaucracy,
link |
00:23:03.120
some of the inefficiencies that make life
link |
00:23:06.360
not as great as it could be.
link |
00:23:08.200
Yeah, I think the issue isn't that there's not enough.
link |
00:23:10.840
Like the issue is that there's inefficiency
link |
00:23:12.680
and I'm really into this positive sum mindset,
link |
00:23:18.640
you know, the win, win mindset of like,
link |
00:23:20.840
instead of, you know, fighting over the scraps,
link |
00:23:23.520
how do we make the, or worrying about scarcity,
link |
00:23:26.520
like instead of a scarcity mindset,
link |
00:23:27.800
why don't we just increase the efficiency
link |
00:23:30.080
and, you know, in that way.
link |
00:23:32.360
Expand the size of the pie.
link |
00:23:34.400
Let me ask you about experimentation.
link |
00:23:36.560
So you said, which is beautiful,
link |
00:23:40.480
being a musician is like having a conversation
link |
00:23:42.760
with all those that came before you.
link |
00:23:45.440
How much of creating music is like
link |
00:23:51.200
kind of having that conversation,
link |
00:23:53.040
trying to fit into the cultural trends
link |
00:23:57.200
and how much of it is like trying to,
link |
00:23:59.320
as much as possible, be an outsider
link |
00:24:00.840
and come up with something totally new.
link |
00:24:02.600
It's like when you're thinking,
link |
00:24:04.440
when you're experimenting,
link |
00:24:05.640
are you trying to be totally different, totally weird?
link |
00:24:08.680
Are you trying to fit in?
link |
00:24:12.120
Man, this is so hard because I feel like I'm
link |
00:24:14.640
kind of in the process of semi retiring from music,
link |
00:24:16.880
so this is like my old brain.
link |
00:24:18.720
Yeah, bring it from like the shelf,
link |
00:24:22.040
put it on the table for a couple minutes,
link |
00:24:24.440
we'll just poke it.
link |
00:24:26.280
I think it's a bit of both
link |
00:24:27.240
because I think forcing yourself to engage with new music
link |
00:24:32.400
is really great for neuroplasticity.
link |
00:24:35.000
Like I think, you know, as people,
link |
00:24:39.480
part of the reason music is marketed at young people
link |
00:24:41.640
is because young people are very neuroplastic.
link |
00:24:43.360
So like if you're 16 to like 23 or whatever,
link |
00:24:48.200
it's gonna be really easy for you to love new music.
link |
00:24:50.840
And if you're older than that,
link |
00:24:52.280
it gets harder and harder and harder.
link |
00:24:53.760
And I think one of the beautiful things
link |
00:24:54.960
about being a musician is I just constantly force myself
link |
00:24:57.880
to listen to new music
link |
00:24:58.760
and I think it keeps my brain really plastic.
link |
00:25:01.000
And I think this is a really good exercise.
link |
00:25:02.760
I just think everyone should do this.
link |
00:25:04.280
You listen to new music and you hate it,
link |
00:25:05.600
I think you should just keep, force yourself to like,
link |
00:25:08.600
okay, well why do people like it?
link |
00:25:09.920
And like, you know, make your brain form new neural pathways
link |
00:25:14.640
and be more open to change.
link |
00:25:16.880
That's really brilliant actually.
link |
00:25:18.280
Sorry to interrupt, but like that exercise
link |
00:25:21.520
is really amazing to sort of embrace change,
link |
00:25:27.480
embrace sort of practice neuroplasticity.
link |
00:25:31.600
Because like that's one of the things,
link |
00:25:33.160
you fall in love with a certain band
link |
00:25:34.360
and you just kind of stay with that for the rest of your life
link |
00:25:36.760
and then you never understand the modern music.
link |
00:25:38.400
That's a really good exercise.
link |
00:25:39.240
Most of the streaming on Spotify
link |
00:25:40.640
is like classic rock and stuff.
link |
00:25:42.440
Like new music makes up a very small chunk
link |
00:25:44.800
of what is played on Spotify.
link |
00:25:46.840
And I think this is like not a good sign for us as a species.
link |
00:25:50.120
I think, yeah.
link |
00:25:52.880
So it's a good measure of the species open mindedness
link |
00:25:57.720
to change is how often you listen to new music.
link |
00:26:01.120
The brain, let's put the music brain back on the shelf.
link |
00:26:05.160
I gotta pull out the futurist brain for a second.
link |
00:26:09.680
In what wild ways do you think the future,
link |
00:26:12.280
say in like 30 years, maybe 50 years,
link |
00:26:14.960
maybe a hundred years will be different
link |
00:26:19.280
from our current way of life on earth?
link |
00:26:22.120
We can talk about augmented reality, virtual reality,
link |
00:26:25.400
maybe robots, maybe space travel, maybe video games,
link |
00:26:30.800
maybe genetic engineering.
link |
00:26:32.520
I can keep going.
link |
00:26:33.360
Cyborgs, aliens, world wars,
link |
00:26:36.240
maybe destructive nuclear wars, good and bad.
link |
00:26:39.840
When you think about the future, what are you imagining?
link |
00:26:43.520
What's the weirdest and the wildest it could be?
link |
00:26:47.640
Have you read Surface Detail by Iain Banks?
link |
00:26:51.480
Surface Detail is my favorite depiction of a,
link |
00:26:54.800
oh wow, you have to read this book.
link |
00:26:56.520
It's literally the greatest science fiction book
link |
00:26:58.680
possibly ever written.
link |
00:26:59.520
Iain Banks is the man, yeah, for sure.
link |
00:27:01.560
What have you read?
link |
00:27:03.200
Just the Player of Games.
link |
00:27:04.520
I read that titles can't be copyrighted
link |
00:27:07.320
so you can just steal them.
link |
00:27:08.280
And I was like, Player of Games, sick.
link |
00:27:09.920
Nice.
link |
00:27:10.760
Yeah, so you can name your album.
link |
00:27:12.760
Like I always wanted to.
link |
00:27:13.600
Romeo and Juliet or something.
link |
00:27:15.000
I always wanted to name an album War and Peace.
link |
00:27:17.080
Nice.
link |
00:27:17.920
Like that would be, like you.
link |
00:27:18.760
That is a good, that's a good,
link |
00:27:20.440
where have I heard that before?
link |
00:27:21.600
You can do that, like you could do that.
link |
00:27:24.280
Also things that are in the public domain.
link |
00:27:26.040
For people who have no clue,
link |
00:27:27.160
you do have a song called Player of Games.
link |
00:27:29.680
Yes, oh yeah.
link |
00:27:30.680
So Iain Banks, Surface Detail is in my opinion
link |
00:27:33.840
the best future that I've ever read about
link |
00:27:37.200
or heard about in science fiction.
link |
00:27:39.540
Basically there's the relationship with super intelligence,
link |
00:27:44.600
like artificial super intelligence is just, it's like great.
link |
00:27:50.400
I want to credit the person who coined this term
link |
00:27:53.040
because I love this term.
link |
00:27:55.240
And I feel like young women don't get enough credit in.
link |
00:28:00.160
Yeah, so if you go to Protopia Futures on Instagram,
link |
00:28:03.920
what is her name?
link |
00:28:05.480
Personalized donor experience at scale,
link |
00:28:07.240
our AI power donor experience.
link |
00:28:08.920
Monica Bealskite, I'm saying that wrong.
link |
00:28:15.360
And I'm probably gonna, I'm probably butchering this a bit,
link |
00:28:17.680
but Protopia is sort of, if utopia is unattainable,
link |
00:28:21.580
Protopia is sort of like, you know.
link |
00:28:26.000
Wow, that's an awesome Instagram, Protopia Futures.
link |
00:28:28.640
A great, a future that is, you know, as good as we can get.
link |
00:28:33.480
The future, positive future.
link |
00:28:34.720
AI, is this a centralized AI in Surface Detail
link |
00:28:38.360
or is it distributed?
link |
00:28:39.280
What kind of AI is it?
link |
00:28:40.560
They mostly exist as giant super ships,
link |
00:28:42.800
like sort of like the guild ships in Dune.
link |
00:28:45.800
Like they're these giant ships that kind of move people
link |
00:28:47.960
around and the ships are sentient
link |
00:28:49.480
and they can talk to all the passengers.
link |
00:28:52.240
And I mean, there's a lot of different types of AI
link |
00:28:56.400
in the Banksyan future,
link |
00:28:58.320
but in the opening scene of Surface Detail,
link |
00:29:01.080
there's this place called the Culture
link |
00:29:02.320
and the Culture is basically a Protopian future.
link |
00:29:04.440
And a Protopian future, I think,
link |
00:29:07.320
is like a future that is like,
link |
00:29:09.760
obviously it's not utopia, it's not perfect.
link |
00:29:12.400
And like, cause like striving for utopia,
link |
00:29:14.000
I think feels hopeless and it's sort of like,
link |
00:29:16.960
maybe not the best terminology to be using.
link |
00:29:20.380
So it's like, it's a pretty good place.
link |
00:29:23.960
Like mostly like, you know,
link |
00:29:27.660
super intelligence and biological beings
link |
00:29:29.860
exist fairly in harmony.
link |
00:29:31.880
There's not too much war.
link |
00:29:33.120
There's like as close to equality as you can get,
link |
00:29:35.680
you know, it's like approximately a good future.
link |
00:29:38.680
Like there's really awesome stuff.
link |
00:29:40.160
It's, and in the opening scene,
link |
00:29:45.600
this girl, she's born as a sex slave outside of the culture.
link |
00:29:49.520
So she's in a society that doesn't adhere
link |
00:29:51.200
to the cultural values.
link |
00:29:52.620
She tries to kill the guy who is her like master,
link |
00:29:56.680
but he kills her, but unbeknownst to her,
link |
00:29:59.040
when she was traveling on a ship through the culture
link |
00:30:01.360
with him one day, a ship put a neural lace
link |
00:30:05.160
in her head and neural lace is sort of like,
link |
00:30:08.480
it's basically a Neuralink because life imitates art.
link |
00:30:13.040
It does indeed.
link |
00:30:14.160
It does indeed.
link |
00:30:15.000
So she wakes up and the opening scene is her memory
link |
00:30:17.320
has been uploaded by this neural lace
link |
00:30:19.040
when she has been killed.
link |
00:30:20.000
And now she gets to choose a new body.
link |
00:30:22.640
And this AI is interfacing with her recorded memory
link |
00:30:26.900
in her neural lace and helping her and being like,
link |
00:30:29.900
hello, you're dead.
link |
00:30:31.320
But because you had a neural lace, your memory's uploaded.
link |
00:30:33.740
Do you want to choose a new body?
link |
00:30:34.980
And you're going to be born here in the culture
link |
00:30:36.440
and like start a new life, which is just,
link |
00:30:38.600
that's like the opening.
link |
00:30:39.920
It's like so sick.
link |
00:30:41.400
And the ship is the super intelligence.
link |
00:30:43.640
All the ships are kind of super intelligence.
link |
00:30:45.120
But they still want to preserve a kind of rich,
link |
00:30:47.720
fulfilling experience for the humans.
link |
00:30:49.680
Yeah, like they're like friends with the humans.
link |
00:30:51.000
And then there's a bunch of ships that don't want to exist
link |
00:30:53.560
with biological beings, but they just have their own place
link |
00:30:56.000
like way over there.
link |
00:30:57.080
But they don't, they just do their own thing.
link |
00:30:58.900
They're not necessarily.
link |
00:31:00.320
So it's a pretty, this protopian existence is pretty peaceful.
link |
00:31:03.640
Yeah, I mean, and then, for example,
link |
00:31:05.860
one of the main fights in the book is they're fighting,
link |
00:31:10.200
there's these artificial hells that,
link |
00:31:13.820
and people don't think it's ethical to have artificial hell.
link |
00:31:17.600
Like basically when people do crime, they get sent,
link |
00:31:19.520
like when they die, their memory gets sent
link |
00:31:21.140
to an artificial hell and they're eternally tortured.
link |
00:31:23.480
And so, and then the way that society is deciding
link |
00:31:27.660
whether or not to have the artificial hell
link |
00:31:29.280
is that they're having these simulated,
link |
00:31:31.900
they're having like a simulated war.
link |
00:31:33.200
So instead of actual blood, you know,
link |
00:31:36.000
people are basically essentially fighting in a video game
link |
00:31:38.520
to choose the outcome of this.
link |
00:31:40.080
But they're still experiencing the suffering
link |
00:31:42.760
in this artificial hell or no?
link |
00:31:44.520
Can you experience stuff or?
link |
00:31:45.760
So the artificial hell sucks.
link |
00:31:47.200
And a lot of people in the culture want to get rid
link |
00:31:48.800
of the artificial hell.
link |
00:31:49.920
There's a simulated wars,
link |
00:31:51.280
are they happening in the artificial hell?
link |
00:31:53.320
So no, the simulated wars are happening
link |
00:31:55.520
outside of the artificial hell,
link |
00:31:57.080
between the political factions who are,
link |
00:31:59.960
so this political faction says we should have simulated hell
link |
00:32:02.880
to deter crime.
link |
00:32:05.000
And this political faction is saying,
link |
00:32:06.960
no, simulated hell is unethical.
link |
00:32:08.920
And so instead of like having, you know,
link |
00:32:11.700
blowing each other up with nukes,
link |
00:32:13.120
they're having like a giant Fortnite battle
link |
00:32:17.160
to decide this, which, you know, to me that's protopia.
link |
00:32:21.680
That's like, okay, we can have war without death.
link |
00:32:25.640
You know, I don't think there should be simulated hells.
link |
00:32:27.440
I think that is definitely one of the ways
link |
00:32:29.800
in which technology could go very, very, very, very wrong.
link |
00:32:34.280
So almost punishing people in a digital space
link |
00:32:37.120
or something like that.
link |
00:32:37.960
Yeah, like torturing people's memories.
link |
00:32:41.680
So either as a deterrent, like if you committed a crime,
link |
00:32:44.760
but also just for personal pleasure,
link |
00:32:46.560
if there's some sick, demented humans in this world.
link |
00:32:50.520
Dan Carlin actually has this
link |
00:32:55.040
episode of Hardcore History on painful attainment.
link |
00:32:59.440
Oh, that episode is fucked.
link |
00:33:02.280
It's dark, because he kind of goes through human history
link |
00:33:05.360
and says like, we as humans seem to enjoy,
link |
00:33:08.800
secretly enjoy or used to be openly enjoy
link |
00:33:12.120
sort of the torture and the death,
link |
00:33:14.400
watching the death and torture of other humans.
link |
00:33:17.720
I do think if people were consenting,
link |
00:33:21.720
we should be allowed to have gladiatorial matches.
link |
00:33:26.240
But consent is hard to achieve in those situations.
link |
00:33:28.560
It always starts getting slippery.
link |
00:33:31.080
Like it could be also forced consent,
link |
00:33:32.760
like it starts getting weird.
link |
00:33:35.280
There's way too much excitement.
link |
00:33:37.360
Like this is what he highlights.
link |
00:33:38.620
There's something about human nature
link |
00:33:40.520
that wants to see that violence.
link |
00:33:42.280
And it's really dark.
link |
00:33:44.280
And you hope that we can sort of overcome
link |
00:33:47.160
that aspect of human nature,
link |
00:33:48.840
but that's still within us somewhere.
link |
00:33:51.280
Well, I think that's what we're doing right now.
link |
00:33:53.240
I have this theory that what is very important
link |
00:33:56.400
about the current moment is that all of evolution
link |
00:34:00.880
has been survival of the fittest up until now.
link |
00:34:03.400
And at some point, the lines are kind of fuzzy,
link |
00:34:07.260
but in the recent past, or maybe even just right now,
link |
00:34:12.080
we're getting to this point
link |
00:34:14.400
where we can choose intelligent design.
link |
00:34:19.440
Like we probably since like the integration of the iPhone,
link |
00:34:23.440
like we are becoming cyborgs.
link |
00:34:24.820
Like our brains are fundamentally changed.
link |
00:34:27.680
Everyone who grew up with electronics,
link |
00:34:29.380
we are fundamentally different from previous,
link |
00:34:32.520
from homo sapiens.
link |
00:34:33.440
I call us homo techno.
link |
00:34:34.680
I think we have evolved into homo techno,
link |
00:34:36.840
which is like essentially a new species.
link |
00:34:39.380
Like if you look at the way,
link |
00:34:41.380
if you took an MRI of my brain
link |
00:34:43.400
and you took an MRI of like a medieval brain,
link |
00:34:46.600
I think it would be very different
link |
00:34:48.160
the way that it has evolved.
link |
00:34:49.960
Do you think when historians look back at this time,
link |
00:34:52.000
they'll see like this was a fundamental shift
link |
00:34:54.040
to what a human being is?
link |
00:34:54.880
I think, I do not think we are still homo sapiens.
link |
00:34:58.040
I believe we are homo techno.
link |
00:34:59.520
And I think we have evolved.
link |
00:35:04.400
And I think right now, the way we are evolving,
link |
00:35:07.380
we can choose how we do that.
link |
00:35:09.080
And I think we are being very reckless
link |
00:35:10.680
about how we're doing that.
link |
00:35:11.800
Like we're just having social media,
link |
00:35:13.020
but I think this idea that like this is a time
link |
00:35:16.320
to choose intelligent design should be taken very seriously.
link |
00:35:19.600
It like now is the moment to reprogram the human computer.
link |
00:35:24.200
It's like, if you go blind,
link |
00:35:27.260
your visual cortex will get taken over
link |
00:35:29.920
with other functions.
link |
00:35:32.000
We can choose our own evolution.
link |
00:35:35.160
We can change the way our brains work.
link |
00:35:37.160
And so we actually have a huge responsibility to do that.
link |
00:35:39.920
And I think I'm not sure who should be responsible for that,
link |
00:35:42.880
but there's definitely not adequate education.
link |
00:35:45.200
We're being inundated with all this technology
link |
00:35:46.940
that is fundamentally changing
link |
00:35:49.020
the physical structure of our brains.
link |
00:35:50.900
And we are not adequately responding to that
link |
00:35:55.920
to choose how we wanna evolve.
link |
00:35:57.460
And we could evolve, we could be really whatever we want.
link |
00:36:00.600
And I think this is a really important time.
link |
00:36:03.040
And I think if we choose correctly and we choose wisely,
link |
00:36:06.440
consciousness could exist for a very long time
link |
00:36:09.720
and integration with AI could be extremely positive.
link |
00:36:12.720
And I don't think enough people are focusing
link |
00:36:14.400
on this specific situation.
link |
00:36:16.360
Do you think we might irreversibly screw things up
link |
00:36:19.160
if we get things wrong now?
link |
00:36:20.000
Because the flip side of that,
link |
00:36:21.640
it seems humans are pretty adaptive.
link |
00:36:23.200
So maybe the way we figure things out
link |
00:36:25.960
is by screwing it up, like social media.
link |
00:36:28.080
Over a generation, we'll see the negative effects
link |
00:36:30.600
of social media, and then we build new social medias,
link |
00:36:33.080
and we just keep improving stuff.
link |
00:36:34.960
And then we learn from the failures of the past.
link |
00:36:37.680
Because humans seem to be really adaptive.
link |
00:36:39.920
On the flip side, we can get it wrong in a way
link |
00:36:43.020
where literally we create weapons of war
link |
00:36:46.400
or increase hate.
link |
00:36:48.360
Past a certain threshold, we really do a lot of damage.
link |
00:36:51.840
I mean, I think we're optimized
link |
00:36:53.700
to notice the negative things.
link |
00:36:55.680
But I would actually say one of the things
link |
00:37:00.120
that I think people aren't noticing
link |
00:37:02.140
is if you look at Silicon Valley
link |
00:37:03.520
and you look at the technocracy,
link |
00:37:06.600
like what's been happening there.
link |
00:37:09.000
When Silicon Valley started, it was all just Facebook
link |
00:37:11.460
and all this for profit crap
link |
00:37:14.040
that really wasn't particular.
link |
00:37:16.920
I guess it was useful, but it's sort of just whatever.
link |
00:37:22.200
But now you see lab grown meat, compostable,
link |
00:37:26.680
or biodegradable, single use cutlery,
link |
00:37:30.200
or meditation apps.
link |
00:37:33.120
I think we are actually evolving and changing,
link |
00:37:38.400
and technology is changing.
link |
00:37:39.600
I think there just maybe there isn't
link |
00:37:43.840
quite enough education about this.
link |
00:37:48.140
And also, I don't know if there's quite enough incentive
link |
00:37:52.240
for it because I think the way capitalism works,
link |
00:37:56.600
what we define as profit,
link |
00:37:58.480
we're also working on an old model
link |
00:38:00.400
of what we define as profit.
link |
00:38:01.540
I really think if we changed the idea of profit
link |
00:38:06.400
to include social good,
link |
00:38:08.200
you can have economic profit,
link |
00:38:09.840
social good also counting as profit
link |
00:38:12.580
would incentivize things that are more useful
link |
00:38:15.100
and more whatever spiritual technology
link |
00:38:16.960
or positive technology or things that help reprogram
link |
00:38:21.280
a human computer in a good way
link |
00:38:22.920
or things that help us intelligently design our new brains.
link |
00:38:28.280
Yeah, there's no reason why within the framework
link |
00:38:30.480
of capitalism, the word profit or the idea of profit
link |
00:38:33.800
can't also incorporate the well being of a human being.
link |
00:38:37.360
So like long term well being, long term happiness.
link |
00:38:41.440
Or even for example, we were talking about motherhood,
link |
00:38:43.840
like part of the reason I'm so late
link |
00:38:44.840
is because I had to get the baby to bed.
link |
00:38:47.400
And it's like, I keep thinking about motherhood,
link |
00:38:48.880
how under capitalism, it's like this extremely essential job
link |
00:38:53.680
that is very difficult that is not compensated.
link |
00:38:56.440
And we sort of like value things
link |
00:38:58.420
by how much we compensate them.
link |
00:39:01.880
And so we really devalue motherhood in our society
link |
00:39:04.880
and pretty much all societies.
link |
00:39:06.080
Like capitalism does not recognize motherhood.
link |
00:39:08.160
It's just a job that you're supposed to do for free.
link |
00:39:11.120
And it's like, but I feel like producing great humans
link |
00:39:15.360
should be seen as a great, as profit under capitalism.
link |
00:39:19.440
Like that should be, that's like a huge social good.
link |
00:39:21.480
Like every awesome human that gets made
link |
00:39:24.160
adds so much to the world.
link |
00:39:25.600
So like if that was integrated into the profit structure,
link |
00:39:29.840
then, you know, and if we potentially found a way
link |
00:39:34.240
to compensate motherhood.
link |
00:39:35.640
So come up with a compensation
link |
00:39:37.920
that's much broader than just money or.
link |
00:39:40.640
Or it could just be money.
link |
00:39:42.040
Like, what if you just made, I don't know,
link |
00:39:45.520
but I don't know how you'd pay for that.
link |
00:39:46.640
Like, I mean, that's where you start getting into.
link |
00:39:52.000
Reallocation of resources that people get upset over.
link |
00:39:56.360
Well, like what if we made like a motherhood Dow?
link |
00:39:58.480
Yeah, yeah.
link |
00:40:01.320
And, you know, used it to fund like single mothers,
link |
00:40:07.680
like, you know, pay for making babies.
link |
00:40:13.640
So, I mean, if you create and put beautiful things
link |
00:40:17.360
onto the world, that could be companies,
link |
00:40:19.760
that can be bridges, that could be art,
link |
00:40:22.640
that could be a lot of things,
link |
00:40:24.280
and that could be children, which are.
link |
00:40:28.080
Or education or.
link |
00:40:29.320
Anything, that should be valued by society,
link |
00:40:32.380
and that should be somehow incorporated into the framework
link |
00:40:35.160
of what, as a market, of what.
link |
00:40:38.480
Like, if you contribute children to this world,
link |
00:40:40.560
that should be valued and respected
link |
00:40:42.440
and sort of celebrated, like, proportional to what it is,
link |
00:40:48.000
which is, it's the thing that fuels human civilization.
link |
00:40:51.560
Yeah, like I. It's kind of important.
link |
00:40:53.000
I feel like everyone's always saying,
link |
00:40:54.600
I mean, I think we're in very different social spheres,
link |
00:40:56.520
but everyone's always saying, like, dismantle capitalism.
link |
00:40:58.880
And I'm like, well, okay, well,
link |
00:40:59.840
I don't think the government should own everything.
link |
00:41:02.120
Like, I don't think we should not have private ownership.
link |
00:41:04.200
Like, that's scary.
link |
00:41:05.360
You know, like that starts getting into weird stuff
link |
00:41:07.320
and just sort of like,
link |
00:41:08.920
I feel there's almost no way to do that
link |
00:41:10.080
without a police state, you know?
link |
00:41:13.520
But obviously, capitalism has some major flaws.
link |
00:41:16.980
And I think actually Mac showed me this idea
link |
00:41:20.360
called social capitalism, which is a form of capitalism
link |
00:41:23.480
that just like considers social good to be also profit.
link |
00:41:28.480
Like, you know, it's like, right now companies need to,
link |
00:41:31.560
like, you're supposed to grow every quarter or whatever
link |
00:41:34.560
to like show that you're functioning well,
link |
00:41:38.700
but it's like, okay, well,
link |
00:41:39.920
what if you kept the same amount of profit?
link |
00:41:42.920
You're still in the green,
link |
00:41:43.920
but then you have also all this social good.
link |
00:41:45.600
Like, do you really need all this extra economic growth
link |
00:41:47.560
or could you add this social good and that counts?
link |
00:41:49.520
And, you know, I don't know if, I am not an economist.
link |
00:41:54.000
I have no idea how this could be achieved, but.
link |
00:41:56.360
I don't think economists know how anything
link |
00:41:58.720
could be achieved either, but they pretend.
link |
00:42:00.320
It's the thing, they construct a model
link |
00:42:02.280
and they go on TV shows and sound like an expert.
link |
00:42:06.160
That's the definition of economist.
link |
00:42:08.540
How did being a mother, becoming a mother
link |
00:42:12.260
change you as a human being, would you say?
link |
00:42:16.160
Man, I think it kind of changed everything
link |
00:42:21.000
and it's still changing me a lot.
link |
00:42:22.380
It's actually changing me more right now in this moment
link |
00:42:24.880
than it was before.
link |
00:42:26.400
Like today, like this?
link |
00:42:28.520
Just like in the most recent months and stuff.
link |
00:42:33.480
Can you elucidate that, how change,
link |
00:42:37.640
like when you wake up in the morning
link |
00:42:39.360
and you look at yourself, it's again, which, who are you?
link |
00:42:42.920
How have you become different, would you say?
link |
00:42:45.520
I think it's just really reorienting my priorities.
link |
00:42:50.680
And at first I was really fighting against that
link |
00:42:53.400
because I somehow felt it was like a failure
link |
00:42:55.560
of feminism or something.
link |
00:42:56.680
Like I felt like it was like bad
link |
00:42:59.840
if like my kids started mattering more than my work.
link |
00:43:05.720
And then like more recently I started sort of analyzing
link |
00:43:08.960
that thought in myself and being like,
link |
00:43:12.240
that's also kind of a construct.
link |
00:43:13.920
It's like, we've just devalued motherhood so much
link |
00:43:16.280
in our culture that like, I feel guilty for caring
link |
00:43:21.280
about my kids more than I care about my work.
link |
00:43:23.320
So feminism includes breaking out
link |
00:43:25.800
of whatever the construct is.
link |
00:43:28.360
So just continually breaking,
link |
00:43:30.900
it's like freedom empower you to be free.
link |
00:43:34.640
And that means...
link |
00:43:37.560
But it also, but like being a mother,
link |
00:43:40.040
like I'm so much more creative.
link |
00:43:41.840
Like I cannot believe the massive amount
link |
00:43:45.880
of brain growth that I am.
link |
00:43:48.440
Why do you think that is?
link |
00:43:49.280
Just cause like the stakes are higher somehow?
link |
00:43:51.940
I think it's like, it's just so trippy
link |
00:43:58.020
watching consciousness emerge.
link |
00:44:00.800
It's just like, it's like going on a crazy journey
link |
00:44:06.860
or something.
link |
00:44:07.700
It's like the craziest science fiction novel
link |
00:44:10.220
you could ever read.
link |
00:44:11.060
It's just so crazy watching consciousness come into being.
link |
00:44:15.100
And then at the same time,
link |
00:44:16.540
like you're forced to value your time so much.
link |
00:44:21.140
Like when I have creative time now, it's so sacred.
link |
00:44:23.520
I need to like be really fricking on it.
link |
00:44:29.380
But the other thing is that I used to just be like a cynic
link |
00:44:34.680
and I used to just wanna...
link |
00:44:35.520
Like my last album was called Miss Anthropocene
link |
00:44:38.060
and it was like this like, it was like a study in villainy
link |
00:44:42.620
or like it was like, well, what if we have,
link |
00:44:44.980
instead of the old gods, we have like new gods
link |
00:44:46.820
and it's like Miss Anthropocene is like misanthrope
link |
00:44:49.700
like and Anthropocene, which is like the, you know,
link |
00:44:53.100
like and she's the goddess of climate change or whatever.
link |
00:44:55.780
And she's like destroying the world.
link |
00:44:56.980
And it was just like, it was like dark
link |
00:44:59.660
and it was like a study in villainy.
link |
00:45:01.380
And it was sort of just like,
link |
00:45:02.700
like I used to like have no problem just making cynical,
link |
00:45:06.820
angry, scary art.
link |
00:45:08.980
And not that there's anything wrong with that,
link |
00:45:11.100
but I think having kids just makes you such an optimist.
link |
00:45:13.580
It just inherently makes you wanna be an optimist so bad
link |
00:45:16.940
that like I feel more responsibility
link |
00:45:20.380
to make more optimistic things.
link |
00:45:23.820
And I get a lot of shit for it
link |
00:45:25.720
because everyone's like, oh, you're so privileged.
link |
00:45:28.500
Stop talking about like pie in the sky,
link |
00:45:30.260
stupid concepts and focus on like the now.
link |
00:45:32.700
But it's like, I think if we don't ideate
link |
00:45:36.500
about futures that could be good,
link |
00:45:40.520
we won't be able to get them.
link |
00:45:41.540
If everything is Blade Runner,
link |
00:45:42.780
then we're gonna end up with Blade Runner.
link |
00:45:44.260
It's like, as we said earlier, life imitates art.
link |
00:45:47.300
Like life really does imitate art.
link |
00:45:49.740
And so we really need more protopian or utopian art.
link |
00:45:53.960
I think this is incredibly essential
link |
00:45:56.060
for the future of humanity.
link |
00:45:58.060
And I think the current discourse
link |
00:46:00.380
where that's seen as a thinking about protopia or utopia
link |
00:46:09.660
is seen as a dismissal of the problems
link |
00:46:11.580
that we currently have.
link |
00:46:12.420
I think that is an incorrect mindset.
link |
00:46:16.160
And like having kids just makes me wanna imagine
link |
00:46:20.180
amazing futures that like maybe I won't be able to build,
link |
00:46:24.420
but they will be able to build if they want to.
link |
00:46:26.900
Yeah, it does seem like ideation
link |
00:46:28.300
is a precursor to creation.
link |
00:46:30.220
So you have to imagine it in order to be able to build it.
link |
00:46:33.820
And there is a sad thing about human nature
link |
00:46:36.700
that somehow a cynical view of the world
link |
00:46:40.740
is seen as a insightful view.
link |
00:46:44.060
You know, cynicism is often confused for insight,
link |
00:46:46.980
which is sad to see.
link |
00:46:48.900
And optimism is confused for naivete.
link |
00:46:52.620
Yes, yes.
link |
00:46:53.460
Like you don't, you're blinded by your,
link |
00:46:57.700
maybe your privilege or whatever.
link |
00:46:59.320
You're blinded by something, but you're certainly blinded.
link |
00:47:02.020
That's sad, that's sad to see
link |
00:47:04.560
because it seems like the optimists
link |
00:47:06.020
are the ones that create our future.
link |
00:47:10.260
They're the ones that build.
link |
00:47:11.900
In order to build the crazy thing,
link |
00:47:13.560
you have to be optimistic.
link |
00:47:14.740
You have to be either stupid or excited or passionate
link |
00:47:19.220
or mad enough to actually believe that it can be built.
link |
00:47:22.740
And those are the people that built it.
link |
00:47:24.200
My favorite quote of all time is from Star Wars, Episode 8,
link |
00:47:29.020
which I know everyone hates.
link |
00:47:30.940
Do you like Star Wars, Episode 8?
link |
00:47:32.420
No, yeah, probably I would say I would probably hate it, yeah.
link |
00:47:36.820
I don't have strong feelings about it.
link |
00:47:38.940
Let me backtrack.
link |
00:47:39.780
I don't have strong feelings about Star Wars.
link |
00:47:41.500
I'm a Tolkien person.
link |
00:47:43.040
I'm more into dragons and orcs and ogres.
link |
00:47:47.860
Yeah, I mean, Tolkien forever.
link |
00:47:49.620
I really want to have one more son and call him,
link |
00:47:51.780
I thought Tao Tecno Tolkien would be cool.
link |
00:47:55.260
It's a lot of T's, I like it.
link |
00:47:56.940
Yeah, and well, and Tao is six, two, eight, two pi.
link |
00:47:59.260
Yeah, Tao Tecno, yeah, yeah, yeah.
link |
00:48:01.740
And then techno is obviously the best genre of music,
link |
00:48:04.780
but also like technocracy.
link |
00:48:06.260
It just sounds really good.
link |
00:48:07.100
Yeah, that's right, and techno Tolkien, Tao Tecno Tolkien.
link |
00:48:11.260
That's a good, that's it.
link |
00:48:12.100
Tao Tecno Tolkien, but Star Wars, Episode 8,
link |
00:48:15.980
I know a lot of people have issues with it.
link |
00:48:17.260
Personally, on the record,
link |
00:48:18.720
I think it's the best Star Wars film.
link |
00:48:24.100
You're starting trouble today.
link |
00:48:25.180
Yeah, but don't kill what you hate, save what you love.
link |
00:48:29.260
Don't kill what you hate.
link |
00:48:30.460
Don't kill what you hate, save what you love.
link |
00:48:32.340
And I think we're, in society right now,
link |
00:48:34.980
we're in a diagnosis mode.
link |
00:48:36.580
We're just diagnosing and diagnosing and diagnosing,
link |
00:48:39.140
and we're trying to kill what we hate,
link |
00:48:41.660
and we're not trying to save what we love enough.
link |
00:48:44.220
And there's this Buckminster Fuller quote,
link |
00:48:46.260
which I'm gonna butcher,
link |
00:48:47.140
because I don't remember it correctly,
link |
00:48:48.220
but it's something along the lines of,
link |
00:48:52.580
don't try to destroy the old bad models,
link |
00:48:58.340
render them obsolete with better models.
link |
00:49:03.100
Maybe we don't need to destroy the oil industry.
link |
00:49:05.620
Maybe we just create a great new battery technology
link |
00:49:08.940
and sustainable transport,
link |
00:49:10.340
and just make it economically unreasonable
link |
00:49:13.140
to still continue to rely on fossil fuels.
link |
00:49:17.100
It's like, don't kill what you hate, save what you love.
link |
00:49:20.180
Make new things and just render the old things unusable.
link |
00:49:24.460
It's like if the college debt is so bad,
link |
00:49:29.060
and universities are so expensive,
link |
00:49:31.500
and I feel like education is becoming obsolete.
link |
00:49:35.700
I feel like we could completely revolutionize education,
link |
00:49:38.460
and we could make it free.
link |
00:49:39.380
And it's like, you look at JSTOR,
link |
00:49:40.420
and you have to pay to get all the studies and everything.
link |
00:49:43.460
What if we created a DAO that bought JSTOR,
link |
00:49:46.520
or we created a DAO that was funding studies,
link |
00:49:48.460
and those studies were open source, or free for everyone.
link |
00:49:51.880
And what if we just open sourced education
link |
00:49:55.100
and decentralized education and made it free,
link |
00:49:56.980
and all research was on the internet,
link |
00:50:00.900
and all the outcomes of studies were on the internet,
link |
00:50:05.220
and no one has student debt,
link |
00:50:10.060
and you just take tests when you apply for a job,
link |
00:50:14.220
and if you're qualified, then you can work there.
link |
00:50:18.060
This is just like, I don't know how anything works.
link |
00:50:20.500
I'm just randomly ranting, but.
link |
00:50:22.700
I like the humility.
link |
00:50:24.920
You gotta think from just basic first principles.
link |
00:50:27.780
What is the problem?
link |
00:50:28.900
What's broken?
link |
00:50:29.740
What are some ideas?
link |
00:50:30.740
That's it.
link |
00:50:31.580
And get excited about those ideas,
link |
00:50:33.100
and share your excitement,
link |
00:50:34.660
and don't tear each other down.
link |
00:50:37.100
It's just when you kill things,
link |
00:50:38.140
you often end up killing yourself.
link |
00:50:40.140
Like war is not a one sided,
link |
00:50:43.380
like you're not gonna go in and just kill them,
link |
00:50:45.060
like you're gonna get stabbed.
link |
00:50:46.900
It's like, and I think when I talk about this nexus point
link |
00:50:50.380
of that we're in this point in society
link |
00:50:53.320
where we're switching to intelligent design,
link |
00:50:55.140
I think part of our switch to intelligent design
link |
00:50:57.220
is that we need to choose nonviolence.
link |
00:50:59.480
And we need to, like, I think we can choose to start,
link |
00:51:04.300
I don't think we can eradicate violence from our species,
link |
00:51:07.380
because I think we need it a little bit,
link |
00:51:10.540
but I think we can choose
link |
00:51:12.100
to really reorient our primitive brains
link |
00:51:14.900
that are fighting over scarcity,
link |
00:51:16.300
and that are so attack oriented,
link |
00:51:20.620
and move into, we can optimize for creativity and building.
link |
00:51:25.420
Yeah, it's interesting to think how that happens,
link |
00:51:27.220
so some of it is just education,
link |
00:51:29.580
some of it is living life and introspecting your own mind,
link |
00:51:34.100
and trying to live up to the better angels of your nature
link |
00:51:37.780
for each one of us, all those kinds of things at scale.
link |
00:51:41.800
That's how we can sort of start to minimize
link |
00:51:44.580
the amount of destructive war in our world,
link |
00:51:48.160
and that's, to me, probably you're the same,
link |
00:51:51.100
technology is a really promising way to do that.
link |
00:51:55.220
Like, social media should be a really promising way
link |
00:51:57.700
to do that, it's a way we connect.
link |
00:52:00.060
I, you know, for the most part,
link |
00:52:01.540
I really enjoy social media.
link |
00:52:03.260
I just know all the negative stuff.
link |
00:52:05.180
I don't engage with any of the negative stuff.
link |
00:52:07.540
Just not even, like, by blocking
link |
00:52:10.300
or any of that kind of stuff,
link |
00:52:11.400
but just not letting it enter my mind.
link |
00:52:14.740
Like, just, like, when somebody says something negative,
link |
00:52:18.500
I see it, I immediately think positive thoughts about them,
link |
00:52:23.260
and I just forget they exist after that.
link |
00:52:26.260
Just move on, because, like, that negative energy,
link |
00:52:28.640
if I return the negative energy,
link |
00:52:30.300
they're going to get excited in a negative way right back,
link |
00:52:34.260
and it's just this kind of vicious cycle.
link |
00:52:38.420
But you would think technology would assist us
link |
00:52:40.540
in this process of letting go,
link |
00:52:42.860
of not taking things personally,
link |
00:52:44.720
of not engaging in the negativity,
link |
00:52:46.380
but unfortunately, social media profits from the negativity,
link |
00:52:50.260
so the current models.
link |
00:52:52.020
I mean, social media is like a gun.
link |
00:52:53.420
Like, you should take a course before you use it.
link |
00:52:57.260
Like, it's like, this is what I mean,
link |
00:52:59.380
like, when I say reprogram the human computer.
link |
00:53:01.100
Like, in school, you should learn
link |
00:53:03.100
about how social media optimizes
link |
00:53:05.060
to, you know, raise your cortisol levels
link |
00:53:07.140
and make you angry and crazy and stressed,
link |
00:53:09.220
and, like, you should learn how to have hygiene
link |
00:53:12.580
about how you use social media.
link |
00:53:16.740
But, so you can, yeah,
link |
00:53:18.420
choose not to focus on the negative stuff, but I don't know.
link |
00:53:22.380
I'm not sure social media should,
link |
00:53:24.580
I guess it should exist.
link |
00:53:25.700
I'm not sure.
link |
00:53:27.420
I mean, we're in the messy, it's the experimental phase.
link |
00:53:29.420
Like, we're working it out.
link |
00:53:30.260
Yeah, it's the early days.
link |
00:53:31.080
I don't even know, when you say social media,
link |
00:53:32.700
I don't know what that even means.
link |
00:53:33.820
We're in the very early days.
link |
00:53:35.020
I think social media is just basic human connection
link |
00:53:37.780
in the digital realm, and that, I think it should exist,
link |
00:53:41.940
but there's so many ways to do it in a bad way.
link |
00:53:43.940
There's so many ways to do it in a good way.
link |
00:53:45.900
There's all discussions of all the same human rights.
link |
00:53:48.100
We talk about freedom of speech.
link |
00:53:49.900
We talk about sort of violence
link |
00:53:52.260
in the space of digital media.
link |
00:53:54.040
We talk about hate speech.
link |
00:53:56.180
We talk about all these things that we had to figure out
link |
00:53:59.020
back in the day in the physical space.
link |
00:54:01.260
We're now figuring out in the digital space,
link |
00:54:03.620
and it's like baby stages.
link |
00:54:06.500
When the printing press came out,
link |
00:54:07.820
it was like pure chaos for a minute, you know?
link |
00:54:10.180
It's like when you inject,
link |
00:54:12.340
when there's a massive information injection
link |
00:54:14.300
into the general population, there's just gonna be,
link |
00:54:20.620
I feel like the printing press, I don't have the years,
link |
00:54:23.480
but it was like printing press came out,
link |
00:54:24.980
shit got really fucking bad for a minute,
link |
00:54:27.160
but then we got the enlightenment.
link |
00:54:29.180
And so it's like, I think we're in,
link |
00:54:30.980
this is like the second coming of the printing press.
link |
00:54:34.780
We're probably gonna have some shitty times for a minute,
link |
00:54:37.760
and then we're gonna have recalibrate
link |
00:54:40.660
to have a better understanding of how we consume media
link |
00:54:44.660
and how we deliver media.
link |
00:54:47.940
Speaking of programming the human computer,
link |
00:54:50.920
you mentioned Baby X.
link |
00:54:52.940
So there's this young consciousness coming to be,
link |
00:54:56.980
came from a cell.
link |
00:54:58.340
Like that whole thing doesn't even make sense.
link |
00:55:01.180
It came from DNA.
link |
00:55:02.380
Yeah.
link |
00:55:03.220
And then there's this baby computer
link |
00:55:04.660
that just like grows and grows and grows and grows,
link |
00:55:06.780
and now there's a conscious being
link |
00:55:08.440
with extremely impressive cognitive capabilities with,
link |
00:55:13.780
Have you met him?
link |
00:55:14.620
Yes, yeah.
link |
00:55:15.460
Yeah.
link |
00:55:16.280
He's actually really smart.
link |
00:55:17.120
He's really smart.
link |
00:55:17.960
Yeah.
link |
00:55:18.780
He's weird.
link |
00:55:19.620
Yeah.
link |
00:55:20.460
Or a baby.
link |
00:55:21.280
He does.
link |
00:55:22.120
I don't, I haven't.
link |
00:55:22.960
I don't know a lot of other babies,
link |
00:55:23.780
but he seems to be smart.
link |
00:55:24.620
Zach, I don't hang out with babies often,
link |
00:55:25.460
but this baby was very impressive.
link |
00:55:26.860
He does a lot of pranks and stuff.
link |
00:55:28.940
Oh, so he's like.
link |
00:55:29.780
Like he'll like give you a treat
link |
00:55:31.140
and then take it away and laugh and like stuff like that.
link |
00:55:33.580
So he's like a chess player.
link |
00:55:35.340
So here's a cognitive sort of,
link |
00:55:39.820
there's a computer being programmed.
link |
00:55:41.620
So he's taking in the environment,
link |
00:55:43.180
interacting with a specific set of humans.
link |
00:55:45.900
How would you, first of all, what is it?
link |
00:55:48.900
What, let me ask.
link |
00:55:50.420
I want to ask how do you program this computer?
link |
00:55:53.220
And also how do you make sense of that
link |
00:55:55.580
there's a conscious being right there
link |
00:55:58.100
that wasn't there before?
link |
00:55:59.260
It's giving me a lot of crisis thoughts.
link |
00:56:01.460
I'm thinking really hard.
link |
00:56:02.700
I think that's part of the reason
link |
00:56:03.700
it's like, I'm struggling to focus on
link |
00:56:06.100
art and stuff right now.
link |
00:56:07.140
Cause baby X is becoming conscious
link |
00:56:09.620
and like my it's just reorienting my brain.
link |
00:56:12.580
Like my brain is suddenly totally shifting of like,
link |
00:56:14.700
oh shit, like the way we raise children.
link |
00:56:18.220
Like, I hate all the baby books and everything.
link |
00:56:21.980
I hate them.
link |
00:56:22.820
Like they're, oh, the art is so bad.
link |
00:56:24.320
And like all this stuff, everything about all the aesthetics.
link |
00:56:29.060
And like, I'm just like, ah, like this is so.
link |
00:56:32.300
The programming languages we're using
link |
00:56:35.060
to program these baby computers isn't good.
link |
00:56:37.220
Yeah, like I'm thinking, and I,
link |
00:56:39.980
not that I have like good answers
link |
00:56:41.340
or know what to do, but I'm just thinking really,
link |
00:56:46.020
really hard about it.
link |
00:56:46.860
I, we recently watched Totoro with him, Studio Ghibli.
link |
00:56:52.860
And it's just like a fantastic film.
link |
00:56:56.100
And he like responded to,
link |
00:56:57.500
I know you're not supposed to show baby screens too much,
link |
00:56:59.460
but like, I think it's the most sort of like,
link |
00:57:04.260
I feel like it's the highest art baby content.
link |
00:57:06.940
Like it really speaks, there's almost no talking in it.
link |
00:57:12.020
It's really simple.
link |
00:57:13.060
Although all the dialogue is super, super, super simple,
link |
00:57:16.020
you know, and it's like a one to three year old
link |
00:57:19.620
can like really connect with it.
link |
00:57:21.020
Like it feels like it's almost aimed
link |
00:57:22.480
at like a one to three year old,
link |
00:57:24.600
but it's like great art and it's so imaginative
link |
00:57:27.620
and it's so beautiful.
link |
00:57:28.700
And like the first time I showed it to him,
link |
00:57:31.660
he was just like so invested in it,
link |
00:57:33.580
unlike I've ever, unlike anything else
link |
00:57:35.740
I'd ever shown him.
link |
00:57:36.740
Like he was just like crying when they cry
link |
00:57:38.500
and laughing when they laugh,
link |
00:57:39.580
like just like having this roller coaster of like emotions,
link |
00:57:42.520
like, and he learned a bunch of words.
link |
00:57:44.020
Like he was, and he started saying Totoro
link |
00:57:46.020
and started just saying all this stuff
link |
00:57:48.560
after watching Totoro,
link |
00:57:49.860
and he wants to watch it all the time.
link |
00:57:52.020
And I was like, man, why isn't there an industry of this?
link |
00:57:55.460
Like why aren't our best artists focusing on making art
link |
00:57:59.980
like for the birth of consciousness?
link |
00:58:04.180
Like, and that's one of the things I've been thinking
link |
00:58:07.080
I really wanna start doing.
link |
00:58:08.380
You know, I don't wanna speak before I do things too much,
link |
00:58:10.380
but like, I'm just like ages one to three,
link |
00:58:15.220
like we should be putting so much effort into that.
link |
00:58:18.680
And the other thing about Totoro is it's like,
link |
00:58:21.020
it's like better for the environment
link |
00:58:22.380
because adults love Totoro.
link |
00:58:23.860
It's such good art that everyone loves it.
link |
00:58:25.580
Like I still have all my old Totoro merch
link |
00:58:27.380
from when I was a kid.
link |
00:58:28.660
Like I literally have the most ragged old Totoro merch.
link |
00:58:33.960
Like everybody loves it, everybody keeps it.
link |
00:58:35.780
It's like, why does the art we have for babies
link |
00:58:40.900
need to suck and be not accessible to adults
link |
00:58:45.300
and then just be thrown out when, you know,
link |
00:58:49.100
they age out of it?
link |
00:58:50.300
Like, it's like, I don't know.
link |
00:58:53.180
I don't have like a fully formed thought here,
link |
00:58:54.740
but this is just something I've been thinking about a lot
link |
00:58:56.220
is like, how do we like,
link |
00:58:58.660
how do we have more Totoroesque content?
link |
00:59:01.180
Like how do we have more content like this
link |
00:59:02.500
that like is universal and everybody loves,
link |
00:59:05.100
but is like really geared to an emerging consciousness?
link |
00:59:10.160
Emerging consciousness in the first like three years of life
link |
00:59:13.100
that so much turmoil,
link |
00:59:14.260
so much evolution of mind is happening.
link |
00:59:16.540
It seems like a crucial time.
link |
00:59:18.020
Would you say to make it not suck,
link |
00:59:21.820
do you think of basically treating a child
link |
00:59:26.580
like they have the capacity to have the brilliance
link |
00:59:28.980
of an adult or even beyond that?
link |
00:59:30.760
Is that how you think of that mind or?
link |
00:59:33.420
No, cause they still,
link |
00:59:35.080
they like it when you talk weird and stuff.
link |
00:59:37.900
Like they respond better to,
link |
00:59:39.660
cause even they can imitate better
link |
00:59:41.040
when your voice is higher.
link |
00:59:42.100
Like people say like, oh, don't do baby talk.
link |
00:59:44.020
But it's like, when your voice is higher,
link |
00:59:45.380
it's closer to something they can imitate.
link |
00:59:47.340
So they like, like the baby talk actually kind of works.
link |
00:59:50.520
Like it helps them learn to communicate.
link |
00:59:52.100
I've found it to be more effective
link |
00:59:53.360
with learning words and stuff.
link |
00:59:55.300
But like, you're not speaking down to them.
link |
00:59:59.980
Like do they have the capacity
link |
01:00:03.100
to understand really difficult concepts
link |
01:00:05.900
just in a very different way,
link |
01:00:07.700
like an emotional intelligence
link |
01:00:09.080
about something deep within?
link |
01:00:11.500
Oh yeah, no, like if X hurts,
link |
01:00:13.060
like if X bites me really hard and I'm like, ow,
link |
01:00:15.700
like he gets, he's sad.
link |
01:00:17.300
He's like sad if he hurts me by accident.
link |
01:00:19.740
Yeah.
link |
01:00:20.580
Which he's huge, so he hurts me a lot by accident.
link |
01:00:22.540
Yeah, that's so interesting that that mind emerges
link |
01:00:26.860
and he and children don't really have memory of that time.
link |
01:00:31.100
So we can't even have a conversation with them about it.
link |
01:00:32.940
Yeah, I just thank God they don't have a memory
link |
01:00:34.380
of this time because like, think about like,
link |
01:00:37.380
I mean with our youngest baby,
link |
01:00:39.620
like it's like, I'm like, have you read
link |
01:00:42.080
the sci fi short story, I Have No Mouth But I Must Scream?
link |
01:00:46.660
Good title, no.
link |
01:00:47.980
Oh man, I mean, you should read that.
link |
01:00:49.920
I Have No Mouth But I Must Scream.
link |
01:00:53.220
I hate getting into this Rocco's Basilisk shit.
link |
01:00:55.460
It's kind of a story about the,
link |
01:00:57.660
about like an AI that's like torturing someone in eternity
link |
01:01:03.820
and they have like no body.
link |
01:01:05.540
The way they describe it,
link |
01:01:07.380
it sort of sounds like what it feels like,
link |
01:01:09.200
like being a baby, like you're conscious
link |
01:01:11.340
and you're just getting inputs from everywhere
link |
01:01:13.420
and you have no muscles and you're like jelly
link |
01:01:15.220
and you like can't move and you try to like communicate,
link |
01:01:17.540
but you can't communicate and we're,
link |
01:01:18.940
and like, you're just like in this like hell state.
link |
01:01:22.580
I think it's good we can't remember that.
link |
01:01:25.180
Like my little baby is just exiting that,
link |
01:01:27.620
like she's starting to like get muscles
link |
01:01:29.100
and have more like autonomy,
link |
01:01:30.700
but like watching her go through the opening phase,
link |
01:01:34.160
I was like, I was like, this does not seem good.
link |
01:01:37.700
Oh, you think it's kind of like.
link |
01:01:39.180
Like I think it sucks.
link |
01:01:40.300
I think it might be really violent.
link |
01:01:41.780
Like violent, mentally violent, psychologically violent.
link |
01:01:44.700
Consciousness emerging, I think is a very violent thing.
link |
01:01:47.460
I never thought about that.
link |
01:01:48.300
I think it's possible that we all carry
link |
01:01:49.900
quite a bit of trauma from it that we don't,
link |
01:01:52.000
I think that would be a good thing to study
link |
01:01:54.380
because I think if, I think addressing that trauma,
link |
01:01:58.540
like, I think that might be.
link |
01:01:59.740
Oh, you mean like echoes of it
link |
01:02:00.900
are still there in the shadows somewhere.
link |
01:02:01.740
I think it's gotta be, I feel this, this help,
link |
01:02:04.220
the helplessness, the like existential
link |
01:02:06.940
and that like fear of being in like an unknown place
link |
01:02:10.540
bombarded with inputs and being completely helpless,
link |
01:02:13.460
like that's gotta be somewhere deep in your brain
link |
01:02:15.680
and that can't be good for you.
link |
01:02:17.420
What do you think consciousness is?
link |
01:02:19.500
This whole conversation
link |
01:02:20.980
has impossibly difficult questions.
link |
01:02:22.500
What do you think it is?
link |
01:02:23.340
Debbie said this is like so hard.
link |
01:02:28.460
Yeah, we talked about music for like two minutes.
link |
01:02:30.700
All right.
link |
01:02:31.540
No, I'm so, I'm just over music.
link |
01:02:32.700
I'm over music.
link |
01:02:33.540
Yeah, I still like it.
link |
01:02:35.420
It has its purpose.
link |
01:02:36.240
No, I love music.
link |
01:02:37.080
I mean, music's the greatest thing ever.
link |
01:02:38.080
It's my favorite thing.
link |
01:02:38.920
But I just like every interview is like,
link |
01:02:42.340
what is your process?
link |
01:02:43.600
Like, I don't know.
link |
01:02:44.440
I'm just done.
link |
01:02:45.260
I can't do anything.
link |
01:02:46.100
I do want to ask you about Able to Live.
link |
01:02:46.940
Oh, I'll tell you about Ableton
link |
01:02:47.780
because Ableton's sick.
link |
01:02:49.420
No one has ever asked about Ableton though.
link |
01:02:51.700
Yeah, well, because I just need tech support mainly.
link |
01:02:54.260
I can help you.
link |
01:02:55.100
I can help you with your Ableton tech.
link |
01:02:56.620
Anyway, from Ableton back to consciousness.
link |
01:02:58.940
What do you, do you think this is a thing
link |
01:03:00.620
that only humans are capable of?
link |
01:03:03.300
Can robots be conscious?
link |
01:03:05.380
Can, like when you think about entities,
link |
01:03:08.220
you think there's aliens out there that are conscious?
link |
01:03:10.220
Like is conscious, what is consciousness?
link |
01:03:11.540
There's this Terrence McKenna quote
link |
01:03:13.300
that I've found that I fucking love.
link |
01:03:15.900
Am I allowed to swear on here?
link |
01:03:17.540
Yes.
link |
01:03:18.360
Nature loves courage.
link |
01:03:21.140
You make the commitment and nature will respond
link |
01:03:23.460
to that commitment by removing impossible obstacles.
link |
01:03:26.580
Dream the impossible dream
link |
01:03:28.060
and the world will not grind you under.
link |
01:03:29.920
It will lift you up.
link |
01:03:31.120
This is the trick.
link |
01:03:32.380
This is what all these teachers and philosophers
link |
01:03:35.160
who really counted, who really touched the alchemical gold,
link |
01:03:38.400
this is what they understood.
link |
01:03:40.100
This is the shamanic dance in the waterfall.
link |
01:03:42.860
This is how magic is done.
link |
01:03:44.700
By hurling yourself into the abyss
link |
01:03:46.420
and discovering it's a feather bed.
link |
01:03:48.620
Yeah.
link |
01:03:49.920
And for this reason,
link |
01:03:50.760
I do think there are no technological limits.
link |
01:03:55.400
I think like what is already happening here,
link |
01:03:58.700
this is like impossible.
link |
01:03:59.880
This is insane.
link |
01:04:01.060
And we've done this in a very limited amount of time.
link |
01:04:03.340
And we're accelerating the rate at which we're doing this.
link |
01:04:05.900
So I think digital consciousness, it's inevitable.
link |
01:04:10.220
And we may not be able to even understand what that means,
link |
01:04:13.300
but I like hurling yourself into the abyss.
link |
01:04:15.780
So we're surrounded by all this mystery
link |
01:04:17.460
and we just keep hurling ourselves into it,
link |
01:04:19.780
like fearlessly and keep discovering cool shit.
link |
01:04:22.940
Yeah.
link |
01:04:23.900
Like, I just think it's like,
link |
01:04:31.340
like who even knows if the laws of physics,
link |
01:04:32.980
the laws of physics are probably just the current,
link |
01:04:35.580
like as I was saying,
link |
01:04:36.420
speed of light is the current render rate.
link |
01:04:37.900
It's like, if we're in a simulation,
link |
01:04:40.220
they'll be able to upgrade that.
link |
01:04:41.220
Like I sort of suspect when we made the James Webb telescope,
link |
01:04:45.660
like part of the reason we made that
link |
01:04:46.780
is because we had an upgrade, you know?
link |
01:04:50.180
And so now more of space has been rendered
link |
01:04:53.640
so we can see more of it now.
link |
01:04:56.740
Yeah, but I think humans are super, super,
link |
01:04:58.860
super limited cognitively.
link |
01:05:00.420
So I wonder if we'll be allowed to create
link |
01:05:04.740
more intelligent beings that can see more of the universe
link |
01:05:08.100
as their render rate is upgraded.
link |
01:05:11.160
Maybe we're cognitively limited.
link |
01:05:12.660
Everyone keeps talking about how we're cognitively limited
link |
01:05:15.300
and AI is gonna render us obsolete,
link |
01:05:17.140
but it's like, you know,
link |
01:05:20.340
like this is not the same thing
link |
01:05:21.800
as like an amoeba becoming an alligator.
link |
01:05:26.020
Like, it's like, if we create AI,
link |
01:05:28.300
again, that's intelligent design.
link |
01:05:29.820
That's literally all religions are based on gods
link |
01:05:33.160
that create consciousness.
link |
01:05:34.460
Like we are God making.
link |
01:05:35.700
Like what we are doing is incredibly profound.
link |
01:05:37.860
And like, even if we can't compute,
link |
01:05:41.620
even if we're so much worse than them,
link |
01:05:44.740
like just like unfathomably worse than like,
link |
01:05:49.460
you know, an omnipotent kind of AI,
link |
01:05:51.980
it's like we, I do not think that they would just think
link |
01:05:55.300
that we are stupid.
link |
01:05:56.420
I think that they would recognize the profundity
link |
01:05:58.420
of what we have accomplished.
link |
01:05:59.740
Are we the gods or are they the gods in our personality?
link |
01:06:02.700
I mean, we're kind of the guy.
link |
01:06:05.460
It's complicated.
link |
01:06:06.300
It's complicated.
link |
01:06:07.540
Like we're.
link |
01:06:08.380
But they would acknowledge the value.
link |
01:06:11.540
Well, I hope they acknowledge the value
link |
01:06:13.540
of paying respect to the creative ancestors.
link |
01:06:16.140
I think they would think it's cool.
link |
01:06:17.940
I think if curiosity is a trait
link |
01:06:23.940
that we can quantify and put into AI,
link |
01:06:29.340
then I think if AI are curious,
link |
01:06:31.740
then they will be curious about us
link |
01:06:33.620
and they will not be hateful or dismissive of us.
link |
01:06:37.660
They might, you know, see us as, I don't know.
link |
01:06:41.060
It's like, I'm not like, oh, fuck these dogs.
link |
01:06:43.580
Let's just kill all the dogs.
link |
01:06:45.500
I love dogs.
link |
01:06:46.340
Dogs have great utility.
link |
01:06:47.660
Dogs like provide a lot of.
link |
01:06:49.060
We make friends with them.
link |
01:06:50.460
We have a deep connection with them.
link |
01:06:53.520
We anthropomorphize them.
link |
01:06:55.380
Like we have a real love for dogs, for cats and so on
link |
01:06:58.860
for some reason, even though they're intellectually
link |
01:07:00.660
much less than us.
link |
01:07:01.500
And I think there is something sacred about us
link |
01:07:03.980
because it's like, if you look at the universe,
link |
01:07:05.700
like the whole universe is like cold and dead
link |
01:07:09.020
and sort of robotic.
link |
01:07:09.980
And it's like, you know, AI intelligence,
link |
01:07:15.620
you know, it's kind of more like the universe.
link |
01:07:18.980
It's like cold and you know, logical
link |
01:07:24.620
and you know, abiding by the laws of physics and whatever.
link |
01:07:28.780
But like, we're this like loosey goosey,
link |
01:07:31.980
weird art thing that happened.
link |
01:07:33.580
And I think it's beautiful.
link |
01:07:34.940
And like, I think even if we, I think one of the values,
link |
01:07:40.300
if consciousness is a thing that is most worth preserving,
link |
01:07:47.180
which I think is the case, I think consciousness,
link |
01:07:49.340
I think if there's any kind of like religious
link |
01:07:50.700
or spiritual thing, it should be that consciousness
link |
01:07:54.860
is sacred.
link |
01:07:55.700
Like, then, you know, I still think even if AI
link |
01:08:01.580
render us obsolete and we, climate change, it's too bad
link |
01:08:05.820
and we get hit by a comet and we don't become
link |
01:08:07.420
a multi planetary species fast enough,
link |
01:08:09.500
but like AI is able to populate the universe.
link |
01:08:12.300
Like I imagine, like if I was an AI,
link |
01:08:14.260
I would find more planets that are capable
link |
01:08:17.860
of hosting biological life forms and like recreate them.
link |
01:08:20.660
Because we're fun to watch.
link |
01:08:21.820
Yeah, we're fun to watch.
link |
01:08:23.340
Yeah, but I do believe that AI can have some of the same
link |
01:08:26.140
magic of consciousness within it.
link |
01:08:29.900
Because consciousness, we don't know what it is.
link |
01:08:31.420
So, you know, there's some kind of.
link |
01:08:33.020
Or it might be a different magic.
link |
01:08:34.140
It might be like a strange, a strange, different.
link |
01:08:37.500
Right.
link |
01:08:38.340
Because they're not gonna have hormones.
link |
01:08:39.340
Like I feel like a lot of our magic is hormonal kind of.
link |
01:08:42.620
I don't know, I think some of our magic
link |
01:08:44.500
is the limitations, the constraints.
link |
01:08:46.500
And within that, the hormones and all that kind of stuff,
link |
01:08:48.780
the finiteness of life, and then we get given
link |
01:08:51.460
our limitations, we get to come up with creative solutions
link |
01:08:54.740
of how to dance around those limitations.
link |
01:08:56.780
We partner up like penguins against the cold.
link |
01:08:59.420
We fall in love, and then love is ultimately
link |
01:09:03.340
some kind of, allows us to delude ourselves
link |
01:09:06.060
that we're not mortal and finite,
link |
01:09:08.500
and that life is not ultimately, you live alone,
link |
01:09:11.620
you're born alone, you die alone.
link |
01:09:13.780
And then love is like for a moment
link |
01:09:15.540
or for a long time, forgetting that.
link |
01:09:17.540
And so we come up with all these creative hacks
link |
01:09:20.340
that make life like fascinatingly fun.
link |
01:09:25.980
Yeah, yeah, yeah, fun, yeah.
link |
01:09:27.740
And then AI might have different kinds of fun.
link |
01:09:30.260
Yes.
link |
01:09:31.300
And hopefully our funs intersect once in a while.
link |
01:09:34.900
I think there would be a little intersection
link |
01:09:38.580
of the fun.
link |
01:09:39.420
Yeah. Yeah.
link |
01:09:40.420
What do you think is the role of love
link |
01:09:42.820
in the human condition?
link |
01:09:45.500
I think.
link |
01:09:46.500
Why, is it useful?
link |
01:09:47.620
Is it useful like a hack, or is this like fundamental
link |
01:09:51.540
to what it means to be human, the capacity to love?
link |
01:09:54.940
I mean, I think love is the evolutionary mechanism
link |
01:09:58.100
that is like beginning the intelligent design.
link |
01:10:00.580
Like I was just reading about,
link |
01:10:04.140
do you know about Kropotkin?
link |
01:10:06.180
He's like an anarchist, like old Russian anarchist.
link |
01:10:08.940
I live next door to Michael Malice.
link |
01:10:11.540
I don't know if you know who that is.
link |
01:10:12.380
He's an anarchist.
link |
01:10:13.300
He's a modern day anarchist.
link |
01:10:14.540
Okay. Anarchists are fun.
link |
01:10:15.780
I'm kind of getting into anarchism a little bit.
link |
01:10:17.940
This is probably not a good route to be taking, but.
link |
01:10:22.380
Oh no, I think if you're,
link |
01:10:23.900
listen, you should expose yourself to ideas.
link |
01:10:26.180
There's no harm to thinking about ideas.
link |
01:10:28.540
I think anarchists challenge systems in interesting ways,
link |
01:10:32.460
and they think in interesting ways.
link |
01:10:34.180
It's just as good for the soul.
link |
01:10:35.340
It's like refreshes your mental palette.
link |
01:10:37.300
I don't think we should actually,
link |
01:10:38.940
I wouldn't actually ascribe to it,
link |
01:10:40.620
but I've never actually gone deep on anarchy
link |
01:10:42.900
as a philosophy, so I'm doing.
link |
01:10:44.540
You should still think about it though.
link |
01:10:45.380
When you read, when you listen,
link |
01:10:46.500
because I'm reading about the Russian Revolution a lot,
link |
01:10:48.300
and there was the Soviets and Lenin and all that,
link |
01:10:51.180
but then there was Kropotkin and his anarchist sect,
link |
01:10:53.700
and they were sort of interesting
link |
01:10:54.820
because he was kind of a technocrat actually.
link |
01:10:57.140
He was like, women can be more equal if we have appliances.
link |
01:11:01.220
He was really into using technology
link |
01:11:04.940
to reduce the amount of work people had to do.
link |
01:11:07.740
But so Kropotkin was a biologist or something.
link |
01:11:11.700
He studied animals.
link |
01:11:13.380
And he was really at the time like,
link |
01:11:17.180
I think it's Nature magazine.
link |
01:11:20.660
I think it might've even started as a Russian magazine,
link |
01:11:22.660
but he was publishing studies.
link |
01:11:23.860
Everyone was really into Darwinism at the time
link |
01:11:26.020
and survival of the fittest,
link |
01:11:27.060
and war is the mechanism by which we become better.
link |
01:11:30.380
And it was this real cementing this idea in society
link |
01:11:36.500
that violence kill the weak,
link |
01:11:39.900
and that's how we become better.
link |
01:11:41.340
And then Kropotkin was kind of interesting
link |
01:11:43.060
because he was looking at instances,
link |
01:11:45.580
he was finding all these instances in nature
link |
01:11:47.420
where animals were like helping each other and stuff.
link |
01:11:49.500
And he was like, actually love is a survival mechanism.
link |
01:11:53.540
Like there's so many instances in the animal kingdom
link |
01:11:58.620
where like cooperation and like helping weaker creatures
link |
01:12:03.140
and all this stuff is actually an evolutionary mechanism.
link |
01:12:06.100
I mean, you even look at child rearing.
link |
01:12:08.020
Like child rearing is like immense amounts
link |
01:12:12.380
of just love and goodwill.
link |
01:12:14.420
And just like, there's no immediate,
link |
01:12:20.060
you're not getting any immediate feedback of like winning.
link |
01:12:24.860
It's not competitive.
link |
01:12:26.180
It's literally, it's like we actually use love
link |
01:12:28.500
as an evolutionary mechanism just as much as we use war.
link |
01:12:30.980
And I think we've like missing the other part
link |
01:12:34.020
and we've reoriented, we've culturally reoriented
link |
01:12:37.300
like science and philosophy has oriented itself
link |
01:12:41.820
around Darwinism a little bit too much.
link |
01:12:43.980
And the Kropotkin model, I think is equally valid.
link |
01:12:48.500
Like it's like cooperation and love and stuff
link |
01:12:54.340
is just as essential for species survival and evolution.
link |
01:12:58.900
It should be a more powerful survival mechanism
link |
01:13:01.380
in the context of evolution.
link |
01:13:02.700
And it comes back to like,
link |
01:13:04.500
we think engineering is so much more important
link |
01:13:06.660
than motherhood, but it's like,
link |
01:13:08.820
if you lose the motherhood, the engineering means nothing.
link |
01:13:10.700
We have no more humans.
link |
01:13:12.020
It's like, I think our society should,
link |
01:13:18.740
the survival of the, the way we see,
link |
01:13:21.300
we conceptualize evolution should really change
link |
01:13:24.500
to also include this idea, I guess.
link |
01:13:27.020
Yeah, there's some weird thing that seems irrational
link |
01:13:32.460
that is also core to what it means to be human.
link |
01:13:37.260
So love is one such thing.
link |
01:13:40.420
They could make you do a lot of irrational things,
link |
01:13:42.980
but that depth of connection and that loyalty
link |
01:13:46.100
is a powerful thing.
link |
01:13:47.300
Are they irrational or are they rational?
link |
01:13:49.260
Like, it's like, is, you know, maybe losing out
link |
01:13:57.100
on some things in order to like keep your family together
link |
01:14:00.900
or in order, like, it's like, what are our actual values?
link |
01:14:06.260
Well, right, I mean, the rational thing is
link |
01:14:08.780
if you have a cold economist perspective,
link |
01:14:11.340
you know, motherhood or sacrificing your career for love,
link |
01:14:16.020
you know, in terms of salary, in terms of economic wellbeing,
link |
01:14:20.620
in terms of flourishing of you as a human being,
link |
01:14:22.740
that could be seen on some kind of metrics
link |
01:14:25.900
as a irrational decision, suboptimal decision,
link |
01:14:28.580
but there's the manifestation of love
link |
01:14:34.220
could be the optimal thing to do.
link |
01:14:36.780
There's a kind of saying, save one life, save the world.
link |
01:14:41.140
That's the thing that doctors often face, which is like.
link |
01:14:44.100
Well, it's considered irrational
link |
01:14:45.140
because the profit model doesn't include social good.
link |
01:14:47.460
Yes, yeah.
link |
01:14:48.580
So if a profit model includes social good,
link |
01:14:50.420
then suddenly these would be rational decisions.
link |
01:14:52.220
Might be difficult to, you know,
link |
01:14:54.420
it requires a shift in our thinking about profit
link |
01:14:57.620
and might be difficult to measure social good.
link |
01:15:00.980
Yes, but we're learning to measure a lot of things.
link |
01:15:04.500
Yeah, digitizing a lot of things.
link |
01:15:05.580
Where we're actually, you know, quantifying vision and stuff.
link |
01:15:10.580
Like we're like, you know, like you go on Facebook
link |
01:15:14.580
and they can, like Facebook can pretty much predict
link |
01:15:16.980
our behaviors.
link |
01:15:17.820
Like we're, a surprising amount of things
link |
01:15:20.660
that seem like mysterious consciousness soul things
link |
01:15:25.900
have been quantified at this point.
link |
01:15:27.540
So surely we can quantify these other things.
link |
01:15:29.700
Yeah.
link |
01:15:31.460
But as more and more of us are moving the digital space,
link |
01:15:34.220
I wanted to ask you about something.
link |
01:15:35.860
From a fan perspective, I kind of, you know,
link |
01:15:40.220
you as a musician, you as an online personality,
link |
01:15:43.580
it seems like you have all these identities
link |
01:15:45.500
and you play with them.
link |
01:15:48.940
One of the cool things about the internet,
link |
01:15:51.140
it seems like you can play with identities.
link |
01:15:53.340
So as we move into the digital world more and more,
link |
01:15:56.100
maybe even in the so called metaverse.
link |
01:15:59.180
I mean, I love the metaverse and I love the idea,
link |
01:16:01.220
but like the way this has all played out didn't go well
link |
01:16:09.980
and people are mad about it.
link |
01:16:11.020
And I think we need to like.
link |
01:16:12.460
I think that's temporary.
link |
01:16:13.460
I think it's temporary.
link |
01:16:14.500
Just like, you know how all the celebrities got together
link |
01:16:16.780
and sang the song Imagine by Jeff Leonard
link |
01:16:19.100
and everyone started hating the song Imagine.
link |
01:16:20.940
I'm hoping that's temporary
link |
01:16:22.060
because it's a damn good song.
link |
01:16:24.380
So I think it's just temporary.
link |
01:16:25.500
Like once you actually have virtual worlds,
link |
01:16:27.820
whatever they're called metaverse or otherwise,
link |
01:16:29.940
it becomes, I don't know.
link |
01:16:31.420
Well, we do have virtual worlds.
link |
01:16:32.500
Like video games, Elden Ring.
link |
01:16:34.460
Have you played Elden Ring?
link |
01:16:35.300
You haven't played Elden Ring?
link |
01:16:36.140
I'm really afraid of playing that game.
link |
01:16:38.580
Literally amazed.
link |
01:16:39.420
It looks way too fun.
link |
01:16:40.780
It looks I would wanna go there and stay there forever.
link |
01:16:45.020
It's yeah, so fun.
link |
01:16:47.020
It's so nice.
link |
01:16:50.500
Oh man, yeah.
link |
01:16:52.180
So that's the, yeah, that's a metaverse.
link |
01:16:54.500
That's a metaverse, but you're not really,
link |
01:16:57.460
how immersive is it in the sense that,
link |
01:17:02.820
does the three dimension
link |
01:17:03.940
like virtual reality integration necessary?
link |
01:17:06.060
Can we really just take our, close our eyes
link |
01:17:08.780
and kind of plug in in the 2D screen
link |
01:17:13.060
and become that other being for time
link |
01:17:15.780
and really enjoy that journey that we take?
link |
01:17:17.940
And we almost become that.
link |
01:17:19.660
You're no longer C, I'm no longer Lex,
link |
01:17:22.100
you're that creature, whatever the hell it is in that game.
link |
01:17:25.340
Yeah, that is that.
link |
01:17:26.180
I mean, that's why I love those video games.
link |
01:17:29.220
I really do become those people for a time.
link |
01:17:33.100
But like, it seems like with the idea of the metaverse,
link |
01:17:36.180
the idea of the digital space,
link |
01:17:37.900
well, even on Twitter,
link |
01:17:39.500
you get a chance to be somebody for prolonged periods of time
link |
01:17:42.580
like across a lifespan.
link |
01:17:44.540
You know, you have a Twitter account for years, for decades
link |
01:17:47.620
and you're that person.
link |
01:17:48.460
I don't know if that's a good thing.
link |
01:17:49.660
I feel very tormented by it.
link |
01:17:52.660
By Twitter specifically.
link |
01:17:54.300
By social media representation of you.
link |
01:17:57.580
I feel like the public perception of me
link |
01:17:59.100
has gotten so distorted that I find it kind of disturbing.
link |
01:18:04.420
It's one of the things that's disincentivizing me
link |
01:18:06.300
from like wanting to keep making art
link |
01:18:07.900
because I'm just like,
link |
01:18:11.060
I've completely lost control of the narrative.
link |
01:18:13.740
And the narrative is, some of it is my own stupidity,
link |
01:18:16.820
but a lot, like some of it has just been like hijacked
link |
01:18:19.260
by forces far beyond my control.
link |
01:18:23.100
I kind of got in over my head in things.
link |
01:18:25.580
Like I'm just a random Indian musician,
link |
01:18:27.300
but I just got like dragged into geopolitical matters
link |
01:18:31.900
and like financial, like the stock market and shit.
link |
01:18:35.100
And so it's just like, it's just,
link |
01:18:36.300
there are very powerful people
link |
01:18:37.700
who have at various points in time
link |
01:18:39.660
had very vested interest in making me seem insane
link |
01:18:43.700
and I can't fucking fight that.
link |
01:18:45.620
And I just like,
link |
01:18:48.820
people really want their celebrity figures
link |
01:18:50.780
to like be consistent and stay the same.
link |
01:18:53.860
And like people have a lot of like emotional investment
link |
01:18:55.820
in certain things.
link |
01:18:56.660
And like, first of all,
link |
01:18:59.700
like I'm like artificially more famous than I should be.
link |
01:19:03.700
Isn't everybody who's famous artificially famous?
link |
01:19:06.620
No, but like I should be like a weird niche indie thing.
link |
01:19:11.340
And I make pretty challenging,
link |
01:19:13.380
I do challenging weird fucking shit a lot.
link |
01:19:16.300
And I accidentally by proxy got like foisted
link |
01:19:22.420
into sort of like weird celebrity culture,
link |
01:19:24.420
but like I cannot be media trained.
link |
01:19:27.300
They have put me through so many hours of media training.
link |
01:19:29.820
I would love to see BF fly in that wall.
link |
01:19:32.540
I can't do, like when I do,
link |
01:19:34.420
I try so hard and I like learn this thing and I like got it.
link |
01:19:37.420
And I'm like, I got it, I got it, I got it.
link |
01:19:38.740
But I just can't stop saying,
link |
01:19:40.540
like my mouth just says things like,
link |
01:19:42.940
and it's just like, and I just do, I just do things.
link |
01:19:45.340
I just do crazy things.
link |
01:19:46.580
Like I'm, I just, I need to do crazy things.
link |
01:19:50.740
And it's just, I should not be,
link |
01:19:53.100
it's too jarring for people
link |
01:19:56.580
and the contradictory stuff.
link |
01:19:59.700
And then all the by association, like, you know,
link |
01:20:05.220
it's like I'm in a very weird position and my public image,
link |
01:20:09.900
the avatar of me is now this totally crazy thing
link |
01:20:14.900
that is so lost from my control.
link |
01:20:16.580
So you feel the burden of the avatar having to be static.
link |
01:20:19.140
So the avatar on Twitter or the avatar on Instagram
link |
01:20:22.260
on these social platforms is as a burden.
link |
01:20:26.740
It becomes like, cause like people don't want to accept
link |
01:20:30.340
a changing avatar, a chaotic avatar.
link |
01:20:32.580
Avatar is a stupid shit sometimes.
link |
01:20:34.820
They think the avatar is morally wrong
link |
01:20:36.460
or they think the avatar, and maybe it has been,
link |
01:20:39.540
and like I question it all the time.
link |
01:20:41.100
Like, I'm like, like, I don't know if everyone's right
link |
01:20:46.300
and I'm wrong.
link |
01:20:47.140
I don't know, like, but you know, a lot of times
link |
01:20:50.100
people ascribe intentions to things,
link |
01:20:51.740
the worst possible intentions.
link |
01:20:53.660
At this point, people think I'm, you know,
link |
01:20:57.300
but which is fine.
link |
01:20:58.140
All kinds of words, yes.
link |
01:20:58.980
Yes, and it's fine.
link |
01:21:00.460
I'm not complaining about it, but I'm just,
link |
01:21:02.820
it's a curiosity to me that we live these double, triple,
link |
01:21:07.900
quadruple lives and I have this other life
link |
01:21:10.340
that is like more people know my other life
link |
01:21:13.900
than my real life, which is interesting.
link |
01:21:16.340
Probably, I mean, you too, I guess, probably.
link |
01:21:18.220
Yeah, but I have the luxury.
link |
01:21:20.100
So we have all different, you know,
link |
01:21:23.100
like I don't know what I'm doing.
link |
01:21:25.140
There is an avatar and you're mediating
link |
01:21:27.820
who you are through that avatar.
link |
01:21:29.940
I have the nice luxury, not the luxury,
link |
01:21:34.340
maybe by intention of not trying really hard
link |
01:21:38.060
to make sure there's no difference between the avatar
link |
01:21:41.460
and the private person.
link |
01:21:44.020
Do you wear a suit all the time?
link |
01:21:45.620
Yeah.
link |
01:21:46.620
You do wear a suit?
link |
01:21:47.740
Not all the time.
link |
01:21:48.940
Recently, because I get recognized a lot,
link |
01:21:51.540
I have to not wear the suit to hide.
link |
01:21:53.420
I'm such an introvert, I'm such a social anxiety
link |
01:21:55.820
and all that kind of stuff, so I have to hide away.
link |
01:21:57.660
I love wearing a suit because it makes me feel
link |
01:22:00.580
like I'm taking the moment seriously.
link |
01:22:02.380
Like I'm, I don't know.
link |
01:22:04.340
It makes me feel like a weirdo in the best possible way.
link |
01:22:06.980
Suits feel great, every time I wear a suit,
link |
01:22:08.500
I'm like, I don't know why I'm not doing this more.
link |
01:22:10.780
Fashion in general, if you're doing it for yourself,
link |
01:22:15.300
I don't know, it's a really awesome thing.
link |
01:22:18.660
But yeah, I think there is definitely a painful way
link |
01:22:24.620
to use social media and an empowering way.
link |
01:22:27.380
And I don't know if any of us know which is which.
link |
01:22:32.620
So we're trying to figure that out.
link |
01:22:33.940
Some people, I think Doja Cat is incredible at it.
link |
01:22:36.780
Incredible, like just masterful.
link |
01:22:39.540
I don't know if you like follow that.
link |
01:22:41.860
So okay, so not taking anything seriously,
link |
01:22:44.780
joking, absurd, humor, that kind of thing.
link |
01:22:47.420
I think Doja Cat might be like
link |
01:22:48.540
the greatest living comedian right now.
link |
01:22:52.080
Like I'm more entertained by Doja Cat
link |
01:22:53.580
than actual comedians.
link |
01:22:56.220
Like she's really fucking funny on the internet.
link |
01:22:58.900
She's just great at social media.
link |
01:23:00.220
It's just, you know.
link |
01:23:02.220
Yeah, the nature of humor, like humor on social media
link |
01:23:05.500
is also a beautiful thing, the absurdity.
link |
01:23:08.940
The absurdity.
link |
01:23:09.780
And memes, like I just wanna like take a moment.
link |
01:23:12.700
I love, like when we're talking about art
link |
01:23:14.580
and credit and authenticity, I love that there's this,
link |
01:23:18.180
I mean now memes are like, they're no longer,
link |
01:23:21.840
like memes aren't like new,
link |
01:23:23.640
but it's still this emergent art form
link |
01:23:25.580
that is completely egoless and anonymous
link |
01:23:27.720
and we just don't know who made any of it.
link |
01:23:29.480
And it's like the forefront of comedy
link |
01:23:32.560
and it's just totally anonymous
link |
01:23:35.420
and it just feels really beautiful.
link |
01:23:36.620
It just feels like this beautiful
link |
01:23:38.380
collective human art project
link |
01:23:43.260
that's like this like decentralized comedy thing
link |
01:23:46.420
that just makes memes add so much to my day
link |
01:23:48.860
and many people's days.
link |
01:23:49.860
And it's just like, I don't know.
link |
01:23:52.120
I don't think people ever,
link |
01:23:54.860
I don't think we stop enough
link |
01:23:56.020
and just appreciate how sick it is that memes exist.
link |
01:23:59.860
Because also making a whole brand new art form
link |
01:24:02.420
in like the modern era that's like didn't exist before.
link |
01:24:07.060
Like, I mean they sort of existed,
link |
01:24:08.520
but the way that they exist now as like this like,
link |
01:24:11.860
you know, like me and my friends,
link |
01:24:13.240
like we joke that we go like mining for memes
link |
01:24:16.320
or farming for memes, like a video game
link |
01:24:18.620
and like meme dealers and like whatever.
link |
01:24:21.100
Like it's, you know, it's this whole,
link |
01:24:22.740
memes are this whole like new comedic language.
link |
01:24:27.920
Well, it's this art form.
link |
01:24:29.260
The interesting thing about it is that
link |
01:24:31.260
lame people seem to not be good at memes.
link |
01:24:35.380
Like corporate can't infiltrate memes.
link |
01:24:38.180
Yeah, they really can't.
link |
01:24:39.860
They try, they could try.
link |
01:24:41.460
But it's like, it's weird cause like.
link |
01:24:43.340
They try so hard and every once in a while,
link |
01:24:45.560
I'm like fine, like you got a good one.
link |
01:24:48.660
I think I've seen like one or two good ones,
link |
01:24:51.420
but like, yeah, they really can't.
link |
01:24:53.340
Cause they're even, corporate is infiltrating web three.
link |
01:24:55.500
It's making me really sad,
link |
01:24:57.140
but they can't infiltrate the memes.
link |
01:24:58.660
And I think there's something really beautiful about that.
link |
01:25:00.100
That gives power, that's why Dogecoin is powerful.
link |
01:25:03.580
It's like, all right, I'm gonna F you
link |
01:25:05.880
to sort of anybody who's trying to centralize,
link |
01:25:08.820
is trying to control the rich people
link |
01:25:10.420
that are trying to roll in and control this,
link |
01:25:12.720
control the narrative.
link |
01:25:14.220
Wow, I hadn't thought about that, but.
link |
01:25:17.220
How would you fix Twitter?
link |
01:25:18.520
How would you fix social media for your own?
link |
01:25:21.680
Like you're an optimist, you're a positive person.
link |
01:25:25.220
There's a bit of a cynicism that you have currently
link |
01:25:27.500
about this particular little slice of humanity.
link |
01:25:30.700
I tend to think Twitter could be beautiful.
link |
01:25:32.700
I'm not that cynical about it.
link |
01:25:34.060
I'm not that cynical about it.
link |
01:25:35.140
I actually refuse to be a cynic on principle.
link |
01:25:37.700
Yes.
link |
01:25:38.540
I was just briefly expressing some personal pathos.
link |
01:25:41.140
Personal stuff.
link |
01:25:42.140
It was just some personal pathos, but like, like.
link |
01:25:45.900
Just to vent a little bit, just to speak.
link |
01:25:48.060
I don't have cancer, I love my family.
link |
01:25:50.860
I have a good life.
link |
01:25:51.980
That is, if that is my biggest,
link |
01:25:55.380
one of my biggest problems.
link |
01:25:56.220
Then it's a good life.
link |
01:25:57.180
Yeah, you know, that was a brief,
link |
01:25:59.860
although I do think there are a lot of issues with Twitter
link |
01:26:01.380
just in terms of like the public mental health,
link |
01:26:03.180
but due to my proximity to the current dramas,
link |
01:26:10.500
I honestly feel that I should not have opinions about this
link |
01:26:13.820
because I think
link |
01:26:17.820
that if Elon ends up getting Twitter,
link |
01:26:28.380
that is a, being the arbiter of truth or public discussion,
link |
01:26:33.180
that is a responsibility.
link |
01:26:36.260
I do not, I am not qualified to be responsible for that.
link |
01:26:41.260
And I do not want to say something
link |
01:26:45.260
that might like dismantle democracy.
link |
01:26:48.340
And so I just like, actually,
link |
01:26:49.780
I actually think I should not have opinions about this
link |
01:26:52.140
because I truly am not,
link |
01:26:55.100
I don't want to have the wrong opinion about this.
link |
01:26:56.740
And I think I'm too close to the actual situation
link |
01:27:00.180
wherein I should not have, I have thoughts in my brain,
link |
01:27:04.300
but I think I am scared by my proximity to this situation.
link |
01:27:09.620
Isn't that crazy that a few words that you could say
link |
01:27:14.620
could change world affairs and hurt people?
link |
01:27:18.780
I mean, that's the nature of celebrity at a certain point
link |
01:27:22.980
that you have to be, you have to a little bit, a little bit,
link |
01:27:27.140
not so much that it destroys you or puts too much constraints,
link |
01:27:30.540
but you have to a little bit think about
link |
01:27:32.620
the impact of your words.
link |
01:27:33.940
I mean, we as humans, you talk to somebody at a bar,
link |
01:27:36.660
you have to think about the impact of your words.
link |
01:27:39.100
Like you can say positive things,
link |
01:27:40.340
you can say negative things,
link |
01:27:41.500
you can affect the direction of one life.
link |
01:27:43.340
But on social media, your words can affect
link |
01:27:45.260
the direction of many lives.
link |
01:27:48.060
That's crazy.
link |
01:27:48.900
It's a crazy world to live in.
link |
01:27:50.420
It's worthwhile to consider that responsibility,
link |
01:27:52.980
take it seriously.
link |
01:27:54.100
Sometimes just like you did choose kind of silence,
link |
01:28:00.740
choose sort of respectful.
link |
01:28:03.580
Like I do have a lot of thoughts on the matter.
link |
01:28:05.260
I'm just, I don't, if my thoughts are wrong,
link |
01:28:10.100
this is one situation where the stakes are high.
link |
01:28:12.820
You mentioned a while back that you were in a cult
link |
01:28:15.740
that's centered around bureaucracy,
link |
01:28:17.260
so you can't really do anything
link |
01:28:18.460
because it involves a lot of paperwork.
link |
01:28:20.460
And I really love a cult that's just like Kafkaesque.
link |
01:28:24.700
Yes.
link |
01:28:25.740
Just like.
link |
01:28:26.580
I mean, it was like a joke, but it was.
link |
01:28:27.660
I know, but I love this idea.
link |
01:28:29.300
The Holy Rain Empire.
link |
01:28:30.420
Yeah, it was just like a Kafkaesque pro bureaucracy cult.
link |
01:28:34.860
But I feel like that's what human civilization is,
link |
01:28:36.820
is that, because when you said that, I was like,
link |
01:28:38.500
oh, that is kind of what humanity is,
link |
01:28:40.620
is this bureaucracy cult.
link |
01:28:41.660
I do, yeah, I have this theory.
link |
01:28:45.620
I really think that we really,
link |
01:28:50.700
bureaucracy is starting to kill us.
link |
01:28:53.580
And I think like we need to reorient laws and stuff.
link |
01:28:59.540
Like, I think we just need sunset clauses on everything.
link |
01:29:01.940
Like, I think the rate of change in culture
link |
01:29:04.580
is happening so fast and the rate of change in technology
link |
01:29:06.660
and everything is happening so fast.
link |
01:29:07.900
It's like, when you see these hearings
link |
01:29:10.780
about like social media and Cambridge Analytica
link |
01:29:15.820
and everyone talking, it's like, even from that point,
link |
01:29:19.180
so much technological change has happened
link |
01:29:21.380
from like those hearings.
link |
01:29:22.740
And it's just like, we're trying to make all these laws now
link |
01:29:24.860
about AI and stuff.
link |
01:29:25.940
I feel like we should be updating things
link |
01:29:27.380
like every five years.
link |
01:29:28.380
And like one of the big issues in our society right now
link |
01:29:30.140
is we're just getting bogged down by laws
link |
01:29:32.260
and it's making it very hard to change things
link |
01:29:37.020
and develop things.
link |
01:29:37.860
In Austin, I don't wanna speak on this too much,
link |
01:29:41.500
but like one of my friends is working on a housing bill
link |
01:29:43.220
in Austin to try to like prevent
link |
01:29:45.020
like a San Francisco situation from happening here
link |
01:29:47.180
because obviously we're getting a little mini San Francisco
link |
01:29:49.940
here, like housing prices are skyrocketing,
link |
01:29:52.140
it's causing massive gentrification.
link |
01:29:54.740
This is really bad for anyone who's not super rich.
link |
01:29:59.180
Like, there's so much bureaucracy.
link |
01:30:00.860
Part of the reason this is happening
link |
01:30:01.900
is because you need all these permits to build.
link |
01:30:04.020
It takes like years to get permits to like build anything.
link |
01:30:06.460
It's so hard to build and so there's very limited housing
link |
01:30:09.100
and there's a massive influx of people.
link |
01:30:10.900
And it's just like, you know, this is a microcosm
link |
01:30:13.460
of like problems that are happening all over the world
link |
01:30:15.540
where it's just like, we're dealing with laws
link |
01:30:18.780
that are like 10, 20, 30, 40, 100, 200 years old
link |
01:30:22.340
and they are no longer relevant
link |
01:30:24.100
and it's just slowing everything down
link |
01:30:25.660
and causing massive social pain.
link |
01:30:29.100
Yeah, but it's like, it's also makes me sad
link |
01:30:32.820
when I see politicians talk about technology
link |
01:30:35.740
and when they don't really get it.
link |
01:30:38.420
But most importantly, they lack curiosity
link |
01:30:41.140
and like that like inspired excitement
link |
01:30:44.780
about like how stuff works and all that stuff.
link |
01:30:46.580
They're just like, they see,
link |
01:30:47.780
they have a very cynical view of technology.
link |
01:30:50.140
It's like tech companies are just trying to do evil
link |
01:30:52.180
on the world from their perspective
link |
01:30:53.540
and they have no curiosity about like
link |
01:30:55.860
how recommender systems work or how AI systems work,
link |
01:30:59.820
natural language processing, how robotics works,
link |
01:31:02.700
how computer vision works, you know.
link |
01:31:05.860
They always take the most cynical possible interpretation
link |
01:31:08.660
of what technology would be used
link |
01:31:09.860
and we should definitely be concerned about that
link |
01:31:11.660
but if you're constantly worried about that
link |
01:31:13.780
and you're regulating based on that,
link |
01:31:15.020
you're just going to slow down all the innovation.
link |
01:31:16.980
I do think a huge priority right now
link |
01:31:19.420
is undoing the bad energy
link |
01:31:25.740
surrounding the emergence of Silicon Valley.
link |
01:31:28.060
Like I think that like a lot of things
link |
01:31:30.020
were very irresponsible during that time
link |
01:31:31.820
and you know, like even just this current whole thing
link |
01:31:36.140
with Twitter and everything,
link |
01:31:36.980
it's like there has been a lot of negative outcomes
link |
01:31:39.980
from the sort of technocracy boom
link |
01:31:44.260
but one of the things that's happening
link |
01:31:46.060
is that like it's alienating people
link |
01:31:49.100
from wanting to care about technology
link |
01:31:52.340
and I actually think technology is probably
link |
01:31:56.020
some of the better, probably the best.
link |
01:31:58.900
I think we can fix a lot of our problems
link |
01:32:01.620
more easily with technology
link |
01:32:03.220
than with you know, fighting the powers that be
link |
01:32:07.100
as a you know, not to go back to the Star Wars quote
link |
01:32:09.700
or the Buckminster Fuller quote.
link |
01:32:11.300
Let's go to some dark questions.
link |
01:32:14.700
If we may for time, what is the darkest place
link |
01:32:17.860
you've ever gone in your mind?
link |
01:32:20.220
Is there a time, a period of time, a moment
link |
01:32:23.860
that you remember that was difficult for you?
link |
01:32:29.580
I mean, when I was 18,
link |
01:32:30.460
my best friend died of a heroin overdose
link |
01:32:33.460
and it was like my,
link |
01:32:38.300
and then shortly after that,
link |
01:32:39.780
one of my other best friends committed suicide
link |
01:32:44.700
and that sort of like coming into adulthood,
link |
01:32:48.620
dealing with two of the most important people in my life
link |
01:32:51.140
dying in extremely disturbing violent ways
link |
01:32:53.140
was a lot.
link |
01:32:55.860
That was a lot.
link |
01:32:56.700
Do you miss them?
link |
01:32:58.340
Yeah, definitely miss them.
link |
01:32:59.860
Did that make you think about your own life?
link |
01:33:02.660
About the finiteness of your own life?
link |
01:33:04.860
The places your mind can go?
link |
01:33:08.100
Did you ever in the distance, far away
link |
01:33:10.860
contemplate just your own death?
link |
01:33:15.380
Or maybe even taking your own life?
link |
01:33:17.220
Oh never, oh no.
link |
01:33:18.660
I'm so, I love my life.
link |
01:33:20.380
I cannot fathom suicide.
link |
01:33:23.060
I'm so scared of death.
link |
01:33:24.100
I haven't, I'm too scared of death.
link |
01:33:25.980
My manager, my manager's like the most Zen guy.
link |
01:33:28.740
My manager's always like, you need to accept death.
link |
01:33:31.060
You need to accept death.
link |
01:33:32.060
And I'm like, look, I can do your meditation.
link |
01:33:34.180
I can do the meditation, but I cannot accept death.
link |
01:33:37.300
I like, I will fight, I'm terrified of death.
link |
01:33:40.340
I will like fight.
link |
01:33:42.740
Although I actually think death is important.
link |
01:33:45.060
I recently went to this meeting about immortality
link |
01:33:50.020
and in the process of.
link |
01:33:51.500
That's the actual topic of the meeting?
link |
01:33:53.300
I'm sorry.
link |
01:33:54.140
No, no, it was this girl.
link |
01:33:54.980
It was a bunch of people working on like anti aging stuff.
link |
01:33:58.940
It was like some like seminary thing about it.
link |
01:34:01.980
And I went in really excited.
link |
01:34:03.300
I was like, yeah, like, okay, like, what do you got?
link |
01:34:05.180
Like, how can I live for 500 years or a thousand years?
link |
01:34:07.860
And then like over the course of the meeting,
link |
01:34:10.620
like it was sort of like, right.
link |
01:34:11.940
It was like two or three days
link |
01:34:13.140
after the Russian invasion started.
link |
01:34:14.580
And I was like, man, like, what if Putin was immortal?
link |
01:34:17.300
Like, what if I'm like, man, maybe immortality,
link |
01:34:20.900
is not good.
link |
01:34:23.620
I mean, like if you get into the later Dune stuff,
link |
01:34:25.740
the immortals cause a lot of problem.
link |
01:34:29.020
Cause as we were talking about earlier with the music
link |
01:34:30.980
and like brains calcify, like good people
link |
01:34:34.740
could become immortal, but bad people could become immortal.
link |
01:34:36.900
But I also think even the best people power corrupts
link |
01:34:43.340
and power alienates you from like the common human experience
link |
01:34:46.740
and.
link |
01:34:47.580
Right, so the people that get more and more powerful.
link |
01:34:49.100
Even the best people who like, whose brains are amazing,
link |
01:34:52.220
like I think death might be important.
link |
01:34:54.780
I think death is part of, you know,
link |
01:34:57.380
like I think with AI one thing we might want to consider,
link |
01:35:01.020
I don't know, when I talk about AI,
link |
01:35:02.420
I'm such not an expert and probably everyone has
link |
01:35:04.540
all these ideas and they're already figured out.
link |
01:35:06.220
But when I was talking.
link |
01:35:07.060
Nobody is an expert in anything.
link |
01:35:08.540
See, okay, go ahead.
link |
01:35:09.900
But when I.
link |
01:35:10.740
You were talking about.
link |
01:35:11.580
Yeah, but I like, it's just like,
link |
01:35:13.180
I think some kind of pruning.
link |
01:35:16.100
But it's a tricky thing because if there's too much
link |
01:35:20.140
of a focus on youth culture, then you don't have the wisdom.
link |
01:35:25.460
So I feel like we're in a tricky,
link |
01:35:27.740
we're in a tricky moment right now in society
link |
01:35:30.300
where it's like, we've really perfected living
link |
01:35:32.460
for a long time.
link |
01:35:33.300
So there's all these really like old people
link |
01:35:35.780
who are like really voting against the wellbeing
link |
01:35:39.540
of the young people, you know?
link |
01:35:41.540
And like, it's like there shouldn't be all this student dead
link |
01:35:45.180
and we need like healthcare, like universal healthcare
link |
01:35:48.540
and like just voting against like best interests.
link |
01:35:52.460
But then you have all these young people
link |
01:35:53.660
that don't have the wisdom that are like,
link |
01:35:55.980
yeah, we need communism and stuff.
link |
01:35:57.620
And it's just like, like literally I got canceled
link |
01:36:00.740
at one point for, I ironically used a Stalin quote
link |
01:36:04.980
in my high school yearbook, but it was actually like a diss
link |
01:36:08.420
against my high school.
link |
01:36:09.620
I saw that.
link |
01:36:10.460
Yeah, and people were like, you used to be a Stalinist
link |
01:36:13.180
and now you're a class traitor and it's like,
link |
01:36:15.860
it's like, oh man, just like, please Google Stalin.
link |
01:36:19.260
Please Google Stalin.
link |
01:36:20.500
Like, you know.
link |
01:36:21.340
Ignoring the lessons of history, yes.
link |
01:36:23.980
And it's like, we're in this really weird middle ground
link |
01:36:26.100
where it's like, we are not finding the happy medium
link |
01:36:31.220
between wisdom and fresh ideas
link |
01:36:34.700
and they're fighting each other.
link |
01:36:35.900
And it's like, like really, like what we need is like
link |
01:36:40.900
the fresh ideas and the wisdom to be like collaborating.
link |
01:36:43.860
And it's like.
link |
01:36:45.140
What the fighting in a way is the searching
link |
01:36:47.300
for the happy medium.
link |
01:36:48.500
And in a way, maybe we are finding the happy medium.
link |
01:36:51.020
Maybe that's what the happy medium looks like.
link |
01:36:52.940
And for AI systems, there has to be,
link |
01:36:54.980
it's, you know, you have the reinforcement learning,
link |
01:36:57.140
you have the dance between exploration and exploitation,
link |
01:37:00.340
sort of doing crazy stuff to see if there's something better
link |
01:37:03.380
than what you think is the optimal
link |
01:37:05.420
and then doing the optimal thing
link |
01:37:06.620
and dancing back and forth from that.
link |
01:37:08.620
You would, Stuart Russell, I don't know if you know that,
link |
01:37:10.660
is AI guy with, thinks about sort of
link |
01:37:15.820
how to control super intelligent AI systems.
link |
01:37:18.580
And his idea is that we should inject uncertainty
link |
01:37:21.500
and sort of humility into AI systems that they never,
link |
01:37:24.980
as they get wiser and wiser and wiser and more intelligent,
link |
01:37:28.100
they're never really sure.
link |
01:37:30.020
They always doubt themselves.
link |
01:37:31.620
And in some sense, when you think of young people,
link |
01:37:34.340
that's a mechanism for doubt.
link |
01:37:36.300
It's like, it's how society doubts
link |
01:37:38.860
whether the thing it has converged towards
link |
01:37:40.860
is the right answer.
link |
01:37:41.940
So the voices of the young people
link |
01:37:44.860
is a society asking itself a question.
link |
01:37:48.140
The way I've been doing stuff for the past 50 years,
link |
01:37:51.100
maybe it's the wrong way.
link |
01:37:52.460
And so you can have all of that within one AI system.
link |
01:37:55.340
I also think, though, that we need to,
link |
01:37:57.460
I mean, actually, that's actually really interesting
link |
01:37:59.900
and really cool.
link |
01:38:01.820
But I also think there's a fine balance of,
link |
01:38:04.500
I think we maybe also overvalue the idea
link |
01:38:09.500
that the old systems are always bad.
link |
01:38:11.180
And I think there are things that we are perfecting
link |
01:38:14.140
and we might be accidentally overthrowing things
link |
01:38:16.780
that we actually have gotten to a good point.
link |
01:38:19.060
Just because we value disruption so much
link |
01:38:22.180
and we value fighting against the generations
link |
01:38:24.780
before us so much that there's also an aspect of,
link |
01:38:29.980
sometimes we're taking two steps forward, one step back
link |
01:38:32.100
because, okay, maybe we kind of did solve this thing
link |
01:38:36.300
and now we're like fucking it up, you know?
link |
01:38:38.500
And so I think there's like a middle ground there too.
link |
01:38:44.300
Yeah, we're in search of that happy medium.
link |
01:38:46.700
Let me ask you a bunch of crazy questions, okay?
link |
01:38:49.780
All right.
link |
01:38:50.900
You can answer in a short way or in a long way.
link |
01:38:53.340
What's the scariest thing you've ever done?
link |
01:38:55.700
These questions are gonna be ridiculous.
link |
01:38:57.780
Something tiny or something big.
link |
01:39:00.780
Something big, skydiving or touring your first record,
link |
01:39:09.860
going on this podcast.
link |
01:39:12.100
I've had two crazy brushes, like really scary brushes
link |
01:39:14.740
with death where I randomly got away on scay.
link |
01:39:16.980
I don't know if I should talk about those on here.
link |
01:39:19.180
Well, I don't know.
link |
01:39:20.020
I think I might be the luckiest person alive though.
link |
01:39:22.820
Like this might be too dark for a podcast though.
link |
01:39:25.980
I feel like, I don't know if this is like good content
link |
01:39:27.820
for a podcast.
link |
01:39:28.700
I don't know what is good content.
link |
01:39:30.220
It might hijack.
link |
01:39:31.580
Here's a safer one.
link |
01:39:32.420
I mean, having a baby really scared me.
link |
01:39:36.740
Before.
link |
01:39:37.580
Just the birth process.
link |
01:39:38.900
Surgery, like just having a baby is really scary.
link |
01:39:45.940
So just like the medical aspect of it,
link |
01:39:47.540
not the responsibility.
link |
01:39:49.300
Were you ready for the responsibility?
link |
01:39:51.300
Did you, were you ready to be a mother?
link |
01:39:53.980
All the beautiful things that comes with motherhood
link |
01:39:56.260
that you were talking about.
link |
01:39:57.580
All the changes and all that, were you ready for that?
link |
01:40:01.060
Or did you feel ready for that?
link |
01:40:02.980
No, I think it took about nine months
link |
01:40:05.340
to start getting ready for it.
link |
01:40:06.580
And I'm still getting more ready for it
link |
01:40:08.380
because now you keep realizing more things
link |
01:40:12.980
as they start getting.
link |
01:40:14.220
As the consciousness grows.
link |
01:40:16.420
And stuff you didn't notice with the first one,
link |
01:40:18.380
now that you've seen the first one older,
link |
01:40:19.700
you're noticing it more.
link |
01:40:21.720
Like the sort of like existential horror
link |
01:40:24.420
of coming into consciousness with Baby Y
link |
01:40:28.340
or Baby Sailor Mars or whatever.
link |
01:40:30.220
She has like so many names at this point
link |
01:40:31.500
that it's, we really need to probably settle on one.
link |
01:40:36.140
If you could be someone else for a day,
link |
01:40:38.340
someone alive today, but somebody you haven't met yet,
link |
01:40:41.820
who would you be?
link |
01:40:42.660
Would I be modeling their brain state
link |
01:40:44.240
or would I just be in their body?
link |
01:40:46.400
You can choose the degree
link |
01:40:48.380
to which you're modeling their brain state.
link |
01:40:50.580
Cause you can still take a third person perspective
link |
01:40:54.300
and realize, you have to realize that you're.
link |
01:40:56.620
Can they be alive or can it be dead?
link |
01:41:00.540
No, oh.
link |
01:41:02.780
They would be brought back to life, right?
link |
01:41:04.420
If they're dead.
link |
01:41:05.260
Yeah, you can bring people back.
link |
01:41:07.100
Definitely Hitler or Stalin.
link |
01:41:09.260
I wanna understand evil.
link |
01:41:12.060
You would need to, oh, to experience what it feels like.
link |
01:41:15.020
I wanna be in their brain feeling what they feel.
link |
01:41:18.220
I might change you forever returning from that.
link |
01:41:20.860
Yes, but I think it would also help me understand
link |
01:41:22.940
how to prevent it and fix it.
link |
01:41:25.380
That might be one of those things,
link |
01:41:26.580
once you experience it, it'll be a burden to know it.
link |
01:41:29.940
Cause you won't be able to transfer that.
link |
01:41:30.780
Yeah, but a lot of things are burdens.
link |
01:41:33.820
But it's a useful burden.
link |
01:41:34.780
But it's a useful burden, yeah.
link |
01:41:36.580
That for sure, I wanna understand evil
link |
01:41:39.260
and psychopathy and that.
link |
01:41:42.020
I have all these fake Twitter accounts
link |
01:41:43.340
where I go into different algorithmic bubbles
link |
01:41:45.580
to try to understand.
link |
01:41:47.380
I'll keep getting in fights with people
link |
01:41:48.740
and realize we're not actually fighting.
link |
01:41:50.660
I think we used to exist in a monoculture
link |
01:41:53.180
before social media and stuff.
link |
01:41:54.860
We kinda all got fed the same thing.
link |
01:41:56.460
So we were all speaking the same cultural language.
link |
01:41:58.700
But I think recently, one of the things
link |
01:42:00.140
that we aren't diagnosing properly enough with social media
link |
01:42:03.100
is that there's different dialects.
link |
01:42:05.500
There's so many different dialects of Chinese.
link |
01:42:06.900
There are now becoming different dialects of English.
link |
01:42:09.580
I am realizing there are people
link |
01:42:11.780
who are saying the exact same things,
link |
01:42:13.540
but they're using completely different verbiage.
link |
01:42:15.980
And we're punishing each other
link |
01:42:17.340
for not using the correct verbiage.
link |
01:42:18.900
And we're completely misunderstanding.
link |
01:42:20.620
People are just misunderstanding
link |
01:42:22.020
what the other people are saying.
link |
01:42:23.580
And I just got in a fight with a friend
link |
01:42:27.460
about anarchism and communism and shit for two hours.
link |
01:42:33.020
And then by the end of a conversation,
link |
01:42:34.700
and then she'd say something, and I'm like,
link |
01:42:35.940
but that's literally what I'm saying.
link |
01:42:37.620
And she was like, what?
link |
01:42:39.340
And then I was like, fuck, we've different,
link |
01:42:40.900
I'm like, our English, the way we are understanding
link |
01:42:44.820
terminology is like drastically, like our algorithm bubbles
link |
01:42:50.580
are creating mini dialects.
link |
01:42:53.380
Of how language is interpreted, how language is used.
link |
01:42:55.900
That's so fascinating.
link |
01:42:56.860
And so we're like having these arguments
link |
01:42:59.220
that we do not need to be having.
link |
01:43:00.980
And there's polarization that's happening
link |
01:43:02.380
that doesn't need to be happening
link |
01:43:03.420
because we've got these like algorithmically created
link |
01:43:08.620
dialects occurring.
link |
01:43:09.820
Plus on top of that, there's also different parts
link |
01:43:11.980
of the world that speak different languages.
link |
01:43:13.460
So there's literally lost in translation
link |
01:43:16.220
kind of communication.
link |
01:43:17.820
I happen to know the Russian language
link |
01:43:19.780
and just know how different it is.
link |
01:43:22.420
Then the English language.
link |
01:43:23.900
And I just wonder how much is lost in a little bit of.
link |
01:43:27.660
Man, I actually, cause I have a question for you.
link |
01:43:28.980
I have a song coming out tomorrow
link |
01:43:30.260
with I Speak Who Are A Russian Band.
link |
01:43:31.900
And I speak a little bit of Russian
link |
01:43:33.700
and I was looking at the title
link |
01:43:35.380
and the title in English doesn't match
link |
01:43:37.300
the title in Russian.
link |
01:43:38.220
I'm curious about this.
link |
01:43:39.140
Cause look, it says the title in English is Last Day.
link |
01:43:42.940
And then the title in Russian is New Day.
link |
01:43:45.580
My pronunciation sucks.
link |
01:43:47.540
New Day.
link |
01:43:48.780
Like what?
link |
01:43:49.620
Like a new day.
link |
01:43:50.460
A new day.
link |
01:43:51.300
Yeah, new day, new day.
link |
01:43:52.140
Like it's two different meanings.
link |
01:43:53.340
Yeah, new day, yeah.
link |
01:43:57.220
Yeah, yeah, new day.
link |
01:43:58.500
New day, but last day.
link |
01:44:01.340
New day.
link |
01:44:02.260
So last day would be the last day.
link |
01:44:04.220
Yeah.
link |
01:44:05.060
Maybe they.
link |
01:44:05.900
Or maybe the title includes both the Russian
link |
01:44:07.580
and it's for.
link |
01:44:09.060
Maybe.
link |
01:44:09.900
Maybe it's for bilingual.
link |
01:44:10.740
But to be honest, Novodin sounds better than
link |
01:44:13.220
just musically.
link |
01:44:15.140
Like Novodin is new day.
link |
01:44:17.780
That's the current one.
link |
01:44:18.780
And Posledniy Den is the last day.
link |
01:44:23.460
I think Novodin.
link |
01:44:25.420
I don't like Novodin.
link |
01:44:26.660
But the meaning is so different.
link |
01:44:30.180
That's kind of awesome actually though.
link |
01:44:31.660
There's an explicit sort of contrast like that.
link |
01:44:35.820
If everyone on earth disappeared
link |
01:44:38.340
and it was just you left, what would your day look like?
link |
01:44:44.060
Like what would you do?
link |
01:44:45.220
Everybody's dead.
link |
01:44:46.780
As far as you.
link |
01:44:47.620
Are there corpses there?
link |
01:44:52.500
Well seriously, it's a big.
link |
01:44:53.340
Let me think through this.
link |
01:44:54.780
It's a big difference if there's just like birds singing
link |
01:44:56.940
versus if there's like corpses littering the street.
link |
01:44:58.940
Yeah, there's corpses everywhere, I'm sorry.
link |
01:45:01.860
It's, and you don't actually know what happened
link |
01:45:05.060
and you don't know why you survived.
link |
01:45:07.540
And you don't even know if there's others out there.
link |
01:45:10.420
But it seems clear that it's all gone.
link |
01:45:13.580
What would you do?
link |
01:45:15.220
What would I do?
link |
01:45:17.300
Listen, I'm somebody who really enjoys the moment,
link |
01:45:19.580
enjoys life.
link |
01:45:20.420
I would just go on like enjoying the inanimate objects.
link |
01:45:26.380
I would just look for food, basic survival.
link |
01:45:30.580
But most of it is just, listen, when I just,
link |
01:45:33.500
I take walks and I look outside and I'm just happy
link |
01:45:36.860
that we get to exist on this planet,
link |
01:45:39.660
to be able to breathe air.
link |
01:45:41.980
It's just all beautiful.
link |
01:45:43.060
It's full of colors, all of this kind of stuff.
link |
01:45:44.820
Just, there's so many things about life,
link |
01:45:48.180
your own life, conscious life that's fucking awesome.
link |
01:45:50.780
So I would just enjoy that.
link |
01:45:54.220
But also maybe after a few weeks,
link |
01:45:56.860
the engineer would start coming out,
link |
01:45:58.460
like wanna build some things.
link |
01:46:01.580
Maybe there's always hope searching for another human.
link |
01:46:05.660
Maybe.
link |
01:46:06.500
Probably searching for another human.
link |
01:46:09.340
Probably trying to get to a TV or radio station
link |
01:46:13.100
and broadcast something.
link |
01:46:18.340
That's interesting, I didn't think about that.
link |
01:46:19.860
So like really maximize your ability
link |
01:46:23.460
to connect with others.
link |
01:46:24.500
Yeah, like probably try to find another person.
link |
01:46:29.220
Would you be excited to see,
link |
01:46:31.500
to meet another person or terrified?
link |
01:46:33.380
Because, you know.
link |
01:46:34.900
I'd be excited.
link |
01:46:35.740
No matter what.
link |
01:46:36.580
Yeah, yeah, yeah, yeah.
link |
01:46:38.220
Being alone for the last however long of my life
link |
01:46:42.180
would be really bad.
link |
01:46:43.580
That's the one instance I might,
link |
01:46:46.100
I don't think I'd kill myself,
link |
01:46:47.220
but I might kill myself if I had to.
link |
01:46:48.740
So you love people.
link |
01:46:50.140
You love connection to other humans.
link |
01:46:51.900
Yeah.
link |
01:46:52.740
I kinda hate people too, but yeah.
link |
01:46:54.500
That's a love hate relationship.
link |
01:46:56.420
Yeah.
link |
01:46:57.260
I feel like we'd have a bunch of weird
link |
01:46:58.380
Nietzsche questions and stuff though.
link |
01:47:00.540
Oh yeah.
link |
01:47:01.380
Like I wonder, cause I'm like, when podcast,
link |
01:47:02.980
I'm like, is this interesting for people
link |
01:47:04.340
to just have like, or I don't know,
link |
01:47:06.500
maybe people do like this.
link |
01:47:08.380
When I listen to podcasts, I'm into like the lore,
link |
01:47:10.460
like the hard lore.
link |
01:47:11.900
Like I just love like Dan Carlin.
link |
01:47:13.380
I'm like, give me the facts.
link |
01:47:14.500
Just like, like the facts into my bloodstream.
link |
01:47:18.740
But you also don't know,
link |
01:47:20.700
like you're a fascinating mind to explore.
link |
01:47:23.300
So you don't realize as you're talking about stuff,
link |
01:47:26.340
the stuff you've taken for granted
link |
01:47:28.420
is actually unique and fascinating.
link |
01:47:30.420
The way you think.
link |
01:47:32.060
Not always what, like the way you reason through things
link |
01:47:35.780
is the fascinating thing to listen to.
link |
01:47:39.420
Because people kind of see, oh,
link |
01:47:41.060
there's other humans that think differently,
link |
01:47:43.420
that explore thoughts differently.
link |
01:47:45.380
That's the cool, that's also cool.
link |
01:47:47.540
So yeah, Dan Carlin retelling of history.
link |
01:47:50.180
By the way, his retelling of history is very,
link |
01:47:54.820
I think what's exciting is not the history,
link |
01:47:57.060
is his way of thinking about history.
link |
01:48:00.340
No, I think Dan Carlin is one of the people,
link |
01:48:02.700
like when, Dan Carlin is one of the people
link |
01:48:04.380
that really started getting me excited
link |
01:48:06.140
about like revolutionizing education.
link |
01:48:08.900
Because like Dan Carlin instilled,
link |
01:48:12.700
I already like really liked history,
link |
01:48:14.380
but he instilled like an obsessive love of history in me
link |
01:48:18.940
to the point where like now I'm fucking reading,
link |
01:48:21.260
like going to bed, reading like part four
link |
01:48:24.940
of The Rise and Fall of the Third Reich or whatever.
link |
01:48:26.740
Like I got like dense ass history,
link |
01:48:28.740
but like he like opened that door
link |
01:48:31.300
that like made me want to be a scholar of that topic.
link |
01:48:34.300
Like it's like, I feel like he's such a good teacher.
link |
01:48:37.740
He just like, you know, and it sort of made me feel like
link |
01:48:42.260
one of the things we could do with education
link |
01:48:44.060
is like find like the world's great,
link |
01:48:46.540
the teachers that like create passion for the topic
link |
01:48:49.780
because autodidactricism,
link |
01:48:53.580
I don't know how to say that properly,
link |
01:48:55.140
but like self teaching is like much faster
link |
01:48:57.900
than being lectured to.
link |
01:48:59.460
Like it's much more efficient
link |
01:49:00.660
to sort of like be able to teach yourself
link |
01:49:02.300
and then ask a teacher questions
link |
01:49:03.700
when you don't know what's up.
link |
01:49:04.860
But like, you know, that's why it's like
link |
01:49:07.380
in university and stuff,
link |
01:49:08.380
like you can learn so much more material so much faster
link |
01:49:11.060
because you're doing a lot of the learning on your own
link |
01:49:13.340
and you're going to the teachers for when you get stuck.
link |
01:49:15.620
But like these teachers that can inspire passion
link |
01:49:18.980
for a topic, I think that is one of the most invaluable
link |
01:49:21.660
skills in our whole species.
link |
01:49:23.180
Like, because if you can do that, then you,
link |
01:49:26.460
it's like AI, like AI is going to teach itself
link |
01:49:30.420
so much more efficiently than we can teach it.
link |
01:49:31.820
We just needed to get it to the point
link |
01:49:32.900
where it can teach itself.
link |
01:49:34.340
And then.
link |
01:49:35.180
It finds the motivation to do so, right?
link |
01:49:37.500
Yeah.
link |
01:49:38.340
So like you inspire it to do so.
link |
01:49:39.580
Yeah.
link |
01:49:40.420
And then it could teach itself.
link |
01:49:42.660
What do you make of the fact,
link |
01:49:44.540
you mentioned Rise and Fall of the Third Reich.
link |
01:49:46.340
I just.
link |
01:49:47.180
Have you read that?
link |
01:49:48.000
Yeah, I read it twice.
link |
01:49:48.840
You read it twice?
link |
01:49:49.680
Yes.
link |
01:49:50.500
Okay, so no one even knows what it is.
link |
01:49:51.620
Yeah.
link |
01:49:52.460
And I'm like, wait, I thought this was like
link |
01:49:53.300
a super poppin book.
link |
01:49:54.820
Super pop.
link |
01:49:55.660
Yeah, I'm not like that, I'm not that far in it.
link |
01:49:58.460
But it is, it's so interesting.
link |
01:50:00.260
Yeah, it's written by a person that was there,
link |
01:50:03.500
which is very important to kind of.
link |
01:50:05.860
You know, you start being like,
link |
01:50:06.980
how could this possibly happen?
link |
01:50:08.420
And then when you read Rise and Fall of the Third Reich,
link |
01:50:10.020
it's like, people tried really hard for this to not happen.
link |
01:50:14.020
People tried, they almost reinstated a monarchy
link |
01:50:15.940
at one point to try to stop this from happening.
link |
01:50:17.940
Like they almost like abandoned democracy
link |
01:50:21.060
to try to get this to not happen.
link |
01:50:22.700
At least the way it makes me feel
link |
01:50:24.780
is that there's a bunch of small moments
link |
01:50:28.180
on which history can turn.
link |
01:50:30.100
Yes.
link |
01:50:30.940
It's like small meetings.
link |
01:50:32.260
Yes.
link |
01:50:33.100
Human interactions.
link |
01:50:34.220
And it's both terrifying and inspiring
link |
01:50:36.900
because it's like, even just attempts
link |
01:50:41.940
at assassinating Hitler, like time and time again failed.
link |
01:50:47.340
And they were so close.
link |
01:50:48.180
Was it like Operation Valkyrie?
link |
01:50:49.820
Such a good.
link |
01:50:51.700
And then there's also the role of,
link |
01:50:55.100
that's a really heavy burden,
link |
01:50:56.500
which is from a geopolitical perspective,
link |
01:50:59.060
the role of leaders to see evil
link |
01:51:00.820
before it truly becomes evil,
link |
01:51:02.500
to anticipate it, to stand up to evil.
link |
01:51:05.460
Because evil is actually pretty rare in this world
link |
01:51:08.140
at a scale that Hitler was.
link |
01:51:09.380
We tend to, you know, in the modern discourse
link |
01:51:12.020
kind of call people evil too quickly.
link |
01:51:14.020
If you look at ancient history,
link |
01:51:17.380
like there was a ton of Hitlers.
link |
01:51:18.860
I actually think it's more the norm than,
link |
01:51:22.660
like again, going back to like my
link |
01:51:24.420
sort of intelligent design theory,
link |
01:51:25.900
I think one of the things we've been successfully doing
link |
01:51:28.380
in our slow move from survival of the fittest
link |
01:51:31.340
to intelligent design is we've kind of been eradicating,
link |
01:51:37.700
like if you look at like ancient Assyria and stuff,
link |
01:51:40.060
like that shit was like brutal
link |
01:51:42.460
and just like the heads on the, like brutal,
link |
01:51:45.860
like Genghis Khan just like genocide after genocide
link |
01:51:48.460
was like throwing plague bodies over the walls
link |
01:51:51.100
and decimating whole cities
link |
01:51:52.660
or like the Muslim conquests of like Damascus and shit.
link |
01:51:55.820
Just like people, cities used to get leveled
link |
01:51:58.780
all the fucking time.
link |
01:52:00.060
Okay, get into the Bronze Age collapse.
link |
01:52:02.460
It's basically, there was like almost
link |
01:52:05.140
like Roman level like society.
link |
01:52:07.940
Like there was like all over the world,
link |
01:52:09.420
like global trade, like everything was awesome
link |
01:52:11.820
through a mix of, I think a bit of climate change
link |
01:52:13.820
and then the development of iron
link |
01:52:16.020
because basically bronze could only come
link |
01:52:17.420
from this, the way to make bronze,
link |
01:52:19.540
like everything had to be funneled
link |
01:52:20.700
through this one Iranian mine.
link |
01:52:23.340
And so it's like, there was just this one supply chain
link |
01:52:26.740
and this is one of the things
link |
01:52:27.580
that makes me worried about supply chains
link |
01:52:29.140
and why I think we need to be so thoughtful about,
link |
01:52:31.580
I think our biggest issue with society right now,
link |
01:52:34.500
like the thing that is most likely to go wrong
link |
01:52:36.460
is probably supply chain collapse,
link |
01:52:38.860
because war, climate change, whatever,
link |
01:52:40.140
like anything that causes supply chain collapse,
link |
01:52:41.860
our population is too big to handle that.
link |
01:52:44.260
And like the thing that seems to cause Dark Ages
link |
01:52:46.300
is mass supply chain collapse.
link |
01:52:48.020
But the Bronze Age collapse happened like,
link |
01:52:52.860
it was sort of like this ancient collapse
link |
01:52:55.380
that happened where like literally like ancient Egypt,
link |
01:52:59.340
all these cities, everything just got like decimated,
link |
01:53:01.740
destroyed, abandoned cities, like hundreds of them.
link |
01:53:04.500
There was like a flourishing society,
link |
01:53:05.980
like we were almost coming to modernity
link |
01:53:07.420
and everything got leveled.
link |
01:53:08.500
And they had this mini Dark Ages,
link |
01:53:10.060
but it was just like, there's so little writing
link |
01:53:12.140
or recording from that time that like,
link |
01:53:13.700
there isn't a lot of information
link |
01:53:14.820
about the Bronze Age collapse,
link |
01:53:16.300
but it was basically equivalent to like medieval,
link |
01:53:18.780
the medieval Dark Ages.
link |
01:53:21.220
But it just happened, I don't know the years,
link |
01:53:23.620
but like thousands of years earlier.
link |
01:53:26.500
And then we sort of like recovered
link |
01:53:28.660
from the Bronze Age collapse,
link |
01:53:30.780
empire reemerged, writing and trade
link |
01:53:33.820
and everything reemerged.
link |
01:53:36.660
And then we of course had the more contemporary Dark Ages.
link |
01:53:40.900
And then over time, we've designed mechanism
link |
01:53:43.140
to lessen and lessen the capability
link |
01:53:46.420
for the destructive power centers to emerge.
link |
01:53:50.540
There's more recording about the more contemporary Dark Ages.
link |
01:53:54.260
So I think we have like a better understanding
link |
01:53:55.660
of how to avoid it,
link |
01:53:56.500
but I still think we're at high risk for it.
link |
01:53:58.140
I think that's one of the big risks right now.
link |
01:54:00.660
So the natural state of being for humans
link |
01:54:03.260
is for there to be a lot of Hitlers,
link |
01:54:04.940
which has gotten really good
link |
01:54:06.780
at making it hard for them to emerge.
link |
01:54:09.980
We've gotten better at collaboration
link |
01:54:12.700
and resisting the power,
link |
01:54:14.820
like authoritarians to come to power.
link |
01:54:16.860
We're trying to go country by country,
link |
01:54:18.580
like we're moving past this.
link |
01:54:19.900
We're kind of like slowly incrementally,
link |
01:54:21.500
like moving towards like not scary old school war stuff.
link |
01:54:29.180
And I think seeing it happen in some of the countries
link |
01:54:32.180
that at least nominally are like
link |
01:54:35.140
supposed to have moved past that,
link |
01:54:36.740
that's scary because it reminds us that it can happen
link |
01:54:39.780
like in the places that have moved supposedly,
link |
01:54:44.980
as hopefully moved past that.
link |
01:54:47.060
And possibly at a civilization level,
link |
01:54:49.420
like you said, supply chain collapse
link |
01:54:51.660
might make people resource constraint,
link |
01:54:54.300
might make people desperate, angry, hateful, violent,
link |
01:54:59.980
and drag us right back in.
link |
01:55:01.500
I mean, supply chain collapse is how,
link |
01:55:03.980
like the ultimate thing that caused the Middle Ages
link |
01:55:06.260
was supply chain collapse.
link |
01:55:08.260
It's like people, because people were reliant
link |
01:55:11.020
on a certain level of technology,
link |
01:55:12.380
like people, like you look at like Britain,
link |
01:55:14.140
like they had glass, like people had aqueducts,
link |
01:55:17.540
people had like indoor heating and cooling
link |
01:55:20.420
and like running water and like buy food
link |
01:55:23.380
from all over the world and trade and markets.
link |
01:55:26.020
Like people didn't know how to hunt and forage and gather.
link |
01:55:28.580
And so we're in a similar situation.
link |
01:55:29.820
We are not educated enough to survive without technology.
link |
01:55:33.740
So if we have a supply chain collapse
link |
01:55:35.380
that like limits our access to technology,
link |
01:55:38.340
there will be like massive starvation and violence
link |
01:55:41.300
and displacement and war.
link |
01:55:43.100
Like, you know, it's like, yeah.
link |
01:55:47.260
In my opinion, it's like the primary marker
link |
01:55:49.060
of like what a dark age is.
link |
01:55:52.700
Well, technology is kind of enabling us
link |
01:55:54.380
to be more resilient in terms of supply chain,
link |
01:55:57.180
in terms of, to all the different catastrophic events
link |
01:56:00.820
that happened to us.
link |
01:56:02.060
Although the pandemic has kind of challenged
link |
01:56:03.900
our preparedness for the catastrophic.
link |
01:56:07.660
What do you think is the coolest invention
link |
01:56:09.220
humans come up with?
link |
01:56:11.100
The wheel, fire, cooking meat.
link |
01:56:14.580
Computers. Computers.
link |
01:56:16.980
Freaking computers. Internet or computers?
link |
01:56:18.900
Which one?
link |
01:56:19.740
What do you think the?
link |
01:56:20.560
Previous technologies, I mean,
link |
01:56:22.420
may have even been more profound
link |
01:56:23.680
and moved us to a certain degree,
link |
01:56:24.740
but I think the computers are what make us homo tech now.
link |
01:56:27.340
I think this is what, it's a brain augmentation.
link |
01:56:30.760
And so it like allows for actual evolution.
link |
01:56:33.820
Like the computers accelerate the degree
link |
01:56:35.460
to which all the other technologies can also be accelerated.
link |
01:56:38.660
Would you classify yourself as a homo sapien
link |
01:56:40.700
or a homo techno?
link |
01:56:41.580
Definitely homo techno.
link |
01:56:43.040
So you're one of the earliest of the species.
link |
01:56:46.940
I think most of us are.
link |
01:56:49.200
Like, as I said, like, I think if you
link |
01:56:53.780
like looked at brain scans of us versus humans
link |
01:56:58.740
a hundred years ago, it would look very different.
link |
01:57:00.920
I think we are physiologically different.
link |
01:57:03.740
Just even the interaction with the devices
link |
01:57:05.580
has changed our brains.
link |
01:57:06.700
Well, and if you look at,
link |
01:57:08.580
a lot of studies are coming out to show that like,
link |
01:57:11.220
there's a degree of inherited memory.
link |
01:57:13.100
So some of these physiological changes in theory
link |
01:57:15.360
should be, we should be passing them on.
link |
01:57:18.020
So like that's, you know, that's not like a,
link |
01:57:21.660
an instance of physiological change
link |
01:57:23.180
that's gonna fizzle out.
link |
01:57:24.140
In theory, that should progress like to our offspring.
link |
01:57:29.060
Speaking of offspring,
link |
01:57:30.420
what advice would you give to a young person,
link |
01:57:33.180
like in high school,
link |
01:57:35.660
whether there be an artist, a creative, an engineer,
link |
01:57:43.140
any kind of career path, or maybe just life in general,
link |
01:57:46.300
how they can live a life they can be proud of?
link |
01:57:48.700
I think one of my big thoughts,
link |
01:57:50.800
and like, especially now having kids,
link |
01:57:53.180
is that I don't think we spend enough time
link |
01:57:55.460
teaching creativity.
link |
01:57:56.820
And I think creativity is a muscle like other things.
link |
01:57:59.300
And there's a lot of emphasis on, you know,
link |
01:58:01.740
learn how to play the piano.
link |
01:58:02.860
And then you can write a song
link |
01:58:04.020
or like learn the technical stuff.
link |
01:58:05.460
And then you can do a thing.
link |
01:58:07.020
But I think it's, like, I have a friend
link |
01:58:10.460
who's like world's greatest guitar player,
link |
01:58:13.700
like, you know, amazing sort of like producer,
link |
01:58:15.740
works with other people, but he's really sort of like,
link |
01:58:18.940
you know, he like engineers and records things
link |
01:58:20.680
and like does solos,
link |
01:58:21.740
but he doesn't really like make his own music.
link |
01:58:23.480
And I was talking to him and I was like,
link |
01:58:26.060
dude, you're so talented at music.
link |
01:58:27.340
Like, why don't you make music or whatever?
link |
01:58:28.860
And he was like, cause I got, I'm too old.
link |
01:58:32.100
I never learned the creative muscle.
link |
01:58:34.180
And it's like, you know, it's embarrassing.
link |
01:58:36.660
It's like learning the creative muscle
link |
01:58:39.220
takes a lot of failure.
link |
01:58:40.760
And it also sort of,
link |
01:58:44.220
if when you're being creative,
link |
01:58:46.880
you know, you're throwing paint at a wall
link |
01:58:48.140
and a lot of stuff will fail.
link |
01:58:49.460
So like part of it is like a tolerance
link |
01:58:51.180
for failure and humiliation.
link |
01:58:53.020
And that's somehow that's easier to develop
link |
01:58:54.900
when you're young or be persist through it
link |
01:58:57.380
when you're young.
link |
01:58:58.220
Everything is easier to develop when you're young.
link |
01:59:02.540
Yes.
link |
01:59:03.380
And the younger, the better.
link |
01:59:04.780
It could destroy you.
link |
01:59:05.620
I mean, that's the shitty thing about creativity.
link |
01:59:08.460
If, you know, failure could destroy you
link |
01:59:11.260
if you're not careful, but that's a risk worth taking.
link |
01:59:13.420
But also, but at a young age,
link |
01:59:15.020
developing a tolerance to failure is good.
link |
01:59:17.820
I fail all the time.
link |
01:59:19.900
Like I do stupid shit all the time.
link |
01:59:22.380
Like in public, in private, I get canceled for,
link |
01:59:24.860
I've make all kinds of mistakes,
link |
01:59:27.060
but I just like am very resilient about making mistakes.
link |
01:59:30.540
And so then like I do a lot of things
link |
01:59:32.980
that like other people wouldn't do.
link |
01:59:34.180
And like, I think my greatest asset is my creativity.
link |
01:59:37.540
And I like, I think pain, like tolerance to failure
link |
01:59:39.840
is just a super essential thing
link |
01:59:43.380
that should be taught before other things.
link |
01:59:45.420
Brilliant advice.
link |
01:59:46.340
Yeah, yeah.
link |
01:59:47.420
I wish everybody encouraged sort of failure more
link |
01:59:51.540
as opposed to kind of.
link |
01:59:52.380
Cause we like punish failure.
link |
01:59:53.460
We're like, no, like when we were teaching kids,
link |
01:59:55.340
we're like, no, that's wrong.
link |
01:59:56.380
Like that's, you know, like X keeps like will be like wrong.
link |
02:00:04.380
Like he'll say like crazy things.
link |
02:00:05.760
Like X keeps being like, like bubble car, bubble car.
link |
02:00:09.740
And I'm like, and you know, I'm like, what's a bubble car?
link |
02:00:14.160
Like, but like, it doesn't like,
link |
02:00:15.860
but I don't want to be like, no, you're wrong.
link |
02:00:17.420
I'm like, you're thinking of weird, crazy shit.
link |
02:00:20.340
Like, I don't know what a bubble car is, but like.
link |
02:00:22.260
It's creating worlds
link |
02:00:23.460
and they might be internally consistent.
link |
02:00:25.180
And through that, you might discover something fundamental
link |
02:00:27.420
about this world.
link |
02:00:28.260
Yeah, or he'll like rewrite songs,
link |
02:00:29.740
like with words that he prefers.
link |
02:00:32.140
So like, instead of baby shark, he says baby car.
link |
02:00:35.500
It's like.
link |
02:00:36.340
Maybe he's onto something.
link |
02:00:41.100
Let me ask the big, ridiculous question.
link |
02:00:42.740
We were kind of dancing around it,
link |
02:00:44.100
but what do you think is the meaning
link |
02:00:47.180
of this whole thing we have here of human civilization,
link |
02:00:52.140
of life on earth, but in general, just life?
link |
02:00:55.060
What's the meaning of life?
link |
02:00:57.340
C.
link |
02:00:58.180
Have you, did you read Nova Scene yet?
link |
02:01:02.540
By James Lovelock?
link |
02:01:03.700
You're doing a lot of really good book recommendations here.
link |
02:01:06.340
I haven't even finished this,
link |
02:01:07.560
so I'm a huge fraud yet again.
link |
02:01:10.260
But like really early in the book,
link |
02:01:12.620
he says this amazing thing.
link |
02:01:14.500
Like, I feel like everyone's so sad and cynical.
link |
02:01:16.500
Like everyone's like the Fermi paradox and everyone.
link |
02:01:18.700
I just keep hearing people being like, fuck,
link |
02:01:20.500
what if we're alone?
link |
02:01:21.340
Like, oh no, ah, like, ah, ah.
link |
02:01:23.620
And I'm like, okay, but like, wait,
link |
02:01:25.100
what if this is the beginning?
link |
02:01:26.780
Like in Nova Scene, he says,
link |
02:01:30.140
this is not gonna be a correct,
link |
02:01:31.380
I can't like memorize quotes,
link |
02:01:32.580
but he says something like,
link |
02:01:36.260
what if our consciousness, like right now,
link |
02:01:39.980
like this is the universe waking up?
link |
02:01:43.380
Like what if instead of discovering the universe,
link |
02:01:45.500
this is the universe,
link |
02:01:47.460
like this is the evolution
link |
02:01:49.460
of the literal universe herself.
link |
02:01:51.620
Like we are not separate from the universe.
link |
02:01:53.140
Like this is the universe waking up.
link |
02:01:54.620
This is the universe seeing herself for the first time.
link |
02:01:57.400
Like this is.
link |
02:01:59.140
The universe becoming conscious.
link |
02:02:00.820
The first time we were a part of that.
link |
02:02:02.340
Yeah, cause it's like,
link |
02:02:03.180
we aren't separate from the universe.
link |
02:02:05.620
Like this could be like an incredibly sacred moment
link |
02:02:08.780
and maybe like social media and all this things,
link |
02:02:10.980
the stuff where we're all getting connected together.
link |
02:02:13.300
Like maybe these are the neurons connecting
link |
02:02:16.940
of the like collective super intelligence that is,
link |
02:02:22.100
Waking up.
link |
02:02:22.940
The, yeah, like, you know, it's like,
link |
02:02:25.420
maybe instead of something cynical
link |
02:02:27.100
or maybe if there's something to discover,
link |
02:02:29.180
like maybe this is just, you know,
link |
02:02:31.540
we're a blast assist of like some incredible
link |
02:02:35.980
kind of consciousness or being.
link |
02:02:39.420
And just like in the first three years of life
link |
02:02:41.220
or for human children,
link |
02:02:42.820
we'll forget about all the suffering
link |
02:02:44.340
that we're going through now.
link |
02:02:45.460
I think we'll probably forget about this.
link |
02:02:46.620
I mean, probably, you know, artificial intelligence
link |
02:02:50.440
will eventually render us obsolete.
link |
02:02:52.700
I don't think they'll do it in a malicious way,
link |
02:02:55.380
but I think probably we are very weak.
link |
02:02:57.760
The sun is expanding.
link |
02:02:58.880
Like, I don't know, like, hopefully we can get to Mars,
link |
02:03:01.660
but like, we're pretty vulnerable.
link |
02:03:04.100
And I, you know, like,
link |
02:03:06.780
I think we can coexist for a long time with AI
link |
02:03:09.100
and we can also probably make ourselves less vulnerable,
link |
02:03:11.460
but, you know, I just think
link |
02:03:15.700
consciousness, sentience, self awareness,
link |
02:03:18.400
like, I think this might be the single greatest
link |
02:03:21.940
like moment in evolution ever.
link |
02:03:24.980
And like, maybe this is, you know,
link |
02:03:29.300
the big, like the true beginning of life.
link |
02:03:32.580
And we're just, we're the blue green algae
link |
02:03:34.700
or we're like the single celled organisms
link |
02:03:36.980
of something amazing.
link |
02:03:38.460
The universe awakens and this is it.
link |
02:03:40.900
Yeah.
link |
02:03:42.620
Well, see, you're an incredible person.
link |
02:03:45.300
You're a fascinating mind.
link |
02:03:47.500
You should definitely do, your friend Liv mentioned
link |
02:03:50.500
that you guys were thinking of maybe talking.
link |
02:03:52.260
I would love it if you explored your mind
link |
02:03:55.340
in this kind of media more and more
link |
02:03:56.700
by doing a podcast with her or just in any kind of way.
link |
02:03:59.660
So you're an awesome person.
link |
02:04:01.780
It's an honor to know you.
link |
02:04:03.380
It's an honor to get to sit down with you late at night,
link |
02:04:05.820
which is like surreal.
link |
02:04:08.360
And I really enjoyed it.
link |
02:04:09.200
Thank you for talking today.
link |
02:04:10.140
Yeah, no, I mean, huge honor.
link |
02:04:11.700
I feel very underqualified to be here, but I'm a big fan.
link |
02:04:13.640
I've been listening to the podcast a lot and yeah,
link |
02:04:15.940
me and Liv would appreciate any advice and help
link |
02:04:18.260
and we're definitely gonna do that.
link |
02:04:19.220
So yeah.
link |
02:04:21.380
Anytime.
link |
02:04:22.220
Thank you.
link |
02:04:23.040
Cool, thank you.
link |
02:04:24.420
Thanks for listening to this conversation with Grimes.
link |
02:04:26.980
To support this podcast,
link |
02:04:28.220
please check out our sponsors in the description.
link |
02:04:31.060
And now let me leave you with some words from Oscar Wilde.
link |
02:04:34.660
Yes, I'm a dreamer.
link |
02:04:36.940
For a dreamer is one who can only find her way by moonlight
link |
02:04:41.340
and her punishment is that she sees the dawn
link |
02:04:44.260
before the rest of the world.
link |
02:04:46.580
Thank you for listening and hope to see you
link |
02:04:49.020
next time.