back to index

Grimes: Music, AI, and the Future of Humanity | Lex Fridman Podcast #281


small model | large model

link |
00:00:00.000
We are becoming cyborgs.
link |
00:00:01.240
Like our brains are fundamentally changed.
link |
00:00:04.080
Everyone who grew up with electronics,
link |
00:00:05.760
we are fundamentally different from previous,
link |
00:00:08.920
from Homo sapiens.
link |
00:00:09.840
I call us Homo Techno.
link |
00:00:11.080
I think we have evolved into Homo Techno,
link |
00:00:13.240
which is like essentially a new species.
link |
00:00:15.800
Previous technologies, I mean,
link |
00:00:17.840
may have even been more profound
link |
00:00:19.120
and moved us to a certain degree,
link |
00:00:20.160
but I think the computers are what make us Homo Techno.
link |
00:00:22.760
I think this is what, it's a brain augmentation.
link |
00:00:25.600
So it like allows for actual evolution.
link |
00:00:27.840
Like the computers accelerate the degree
link |
00:00:29.480
to which all the other technologies
link |
00:00:31.120
can also be accelerated.
link |
00:00:32.720
Would you classify yourself as a Homo sapien
link |
00:00:34.760
or a Homo Techno?
link |
00:00:35.640
Definitely Homo Techno.
link |
00:00:37.120
So you're one of the earliest of the species.
link |
00:00:40.960
I think most of us are.
link |
00:00:45.440
The following is a conversation with Grimes,
link |
00:00:47.800
an artist, musician, songwriter, producer, director,
link |
00:00:50.560
and a fascinating human being
link |
00:00:53.080
who thinks a lot about both the history
link |
00:00:55.520
and the future of human civilization.
link |
00:00:57.560
Studying the dark periods of our past
link |
00:00:59.840
to help form an optimistic vision of our future.
link |
00:01:03.920
This is the Lex Friedman podcast.
link |
00:01:05.760
To support it, please check out our sponsors
link |
00:01:07.880
in the description.
link |
00:01:09.040
And now, dear friends, here's Grimes.
link |
00:01:12.880
Oh yeah, the Cloudlifter, there you go.
link |
00:01:14.520
There you go.
link |
00:01:15.360
You know your stuff.
link |
00:01:16.440
Have you ever used the Cloudlifter?
link |
00:01:18.280
Yeah, I actually, this microphone in Cloudlifter
link |
00:01:21.000
is what Michael Jackson used, so.
link |
00:01:23.480
No, really?
link |
00:01:24.440
Yeah, this is like thriller and stuff.
link |
00:01:26.040
This mic and the Cloudlifter.
link |
00:01:27.640
And that, yeah, it's an incredible microphone.
link |
00:01:30.640
It's very flattering on vocals.
link |
00:01:32.040
I've used this a lot.
link |
00:01:33.600
It's great for demo vocals.
link |
00:01:34.720
It's great in a room.
link |
00:01:36.640
Sometimes it's easier to record vocals
link |
00:01:38.280
if you're just in a room and the music's playing
link |
00:01:40.600
and you just want to feel it
link |
00:01:41.920
so it's not in the headphones.
link |
00:01:43.040
And this mic is pretty directional,
link |
00:01:44.680
so I think it's a good mic for just vibing out
link |
00:01:47.720
and just getting a real good vocal take.
link |
00:01:49.840
Just vibing, just in a room.
link |
00:01:51.880
Anyway, this is the Michael Jackson, Quincy Jones.
link |
00:01:55.920
Microphone.
link |
00:01:57.000
I feel way more badass now.
link |
00:01:58.800
All right, you want to just get into it?
link |
00:02:01.760
I guess so.
link |
00:02:03.040
All right, one of your names,
link |
00:02:04.760
at least in this space and time, is C, like the letter C.
link |
00:02:08.320
And you told me that C means a lot of things.
link |
00:02:11.240
It's the speed of light.
link |
00:02:12.600
It's the render rate of the universe.
link |
00:02:14.680
It's yes in Spanish.
link |
00:02:16.120
It's the crescent moon.
link |
00:02:17.640
And it happens to be my favorite programming language
link |
00:02:21.120
because it basically runs the world,
link |
00:02:24.120
but it's also powerful, fast, and it's dangerous
link |
00:02:28.200
because you can mess things up really bad with it
link |
00:02:30.000
because of all the pointers.
link |
00:02:31.160
But anyway, which of these associations
link |
00:02:34.000
with the name C is the coolest to you?
link |
00:02:37.760
I mean, to me, the coolest is the speed of light, obviously,
link |
00:02:41.640
or the speed of light.
link |
00:02:42.720
When I say render rate of the universe,
link |
00:02:44.400
I think I mean the speed of light
link |
00:02:46.320
because essentially that's what we're rendering at.
link |
00:02:49.120
See, I think we'll know if we're in a simulation,
link |
00:02:52.200
if the speed of light changes
link |
00:02:53.720
because if they can improve their render speed, then...
link |
00:02:57.240
Well, it's already pretty good.
link |
00:02:58.440
It's already pretty good, but if it improves,
link |
00:03:01.000
then we'll know, we can probably be like,
link |
00:03:03.880
okay, they've updated or upgraded.
link |
00:03:05.320
Well, it's fast enough for us humans
link |
00:03:06.760
because it seems immediate.
link |
00:03:10.960
There's no delay, there's no latency
link |
00:03:13.320
in terms of us humans on Earth interacting with things.
link |
00:03:16.240
But if you're like intergalactic species
link |
00:03:20.000
operating on a much larger scale,
link |
00:03:21.440
then you're gonna start noticing some weird stuff.
link |
00:03:23.880
Or if you can operate in like around a black hole,
link |
00:03:27.360
then you're gonna start to see some render issues.
link |
00:03:29.720
You can't go faster than the speed of light, correct?
link |
00:03:32.720
So it really limits our ability
link |
00:03:34.560
or one's ability to travel space.
link |
00:03:36.760
Theoretically, you can, you have wormholes.
link |
00:03:38.960
So there's nothing in general relativity
link |
00:03:41.920
that precludes faster than speed of light travel,
link |
00:03:48.320
but it just seems you're gonna have to do
link |
00:03:49.880
some really funky stuff with very heavy things
link |
00:03:54.040
that have like weirdnesses,
link |
00:03:56.120
that have basically tears in space time.
link |
00:03:58.640
We don't know how to do that.
link |
00:03:59.760
Do navigators know how to do it?
link |
00:04:01.880
Do navigators?
link |
00:04:03.120
Yeah, folding space, basically making wormholes.
link |
00:04:07.000
So the name C.
link |
00:04:09.520
Yes.
link |
00:04:11.880
Who are you?
link |
00:04:14.880
Do you think of yourself as multiple people?
link |
00:04:17.000
Are you one person?
link |
00:04:18.320
Do you know like in this morning
link |
00:04:20.880
where you're a different person than you are tonight?
link |
00:04:23.600
We are, I should say, recording this
link |
00:04:25.880
basically at midnight, which is awesome.
link |
00:04:27.880
Yes, thank you so much.
link |
00:04:29.640
I think I'm about eight hours late.
link |
00:04:31.720
No, you're right on time.
link |
00:04:34.000
Good morning.
link |
00:04:34.840
This is the beginning of a new day soon.
link |
00:04:37.240
Anyway, are you the same person
link |
00:04:39.520
you were in the morning in the evening?
link |
00:04:41.960
Do you, as there are multiple people in there,
link |
00:04:44.360
do you think of yourself as one person?
link |
00:04:46.280
Or maybe you have no clue?
link |
00:04:47.520
Or are you just a giant mystery to yourself?
link |
00:04:50.200
Okay, these are really intense questions, but...
link |
00:04:52.480
Let's go, let's go.
link |
00:04:53.320
Because I asked this myself,
link |
00:04:54.560
like look in the mirror, who are you?
link |
00:04:56.360
People tell you to just be yourself,
link |
00:04:58.000
but what does that even mean?
link |
00:04:59.680
I mean, I think my personality changes
link |
00:05:01.560
with everyone I talk to.
link |
00:05:03.000
So I have a very inconsistent personality, yeah.
link |
00:05:07.200
Person to person, so the interaction,
link |
00:05:09.000
your personality materializes.
link |
00:05:11.480
Or my mood, like I'll go from being like a megalomaniac
link |
00:05:16.160
to being like, you know, just like a total hermit
link |
00:05:19.960
who is very shy.
link |
00:05:21.440
So some combinatorial combination of your mood
link |
00:05:24.600
and the person you're interacting with.
link |
00:05:26.360
Yeah, mood and people I'm interacting with.
link |
00:05:28.120
But I think everyone's like that, maybe not.
link |
00:05:30.920
Well, not everybody acknowledges it
link |
00:05:32.640
and able to introspect it.
link |
00:05:34.080
Who brings up, what kind of person,
link |
00:05:35.840
what kind of mood brings out the best in you?
link |
00:05:38.200
As an artist and as a human, can you introspect this?
link |
00:05:41.880
Like my best friends, like people I can,
link |
00:05:45.280
when I'm like super confident
link |
00:05:47.560
and I know that they're gonna understand
link |
00:05:50.360
everything I'm saying, so like my best friends,
link |
00:05:52.240
then when I can start being really funny,
link |
00:05:55.440
that's always my like peak mode.
link |
00:05:57.720
But it's like, yeah, takes a lot to get there.
link |
00:06:00.200
Let's talk about constraints.
link |
00:06:02.400
You've talked about constraints and limits.
link |
00:06:07.040
Do those help you out as an artist or as a human being?
link |
00:06:09.640
Or do they get in the way?
link |
00:06:10.840
Do you like the constraints?
link |
00:06:11.960
So in creating music and creating art and living life,
link |
00:06:16.800
do you like the constraints that this world puts on you?
link |
00:06:21.840
Or do you hate them?
link |
00:06:24.760
If constraints are moving, then you're good, right?
link |
00:06:29.760
Like it's like, as we are progressing with technology,
link |
00:06:32.080
we're changing the constraints of like artistic creation,
link |
00:06:34.840
you know, making video and music and stuff
link |
00:06:38.320
is getting a lot cheaper.
link |
00:06:39.760
There's constantly new technology and new software
link |
00:06:42.120
that's making it faster and easier.
link |
00:06:44.040
We have so much more freedom than we had in the 70s,
link |
00:06:46.720
like when Michael Jackson, you know,
link |
00:06:48.680
when they recorded Thriller with this microphone,
link |
00:06:51.480
like they had to use a mixing desk and all this stuff.
link |
00:06:54.000
And like probably even getting a studio
link |
00:06:55.640
is probably really expensive
link |
00:06:56.600
and you have to be a really good singer
link |
00:06:57.640
and you have to know how to use like the mixing desk
link |
00:07:00.160
and everything.
link |
00:07:01.000
And now I can just, you know,
link |
00:07:02.520
I've made a whole album on this computer.
link |
00:07:05.280
I have a lot more freedom,
link |
00:07:06.800
but then I'm also constrained in different ways
link |
00:07:10.280
because there's like literally millions more artists.
link |
00:07:13.840
It's like a much bigger playing field.
link |
00:07:15.720
It's just like, I also, I didn't learn music.
link |
00:07:18.720
I'm not a natural musician.
link |
00:07:20.280
So I don't know anything about actual music.
link |
00:07:22.680
I just know about like the computer.
link |
00:07:24.840
So I'm really kind of just like messing around
link |
00:07:30.560
and like trying things out.
link |
00:07:33.320
Well, yeah, I mean, but the nature of music is changing.
link |
00:07:35.800
So you're saying you don't know actual music,
link |
00:07:37.360
but music is changing.
link |
00:07:39.280
Music is becoming, you've talked about this,
link |
00:07:41.960
it's becoming, it's like merging with technology.
link |
00:07:46.680
Yes.
link |
00:07:47.680
It's becoming something more than just like
link |
00:07:51.440
the notes on a piano.
link |
00:07:53.040
It's becoming some weird composition
link |
00:07:55.000
that requires engineering skills, programming skills,
link |
00:07:59.480
some kind of human robot interaction skills,
link |
00:08:03.480
and still some of the same things that Michael Jackson had,
link |
00:08:05.720
which is like a good year for a good sense of taste
link |
00:08:08.480
of what's good and not the final thing
link |
00:08:10.360
what is put together.
link |
00:08:11.520
Like you're allowed, you're enabled, empowered
link |
00:08:14.920
with a laptop to layer stuff,
link |
00:08:17.200
to start like layering insane amounts of stuff.
link |
00:08:20.280
And it's super easy to do that.
link |
00:08:22.280
I do think music production is a really underrated art form.
link |
00:08:25.000
I feel like people really don't appreciate it.
link |
00:08:26.680
When I look at publishing splits,
link |
00:08:27.960
the way that people like pay producers and stuff,
link |
00:08:32.240
it's super, producers are just deeply underrated.
link |
00:08:35.560
Like so many of the songs that are popular right now
link |
00:08:39.200
or for the last 20 years,
link |
00:08:40.960
like part of the reason they're popular
link |
00:08:42.240
is because the production is really interesting
link |
00:08:44.040
or really sick or really cool.
link |
00:08:45.680
And it's like, I don't think listeners,
link |
00:08:50.960
like people just don't really understand
link |
00:08:52.560
what music production is.
link |
00:08:54.680
It's not, it's sort of like this weird,
link |
00:08:57.720
discombobulated art form.
link |
00:08:59.400
It's not like a formal, because it's so new,
link |
00:09:01.400
there isn't like a formal training
link |
00:09:03.280
or a path for it.
link |
00:09:06.920
It's mostly driven by like autodidacts.
link |
00:09:10.240
Like it's like almost everyone I know
link |
00:09:11.320
who's good at production,
link |
00:09:12.240
like I didn't go to music school or anything.
link |
00:09:13.800
They just taught themselves.
link |
00:09:15.120
Are they mostly different?
link |
00:09:16.080
Like the music producers, you know,
link |
00:09:18.440
is there some commonalities that time together
link |
00:09:21.360
or are they all just different kinds of weirdos?
link |
00:09:23.640
Cause I just, I just saw that with Rick Rubin.
link |
00:09:25.480
I don't know if you've.
link |
00:09:26.320
Yeah, I mean, Rick Rubin is like literally
link |
00:09:29.800
one of the gods of music production.
link |
00:09:31.240
Like he's one of the people who first, you know,
link |
00:09:34.160
who like made music production, you know,
link |
00:09:37.600
made the production as important
link |
00:09:39.360
as the actual lyrics or the notes.
link |
00:09:41.600
But the thing he does, which is interesting,
link |
00:09:43.560
I don't know if you can speak to that,
link |
00:09:45.520
but just hanging out with him,
link |
00:09:46.720
he seems to just sit there in silence,
link |
00:09:48.520
close his eyes and listen.
link |
00:09:50.800
It's like, he almost does nothing.
link |
00:09:53.640
And that nothing somehow gives you freedom
link |
00:09:55.880
to be the best version of yourself.
link |
00:09:58.160
So that's music production somehow too,
link |
00:10:00.040
which is like encouraging you to do less,
link |
00:10:02.640
to simplify, to like push towards minimalism.
link |
00:10:06.880
I mean, I guess, I mean,
link |
00:10:09.560
I work differently from Rick Rubin
link |
00:10:11.560
because Rick Rubin produces for other artists,
link |
00:10:14.120
whereas like I mostly produce for myself.
link |
00:10:17.040
So it's a very different situation.
link |
00:10:19.360
I also think Rick Rubin, he's in that,
link |
00:10:21.720
I would say advanced category of producer
link |
00:10:23.560
where like you've like earned your,
link |
00:10:26.560
you can have an engineer and stuff
link |
00:10:27.920
and people like do the stuff for you.
link |
00:10:29.840
But I usually just like do stuff myself.
link |
00:10:32.400
So you're the engineer, the producer and the artist.
link |
00:10:38.080
Yeah, I guess I would say I'm in the era,
link |
00:10:39.880
like the post Rick Rubin era,
link |
00:10:41.280
like I come from the kind of like
link |
00:10:44.320
Skrillex school of thought,
link |
00:10:47.040
which is like where you are.
link |
00:10:49.240
Yeah, the engineer, producer, artist.
link |
00:10:51.040
Like, I mean, lately,
link |
00:10:53.760
sometimes I'll work with a producer now.
link |
00:10:55.560
I'm gently sort of delicately starting to collaborate
link |
00:10:59.600
a little bit more,
link |
00:11:00.440
but like, I think I'm kind of from the,
link |
00:11:02.800
like the whatever 2010s explosion of things
link |
00:11:07.120
where everything became available on the computer
link |
00:11:11.920
and you kind of got this like loan wizard energy thing going.
link |
00:11:16.680
So you embraced being the loneliness.
link |
00:11:19.680
Is the loneliness somehow an engine of creativity?
link |
00:11:22.440
Like, so most of your stuff,
link |
00:11:24.560
most of your creative quote unquote genius and quotes
link |
00:11:28.640
is in the privacy of your mind.
link |
00:11:32.120
Yes.
link |
00:11:33.480
Well, it was, but here's the thing.
link |
00:11:39.040
I was talking to Daniel Eck and he said,
link |
00:11:40.840
he's like most artists, they have about 10 years,
link |
00:11:43.400
like 10 good years.
link |
00:11:45.160
And then they usually stop making their like vital shit.
link |
00:11:49.840
And I feel like I'm sort of like nearing the end
link |
00:11:53.360
of my 10 years on my own.
link |
00:11:56.520
And so you have to become somebody else.
link |
00:11:58.640
Now I'm like,
link |
00:11:59.480
I'm in the process of becoming somebody else
link |
00:12:01.000
and reinventing when I work with other people
link |
00:12:02.880
because I've never worked with other people.
link |
00:12:04.200
I find that I make like,
link |
00:12:06.400
that I'm exceptionally rejuvenated
link |
00:12:08.400
and making like some of the most vital work I've ever made.
link |
00:12:11.000
So, because I think another human brain is like
link |
00:12:14.120
one of the best tools you can possibly find.
link |
00:12:17.560
Like, it's a funny way to put it, I love it.
link |
00:12:20.600
It's like, if a tool is like, you know,
link |
00:12:23.360
whatever HP plus one or like adds some like stats
link |
00:12:27.360
to your character, like another human brain
link |
00:12:30.800
will like square it instead of just like adding something.
link |
00:12:34.240
Double up the experience points.
link |
00:12:35.640
I love this.
link |
00:12:36.480
We should also mention we're playing Tavern music
link |
00:12:38.320
before this and which I love, which I first one,
link |
00:12:41.600
I think I first.
link |
00:12:42.440
You have to stop the Tavern music.
link |
00:12:43.800
Yeah, because it doesn't, the audio.
link |
00:12:46.440
Okay, okay.
link |
00:12:47.280
But it makes.
link |
00:12:48.120
Yeah, it'll make the podcast going.
link |
00:12:48.960
Add it in post, add it in post.
link |
00:12:50.040
No one will want to listen to the podcast.
link |
00:12:51.600
It probably would, but it makes me,
link |
00:12:53.440
it reminds me like a video game,
link |
00:12:55.480
like a role playing video game
link |
00:12:56.760
where you have experience points.
link |
00:12:58.400
There's something really joyful about wandering places
link |
00:13:03.440
like Elder Scrolls, like Skyrim,
link |
00:13:06.480
just exploring these landscapes in another world.
link |
00:13:10.520
And then you get experience points
link |
00:13:12.000
and you can work on different skills
link |
00:13:13.960
and somehow you progress in life.
link |
00:13:16.120
And I don't know, it's simple.
link |
00:13:17.600
It doesn't have some of the messy complexities of life.
link |
00:13:19.960
And there's usually a bad guy.
link |
00:13:21.280
You can fight in Skyrim, it's dragons and so on.
link |
00:13:25.560
I'm sure in Elden Ring,
link |
00:13:26.480
there's a bunch of monsters you can fight.
link |
00:13:28.280
I love that.
link |
00:13:29.120
I feel like Elden Ring,
link |
00:13:29.960
I feel like this is a good analogy to music protection though
link |
00:13:32.400
because it's like, I feel like the engineers
link |
00:13:34.440
and the people creating these open worlds
link |
00:13:36.240
are, it's sort of like similar to people,
link |
00:13:38.760
to music producers,
link |
00:13:39.680
whereas it's like this hidden archetype
link |
00:13:42.760
that like no one really understands what they do
link |
00:13:44.640
and no one really knows who they are,
link |
00:13:46.200
but they're like, it's like the artist engineer
link |
00:13:49.320
because it's like, it's both art
link |
00:13:51.760
and fairly complex engineering.
link |
00:13:54.840
Well, you're saying they don't get enough credit.
link |
00:13:57.200
Aren't you kind of changing that
link |
00:13:58.600
by becoming the person doing everything?
link |
00:14:01.320
Aren't you, isn't the engineer?
link |
00:14:03.680
Well, I mean, others have gone before me.
link |
00:14:05.440
I'm not, you know, there's like Timbaland and Skrillex
link |
00:14:07.800
and there's all these people that are like,
link |
00:14:10.360
you know, very famous for this.
link |
00:14:12.040
But I just think the general,
link |
00:14:13.920
I think people get confused about what it is
link |
00:14:15.920
and just don't really know what it is per se.
link |
00:14:19.200
And it's just when I see a song,
link |
00:14:20.480
like when there's like a hit song,
link |
00:14:22.280
like I'm just trying to think of like,
link |
00:14:27.520
just going for like even just a basic pop hit,
link |
00:14:29.840
like was it like rules by Dua Lipa or something?
link |
00:14:36.080
The production on that is actually like really crazy.
link |
00:14:39.200
I mean, the song is also great,
link |
00:14:40.560
but it's like the production is exceptionally memorable.
link |
00:14:43.360
Like, you know, and it's just like no one,
link |
00:14:47.200
I can't, I don't even know who produced that song.
link |
00:14:49.160
It just like isn't part of like the rhetoric
link |
00:14:50.680
of how we just discuss the creation of art.
link |
00:14:53.440
We just sort of like don't consider the music producer
link |
00:14:57.200
because I think the music producer used to be more,
link |
00:15:00.320
just simply recording things.
link |
00:15:03.680
Yeah, that's interesting.
link |
00:15:04.640
Cause when you think about movies,
link |
00:15:06.040
we talk about the actor and the actresses,
link |
00:15:08.600
but we also talk about the director.
link |
00:15:10.440
Directors, yeah.
link |
00:15:11.520
We don't talk about like that with the music as often.
link |
00:15:14.440
The Beatles music producer was one of the first kind of,
link |
00:15:19.360
got one of the first people sort of introducing
link |
00:15:21.240
crazy sound design into pop music.
link |
00:15:22.640
I forget his name.
link |
00:15:24.160
He has the same, I forget his name, but, you know,
link |
00:15:28.000
like he was doing all the weird stuff,
link |
00:15:29.160
like dropping pianos and like, yeah.
link |
00:15:32.400
Oh, to get the, yeah, yeah, yeah, yeah.
link |
00:15:33.480
To get the sound, to get the authentic sound.
link |
00:15:36.560
What about lyrics?
link |
00:15:38.080
You think those, where did they fit into how important they are?
link |
00:15:42.960
I was heartbroken to learn that Elvis didn't write his songs.
link |
00:15:46.800
I was very mad.
link |
00:15:47.880
A lot of people don't write their songs.
link |
00:15:49.520
I understand this, but.
link |
00:15:50.840
But here's the thing.
link |
00:15:52.200
I feel like there's this desire for authenticity.
link |
00:15:54.880
I used to be like really mad
link |
00:15:56.120
when like people wouldn't write or produce their music.
link |
00:15:58.000
And I'd be like, that's fake.
link |
00:15:59.240
And then I realized there's all this like weird bitterness
link |
00:16:04.520
and like agronious in art about authenticity.
link |
00:16:07.760
But I had this kind of like weird realization recently
link |
00:16:10.800
where I started thinking that like,
link |
00:16:14.480
art is sort of a decentralized collective thing.
link |
00:16:20.240
Like art is kind of a conversation with all the artists
link |
00:16:26.720
that have ever lived before you, you know?
link |
00:16:29.080
Like it's like, you're really just sort of,
link |
00:16:31.120
it's not like anyone's reinventing the wheel here.
link |
00:16:33.560
Like you're kind of just taking, you know,
link |
00:16:36.680
thousands of years of art
link |
00:16:38.240
and like running it through your own little algorithm
link |
00:16:41.720
and then like making your like your interpretation of it.
link |
00:16:45.080
You just joined the conversation
link |
00:16:46.280
with all the other artists that came before.
link |
00:16:47.600
It's such a beautiful way to look at it.
link |
00:16:49.600
Like, and it's like, I feel like everyone's always like,
link |
00:16:51.560
there's always copyright and IP and this and that
link |
00:16:54.120
or authenticity.
link |
00:16:55.200
And it's just like, I think we need to stop seeing this
link |
00:16:59.240
as this like egotistical thing of like,
link |
00:17:01.600
oh, the creative genius, the lone creative genius
link |
00:17:04.200
or this or that.
link |
00:17:05.040
It's like, I think art isn't, shouldn't be about that.
link |
00:17:08.760
I think art is something
link |
00:17:09.600
that sort of brings humanity together.
link |
00:17:12.040
And it's also art is also kind of the collective memory
link |
00:17:14.080
of humans.
link |
00:17:14.920
It's like, we don't, like we don't give a fuck
link |
00:17:17.920
about whatever ancient Egypt,
link |
00:17:20.280
like how much grain got sent that day
link |
00:17:22.760
and sending the records and like, you know, like
link |
00:17:25.760
who went where and, you know,
link |
00:17:27.600
how many shields needed to be produced for this.
link |
00:17:29.640
Like we just remember their art.
link |
00:17:32.240
And it's like, you know, it's like in our day to day life,
link |
00:17:34.840
there's all this stuff that seems more important than art
link |
00:17:38.080
because it helps us function and survive.
link |
00:17:40.200
But when all this is gone, like the only thing
link |
00:17:42.840
that's really going to be left is the art.
link |
00:17:45.040
The technology will be obsolete.
link |
00:17:46.800
That's so fascinating.
link |
00:17:47.640
Like the humans will be dead.
link |
00:17:49.080
That is true.
link |
00:17:49.920
A good compression of human history is the art
link |
00:17:53.000
we've generated across the different centuries
link |
00:17:56.200
of different millennia.
link |
00:17:57.800
So when the aliens come.
link |
00:17:59.920
When the aliens come,
link |
00:18:00.760
they're going to find the hieroglyphics and the pyramids.
link |
00:18:02.760
I mean, art could be broadly defined.
link |
00:18:04.360
They might find like the engineering marvels,
link |
00:18:06.240
the bridges, the rockets, the...
link |
00:18:09.840
I guess I sort of classify though.
link |
00:18:11.480
Architecture is art.
link |
00:18:12.840
Yes.
link |
00:18:13.680
I consider engineering in those formats to be art, for sure.
link |
00:18:19.360
It sucks that like digital art is easier to delete.
link |
00:18:23.200
So if there's an apocalypse, a nuclear war
link |
00:18:25.800
that can disappear and the physical,
link |
00:18:28.840
there's something still valuable
link |
00:18:30.000
about the physical manifestation of art.
link |
00:18:32.360
That sucks that like music, for example,
link |
00:18:35.600
has to be played by somebody.
link |
00:18:37.960
Yeah, I mean, I do think we should have
link |
00:18:40.200
a foundation type situation where we like,
link |
00:18:42.840
you know how we have like seed banks up in the North
link |
00:18:44.760
and stuff?
link |
00:18:45.600
Like we should probably have like a solar powered
link |
00:18:48.040
or geothermal little bunker
link |
00:18:49.800
that like has all human knowledge.
link |
00:18:52.400
You mentioned Daniel, I can Spotify.
link |
00:18:55.280
What do you think about that as an artist?
link |
00:18:57.000
What's Spotify?
link |
00:18:58.280
Is that empowering?
link |
00:18:59.760
Like to me, Spotify as a consumer is super exciting.
link |
00:19:02.600
It makes it easy for me to access music
link |
00:19:05.000
from all kinds of artists,
link |
00:19:06.640
get to explore all kinds of music,
link |
00:19:08.480
make it super easy to sort of curate my playlist
link |
00:19:12.320
and have fun with all that.
link |
00:19:14.000
It was so liberating to let go.
link |
00:19:16.080
You know, I used to collect albums and CDs and so on.
link |
00:19:19.400
Like I got like Horde albums.
link |
00:19:22.160
Yeah.
link |
00:19:23.000
Like they matter.
link |
00:19:23.840
But the reality you can, you know,
link |
00:19:25.680
that was really liberating.
link |
00:19:26.920
I can let go of that and letting go of the albums
link |
00:19:30.520
you're kind of collecting
link |
00:19:32.160
allows you to find new music,
link |
00:19:33.600
exploring new artists and all that kind of stuff.
link |
00:19:36.200
But I know from a perspective of an artist that could be,
link |
00:19:38.400
like you mentioned,
link |
00:19:39.240
competition could be a kind of constraint
link |
00:19:42.040
because there's more and more and more artists
link |
00:19:45.000
on the platform.
link |
00:19:46.080
I think it's better that there's more artists.
link |
00:19:47.920
I mean, again, this might be propaganda
link |
00:19:49.840
because this is all for a conversation with Daniel X.
link |
00:19:51.720
So this could easily be propaganda, like.
link |
00:19:54.080
We're all a victim of somebody's propaganda.
link |
00:19:56.720
So let's just accept this.
link |
00:19:58.960
But Daniel X was telling me that, you know, at the,
link |
00:20:01.720
cause I, you know, when I met him, I like,
link |
00:20:04.640
I came in all furious about Spotify
link |
00:20:06.480
and like I grilled him super hard.
link |
00:20:07.800
So I've got his answers here.
link |
00:20:10.680
But he was saying like at the sort of peak of the CD industry,
link |
00:20:15.280
there was like 20,000 artists
link |
00:20:17.480
making millions and millions of dollars.
link |
00:20:19.560
Like there was just like a very tiny kind of 1%.
link |
00:20:22.880
And Spotify has kind of democratized the industry
link |
00:20:27.440
because now I think he said there's about a million
link |
00:20:29.720
artists making a good living from Spotify.
link |
00:20:33.160
And when I heard that, I was like, honestly,
link |
00:20:36.920
I would rather make less money
link |
00:20:38.840
and have just like a decent living
link |
00:20:42.720
than and have more artists be able to have that.
link |
00:20:46.560
Even though I like, I wish it could include everyone, but.
link |
00:20:49.320
Yeah, that's really hard to argue with.
link |
00:20:50.760
YouTube is the same as YouTube's mission.
link |
00:20:54.120
They wanna basically have as many creators as possible
link |
00:20:58.280
and make a living, some kind of living.
link |
00:21:00.720
And that's so hard to argue with.
link |
00:21:03.400
But I think there's better ways to do it.
link |
00:21:04.480
My manager, I actually wish he was here.
link |
00:21:06.320
I like, I would have brought him up.
link |
00:21:07.840
My manager is building an app that can manage you.
link |
00:21:13.800
So it'll like help you organize your percentages
link |
00:21:16.520
and get your publishing and da, da, da, da, da.
link |
00:21:18.840
So you can take out all the middle men
link |
00:21:20.000
so you can have a much bigger, it'll just like automate it.
link |
00:21:23.040
So you can get.
link |
00:21:23.880
So automate the manager?
link |
00:21:24.720
Automate, automate managing, management, publishing.
link |
00:21:29.400
Like and legal, it can read,
link |
00:21:32.480
the app he's building can read your contract
link |
00:21:34.120
and like tell you about it.
link |
00:21:35.680
Because one of the issues with music right now,
link |
00:21:38.320
it's not that we're not getting paid enough,
link |
00:21:39.840
but it's that the art industry is filled with middle men
link |
00:21:45.000
because artists are not good at business.
link |
00:21:47.760
And from the beginning, like Frank Sinatra,
link |
00:21:50.400
it's all mob stuff.
link |
00:21:51.640
Like it's the music industry is run by business people,
link |
00:21:56.720
not the artists.
link |
00:21:57.560
And the artists really get very small cuts
link |
00:21:59.520
of like what they make.
link |
00:22:00.440
And so I think part of the reason I'm a technocrat,
link |
00:22:04.760
which I mean, your fans are gonna be technocrats.
link |
00:22:07.080
So no one's, they're not gonna be mad at me about this,
link |
00:22:09.280
but like my fans hate it when I say this kind of thing.
link |
00:22:12.200
Or the general public.
link |
00:22:13.040
They don't like technocrats.
link |
00:22:14.240
They don't like technocrats.
link |
00:22:15.640
Like when I watched Battle Angel, Elita,
link |
00:22:18.800
and they were like, the Martian technocracy.
link |
00:22:20.440
And I was like, yeah, Martian technocracy.
link |
00:22:22.080
And then they were like, and they're evil.
link |
00:22:23.600
And I was like, oh, okay.
link |
00:22:25.640
I was like, because Martian technocracy sounds sick to me.
link |
00:22:28.840
Yeah, so your intuition as technocrats
link |
00:22:32.000
would create some kind of beautiful world.
link |
00:22:34.280
For example, what my manager's working on,
link |
00:22:36.160
if you can create an app that removes the need for a lawyer
link |
00:22:39.960
and then you could have smart contracts on the blockchain,
link |
00:22:43.440
removes the need for like management
link |
00:22:46.880
and organizing all the stuff.
link |
00:22:48.080
Like can read your stuff and explain it to you,
link |
00:22:51.000
can collect your royalties.
link |
00:22:54.240
Like then the small amounts,
link |
00:22:57.000
the amount of money that you're getting from Spotify
link |
00:22:58.720
actually means a lot more and goes a lot farther.
link |
00:23:01.800
They can remove some of the bureaucracy,
link |
00:23:03.160
some of the inefficiencies that make life
link |
00:23:06.400
not as great as it could be.
link |
00:23:08.240
Yeah, I think the issue isn't that there's not enough.
link |
00:23:10.880
Like the issue is that there's inefficiency.
link |
00:23:12.720
And I'm really into this positive some mindset,
link |
00:23:18.880
the win, win mindset of like,
link |
00:23:20.840
instead of fighting over the scraps,
link |
00:23:23.520
how do we make the, or worrying about scarcity,
link |
00:23:26.520
like instead of a scarcity mindset,
link |
00:23:27.800
why don't we just increase the efficiency
link |
00:23:30.080
and in that way.
link |
00:23:32.360
Expand the size of the pie.
link |
00:23:34.400
Let me ask you about experimentation.
link |
00:23:36.560
So you said, which is beautiful,
link |
00:23:38.620
being a musician is like having a conversation
link |
00:23:42.860
with all those that came before you.
link |
00:23:45.500
How much of creating music is like,
link |
00:23:51.220
kind of having that conversation,
link |
00:23:53.100
trying to fit into the cultural trends
link |
00:23:57.300
and how much of it is like trying to as much as possible
link |
00:24:00.180
be an outsider and come up with something totally new.
link |
00:24:02.660
Like when you're thinking, when you're experimenting,
link |
00:24:05.700
are you trying to be totally different, totally weird?
link |
00:24:08.740
Are you trying to fit in?
link |
00:24:12.140
Man, this is so hard.
link |
00:24:13.180
Cause I feel like I'm kind of in the process
link |
00:24:15.540
of semi retiring from music.
link |
00:24:16.940
So this is like my old brain.
link |
00:24:18.780
Yeah, bring it from like the shelf,
link |
00:24:22.100
put it on the table for a couple of minutes.
link |
00:24:24.500
We'll just poke it.
link |
00:24:26.340
I think it's a bit of both,
link |
00:24:27.300
because I think forcing yourself to engage with new music
link |
00:24:32.460
is really great for neural plasticity.
link |
00:24:35.060
Like I think as people,
link |
00:24:39.540
part of the reason music is marketed at young people
link |
00:24:41.740
is cause young people are very neural plastic.
link |
00:24:43.420
So like if you're 16 to like 23 or whatever,
link |
00:24:48.060
it's gonna be really easy for you to love new music.
link |
00:24:50.900
And if you're older than that,
link |
00:24:52.300
it gets harder and harder and harder.
link |
00:24:53.820
And I think one of the beautiful things
link |
00:24:55.020
about being a musician is I just constantly force myself
link |
00:24:57.940
to listen to new music.
link |
00:24:58.780
And I think it keeps my brain really plastic.
link |
00:25:01.020
And I think this is a really good exercise.
link |
00:25:02.820
I just think everyone should do this.
link |
00:25:04.340
You listen to new music and you hate it.
link |
00:25:05.660
I think you should just keep force yourself to like,
link |
00:25:08.620
okay, well, why do people like it?
link |
00:25:09.980
And like, you know,
link |
00:25:11.820
make your brain form new neural pathways
link |
00:25:14.700
and be more open to change.
link |
00:25:16.900
That's really brilliant actually.
link |
00:25:18.300
Sorry to interrupt,
link |
00:25:19.140
but like that exercise is really amazing
link |
00:25:24.900
to sort of embrace change,
link |
00:25:27.500
embrace sort of practice on your plasticity.
link |
00:25:31.620
Because like that's one of the things you've,
link |
00:25:33.220
you fall in love with a certain band
link |
00:25:34.420
and you just kind of stay with that for the rest of life.
link |
00:25:36.780
And then you never understand the modern music.
link |
00:25:38.420
That's a really good exercise.
link |
00:25:39.260
Most of the streaming on Spotify
link |
00:25:40.620
is like classic rock and stuff.
link |
00:25:42.420
Like new music makes up a very small chunk
link |
00:25:44.780
of what is played on Spotify.
link |
00:25:46.820
And I think this is like not a good sign for us as a species.
link |
00:25:50.140
I think, yeah.
link |
00:25:52.860
So it's a good measure of the species open mindedness
link |
00:25:57.740
to change as how often you listen to new music.
link |
00:26:01.140
The brain, let's put the music brain back on the shelf.
link |
00:26:05.180
I got to pull out the futurist brain for a second.
link |
00:26:09.700
In what wild ways do you think the future?
link |
00:26:12.300
Say in like 30 years, maybe 50 years,
link |
00:26:14.980
maybe a hundred years will be different
link |
00:26:18.340
from like from our current way of life on earth.
link |
00:26:22.140
We can talk about augmented reality, virtual reality,
link |
00:26:25.420
maybe robots, maybe space travel,
link |
00:26:28.780
maybe video games, maybe genetic engineering.
link |
00:26:32.540
I can keep going, cyborgs, aliens, world wars,
link |
00:26:36.260
maybe destructive nuclear wars, good and bad.
link |
00:26:40.300
When you think about the future, what are you imagining?
link |
00:26:43.540
What's the weirdest and the wildest it could be?
link |
00:26:47.620
Have you read Surface Detail by Ian Banks?
link |
00:26:51.460
Surface Detail is my favorite depiction of a,
link |
00:26:54.780
oh, wow, you have to read this book.
link |
00:26:56.540
It's literally the greatest science fiction book
link |
00:26:58.660
possibly ever.
link |
00:26:59.660
Ian Banks is the man, yeah, for sure.
link |
00:27:01.580
What have you read?
link |
00:27:03.180
Just The Player of Games.
link |
00:27:04.500
I read that titles can't be copyrighted
link |
00:27:07.340
so you can just steal them.
link |
00:27:08.300
And I was like, Player of Games, sick.
link |
00:27:09.940
Nice.
link |
00:27:10.780
Yeah, so you could name your album.
link |
00:27:12.740
Like I always wanted to.
link |
00:27:13.580
Romeo and Juliet or something?
link |
00:27:14.980
I always wanted to name an album War and Peace.
link |
00:27:17.060
Nice.
link |
00:27:17.900
Like that would be like you.
link |
00:27:18.740
That is a good, that's a good,
link |
00:27:20.420
where have I heard that before?
link |
00:27:21.580
You can do that, like you could do that.
link |
00:27:24.300
All those things that are in the public domain.
link |
00:27:26.060
For people who have no clue, you do have a song called
link |
00:27:28.820
Player of Games.
link |
00:27:29.660
Yes.
link |
00:27:30.500
Oh yeah.
link |
00:27:31.340
So Ian Banks Surface Detail is, in my opinion,
link |
00:27:33.860
the best future that I've ever read about
link |
00:27:37.220
or heard about in science fiction.
link |
00:27:39.540
Basically there's the relationship with super intelligence,
link |
00:27:44.620
like artificial super intelligence is just,
link |
00:27:48.020
it's like great.
link |
00:27:50.380
I want to credit the person who coined this term
link |
00:27:53.060
because I love this term.
link |
00:27:55.260
And I feel like young women don't get enough credit in.
link |
00:28:00.180
Yeah, so if you go to Protopia Futures on Instagram,
link |
00:28:03.940
what is her name?
link |
00:28:05.460
Personalized donor experience at scale,
link |
00:28:07.220
our app power donor experience.
link |
00:28:08.980
Monica Bealskite, I'm saying that wrong.
link |
00:28:15.380
And I'm probably gonna,
link |
00:28:16.620
I'm probably butchering this a bit,
link |
00:28:17.700
but Protopia is sort of, if Utopia is unattainable,
link |
00:28:21.580
Protopia is sort of like, you know.
link |
00:28:25.980
Wow, that's an awesome Instagram, Protopia Futures.
link |
00:28:28.660
A great, a future that is, you know, as good as we can get.
link |
00:28:33.460
The future, positive future.
link |
00:28:34.740
AI, is this a centralized AI in Surface Detail
link |
00:28:38.340
or is it distributed?
link |
00:28:39.260
What kind of AI is it?
link |
00:28:40.580
They mostly exist as giant super ships,
link |
00:28:42.780
like sort of like the Guild ships in Dune.
link |
00:28:45.780
Like they're these giant ships
link |
00:28:46.820
that kind of move people around and the ships are sentient.
link |
00:28:49.460
And they can talk to all the passengers.
link |
00:28:52.220
And I mean, there's a lot of different types of AI
link |
00:28:56.420
in the Banksy and Future,
link |
00:28:58.340
but in the opening scene of Surface Detail,
link |
00:29:01.060
there's this place called the Culture.
link |
00:29:02.340
And the Culture is basically a Protopian Future.
link |
00:29:04.460
And a Protopian Future, I think, is like a future
link |
00:29:08.100
that is like, obviously it's not Utopia, it's not perfect.
link |
00:29:12.380
And like, cause like striving for Utopia,
link |
00:29:14.020
I think feels hopeless and it's sort of like,
link |
00:29:16.980
maybe not the best terminology to be using.
link |
00:29:20.100
So it's like, it's a pretty good place.
link |
00:29:23.980
Like mostly like, you know,
link |
00:29:27.660
super intelligence and biological beings exist
link |
00:29:30.620
fairly in harmony.
link |
00:29:31.900
There's not too much war.
link |
00:29:33.140
There's like as close to a quality as you can get.
link |
00:29:35.700
You know, it's like approximately a good future.
link |
00:29:38.700
Like there's really awesome stuff.
link |
00:29:40.180
It's, and in the opening scene,
link |
00:29:45.180
this girl, she's born as a sex slave outside of the culture.
link |
00:29:49.540
So she's in a society that doesn't adhere
link |
00:29:51.220
to the cultural values.
link |
00:29:52.660
She tries to kill the guy who is her like master,
link |
00:29:56.740
but he kills her.
link |
00:29:57.980
But unbeknownst to her,
link |
00:29:59.060
when she was traveling on a ship
link |
00:30:00.740
through the culture with him one day,
link |
00:30:03.380
a ship put a neural lace in her head.
link |
00:30:05.740
And neural lace is sort of like,
link |
00:30:08.500
it's basically a neural ink,
link |
00:30:11.300
cause life imitates art.
link |
00:30:13.100
It does indeed.
link |
00:30:13.940
It does indeed.
link |
00:30:14.900
So she wakes up and the opening scene is her memory
link |
00:30:17.340
has been uploaded by this neural lace
link |
00:30:19.060
when she's been killed.
link |
00:30:20.060
And now she gets to choose a new body
link |
00:30:22.660
and this AI is interfacing
link |
00:30:25.420
with her recorded memory in her neural lace
link |
00:30:28.140
and helping her and being like, hello, you're dead.
link |
00:30:31.340
But because you had a neural lace,
link |
00:30:32.660
your memory is uploaded.
link |
00:30:33.780
Do you want to choose a new body?
link |
00:30:35.020
And you're going to be born here in the culture
link |
00:30:36.500
and like start a new life,
link |
00:30:38.140
which is just that's like the opening.
link |
00:30:39.940
It's like so sick.
link |
00:30:41.420
And the ship is the super intelligence.
link |
00:30:43.700
All the ships are kind of super intelligence.
link |
00:30:45.100
But they still want to preserve
link |
00:30:46.940
a kind of rich fulfilling experience for the humans.
link |
00:30:49.700
Yeah, like they're like friends with the humans.
link |
00:30:51.020
And then there's a bunch of ships
link |
00:30:51.980
that don't want to exist with biological beings,
link |
00:30:54.460
but they just have their own place like way over there.
link |
00:30:57.140
But they don't, they just do their own thing.
link |
00:30:58.940
They're not necessarily.
link |
00:31:00.340
So it's a pretty,
link |
00:31:01.660
this Portopian existence is pretty peaceful.
link |
00:31:03.660
Yeah. I mean, and then, and then for example,
link |
00:31:05.900
one of the main fights in the book is they're fighting,
link |
00:31:10.220
there's these artificial hells
link |
00:31:11.780
that and people don't think it's ethical
link |
00:31:16.700
to have artificial hell.
link |
00:31:17.620
Like basically when people do crime,
link |
00:31:18.860
they get sent, like when they die,
link |
00:31:20.260
their memory gets sent to an artificial hell
link |
00:31:21.940
and they're eternally tortured.
link |
00:31:23.340
And so, and then the way that society is deciding
link |
00:31:27.660
whether or not to have the artificial hell
link |
00:31:29.300
is that they're having these simulated,
link |
00:31:31.900
they're having like a simulated war.
link |
00:31:33.220
So instead of actual blood, you know,
link |
00:31:35.980
people are basically essentially fighting in a video game
link |
00:31:38.500
to choose the outcome of this.
link |
00:31:40.100
But they're still experiencing the suffering
link |
00:31:42.820
in this artificial hell or no, can you experience stuff?
link |
00:31:45.780
So the artificial hell sucks.
link |
00:31:47.300
And a lot of people in the culture
link |
00:31:48.380
want to get rid of the artificial hell.
link |
00:31:49.940
There's a simulated wars,
link |
00:31:51.340
are they happening in the artificial hell?
link |
00:31:53.340
So the simulated wars are happening
link |
00:31:55.540
outside of the artificial hell,
link |
00:31:57.140
between the political factions who are,
link |
00:31:59.980
so this political faction says
link |
00:32:01.740
we should have simulated hell to deter crime.
link |
00:32:05.060
And this political faction is saying,
link |
00:32:06.980
no, simulated hell is unethical.
link |
00:32:08.980
And so instead of like having, you know,
link |
00:32:11.700
blowing each other up with nukes,
link |
00:32:13.100
they're having like a giant Fortnite battle to decide this.
link |
00:32:18.940
Which, you know, to me, that's protopia.
link |
00:32:21.660
That's like, okay, we can have war without death.
link |
00:32:25.620
You know, I don't think there should be simulated hells.
link |
00:32:27.420
I think that is definitely one of the ways
link |
00:32:29.820
in which technology could go very, very, very, very wrong.
link |
00:32:34.300
So almost punishing people in the digital space
link |
00:32:37.100
or something like that?
link |
00:32:37.940
Or torturing people's memories?
link |
00:32:41.660
So either as a deterrent, like if you committed a crime,
link |
00:32:44.780
but also just for personal pleasure,
link |
00:32:46.540
if there's some segmented humans in this world.
link |
00:32:50.260
Dan Carlin actually has this
link |
00:32:55.060
episode of Hardcore History on Painful Tainment.
link |
00:32:59.460
Oh, that episode is fucked.
link |
00:33:02.300
Is dark, because he kind of goes through human history
link |
00:33:05.380
and says like, we as humans seem to enjoy secretly enjoy
link |
00:33:09.980
or used to be openly enjoy sort of the torture
link |
00:33:13.700
and the death, watching the death and torture
link |
00:33:16.180
of other humans.
link |
00:33:17.700
I do think if people were consenting,
link |
00:33:21.740
we should be allowed to have gladiatorial matches.
link |
00:33:26.260
But consent is hard to achieve in those situations.
link |
00:33:28.580
It always starts getting slippery.
link |
00:33:31.060
Like it could be also forced, like it starts getting weird.
link |
00:33:34.260
Yeah, yeah.
link |
00:33:35.260
There's way too much excitement.
link |
00:33:37.300
Like this is what he highlights.
link |
00:33:38.620
There's something about human nature
link |
00:33:40.500
that wants to see that violence.
link |
00:33:42.260
And it's really dark.
link |
00:33:44.300
And you hope that we can sort of overcome
link |
00:33:47.140
that aspect of human nature,
link |
00:33:48.820
but that's still within us somewhere.
link |
00:33:51.260
Well, I think that's what we're doing right now.
link |
00:33:53.220
I have this theory that what is very important
link |
00:33:56.380
about the current moment is that all of evolution
link |
00:34:00.860
has been survival of the fittest up until now.
link |
00:34:03.380
And at some point, it's kind of the lines are kind of fuzzy,
link |
00:34:07.220
but in the recent past or maybe even just right now,
link |
00:34:12.060
we're getting to this point where we can choose
link |
00:34:17.740
intelligent design.
link |
00:34:19.380
Like we probably since like the integration of the iPhone,
link |
00:34:23.380
like we are becoming cyborgs.
link |
00:34:24.780
Like our brains are fundamentally changed.
link |
00:34:27.620
Everyone who grew up with electronics,
link |
00:34:29.300
we are fundamentally different from previous, from Homo sapiens.
link |
00:34:33.380
I call us Homo Techno.
link |
00:34:34.620
I think we have evolved into Homo Techno,
link |
00:34:36.780
which is like essentially a new species.
link |
00:34:39.340
Like if you look at the way,
link |
00:34:41.340
if you took an MRI of my brain,
link |
00:34:43.340
and you took an MRI of like a medieval brain,
link |
00:34:46.540
I think it would be very different,
link |
00:34:48.100
the way that it has evolved.
link |
00:34:49.900
Do you think when historians look back at this time,
link |
00:34:51.940
they'll see like this was a fundamental shift
link |
00:34:53.980
to what a human being is?
link |
00:34:54.860
I think, I do not think we are still Homo sapiens.
link |
00:34:57.980
I believe we are Homo Techno.
link |
00:34:59.460
And I think we have evolved and I think right now,
link |
00:35:05.260
the way we are evolving, we can choose how we do that.
link |
00:35:08.980
And I think we are being very reckless
link |
00:35:10.580
about how we're doing that.
link |
00:35:11.740
Like we're just having social media,
link |
00:35:12.940
but I think this idea that like this is a time
link |
00:35:16.260
to choose intelligent design should be taken very seriously.
link |
00:35:19.540
It like now is the moment to reprogram the human computer.
link |
00:35:22.580
You know, it's like if you go blind,
link |
00:35:27.220
your visual cortex will get taken over with other functions.
link |
00:35:31.940
We can choose our own evolution.
link |
00:35:35.140
We can change the way our brains work.
link |
00:35:37.140
And so we actually have a huge responsibility to do that.
link |
00:35:39.900
And I think I'm not sure who should be responsible for that,
link |
00:35:42.820
but there's definitely not adequate education.
link |
00:35:45.140
We're being inundated with all this technology
link |
00:35:46.900
that is fundamentally changing
link |
00:35:48.980
the physical structure of our brains.
link |
00:35:50.860
And we are not adequately responding to that
link |
00:35:55.860
to choose how we want to evolve.
link |
00:35:57.420
And we could evolve, we could be really whatever we want.
link |
00:36:00.540
And I think this is a really important time.
link |
00:36:02.980
And I think if we choose correctly
link |
00:36:04.260
and we choose wisely, consciousness could exist
link |
00:36:07.620
for a very long time.
link |
00:36:09.700
And integration with AI could be extremely positive.
link |
00:36:12.660
And I don't think enough people are focusing
link |
00:36:14.340
on this specific situation.
link |
00:36:16.340
So you think we might irreversibly screw things up
link |
00:36:18.660
if we get things wrong now?
link |
00:36:19.940
Because the flip side of that
link |
00:36:21.620
seems humans are pretty adaptive.
link |
00:36:23.140
So maybe the way we figure things out
link |
00:36:25.940
is by screwing it up, like social media.
link |
00:36:28.060
Over a generation, we'll see the negative effects
link |
00:36:30.540
of social media and then we build new social medias
link |
00:36:33.020
and we just keep improving stuff.
link |
00:36:34.940
And then we learn the failure from the failures of the past.
link |
00:36:37.660
Because humans seem to be really adaptive.
link |
00:36:39.900
On the flip side, we can get it wrong in a way
link |
00:36:42.980
where literally we create weapons of war
link |
00:36:46.380
or increase hate past a certain threshold
link |
00:36:49.620
we really do a lot of damage.
link |
00:36:51.900
I mean, I think we're optimized
link |
00:36:53.740
to notice the negative things.
link |
00:36:55.700
But I would actually say one of the things
link |
00:37:00.180
that I think people aren't noticing
link |
00:37:02.180
is if you look at Silicon Valley
link |
00:37:03.580
and you look at whatever the technocracy,
link |
00:37:06.660
like what's been happening there,
link |
00:37:08.420
like it's like when Silicon Valley started,
link |
00:37:10.140
it was all just like Facebook
link |
00:37:11.500
and all this like for profit crap
link |
00:37:14.100
that really wasn't particular.
link |
00:37:16.980
I guess it was useful,
link |
00:37:17.900
but it's sort of just like whatever.
link |
00:37:22.220
But now you see like lab grown meat,
link |
00:37:24.740
like compostable or biodegradable single use cutlery
link |
00:37:30.220
or like meditation apps.
link |
00:37:33.180
I think we are actually evolving
link |
00:37:37.380
and changing and technology is changing.
link |
00:37:39.620
I think they're just maybe there isn't quite enough
link |
00:37:44.620
education about this.
link |
00:37:48.180
And also, I don't know if there's like quite enough
link |
00:37:51.220
incentive for it because I think the way capitalism works,
link |
00:37:56.660
what we define as profit,
link |
00:37:58.540
we're also working on an old model
link |
00:38:00.420
of what we define as profit.
link |
00:38:01.580
I really think if we changed the idea of profit
link |
00:38:06.420
to include social good,
link |
00:38:08.260
you can have like economic profit, social good,
link |
00:38:10.780
also counting as profit would incentivize things
link |
00:38:14.140
that are more useful and more whatever spiritual technology
link |
00:38:17.020
or like positive technology or things
link |
00:38:20.340
that help re program a human computer in a good way
link |
00:38:22.980
or things that help us intelligently design our new brains.
link |
00:38:28.340
Yeah, there's no reason why within the framework
link |
00:38:30.540
of capitalism, the word profit or the idea of profit
link |
00:38:33.860
can't also incorporate the wellbeing of a human being.
link |
00:38:37.420
So like longterm wellbeing, longterm happiness.
link |
00:38:41.500
Or even for example,
link |
00:38:42.980
we were talking about motherhood, like part of the reason
link |
00:38:44.340
I'm so late is because I had to get the baby to bed.
link |
00:38:47.380
And it's like, I keep thinking about motherhood,
link |
00:38:48.900
how under capitalism, it's like this extremely essential
link |
00:38:53.300
job that is very difficult, that is not compensated.
link |
00:38:56.460
And we sort of like value things
link |
00:38:58.420
by how much we compensate them.
link |
00:39:01.900
And so we really devalue motherhood in our society
link |
00:39:04.860
and pretty much all societies.
link |
00:39:06.100
Like capitalism does not recognize motherhood.
link |
00:39:08.140
It's just a job that you're supposed to do for free.
link |
00:39:11.140
And it's like, but I feel like producing great humans
link |
00:39:15.380
should be seen as a great, as profit under capitalism.
link |
00:39:19.460
Like that should be, that's like a huge social good.
link |
00:39:21.500
Like every awesome human that gets made
link |
00:39:24.140
adds so much to the world.
link |
00:39:25.580
So like if that was integrated into the profit structure,
link |
00:39:29.860
then, you know, and if we potentially found a way
link |
00:39:34.260
to compensate motherhood.
link |
00:39:35.660
So come up with a compensation
link |
00:39:37.940
that's much broader than just money.
link |
00:39:40.180
Or it could just be money.
link |
00:39:42.060
Like what if you just made, I don't know,
link |
00:39:45.500
but I don't know how you'd pay for that.
link |
00:39:46.660
Like, I mean, that's where you start getting into.
link |
00:39:52.180
Reallocation of resources that people get upset over.
link |
00:39:56.340
Well, like what if we made like a motherhood Dow?
link |
00:39:58.620
Yeah, yeah.
link |
00:40:01.220
You know, and, you know, used it to fund like single mothers,
link |
00:40:06.220
like, you know, pay for making babies.
link |
00:40:13.660
So I mean, if you create and put beautiful things
link |
00:40:17.380
out into the world, that could be companies,
link |
00:40:19.780
that can be bridges, that could be art,
link |
00:40:22.660
that could be a lot of things, and that could be children,
link |
00:40:26.740
which are, or education or anything,
link |
00:40:29.940
that could just be valued by society.
link |
00:40:32.420
And that should be somehow incorporated into the framework
link |
00:40:35.180
of what, sort of as a market of what,
link |
00:40:38.500
like if you contribute children to this world,
link |
00:40:40.540
that should be valued and respected
link |
00:40:42.460
and sort of celebrated, like proportional to what it is,
link |
00:40:47.980
which is it's the thing that fuels human civilization.
link |
00:40:51.540
Yeah, like, it's kind of important.
link |
00:40:52.980
I feel like everyone's always saying,
link |
00:40:54.580
I mean, I think we're in very different social spheres,
link |
00:40:56.540
but everyone's always saying like, dismantle capitalism.
link |
00:40:58.860
And I'm like, well, okay,
link |
00:40:59.700
well, I don't think the government should own everything.
link |
00:41:02.140
Like, I don't think we should not have private ownership.
link |
00:41:04.220
Like that's scary.
link |
00:41:05.380
You know, like that starts getting into weird stuff
link |
00:41:07.340
and just sort of like,
link |
00:41:08.940
I feel like there's almost no way to do that
link |
00:41:10.100
without a police state, you know?
link |
00:41:13.540
But obviously capitalism has some major flaws.
link |
00:41:17.020
And I think actually Mac showed me this idea
link |
00:41:20.420
called social capitalism,
link |
00:41:22.180
which is a form of capitalism
link |
00:41:23.540
that just like considers social good to be also profit.
link |
00:41:28.620
Like, you know, it's like right now,
link |
00:41:30.220
companies need to, like you're supposed to grow every quarter
link |
00:41:33.620
or whatever to like show that you're functioning well.
link |
00:41:38.740
But it's like, okay, well,
link |
00:41:39.940
what if you kept the same amount of profit?
link |
00:41:42.940
You're still in the green,
link |
00:41:43.940
but then you have also all the social good.
link |
00:41:45.620
Like, do you really need all this extra economic growth
link |
00:41:47.580
or could you add this social good and that counts?
link |
00:41:49.540
And, you know, I don't know if I am not an economist.
link |
00:41:53.980
I have no idea how this could be achieved, but it's just...
link |
00:41:56.420
I don't think economists know
link |
00:41:57.820
how anything could be achieved either,
link |
00:41:59.580
but they pretend it's the thing.
link |
00:42:01.100
It construct a model and they go on TV shows
link |
00:42:04.300
and sound like an expert.
link |
00:42:06.180
That's the definition of an economist.
link |
00:42:08.540
How did being a mother becoming a mother
link |
00:42:12.260
change as a human being, would you say?
link |
00:42:16.140
Man, I think it kind of changed everything
link |
00:42:21.020
and it's still changing me a lot.
link |
00:42:22.380
It's actually changing me more right now in this moment
link |
00:42:24.900
than it was before.
link |
00:42:26.380
Like today, like this...
link |
00:42:28.500
Just like in the most recent months and stuff.
link |
00:42:33.220
Can you elucidate that change?
link |
00:42:37.580
Like when you wake up in the morning
link |
00:42:39.340
and you look at yourself, it's again, which, who are you?
link |
00:42:42.900
How have you become different, would you say?
link |
00:42:45.500
I think it's just really reorienting my priorities.
link |
00:42:50.660
And at first I was really fighting against that
link |
00:42:53.420
because I somehow felt it was like a failure
link |
00:42:55.580
of feminism or something.
link |
00:42:56.700
Like I felt like it was bad
link |
00:42:59.860
if my kids started mattering more than my work.
link |
00:43:05.700
And then more recently,
link |
00:43:07.460
I started sort of analyzing that thought in myself
link |
00:43:10.940
and being like, that's also kind of a construct.
link |
00:43:13.940
It's like we've just devalued motherhood so much
link |
00:43:16.340
in our culture that I feel guilty
link |
00:43:18.820
for caring about my kids more than I care about my work.
link |
00:43:23.340
So feminism includes breaking out of whatever the construct is.
link |
00:43:28.340
So just continually breaking,
link |
00:43:30.900
it's like freedom empower you to be free.
link |
00:43:34.620
And that means...
link |
00:43:37.580
But it also, but like being a mother,
link |
00:43:40.060
like I'm so much more creative.
link |
00:43:41.860
Like I cannot believe the massive amount of brain growth
link |
00:43:48.140
that I am.
link |
00:43:48.980
What do you think that is?
link |
00:43:49.820
Just cause like the stakes are higher somehow?
link |
00:43:51.980
I think it's like, it's just so trippy
link |
00:43:57.980
watching consciousness emerge.
link |
00:44:00.780
It's just like going on a crazy journey or something.
link |
00:44:07.340
It's like the craziest science fiction novel you could ever read.
link |
00:44:10.860
It's just so crazy watching consciousness come into being.
link |
00:44:15.060
And then at the same time,
link |
00:44:16.500
like you're forced to value your time so much.
link |
00:44:21.100
Like when I have creative time now, it's so sacred.
link |
00:44:23.500
I need to like be really fricking on it.
link |
00:44:29.340
But the other thing is that I used to just be like a cynic
link |
00:44:34.660
and I used to just wanna,
link |
00:44:35.460
like my last album was called Miss Anthropocene.
link |
00:44:38.060
And it was like this like, it was like a study in villainy.
link |
00:44:42.140
Like, or like it was like, well, what if, you know,
link |
00:44:44.380
we have instead of the old gods, we have like new gods.
link |
00:44:46.780
And it's like Miss Anthropocene is like Miss Anthrope
link |
00:44:49.700
like Anthropocene, which is like the, you know,
link |
00:44:53.100
like and she's the goddess of climate change or whatever.
link |
00:44:55.780
And she's like destroying the world.
link |
00:44:56.980
And it was just like, it was like dark
link |
00:44:59.660
and it was like a study in villainy.
link |
00:45:01.380
And it was sort of just like, like,
link |
00:45:02.900
I used to like have no problem just making cynical,
link |
00:45:06.820
angry, scary art.
link |
00:45:08.980
And not that there's anything wrong with that,
link |
00:45:11.100
but I think having kids just makes you such an optimist.
link |
00:45:13.580
It just inherently makes you wanna be an optimist so bad
link |
00:45:16.940
that like I feel like a more responsibility
link |
00:45:20.380
to make more optimistic things.
link |
00:45:23.820
And I get a lot of shit for it cause everyone's like,
link |
00:45:27.380
oh, you're so privileged.
link |
00:45:28.500
Stop talking about like pie in the sky,
link |
00:45:30.260
stupid concepts and focus on like the now.
link |
00:45:32.700
But it's like, I think if we don't ideate
link |
00:45:36.500
about futures that could be good,
link |
00:45:40.540
we won't be able to get them.
link |
00:45:41.540
If everything is Blade Runner,
link |
00:45:42.780
then we're gonna end up with Blade Runner.
link |
00:45:44.260
It's like, as we said earlier, life imitates art.
link |
00:45:47.260
Like life really does imitate art.
link |
00:45:49.700
And so we really need more protopian or utopian art.
link |
00:45:53.940
I think this is incredibly essential
link |
00:45:56.060
for the future of humanity.
link |
00:45:58.020
And I think the current discourse where that's seen
link |
00:46:03.220
as a thinking about protopia or utopia
link |
00:46:09.620
is seen as a dismissal of the problems
link |
00:46:11.540
that we currently have.
link |
00:46:12.380
I think that is an incorrect mindset.
link |
00:46:16.100
And like having kids just makes me wanna imagine
link |
00:46:20.140
amazing futures that like maybe I won't be able to build
link |
00:46:24.340
but they will be able to build if they want to.
link |
00:46:26.860
Yeah, it does seem like ideation is a precursor
link |
00:46:29.180
to creation.
link |
00:46:30.500
You have to imagine it in order to be able to build it.
link |
00:46:33.740
And there is a sad thing about human nature
link |
00:46:36.660
that they somehow a cynical view of the world
link |
00:46:40.700
is seen as a insightful view.
link |
00:46:44.340
Cynicism is often confused for insight,
link |
00:46:46.940
which is sad to see.
link |
00:46:48.860
And optimism is confused for naivete.
link |
00:46:52.580
Yes, yes.
link |
00:46:53.420
Like you don't, you're blinded by your,
link |
00:46:57.620
maybe your privilege or whatever.
link |
00:46:59.260
You're blinded by something but you're certainly blinded.
link |
00:47:01.980
That's sad.
link |
00:47:03.740
That's sad to see because it seems like the optimists
link |
00:47:05.940
are the ones that create our future.
link |
00:47:10.220
They're the ones that build.
link |
00:47:11.860
In order to build the crazy thing,
link |
00:47:13.500
you have to be optimistic.
link |
00:47:14.700
You have to be either stupid or excited or passionate
link |
00:47:19.180
or mad enough to actually believe that it can be built.
link |
00:47:22.660
And those are the people that built it.
link |
00:47:24.140
My favorite quote of all time is from Star Wars Episode 8,
link |
00:47:28.980
which I know everyone hates.
link |
00:47:30.900
Do you like Star Wars Episode 8?
link |
00:47:32.380
No, I probably would say I would probably hate it.
link |
00:47:35.340
Yeah, I don't have strong feelings about it.
link |
00:47:38.860
Let me backtrack.
link |
00:47:39.700
I don't have strong feelings about Star Wars.
link |
00:47:41.540
I just want to cut.
link |
00:47:42.380
I'm a Tolkien person.
link |
00:47:43.220
I'm more into dragons and orcs and ogres and.
link |
00:47:47.860
Yeah, I mean, Tolkien forever.
link |
00:47:49.620
I really want to have one more son and call him,
link |
00:47:51.780
I thought Tau Tekno Tolkien would be cool.
link |
00:47:55.260
It's a lot of teas, I like it.
link |
00:47:56.940
Yeah, and Tau is six to eight, two pie.
link |
00:47:59.260
Yeah, Tau took, yeah.
link |
00:48:01.740
And then Tekno is obviously the best genre of music,
link |
00:48:04.780
but also like technocracy.
link |
00:48:06.220
It just sounds really good.
link |
00:48:07.140
Yeah, that's right, and Tekno Tolkien, Tau Tekno Tolkien.
link |
00:48:11.180
Tau Tekno Tolkien, but Star Wars Episode 8,
link |
00:48:15.900
I know a lot of people have issues with it personally.
link |
00:48:18.020
For on the record, I think it's the best Star Wars film.
link |
00:48:24.020
You're starting to trouble today.
link |
00:48:25.060
Yeah.
link |
00:48:25.580
So what?
link |
00:48:26.300
And but don't kill what you hate, save what you love.
link |
00:48:29.180
Don't kill what you hate.
link |
00:48:30.380
Don't kill what you hate, save what you love.
link |
00:48:32.220
And I think we're in society right now,
link |
00:48:34.860
we're in a diagnosis mode.
link |
00:48:36.500
We're just diagnosing and diagnosing and diagnosing.
link |
00:48:39.140
And we're trying to kill what we hate,
link |
00:48:41.580
and we're not trying to save what we love enough.
link |
00:48:44.140
And there's this Buckminster Fuller quote,
link |
00:48:46.180
which I'm going to butcher
link |
00:48:47.060
because I don't remember it correctly,
link |
00:48:48.140
but it's something along the lines of,
link |
00:48:52.500
don't try to destroy the old bad models,
link |
00:48:58.260
render them obsolete with better models.
link |
00:49:03.020
Maybe we don't need to destroy the oil industry.
link |
00:49:05.540
Maybe we just create a great new battery technology
link |
00:49:08.900
and sustainable transport and just make it economically
link |
00:49:12.380
unreasonable to still continue to rely on fossil fuels.
link |
00:49:17.060
It's like, don't kill what you hate, save what you love.
link |
00:49:19.980
Make new things and just render the old things unusable.
link |
00:49:24.420
It's like, if the college debt is so bad
link |
00:49:28.980
and universities are so expensive,
link |
00:49:31.460
I feel like education is becoming obsolete.
link |
00:49:35.660
I feel like we could completely revolutionize education
link |
00:49:38.460
and we could make it free.
link |
00:49:39.340
And it's like, you look at JSTOR
link |
00:49:40.380
and you have to pay to get all the studies and everything.
link |
00:49:43.460
What if we created a DAO that bought JSTOR
link |
00:49:46.500
or we created a DAO that was funding studies
link |
00:49:48.460
and those studies were free for everyone?
link |
00:49:51.860
And what if we just open sourced education
link |
00:49:55.060
and decentralized education and made it free?
link |
00:49:56.980
And all research was on the internet
link |
00:50:00.860
and all the outcomes of studies are on the internet
link |
00:50:05.180
and no one has student debt.
link |
00:50:10.020
And you just take tests when you apply for a job
link |
00:50:14.180
and if you're qualified, then you can work there.
link |
00:50:18.060
This is just like, I don't know how anything works.
link |
00:50:20.460
I'm just randomly ranting, but...
link |
00:50:22.700
I like the humility.
link |
00:50:24.900
You gotta think from just basic first principles.
link |
00:50:27.580
Like what is the problem?
link |
00:50:28.900
What's broken?
link |
00:50:29.740
What are some ideas?
link |
00:50:30.740
That's it.
link |
00:50:31.580
And get excited about those ideas
link |
00:50:33.100
and share your excitement and don't tear each other down.
link |
00:50:36.260
I'm like...
link |
00:50:37.100
It's just when you kill things,
link |
00:50:38.140
you often end up killing yourself.
link |
00:50:40.100
Like war is not a one sided,
link |
00:50:43.340
like you're not gonna go in and just kill them.
link |
00:50:45.020
Like you're gonna get stabbed.
link |
00:50:46.860
It's like, and I think that when I talk about
link |
00:50:49.460
this nexus point of that we're in this point in society
link |
00:50:53.260
where we're switching to intelligent design,
link |
00:50:55.100
I think part of our switch to intelligent design
link |
00:50:57.180
is that we need to choose nonviolence
link |
00:50:59.460
and we need to...
link |
00:51:00.780
I think we can choose to start...
link |
00:51:04.340
I don't think we can eradicate violence from our species
link |
00:51:07.420
because I think we need it a little bit,
link |
00:51:10.580
but I think we can choose to really reorient
link |
00:51:13.260
our primitive brains that are fighting over scarcity
link |
00:51:16.340
and fight and that are so attack oriented
link |
00:51:20.660
and move into it.
link |
00:51:22.060
We can optimize for creativity and building.
link |
00:51:25.460
Yeah, it's interesting to think how that happens.
link |
00:51:27.260
So some of it is just education.
link |
00:51:29.580
Some of it is living life and introspecting your own mind
link |
00:51:34.100
and trying to live up to the better angels of your nature
link |
00:51:37.780
for each one of us, all those kinds of things at scale.
link |
00:51:41.820
That's how we can sort of start to minimize
link |
00:51:44.580
the amount of destructive war in our world.
link |
00:51:48.180
And that's, to me, I'd probably hear the same.
link |
00:51:51.100
Technologies is a really promising way of to do that.
link |
00:51:55.220
Like social media should be a really promising way to do that.
link |
00:51:58.260
It's a way to reconnect.
link |
00:52:00.780
For the most part, I really enjoy social media.
link |
00:52:03.260
I just know all the negative stuff.
link |
00:52:05.220
I don't engage with any of the negative stuff.
link |
00:52:07.540
Just not even by blocking or any of that kind of stuff,
link |
00:52:11.380
but just not letting it enter my mind.
link |
00:52:16.340
When somebody says something negative, I see it.
link |
00:52:20.180
I immediately think positive thoughts about them
link |
00:52:23.260
and I just forget they exist after that.
link |
00:52:26.300
Just move on, because that negative energy,
link |
00:52:28.700
if I return the negative energy, they're
link |
00:52:31.940
going to get excited in a negative way right back.
link |
00:52:34.300
And it's just this kind of vicious cycle.
link |
00:52:38.460
But you would think technology would assist us
link |
00:52:40.580
in this process of letting go, of not taking things personally,
link |
00:52:44.780
of not engaging in the negativity.
link |
00:52:46.380
But unfortunately, social media profits from the negativity.
link |
00:52:50.260
So the current models.
link |
00:52:52.060
I mean, social media is like a gun.
link |
00:52:53.780
You should take a course before you use it.
link |
00:52:57.620
It's like, this is what I mean, when
link |
00:52:59.740
I say reprogram the human computer.
link |
00:53:01.420
In school, you should learn about how social media optimizes
link |
00:53:05.020
to raise your cortisol levels and make you angry and crazy
link |
00:53:08.540
and stressed.
link |
00:53:09.140
And you should learn how to have hygiene about how
link |
00:53:13.020
you use social media.
link |
00:53:16.660
But so you can choose not to focus on the negative stuff.
link |
00:53:19.900
But I'm not sure social media should it.
link |
00:53:24.620
I guess it should exist.
link |
00:53:25.740
I'm not sure.
link |
00:53:27.180
I mean, we're in the messy, it's the experimental phase.
link |
00:53:29.420
We're working it out.
link |
00:53:29.980
Yeah, it's the early days.
link |
00:53:31.100
I don't even know when you say social media.
link |
00:53:32.700
I don't know what that even means.
link |
00:53:33.820
We're in the very early days.
link |
00:53:35.060
I think social media is just basic human connection
link |
00:53:37.580
in the digital realm.
link |
00:53:39.300
And that, I think it should exist.
link |
00:53:41.900
But there's so many ways to do it in a bad way.
link |
00:53:43.940
There's so many ways to do it in a good way.
link |
00:53:45.940
There's all discussions of all the same human rights
link |
00:53:48.100
to talk about freedom of speech, to talk about violence
link |
00:53:52.300
in the space of digital media.
link |
00:53:54.700
We talk about hate speech.
link |
00:53:56.180
We talk about all these things that we
link |
00:53:57.860
had to figure out back in the day in the physical space.
link |
00:54:01.300
We're not figuring out in the digital space.
link |
00:54:03.660
And it's like baby stages.
link |
00:54:06.540
When the printing press came out,
link |
00:54:07.860
it was like pure chaos for a minute.
link |
00:54:10.180
It's like when you inject, when there's
link |
00:54:12.660
a massive information injection into the general population,
link |
00:54:18.460
there's just going to be like, I feel like the printing
link |
00:54:21.260
press, I don't have the years, but it was like printing
link |
00:54:24.060
press came out.
link |
00:54:25.020
Shit got really fucking bad for a minute,
link |
00:54:27.180
but then we got the enlightenment.
link |
00:54:29.220
And so it's like, I think we're in,
link |
00:54:31.020
this is like the second coming of the printing press.
link |
00:54:34.820
We're probably going to have some shitty times for a minute.
link |
00:54:38.060
And then we're going to have recalibrate
link |
00:54:40.660
to have a better understanding of how we consume media
link |
00:54:44.660
and how we deliver media.
link |
00:54:47.940
Speaking of programming the human computer,
link |
00:54:50.940
you mentioned baby X.
link |
00:54:52.940
So there's this young consciousness coming to be.
link |
00:54:56.980
It came from a cell.
link |
00:54:59.460
Like that whole thing doesn't even make sense.
link |
00:55:01.140
It came from DNA.
link |
00:55:03.100
And that this is this baby computer.
link |
00:55:04.620
It is just like grows and grows and grows and grows.
link |
00:55:06.740
And now there's a conscious being with extremely impressive
link |
00:55:10.940
cognitive capabilities with, I don't know.
link |
00:55:13.740
Have you met him?
link |
00:55:14.620
Yes. Yeah. Yeah.
link |
00:55:15.700
He's actually really smart.
link |
00:55:16.980
He's really smart.
link |
00:55:17.820
Yeah.
link |
00:55:18.660
It's weird.
link |
00:55:19.900
Yeah. Baby, I don't, I haven't.
link |
00:55:22.300
I don't know a lot of other babies,
link |
00:55:23.460
but he seems to be smart.
link |
00:55:24.300
I don't know all the babies often,
link |
00:55:25.260
but this baby was very impressive.
link |
00:55:26.820
He does a lot of pranks and stuff.
link |
00:55:28.900
Oh, so he's like,
link |
00:55:29.740
like he'll like give you treatment and take it away
link |
00:55:31.700
and laugh and like stuff like that.
link |
00:55:33.580
So he's like a chess player.
link |
00:55:35.300
So here's a cognitive sort of,
link |
00:55:39.820
like there's a computer being programmed.
link |
00:55:41.620
So he's taking in the environment,
link |
00:55:43.140
interacting with a specific set of humans.
link |
00:55:45.900
How would you, first of all, what is it?
link |
00:55:48.900
What, let me ask,
link |
00:55:50.420
I want to ask how do you program this computer?
link |
00:55:53.220
And also, how do you make sense
link |
00:55:55.300
of that there's a conscious being right there
link |
00:55:58.100
that wasn't there before?
link |
00:55:59.260
It's given me a lot of crisis thoughts.
link |
00:56:01.460
I'm thinking really, I think that's part of the reason.
link |
00:56:03.700
It's like I'm struggling to focus on art and stuff right now
link |
00:56:07.180
cause baby X is becoming conscious.
link |
00:56:09.660
And like my, it's just reorienting my brain.
link |
00:56:12.620
Like my brain is suddenly totally shifting of like,
link |
00:56:14.740
oh shit, like the way we raise children.
link |
00:56:18.260
Like, I hate all the baby books and everything.
link |
00:56:22.020
I hate them.
link |
00:56:22.860
Like they're, oh, the art is so bad.
link |
00:56:24.340
And like all the stuff, everything about all the aesthetics.
link |
00:56:29.100
And like, I'm just like, ah, like this is so.
link |
00:56:32.340
The programming languages we're using
link |
00:56:35.060
to program these baby computers isn't good.
link |
00:56:37.260
Yeah, like I'm thinking,
link |
00:56:39.420
and not that I have like good answers or know what to do,
link |
00:56:42.700
but I'm just thinking really, really hard about it.
link |
00:56:46.900
We recently watched Totoro with him, Studio Ghibli.
link |
00:56:52.900
And it's just like a fantastic film.
link |
00:56:56.140
And he like responded to,
link |
00:56:57.540
I know you're not supposed to show baby screens too much,
link |
00:56:59.500
but like I think it's the most sort of like,
link |
00:57:04.300
I feel like it's the highest art baby content.
link |
00:57:06.980
Like it really speaks, there's almost no talking in it.
link |
00:57:12.060
It's really simple.
link |
00:57:13.100
Although all the dialogue is super, super, super simple.
link |
00:57:16.060
You know, and it's like a one to three year old
link |
00:57:19.660
can like really connect with it.
link |
00:57:21.060
Like it feels like it's almost aimed
link |
00:57:22.500
at like a one to three year old,
link |
00:57:24.620
but it's like great art and it's so imaginative
link |
00:57:27.660
and it's so beautiful.
link |
00:57:28.740
And like the first time I showed it to him,
link |
00:57:31.700
he was just like so invested in it.
link |
00:57:33.580
Unlike anything else I'd ever shown him.
link |
00:57:36.740
Like he was just like crying when they cried,
link |
00:57:38.580
laughing when they laughed,
link |
00:57:39.620
like just like having this roller coaster of like emotions.
link |
00:57:42.540
Like, and he learned a bunch of words.
link |
00:57:44.020
Like he was, and he started saying Totoro
link |
00:57:46.020
and started just saying all the stuff after watching Totoro
link |
00:57:49.860
and he wants to watch it all the time.
link |
00:57:52.060
And I was like, man, why isn't there an industry of this?
link |
00:57:55.460
Like why aren't our best artists focusing on making art
link |
00:58:00.020
like for the birth of consciousness?
link |
00:58:04.220
Like, and that's one of the things I've been thinking
link |
00:58:07.100
I really want to start doing, you know,
link |
00:58:08.580
I don't want to speak before I do things too much,
link |
00:58:10.420
but like I'm just like ages one to three.
link |
00:58:15.260
Like we should be putting so much effort into that.
link |
00:58:18.700
And the other thing about Totoro is it's like,
link |
00:58:21.060
it's like better for the environment
link |
00:58:22.420
because adults love Totoro.
link |
00:58:23.860
It's such good art that everyone loves it.
link |
00:58:25.580
Like I still have all my old Totoro merch
link |
00:58:27.380
from when I was a kid.
link |
00:58:28.660
Like I literally have the most ragged old Totoro merch.
link |
00:58:33.940
Like everybody loves it, everybody keeps it.
link |
00:58:35.780
It's like, why does the art we have for babies need to suck
link |
00:58:42.420
and then be not accessible to adults
link |
00:58:45.300
and then just be thrown out when, you know,
link |
00:58:49.100
they age out of it.
link |
00:58:50.300
Like it's like, I don't know, I don't have like a fully formed
link |
00:58:54.300
thought here, but this is just something I've been thinking
link |
00:58:55.900
about a lot is like, how do we, like,
link |
00:58:58.700
how do we have more Totoro esque content?
link |
00:59:01.220
Like how do we have more content like this
link |
00:59:02.540
that like is universal and everybody loves
link |
00:59:05.140
but is like really geared to an emerging consciousness?
link |
00:59:10.180
Emerging consciousness in the first like three years of life
link |
00:59:13.140
that so much turmoil, so much evolution of mind is happening.
link |
00:59:16.580
It seems like a crucial time.
link |
00:59:18.060
Would you say to make it not suck?
link |
00:59:21.820
Do you think of basically treating a child
link |
00:59:26.620
like they have the capacity to have the brilliance
link |
00:59:29.020
of an adult or even beyond that?
link |
00:59:30.780
Is that how you think of that mind?
link |
00:59:33.460
No, cause they still, they like it
link |
00:59:35.780
when you talk weird and stuff.
link |
00:59:37.940
Like they respond better to,
link |
00:59:39.700
cause even they can imitate better when your voice is higher.
link |
00:59:42.140
Like people say like, don't do baby talk,
link |
00:59:44.060
but it's like when your voice is higher,
link |
00:59:45.420
it's closer to something they can imitate.
link |
00:59:47.380
So they like, like the baby talk actually kind of works.
link |
00:59:50.540
Like it helps them learn to communicate.
link |
00:59:52.140
I've found it to be more effective
link |
00:59:53.380
with learning words and stuff.
link |
00:59:55.340
But like you're not speaking,
link |
00:59:58.380
I'm not like speaking down to them.
link |
01:00:00.020
Like do you, do, do they have the capacity
link |
01:00:03.180
to understand really difficult concepts
link |
01:00:05.540
in just in a very difficult, different way,
link |
01:00:07.740
like an emotional intelligence about something deep within.
link |
01:00:11.540
Oh yeah.
link |
01:00:12.380
No, like if X hurts, like if X bites me really hard
link |
01:00:14.180
and I'm like, ow, he gets, he's sad.
link |
01:00:17.340
He's like sad if he hurts me by accident.
link |
01:00:19.780
Yeah.
link |
01:00:20.620
Which he's huge, so he hurts me a lot by accident.
link |
01:00:23.860
Yeah, that's so interesting
link |
01:00:25.020
that that mind emerges and he and children
link |
01:00:29.140
don't really have a memory of that time.
link |
01:00:31.140
So we can't even have a conversation with them about it.
link |
01:00:32.980
Yeah, I think God, they don't have a memory of this time
link |
01:00:34.940
because like think about like,
link |
01:00:37.380
I mean with our youngest baby, like it's like,
link |
01:00:40.700
I'm like, have you read the sci fi short story?
link |
01:00:44.220
I have no mouth, but I'm a scream.
link |
01:00:46.700
The title, no.
link |
01:00:47.980
Oh man.
link |
01:00:48.860
I mean, you should read that.
link |
01:00:50.380
The mouth, but I'm a scream.
link |
01:00:53.180
I hate getting into this Roco's Basilisk shit.
link |
01:00:55.460
It's kind of a story about like
link |
01:01:00.580
an AI that's like torturing someone in eternity
link |
01:01:03.820
and they have like no body.
link |
01:01:05.540
The way they describe it, it sort of sounds
link |
01:01:08.340
like what it feels like, like being a baby,
link |
01:01:10.140
like you're conscious and you're just getting inputs
link |
01:01:12.740
from everywhere and you have no muscles
link |
01:01:14.500
and you're like jelly and you like can't move
link |
01:01:16.260
and you try to like communicate,
link |
01:01:17.580
but you can't communicate and like you're just like
link |
01:01:20.580
in this like hell state.
link |
01:01:22.620
I think it's good, we can't remember that.
link |
01:01:25.220
Like my little baby is just exiting that,
link |
01:01:27.660
like she's starting to like get muscles
link |
01:01:29.140
and have more like autonomy,
link |
01:01:30.740
but like watching her go through the opening phase,
link |
01:01:34.180
I was like, I was like, this does not seem good.
link |
01:01:37.740
Oh, you think it's kind of like.
link |
01:01:39.220
Like I think it sucks.
link |
01:01:40.340
I think it might be really violent.
link |
01:01:41.780
Like violent, mentally violent, psychologically violent.
link |
01:01:44.740
Consciousness emerging, I think is a very violent thing.
link |
01:01:47.460
Never thought about that.
link |
01:01:48.300
I think it's possible
link |
01:01:49.140
that we all carry quite a bit of trauma from it
link |
01:01:51.340
that we don't, I think that would be a good thing to study
link |
01:01:54.420
because I think if, I think addressing that trauma,
link |
01:01:58.580
like I think that might be.
link |
01:01:59.780
Oh, you mean like echoes of it
link |
01:02:00.900
are still there in the shadow of someone?
link |
01:02:02.220
I think it's gotta be, I feel this help,
link |
01:02:04.260
the helplessness, the like existential
link |
01:02:06.980
and that like fear of being in like an unknown place,
link |
01:02:10.580
bombarded with inputs and being completely helpless.
link |
01:02:13.500
Like that's gotta be somewhere deep in your brain
link |
01:02:15.700
and that can't be good for you.
link |
01:02:17.420
What do you think consciousness is?
link |
01:02:19.500
This whole conversation has impossibly difficult questions.
link |
01:02:22.500
What, what do you think it is?
link |
01:02:23.700
It's so hard.
link |
01:02:28.500
Yeah, we talked about music for like two minutes.
link |
01:02:30.700
All right.
link |
01:02:31.540
No, I'm just over music.
link |
01:02:32.700
I'm over music.
link |
01:02:33.540
Yeah, I still like it.
link |
01:02:35.420
It has its purpose.
link |
01:02:36.260
No, I love music.
link |
01:02:37.100
I mean, music is the greatest thing ever.
link |
01:02:38.100
It's my favorite thing.
link |
01:02:38.940
But I just like, every interview is like,
link |
01:02:42.340
what is your process?
link |
01:02:43.620
Like, I don't know, I'm just done.
link |
01:02:44.740
I can't do anything.
link |
01:02:45.580
I do want to ask you about Ableton Live.
link |
01:02:46.940
I'll tell you about Ableton, because Ableton's sick.
link |
01:02:49.420
No one has ever asked about Ableton though.
link |
01:02:51.700
Yeah, well, because I just need tech support, maybe.
link |
01:02:53.940
I can help you, I can help you with your Ableton tech.
link |
01:02:56.620
Anyway, from Ableton back to consciousness,
link |
01:02:58.940
what do you, do you think this is a thing
link |
01:03:00.620
that only humans are capable of?
link |
01:03:03.300
Can robots be conscious?
link |
01:03:05.380
Can, like when you think about entities,
link |
01:03:08.220
you think there's aliens out there that are conscious,
link |
01:03:10.220
like is conscious, what is conscious?
link |
01:03:11.540
There's this Terence McKenna quote
link |
01:03:13.340
that I've found that I fucking love.
link |
01:03:15.900
Am I allowed to swear on here?
link |
01:03:17.540
Yes.
link |
01:03:18.380
Nature loves courage.
link |
01:03:21.140
You make the commitment and nature will respond
link |
01:03:23.460
to that commitment by removing impossible obstacles.
link |
01:03:26.580
Dream the impossible dream
link |
01:03:28.060
and the world will not grind you under.
link |
01:03:29.940
It will lift you up.
link |
01:03:31.140
This is the trick.
link |
01:03:32.380
This is what all these teachers and philosophers
link |
01:03:35.180
who really counted, who really touched the alchemical gold,
link |
01:03:38.420
this is what they understood.
link |
01:03:40.100
This is the shamanic dance in the waterfall.
link |
01:03:42.860
This is how magic is done,
link |
01:03:44.700
by hurling yourself into the abyss
link |
01:03:46.420
and discovering it's a featherbed.
link |
01:03:48.660
Yeah.
link |
01:03:49.940
And for this reason, I do think
link |
01:03:52.020
there are no technological limits.
link |
01:03:55.420
I think like what is already happening here,
link |
01:03:58.700
this is like impossible, this is insane.
link |
01:04:01.060
And we've done this in a very limited amount of time
link |
01:04:03.340
and we're accelerating the rate at which we're doing this.
link |
01:04:05.900
So I think digital consciousness, it's inevitable.
link |
01:04:10.220
And we may not be able to even understand what that means,
link |
01:04:13.300
but I like hurling yourself into the abyss.
link |
01:04:15.820
So we're surrounded by all this mystery
link |
01:04:17.500
and we just keep hurling ourselves into it,
link |
01:04:19.820
like fearlessly and keep discovering cool shit.
link |
01:04:22.980
Yeah, like I just think it's like,
link |
01:04:31.380
like who even knows if the laws of physics,
link |
01:04:33.020
the laws of physics are probably just the current,
link |
01:04:35.620
like as I was saying, speed of light
link |
01:04:36.620
is the current render rate.
link |
01:04:37.940
It's like, if we're in a simulation,
link |
01:04:40.220
they'll be able to upgrade that.
link |
01:04:41.260
Like I sort of suspect when we made the James Webb telescope,
link |
01:04:45.700
like part of the reason we made that
link |
01:04:46.820
is because we had an upgrade, you know?
link |
01:04:50.220
And so now more of space has been rendered,
link |
01:04:53.660
so we can see more of it now.
link |
01:04:56.780
Yeah, but I think humans are super, super, super limited
link |
01:04:59.620
cognitively, so I wonder if we'll be allowed to create
link |
01:05:04.620
more intelligent beings that can see more of the universe
link |
01:05:07.980
as the render rate is upgraded.
link |
01:05:11.020
Maybe we're cognitively limited.
link |
01:05:12.460
Everyone keeps talking about how we're cognitively limited
link |
01:05:15.100
and AI is gonna render us obsolete, but it's like,
link |
01:05:20.220
like this is not the same thing as like an amoeba
link |
01:05:23.940
becoming an alligator.
link |
01:05:25.820
Like it's like, if we create AI, again,
link |
01:05:28.460
that's intelligent design.
link |
01:05:29.620
That's literally all religions are based on gods
link |
01:05:32.980
that create consciousness.
link |
01:05:34.300
Like we are god making.
link |
01:05:35.540
Like what we are doing is incredibly profound
link |
01:05:37.700
and like even if we can't compute,
link |
01:05:41.380
even if we're so much worse than them,
link |
01:05:44.540
like just like unfathomably worse than like,
link |
01:05:49.260
you know, an omnipotent kind of AI,
link |
01:05:51.780
it's like we, I do not think
link |
01:05:54.380
that they would just think that we are stupid.
link |
01:05:56.220
I think that they would recognize the profundity
link |
01:05:58.220
of what we have accomplished.
link |
01:05:59.540
Are we the gods or are they the gods in our perspective?
link |
01:06:02.500
I mean, but I mean, we're kind of a good guy.
link |
01:06:05.260
It's complicated.
link |
01:06:06.100
It's complicated.
link |
01:06:07.420
Like we're good.
link |
01:06:08.260
But they would acknowledge the value.
link |
01:06:11.420
Well, I hope they acknowledge the value
link |
01:06:13.420
of paying respect to the creative ancestors.
link |
01:06:16.020
I think they would think it's cool.
link |
01:06:17.820
I think if curiosity is a trait
link |
01:06:23.820
that we can quantify and put into AI,
link |
01:06:29.220
then I think if AI are curious,
link |
01:06:31.620
then they will be curious about us
link |
01:06:33.580
and they will not be hateful or dismissive of us.
link |
01:06:37.620
They might, you know, see us as, I don't know,
link |
01:06:41.020
it's like I'm not like, oh, fuck these dogs.
link |
01:06:43.540
Let's kill all the dogs.
link |
01:06:45.180
I love dogs.
link |
01:06:46.260
Dogs have great utility.
link |
01:06:47.660
Dogs like provide a lot.
link |
01:06:49.020
We make friends with them.
link |
01:06:50.020
Yeah.
link |
01:06:50.860
We have a deep connection with them.
link |
01:06:53.460
We anthropomorphize them.
link |
01:06:55.340
Like we have a real love for dogs, for cats and so on.
link |
01:06:58.820
For some reason, even though they're intellectually
link |
01:07:00.620
much less than us.
link |
01:07:01.620
And I think there is something sacred about us
link |
01:07:04.020
because it's like, if you look at the universe,
link |
01:07:05.700
like the whole universe is like cold and dead
link |
01:07:09.020
and sort of robotic and it's like, you know, AI intelligence.
link |
01:07:15.660
You know, it's kind of more like the universe.
link |
01:07:19.020
It's like cold and, you know, logical
link |
01:07:24.660
and, you know, abiding by the laws of physics and whatever.
link |
01:07:28.780
But like, we're this like loosey goosey weird art thing
link |
01:07:32.940
that happened and I think it's beautiful.
link |
01:07:34.940
And like, I think even if we, I think one of the values,
link |
01:07:40.300
if consciousness is a thing that is most worth preserving,
link |
01:07:47.180
which I think is the case, I think consciousness,
link |
01:07:49.340
I think if there's any kind of like religious
link |
01:07:50.700
or spiritual thing, it should be that consciousness is sacred.
link |
01:07:55.580
Like then, you know, I still think even if AI render us
link |
01:08:02.020
obsolete and we climate change, it's too bad
link |
01:08:05.860
and we get hit by a comet and we don't become
link |
01:08:07.460
a multi planetary species fast enough.
link |
01:08:09.540
But like AI is able to populate the universe.
link |
01:08:12.300
Like I imagine, like if I was an AI,
link |
01:08:14.300
I would find more planets that are capable
link |
01:08:17.900
of hosting biological life forms and like recreate them.
link |
01:08:20.700
Cause we're fun to watch.
link |
01:08:21.820
Yeah, we're fun to watch.
link |
01:08:23.340
Yeah, but I do believe that AI can have
link |
01:08:25.660
some of the same magic of consciousness within it.
link |
01:08:29.940
Cause consciousness, we don't know what it is.
link |
01:08:31.460
So, you know, there's some kind of.
link |
01:08:33.060
Or it might be a different magic.
link |
01:08:34.180
It might be like a strange, a strange, a strange different.
link |
01:08:37.580
Right.
link |
01:08:38.420
Cause they're not going to have hormones.
link |
01:08:39.380
Like I feel like a lot of our magic is hormonal kind of.
link |
01:08:42.700
I don't know.
link |
01:08:43.540
I think some of our magic is the limitations,
link |
01:08:45.300
the constraints and within that, the hormones
link |
01:08:47.900
and all that kind of stuff, the finiteness of life.
link |
01:08:50.220
And then we get, given our limitations,
link |
01:08:52.820
we get to come up with creative solutions
link |
01:08:54.740
of how to dance around those limitations.
link |
01:08:56.780
We partner up like penguins against the cold.
link |
01:08:59.420
We've fallen in love and then love is ultimately
link |
01:09:03.340
some kind of allows us to delude ourselves
link |
01:09:06.060
that we're not mortal and finite
link |
01:09:08.540
and that life is not ultimately you live alone.
link |
01:09:11.620
You're born alone, you die alone.
link |
01:09:13.780
And then love is like a firm moment
link |
01:09:15.580
or for a long time, forgetting that.
link |
01:09:17.580
And so like we come up with all these creative hacks
link |
01:09:20.340
that make life like fascinatingly fun.
link |
01:09:25.980
Yeah, yeah, yeah, fun, yeah.
link |
01:09:27.740
And then AI might have different kinds of fun.
link |
01:09:30.260
Yes.
link |
01:09:31.260
And hopefully our funds intersect in less than a while.
link |
01:09:34.900
I think there would be a little intersection,
link |
01:09:37.580
there'd be a little intersection of the fun.
link |
01:09:39.300
Yeah, yeah.
link |
01:09:40.460
What do you think is the role of love
link |
01:09:42.860
in the human condition?
link |
01:09:45.500
Ice.
link |
01:09:46.500
Why, is it useful?
link |
01:09:47.620
Is it useful like a hack or is this a fundamental
link |
01:09:51.540
to what it means to be human, the capacity to love?
link |
01:09:54.940
I mean, I think love is the evolutionary mechanism
link |
01:09:58.100
that is like beginning the intelligent design.
link |
01:10:00.580
Like I was just reading about,
link |
01:10:04.140
do you know about Kropotkin?
link |
01:10:06.220
He's like an anarchist, like old Russian anarchist.
link |
01:10:08.940
I live next door to Michael Malus.
link |
01:10:11.540
I don't know if you know that is, he's an anarchist.
link |
01:10:13.300
He's a modern day anarchist.
link |
01:10:14.540
Okay.
link |
01:10:15.380
Anarchists are fun.
link |
01:10:16.220
Getting into anarchism a little bit.
link |
01:10:17.940
This is probably, yeah, not a good route to be taking, but.
link |
01:10:22.380
Oh no, I think if you're,
link |
01:10:23.900
listen, you should expose yourself to ideas
link |
01:10:26.140
that there's no harm to thinking about ideas.
link |
01:10:28.500
I think anarchist challenge systems in interesting ways
link |
01:10:32.460
and they think in interesting ways
link |
01:10:34.180
that just is good for the soul.
link |
01:10:35.340
It's like refreshes your mental palette.
link |
01:10:37.300
I don't think we should actually,
link |
01:10:38.940
I wouldn't actually ascribe to it,
link |
01:10:40.620
but I've never actually gone deep on,
link |
01:10:42.260
on anarchy as a philosophy.
link |
01:10:43.500
So I'm doing.
link |
01:10:44.340
You should still think about it though.
link |
01:10:45.500
When you, when you listen,
link |
01:10:46.620
because I'm like reading about the Russian revolution a lot
link |
01:10:48.420
and it's like, there was like the Soviets and Lenin,
link |
01:10:51.060
all that, but then there was like Kropotkin
link |
01:10:52.380
and his like anarchist sect.
link |
01:10:53.820
And they were sort of interesting
link |
01:10:54.980
cause he was kind of a technocrat actually.
link |
01:10:57.100
Like he was like, you know, like women can be more equal
link |
01:10:59.940
if we have appliances.
link |
01:11:01.180
Like he was like really into like, you know,
link |
01:11:04.140
using technology to like reduce
link |
01:11:05.940
the amount of work people had to do.
link |
01:11:07.900
But so Kropotkin was like a biologist or something.
link |
01:11:11.700
Like he studied animals.
link |
01:11:13.140
And he was really at the time, like,
link |
01:11:17.380
I think it's nature magazine.
link |
01:11:20.820
I think it might have even started as like a Russian magazine,
link |
01:11:22.860
but he was like publishing studies.
link |
01:11:23.900
Like everyone was really into like Darwinism at the time
link |
01:11:26.220
and like survival of the fittest
link |
01:11:27.260
and like war is like the mechanism
link |
01:11:29.380
by which we become better.
link |
01:11:30.540
And it was like this real kind of like,
link |
01:11:33.380
like cementing this idea in society
link |
01:11:36.700
that like violence, you know, kill the weak.
link |
01:11:40.100
And like that's how we become better.
link |
01:11:41.540
And then Kropotkin was kind of interesting
link |
01:11:43.220
cause he was looking at instances.
link |
01:11:45.740
He was finding all these instances in nature
link |
01:11:47.580
where animals were like helping each other and stuff.
link |
01:11:49.700
And he was like, you know, actually love
link |
01:11:51.940
is a survival mechanism.
link |
01:11:53.700
Like there's so many instances in the animal kingdom
link |
01:11:58.780
where like cooperation and, you know,
link |
01:12:01.500
like helping weaker creatures and all this stuff
link |
01:12:03.940
is actually an evolutionary mechanism.
link |
01:12:06.260
I mean, you even look at child rearing.
link |
01:12:08.180
Like child rearing is like immense amounts
link |
01:12:12.540
of just love and goodwill.
link |
01:12:14.580
And just like there's no immediate,
link |
01:12:19.260
you know, you're not getting any immediate feedback
link |
01:12:22.180
of like winning, it's not competitive.
link |
01:12:26.380
It's literally, you know, it's like we actually use love
link |
01:12:28.700
as an evolutionary mechanism just as much as we use war.
link |
01:12:31.180
And I think we've like missing the other part
link |
01:12:34.220
and we've reoriented, we've culturally reoriented
link |
01:12:37.460
like science and philosophy has reoriented itself
link |
01:12:41.980
around Darwinism a little bit too much
link |
01:12:44.140
and the Kropotkin model, I think is equally valid.
link |
01:12:48.700
Like it's like cooperation and love and stuff
link |
01:12:54.540
is just as essential for a species survival and evolution.
link |
01:12:59.060
It'd be a more powerful survival mechanism
link |
01:13:01.580
in the context of evolution.
link |
01:13:02.900
And it comes back to like, you know,
link |
01:13:04.700
we think engineering is so much more important
link |
01:13:06.860
than motherhood, but it's like,
link |
01:13:09.060
if you lose the motherhood, the engineering means nothing.
link |
01:13:10.940
We have no more humans.
link |
01:13:11.860
Like it's like, you know, it's like we,
link |
01:13:15.700
I think our society should, the survival of the fit,
link |
01:13:19.900
the way we see, we conceptualize evolution
link |
01:13:23.740
should really change to also include this idea, I guess.
link |
01:13:27.020
Yeah, there's some weird thing that seems irrational
link |
01:13:31.620
that is also core to what it means to be human.
link |
01:13:37.260
So love is one such thing.
link |
01:13:40.220
It could make you do a lot of irrational things,
link |
01:13:43.020
but that depth of connection and that loyalty
link |
01:13:46.140
is a powerful thing.
link |
01:13:47.300
Are they irrational or are they rational?
link |
01:13:49.300
Like, it's like, it's like, it's, you know, maybe
link |
01:13:56.380
losing out on some things in order to like
link |
01:13:59.500
keep your family together or in order,
link |
01:14:01.860
like it's like, what are our actual values?
link |
01:14:06.260
Well, right.
link |
01:14:07.100
I mean, the irrational thing is,
link |
01:14:08.820
if you have a cold economist perspective,
link |
01:14:11.340
you know, motherhood or sacrificing your career for love,
link |
01:14:16.100
you know, in terms of salary,
link |
01:14:18.300
in terms of economic wellbeing,
link |
01:14:20.620
in terms of flourishing of you as a human being,
link |
01:14:22.740
that could be seen on some kind of metrics
link |
01:14:25.900
as a irrational decision, a suboptimal decision,
link |
01:14:28.580
but there's the manifestation of love
link |
01:14:34.180
could be the optimal thing to do.
link |
01:14:36.780
There's a kind of saying, save one life, save the world.
link |
01:14:41.140
There's a thing that doctors often face, which is like...
link |
01:14:44.100
Well, it's considered irrational
link |
01:14:45.140
because the profit model doesn't include social good.
link |
01:14:47.460
Yes, yeah.
link |
01:14:48.620
So if a profit model includes social good,
link |
01:14:50.420
then suddenly these would be rational decisions.
link |
01:14:52.180
It might be difficult to, you know,
link |
01:14:54.420
it requires a shift in our thinking about profit
link |
01:14:57.620
and might be difficult to measure social good.
link |
01:15:01.020
Yes.
link |
01:15:02.340
But we're learning to measure a lot of things.
link |
01:15:04.500
Yeah, digitizing a lot of things.
link |
01:15:05.620
Where we're actually, you know, quantifying vision
link |
01:15:10.060
and stuff, like we're like, you know,
link |
01:15:13.420
like you go on Facebook and they can,
link |
01:15:15.380
like Facebook can pretty much predict our behaviors.
link |
01:15:17.700
Like we're a surprising amount of things
link |
01:15:20.660
that seem like mysterious consciousness soul things
link |
01:15:25.660
have been quantified at this point.
link |
01:15:27.380
So surely we can quantify these other things.
link |
01:15:29.500
Yeah.
link |
01:15:31.220
But as more and more of us are moving the digital space,
link |
01:15:33.980
I wanted to ask you about something from a fan perspective.
link |
01:15:36.860
I kind of, you know, use a musician,
link |
01:15:41.100
use an online personality.
link |
01:15:43.300
It seems like you have all these identities
link |
01:15:45.220
and you play with them.
link |
01:15:48.700
One of the cool things about the internet,
link |
01:15:50.900
it seems like you can play with identities.
link |
01:15:53.100
So as we move into the digital world more and more,
link |
01:15:55.900
maybe even in the, in the so called metaverse.
link |
01:15:58.940
I mean, I love the metaverse and I love the idea,
link |
01:16:01.020
but like the way this has all played out didn't,
link |
01:16:09.060
didn't go well and people are mad about it.
link |
01:16:10.860
And I think, I think we need to like.
link |
01:16:12.220
I think that's temporary.
link |
01:16:13.300
I think it's temporary.
link |
01:16:14.300
Just like, you know how all the celebrities got together
link |
01:16:16.580
and sang the song, Imagine by Jeff Lennon
link |
01:16:18.900
and everyone started hating the song, Imagine.
link |
01:16:20.780
I'm hoping that's temporary
link |
01:16:21.900
because it's a damn good song.
link |
01:16:24.220
So I think it's just temporary.
link |
01:16:25.340
Like once you actually have virtual worlds,
link |
01:16:27.660
whatever they're called, metaverse or otherwise,
link |
01:16:29.820
it becomes, I don't know.
link |
01:16:31.260
Well, we do have virtual worlds.
link |
01:16:32.300
Like video games, Elden Ring, have you played Elden Ring?
link |
01:16:35.180
You have played Elden Ring?
link |
01:16:36.020
I'm really afraid of playing that game.
link |
01:16:38.420
Literally, I mean.
link |
01:16:39.260
It looks way too fun.
link |
01:16:40.580
It looks, it looks I would want to go there
link |
01:16:43.500
and stay there forever.
link |
01:16:44.860
It's, yeah, so fun.
link |
01:16:46.860
It's so, it's so nice.
link |
01:16:48.860
Oh man.
link |
01:16:51.700
Yeah, so that, that's, yeah, that's a metaverse.
link |
01:16:54.500
That's a metaverse, but you're not really,
link |
01:16:57.220
is how immersive is it?
link |
01:16:59.260
In the sense that this is the three dimension,
link |
01:17:03.980
like virtual reality integration necessary.
link |
01:17:06.100
Can we really just take our, close our eyes
link |
01:17:08.820
and kind of plug in in the 2D screen
link |
01:17:13.100
and become that other being for a time
link |
01:17:15.820
and really enjoy that journey that we take.
link |
01:17:17.980
And we almost become that.
link |
01:17:19.700
You're no longer, see, I'm no longer Lex,
link |
01:17:22.100
you're that creature, whatever,
link |
01:17:23.700
whatever the hell it is in that game.
link |
01:17:25.380
Yeah, that is that.
link |
01:17:26.220
I mean, that's why I love those video games.
link |
01:17:28.460
It's, I really do become those people for a time.
link |
01:17:33.140
But like, it seems like with the idea of the metaverse
link |
01:17:36.180
or the idea of the digital space,
link |
01:17:37.940
well, even on Twitter, you get a chance
link |
01:17:40.260
to be somebody for prolonged periods of time,
link |
01:17:42.620
like across a lifespan.
link |
01:17:44.540
You know, you have a Twitter account for years, for decades
link |
01:17:47.660
and you're that person.
link |
01:17:48.500
I don't know if that's a good thing.
link |
01:17:49.700
I feel very tormented by it.
link |
01:17:52.700
By Twitter specifically,
link |
01:17:54.340
by social media representation of you.
link |
01:17:56.300
I feel like the public perception of me
link |
01:17:59.180
has gotten so distorted that I find it kind of disturbing.
link |
01:18:04.500
It's one of the things that's disincentivizing me
link |
01:18:06.380
from like wanting to keep making art
link |
01:18:07.980
because I'm just like, I've completely lost control
link |
01:18:13.100
of the narrative and the narrative is,
link |
01:18:15.540
some of it is my own stupidity,
link |
01:18:16.900
but a lot, like some of it has just been like hijacked
link |
01:18:19.300
by forces far beyond my control.
link |
01:18:22.940
You know, I kind of got in over my head in things.
link |
01:18:25.580
Like I'm just a random indie musician,
link |
01:18:27.300
but I just got like dragged into like geopolitical matters
link |
01:18:31.940
and like financial, like the stock market and shit.
link |
01:18:35.100
And so it's just like, it's just,
link |
01:18:36.340
there are very powerful people who have,
link |
01:18:38.220
who have at various points in time
link |
01:18:39.700
had very vested interest in making me seem insane.
link |
01:18:43.740
And I can't fucking fight that.
link |
01:18:45.660
And I just like, you know,
link |
01:18:48.860
people really want their celebrity figures
link |
01:18:50.820
to like be consistent and stay the same.
link |
01:18:53.860
And like people have a lot of like emotional investment
link |
01:18:55.860
in certain things.
link |
01:18:56.700
And like, first of all, like I'm like artificially
link |
01:19:01.900
more famous than I should be.
link |
01:19:03.700
Isn't everybody who's famous artificially famous?
link |
01:19:06.620
No, but like, I should be like a weird niche indie thing.
link |
01:19:11.340
And I make pretty challenging,
link |
01:19:13.420
I do challenging weird fucking shit a lot.
link |
01:19:16.300
And I accidentally by proxy got like,
link |
01:19:21.860
foisted into sort of like weird celebrity culture.
link |
01:19:24.460
But like, I cannot be media trained.
link |
01:19:27.340
They have put me through so many hours of media training.
link |
01:19:29.780
I would love to see, I'd love to see BF fly in that wall.
link |
01:19:32.540
I can't do, like, well, and I do,
link |
01:19:34.420
I try so hard and I like learn the same.
link |
01:19:36.540
And I like got it.
link |
01:19:37.380
And I'm like, I got it.
link |
01:19:38.220
I got it, I got it.
link |
01:19:39.040
But I just can't stop saying,
link |
01:19:40.540
like my mouth just says things.
link |
01:19:42.220
I like, and it's just like, and I just do,
link |
01:19:44.500
I just do things, I just do crazy things.
link |
01:19:46.020
I mean, it's like, I'm, I just,
link |
01:19:48.900
I need to do crazy things.
link |
01:19:50.740
And it's just, I should not be,
link |
01:19:53.100
it's too jarring for people and the contradictory stuff.
link |
01:19:59.700
And then all the, by association, like, you know,
link |
01:20:05.220
it's like, I'm in a very weird position
link |
01:20:07.660
and my public image that the avatar of me is now
link |
01:20:13.540
this totally crazy thing that is so lost from my control.
link |
01:20:16.540
So you feel the burden of the avatar having to be static.
link |
01:20:19.140
So the avatar on Twitter, the avatar on Instagram,
link |
01:20:22.300
on these social platforms is as a burden.
link |
01:20:26.740
It becomes like, cause it, like people don't want to accept
link |
01:20:30.300
a changing avatar, a chaotic avatar.
link |
01:20:32.580
Avatar, it's a stupid shift sometimes.
link |
01:20:34.820
They think the avatar is morally wrong
link |
01:20:36.480
or they think the avatar, and maybe it has been,
link |
01:20:39.580
and like, I questioned it all the time.
link |
01:20:41.140
Like, I'm like, I don't know if everyone's right
link |
01:20:46.340
and I'm wrong, I don't, like, but you know,
link |
01:20:49.740
a lot of times people ascribe intentions to things,
link |
01:20:51.780
the worst possible intentions.
link |
01:20:53.700
At this point, people think I'm, you know.
link |
01:20:57.340
But we're just fine.
link |
01:20:58.180
All kinds of words, yes.
link |
01:20:59.180
Yes, and it's fine.
link |
01:21:00.500
I'm not complaining about it, but I'm just,
link |
01:21:02.860
it's a curiosity to me that we live
link |
01:21:07.020
these double, triple, quadruple lives
link |
01:21:08.780
and I have this other life that is like,
link |
01:21:12.480
more people know my other life than my real life,
link |
01:21:15.140
which is interesting.
link |
01:21:16.380
Probably, I mean, you too, I guess.
link |
01:21:18.220
Yeah, but I have the luxury.
link |
01:21:20.140
So we have all different, we don't,
link |
01:21:23.140
like, I don't know what I'm doing.
link |
01:21:25.180
There is an avatar and you're mediating
link |
01:21:27.860
who you are through that avatar.
link |
01:21:29.980
I have the nice luxury, not the luxury,
link |
01:21:34.380
maybe by intention of not trying really hard
link |
01:21:38.100
to make sure there's no difference
link |
01:21:40.100
between the avatar and the private person.
link |
01:21:44.060
Do you wear a suit all the time?
link |
01:21:45.660
Yeah, but not all the time.
link |
01:21:48.660
Like recently because I get recognized a lot,
link |
01:21:51.580
I have to not wear the suit to hide.
link |
01:21:53.460
I'm such an introvert, I'm such a social anxiety
link |
01:21:55.860
and all that kind of stuff to hide away.
link |
01:21:57.700
I love wearing a suit because it makes me feel
link |
01:22:00.620
like I'm taking the moment seriously.
link |
01:22:02.420
Like I'm, I don't know.
link |
01:22:04.380
It makes me feel like a weirdo in the best possible way.
link |
01:22:06.980
Your suits feel great.
link |
01:22:07.820
Every time I wear a suit,
link |
01:22:08.660
I'm like, I don't know why I'm not doing this more.
link |
01:22:10.700
In fashion in general, if you're doing it for yourself,
link |
01:22:15.380
I don't know, it's a really awesome thing.
link |
01:22:18.740
But yeah, I think there is definitely a painful way
link |
01:22:24.720
to use social media in an empowering way
link |
01:22:27.460
and I don't know if any of us know which is which.
link |
01:22:32.700
So we're trying to figure that out.
link |
01:22:34.020
Some people, I think Doja Cat is incredible at it.
link |
01:22:36.860
Incredible, like just masterful.
link |
01:22:39.620
I don't know if you like follow that.
link |
01:22:41.980
So okay, so not taking anything seriously,
link |
01:22:44.860
joking, absurd, humor, that kind of thing.
link |
01:22:47.540
I think Doja Cat might be like the greatest
link |
01:22:49.940
living comedian right now.
link |
01:22:52.180
Like I'm more entertained by Doja Cat
link |
01:22:53.660
than actual comedians.
link |
01:22:56.300
Like she's really fucking funny on the internet.
link |
01:22:58.980
She's just great at social media.
link |
01:23:00.300
It's just, you know, her media.
link |
01:23:02.300
Yeah, the nature of humor,
link |
01:23:03.300
like human on social media is also a beautiful thing,
link |
01:23:08.020
the absurdity.
link |
01:23:08.980
The absurdity and memes,
link |
01:23:10.300
like I just want to like take a moment.
link |
01:23:12.740
I love, like when we're talking about art
link |
01:23:14.620
and credit and authenticity,
link |
01:23:16.660
I love that there's this,
link |
01:23:18.220
I mean, now memes are like, they're no longer,
link |
01:23:21.860
like it means art like new,
link |
01:23:23.660
but it's still this emergent art form
link |
01:23:25.620
that is completely egoless and anonymous.
link |
01:23:27.740
And we just don't know who made any of it.
link |
01:23:29.500
And it's like the forefront of comedy
link |
01:23:32.580
and it's just totally anonymous.
link |
01:23:35.420
And it just feels really beautiful.
link |
01:23:36.740
It just feels like this beautiful collect,
link |
01:23:39.060
like collective human art project
link |
01:23:43.300
that's like this like decentralized comedy thing
link |
01:23:46.460
that just makes memes add so much to my day
link |
01:23:48.900
and many people stays.
link |
01:23:49.900
And it's just like, I don't know.
link |
01:23:52.180
I don't think people ever,
link |
01:23:54.900
I don't think we stop enough
link |
01:23:56.060
and just appreciate how sick it is that memes exist.
link |
01:23:59.500
And cause also making a whole brand new art form
link |
01:24:02.420
in like the modern era that's like didn't exist before.
link |
01:24:07.060
Like, I mean, they sort of existed,
link |
01:24:08.500
but the way that they exist now as like this like,
link |
01:24:11.860
you know, like me and my friends,
link |
01:24:13.260
like we joke that we go like mining for memes
link |
01:24:16.340
or farming for memes,
link |
01:24:17.980
like a video game and like meme dealers and like whatever.
link |
01:24:21.100
Like it's, you know, it's this whole,
link |
01:24:22.740
memes are this whole like new comedic language.
link |
01:24:27.940
Well, it's this art form.
link |
01:24:29.300
The interesting thing about it is that lame people
link |
01:24:33.540
seem to not be good at memes.
link |
01:24:35.380
Like corporate can't infiltrate memes.
link |
01:24:38.220
Yeah, they really can't.
link |
01:24:39.860
They try, they could try, but it's like, it's weird.
link |
01:24:42.540
Cause like
link |
01:24:43.380
They try so hard and every once in a while I'm like fine.
link |
01:24:46.420
Like you got a good one.
link |
01:24:48.700
I think I've seen like one or two good ones,
link |
01:24:51.420
but like, yeah, they really can't.
link |
01:24:53.340
Cause they're even corporate is infiltrating web three.
link |
01:24:55.500
It's making me really sad,
link |
01:24:57.140
but they can't infiltrate the memes.
link |
01:24:58.660
And I think there's something really beautiful about that.
link |
01:25:00.060
And that gives power.
link |
01:25:00.980
That's why Dogecoin is powerful.
link |
01:25:03.580
It's like, all right.
link |
01:25:05.340
FU to sort of anybody who's trying to centralize
link |
01:25:08.820
is trying to control the rich people
link |
01:25:10.420
that are trying to roll in and control this,
link |
01:25:12.700
control the narrative.
link |
01:25:14.220
Wow.
link |
01:25:15.060
I hadn't thought about that, but.
link |
01:25:17.220
How would you fix Twitter?
link |
01:25:18.500
How would you fix social media?
link |
01:25:19.980
For your own, like you're an optimist,
link |
01:25:23.820
you're a positive person.
link |
01:25:25.220
There's a bit of a cynicism that you have currently
link |
01:25:27.500
about this particular little slice of humanity.
link |
01:25:30.700
I tend to think Twitter could be beautiful.
link |
01:25:32.700
I'm not that cynical about it.
link |
01:25:34.100
I'm not that cynical about it.
link |
01:25:35.140
I actually refuse to be a cynic on principle.
link |
01:25:37.700
Yes.
link |
01:25:38.540
I was just briefly expressing some personal pathos.
link |
01:25:40.940
Personal stuff.
link |
01:25:41.780
It was just some personal pathos, but like, like.
link |
01:25:45.900
Just to vent a little bit.
link |
01:25:46.980
Just to speak.
link |
01:25:47.820
I don't have cancer.
link |
01:25:48.940
I've, I love my family.
link |
01:25:50.860
I have a good life.
link |
01:25:51.620
I'm that, that, that is, if that is my biggest,
link |
01:25:55.380
one of my biggest problems.
link |
01:25:56.220
That's a good life.
link |
01:25:57.180
Yeah.
link |
01:25:58.020
I, you know, that was a brief,
link |
01:25:59.860
although I do think there are a lot of issues with Twitter
link |
01:26:01.380
just in terms of like the public mental health,
link |
01:26:03.180
but due to my proximity to the current dramas,
link |
01:26:10.500
I honestly feel that I should not have opinions about this
link |
01:26:13.820
because I think if Elon ends up getting Twitter,
link |
01:26:28.380
that is a being the arbiter of truth or public discussion.
link |
01:26:33.140
That is a responsibility.
link |
01:26:36.260
I do not, I am not qualified to be responsible for that
link |
01:26:41.260
and I do not want to say something
link |
01:26:45.260
that might like dismantle democracy.
link |
01:26:48.340
And so I just like actually,
link |
01:26:49.780
I actually think I should not have opinions about this
link |
01:26:52.140
because I truly am not,
link |
01:26:55.100
I don't want to have the wrong opinion about this.
link |
01:26:56.740
And I think I'm too close to the actual situation
link |
01:27:00.180
wherein I should not have.
link |
01:27:02.860
I have thoughts in my brain,
link |
01:27:04.300
but I think I am scared by my proximity to this situation.
link |
01:27:09.300
Isn't that crazy that a few words that you could say
link |
01:27:14.660
could change world affairs and hurt people?
link |
01:27:18.780
I mean, that's the nature of celebrity at a certain point
link |
01:27:22.980
that you have to be, you have to a little bit, a little bit,
link |
01:27:27.140
not so much that it destroys you,
link |
01:27:29.540
puts too much constraints,
link |
01:27:30.540
but you have to a little bit think about
link |
01:27:32.620
the impact of your words.
link |
01:27:33.940
I mean, we as humans, you talk to somebody at a bar,
link |
01:27:36.700
you have to think about the impact of your words.
link |
01:27:39.140
Like you can say positive things,
link |
01:27:40.380
you can think of negative things,
link |
01:27:41.540
you can affect the direction of one life.
link |
01:27:43.380
But on social media,
link |
01:27:44.380
your words can affect the direction of many lives.
link |
01:27:48.100
That's crazy.
link |
01:27:48.940
It's a crazy world to live in.
link |
01:27:50.460
It's worthwhile to consider that responsibility,
link |
01:27:53.020
take it seriously.
link |
01:27:54.140
Sometimes just like you did choose kind of silence,
link |
01:28:00.780
choose sort of respectful.
link |
01:28:03.620
Like I do have a lot of thoughts on the matter.
link |
01:28:05.300
I'm just,
link |
01:28:06.140
I don't, if my thoughts are wrong,
link |
01:28:10.140
this is one situation where the stakes are high.
link |
01:28:12.900
You mentioned a while back that you were in a cult
link |
01:28:15.820
that's centered around bureaucracy,
link |
01:28:17.340
so you can't really do anything
link |
01:28:18.540
because it involves a lot of paperwork.
link |
01:28:20.620
And I really love a cult that's just like Kafkaesque,
link |
01:28:26.700
I mean, it was like a joke,
link |
01:28:27.540
but it was,
link |
01:28:28.380
I know, but I love this idea.
link |
01:28:29.220
The Holy Rain Empire.
link |
01:28:30.500
Yeah, it was just like a Kafkaesque pro bureaucracy cult.
link |
01:28:34.940
But I feel like that's what human civilization is.
link |
01:28:36.860
Is that, because when you said that,
link |
01:28:38.180
I was like, oh, that is kind of what humanity is.
link |
01:28:40.660
Is this bureaucracy?
link |
01:28:41.700
I do, yeah, I have this theory.
link |
01:28:45.620
I really think that we really,
link |
01:28:50.660
bureaucracy is starting to kill us.
link |
01:28:53.540
And I think like we need to reorient laws and stuff.
link |
01:28:59.500
Like I think we just need sunset clauses on everything.
link |
01:29:01.900
Like I think the rate of change in culture
link |
01:29:04.540
is happening so fast and the rate of change in technology
link |
01:29:06.660
and everything is happening so fast.
link |
01:29:07.900
It's like, you know, when you see these hearings
link |
01:29:10.780
about like social media and Cambridge Analytica
link |
01:29:15.780
and everyone talking,
link |
01:29:16.620
it's like even from that point,
link |
01:29:19.180
so much technological change has happened
link |
01:29:21.380
from like those hearings.
link |
01:29:22.740
And it's just like,
link |
01:29:23.580
we're trying to make all these laws now about AI and stuff.
link |
01:29:25.940
I feel like we should be updating things
link |
01:29:27.380
like every five years.
link |
01:29:28.380
And like one of the big issues in our society right now
link |
01:29:30.140
is we're just getting bogged down by laws
link |
01:29:32.260
and it's making it very hard to change things
link |
01:29:36.980
and develop things.
link |
01:29:37.820
Like in Austin,
link |
01:29:38.660
like I don't want to speak on this too much,
link |
01:29:41.460
but like one of my friends is working on a housing bill
link |
01:29:43.180
in Austin to try to like prevent
link |
01:29:44.980
like a San Francisco situation from happening here
link |
01:29:47.140
because obviously we're getting a little mini San Francisco
link |
01:29:49.900
here like housing prices are skyrocketing.
link |
01:29:52.100
It's causing massive gentrification.
link |
01:29:54.700
This is gonna be, this is really bad
link |
01:29:56.100
for anyone who's not super rich.
link |
01:29:59.140
Like there's so much bureaucracy.
link |
01:30:00.820
Part of the reason this is happening
link |
01:30:01.860
is because you need all these permits to build.
link |
01:30:03.980
It takes like years to get permits to like build anything.
link |
01:30:06.420
It's so hard to build.
link |
01:30:07.420
And so there's very limited housing
link |
01:30:09.100
and there's a massive influx of people.
link |
01:30:10.900
And it's just like, you know,
link |
01:30:12.180
this is a microcosm of like problems
link |
01:30:14.300
that are happening all over the world
link |
01:30:15.540
where it's just like we're dealing with laws
link |
01:30:18.780
that are like 10, 20, 30, 40, 100, 200 years old
link |
01:30:22.340
and they are no longer relevant.
link |
01:30:24.100
And it's just slowing everything down
link |
01:30:25.660
and causing massive social pain.
link |
01:30:27.980
Yeah, and it's like, it's also makes me sad
link |
01:30:32.820
when I see politicians talk about technology
link |
01:30:35.780
and when they don't really get it.
link |
01:30:38.020
And, but most importantly, they lack curiosity
link |
01:30:41.140
and like that like inspired excitement
link |
01:30:44.580
about like how stuff works and all that stuff.
link |
01:30:46.620
They're just like, they see,
link |
01:30:47.820
they have a very cynical view of technology.
link |
01:30:50.140
It's like tech companies are just trying to do evil
link |
01:30:52.180
on the world from their perspective.
link |
01:30:53.580
And they have no curiosity about like
link |
01:30:55.900
how recommender systems work or how AI systems work,
link |
01:30:59.820
natural language processing, how robotics works,
link |
01:31:02.740
how computer vision works, you know,
link |
01:31:05.860
they always take the most cynical possible interpretation
link |
01:31:08.460
of what technology will be used.
link |
01:31:09.900
And we should definitely be concerned about that.
link |
01:31:11.700
But if you're constantly worried about that
link |
01:31:13.820
and you're regulating based on that,
link |
01:31:15.020
you're just going to slow down all the innovation.
link |
01:31:17.020
I do think a huge priority right now is undoing
link |
01:31:21.180
the bad energy surrounding the emergence of Silicon Valley.
link |
01:31:28.060
Like I think that like a lot of things
link |
01:31:29.980
were very irresponsible during that time.
link |
01:31:31.820
And, you know, like even just this current whole thing
link |
01:31:36.140
with Twitter and everything, it's like,
link |
01:31:37.900
like there has been a lot of negative outcomes
link |
01:31:39.980
from the sort of technocracy boom.
link |
01:31:44.260
But one of the things that's happening is that like,
link |
01:31:47.140
it's alienating people from wanting to care about technology.
link |
01:31:52.340
And I actually think technology is probably some
link |
01:31:56.260
of the better, probably the best.
link |
01:32:00.140
I think we can fix a lot of our problems more easily
link |
01:32:02.180
with technology than with, you know,
link |
01:32:05.980
fighting the powers that be as a, you know,
link |
01:32:07.980
not to go back to the Star Wars quote
link |
01:32:09.700
or the Buckminster Fuller quote.
link |
01:32:11.340
Let's go to some dark questions.
link |
01:32:12.980
If we may, for time, what is the darkest place
link |
01:32:17.900
you have ever gone in your mind?
link |
01:32:20.020
Is there a time, a period of time, a moment
link |
01:32:23.900
that you remember that was difficult for you?
link |
01:32:29.620
I mean, when I was 18, my best friend died
link |
01:32:31.260
of a heroin overdose and it was like my,
link |
01:32:37.660
it was, and then shortly after that,
link |
01:32:39.820
one of my other best friends committed suicide and that,
link |
01:32:46.300
sort of like coming into adulthood,
link |
01:32:48.740
dealing with two of the most important people in my life,
link |
01:32:51.220
dying in extremely disturbing, violent ways was a lot.
link |
01:32:55.940
That was a lot.
link |
01:32:56.780
You miss them?
link |
01:32:58.500
Yeah, definitely miss them.
link |
01:32:59.940
Did that make you think about your own life,
link |
01:33:02.780
about the finiteness of your own life,
link |
01:33:04.980
the places your mind can go?
link |
01:33:08.180
Did you ever, in a distance, far away,
link |
01:33:10.940
contemplate just your own death
link |
01:33:15.460
or maybe even taking your own life?
link |
01:33:17.260
Oh, never, oh no.
link |
01:33:18.740
I'm so, I love my life, I cannot fathom suicide.
link |
01:33:23.140
I'm so scared of death.
link |
01:33:24.180
I haven't, I'm too scared of death.
link |
01:33:26.060
My manager, my manager's like the most zen guy.
link |
01:33:28.820
My manager's always like, you need to accept death.
link |
01:33:31.100
You need to accept death.
link |
01:33:32.140
And I'm like, look, I can do your meditation.
link |
01:33:34.260
I can do the meditation, but I cannot accept death.
link |
01:33:37.340
I'll see you terrified of death.
link |
01:33:39.060
I'm terrified of death.
link |
01:33:40.380
I will like fight.
link |
01:33:42.780
Although I actually think death is important,
link |
01:33:45.060
I recently went to this meeting about immortality
link |
01:33:50.060
and in the process of...
link |
01:33:51.500
That's the actual topic of the meeting.
link |
01:33:53.060
All right, I'm sorry.
link |
01:33:53.900
No, no, it was this girl.
link |
01:33:54.740
It was a bunch of people working on anti aging stuff.
link |
01:33:58.940
It was some seminary thing about it
link |
01:34:01.980
and I went in really excited.
link |
01:34:03.300
I was like, yeah, okay, what do you got?
link |
01:34:05.340
How can I live for 500 years or 1,000 years?
link |
01:34:07.860
And then over the course of the meeting,
link |
01:34:10.660
like it was sort of like right,
link |
01:34:12.020
it was like two or three days
link |
01:34:13.180
after the Russian invasion started.
link |
01:34:14.620
And I was like, man, like what if Putin was immortal?
link |
01:34:17.340
Like what if I'm like, man, maybe immortality is not good.
link |
01:34:23.660
I mean, like if you get into the later Dune stuff,
link |
01:34:25.780
the immortals cause a lot of problems.
link |
01:34:29.060
Cause as we were talking about earlier with the music
link |
01:34:31.020
and like brains calcify,
link |
01:34:32.540
like good people could become immortal,
link |
01:34:35.500
but bad people could become immortal.
link |
01:34:36.940
But I also think even the best people power corrupt
link |
01:34:43.380
and power alienates you from like the common human experience
link |
01:34:46.780
and...
link |
01:34:47.620
Right, so the people that get more and more powerful.
link |
01:34:49.140
Even the best people whose brains are amazing,
link |
01:34:52.220
like I think death might be important.
link |
01:34:54.780
I think death is part of, you know, like I think with AI,
link |
01:34:59.140
one thing we might wanna consider,
link |
01:35:01.020
I don't know, I wanna talk about AI.
link |
01:35:02.420
I'm such not an expert
link |
01:35:03.420
and probably everyone has all these ideas
link |
01:35:05.220
and they're already figured out.
link |
01:35:06.220
But when I talk...
link |
01:35:07.060
Nobody is an expert in anything, see?
link |
01:35:09.060
Okay, go ahead.
link |
01:35:09.900
But when I...
link |
01:35:10.740
You were talking about?
link |
01:35:11.580
Yeah, but I like, it's just like,
link |
01:35:13.180
I think some kind of pruning,
link |
01:35:17.300
but it's a tricky thing because,
link |
01:35:18.740
because if there's too much of a focus on youth culture,
link |
01:35:22.980
then you don't have the wisdom.
link |
01:35:25.500
So I feel like we're in a tricky moment right now
link |
01:35:29.700
in society where it's like,
link |
01:35:31.340
we've really perfected living for a long time.
link |
01:35:33.140
So there's all these really like old people
link |
01:35:35.820
who are like really voting against the wellbeing
link |
01:35:39.580
of the young people, you know?
link |
01:35:41.580
And like, it's like,
link |
01:35:43.740
there shouldn't be all this student dead
link |
01:35:45.220
and we need like healthcare, like universal healthcare
link |
01:35:48.620
and like just voting against like best interests.
link |
01:35:52.540
But then you have all these young people
link |
01:35:53.740
that don't have the wisdom that are like,
link |
01:35:55.780
like, yeah, we need communism and stuff.
link |
01:35:57.700
And it's just like, like literally,
link |
01:35:59.820
I got canceled at one point for,
link |
01:36:03.300
I ironically used a Stalin quote
link |
01:36:05.060
in my high school yearbook,
link |
01:36:06.220
but it was actually like a diss against my high school.
link |
01:36:09.700
I saw it.
link |
01:36:10.540
Yeah.
link |
01:36:11.380
And people were like, you used to be a Stalinist
link |
01:36:13.260
and now you're a class trader.
link |
01:36:14.300
And it's like, it's like, oh man, just like,
link |
01:36:17.540
please Google Stalin, please Google Stalin.
link |
01:36:20.580
Like, you know what I mean?
link |
01:36:21.740
Ignoring his, the lessons of history, yes.
link |
01:36:23.900
And it's like, we're in this really weird middle ground
link |
01:36:26.180
where it's like, we are not finding the happy medium
link |
01:36:31.300
between wisdom and fresh ideas
link |
01:36:34.780
and they're fighting each other.
link |
01:36:35.980
And it's like, like really, like what we need is like,
link |
01:36:41.060
like the fresh ideas and the wisdom to be like collaborating.
link |
01:36:43.940
And it's like...
link |
01:36:45.180
Well, the fighting in a way is the searching
link |
01:36:47.380
for the happy medium.
link |
01:36:48.540
And in a way, maybe we are finding the happy medium.
link |
01:36:51.100
Maybe that's what the happy medium looks like.
link |
01:36:52.980
And for AI systems, there has to be,
link |
01:36:55.020
it's, you know, you have reinforcement learning.
link |
01:36:57.180
You have the dance between exploration and exploitation,
link |
01:37:00.380
sort of doing crazy stuff to see if there's something better
link |
01:37:03.420
than what you think is the optimal
link |
01:37:05.460
and then doing the optimal thing
link |
01:37:06.620
and dancing back and forth from that.
link |
01:37:08.660
You would, Stuart Russell, I don't know if you know that,
link |
01:37:10.700
is AI guy with things about sort of
link |
01:37:15.900
how to control super intelligent AI systems.
link |
01:37:18.620
And his idea is that we should inject uncertainty
link |
01:37:21.540
and sort of humility into AI systems
link |
01:37:24.180
that they never, as they get wiser and wiser and wiser
link |
01:37:26.820
and more intelligent, they're never really sure.
link |
01:37:30.060
They always doubt themselves.
link |
01:37:31.660
And in some sense, when you think of young people,
link |
01:37:34.380
that's a mechanism for doubt.
link |
01:37:36.340
It's like, it's how society doubts
link |
01:37:38.900
whether the thing it has converged towards
link |
01:37:40.900
is the right answer.
link |
01:37:41.980
So the voices of the young people
link |
01:37:44.900
is a society asking itself a question.
link |
01:37:48.180
The way I've been doing stuff for the past 50 years,
link |
01:37:51.140
maybe it's the wrong way.
link |
01:37:52.500
And so you can have all of that within one AI system.
link |
01:37:55.380
I also think though that we need to,
link |
01:37:57.500
I mean, actually that's actually really interesting
link |
01:37:59.940
and really cool.
link |
01:38:01.860
But I also think there's a fine balance of,
link |
01:38:05.420
I think we maybe also overvalue the idea
link |
01:38:10.180
that the old systems are always bad.
link |
01:38:11.980
And I think there are things that we are perfecting
link |
01:38:14.860
and we might be accidentally overthrowing things
link |
01:38:17.460
that we actually have gotten to a good point.
link |
01:38:19.780
Just because we are valuing, we value disruption so much
link |
01:38:22.820
and we value fighting against the generations
link |
01:38:25.460
before us so much, that like,
link |
01:38:29.100
there's also an aspect of like,
link |
01:38:30.660
sometimes we're taking two steps forward, one step back
link |
01:38:32.780
because, okay, maybe we kind of did solve this thing.
link |
01:38:36.980
And now we're like fucking it up.
link |
01:38:39.220
And so I think there's like a middle ground there too.
link |
01:38:44.780
Yeah, we're in search of that happy medium.
link |
01:38:47.300
Let me ask you a bunch of crazy questions, okay?
link |
01:38:51.500
You can answer in a short way or in a long way.
link |
01:38:53.980
What's the scariest thing you've ever done?
link |
01:38:56.340
These questions are gonna be ridiculous.
link |
01:38:58.380
Something tiny or something big,
link |
01:39:02.420
skydiving or touring your first record
link |
01:39:09.860
going on this podcast.
link |
01:39:12.100
I've had two crazy brushes,
link |
01:39:13.620
like really scary brushes with death
link |
01:39:15.180
where I randomly got away on skate.
link |
01:39:17.020
I don't know if I should talk about those on here.
link |
01:39:19.220
But like, I think I might be the luckiest person alive though.
link |
01:39:22.860
Like, this might be too dark for a podcast though.
link |
01:39:26.020
I feel like, I don't know if this is like
link |
01:39:27.100
good content for a podcast.
link |
01:39:28.740
I don't know what content.
link |
01:39:30.300
It might hijack.
link |
01:39:31.620
Here's a safer one.
link |
01:39:32.460
I mean, having a baby really scared me.
link |
01:39:36.780
Before, after.
link |
01:39:37.780
Just a birth process.
link |
01:39:38.940
Surgery, like, just having a baby is really scary.
link |
01:39:43.940
It's just like the medical aspect of it,
link |
01:39:47.540
not the responsibility.
link |
01:39:49.300
Were you ready for the responsibility?
link |
01:39:51.380
Did you, were you ready to be a mother?
link |
01:39:53.980
All the beautiful things that comes with motherhood
link |
01:39:56.300
that you were talking about,
link |
01:39:57.580
all the changes and all that, were you ready for that?
link |
01:40:01.060
Were you, did you feel ready for that?
link |
01:40:03.020
No, I think it took about nine months
link |
01:40:05.340
to start getting ready for it.
link |
01:40:06.620
And I'm still getting more ready for it
link |
01:40:08.380
because now you keep realizing more things
link |
01:40:12.980
as they start getting.
link |
01:40:14.220
As the consciousness grows.
link |
01:40:16.420
And stuff you didn't notice with the first one.
link |
01:40:18.380
Now that you've seen the first one older,
link |
01:40:19.700
you're noticing it more,
link |
01:40:21.700
like the sort of like existential horror
link |
01:40:24.420
of coming into consciousness with baby Y
link |
01:40:28.300
or baby Sailor Mars or whatever.
link |
01:40:30.140
She has like so many names at this point that it's,
link |
01:40:33.540
we really need to probably settle on one.
link |
01:40:36.100
If you could be someone else for a day,
link |
01:40:38.300
someone alive today, but somebody you haven't met yet,
link |
01:40:41.780
who would you be?
link |
01:40:42.620
Would I be modeling their brain state
link |
01:40:44.220
or would I just be in their body?
link |
01:40:46.420
You can choose the degree
link |
01:40:48.380
to which you're modeling their brain state.
link |
01:40:50.580
Cause so you can still take a third person perspective
link |
01:40:54.300
and realize, you have to realize that you're.
link |
01:40:56.660
Can they be alive or can it be dead?
link |
01:41:00.580
No, oh, could it be anyone?
link |
01:41:02.780
They would be brought back to life, right?
link |
01:41:04.540
If they're dead.
link |
01:41:05.380
Yeah, you can bring people back.
link |
01:41:07.140
Definitely Hitler is Stalin.
link |
01:41:09.300
I want to understand evil.
link |
01:41:11.260
Who would you, you would need to,
link |
01:41:12.900
oh, to experience what it feels like.
link |
01:41:15.020
I want to be in their brain, feeling what they feel.
link |
01:41:18.580
That might change you forever, returning from that.
link |
01:41:20.860
Yes, but I think it would also help me understand
link |
01:41:22.940
how to prevent it and fix it.
link |
01:41:25.380
That might be one of those things.
link |
01:41:26.580
Once you experience it, it'll be a burden to know it.
link |
01:41:29.940
Cause you won't be able to trust that.
link |
01:41:30.780
Yeah, but a lot of things are burdens, like.
link |
01:41:33.820
But it's a useful burden.
link |
01:41:34.780
But it's a useful burden.
link |
01:41:36.020
Yeah.
link |
01:41:36.860
That for sure.
link |
01:41:37.700
I want to understand evil and like psychopathy
link |
01:41:40.300
and that I have all these fake Twitter accounts
link |
01:41:43.300
where I like go into different algorithmic bubbles
link |
01:41:45.540
to try to like understand.
link |
01:41:47.340
I'll keep getting in fights with people
link |
01:41:48.700
and realize we're not actually fighting.
link |
01:41:50.620
I think we're, we used to exist in a monoculture
link |
01:41:53.020
like before social media and stuff.
link |
01:41:54.420
Like we kind of all got fed the same thing.
link |
01:41:56.460
So we were all speaking the same cultural language.
link |
01:41:58.700
But I think recently one of the things
link |
01:42:00.140
that like we aren't diagnosing properly enough
link |
01:42:02.100
with social media is that there's different dialects.
link |
01:42:05.500
There's so many different dialects of Chinese.
link |
01:42:06.900
There are now becoming different dialects of English.
link |
01:42:09.380
Like I am realizing like there are people
link |
01:42:11.780
who are saying the exact same things,
link |
01:42:13.540
but they're using completely different verbiage.
link |
01:42:15.980
And we're like punishing each other
link |
01:42:17.340
for not using the correct verbiage.
link |
01:42:18.900
And we're completely misunderstanding.
link |
01:42:20.500
Like people are just like misunderstanding
link |
01:42:22.020
what the other people are saying.
link |
01:42:23.580
And like, like I just got in a fight with a friend
link |
01:42:27.460
about like anarchism and communism and shit for like two hours.
link |
01:42:33.020
And then by the end of the conversation,
link |
01:42:34.540
like, and then she'd say something and I'm like,
link |
01:42:35.940
but that's literally what I'm saying.
link |
01:42:37.580
And she was like, what?
link |
01:42:38.900
And then I was like, fuck, we've different, right?
link |
01:42:40.860
And I'm like, we're, our English,
link |
01:42:42.900
like the way we are understanding terminology
link |
01:42:45.940
is like drastically like our algorithm bubbles
link |
01:42:50.140
are creating mini dialects.
link |
01:42:52.820
And how language is interpreted, how language is used.
link |
01:42:55.820
That's so fascinating.
link |
01:42:56.820
And so we're like having these arguments
link |
01:42:59.140
that we do not need to be having
link |
01:43:00.900
and there's polarization that's happening
link |
01:43:02.300
that doesn't need to be happening
link |
01:43:03.340
because we've got these like algorithmically created
link |
01:43:06.340
dialects occurring.
link |
01:43:09.860
Plus on top of that,
link |
01:43:10.700
there's also different parts of the world
link |
01:43:12.340
that speak different languages.
link |
01:43:13.460
So there's literally lost in translation kind of communication.
link |
01:43:17.820
I happen to know the Russian language
link |
01:43:19.780
and I just know how different it is.
link |
01:43:21.420
Yeah.
link |
01:43:22.380
Then the English language.
link |
01:43:23.860
And I just wonder how much is lost in a little bit of.
link |
01:43:27.660
Man, I actually, cause I have a question for you.
link |
01:43:28.940
I have a song coming out tomorrow
link |
01:43:30.260
with Ice Peak or a Russian band.
link |
01:43:31.900
And I speak a little bit of Russian
link |
01:43:33.700
and I was looking at the title
link |
01:43:35.380
and the title in English doesn't match the title in Russian.
link |
01:43:38.260
I'm curious about this.
link |
01:43:39.140
Cause look, it says
link |
01:43:40.540
What's the English?
link |
01:43:41.380
The title in English is last day.
link |
01:43:42.940
And then the title in Russian is
link |
01:43:44.500
my pronunciation sucks.
link |
01:43:47.540
Novideyen?
link |
01:43:48.780
Like what?
link |
01:43:49.620
A new day.
link |
01:43:50.460
Yeah, new day.
link |
01:43:51.300
New day.
link |
01:43:52.140
Like it's two different.
link |
01:43:53.460
Yeah, new day.
link |
01:43:54.420
Yeah.
link |
01:43:57.260
Yeah, yeah, new day.
link |
01:43:58.460
New day, but last day.
link |
01:44:01.340
Novideyen.
link |
01:44:02.260
So last day would be the last day.
link |
01:44:04.460
Maybe they, or maybe the title includes
link |
01:44:06.980
both the Russian and the, and the, and the, and it's four.
link |
01:44:09.060
Maybe, maybe.
link |
01:44:09.900
It's four, maybe it's four bilingual.
link |
01:44:10.740
To be honest, Novideyen sounds better than
link |
01:44:13.220
just musically, like a
link |
01:44:16.300
Novideyen is new day.
link |
01:44:17.780
That's the current one.
link |
01:44:18.780
And the last day is the last day.
link |
01:44:23.460
I think Novideyen.
link |
01:44:25.460
I don't like Novideyen.
link |
01:44:26.660
But the meaning is so different.
link |
01:44:30.180
That's kind of awesome actually though.
link |
01:44:31.660
There's an explicit sort of contrast like that.
link |
01:44:35.820
If everyone on earth disappeared
link |
01:44:38.340
and it was just you left,
link |
01:44:42.540
what would your day look like?
link |
01:44:44.060
Like what would you do?
link |
01:44:45.260
Everybody's dead.
link |
01:44:46.780
As far as you.
link |
01:44:47.620
Are there corpses there?
link |
01:44:52.500
Well seriously, it's a big day.
link |
01:44:53.820
Let me think through this.
link |
01:44:54.780
It's a big difference if there's just like birds singing
link |
01:44:56.940
versus if there's like corpses littering the street.
link |
01:44:58.940
Yeah, there's corpses everywhere.
link |
01:45:00.300
I'm sorry.
link |
01:45:01.900
It's, and you don't actually know what happened.
link |
01:45:05.100
And you don't know why you survived.
link |
01:45:07.580
And you don't even know if there's others out there.
link |
01:45:10.460
But it seems clear that it's all gone.
link |
01:45:13.580
What would you do?
link |
01:45:15.260
What would I do?
link |
01:45:17.300
Listen, I'm somebody who really enjoys the moment,
link |
01:45:19.580
enjoys life.
link |
01:45:20.460
I would just go on like enjoying the inanimate objects.
link |
01:45:25.460
I would just look for food, basic survival,
link |
01:45:29.740
but mostly it's just, listen, when I just,
link |
01:45:32.900
I take walks and I look outside
link |
01:45:35.100
and I'm just happy that we get to exist on this planet
link |
01:45:38.940
to be able to breathe air.
link |
01:45:41.140
It's just all beautiful.
link |
01:45:42.380
It's full of colors, all of this kind of stuff.
link |
01:45:44.180
Just there's so many things about life, your own life,
link |
01:45:48.260
conscious life that's fucking awesome.
link |
01:45:50.100
So I would just enjoy that.
link |
01:45:51.260
But also maybe after a few weeks,
link |
01:45:54.260
the engineer would start coming out,
link |
01:45:55.940
like want to build some things.
link |
01:45:58.940
Maybe there's always hope searching for another human.
link |
01:46:02.940
Maybe like...
link |
01:46:03.780
Probably searching for another human,
link |
01:46:06.820
probably trying to get to a TV or radio station
link |
01:46:10.820
and broadcast something.
link |
01:46:14.260
I...
link |
01:46:15.580
That's interesting.
link |
01:46:16.420
I didn't think about that.
link |
01:46:17.260
So like really maximize your ability to connect with others.
link |
01:46:21.260
Yeah, like probably try to find another person.
link |
01:46:26.260
Would you be excited to meet another person or terrified?
link |
01:46:30.260
Because, you know...
link |
01:46:31.260
I'd be excited, even if they...
link |
01:46:33.260
No matter what.
link |
01:46:34.260
Yeah, yeah, yeah, yeah.
link |
01:46:35.260
Being alone for the last however long of my life
link |
01:46:39.260
would be really bad.
link |
01:46:40.260
That's the one instance I might,
link |
01:46:43.260
I don't think I'd kill myself for it.
link |
01:46:45.260
I might, I don't think I'd kill myself
link |
01:46:47.260
but I might kill myself if I had to understand that.
link |
01:46:49.260
Do you love people?
link |
01:46:50.260
Do you love connection to other humans?
link |
01:46:52.260
Yeah.
link |
01:46:53.260
I kind of hate people too, but yeah.
link |
01:46:54.260
No, it's a love hate relationship.
link |
01:46:56.260
Yeah.
link |
01:46:57.260
I feel like this is,
link |
01:46:58.260
I feel like we had a bunch of like weird niche of questions and stuff.
link |
01:47:00.260
Oh yeah.
link |
01:47:01.260
Like I wonder, because I'm like,
link |
01:47:02.260
when podcast, like I'm like,
link |
01:47:03.260
is this interesting for people to just have like...
link |
01:47:05.260
Or, I don't know, maybe people do like this.
link |
01:47:08.260
When I listen to podcasts,
link |
01:47:09.260
I'm into like the lore, like the hard lore.
link |
01:47:12.260
Like I just love like Dan Carlin.
link |
01:47:13.260
I'm like, give me the facts, just like...
link |
01:47:15.260
Yeah.
link |
01:47:16.260
Just like the facts into my bloodstream.
link |
01:47:18.260
But you also don't know,
link |
01:47:20.260
like you're a fascinating mind to explore.
link |
01:47:23.260
So when you don't realize as you're talking about stuff,
link |
01:47:26.260
the stuff you've taken for granted
link |
01:47:28.260
is actually unique and fascinating.
link |
01:47:30.260
The way you think, not always what,
link |
01:47:33.260
like the way you reason through things
link |
01:47:35.260
is the fascinating thing.
link |
01:47:36.260
Okay.
link |
01:47:37.260
To listen, to listen to,
link |
01:47:39.260
because people kind of see,
link |
01:47:40.260
oh, there's other humans that think differently,
link |
01:47:43.260
that explore thoughts differently.
link |
01:47:45.260
That's the cool.
link |
01:47:46.260
That's also cool.
link |
01:47:47.260
So yeah, Dan Carlin retelling of history.
link |
01:47:50.260
By the way, his retelling of history is very,
link |
01:47:54.260
I think what's exciting is not the history,
link |
01:47:57.260
is his way of thinking about history.
link |
01:48:00.260
No, I think Dan Carlin is one of the people,
link |
01:48:02.260
like when Dan Carlin is one of the people
link |
01:48:04.260
that really started getting me excited
link |
01:48:06.260
about like revolutionizing education,
link |
01:48:08.260
because like Dan Carlin instilled,
link |
01:48:11.260
I already really liked history,
link |
01:48:14.260
but he instilled like an obsessive love of history in me,
link |
01:48:18.260
to the point where like now I'm fucking reading,
link |
01:48:21.260
like going to bed,
link |
01:48:23.260
reading like part four of the Rise and Fall,
link |
01:48:25.260
Third Reich or whatever.
link |
01:48:26.260
Like I got like dense ass history,
link |
01:48:28.260
but like he like opened that door
link |
01:48:31.260
that like made me want to be a scholar of that topic.
link |
01:48:34.260
Like it's like, I feel like he's such a good teacher.
link |
01:48:37.260
He just like, you know,
link |
01:48:39.260
and it sort of made me feel like
link |
01:48:41.260
one of the things we could do with education
link |
01:48:43.260
is like find like the world's great,
link |
01:48:46.260
the teachers that like create passion for the topic,
link |
01:48:49.260
because auto didacticism,
link |
01:48:53.260
I don't know how to say that properly,
link |
01:48:55.260
but like self teaching is like much faster
link |
01:48:57.260
than being lectured to,
link |
01:48:59.260
like it's much more efficient to sort of like be able
link |
01:49:01.260
to teach yourself and then ask a teacher questions
link |
01:49:03.260
when you don't know what's up,
link |
01:49:04.260
but like, you know, that's why it's like in university and stuff,
link |
01:49:08.260
like you can learn so much more material,
link |
01:49:10.260
so much faster because you're doing a lot of the learning
link |
01:49:12.260
on your own and you're going to the teachers
link |
01:49:14.260
for when you get stuck,
link |
01:49:15.260
but like these teachers that can inspire passion
link |
01:49:18.260
for a topic,
link |
01:49:20.260
I think that is one of the most invaluable skills
link |
01:49:22.260
in our whole species,
link |
01:49:23.260
like because if you can do that,
link |
01:49:25.260
then you, it's like AI,
link |
01:49:27.260
like AI is going to teach itself
link |
01:49:30.260
so much more efficiently than we can teach it.
link |
01:49:32.260
To get it to the point where it can teach itself
link |
01:49:34.260
and then
link |
01:49:35.260
It finds the motivation to do so, right?
link |
01:49:37.260
Yeah.
link |
01:49:38.260
It's like you inspire it to do so.
link |
01:49:39.260
Yeah.
link |
01:49:40.260
And then it could, it could teach itself.
link |
01:49:42.260
What do you make of the fact,
link |
01:49:44.260
you mentioned Rise and Fall of the Third Reich,
link |
01:49:46.260
I just read it twice.
link |
01:49:48.260
You read it twice?
link |
01:49:49.260
Yes.
link |
01:49:50.260
Okay.
link |
01:49:51.260
He's known even knows what it, what it is.
link |
01:49:52.260
Yeah.
link |
01:49:53.260
And I'm like, I'm like, wait,
link |
01:49:54.260
I thought this was like a super pop and book.
link |
01:49:55.260
Super pop.
link |
01:49:56.260
I'm not like that.
link |
01:49:57.260
It's a classic.
link |
01:49:58.260
I'm not that far in it,
link |
01:49:59.260
but it is, it's so interesting.
link |
01:50:00.260
Yeah.
link |
01:50:02.260
It's written by a person that was there,
link |
01:50:04.260
which is very important to kind of,
link |
01:50:06.260
you know, you can start being like,
link |
01:50:08.260
how could this possibly happen?
link |
01:50:09.260
And then when you read Rise and Fall of the Third Reich,
link |
01:50:11.260
it's like people tried really hard for this to not happen.
link |
01:50:14.260
People tried,
link |
01:50:15.260
they almost reinstated a monarchy at one point
link |
01:50:17.260
to try to stop this from happening.
link |
01:50:18.260
Like they almost like,
link |
01:50:20.260
like abandoned democracy to try to get this to not happen.
link |
01:50:23.260
At least the way it makes me feel
link |
01:50:25.260
is that there's a bunch of small moments
link |
01:50:28.260
on which history can turn.
link |
01:50:30.260
Yes.
link |
01:50:31.260
It's like small meetings.
link |
01:50:32.260
Yes.
link |
01:50:33.260
Human interactions.
link |
01:50:34.260
And it's both terrifying and inspiring
link |
01:50:37.260
because it's like,
link |
01:50:40.260
even just attempts at assassinating Hitler
link |
01:50:44.260
like time and time again failed.
link |
01:50:47.260
And they were so close.
link |
01:50:48.260
Was it like,
link |
01:50:49.260
I'm from Valkyrie?
link |
01:50:50.260
Such a good.
link |
01:50:52.260
And then there is also,
link |
01:50:53.260
also the role of,
link |
01:50:55.260
that's a really heavy burden,
link |
01:50:57.260
that from a geopolitical perspective,
link |
01:50:59.260
the role of leaders to see evil
link |
01:51:01.260
before it truly becomes evil,
link |
01:51:03.260
to anticipate it,
link |
01:51:04.260
to stand up to evil.
link |
01:51:05.260
Because evil is actually pretty rare in this world.
link |
01:51:08.260
At a scale that Hitler was.
link |
01:51:09.260
We tend to, you know,
link |
01:51:11.260
in modern discourse kind of call people evil too quickly.
link |
01:51:14.260
If you look at ancient history,
link |
01:51:17.260
like there was a ton of Hitlers.
link |
01:51:19.260
I actually think it's more the norm
link |
01:51:21.260
than like,
link |
01:51:23.260
again, going back to like my sort of intelligent design theory.
link |
01:51:26.260
One of the things we've been successfully doing
link |
01:51:28.260
in our slow move from survival
link |
01:51:30.260
of the fittest to intelligent design
link |
01:51:32.260
is we've kind of been eradicating,
link |
01:51:37.260
like if you look at like ancient Assyria and stuff,
link |
01:51:40.260
like that shit was like brutal
link |
01:51:42.260
and just like the heads on the,
link |
01:51:44.260
like brutal, like Genghis Khan,
link |
01:51:46.260
just like genocide after genocide after genocide.
link |
01:51:49.260
There's like throwing plague bodies over the walls
link |
01:51:51.260
and decimating whole cities
link |
01:51:53.260
or like the Muslim conquests of like Damascus and shit.
link |
01:51:56.260
Just like people, cities used to get leveled
link |
01:51:59.260
all the fucking time.
link |
01:52:00.260
Okay.
link |
01:52:01.260
Get into the Bronze Age collapse.
link |
01:52:03.260
It's basically there was like almost like Roman level,
link |
01:52:06.260
like society,
link |
01:52:08.260
like there was like all over the world,
link |
01:52:10.260
like global trade,
link |
01:52:11.260
like everything was awesome
link |
01:52:12.260
through a mix of I think a bit of climate change
link |
01:52:14.260
and then the development of iron,
link |
01:52:16.260
because basically bronze could only come from this,
link |
01:52:18.260
the way to make bronze,
link |
01:52:20.260
like everything had to be funneled through this one,
link |
01:52:22.260
Canadian mine.
link |
01:52:24.260
And so it's like,
link |
01:52:25.260
there was just this one supply chain.
link |
01:52:27.260
And this is one of the things
link |
01:52:28.260
that makes me worried about supply chains
link |
01:52:30.260
and why I think we need to be so thoughtful about,
link |
01:52:32.260
I think our biggest issue with society right now,
link |
01:52:35.260
like the thing that is most likely to go wrong
link |
01:52:37.260
is probably supply chain collapse.
link |
01:52:39.260
You know, cause war, climate change, whatever,
link |
01:52:41.260
like anything that causes supply chain collapse,
link |
01:52:43.260
our population is too big to handle that.
link |
01:52:45.260
And like the thing that seems to cause dark ages
link |
01:52:47.260
is mass supply chain collapse.
link |
01:52:49.260
But the bronze age collapse happened,
link |
01:52:51.260
like it was sort of like this ancient collapse
link |
01:52:55.260
that happened where like literally like ancient Egypt,
link |
01:52:59.260
all these cities,
link |
01:53:00.260
everything just got like decimated, destroyed,
link |
01:53:02.260
abandoned cities, like hundreds of them.
link |
01:53:04.260
There was like a flourishing society,
link |
01:53:06.260
like we were almost coming to modernity
link |
01:53:07.260
and everything got leveled.
link |
01:53:08.260
And they had this many dark ages,
link |
01:53:10.260
but it was just like,
link |
01:53:11.260
there's so little writing or recording from that time
link |
01:53:13.260
that like there isn't a lot of information
link |
01:53:15.260
about the bronze age collapse,
link |
01:53:16.260
but it was basically equivalent to like medieval,
link |
01:53:19.260
the medieval dark ages,
link |
01:53:21.260
but it just happened not,
link |
01:53:23.260
I don't know the years,
link |
01:53:24.260
but like thousands of years earlier.
link |
01:53:27.260
And then we sort of like recovered
link |
01:53:29.260
from the bronze age collapse,
link |
01:53:31.260
empire reemerged, writing and trade
link |
01:53:34.260
and everything reemerged, you know,
link |
01:53:37.260
and then we of course had the more contemporary dark ages.
link |
01:53:41.260
And then over time,
link |
01:53:42.260
we've designed mechanism to lessen
link |
01:53:44.260
and lessen the capability for the destructive.
link |
01:53:48.260
Power centers to emerge.
link |
01:53:50.260
There's more recording
link |
01:53:51.260
about the more contemporary dark ages.
link |
01:53:54.260
So I think we have like a better understanding
link |
01:53:55.260
of how to avoid it,
link |
01:53:56.260
but I still think we're at high risk for it.
link |
01:53:58.260
I think that's one of the big risks right now.
link |
01:54:00.260
So the natural state of being for humans
link |
01:54:03.260
is for there to be a lot of hitlers,
link |
01:54:04.260
which has gotten really good
link |
01:54:06.260
at making it hard for them to emerge.
link |
01:54:09.260
We've gotten better collaboration.
link |
01:54:11.260
Yes.
link |
01:54:12.260
And resisting the power,
link |
01:54:14.260
like authoritarians to come to power.
link |
01:54:16.260
We're trying to go country by country.
link |
01:54:18.260
Like we're moving past this.
link |
01:54:19.260
We're kind of like slowly, incrementally,
link |
01:54:21.260
like moving towards like not scary old school war stuff.
link |
01:54:28.260
And I think seeing it happen in some of the countries
link |
01:54:31.260
that at least nominally are like,
link |
01:54:34.260
supposed to have moved past that,
link |
01:54:36.260
that's scary because it reminds us that it can happen.
link |
01:54:39.260
Like in the places that have made,
link |
01:54:42.260
like supposedly hopefully moved past that.
link |
01:54:46.260
And possibly at a civilization level,
link |
01:54:49.260
like you said, supply chain collapse
link |
01:54:51.260
might make people resource constrained,
link |
01:54:53.260
might make people desperate, angry,
link |
01:54:57.260
hateful, violent, and drag us right back in.
link |
01:55:01.260
I mean, supply chain collapse is how,
link |
01:55:03.260
like the ultimate thing that caused the Middle Ages
link |
01:55:06.260
was supply chain collapse.
link |
01:55:08.260
It's like people,
link |
01:55:09.260
because people were reliant on a certain level of technology,
link |
01:55:12.260
like people, like you look at like Britain,
link |
01:55:14.260
like they had glass, like people had aqueducts,
link |
01:55:17.260
people had like indoor heating and cooling
link |
01:55:20.260
and like running water and like buy food
link |
01:55:23.260
from all over the world and trade and markets.
link |
01:55:26.260
Like people didn't know how to hunt and forage and gather.
link |
01:55:28.260
And so we're in a similar situation.
link |
01:55:30.260
We are not educated enough to survive without technology.
link |
01:55:33.260
So if we have a supply chain collapse
link |
01:55:35.260
that like limits our access to technology,
link |
01:55:38.260
there will be like mass starvation and violence
link |
01:55:41.260
and displacement and war.
link |
01:55:43.260
Like, you know, it's also, like, yeah.
link |
01:55:47.260
In my opinion, it's like the primary marker of dark,
link |
01:55:51.260
like what a dark age is.
link |
01:55:52.260
What technology is kind of enabling us
link |
01:55:54.260
to be more resilient in terms of supply chain,
link |
01:55:57.260
in terms of to all the different catastrophic events
link |
01:56:00.260
that happened to us,
link |
01:56:02.260
although the pandemic has kind of challenged
link |
01:56:04.260
our preparedness for the catastrophic.
link |
01:56:07.260
What do you think is the coolest invention humans come up with?
link |
01:56:10.260
The wheel, fire, cooking meat.
link |
01:56:14.260
Computers.
link |
01:56:15.260
Computers.
link |
01:56:16.260
Freaking computers.
link |
01:56:17.260
Internet or computers, which one?
link |
01:56:19.260
What do you think the...
link |
01:56:20.260
Previous technologies, I mean, may have even been more profound
link |
01:56:23.260
and moved us to a certain degree,
link |
01:56:24.260
but I think the computers are what make us homo tech now.
link |
01:56:27.260
I think this is what, it's a brain augmentation.
link |
01:56:30.260
And so it like allows for actual evolution,
link |
01:56:33.260
like the computers accelerate the degree
link |
01:56:35.260
to which all the other technologies can also be accelerated.
link |
01:56:38.260
Would you classify yourself as a homo sapien
link |
01:56:40.260
or a homo techno?
link |
01:56:41.260
Definitely a homo techno.
link |
01:56:42.260
So you're one of the earliest of the species.
link |
01:56:46.260
I think most of us are.
link |
01:56:48.260
Like, as I said, like, I think if you, like,
link |
01:56:54.260
looked at brain scans of ossoverses humans 100 years ago,
link |
01:56:59.260
it would look very different.
link |
01:57:00.260
I think we are physiologically different.
link |
01:57:03.260
Just even the interaction with the devices has changed our brains.
link |
01:57:06.260
Well, and if you look at them,
link |
01:57:08.260
a lot of studies are coming out to show that, like,
link |
01:57:10.260
there's a degree of inherited memory.
link |
01:57:12.260
So some of these physiological changes in theory should be,
link |
01:57:15.260
we should be passing them on.
link |
01:57:17.260
So, like, that's, you know, that's not like a,
link |
01:57:21.260
an instance of physiological change that's going to fizzle out.
link |
01:57:23.260
In theory, that should progress, like, to our offspring.
link |
01:57:28.260
Speaking of offspring, what advice would you give to a young person?
link |
01:57:32.260
Like, in high school, whether there be an artist,
link |
01:57:38.260
a creative, an engineer, any kind of career path,
link |
01:57:44.260
or maybe just life in general,
link |
01:57:46.260
how they can live a life they can be proud of?
link |
01:57:48.260
I think one of my big thoughts,
link |
01:57:50.260
and like, especially now having kids,
link |
01:57:53.260
is that I don't think we spend enough time teaching creativity.
link |
01:57:56.260
And I think creativity is a muscle like other things.
link |
01:57:59.260
And there's a lot of emphasis on, you know, learn how to play the piano,
link |
01:58:02.260
and then you can write a song, or like, learn the technical stuff,
link |
01:58:05.260
and then you can do a thing.
link |
01:58:07.260
But I think it's, like, I have a friend who's like,
link |
01:58:10.260
world's greatest guitar player, like, you know,
link |
01:58:14.260
amazing sort of like producer, works with other people,
link |
01:58:16.260
but he's really sort of like, you know,
link |
01:58:19.260
he like engineers and records things and like does solos,
link |
01:58:21.260
but he doesn't really like make his own music.
link |
01:58:23.260
And I was talking to him, and I was like,
link |
01:58:25.260
dude, you're so talented at music.
link |
01:58:27.260
Like, why don't you make music or whatever?
link |
01:58:29.260
And he was like, because I got, I'm too old.
link |
01:58:31.260
I never learned the creative muscle.
link |
01:58:33.260
And it's like, you know, it's embarrassing.
link |
01:58:36.260
It's like learning the creative muscle takes a lot of failure.
link |
01:58:40.260
And it also sort of, when you're being creative,
link |
01:58:46.260
you know, you're throwing paint at a wall,
link |
01:58:48.260
and a lot of stuff will fail.
link |
01:58:50.260
So like part of it is like a tolerance for failure and humiliation.
link |
01:58:53.260
And somehow that's easier to develop when you're young?
link |
01:58:55.260
Or be possessed through it when you're young?
link |
01:58:57.260
Everything is easier to develop when you're young.
link |
01:59:00.260
Yes.
link |
01:59:02.260
The younger, the better.
link |
01:59:04.260
It could destroy you.
link |
01:59:05.260
I mean, that's the shitty thing about creativity.
link |
01:59:07.260
If, you know, failure could destroy you if you're not careful,
link |
01:59:11.260
but that's the risk worth taking.
link |
01:59:13.260
But also, at a young age, developing a tolerance to failure is good.
link |
01:59:17.260
I fail all the time.
link |
01:59:19.260
Like I do stupid shit all the time.
link |
01:59:21.260
Like in public, in private, I get canceled before.
link |
01:59:24.260
I have make all kinds of mistakes,
link |
01:59:26.260
but I just like am very resilient about making mistakes.
link |
01:59:30.260
And so then like I do a lot of things that like other people wouldn't do.
link |
01:59:34.260
And like I think my greatest asset is my creativity.
link |
01:59:37.260
And I like, I think paint like tolerance to failure is just a super essential thing
link |
01:59:43.260
that should be taught before other things.
link |
01:59:45.260
Brilliant advice.
link |
01:59:46.260
Yeah, yeah.
link |
01:59:47.260
I wish everybody encouraged sort of failure more as opposed to kind of.
link |
01:59:52.260
Cause we like punish failure.
link |
01:59:54.260
Like when we were teaching kids, we're like, no, that's wrong.
link |
01:59:56.260
Like that's, you know, like X keeps like, we'll be like wrong.
link |
02:00:04.260
Like he'll say like crazy things.
link |
02:00:05.260
Like X keeps being like, like bubble car, bubble car.
link |
02:00:09.260
And I'm like, and you know, I'm like, what's a bubble car?
link |
02:00:14.260
Like, but like it doesn't, like, but I don't want to be like, no, you're wrong.
link |
02:00:17.260
I'm like, you're thinking of weird crazy shit.
link |
02:00:20.260
Like I don't know what a bubble car is, but like.
link |
02:00:22.260
He's creating worlds and they might be internally consistent.
link |
02:00:25.260
And through that, he might discover something fundamental about this.
link |
02:00:28.260
Yeah.
link |
02:00:29.260
Or he'll like rewrite songs like with words that he prefers.
link |
02:00:32.260
So like instead of baby shark, he says baby car.
link |
02:00:35.260
It's like.
link |
02:00:39.260
Maybe he's onto something.
link |
02:00:41.260
Let me ask the big ridiculous question.
link |
02:00:43.260
We were kind of dancing around it, but what do you think is the meaning of this whole thing we have here?
link |
02:00:49.260
Of human civilization, of life on earth, but in general, just life.
link |
02:00:54.260
What's the meaning of life?
link |
02:00:57.260
See.
link |
02:00:58.260
Have you, did you read Nova scene yet by James Lovelock?
link |
02:01:03.260
You're doing a lot of really good book recommendations here.
link |
02:01:06.260
I haven't even finished this.
link |
02:01:07.260
So I'm a huge fraud yet again.
link |
02:01:09.260
But like really early in the book, he says this amazing thing.
link |
02:01:14.260
Like I feel like everyone's so sad and cynical.
link |
02:01:16.260
Like everyone's like the Fermi paradox and everyone, I just keep hearing people being like, fuck, what if we're alone?
link |
02:01:21.260
Like, oh no.
link |
02:01:22.260
Like, and I'm like, okay, but like, wait, what if this is the beginning?
link |
02:01:26.260
Like in Nova scene, he says, I'm, this is not going to be a correct.
link |
02:01:31.260
I can't like memorize quotes, but he says something like, what if our consciousness.
link |
02:01:38.260
Like right now, like this is the universe waking up.
link |
02:01:43.260
Like what if instead of discovering the universe.
link |
02:01:45.260
Like this is the universe.
link |
02:01:47.260
Like this is the evolution of the little literal universe herself.
link |
02:01:51.260
Like we are not separate from the universe.
link |
02:01:53.260
Like this is the universe waking up.
link |
02:01:54.260
This is the universe seeing herself for the first time.
link |
02:01:57.260
Like this is
link |
02:01:58.260
The universe becoming conscious.
link |
02:02:00.260
The first time we were part of that.
link |
02:02:02.260
Yeah.
link |
02:02:03.260
Cause it's like we aren't separate from the universe.
link |
02:02:05.260
Like this could be like an incredibly sacred moment and maybe like social media and all this things.
link |
02:02:10.260
The stuff where we're all getting connected together.
link |
02:02:13.260
Like maybe this, these are the neurons connecting of the like collective superintelligence that is, you know.
link |
02:02:21.260
Waking up.
link |
02:02:22.260
Yeah.
link |
02:02:23.260
Like, you know, it's like maybe instead of something cynical or maybe if there's something to discover, like maybe this is just, you know, we're a blast assist of, of like some incredible kind of consciousness or being.
link |
02:02:39.260
And just like in the first three years of life or for human children, we'll forget about all the suffering that we're going through now.
link |
02:02:45.260
I think we'll probably forget about this.
link |
02:02:46.260
I mean, probably, you know, artificial intelligence will eventually render us obsolete.
link |
02:02:52.260
I don't think they'll do it in a malicious way, but I think probably we are very weak.
link |
02:02:57.260
The sun is expanding.
link |
02:02:58.260
Like, I don't know, like hopefully we can get to Mars, but like we're pretty vulnerable.
link |
02:03:03.260
And I, you know, like, I think we can coexist for a long time with AI and we can also probably make ourselves less vulnerable.
link |
02:03:11.260
But, you know, I just think consciousness, sentience, self awareness.
link |
02:03:17.260
Like, I think this might be the single greatest like moment in evolution ever.
link |
02:03:24.260
And like maybe this is, you know, the big, like the true beginning of life and we're just, we're the blue green algae or we're like, we're like the single celled organisms of something amazing.
link |
02:03:38.260
The universe awakens.
link |
02:03:39.260
And this is, this is it.
link |
02:03:40.260
Yeah.
link |
02:03:42.260
Well, see, you're an incredible person.
link |
02:03:45.260
You're a fascinating mind.
link |
02:03:47.260
You should definitely do your friend live mentioned that you guys were thinking of maybe talking.
link |
02:03:52.260
I would love it if you explored your mind in this kind of media more and more by doing a podcast with her or just in any kind of way.
link |
02:03:59.260
So you're, you're an awesome person.
link |
02:04:01.260
And it's an honor to know you.
link |
02:04:03.260
It's an honor to get to sit down with you late at night, which is like surreal.
link |
02:04:08.260
And I really enjoyed it.
link |
02:04:09.260
Thank you for talking to me.
link |
02:04:10.260
Yeah, no, I mean, huge honor.
link |
02:04:11.260
I feel very underqualified to be here, but I'm a big fan.
link |
02:04:13.260
I've been listening to podcasts a lot.
link |
02:04:15.260
And yeah, me and live would appreciate any advice and help.
link |
02:04:18.260
And we're definitely going to do that.
link |
02:04:19.260
So anytime.
link |
02:04:22.260
Thank you.
link |
02:04:23.260
Cool.
link |
02:04:24.260
Thank you.
link |
02:04:25.260
Thanks for listening to this conversation with Grimes to support this podcast.
link |
02:04:28.260
Please check out our sponsors in the description.
link |
02:04:31.260
And now let me leave you with some words from Oscar Wilde.
link |
02:04:34.260
Yes, I'm a dreamer for dreamers one who can only find her way by moonlight.
link |
02:04:41.260
And her punishment is that she sees the dawn before the rest of the world.
link |
02:04:46.260
Thank you for listening and hope to see you next time.