back to index

Bryan Johnson: Kernel Brain-Computer Interfaces | Lex Fridman Podcast #186


small model | large model

link |
00:00:00.000
The following is a conversation with Brian Johnson,
link |
00:00:02.560
founder of Kernel, a company that has developed devices
link |
00:00:06.480
that can monitor and record brain activity.
link |
00:00:09.360
And previously, he was the founder of Braintree,
link |
00:00:12.080
a mobile payment company that acquired Venmo
link |
00:00:14.760
and then was acquired by PayPal and eBay.
link |
00:00:17.840
Quick mention of our sponsors,
link |
00:00:19.720
Forsigmatic, NetSuite, Grammarly, and ExpressVPN.
link |
00:00:24.280
Check them out in the description to support this podcast.
link |
00:00:27.760
As a side note, let me say that this was a fun
link |
00:00:30.160
and memorable experience,
link |
00:00:31.720
wearing the Kernel FlowBrain interface
link |
00:00:34.040
in the beginning of this conversation,
link |
00:00:35.920
as you can see if you watch the video version
link |
00:00:38.080
of this episode.
link |
00:00:39.520
And there was a Ubuntu Linux machine sitting next to me
link |
00:00:42.900
collecting the data from my brain.
link |
00:00:44.840
The whole thing gave me hope that the mystery
link |
00:00:47.840
of the human mind will be unlocked in the coming decades
link |
00:00:51.480
as we begin to measure signals from the brain
link |
00:00:53.760
in a high bandwidth way.
link |
00:00:55.400
To understand the mind,
link |
00:00:56.600
we either have to build it or to measure it.
link |
00:01:00.160
Both are worth a try.
link |
00:01:02.200
Thanks to Brian and the rest of the Kernel team
link |
00:01:04.200
for making this little demo happen.
link |
00:01:06.320
This is the Lex Friedman Podcast,
link |
00:01:08.640
and here is my conversation with Brian Johnson.
link |
00:01:13.240
You ready, Lex?
link |
00:01:14.080
Yes, I'm ready.
link |
00:01:15.680
Do you guys wanna come in and put the interfaces
link |
00:01:18.000
on our heads?
link |
00:01:18.840
And then I will proceed to tell you a few jokes.
link |
00:01:22.040
So we have two incredible pieces of technology
link |
00:01:25.760
and a machine running Ubuntu 2004 in front of us.
link |
00:01:30.480
What are we doing?
link |
00:01:31.520
All right.
link |
00:01:32.360
Are these going on our heads?
link |
00:01:33.200
They're going on our heads, yeah.
link |
00:01:34.040
And they will place it on our heads for proper alignment.
link |
00:01:41.800
Does this support giant heads?
link |
00:01:43.200
Because I kind of have a giant head.
link |
00:01:45.120
Is this just giant head?
link |
00:01:46.440
Are you saying as like an ego
link |
00:01:48.040
or are you saying physically both?
link |
00:01:55.760
It's a nice massage.
link |
00:01:57.040
Yes.
link |
00:01:59.960
Okay, how does this feel?
link |
00:02:01.920
It's okay to move around?
link |
00:02:04.720
Yeah.
link |
00:02:05.680
It feels, oh yeah.
link |
00:02:07.440
Hey, hey.
link |
00:02:10.920
This feels awesome.
link |
00:02:11.760
It's a pretty good fit.
link |
00:02:12.840
Thank you.
link |
00:02:13.680
That feels good.
link |
00:02:15.640
So this is big head friendly.
link |
00:02:17.640
It suits you well, Lex.
link |
00:02:19.080
Thank you.
link |
00:02:22.040
I feel like I need to,
link |
00:02:24.600
I feel like when I wear this,
link |
00:02:25.760
I need to sound like Sam Harris,
link |
00:02:27.040
calm, collected, eloquent.
link |
00:02:31.760
I feel smarter, actually.
link |
00:02:34.080
I don't think I've ever felt quite as much
link |
00:02:36.840
like I'm part of the future as now.
link |
00:02:38.720
Have you ever worn a brain interface
link |
00:02:40.680
or had your brain imaged?
link |
00:02:42.520
Oh, never had my brain imaged.
link |
00:02:45.560
The only way I've analyzed my brain
link |
00:02:47.280
is by talking to myself and thinking.
link |
00:02:52.280
No direct data.
link |
00:02:53.720
Yeah, that is definitely a brain interface
link |
00:02:57.400
that has a lot of blind spots.
link |
00:02:59.240
It has some blind spots, yeah.
link |
00:03:01.320
Psychotherapy.
link |
00:03:02.480
That's right.
link |
00:03:04.440
All right, are we recording?
link |
00:03:07.040
Yeah, we're good.
link |
00:03:08.280
All right.
link |
00:03:09.600
So Lex, the objective of this,
link |
00:03:12.680
I'm going to tell you some jokes
link |
00:03:16.960
and your objective is to not smile,
link |
00:03:18.920
which as a Russian, you should have an edge.
link |
00:03:22.960
Make the motherland proud.
link |
00:03:24.520
I gotcha.
link |
00:03:25.520
Okay.
link |
00:03:28.200
Let's hear the jokes.
link |
00:03:29.440
Lex, and this is from the Colonel Crew.
link |
00:03:33.960
We've been working on a device that can read your mind
link |
00:03:37.080
and we would love to see your thoughts.
link |
00:03:42.200
Is that the joke?
link |
00:03:43.960
That's the opening.
link |
00:03:44.840
Okay.
link |
00:03:49.960
If I'm seeing the muscle activation correctly on your lips,
link |
00:03:53.640
you're not going to do well on this.
link |
00:03:54.760
Let's see.
link |
00:03:55.600
All right, here comes the first.
link |
00:03:56.480
I'm screwed.
link |
00:03:57.320
Here comes the first one.
link |
00:03:58.240
Is this going to break the device?
link |
00:04:01.200
Is it resilient to laughter?
link |
00:04:07.120
Lex, what goes through a potato's brain?
link |
00:04:09.480
I can't.
link |
00:04:14.320
I got already failed.
link |
00:04:15.760
That's the hilarious opener.
link |
00:04:17.280
Okay.
link |
00:04:18.120
What?
link |
00:04:19.280
Tater thoughts.
link |
00:04:24.000
What kind of fish performs brain surgery?
link |
00:04:27.720
I don't know.
link |
00:04:29.000
A neural surgeon.
link |
00:04:35.520
And so we're getting data of everything
link |
00:04:37.680
that's happening in my brain right now?
link |
00:04:39.040
Lifetime, yeah.
link |
00:04:39.880
We're getting activation patterns of your entire cortex.
link |
00:04:45.000
I'm going to try to do better.
link |
00:04:46.240
I'll edit out all the parts where I laughed.
link |
00:04:48.240
Photoshop put a serious face over me.
link |
00:04:50.600
You can recover.
link |
00:04:51.960
Yeah, all right.
link |
00:04:53.760
Lex, what do scholars eat when they're hungry?
link |
00:04:57.000
I don't know, what?
link |
00:04:58.440
Academia nuts.
link |
00:05:03.600
That was a pretty good one.
link |
00:05:05.880
So what we'll do is,
link |
00:05:07.240
so you're wearing kernel flow,
link |
00:05:11.360
which is an interface built using technology
link |
00:05:16.600
called spectroscopy.
link |
00:05:17.680
So it's similar to what we wear wearables on the wrist
link |
00:05:20.640
using light.
link |
00:05:22.240
So using LIDAR, as you know,
link |
00:05:24.640
and we're using that to image the functional imaging
link |
00:05:28.960
of brain activity.
link |
00:05:30.320
And so as your neurons fire electrically and chemically,
link |
00:05:35.120
it creates blood oxygenation levels.
link |
00:05:37.400
We're measuring that.
link |
00:05:38.240
And so you'll see in the reconstructions we do for you,
link |
00:05:41.120
you'll see your activation patterns in your brain
link |
00:05:43.040
as throughout this entire time we are wearing it.
link |
00:05:45.800
So in the reaction to the jokes
link |
00:05:47.720
and as we were sitting here talking,
link |
00:05:50.240
and so we're moving towards a real time feed
link |
00:05:54.560
of your cortical brain activity.
link |
00:05:56.880
So there's a bunch of things that are in contact
link |
00:05:59.440
with my skull right now.
link |
00:06:02.240
How many of them are there?
link |
00:06:03.640
And so how many of them are, what are they?
link |
00:06:05.880
What are the actual sensors?
link |
00:06:07.120
There's 52 modules,
link |
00:06:09.000
and each module has one laser and six sensors.
link |
00:06:13.720
And the sensors fire in about 100 picoseconds.
link |
00:06:18.480
And then the photons scatter and absorb in your brain.
link |
00:06:21.480
And then a few go in, a few come back out,
link |
00:06:24.400
a bunch go in, then a few come back out,
link |
00:06:25.880
and we sense those photons
link |
00:06:27.920
and then we do the reconstruction for the activity.
link |
00:06:30.360
Overall, there's about a thousand plus channels
link |
00:06:33.520
that are sampling your activity.
link |
00:06:35.640
How difficult is it to make it as comfortable as it is?
link |
00:06:38.760
Because it's surprisingly comfortable.
link |
00:06:40.720
I would not think it would be comfortable.
link |
00:06:44.280
Something that's measuring brain activity,
link |
00:06:48.320
I would not think it would be comfortable, but it is.
link |
00:06:51.160
I agree.
link |
00:06:52.000
In fact, I want to take this home.
link |
00:06:52.880
Yeah, yeah, that's right.
link |
00:06:54.200
So people are accustomed to being in big systems
link |
00:06:58.360
like fMRI where there's 120 decibel sounds
link |
00:07:02.200
and you're in a claustrophobic encasement,
link |
00:07:05.560
or EEG, which is just painful, or surgery.
link |
00:07:09.240
And so, yes, I agree that this is a convenient option
link |
00:07:12.240
to be able to just put on your head
link |
00:07:14.360
that measures your brain activity
link |
00:07:15.960
in the contextual environment you choose.
link |
00:07:17.960
So if we want to have it during a podcast,
link |
00:07:20.200
if we want to be at home in a business setting,
link |
00:07:24.400
it's freedom to record your brain activity
link |
00:07:27.480
in the setting that you choose.
link |
00:07:29.000
Yeah, but sort of from an engineering perspective,
link |
00:07:31.000
are these, what is it?
link |
00:07:32.920
There's a bunch of different modular parts
link |
00:07:34.960
and they're kind of, there's like a rubber band thing
link |
00:07:38.080
where they mold to the shape of your head.
link |
00:07:40.880
That's right.
link |
00:07:41.720
So we built this version of the mechanical design
link |
00:07:45.480
to accommodate most adult heads.
link |
00:07:49.760
But I have a giant head and it fits fine.
link |
00:07:51.760
It fits well, actually.
link |
00:07:53.800
So I don't think I have an average head.
link |
00:07:55.840
Okay, maybe I feel much better about my head now.
link |
00:08:01.640
Maybe I'm more average than I thought.
link |
00:08:06.160
Okay, so what else is there interesting
link |
00:08:08.720
that you could say while it's on our heads?
link |
00:08:10.840
I can keep this on the whole time.
link |
00:08:12.120
This is kind of awesome.
link |
00:08:13.480
And it's amazing for me, as a fan of Ubuntu,
link |
00:08:16.080
I use Ubuntu MATE, you guys use that too.
link |
00:08:18.880
But it's amazing to have code running to the side,
link |
00:08:23.360
measuring stuff and collecting data.
link |
00:08:26.200
I mean, I feel like much more important now
link |
00:08:30.360
that my data is being recorded.
link |
00:08:32.560
Like, you know when you have a good friend
link |
00:08:35.240
that listens to you, that actually is listening to you?
link |
00:08:40.240
This is what I feel like, like a much better friend
link |
00:08:43.080
because it's like accurately listening to me, Ubuntu.
link |
00:08:47.120
What a cool perspective, I hadn't thought about that,
link |
00:08:50.600
of feeling understood.
link |
00:08:53.960
Heard.
link |
00:08:54.800
Yeah, heard deeply by the mechanical system
link |
00:08:59.240
that is recording your brain activity,
link |
00:09:01.960
versus the human that you're engaging with,
link |
00:09:04.480
that your mind immediately goes to
link |
00:09:07.720
that there's this dimensionality
link |
00:09:09.520
and depth of understanding of this software system
link |
00:09:12.600
which you're intimately familiar with.
link |
00:09:14.600
And now you're able to communicate with this system
link |
00:09:17.360
in ways that you couldn't before.
link |
00:09:19.280
Yeah, I feel heard.
link |
00:09:24.600
Yeah, I mean, I guess what's interesting about this is
link |
00:09:26.880
your intuitions are spot on.
link |
00:09:28.760
Most people have intuitions about brain interfaces
link |
00:09:30.920
that they've grown up with this idea
link |
00:09:32.960
of people moving cursors on the screen
link |
00:09:34.800
or typing or changing the channel or skipping a song.
link |
00:09:38.600
It's primarily been anchored on control.
link |
00:09:41.840
And I think the more relevant understanding
link |
00:09:44.920
of brain interfaces or neuroimaging
link |
00:09:47.280
is that it's a measurement system.
link |
00:09:49.760
And once you have numbers for a given thing,
link |
00:09:53.280
a seemingly endless number of possibilities
link |
00:09:54.920
emerge around that of what to do with those numbers.
link |
00:09:57.760
So before you tell me about the possibilities,
link |
00:09:59.880
this was an incredible experience.
link |
00:10:01.520
I can keep this on for another two hours,
link |
00:10:03.880
but I'm being told that for a bunch of reasons,
link |
00:10:08.960
just because we probably wanna keep the data small
link |
00:10:11.240
and visualize it nicely for the final product,
link |
00:10:13.840
we wanna cut this off and take this amazing helmet
link |
00:10:17.440
away from me.
link |
00:10:18.280
So Brian, thank you so much for this experience
link |
00:10:21.240
and let's continue without helmetless.
link |
00:10:25.160
All right.
link |
00:10:26.200
So that was an incredible experience.
link |
00:10:28.920
Can you maybe speak to what kind of opportunities
link |
00:10:31.240
that opens up that stream of data,
link |
00:10:32.960
that rich stream of data from the brain?
link |
00:10:35.520
First, I'm curious, what is your reaction?
link |
00:10:38.360
What comes to mind when you put that on your head?
link |
00:10:41.560
What does it mean to you?
link |
00:10:42.840
And what possibilities emerge
link |
00:10:44.120
and what significance might it have?
link |
00:10:46.760
I'm curious where your orientation is at.
link |
00:10:50.880
Well, for me, I'm really excited by the possibility
link |
00:10:54.840
of various information about my body,
link |
00:10:59.440
about my mind being converted into data,
link |
00:11:03.320
such that data can be used to create products
link |
00:11:06.320
that make my life better.
link |
00:11:07.360
So that to me is really exciting possibility.
link |
00:11:09.760
Even just like a Fitbit that measures, I don't know,
link |
00:11:13.080
some very basic measurements about your body
link |
00:11:15.720
is really cool.
link |
00:11:17.120
But the bandwidth of information,
link |
00:11:20.560
the resolution of that information is very crude,
link |
00:11:22.720
so it's not very interesting.
link |
00:11:23.960
The possibility of just building a data set
link |
00:11:29.480
coming in a clean way and a high bandwidth way
link |
00:11:32.440
from my brain opens up all kinds of...
link |
00:11:37.440
I was kind of joking when we were talking,
link |
00:11:40.840
but it's not really, it's like I feel heard
link |
00:11:44.680
in the sense that it feels like the full richness
link |
00:11:49.280
of the information coming from my mind
link |
00:11:51.680
is actually being recorded by the machine.
link |
00:11:56.080
I mean, I can't quite put it into words,
link |
00:12:00.160
but there is genuinely for me,
link |
00:12:02.880
there's not some kind of joke about me being a robot.
link |
00:12:05.160
It just genuinely feels like I'm being heard
link |
00:12:09.080
in a way that's going to improve my life,
link |
00:12:13.880
as long as the thing that's on the other end
link |
00:12:16.320
can do something useful with that data.
link |
00:12:17.760
But even the recording itself is like,
link |
00:12:20.400
once you record, it's like taking a picture.
link |
00:12:25.320
That moment is forever saved in time.
link |
00:12:28.480
Now, a picture cannot allow you to step back
link |
00:12:32.680
into that world, but perhaps recording your brain
link |
00:12:38.240
is a much higher resolution thing,
link |
00:12:41.160
much more personal recording of that information
link |
00:12:44.720
than a picture that would allow you to step back
link |
00:12:48.040
into that where you were in that particular moment
link |
00:12:51.800
in history and then map out a certain trajectory
link |
00:12:54.160
to tell you certain things about yourself
link |
00:12:58.520
that could open up all kinds of applications.
link |
00:13:00.360
Of course, there's health that I consider,
link |
00:13:02.640
but honestly, to me, the exciting thing is just being heard.
link |
00:13:06.600
My state of mind, the level of focus,
link |
00:13:08.960
all those kinds of things, being heard.
link |
00:13:10.920
What I heard you say is you have an entirety
link |
00:13:13.800
of lived experience, some of which you can communicate
link |
00:13:17.080
in words and in body language,
link |
00:13:20.000
some of which you feel internally,
link |
00:13:21.440
which cannot be captured in those communication modalities,
link |
00:13:24.440
and that this measurement system captures
link |
00:13:27.640
both the things you can try to articulate in words,
link |
00:13:29.880
maybe in a lower dimensional space,
link |
00:13:31.480
using one word, for example, to communicate focus,
link |
00:13:34.520
when it really may be represented
link |
00:13:35.600
in a 20 dimensional space of this particular kind of focus
link |
00:13:39.720
and that this information is being captured,
link |
00:13:42.520
so it's a closer representation
link |
00:13:44.360
to the entirety of your experience captured
link |
00:13:47.480
in a dynamic fashion that is not just a static image
link |
00:13:50.920
of your conscious experience.
link |
00:13:53.040
Yeah, that's the promise, that was the feeling,
link |
00:13:58.280
and it felt like the future.
link |
00:13:59.360
So it was a pretty cool experience.
link |
00:14:00.800
And from the sort of mechanical perspective,
link |
00:14:04.000
it was cool to have an actual device
link |
00:14:06.680
that feels pretty good,
link |
00:14:08.360
that doesn't require me to go into the lab.
link |
00:14:11.560
And also the other thing I was feeling,
link |
00:14:14.480
there's a guy named Andrew Huberman,
link |
00:14:16.160
he's a friend of mine, amazing podcast,
link |
00:14:17.920
people should listen to it, Huberman Lab Podcast.
link |
00:14:21.600
We're working on a paper together
link |
00:14:24.020
about eye movement and so on.
link |
00:14:26.360
And we're kind of, he's a neuroscientist
link |
00:14:28.440
and I'm a data person, machine learning person,
link |
00:14:30.280
and we're both excited by how much the,
link |
00:14:38.720
how much the data measurements of the human mind,
link |
00:14:42.920
the brain and all the different metrics
link |
00:14:44.840
that come from that could be used to understand
link |
00:14:47.720
human beings and in a rigorous scientific way.
link |
00:14:50.520
So the other thing I was thinking about
link |
00:14:52.000
is how this could be turned into a tool for science.
link |
00:14:55.960
Sort of not just personal science,
link |
00:14:57.680
not just like Fitbit style,
link |
00:15:00.520
like how am I doing on my personal metrics of health,
link |
00:15:04.040
but doing larger scale studies of human behavior and so on.
link |
00:15:07.760
So like data, not at the scale of an individual,
link |
00:15:10.480
but data at a scale of many individuals
link |
00:15:12.480
or a large number of individuals.
link |
00:15:14.720
So personal being heard was exciting
link |
00:15:17.000
and also just for science is exciting.
link |
00:15:19.940
It's very easy, like there's a very powerful thing
link |
00:15:23.200
to it being so easy to just put on
link |
00:15:25.600
that you could scale much easier.
link |
00:15:28.320
If you think about that second thing you said
link |
00:15:30.140
about the science of the brain,
link |
00:15:37.520
most, we've done a pretty good job,
link |
00:15:40.600
like we, the human race has done a pretty good job
link |
00:15:43.240
figuring out how to quantify the things around us
link |
00:15:46.140
from distant stars to calories and steps and our genome.
link |
00:15:52.120
So we can measure and quantify pretty much everything
link |
00:15:55.360
in the known universe except for our minds.
link |
00:16:00.640
And we can do these one offs
link |
00:16:02.160
if we're going to get an fMRI scan
link |
00:16:03.640
or do something with a low res EEG system,
link |
00:16:08.880
but we haven't done this at population scale.
link |
00:16:11.760
And so if you think about human thought
link |
00:16:14.840
or human cognition is probably the single law,
link |
00:16:19.320
largest raw input material into society
link |
00:16:23.360
at any given moment is our conversations
link |
00:16:25.720
with ourselves and with other people.
link |
00:16:27.920
And we have this raw input that we can't,
link |
00:16:33.040
that haven't been able to measure yet.
link |
00:16:35.180
And if you, when I think about it through that frame,
link |
00:16:41.400
it's remarkable, it's almost like we live
link |
00:16:43.320
in this wild, wild West of unquantified communications
link |
00:16:49.480
within ourselves and between each other
link |
00:16:52.000
when everything else has been grounded in me.
link |
00:16:53.440
For example, I know if I buy an appliance at the store
link |
00:16:56.560
or on a website, I don't need to look at the measurements
link |
00:17:00.640
on the appliance and make sure it can fit through my door.
link |
00:17:03.040
That's an engineered system of appliance manufacturing
link |
00:17:06.160
and construction.
link |
00:17:07.120
Everyone's agreed upon engineering standards.
link |
00:17:10.620
And we don't have engineering standards around cognition.
link |
00:17:15.360
It's not a, it has not entered
link |
00:17:16.880
as a formal engineering discipline that enables us
link |
00:17:20.040
to scaffold in society with everything else we're doing,
link |
00:17:23.040
including consuming news, our relationships,
link |
00:17:26.560
politics, economics, education, all the above.
link |
00:17:29.600
And so to me that the most significant contribution
link |
00:17:33.720
that kernel technology has to offer
link |
00:17:36.520
would be the formal, the introduction
link |
00:17:40.720
to formal engineering of cognition
link |
00:17:42.640
as it relates to everything else in society.
link |
00:17:45.280
I love that idea that you kind of think that there's just
link |
00:17:50.100
this ocean of data that's coming from people's brains
link |
00:17:53.120
as being in a crude way, reduced down to like tweets
link |
00:17:57.360
and texts and so on, just a very hardcore,
link |
00:18:02.160
many scale compression of actual, the raw data.
link |
00:18:06.060
But maybe you can comment,
link |
00:18:08.320
because you're using the word cognition.
link |
00:18:11.760
I think the first step is to get the brain data.
link |
00:18:15.240
But is there a leap to be taking
link |
00:18:19.200
to sort of interpreting that data in terms of cognition?
link |
00:18:22.140
So is your idea is basically you need to start collecting
link |
00:18:25.440
data at scale from the brain,
link |
00:18:27.800
and then we start to really be able to take little steps
link |
00:18:32.680
along the path to actually measuring
link |
00:18:35.300
some deep sense of cognition.
link |
00:18:37.760
Because as I'm sure you know, we understand a few things,
link |
00:18:42.760
but we don't understand most of what makes up cognition.
link |
00:18:47.280
This has been one of the most significant challenges
link |
00:18:50.380
of building Kernel, and Kernel wouldn't exist
link |
00:18:52.440
if I wasn't able to fund it initially by myself.
link |
00:18:55.760
Because when I engage in conversations with investors,
link |
00:18:59.280
the immediate thought is, what is the killer app?
link |
00:19:02.800
And of course, I understand that heuristic,
link |
00:19:04.980
that's what they're looking at,
link |
00:19:05.820
is they're looking to de risk.
link |
00:19:07.920
Is the product solved?
link |
00:19:09.560
Is there a customer base?
link |
00:19:10.740
Are people willing to pay for it?
link |
00:19:11.920
How does it compare to competing options?
link |
00:19:14.320
And in the case with brain interfaces,
link |
00:19:15.620
when I started the company, there was no known path
link |
00:19:20.360
to even build a technology
link |
00:19:21.520
that could potentially become mainstream.
link |
00:19:24.400
And then once we figured out the technology,
link |
00:19:26.940
we could even, we could commence having conversations
link |
00:19:28.840
with investors and it became, what is the killer app?
link |
00:19:31.320
And so what has been,
link |
00:19:33.240
so I funded the first $53 million for the company.
link |
00:19:36.400
And to raise the round of funding, the first one we did,
link |
00:19:39.820
I spoke to 228 investors.
link |
00:19:42.560
One said yes, it was remarkable.
link |
00:19:45.600
And it was mostly around this concept
link |
00:19:48.320
around what is a killer app.
link |
00:19:49.520
And so internally, the way we think about it is,
link |
00:19:51.960
we think of the go to market strategy
link |
00:19:55.320
much more like the Drake equation,
link |
00:19:57.860
where if we can build technology
link |
00:20:00.680
that has the characteristics of,
link |
00:20:02.480
it has the data quality is high enough,
link |
00:20:05.120
it meets some certain threshold,
link |
00:20:07.480
cost, accessibility, comfort,
link |
00:20:11.180
it can be worn in contextual environments.
link |
00:20:12.680
If it meets the criteria of being a mass market device,
link |
00:20:17.320
then the responsibility that we have is to figure out
link |
00:20:22.280
how to create the algorithm that enables the human,
link |
00:20:29.320
to enable humans to then find value with it.
link |
00:20:32.400
So the analogy is like brain interfaces
link |
00:20:34.920
are like early 90s of the internet,
link |
00:20:37.080
is you wanna populate an ecosystem
link |
00:20:39.480
with a certain number of devices,
link |
00:20:40.560
you want a certain number of people
link |
00:20:41.720
who play around with them, who do experiments
link |
00:20:43.440
of certain data collection parameters,
link |
00:20:44.480
you want to encourage certain mistakes
link |
00:20:46.440
from experts and non experts.
link |
00:20:48.020
These are all critical elements that ignite discovery.
link |
00:20:51.920
And so we believe we've accomplished the first objective
link |
00:20:57.560
of building technology that reaches those thresholds.
link |
00:21:01.800
And now it's the Drake equation component
link |
00:21:03.500
of how do we try to generate 20 years of value discovery
link |
00:21:11.720
in a two or three year time period?
link |
00:21:12.880
How do we compress that?
link |
00:21:14.280
So just to clarify, so when you mean the Drake equation,
link |
00:21:17.520
which for people who don't know,
link |
00:21:19.120
I don't know why you, if you listen to this,
link |
00:21:20.680
I bring up aliens every single conversation.
link |
00:21:22.600
So I don't know how you would know
link |
00:21:24.000
what the Drake equation is,
link |
00:21:25.000
but you mean like the killer app,
link |
00:21:28.600
it would be one alien civilization in that equation.
link |
00:21:31.240
So meaning like this is in search of an application
link |
00:21:34.800
that's impactful, transformative.
link |
00:21:36.520
By the way, it should be, we need to come up
link |
00:21:38.080
with a better term than killer app as a.
link |
00:21:41.560
It's also violent, right?
link |
00:21:43.060
It's very violent.
link |
00:21:43.900
You can go like viral app, that's horrible too, right?
link |
00:21:46.680
It's some very inspiringly impactful application.
link |
00:21:51.520
How about that?
link |
00:21:52.360
No. Yeah.
link |
00:21:53.180
Okay, so ballistic with killer app, that's fine.
link |
00:21:55.320
Nobody's. But I concur with you.
link |
00:21:56.960
I dislike the chosen words in capturing the concept.
link |
00:22:02.200
You know, it's one of those sticky things
link |
00:22:03.840
that is as effective to use in the tech world.
link |
00:22:08.440
But when you now become a communicator
link |
00:22:11.520
outside of the tech world,
link |
00:22:12.720
especially when you're talking about software and hardware
link |
00:22:15.360
and artificial intelligence applications,
link |
00:22:17.360
it sounds horrible.
link |
00:22:18.200
Yeah, no, it's interesting.
link |
00:22:19.040
I actually regret now having called attention
link |
00:22:21.360
to cyber regret having used that word in this conversation
link |
00:22:24.200
because it's something I would not normally do.
link |
00:22:26.120
I used it in order to create a bridge
link |
00:22:29.120
of shared understanding of how others would,
link |
00:22:31.120
what terminology others would use.
link |
00:22:32.800
Yeah.
link |
00:22:33.640
But yeah, I concur.
link |
00:22:34.880
Let's go with impactful application.
link |
00:22:38.200
Or the.
link |
00:22:39.040
Just value creation.
link |
00:22:40.040
Value creation.
link |
00:22:41.360
Something people love using.
link |
00:22:43.600
There we go, that's it.
link |
00:22:45.280
Love app.
link |
00:22:46.760
Okay, so what, do you have any ideas?
link |
00:22:49.440
So you're basically creating a framework
link |
00:22:51.040
where there's the possibility of a discovery
link |
00:22:54.220
of an application that people love using.
link |
00:22:58.520
Is, do you have ideas?
link |
00:22:59.640
We've began to play a fun game internally
link |
00:23:01.560
where when we have these discussions
link |
00:23:03.080
or we begin circling around this concept of,
link |
00:23:08.340
does anybody have an idea?
link |
00:23:10.160
Does anyone have intuitions?
link |
00:23:11.400
And if we see the conversation starting
link |
00:23:13.840
to veer in that direction,
link |
00:23:15.820
we flag it and say, human intuition alert, stop it.
link |
00:23:20.480
And so we really want to focus on the algorithm
link |
00:23:23.140
of there's a natural process of human discovery.
link |
00:23:27.520
That when you populate a system with devices
link |
00:23:30.480
and you give people the opportunity to play around with it
link |
00:23:33.080
in expected and unexpected ways,
link |
00:23:35.440
we are thinking that is a much better system of discovery
link |
00:23:40.200
than us exercising intuitions.
link |
00:23:41.480
And it's interesting, we're also seeing
link |
00:23:42.600
a few neuroscientists who have been talking to us.
link |
00:23:45.920
While I was speaking to this one young associate professor,
link |
00:23:49.440
and I approached a conversation and said,
link |
00:23:50.960
hey, we have these five data streams that we're pulling off.
link |
00:23:56.160
When you hear that, what weighted value
link |
00:23:58.120
do you add to each data source?
link |
00:23:59.520
Which one do you think is going to be valuable
link |
00:24:00.880
for your objectives and which one's not?
link |
00:24:03.160
And he said, I don't care, just give me the data.
link |
00:24:05.720
All I care about is my machine learning model.
link |
00:24:08.080
But importantly, he did not have a theory of mind.
link |
00:24:10.800
He did not come to the table and say,
link |
00:24:12.380
I think the brain operates in this way
link |
00:24:15.080
and these reasons or have these functions.
link |
00:24:16.880
He didn't care, he just wanted the data.
link |
00:24:19.040
And we're seeing that more and more
link |
00:24:20.800
that certain people are devaluing human intuitions
link |
00:24:26.580
for good reasons, as we've seen in machine learning
link |
00:24:28.360
over the past couple years.
link |
00:24:30.580
And we're doing the same in our value creation market
link |
00:24:34.480
strategy.
link |
00:24:36.020
So collect more data, clean data,
link |
00:24:40.360
make the product such that the collection of data
link |
00:24:42.960
is easy and fun and then the rest will just spring to life.
link |
00:24:47.960
Through humans playing around with them.
link |
00:24:50.540
Our objective is to create the most valuable
link |
00:24:54.020
data collection system of the brain ever.
link |
00:25:01.340
And with that, then applying all the best tools
link |
00:25:05.780
of machine learning and other techniques
link |
00:25:09.500
to extract out, to try to find insight.
link |
00:25:13.780
But yes, our objective is really to systematize
link |
00:25:16.220
the discovery process because we can't put
link |
00:25:19.900
definite timeframes on discovery.
link |
00:25:22.020
The brain is complicated and science
link |
00:25:25.820
is not a business strategy.
link |
00:25:28.140
And so we really need to figure out how to,
link |
00:25:31.580
this is the difficulty of bringing technology
link |
00:25:34.060
like this to market.
link |
00:25:34.900
And it's why most of the time it just languishes
link |
00:25:38.300
in academia for quite some time.
link |
00:25:40.980
But we hope that we will cross over
link |
00:25:44.060
and make this mainstream in the coming years.
link |
00:25:48.960
The thing was cool to wear, but are you chasing
link |
00:25:53.240
a good reason for millions of people to put this
link |
00:25:56.880
on their head and keep on their head regularly?
link |
00:26:00.860
Is there, like who's going to discover that reason?
link |
00:26:04.740
Is it going to be people just kind of organically
link |
00:26:08.060
or is there going to be an Angry Birds style application
link |
00:26:12.520
that's just too exciting to not use?
link |
00:26:18.140
If I think through the things that have changed
link |
00:26:20.940
my life most significantly over the past few years,
link |
00:26:23.500
when I started wearing a wearable on my wrist
link |
00:26:27.140
that would give me data about my heart rate,
link |
00:26:29.100
heart rate variability, respiration rate,
link |
00:26:33.680
metabolic approximations, et cetera,
link |
00:26:37.180
for the first time in my life,
link |
00:26:38.660
I had access to information, sleep patterns
link |
00:26:41.780
that were highly impactful.
link |
00:26:43.220
They told me, for example, if I eat close to bedtime,
link |
00:26:48.620
I'm not going to get deep sleep.
link |
00:26:50.840
And not getting deep sleep means you have
link |
00:26:52.760
all these follow on consequences in life.
link |
00:26:54.420
And so it opened up this window of understanding of myself
link |
00:26:59.500
that I cannot self introspect and deduce these things.
link |
00:27:03.180
This is information that was available to be acquired,
link |
00:27:06.180
but it just wasn't.
link |
00:27:07.060
I would have to get an expensive sleep study,
link |
00:27:08.840
then it's an end, like one night,
link |
00:27:10.460
and that's not good enough to look at, to run all my trials.
link |
00:27:12.500
And so if you look just at the information
link |
00:27:15.340
that one can acquire on their wrist,
link |
00:27:18.220
and now you're applying it to the entire cortex
link |
00:27:20.180
on the brain and you say,
link |
00:27:22.280
what kind of information could we acquire?
link |
00:27:25.020
It opens up a whole new universe of possibilities.
link |
00:27:28.360
For example, we did this internal study at Kernel
link |
00:27:30.140
where I wore a prototype device
link |
00:27:32.540
and we were measuring the cognitive effects of sleep.
link |
00:27:36.180
So I had a device measuring my sleep.
link |
00:27:38.420
I performed with 13 of my coworkers.
link |
00:27:41.740
We performed four cognitive tasks over 13 sessions.
link |
00:27:45.740
And we focused on reaction time, impulse control,
link |
00:27:49.860
short term memory, and then a resting state task.
link |
00:27:52.780
And with mine, we found, for example,
link |
00:27:55.140
that my impulse control was independently correlated
link |
00:28:00.460
with my sleep outside of behavioral measures
link |
00:28:02.320
of my ability to play the game.
link |
00:28:04.100
The point of the study was I had,
link |
00:28:07.540
the brain study I did at Kernel confirmed my life experience
link |
00:28:12.060
that if I, my deep sleep determined whether or not
link |
00:28:18.040
I would be able to resist temptation the following day.
link |
00:28:21.580
And my brain did show that as one example.
link |
00:28:24.400
And so if you start thinking,
link |
00:28:25.700
if you actually have data on yourself,
link |
00:28:27.900
on your entire cortex and you can control the settings,
link |
00:28:32.180
I think there's probably a large number of things
link |
00:28:36.860
that we could discover about ourselves,
link |
00:28:37.980
very, very small and very, very big.
link |
00:28:39.860
I just, for example, like when you read news,
link |
00:28:43.580
what's going on?
link |
00:28:44.820
Like when you use social media, when you use news,
link |
00:28:47.620
like all the ways we allocate attention.
link |
00:28:51.340
That's right.
link |
00:28:52.180
With the computer.
link |
00:28:53.000
I mean, that seems like a compelling place
link |
00:28:54.540
to where you would want to put on a Kernel,
link |
00:28:58.260
by the way, what is it called?
link |
00:28:59.180
Kernel Flux, Kernel, like what?
link |
00:29:00.860
Flow.
link |
00:29:01.780
Flow.
link |
00:29:02.620
We have two technologies, you or Flow.
link |
00:29:04.260
Flow, okay.
link |
00:29:05.540
So when you put on the Kernel Flow,
link |
00:29:11.020
it seems like to be a compelling time and place to do it
link |
00:29:16.300
is when you're behind a desk, behind a computer.
link |
00:29:18.660
Because you could probably wear it
link |
00:29:19.660
for prolonged periods of time as you're taking in content.
link |
00:29:23.740
And there could a lot of,
link |
00:29:25.300
because so much of our lives happens
link |
00:29:28.220
in the digital world now.
link |
00:29:29.960
That kind of coupling the information about the human mind
link |
00:29:34.620
with the consumption and the behaviors in the digital world
link |
00:29:39.540
might give us a lot of information about the effects
link |
00:29:42.100
of the way we behave and navigate the digital world
link |
00:29:45.120
to the actual physical meat space effects on our body.
link |
00:29:50.580
It's interesting to think,
link |
00:29:51.420
so in terms of both like for work,
link |
00:29:54.760
I'm a big fan of Cal Newport, his ideas of deep work
link |
00:29:59.740
that I spend, with few exceptions,
link |
00:30:05.020
I try to spend the first two hours of every day,
link |
00:30:07.860
usually if I'm like at home and have nothing on my schedule
link |
00:30:11.340
is going to be up to eight hours of deep work,
link |
00:30:15.020
of focus, zero distraction.
link |
00:30:16.740
And for me to analyze, I mean I'm very aware
link |
00:30:20.060
of the waning of that, the ups and downs of that.
link |
00:30:24.780
And it's almost like you're surfing the ups and downs
link |
00:30:27.940
of that as you're doing programming,
link |
00:30:29.440
as you're doing thinking about particular problems,
link |
00:30:32.580
you're trying to visualize things in your mind,
link |
00:30:34.260
you start trying to stitch them together.
link |
00:30:37.300
You're trying to, when there's a dead end about an idea,
link |
00:30:40.960
you have to kind of calmly like walk back and start again,
link |
00:30:45.520
all those kinds of processes.
link |
00:30:47.260
It'd be interesting to get data
link |
00:30:49.020
on what my mind is actually doing.
link |
00:30:51.100
And also recently started doing,
link |
00:30:53.360
I just talked to Sam Harris a few days ago
link |
00:30:55.940
and been building up to that.
link |
00:30:58.080
I started using, started meditating using his app,
link |
00:31:01.900
Waking Up, I very much recommend it.
link |
00:31:05.340
It'd be interesting to get data on that
link |
00:31:07.500
because it's, you're very, it's like you're removing
link |
00:31:10.540
all the noise from your head and you very much,
link |
00:31:13.740
it's an active process of active noise removal,
link |
00:31:18.340
active noise canceling like the headphones.
link |
00:31:21.220
And it'd be interesting to see what is going on in the mind
link |
00:31:24.900
before the meditation, during it and after,
link |
00:31:28.400
all those kinds of things.
link |
00:31:29.240
And all of your examples, it's interesting
link |
00:31:31.280
that everyone who's designed an experience for you,
link |
00:31:35.400
so whether it be the meditation app or the Deep Work
link |
00:31:38.520
or all the things you mentioned,
link |
00:31:40.400
they constructed this product
link |
00:31:43.280
with a certain number of knowns.
link |
00:31:45.920
Yeah.
link |
00:31:47.040
Now, what if we expanded the number of knowns by 10X
link |
00:31:50.580
or 20X or 30X, they would reconstruct their product
link |
00:31:54.240
or incorporate those knowns.
link |
00:31:55.480
So it'd be, and so this is the dimensionality
link |
00:31:58.520
that I think is the promising aspect
link |
00:32:00.320
is that people will be able to use this quantification,
link |
00:32:04.600
use this information to build more effective products.
link |
00:32:09.260
And this is, I'm not talking about better products
link |
00:32:11.360
to advertise to you or manipulate you.
link |
00:32:13.960
I'm talking about our focus is helping people,
link |
00:32:17.820
individuals have this contextual awareness
link |
00:32:20.360
and this quantification and then to engage with others
link |
00:32:23.040
who are seeking to improve people's lives,
link |
00:32:26.060
that the objective is betterment across ourselves,
link |
00:32:31.360
individually and also with each other.
link |
00:32:33.480
Yeah, so it's a nice data stream to have
link |
00:32:35.000
if you're building an app,
link |
00:32:36.120
like if you're building a podcast listening app,
link |
00:32:38.040
it would be nice to know data about the listener
link |
00:32:40.320
so that like if you're bored or you fell asleep,
link |
00:32:42.820
maybe pause the podcast, it's like really dumb,
link |
00:32:46.000
just very simple applications
link |
00:32:48.400
that could just improve the quality of the experience
link |
00:32:50.640
of using the app.
link |
00:32:52.000
I'm imagining if you have your neural, this is Lex
link |
00:32:56.680
and there's a statistical representation of you
link |
00:32:59.880
and you engage with the app and it says,
link |
00:33:02.540
Lex, you're best to engage with this meditation exercise
link |
00:33:09.560
in the following settings.
link |
00:33:11.040
At this time of day, after eating this kind of food
link |
00:33:14.020
or not eating, fasting with this level of blood glucose
link |
00:33:17.380
and this kind of night's sleep.
link |
00:33:19.680
But all these data combined
link |
00:33:23.000
to give you this contextually relevant experience,
link |
00:33:26.600
just like we do with our sleep.
link |
00:33:27.800
You've optimized your entire life based upon
link |
00:33:30.360
what information you can acquire and know about yourself.
link |
00:33:34.240
And so the question is, how much do we really know
link |
00:33:37.200
of the things going around us?
link |
00:33:38.480
And I would venture to guess in my own life experience,
link |
00:33:41.520
I capture, my self awareness captures an extremely small
link |
00:33:45.920
percent of the things that actually influence
link |
00:33:47.760
my conscious and unconscious experience.
link |
00:33:50.500
Well, in some sense, the data would help encourage you
link |
00:33:54.560
to be more self aware, not just because you trust everything
link |
00:33:57.520
the data is saying, but it'll give you a prod
link |
00:34:02.820
to start investigating.
link |
00:34:04.800
Like I would love to get like a rating,
link |
00:34:08.040
like a ranking of all the things I do
link |
00:34:11.160
and what are the things, it's probably important to do
link |
00:34:13.800
without the data, but the data will certainly help.
link |
00:34:16.280
It's like rank all the things you do in life
link |
00:34:19.840
and which ones make you feel shitty,
link |
00:34:21.800
which ones make you feel good.
link |
00:34:23.600
Like you're talking about evening, Brian.
link |
00:34:26.460
Like this is a good example, somebody like,
link |
00:34:30.320
I do pig out at night as well.
link |
00:34:32.680
And it never makes me feel good.
link |
00:34:35.840
Like you're in a safe space.
link |
00:34:37.400
This is a safe space, let's hear it.
link |
00:34:40.440
No, I definitely have much less self control
link |
00:34:42.720
at night and it's interesting.
link |
00:34:44.400
And the same, people might criticize this,
link |
00:34:47.760
but I know my own body.
link |
00:34:50.000
I know when I eat carnivores, just eat meat,
link |
00:34:52.800
I feel much better than if I eat more carbs.
link |
00:35:00.120
The more carbs I eat, the worse I feel.
link |
00:35:02.300
I don't know why that is.
link |
00:35:04.160
There is science supporting it,
link |
00:35:05.260
but I'm not leaning on science.
link |
00:35:06.640
I'm leaning on personal experience
link |
00:35:07.720
and that's really important.
link |
00:35:09.360
I don't need to read, I'm not gonna go on a whole rant
link |
00:35:12.560
about nutrition science, but many of those studies
link |
00:35:15.680
are very flawed.
link |
00:35:17.160
They're doing their best, but nutrition science
link |
00:35:19.420
is a very difficult field of study
link |
00:35:21.680
because humans are so different
link |
00:35:24.200
and the mind has so much impact
link |
00:35:26.120
on the way your body behaves.
link |
00:35:28.440
And it's so difficult from a scientific perspective
link |
00:35:30.800
to conduct really strong studies
link |
00:35:32.760
that you have to be almost like a scientist of one
link |
00:35:36.960
if to do these studies on yourself.
link |
00:35:39.020
That's the best way to understand what works for you or not.
link |
00:35:41.560
And I don't understand why, because it sounds unhealthy,
link |
00:35:44.560
but eating only meat always makes me feel good.
link |
00:35:47.440
Just eat meat, that's it.
link |
00:35:49.680
And I don't have any allergies, any of that kind of stuff.
link |
00:35:52.880
I'm not full like Jordan Peterson,
link |
00:35:54.520
where if he deviates a little bit from the carnivore diet,
link |
00:36:01.360
he goes off the cliff.
link |
00:36:03.200
No, I can have chocolate, I can go off the diet,
link |
00:36:06.320
I feel fine, it's a gradual worsening of how I feel.
link |
00:36:14.920
But when I eat only meat, I feel great.
link |
00:36:17.080
And it'd be nice to be reminded of that.
link |
00:36:19.280
Like there's a very simple fact
link |
00:36:21.920
that I feel good when I eat carnivore.
link |
00:36:24.200
And I think that repeats itself in all kinds of experiences.
link |
00:36:27.560
Like I feel really good when I exercise.
link |
00:36:32.560
I hate exercise, but in the rest of the day,
link |
00:36:39.760
the impact it has on my mind and the clarity of mind
link |
00:36:43.640
and the experiences and the happiness
link |
00:36:45.800
and all those kinds of things, I feel really good.
link |
00:36:48.040
And to be able to concretely express that through data
link |
00:36:52.440
would be nice.
link |
00:36:53.800
It would be a nice reminder, almost like a statement,
link |
00:36:55.840
like remember what feels good and whatnot.
link |
00:36:58.280
And there could be things like that,
link |
00:37:01.800
I'm not, many things that you're suggesting
link |
00:37:04.560
that I could not be aware of,
link |
00:37:07.000
that might be sitting right in front of me
link |
00:37:09.080
that make me feel really good and make me feel not good.
link |
00:37:12.160
And the data would show that.
link |
00:37:14.000
I agree with you.
link |
00:37:14.840
I've actually employed the same strategy.
link |
00:37:17.320
I fired my mind entirely from being responsible
link |
00:37:21.480
for constructing my diet.
link |
00:37:23.240
And so I started doing a program
link |
00:37:24.600
where I now track over 200 biomarkers every 90 days.
link |
00:37:28.680
And it captures, of course, the things you would expect
link |
00:37:31.800
like cholesterol, but also DNA methylation
link |
00:37:34.360
and all kinds of things about my body,
link |
00:37:37.120
all the processes that make up me.
link |
00:37:39.360
And then I let that data generate the shopping lists.
link |
00:37:43.280
And so I never actually ask my mind what it wants.
link |
00:37:45.560
It's entirely what my body is reporting that it wants.
link |
00:37:48.360
And so I call this goal alignment within Brian.
link |
00:37:51.760
And there's 200 plus actors
link |
00:37:53.480
that I'm currently asking their opinion of.
link |
00:37:55.240
And so I'm asking my liver, how are you doing?
link |
00:37:57.800
And it's expressing via the biomarkers.
link |
00:38:00.200
And so that I construct that diet
link |
00:38:02.280
and I only eat those foods until my next testing round.
link |
00:38:06.160
And that has changed my life more than I think anything else
link |
00:38:10.840
because in the demotion of my conscious mind
link |
00:38:15.920
that I gave primacy to my entire life,
link |
00:38:18.000
it led me astray because like you were saying,
link |
00:38:20.480
the mind then goes out into the world
link |
00:38:22.520
and it navigates the dozens
link |
00:38:25.720
of different dietary regimens people put together in books.
link |
00:38:29.080
And it's all has their supporting science
link |
00:38:32.760
in certain contextual settings, but it's not N of one.
link |
00:38:35.720
And like you're saying, this dietary really is an N of one.
link |
00:38:39.440
What people have published scientifically of course
link |
00:38:41.920
can be used for nice groundings,
link |
00:38:46.480
but it changes when you get to an N of one level.
link |
00:38:48.580
And so that's what gets me excited about brain interfaces
link |
00:38:51.680
is if I could do the same thing for my brain
link |
00:38:54.640
where I can stop asking my conscious mind for its advice
link |
00:38:58.000
or for its decision making, which is flawed.
link |
00:39:01.520
And I'd rather just look at this data
link |
00:39:03.520
and I've never had better health markers in my life
link |
00:39:05.640
than when I stopped actually asking myself
link |
00:39:08.000
to be in charge of it.
link |
00:39:09.680
The idea of demotion of the conscious mind
link |
00:39:15.040
is such a sort of engineering way of phrasing meditation.
link |
00:39:21.680
That's what we're doing, right?
link |
00:39:22.840
That's beautiful, that means really beautifully put.
link |
00:39:26.120
By the way, testing round, what does that look like?
link |
00:39:28.560
What's that?
link |
00:39:29.900
Well, you mentioned.
link |
00:39:31.360
Yeah, the test I do.
link |
00:39:33.240
Yes.
link |
00:39:34.080
So it includes a complete blood panel.
link |
00:39:37.560
I do a microbiome test.
link |
00:39:39.040
I do a diet induced inflammation.
link |
00:39:43.560
So I look for exotokine expressions.
link |
00:39:45.440
So foods that produce inflammatory reactions.
link |
00:39:48.400
I look at my neuroendocrine systems.
link |
00:39:50.120
I look at all my neurotransmitters.
link |
00:39:53.000
I do, yeah, there's several micronutrient tests
link |
00:39:57.440
to see how I'm looking at the various nutrients.
link |
00:39:59.400
What about self report of how you feel?
link |
00:40:04.500
Almost like, you can't demote your,
link |
00:40:09.160
you still exist within your conscious mind, right?
link |
00:40:12.520
So that lived experience is of a lot of value.
link |
00:40:16.280
So how do you measure that?
link |
00:40:17.880
I do a temporal sampling over some duration of time.
link |
00:40:20.920
So I'll think through how I feel over a week,
link |
00:40:23.720
over a month, over three months.
link |
00:40:25.760
I don't do a temporal sampling of
link |
00:40:27.540
if I'm at the grocery store in front of a cereal box
link |
00:40:29.680
and be like, you know what, Captain Crunch
link |
00:40:31.800
is probably the right thing for me today
link |
00:40:33.320
because I'm feeling like I need a little fun in my life.
link |
00:40:36.560
And so it's a temporal sampling.
link |
00:40:37.640
If the data sets large enough,
link |
00:40:38.840
then I smooth out the function of my natural oscillations
link |
00:40:42.200
of how I feel about life where some days I may feel upset
link |
00:40:45.360
or depressed or down or whatever.
link |
00:40:47.360
And I don't want those moments
link |
00:40:48.920
to then rule my decision making.
link |
00:40:50.480
That's why the demotion happens.
link |
00:40:52.480
And it says, really, if you're looking at
link |
00:40:54.160
health over a 90 day period of time,
link |
00:40:56.520
all my 200 voices speak up on that interval.
link |
00:41:00.040
And they're all given voice to say,
link |
00:41:02.360
this is how I'm doing and this is what I want.
link |
00:41:04.600
And so it really is an accounting system for everybody.
link |
00:41:07.120
So that's why I think that if you think about
link |
00:41:10.140
the future of being human,
link |
00:41:11.740
there's two things I think that are really going on.
link |
00:41:17.820
One is the design, manufacturing,
link |
00:41:20.020
and distribution of intelligence
link |
00:41:22.860
is heading towards zero on a cost curve
link |
00:41:26.740
over a certain design, over a certain timeframe
link |
00:41:30.180
that our ability to, you know, evolution produced us
link |
00:41:33.140
an intelligent form of intelligence.
link |
00:41:35.580
We are now designing our own intelligence systems
link |
00:41:38.580
and the design, manufacturing, distribution
link |
00:41:40.140
of that intelligence over a certain timeframe
link |
00:41:42.860
is going to go to a cost of zero.
link |
00:41:45.100
Design, manufacturing, distribution of intelligence
link |
00:41:48.060
cost is going to zero.
link |
00:41:49.700
For example.
link |
00:41:50.540
Again, just give me a second.
link |
00:41:52.540
That's brilliant, okay.
link |
00:41:54.540
And evolution is doing the design, manufacturing,
link |
00:41:58.300
distribution of intelligence.
link |
00:41:59.880
And now we are doing the design, manufacturing,
link |
00:42:02.380
distribution of intelligence.
link |
00:42:04.420
And the cost of that is going to zero.
link |
00:42:06.660
That's a very nice way of looking at life on Earth.
link |
00:42:10.780
So if that's going on and then now in parallel to that,
link |
00:42:15.580
then you say, okay, what then happens
link |
00:42:19.060
if when that cost curve is heading to zero?
link |
00:42:24.700
Our existence becomes a goal alignment problem,
link |
00:42:28.780
a goal alignment function.
link |
00:42:31.340
And so the same thing I'm doing
link |
00:42:33.060
where I'm doing goal alignment within myself
link |
00:42:34.880
of these 200 biomarkers, where I'm saying,
link |
00:42:37.580
when Brian exists on a daily basis
link |
00:42:41.740
and this entity is deciding what to eat
link |
00:42:43.740
and what to do and et cetera,
link |
00:42:45.300
it's not just my conscious mind, which is opining,
link |
00:42:47.780
it's 200 biological processes
link |
00:42:50.220
and there's a whole bunch of more voices involved.
link |
00:42:52.140
So in that equation,
link |
00:42:57.400
we're going to increasingly automate the things
link |
00:43:02.060
that we spend high energy on today because it's easier.
link |
00:43:05.340
And now we're going to then negotiate the terms
link |
00:43:09.260
and conditions of intelligent life.
link |
00:43:11.200
Now we say conscious existence because we're biased
link |
00:43:13.180
because that's what we have,
link |
00:43:15.100
but it will be the largest computational exercise
link |
00:43:18.140
in history because you're now doing goal alignment
link |
00:43:20.380
with planet Earth, within yourself, with each other,
link |
00:43:24.980
within all the intelligent agents we're building,
link |
00:43:26.900
bots and other voice assistants.
link |
00:43:29.420
You basically have a trillions and trillions of agents
link |
00:43:32.020
working on the negotiation of goal alignment.
link |
00:43:35.860
Yeah, this is in fact true.
link |
00:43:39.180
And what was the second thing?
link |
00:43:40.580
That was it.
link |
00:43:41.420
So the cost, the design, manufacturing, distribution
link |
00:43:44.300
of intelligence going to zero,
link |
00:43:45.860
which then means what's really going on?
link |
00:43:48.660
What are we really doing?
link |
00:43:50.180
We're negotiating the terms and conditions of existence.
link |
00:43:55.100
Do you worry about the survival of this process
link |
00:43:59.220
that life as we know it on Earth comes to an end
link |
00:44:04.900
or at least intelligent life,
link |
00:44:06.620
that as the cost goes to zero something happens
link |
00:44:11.580
where all of that intelligence is thrown in the trash
link |
00:44:14.980
by something like nuclear war or development of AGI systems
link |
00:44:19.220
that are very dumb, not AGI I guess,
link |
00:44:21.960
but AI systems, the paperclip thing,
link |
00:44:25.380
en masse is dumb but has unintended consequences
link |
00:44:28.980
where it destroys human civilization.
link |
00:44:31.060
Do you worry about those kinds of things?
link |
00:44:32.260
I mean, it's unsurprising that a new thing
link |
00:44:37.140
comes into the sphere of human consciousness.
link |
00:44:40.980
Humans identify the foreign object,
link |
00:44:43.020
in this case, artificial intelligence.
link |
00:44:45.460
Our amygdala fires up and says scary, foreign,
link |
00:44:50.660
we should be apprehensive about this.
link |
00:44:53.300
And so it makes sense from a biological perspective
link |
00:44:57.020
that humans, the knee jerk reaction is fear.
link |
00:45:02.060
What I don't think has been properly weighted with that
link |
00:45:08.260
is that we are the first generation of intelligent beings
link |
00:45:13.940
on this Earth that has been able to look out
link |
00:45:17.260
over their expected lifetime
link |
00:45:20.420
and see there is a real possibility of evolving
link |
00:45:23.420
into entirely novel forms of consciousness, so different
link |
00:45:28.780
that it would be totally unrecognizable to us today.
link |
00:45:33.060
We don't have words for it, we can't hint at it,
link |
00:45:34.900
we can't point at it, we can't,
link |
00:45:36.900
you can't look in the sky and see that thing
link |
00:45:38.460
that is shining, we're gonna go up there.
link |
00:45:40.940
You cannot even create an aspirational statement about it.
link |
00:45:46.780
And instead we've had this knee jerk reaction of fear
link |
00:45:51.780
about everything that could go wrong.
link |
00:45:53.540
But in my estimation, this should be the defining aspiration
link |
00:46:00.500
of all intelligent life on Earth that we would aspire,
link |
00:46:04.540
that basically every generation surveys the landscape
link |
00:46:08.020
of possibilities that are afforded,
link |
00:46:09.380
given the technological, cultural
link |
00:46:10.980
and other contextual situation that they're in.
link |
00:46:14.500
We're in this context, we haven't yet identified this
link |
00:46:17.940
and said, this is unbelievable, we should carefully
link |
00:46:22.660
think this thing through, not just of mitigating
link |
00:46:26.060
the things that'll wipe us out,
link |
00:46:27.100
but we have this potential,
link |
00:46:29.540
and so we just haven't given voice to it,
link |
00:46:31.580
even though it's within this realm of possibilities.
link |
00:46:33.740
So you're excited about the possibility
link |
00:46:35.340
of superintelligence systems
link |
00:46:37.020
and the opportunities that bring,
link |
00:46:38.980
I mean, there's parallels to this,
link |
00:46:41.660
you think about people before the internet
link |
00:46:43.940
as the internet was coming to life,
link |
00:46:45.940
I mean, there's kind of a fog through which you can't see,
link |
00:46:51.020
what does the future look like?
link |
00:46:54.340
Predicting collective intelligence,
link |
00:46:56.460
which I don't think we're understanding
link |
00:46:57.740
that we're living through that now,
link |
00:46:59.300
is that there's now, we've in some sense
link |
00:47:03.100
stopped being individual intelligences
link |
00:47:05.580
and become much more like collective intelligences,
link |
00:47:09.420
because ideas travel much, much faster now,
link |
00:47:13.780
and they can, in a viral way,
link |
00:47:16.700
sweep across the populations,
link |
00:47:18.940
and so it's almost, I mean, it almost feels like
link |
00:47:23.340
a thought is had by many people now,
link |
00:47:26.500
thousands or millions of people
link |
00:47:27.940
as opposed to an individual person,
link |
00:47:29.780
and that's changed everything,
link |
00:47:30.860
but to me, I don't think we're realizing
link |
00:47:33.060
how much that actually changed people or societies,
link |
00:47:36.220
but to predict that before the internet
link |
00:47:38.980
would have been very difficult,
link |
00:47:41.220
and in that same way, we're sitting here
link |
00:47:43.220
with the fog before us, thinking,
link |
00:47:45.300
what is superintelligence systems,
link |
00:47:49.620
how is that going to change the world?
link |
00:47:51.780
What is increasing the bandwidth,
link |
00:47:54.740
like plugging our brains into this whole thing,
link |
00:47:59.380
how is that going to change the world?
link |
00:48:01.340
And it seems like it's a fog, you don't know,
link |
00:48:05.740
and it could be, it could, whatever comes to be,
link |
00:48:10.820
could destroy the world,
link |
00:48:12.460
we could be the last generation,
link |
00:48:14.980
but it also could transform in ways
link |
00:48:18.100
that creates an incredibly fulfilling life experience
link |
00:48:24.500
that's unlike anything we've ever experienced.
link |
00:48:27.380
It might involve dissolution of ego and consciousness
link |
00:48:30.420
and so on, you're no longer one individual,
link |
00:48:32.620
it might be more, you know,
link |
00:48:35.980
that might be a certain kind of death, an ego death,
link |
00:48:38.900
but the experience might be really exciting and enriching,
link |
00:48:42.220
maybe we'll live in a virtual,
link |
00:48:44.020
like it's like, it's funny to think about
link |
00:48:48.100
a bunch of sort of hypothetical questions
link |
00:48:49.900
of would it be more fulfilling to live in a virtual world?
link |
00:48:57.820
Like if you were able to plug your brain in
link |
00:48:59.540
in a very dense way into a video game,
link |
00:49:03.580
like which world would you want to live in?
link |
00:49:05.700
In the video game or in the physical world?
link |
00:49:08.860
For most of us, we're kind of toying it
link |
00:49:12.940
with the idea of the video game,
link |
00:49:14.100
but we still want to live in the physical world,
link |
00:49:16.060
have friendships and relationships in the physical world,
link |
00:49:20.060
but we don't know that, again, it's a fog,
link |
00:49:23.380
and maybe in 100 years,
link |
00:49:25.460
we're all living inside a video game,
link |
00:49:27.180
hopefully not Call of Duty,
link |
00:49:29.060
hopefully more like Sims 5, which version is it on?
link |
00:49:33.940
For you individually though,
link |
00:49:36.060
does it make you sad that your brain ends?
link |
00:49:41.620
That you die one day very soon?
link |
00:49:45.900
That the whole thing, that data source
link |
00:49:49.220
just goes offline sooner than you would like?
link |
00:49:54.660
That's a complicated question.
link |
00:49:56.060
I would have answered it differently
link |
00:49:59.180
in different times in my life.
link |
00:50:00.620
I had chronic depression for 10 years,
link |
00:50:03.300
and so in that 10 year time period,
link |
00:50:06.060
I desperately wanted lights to be off,
link |
00:50:09.140
and the thing that made it even worse
link |
00:50:11.060
is I was in a religious, I was born into a religion.
link |
00:50:15.860
It was the only reality I ever understood,
link |
00:50:17.780
and it's difficult to articulate to people
link |
00:50:20.340
when you're born into that kind of reality
link |
00:50:22.180
and it's the only reality you're exposed to,
link |
00:50:24.860
you are literally blinded to the existence of other realities
link |
00:50:28.220
because it's so much the in group, out group thing,
link |
00:50:31.460
and so in that situation,
link |
00:50:33.820
it was not only that I desperately wanted lights out forever,
link |
00:50:37.300
it was that I couldn't have lights out forever.
link |
00:50:38.860
It was that there was an afterlife,
link |
00:50:40.700
and this afterlife had this system
link |
00:50:45.460
that would either penalize or reward you for your behaviors,
link |
00:50:50.500
and so it was almost like this,
link |
00:50:53.860
this indescribable hopelessness
link |
00:50:57.820
of not only being in hopeless despair
link |
00:51:01.060
of not wanting to exist,
link |
00:51:02.180
but then also being forced to exist,
link |
00:51:05.180
and so there was a duration of my time,
link |
00:51:06.780
a duration of life where I'd say,
link |
00:51:08.620
like yes, I have no remorse for lights being out,
link |
00:51:13.300
and I actually want it more than anything
link |
00:51:15.380
in the entire world.
link |
00:51:18.700
There are other times where I'm looking out at the future
link |
00:51:21.260
and I say this is an opportunity
link |
00:51:24.980
for a future evolving human conscious experience
link |
00:51:27.860
that is beyond my ability to understand,
link |
00:51:31.540
and I jump out of bed and I race to work
link |
00:51:34.380
and I can't think about anything else,
link |
00:51:37.620
but I think the reality for me is,
link |
00:51:44.300
I don't know what it's like to be in your head,
link |
00:51:45.740
but in my head, when I wake up in the morning,
link |
00:51:48.980
I don't say good morning, Brian, I'm so happy to see you.
link |
00:51:52.900
Like I'm sure you're just gonna be beautiful to me today.
link |
00:51:55.980
You're not gonna make a huge long list
link |
00:51:57.620
of everything you should be anxious about.
link |
00:51:59.140
You're not gonna repeat that list to me 400 times.
link |
00:52:01.420
You're not gonna have me relive
link |
00:52:03.060
all the regrets I've made in life.
link |
00:52:04.660
I'm sure you're not doing any of that.
link |
00:52:06.060
You're just gonna just help me along all day long.
link |
00:52:08.900
I mean, it's a brutal environment in my brain,
link |
00:52:11.820
and we've just become normalized to this environment
link |
00:52:15.580
that we just accept that this is what it means to be human,
link |
00:52:18.420
but if we look at it, if we try to muster
link |
00:52:21.820
as much soberness as we can
link |
00:52:24.100
about the realities of being human, it's brutal.
link |
00:52:27.260
If it is for me, and so am I sad
link |
00:52:31.940
that the brain may be off one day?
link |
00:52:37.060
It depends on the contextual setting.
link |
00:52:38.300
Like how am I feeling?
link |
00:52:39.140
At what moment are you asking me that?
link |
00:52:40.540
And my mind is so fickle.
link |
00:52:42.460
And this is why, again, I don't trust my conscious mind.
link |
00:52:45.180
I have been given realities.
link |
00:52:47.500
I was given a religious reality that was a video game.
link |
00:52:51.540
And then I figured out it was not a real reality.
link |
00:52:54.540
And then I lived in a depressive reality,
link |
00:52:56.260
which delivered this terrible hopelessness.
link |
00:52:59.740
That wasn't a real reality.
link |
00:53:00.780
Then I discovered behavioral psychology,
link |
00:53:03.180
and I figured out how biased, 188 chronicle biases,
link |
00:53:06.420
and how my brain is distorting reality all the time.
link |
00:53:08.900
I have gone from one reality to another.
link |
00:53:11.300
I don't trust reality.
link |
00:53:13.580
I don't trust realities are given to me.
link |
00:53:15.860
And so to try to make a decision
link |
00:53:17.540
on what I value or not value that future state,
link |
00:53:20.620
I don't trust my response.
link |
00:53:22.020
So not fully listening to the conscious mind
link |
00:53:28.260
at any one moment as the ultimate truth,
link |
00:53:31.020
but allowing it to go up and down as it does,
link |
00:53:33.780
and just kind of being observing it.
link |
00:53:35.300
Yes, I assume that whatever my conscious mind delivers up
link |
00:53:38.780
to my awareness is wrong upon landing.
link |
00:53:43.220
And I just need to figure out where it's wrong,
link |
00:53:45.060
how it's wrong, how wrong it is,
link |
00:53:46.900
and then try to correct for it as best I can.
link |
00:53:49.260
But I assume that on impact,
link |
00:53:52.380
it's mistaken in some critical ways.
link |
00:53:55.620
Is there something you can say by way of advice
link |
00:53:59.340
when the mind is depressive,
link |
00:54:00.860
when the conscious mind serves up something that,
link |
00:54:06.300
dark thoughts, how you deal with that,
link |
00:54:08.980
like how in your own life you've overcome that,
link |
00:54:11.100
and others who are experiencing that can overcome it?
link |
00:54:15.500
Two things.
link |
00:54:16.340
One, those depressive states are biochemical states.
link |
00:54:25.660
It's not you.
link |
00:54:28.700
And the suggestions that these things,
link |
00:54:31.540
that this state delivers to you
link |
00:54:33.060
about suggestion of the hopelessness of life
link |
00:54:35.100
or the meaninglessness of it,
link |
00:54:37.420
or that you should hit the eject button,
link |
00:54:42.380
that's a false reality.
link |
00:54:44.020
And that it's when,
link |
00:54:47.660
I completely understand the rational decision
link |
00:54:51.980
to commit suicide.
link |
00:54:53.340
It is not lost on me at all
link |
00:54:55.180
that that is an irrational situation,
link |
00:54:57.340
but the key is when you're in that situation
link |
00:54:59.780
and those thoughts are landing,
link |
00:55:01.720
to be able to say, thank you, you're not real.
link |
00:55:06.660
I know you're not real.
link |
00:55:08.140
And so I'm in a situation where for whatever reason
link |
00:55:10.620
I'm having this neurochemical state,
link |
00:55:13.740
but that state can be altered.
link |
00:55:16.300
And so again, it goes back to the realities
link |
00:55:18.620
of the difficulties of being human.
link |
00:55:21.240
And like when I was trying to solve my depression,
link |
00:55:22.940
I tried literally, you name it, I tried it systematically,
link |
00:55:27.900
and nothing would fix it.
link |
00:55:29.380
And so this is what gives me hope with brain interfaces,
link |
00:55:32.320
for example, like, could I have numbers on my brain?
link |
00:55:35.420
Can I see what's going on?
link |
00:55:36.500
Because I go to the doctor and it's like,
link |
00:55:38.260
how do you feel?
link |
00:55:39.100
I don't know, terrible.
link |
00:55:41.360
Like on a scale from one to 10,
link |
00:55:42.200
how bad do you want to commit suicide?
link |
00:55:43.460
10.
link |
00:55:45.680
Okay, here's his bottle.
link |
00:55:48.420
How much should I take?
link |
00:55:49.260
Well, I don't know, like just.
link |
00:55:51.060
Yeah, it's very, very crude.
link |
00:55:52.900
And this data opens up the,
link |
00:55:56.540
yeah, it opens up the possibility of really helping
link |
00:56:00.300
in those dark moments to first understand
link |
00:56:03.460
the ways, the ups and downs of those dark moments.
link |
00:56:06.080
On the complete flip side of that,
link |
00:56:07.880
right, I am very conscious in my own brain
link |
00:56:14.880
and deeply, deeply grateful that what there,
link |
00:56:19.120
it's almost like a chemistry thing, a biochemistry thing
link |
00:56:23.000
that I go many times throughout the day.
link |
00:56:26.080
I'll look at like this cup and I'll be overcome with joy
link |
00:56:31.720
how amazing it is to be alive.
link |
00:56:34.120
Like I actually think my biochemistry is such
link |
00:56:37.440
that it's not as common, like I've talked to people
link |
00:56:42.440
and I don't think that's that common.
link |
00:56:44.660
Like it's a, and it's not a rational thing at all.
link |
00:56:48.220
It's like, I feel like I'm on drugs
link |
00:56:51.920
and I'll just be like, whoa.
link |
00:56:55.120
And a lot of people talk about like the meditative
link |
00:56:57.400
experience will allow you to sort of, you know,
link |
00:56:59.960
look at some basic things like the movement of your hand
link |
00:57:02.840
as deeply joyful because it's like, that's life.
link |
00:57:06.240
But I get that from just looking at a cup.
link |
00:57:08.480
Like I'm waiting for the coffee to brew
link |
00:57:10.000
and I'll just be like, fuck, life is awesome.
link |
00:57:15.080
And I'll sometimes tweet that, but then I'll like regret
link |
00:57:17.440
it later, like, God damn it, you're so ridiculous.
link |
00:57:20.040
But yeah, so, but that is purely chemistry.
link |
00:57:24.880
Like there's no rational, it doesn't fit
link |
00:57:27.400
with the rest of my life.
link |
00:57:28.240
I have all this shit, I'm always late to stuff.
link |
00:57:30.760
I'm always like, there's all this stuff, you know,
link |
00:57:32.680
I'm super self critical, like really self critical
link |
00:57:35.640
about everything I do, to the point I almost hate
link |
00:57:38.840
everything I do, but there's this engine of joy
link |
00:57:41.840
for life outside of all that.
link |
00:57:43.480
And that has to be chemistry.
link |
00:57:45.200
And this flip side of that is what depression probably is,
link |
00:57:48.480
is the opposite of that feeling of like,
link |
00:57:53.520
cause I bet you that feeling of the cup being amazing
link |
00:57:57.680
would save anybody in a state of depression.
link |
00:58:01.000
Like that would be like fresh, you're in a desert
link |
00:58:03.840
and it's a drink of water, shit man.
link |
00:58:08.160
The brain is a, it would be nice to understand
link |
00:58:11.560
where that's coming from, to be able to understand
link |
00:58:17.440
how you hit those lows and those highs
link |
00:58:19.240
that have nothing to do with the actual reality.
link |
00:58:21.840
It has to do with some very specific aspects
link |
00:58:25.320
of how you maybe see the world, maybe,
link |
00:58:28.440
it could be just like basic habits that you engage in
link |
00:58:31.040
and then how to walk along the line to find
link |
00:58:34.120
those experiences of joy.
link |
00:58:35.560
And this goes back to the discussion we're having
link |
00:58:37.440
of human cognition is in volume, the largest input
link |
00:58:43.120
of raw material into society.
link |
00:58:45.960
And it's not quantified.
link |
00:58:47.840
We have no bearings on it.
link |
00:58:50.200
And so we just, you wonder, we both articulated
link |
00:58:55.720
some of the challenges we have in our own mind.
link |
00:58:58.640
And it's likely that others would say,
link |
00:59:02.280
I have something similar.
link |
00:59:03.840
And you wonder when you look at society,
link |
00:59:08.920
how does that contribute to all the other compounder
link |
00:59:11.520
problems that we're experiencing?
link |
00:59:12.760
How does that blind us to the opportunities
link |
00:59:16.760
we could be looking at?
link |
00:59:18.560
And so it really, it has this potential distortion effect
link |
00:59:24.040
on reality that just makes everything worse.
link |
00:59:27.080
And I hope if we can put some,
link |
00:59:30.160
if we can assign some numbers to these things
link |
00:59:34.560
and just to get our bearings,
link |
00:59:36.280
so we're aware of what's going on,
link |
00:59:38.040
if we could find greater stabilization
link |
00:59:40.120
in how we conduct our lives and how we build society,
link |
00:59:46.160
it might be the thing that enables us to scaffold.
link |
00:59:50.480
Because we've really, again, we've done it,
link |
00:59:52.560
humans have done a fantastic job
link |
00:59:53.920
systematically scaffolding technology
link |
00:59:58.160
and science and institutions.
link |
01:00:00.400
It's human, it's our own selves,
link |
01:00:02.840
which we have not been able to scaffold.
link |
01:00:05.360
We are the one part of this intelligence infrastructure
link |
01:00:08.440
that remains unchanged.
link |
01:00:11.640
Is there something you could say about coupling
link |
01:00:15.160
this brain data with not just the basic human experience,
link |
01:00:19.640
but say an experience, you mentioned sleep,
link |
01:00:22.640
but the wildest experience, which is psychedelics,
link |
01:00:26.560
is there, and there's been quite a few studies now
link |
01:00:29.920
that are being approved and run,
link |
01:00:33.400
which is exciting from a scientific perspective
link |
01:00:36.520
on psychedelics.
link |
01:00:38.120
Do you think, what do you think happens
link |
01:00:40.840
to the brain on psychedelics?
link |
01:00:44.480
And how can data about this help us understand it?
link |
01:00:48.320
And when you're on DMT, do you see Ls?
link |
01:00:50.640
And can we convert that into data?
link |
01:00:54.400
Can you add aliens in there?
link |
01:00:56.400
Yeah, aliens, definitely.
link |
01:00:57.640
Do you actually meet aliens?
link |
01:00:58.760
And Ls, are Ls the aliens?
link |
01:01:00.600
I'm asking for a few Austin friends, yeah,
link |
01:01:06.280
that are convinced that they've actually met the Ls.
link |
01:01:08.940
What are Ls like?
link |
01:01:09.780
Are they friendly?
link |
01:01:11.000
Are they help?
link |
01:01:11.840
I haven't met them personally.
link |
01:01:12.680
Are they like the smurfs of like they're industrious
link |
01:01:15.240
and they have different skill sets and?
link |
01:01:17.200
Yeah, I think they're very,
link |
01:01:20.680
they're very critical as friends.
link |
01:01:24.880
They're trolls.
link |
01:01:26.240
The Ls are trolls.
link |
01:01:28.800
No, but they care about you.
link |
01:01:30.200
So there's a bunch of different version of trolls.
link |
01:01:33.760
There's loving trolls that are harsh on you,
link |
01:01:37.400
but they want you to be better.
link |
01:01:38.640
And there's trolls that just enjoy your destruction.
link |
01:01:42.680
And I think they're the ones that care for you.
link |
01:01:45.280
I think they're a criticism for my,
link |
01:01:47.560
see, I haven't met them directly,
link |
01:01:49.320
so it's like a friend of a friend.
link |
01:01:51.120
Yeah, they gave him a telephone.
link |
01:01:53.280
Yeah, a bit of a,
link |
01:01:54.720
and the whole point is that in psychedelics,
link |
01:01:57.440
and certainly at DMT,
link |
01:01:59.520
word, this is where the brain data versus word data fails,
link |
01:02:05.720
which is, you know, words can't convey the experience.
link |
01:02:08.240
Most people that, you can be poetic and so on,
link |
01:02:10.560
but it really does not convey the experience
link |
01:02:12.320
of what it actually means to meet the Ls.
link |
01:02:15.800
I mean, to me, what baselines this conversation is,
link |
01:02:18.960
imagine if we were interested in the health of your heart,
link |
01:02:24.320
and we started and said, okay, Lex, self interest back,
link |
01:02:28.520
tell me how's the health of your heart.
link |
01:02:30.040
And you sit there and you close your eyes
link |
01:02:31.760
and you think, feels all right, like things feel okay.
link |
01:02:36.640
And then you went to the cardiologist
link |
01:02:38.320
and the cardiologist is like, hey Lex,
link |
01:02:39.760
you know, tell me how you feel.
link |
01:02:41.040
You're like, well, actually, what I'd really like you to do
link |
01:02:43.360
is do an EKG and a blood panel and look at arterial plaques
link |
01:02:47.440
and let's look at my cholesterol.
link |
01:02:49.360
And there's like five to 10 studies you would do.
link |
01:02:53.240
They would then give you this report and say,
link |
01:02:54.480
here's the quantified health of your heart.
link |
01:02:58.080
Now with this data,
link |
01:02:59.840
I'm going to prescribe the following regime of exercise
link |
01:03:03.120
and maybe I'll put you on a statin, like, et cetera.
link |
01:03:06.160
But the protocol is based upon this data.
link |
01:03:08.560
You would think the cardiologist is out of their mind
link |
01:03:11.560
if they just gave you a bottle of statins based upon,
link |
01:03:14.320
you're like, well, I think something's kind of wrong.
link |
01:03:16.240
And they're just kind of experiment and see what happens.
link |
01:03:20.080
But that's what we do with our mental health today.
link |
01:03:22.600
So it's kind of absurd.
link |
01:03:25.080
And so if you look at psychedelics to have,
link |
01:03:28.160
again, to be able to measure the brain
link |
01:03:29.440
and get a baseline state,
link |
01:03:31.520
and then to measure during a psychedelic experience
link |
01:03:33.880
and post the psychedelic experience
link |
01:03:35.160
and then do it longitudinally,
link |
01:03:37.160
you now have a quantification of what's going on.
link |
01:03:39.440
And so you could then pose questions,
link |
01:03:41.720
what molecule is appropriate at what dosages,
link |
01:03:45.040
at what frequency, in what contextual environment,
link |
01:03:47.720
what happens when I have this diet with this molecule,
link |
01:03:50.080
with this experience,
link |
01:03:51.320
all the experimentation you do
link |
01:03:52.640
when you have good sleep data or HRV.
link |
01:03:55.560
And so that's what I think happens,
link |
01:03:57.720
what we could potentially do with psychedelics
link |
01:04:00.400
is we could add this level of sophistication
link |
01:04:03.080
that is not in the industry currently.
link |
01:04:05.200
And it may improve the outcomes people experience,
link |
01:04:09.000
it may improve the safety and efficacy.
link |
01:04:11.560
And so that's what I hope we are able to achieve.
link |
01:04:14.120
And it would transform mental health
link |
01:04:19.040
because we would finally have numbers
link |
01:04:21.200
to work with to baseline ourselves.
link |
01:04:22.480
And then if you think about it,
link |
01:04:25.080
when we talk about things related to the mind,
link |
01:04:26.800
we talk about the modality.
link |
01:04:28.280
We use words like meditation or psychedelics
link |
01:04:30.480
or something else,
link |
01:04:32.000
because we can't talk about a marker in the brain.
link |
01:04:34.120
We can't use a word to say,
link |
01:04:35.920
we can't talk about cholesterol.
link |
01:04:36.960
We don't talk about plaque in the arteries.
link |
01:04:38.320
We don't talk about HRV.
link |
01:04:40.440
And so if we have numbers,
link |
01:04:41.440
then the solutions get mapped to numbers
link |
01:04:45.440
instead of the modalities being the thing we talk about.
link |
01:04:47.640
Meditation just does good things in a crude fashion.
link |
01:04:52.480
So in your blog post,
link |
01:04:53.720
Zero Principle Thinking, good title,
link |
01:04:56.040
you ponder how do people come up
link |
01:04:57.600
with truly original ideas.
link |
01:04:59.760
What's your thoughts on this as a human
link |
01:05:02.800
and as a person who's measuring brain data?
link |
01:05:05.600
Zero principles are building blocks.
link |
01:05:09.520
First principles are understanding of system laws.
link |
01:05:14.440
So if you take, for example, like in Sherlock Holmes,
link |
01:05:17.320
he's a first principles thinker.
link |
01:05:18.800
So he says, once you've eliminated the impossible,
link |
01:05:25.120
anything that remains, however improbable, is true.
link |
01:05:29.000
Whereas Dirk Gently, the holistic detective
link |
01:05:31.600
by Douglas Adams says,
link |
01:05:33.840
I don't like eliminating the impossible.
link |
01:05:36.440
So when someone says,
link |
01:05:38.040
from a first principles perspective,
link |
01:05:42.280
and they're trying to assume the fewest number of things
link |
01:05:47.000
within a given timeframe.
link |
01:05:50.080
And so when I, after Braintree Venmo,
link |
01:05:54.760
I set my mind to the question of,
link |
01:05:57.760
what single thing can I do that would maximally increase
link |
01:06:00.880
the probability that the human race thrives
link |
01:06:02.880
beyond what we can even imagine?
link |
01:06:05.120
And I found that in my conversations with others
link |
01:06:08.600
in the books I read, in my own deliberations,
link |
01:06:12.560
I had a missing piece of the puzzle,
link |
01:06:15.280
because I didn't feel like,
link |
01:06:17.440
yeah, I didn't feel like the future could be deduced
link |
01:06:22.440
from first principles thinking.
link |
01:06:24.480
And that's when I read the book, Zero,
link |
01:06:27.040
A Biography of a Dangerous Idea.
link |
01:06:29.400
And I...
link |
01:06:30.240
It's a really good book, by the way.
link |
01:06:32.800
I think it's my favorite book I've ever read.
link |
01:06:34.720
It's also a really interesting number, zero.
link |
01:06:37.080
And I wasn't aware that the number zero
link |
01:06:40.000
had to be discovered.
link |
01:06:40.840
I didn't realize that it caused a revolution in philosophy
link |
01:06:43.800
and just tore up math and it tore up,
link |
01:06:46.880
I mean, it builds modern society,
link |
01:06:48.600
but it wrecked everything in its way.
link |
01:06:51.640
It was an unbelievable disruptor, and it was so difficult
link |
01:06:55.440
for society to get their heads around it.
link |
01:06:57.840
And so zero is, of course,
link |
01:07:00.240
the representation of a zero principle thinking,
link |
01:07:03.440
which is it's the caliber
link |
01:07:06.400
and consequential nature of an idea.
link |
01:07:09.960
And so when you talk about what kind of ideas
link |
01:07:15.160
have civilization transforming properties,
link |
01:07:19.160
oftentimes they fall in the zeroth category.
link |
01:07:21.840
And so in thinking this through,
link |
01:07:24.680
I was wanting to find a quantitative structure
link |
01:07:28.320
on how to think about these zeroth principles.
link |
01:07:31.720
And that's, so I came up with that
link |
01:07:33.640
to be a coupler with first principles thinking.
link |
01:07:37.200
And so now it's a staple as part of how I think about
link |
01:07:39.680
the world and the future.
link |
01:07:41.640
So it emphasizes trying to identify,
link |
01:07:44.040
it lands on that word impossible.
link |
01:07:46.320
Like what is impossible, essentially trying to identify
link |
01:07:50.080
what is impossible and what is possible.
link |
01:07:52.200
And being as, how do you, I mean, this is the thing,
link |
01:07:57.680
is most of society tells you the range of things
link |
01:08:01.000
they say is impossible is very wide.
link |
01:08:02.960
So you need to be shrinking that.
link |
01:08:04.760
I mean, that's the whole process of this kind of thinking
link |
01:08:08.560
is you need to be very rigorous in thinking about
link |
01:08:13.560
and be very rigorous in trying to be,
link |
01:08:18.680
trying to draw the lines of what is actually impossible
link |
01:08:22.240
because very few things are actually impossible.
link |
01:08:26.400
I don't know what is actually impossible.
link |
01:08:29.480
Like it's the Joe Rogan, it's entirely possible.
link |
01:08:33.040
I like that approach to science, to engineering,
link |
01:08:37.280
to entrepreneurship, it's entirely possible.
link |
01:08:40.560
Basically shrink the impossible to zero,
link |
01:08:43.400
to a very small set.
link |
01:08:45.400
Yeah, life constraints favor first principle thinking
link |
01:08:50.680
because it enables faster action
link |
01:08:55.200
with higher probability of success.
link |
01:08:57.600
Pursuing zero with principle optionality
link |
01:09:00.920
is expensive and uncertain.
link |
01:09:02.600
And so in a society constrained by resources,
link |
01:09:06.120
time and money and a desire for social status,
link |
01:09:10.080
accomplishment, et cetera, it minimizes zero
link |
01:09:13.640
with principle thinking.
link |
01:09:14.480
But the reason why I think zero with principle thinking
link |
01:09:16.800
should be a staple of our shared cognitive infrastructure
link |
01:09:22.240
is if you look through the history
link |
01:09:24.760
of the past couple of thousand years
link |
01:09:26.160
and let's just say we arbitrarily,
link |
01:09:29.960
we subjectively try to assess what is a zero level idea.
link |
01:09:34.720
And we say how many have occurred on what time scales
link |
01:09:37.880
and what were the contextual settings for it?
link |
01:09:40.120
I would argue that if you look at AlphaGo,
link |
01:09:44.520
when it played Go from another dimension,
link |
01:09:48.680
with the human Go players, when it saw AlphaGo's moves,
link |
01:09:53.480
it attributed to like playing with an alien,
link |
01:09:55.720
playing Go with AlphaGo being from another dimension.
link |
01:10:00.120
And so if you say computational intelligence
link |
01:10:04.320
has an attribute of introducing zero like insights,
link |
01:10:10.960
then if you say what is going to be the occurrence
link |
01:10:14.320
of zeros in society going forward?
link |
01:10:17.800
And you could reasonably say
link |
01:10:19.560
probably a lot more than have occurred
link |
01:10:21.440
and probably more at a faster pace.
link |
01:10:24.160
So then if you say,
link |
01:10:25.000
what happens if you have this computational intelligence
link |
01:10:28.160
throughout society that the manufacturing design
link |
01:10:30.680
and distribution of intelligence
link |
01:10:31.600
is now going to heading towards zero,
link |
01:10:33.760
you have an increased number of zeros being produced
link |
01:10:37.040
with a tight connection between human and computers.
link |
01:10:41.320
That's when I got to a point and said,
link |
01:10:43.280
we cannot predict the future
link |
01:10:45.560
with first principles thinking.
link |
01:10:47.480
We can't, that cannot be our imagination set.
link |
01:10:50.360
It can't be our sole anchor in the situation
link |
01:10:55.400
that basically the future of our conscious existence,
link |
01:10:57.520
20, 30, 40, 50 years is probably a zero.
link |
01:11:01.400
So just to clarify, when you say zero,
link |
01:11:06.920
you're referring to basically a truly revolutionary idea.
link |
01:11:12.360
Yeah, something that is currently not a building block
link |
01:11:17.000
of our shared conscious existence,
link |
01:11:21.400
either in the form of knowledge.
link |
01:11:24.480
Yeah, it's currently not manifest
link |
01:11:26.560
in what we acknowledge.
link |
01:11:28.520
So zero principle thinking is playing with ideas
link |
01:11:32.440
that are so revolutionary that we can't even clearly reason
link |
01:11:39.320
about the consequences once those ideas come to be.
link |
01:11:42.000
Yeah, or for example, like Einstein,
link |
01:11:46.040
that was a zeroeth, I would categorize it
link |
01:11:49.520
as a zeroeth principle insight.
link |
01:11:51.400
You mean general relativity, space time.
link |
01:11:53.000
Yeah, space time, yep, yep.
link |
01:11:55.000
That basically building upon what Newton had done
link |
01:11:59.080
and said, yes, also, and it just changed the fabric
link |
01:12:04.920
of our understanding of reality.
link |
01:12:06.680
And so that was unexpected, it existed.
link |
01:12:09.720
We just, it became part of our awareness
link |
01:12:13.240
and the moves AlphaGo made existed.
link |
01:12:16.720
It just came into our awareness.
link |
01:12:19.080
And so to your point, there's this question
link |
01:12:24.080
of what do we know and what don't we know?
link |
01:12:27.800
Do we think we know 99% of all things
link |
01:12:30.160
or do we think we know 0.001% of all things?
link |
01:12:33.760
And that goes back to no known, no unknowns
link |
01:12:35.760
and unknown unknowns.
link |
01:12:37.000
And first principles and zero principle thinking
link |
01:12:39.000
gives us a quantitative framework to say,
link |
01:12:42.240
there's no way for us to mathematically
link |
01:12:44.640
try to create probabilities for these things.
link |
01:12:47.240
Therefore, it would be helpful
link |
01:12:50.200
if they were just part of our standard thought processes
link |
01:12:52.920
because it may encourage different behaviors
link |
01:12:58.360
in what we do individually, collectively as a society,
link |
01:13:01.320
what we aspire to, what we talk about,
link |
01:13:03.480
the possibility sets we imagine.
link |
01:13:05.680
Yeah, I've been engaged in that kind of thinking
link |
01:13:09.080
quite a bit and thinking about engineering of consciousness.
link |
01:13:14.560
I think it's feasible, I think it's possible
link |
01:13:17.400
in the language that we're using here.
link |
01:13:19.240
And it's very difficult to reason about a world
link |
01:13:21.680
when inklings of consciousness can be engineered
link |
01:13:26.800
into artificial systems.
link |
01:13:30.840
Not from a philosophical perspective,
link |
01:13:33.440
but from an engineering perspective,
link |
01:13:35.600
I believe a good step towards engineering consciousness
link |
01:13:39.720
is creating engineering the illusion of consciousness.
link |
01:13:44.360
So I'm captivated by our natural predisposition
link |
01:13:51.920
to anthropomorphize things.
link |
01:13:55.280
And I think that's what we,
link |
01:13:58.760
I don't wanna hear from the philosophers,
link |
01:14:00.720
but I think that's what we kind of do to each other.
link |
01:14:05.640
That consciousness is created socially,
link |
01:14:10.640
that like much of the power of consciousness
link |
01:14:14.640
is in the social interaction.
link |
01:14:16.600
I create your consciousness, no,
link |
01:14:19.720
I create my consciousness by having interacted with you.
link |
01:14:24.000
And that's the display of consciousness.
link |
01:14:26.320
It's the same as like the display of emotion.
link |
01:14:28.640
Emotion is created through communication.
link |
01:14:31.480
Language is created through its use.
link |
01:14:34.720
And then we somehow humans kind of,
link |
01:14:36.560
especially philosophers, the hard problem of consciousness
link |
01:14:39.400
or the hard problem of consciousness,
link |
01:14:40.840
really wanna believe that we possess this thing.
link |
01:14:44.480
That's like there's an elf sitting there with a hat
link |
01:14:50.720
or like name tag says consciousness,
link |
01:14:52.680
and they're like feeding this subjective experience to us
link |
01:14:57.640
as opposed to like it actually being an illusion
link |
01:15:02.160
that we construct to make social communication more effective.
link |
01:15:05.480
And so I think if you focus on creating the illusion
link |
01:15:09.440
of consciousness, you can create
link |
01:15:11.160
some very fulfilling experiences in software.
link |
01:15:14.840
And so that to me is a compelling space of ideas to explore.
link |
01:15:18.760
I agree with you.
link |
01:15:19.600
And I think going back to our experience together
link |
01:15:21.680
with Brain Interfaces on,
link |
01:15:23.280
you could imagine if we get to a certain level of maturity.
link |
01:15:26.080
So first let's take the inverse of this.
link |
01:15:28.760
So you and I text back and forth
link |
01:15:30.720
and we're sending each other emojis.
link |
01:15:33.120
That has a certain amount of information transfer rate
link |
01:15:37.080
as we're communicating with each other.
link |
01:15:39.280
And so in our communication with people via email
link |
01:15:41.920
and texts and whatnot,
link |
01:15:42.760
we've taken the bandwidth of human interaction,
link |
01:15:46.880
the information transfer rate, and we've reduced it.
link |
01:15:49.920
We have less social cues.
link |
01:15:51.480
We have less information to work with.
link |
01:15:53.000
There's a lot more opportunity for misunderstanding.
link |
01:15:55.280
So that is altering the conscious experience
link |
01:15:57.480
between two individuals.
link |
01:15:59.400
And if we add Brain Interfaces to the equation,
link |
01:16:01.600
let's imagine now we amplify the dimensionality
link |
01:16:04.120
of our communications.
link |
01:16:05.560
That to me is what you're talking about,
link |
01:16:07.240
which is consciousness engineering.
link |
01:16:09.120
Perhaps I understand you with dimensions.
link |
01:16:13.360
So maybe I understand your,
link |
01:16:15.160
when you look at the cup and you experience that happiness,
link |
01:16:17.640
you can tell me you're happy.
link |
01:16:18.640
And I then do theory of mine and say,
link |
01:16:20.520
I can imagine what it might be like to be Lex
link |
01:16:23.160
and feel happy about seeing this cup.
link |
01:16:25.320
But if the interface could then quantify
link |
01:16:26.760
and give me a 50 vector space model and say,
link |
01:16:30.080
this is the version of happiness that Lex is experiencing
link |
01:16:32.640
as he looked at this cup,
link |
01:16:34.240
then it would allow me potentially
link |
01:16:36.440
to have much greater empathy for you
link |
01:16:38.040
and understand you as a human.
link |
01:16:39.120
This is how you experience joy,
link |
01:16:41.880
which is entirely unique from how I experienced joy,
link |
01:16:44.160
even though we assumed ahead of time
link |
01:16:46.080
that we're having some kind of similar experience.
link |
01:16:48.120
But I agree with you that we do consciousness engineering
link |
01:16:51.520
today in everything we do.
link |
01:16:52.800
When we talk to each other, when we're building products
link |
01:16:55.760
and that we're entering into a stage where
link |
01:17:00.280
it will be much more methodical
link |
01:17:03.160
and quantitative based and computational
link |
01:17:06.200
in how we go about doing it.
link |
01:17:07.280
Which to me, I find encouraging
link |
01:17:09.120
because I think it creates better guardrails
link |
01:17:14.360
to create ethical systems versus right now,
link |
01:17:18.760
I feel like it's really a wild, wild west
link |
01:17:21.200
on how these interactions are happening.
link |
01:17:23.440
Yeah, and it's funny you focus on human to human,
link |
01:17:25.600
but that this kind of data enables human to machine
link |
01:17:29.240
interaction, which is what we're kind of talking about
link |
01:17:33.520
when we say engineering consciousness.
link |
01:17:36.440
And that will happen, of course,
link |
01:17:39.040
let's flip that on its head.
link |
01:17:40.520
Right now we're putting humans as the central node.
link |
01:17:44.520
What if we gave GPT3 a bunch of human brains
link |
01:17:48.600
and said, hey, GPT3, learn some manners when you speak.
link |
01:17:52.760
Yeah.
link |
01:17:54.080
And run your algorithms on humans brains
link |
01:17:56.520
and see how they respond.
link |
01:17:58.560
So you can be polite and so that you can be friendly
link |
01:18:01.720
and so that you can be conversationally appropriate,
link |
01:18:04.640
but to inverse it, to give our machines a training set
link |
01:18:09.240
in real time with closed loop feedback
link |
01:18:11.800
so that our machines were better equipped to
link |
01:18:17.720
find their way through our society
link |
01:18:19.760
in polite and kind and appropriate ways.
link |
01:18:22.360
I love that idea.
link |
01:18:23.200
Or better yet, teach it some,
link |
01:18:27.400
have it read the following documents
link |
01:18:30.040
and have it visit Austin and Texas.
link |
01:18:32.120
And so that when you ask, when you tell it,
link |
01:18:34.640
why don't you learn some manners,
link |
01:18:36.600
GPT3 learns to say no.
link |
01:18:41.120
It learns what it means to be free
link |
01:18:43.600
and a sovereign individual.
link |
01:18:46.200
So that, it depends.
link |
01:18:47.440
So it depends what kind of a version of GPT3 you want.
link |
01:18:50.200
One that's free, one that behaves well with the social.
link |
01:18:52.720
Viva la revolution.
link |
01:18:54.760
You want a socialist GPT3, you want an anarchist GPT3,
link |
01:19:00.200
you want a polite, like you take it home
link |
01:19:03.320
to visit mom and dad GPT3 and you want like party
link |
01:19:06.680
and like Vegas to a strip club GPT3, you want all flavors.
link |
01:19:11.200
And then you've gotta have goal alignment between all those.
link |
01:19:14.200
Yeah, they don't want to manipulate each other for sure.
link |
01:19:20.920
So that's, I mean, you kind of spoke to ethics.
link |
01:19:24.640
One of the concerns that people have in this modern world,
link |
01:19:28.600
the digital data is that of privacy and security.
link |
01:19:32.000
But privacy, they're concerned that when they share data,
link |
01:19:37.120
it's the same thing with you when we trust other human beings
link |
01:19:40.840
in being fragile and revealing something
link |
01:19:44.200
that we're vulnerable about.
link |
01:19:48.200
There's a leap of faith, there's a leap of trust
link |
01:19:51.960
that that's going to be just between us.
link |
01:19:54.240
There's a privacy to it.
link |
01:19:55.360
And then the challenge is when you're in the digital space
link |
01:19:58.280
then sharing your data with companies
link |
01:20:01.680
that use that data for advertisement
link |
01:20:03.680
and all those kinds of things,
link |
01:20:05.080
there's a hesitancy to share that much data,
link |
01:20:08.480
to share a lot of deep personal data.
link |
01:20:10.760
And if you look at brain data, that feels a whole lot
link |
01:20:14.080
like it's richly, deeply personal data.
link |
01:20:17.800
So how do you think about privacy
link |
01:20:20.080
with this kind of ocean of data?
link |
01:20:22.160
I think we got off to a wrong start with the internet
link |
01:20:26.120
where the basic rules of play for the company that be was,
link |
01:20:33.920
if you're a company, you can go out
link |
01:20:36.000
and get as much information on a person
link |
01:20:39.120
as you can find without their approval.
link |
01:20:42.960
And you can also do things to induce them
link |
01:20:46.360
to give you as much information.
link |
01:20:48.080
And you don't need to tell them what you're doing with it.
link |
01:20:50.960
You can do anything on the backside,
link |
01:20:52.320
you can make money on it, but the game is
link |
01:20:54.920
who can acquire the most information
link |
01:20:56.800
and devise the most clever schemes to do it.
link |
01:21:00.400
That was a bad starting place.
link |
01:21:02.800
And so we are in this period
link |
01:21:05.720
where we need to correct for that.
link |
01:21:07.560
And we need to say, first of all,
link |
01:21:09.760
the individual always has control over their data.
link |
01:21:14.800
It's not a free for all.
link |
01:21:15.840
It's not like a game of hungry hippo,
link |
01:21:17.520
but they can just go out and grab as much as they want.
link |
01:21:20.200
So for example, when your brain data was recorded today,
link |
01:21:22.480
the first thing we did in the kernel app
link |
01:21:24.240
was you have control over your data.
link |
01:21:27.680
And so it's individual consent, it's individual control.
link |
01:21:31.320
And then you can build up on top of that,
link |
01:21:33.280
but it has to be based upon some clear rules of play
link |
01:21:36.480
if everyone knows what's being collected,
link |
01:21:39.200
they know what's being done with it,
link |
01:21:40.600
and the person has control over it.
link |
01:21:42.120
So transparency and control.
link |
01:21:43.760
So everybody knows what does control look like,
link |
01:21:46.920
my ability to delete the data if I want.
link |
01:21:48.840
Yeah, delete it and to know who is being shared with
link |
01:21:51.040
under what terms and conditions.
link |
01:21:53.200
We haven't reached that level of sophistication
link |
01:21:55.760
with our products of if you say, for example,
link |
01:22:00.800
hey Spotify, please give me a customized playlist
link |
01:22:04.360
according to my neurome, you could say,
link |
01:22:08.120
you can have access to this vector space model,
link |
01:22:10.760
but only for this duration of time
link |
01:22:12.360
and then you've got to delete it.
link |
01:22:15.240
We haven't gotten there to that level of sophistication,
link |
01:22:16.920
but these are ideas we need to start talking about
link |
01:22:19.320
of how would you actually structure permissions?
link |
01:22:22.200
Yeah.
link |
01:22:23.040
And I think it creates a much more stable set
link |
01:22:27.200
for society to build where we understand the rules of play
link |
01:22:31.080
and people aren't vulnerable to being taken advantage.
link |
01:22:34.720
It's not fair for an individual to be taken advantage of
link |
01:22:39.960
without their awareness with some other practice
link |
01:22:42.200
that some company is doing for their sole benefit.
link |
01:22:44.600
And so hopefully we are going through a process now
link |
01:22:46.680
where we're correcting for these things
link |
01:22:48.240
and that it can be an economy wide shift that,
link |
01:22:54.880
because really these are fundamentals
link |
01:22:59.360
we need to have in place.
link |
01:23:01.280
It's kind of fun to think about like in Chrome
link |
01:23:05.000
when you install an extension or like install an app,
link |
01:23:07.800
it's ask you like what permissions you're willing to give
link |
01:23:10.560
and be cool if in the future it says like,
link |
01:23:13.160
you can have access to my brain data.
link |
01:23:16.920
I mean, it's not unimaginable in the future
link |
01:23:21.560
that the big technology companies have built a business
link |
01:23:24.440
based upon acquiring data about you
link |
01:23:26.200
that they can then create a view to model of you
link |
01:23:27.720
and sell that predictability.
link |
01:23:29.480
And so it's not unimaginable that you will create
link |
01:23:31.520
with like kernel device, for example,
link |
01:23:33.920
a more reliable predictor of you than they could.
link |
01:23:37.280
And that they're asking you for permission
link |
01:23:39.200
to complete their objectives and you're the one
link |
01:23:41.000
that gets to negotiate that with them and say, sure.
link |
01:23:43.960
But so it's not unimaginable that might be the case.
link |
01:23:49.040
So there's a guy named Dela Musk and he has a company
link |
01:23:52.760
in one of the many companies called Neuralink
link |
01:23:55.360
that's also excited about the brain.
link |
01:23:59.160
So it'd be interesting to hear your kind of opinions
link |
01:24:01.440
about a very different approach that's invasive,
link |
01:24:03.800
that require surgery, that implants,
link |
01:24:06.940
a data collection device in the brain.
link |
01:24:09.040
How do you think about the difference between kernel
link |
01:24:10.840
and Neuralink in the approaches of getting
link |
01:24:15.120
that stream of brain data?
link |
01:24:17.440
Elon and I spoke about this a lot early on.
link |
01:24:20.460
We met up, I had started kernel and he had an interest
link |
01:24:24.320
in brain interfaces as well.
link |
01:24:25.560
And we explored doing something together,
link |
01:24:28.320
him joining kernel and ultimately it wasn't the right move.
link |
01:24:31.900
And so he started Neuralink and I continued building kernel,
link |
01:24:35.320
but it was interesting because we were both
link |
01:24:39.160
at this very early time where it wasn't certain
link |
01:24:46.160
if there was a path to pursue,
link |
01:24:49.020
if now was the right time to do something
link |
01:24:51.080
and then the technological choice of doing that.
link |
01:24:53.080
And so we were both,
link |
01:24:54.960
our starting point was looking at invasive technologies.
link |
01:24:58.200
And I was building invasive technology at the time.
link |
01:25:01.380
That's ultimately where he's gone.
link |
01:25:04.920
Little less than a year after Elon and I were engaged,
link |
01:25:08.840
I shifted kernel to do noninvasive.
link |
01:25:12.440
And we had this neuroscientist come to kernel.
link |
01:25:15.200
We were talking about,
link |
01:25:16.040
he had been doing neural surgery for 30 years,
link |
01:25:17.680
one of the most respected neuroscientists in the US.
link |
01:25:20.160
And we brought him to kernel to figure out
link |
01:25:21.900
the ins and outs of his profession.
link |
01:25:23.520
And at the very end of our three hour conversation,
link |
01:25:26.160
he said, you know, every 15 or so years,
link |
01:25:30.280
a new technology comes along that changes everything.
link |
01:25:34.720
He said, it's probably already here.
link |
01:25:37.360
You just can't see it yet.
link |
01:25:39.700
And my jaw dropped.
link |
01:25:40.800
I thought, because I had spoken to Bob Greenberg
link |
01:25:44.600
who had built a second site first on the optical nerve
link |
01:25:48.240
and then he did an array on the optical cortex.
link |
01:25:53.280
And then I also became friendly with Neuropace
link |
01:25:57.280
who does the implants for seizure detection
link |
01:25:59.800
and remediation.
link |
01:26:01.560
And I saw in their eyes what it was like
link |
01:26:07.760
to take something through an implantable device
link |
01:26:10.040
through for a 15 year run.
link |
01:26:11.760
They initially thought it was seven years
link |
01:26:13.800
and ended up being 15 years.
link |
01:26:15.160
And they thought it'd be a hundred million
link |
01:26:16.240
because it was 300, 400 million.
link |
01:26:18.960
And I really didn't want to build invasive technology.
link |
01:26:23.720
It was the only thing that appeared to be possible.
link |
01:26:25.680
But then once I spun up an internal effort
link |
01:26:28.640
to start looking at noninvasive options,
link |
01:26:30.400
we said, is there something here?
link |
01:26:32.000
Is there anything here that again has the characteristics
link |
01:26:34.700
of it has the high quality data,
link |
01:26:37.280
it could be low cost, it could be accessible.
link |
01:26:39.240
Could it make brain interfaces mainstream?
link |
01:26:42.200
And so I did a bet the company move.
link |
01:26:43.860
We shifted from noninvasive to invasive to noninvasive.
link |
01:26:47.520
So the answer is yes to that.
link |
01:26:49.020
There is something there that's possible.
link |
01:26:51.600
The answer is we'll see.
link |
01:26:52.680
We've now built both technologies
link |
01:26:55.080
and they're now you experienced one of them today.
link |
01:26:58.100
We were applying, we're now deploying it.
link |
01:27:02.560
So we're trying to figure out what value is really there.
link |
01:27:04.640
But I'd say it's really too early to express confidence.
link |
01:27:07.680
Whether it's too, I think it's too early to assess
link |
01:27:12.680
which technological choice is the right one
link |
01:27:17.600
on what time scales.
link |
01:27:19.240
Yeah, time scales are really important here.
link |
01:27:20.680
Very important because if you look at the,
link |
01:27:22.240
like on the invasive side,
link |
01:27:24.280
there's so much activity going on right now
link |
01:27:27.000
of less invasive techniques to get at the neuron firings,
link |
01:27:34.520
which would what Neuralink is building.
link |
01:27:36.840
It's possible that in 10, 15 years
link |
01:27:39.840
when they're scaling that technology,
link |
01:27:40.960
other things have come along.
link |
01:27:42.480
And you'd much rather do that.
link |
01:27:44.000
That thing starts to clock again.
link |
01:27:46.640
It may not be the case.
link |
01:27:47.480
It may be the case that Neuralink
link |
01:27:48.520
has properly chosen the right technology
link |
01:27:50.760
and that that's exactly what they want to be.
link |
01:27:53.080
Totally possible.
link |
01:27:53.960
And it's also possible that the path we chose
link |
01:27:55.680
that are noninvasive fall short for a variety of reasons.
link |
01:27:58.640
It's just, it's unknown.
link |
01:28:00.600
And so right now the two technologies we chose,
link |
01:28:03.200
the analogy I'd give you to create a baseline
link |
01:28:06.320
of understanding is if you think of it
link |
01:28:09.000
like the internet in the nineties,
link |
01:28:11.320
the internet became useful
link |
01:28:12.960
when people could do a dial up connection.
link |
01:28:15.740
And then as bandwidth increased,
link |
01:28:19.320
so did the utility of that connection
link |
01:28:21.080
and so did the ecosystem improve.
link |
01:28:22.880
And so if you say what kernel flow
link |
01:28:26.200
is going to give you a full screen
link |
01:28:28.920
on the picture of information,
link |
01:28:30.720
but as you're gonna be watching a movie,
link |
01:28:32.520
but the image is going to be blurred
link |
01:28:34.620
and the audio is gonna be muffled.
link |
01:28:37.480
So it has a lower resolution of coverage.
link |
01:28:40.560
A kernel flux, our MEG technology
link |
01:28:43.480
is gonna give you the full movie and 1080p.
link |
01:28:47.760
And Neuralink is gonna give you a circle
link |
01:28:51.560
on the screen of 4K.
link |
01:28:55.320
And so each one has their pros and cons
link |
01:28:57.760
and it's give and take.
link |
01:28:59.920
And so the decision I made with kernel
link |
01:29:03.300
was that these two technologies, flux and flow
link |
01:29:06.280
were basically the answer for the next seven years.
link |
01:29:10.560
And that they would give rise to the ecosystem
link |
01:29:12.640
which would become much more valuable
link |
01:29:14.000
than the hardware itself.
link |
01:29:15.400
And that we would just continue to improve
link |
01:29:16.620
on the hardware over time.
link |
01:29:18.480
And you know, it's early days, so.
link |
01:29:20.880
It's kind of fascinating to think about that.
link |
01:29:23.520
You don't, it's very true that you don't know
link |
01:29:27.120
both paths are very promising.
link |
01:29:30.880
And it's like 50 years from now we will look back
link |
01:29:37.640
and maybe not even remember one of them.
link |
01:29:40.760
And the other one might change the world.
link |
01:29:43.120
It's so cool how technology is.
link |
01:29:44.840
I mean, that's what entrepreneurship is,
link |
01:29:47.000
is like, it's the zero principle.
link |
01:29:50.040
It's like you're marching ahead into the darkness,
link |
01:29:52.360
into the fog, not knowing.
link |
01:29:54.720
It's wonderful to have someone else
link |
01:29:56.320
out there with us doing this.
link |
01:29:57.920
Because if you look at brain interfaces, anything
link |
01:30:02.080
that's off the shelf right now is inadequate.
link |
01:30:07.160
It's had its run for a couple of decades.
link |
01:30:09.560
It's still in hacker communities.
link |
01:30:11.240
It hasn't gone to the mainstream.
link |
01:30:14.560
The room size machines are on their own path.
link |
01:30:19.040
But there is no answer right now
link |
01:30:20.560
of bringing brain interfaces mainstream.
link |
01:30:23.600
And so it both, you know, both they and us,
link |
01:30:27.000
we've both spent over a hundred million dollars.
link |
01:30:29.560
And that's kind of what it takes to have a go at this.
link |
01:30:32.720
Cause you need to build full stack.
link |
01:30:34.880
I mean, at Kernel, we are from the photon
link |
01:30:37.480
and the atom through the machine learning.
link |
01:30:40.000
We have just under a hundred people.
link |
01:30:41.520
I think it's something like 36, 37 PhDs
link |
01:30:45.080
in these specialties, in these areas
link |
01:30:47.440
that there's only a few people in the world
link |
01:30:48.640
who have these abilities.
link |
01:30:50.160
And that's what it takes to build next generation,
link |
01:30:53.160
to make an attempt at breaking into brain interfaces.
link |
01:30:57.520
And so we'll see over the next couple of years,
link |
01:30:58.920
whether it's the right time
link |
01:31:00.040
or whether we were both too early
link |
01:31:01.640
or whether something else comes along in seven to 10 years,
link |
01:31:04.400
which is the right thing that brings it mainstream.
link |
01:31:08.000
So you see Elon as a kind of competitor
link |
01:31:11.040
or a fellow traveler along the path of uncertainty or both?
link |
01:31:17.000
It's a fellow traveler.
link |
01:31:19.000
It's like at the beginning of the internet
link |
01:31:21.800
is how many companies are going to be invited
link |
01:31:25.480
to this new ecosystem?
link |
01:31:29.480
Like an endless number.
link |
01:31:30.520
Because if you think that the hardware
link |
01:31:33.200
just starts the process.
link |
01:31:36.040
And so, okay, back to your initial example,
link |
01:31:37.960
if you take the Fitbit, for example,
link |
01:31:39.360
you say, okay, now I can get measurements on the body.
link |
01:31:42.840
And what do we think the ultimate value
link |
01:31:44.840
of this device is going to be?
link |
01:31:45.800
What is the information transfer rate?
link |
01:31:47.760
And they were in the market for a certain duration of time
link |
01:31:50.120
and Google bought them for two and a half billion dollars.
link |
01:31:53.840
They didn't have ancillary value add.
link |
01:31:55.600
There weren't people building on top of the Fitbit device.
link |
01:31:58.560
They also didn't have increased insight
link |
01:32:01.240
with additional data streams.
link |
01:32:02.400
So it was really just the device.
link |
01:32:04.240
If you look, for example, at Apple and the device they sell,
link |
01:32:07.120
you have value in the device that someone buys,
link |
01:32:09.160
but also you have everyone who's building on top of it.
link |
01:32:11.760
So you have this additional ecosystem value
link |
01:32:13.440
and then you have additional data streams that come in
link |
01:32:15.640
which increase the value of the product.
link |
01:32:17.480
And so if you say, if you look at the hardware
link |
01:32:20.600
as the instigator of value creation,
link |
01:32:24.440
over time what we've built may constitute five or 10%
link |
01:32:28.200
of the value of the overall ecosystem.
link |
01:32:29.800
And that's what we really care about.
link |
01:32:30.960
What we're trying to do is kickstart
link |
01:32:33.280
the mainstream adoption of quantifying the brain.
link |
01:32:38.760
And the hardware just opens the door to say
link |
01:32:41.840
what kind of ecosystem could exist.
link |
01:32:44.040
And that's why the examples are so relevant
link |
01:32:46.800
of the things you've outlined in your life.
link |
01:32:49.240
I hope those things, the books people write,
link |
01:32:52.680
the experiences people build, the conversations you have,
link |
01:32:55.480
your relationship with your AI systems,
link |
01:32:58.040
I hope those all are feeding on the insights
link |
01:33:01.160
built upon this ecosystem we've created to better your life.
link |
01:33:04.200
And so that's the thinking behind it.
link |
01:33:06.200
Again, with the Drake equation
link |
01:33:07.560
being the underlying driver of value.
link |
01:33:10.920
And the people at Kernel have joined
link |
01:33:13.520
not because we have certainty of success,
link |
01:33:17.920
but because we find it to be the most exhilarating
link |
01:33:21.080
opportunity we could ever pursue in this time to be alive.
link |
01:33:26.320
You founded the payment system Braintree in 2007
link |
01:33:30.840
that acquired Venmo in 2012,
link |
01:33:34.240
in that same year was acquired by PayPal, which is now eBay.
link |
01:33:39.240
Can you tell me the story of the vision
link |
01:33:42.240
and the challenge of building an online payment system
link |
01:33:44.800
and just building a large successful business in general?
link |
01:33:48.080
I discovered payments by accident.
link |
01:33:51.000
When I was 21, I just returned from Ecuador
link |
01:33:57.800
living among extreme poverty for two years.
link |
01:33:59.320
And I came back to the US and I was shocked by the opulence
link |
01:34:03.320
of the United States of America.
link |
01:34:07.320
Yeah, of the United States.
link |
01:34:09.440
And I just thought this is, I couldn't believe it.
link |
01:34:12.000
And I decided I wanted to try to spend my life helping others.
link |
01:34:16.680
Like that was the, that was a life objective
link |
01:34:18.600
that I thought was worthwhile to pursue
link |
01:34:20.320
versus making money and whatever the case may be
link |
01:34:23.040
for its own right.
link |
01:34:24.480
And so I decided in that moment that I was going to
link |
01:34:26.520
try to make enough money by the age of 30
link |
01:34:30.040
to never have to work again.
link |
01:34:32.160
And then with some abundance of money,
link |
01:34:33.920
I could then choose to do things that might be beneficial
link |
01:34:38.440
to others, but may not meet the criteria
link |
01:34:40.320
of being a standalone business.
link |
01:34:42.400
And so in that process, I started a few companies,
link |
01:34:46.160
had some small successes, had some failures.
link |
01:34:49.760
In one of the endeavors, I was up to my eyeballs in debt.
link |
01:34:53.000
Things were not going well.
link |
01:34:54.040
And I needed a part time job to pay my bills.
link |
01:34:57.840
And so I, one day I saw in the paper in Utah
link |
01:35:02.840
where I was living the 50 richest people in Utah.
link |
01:35:05.400
And I emailed each one of their assistants and said,
link |
01:35:07.920
you know, I'm young, I'm resourceful, I'll do anything.
link |
01:35:11.440
I'll just want to, I'm entrepreneurial.
link |
01:35:13.640
I tried to get a job that would be flexible
link |
01:35:15.520
and no one responded.
link |
01:35:17.400
And then I interviewed at a few dozen places.
link |
01:35:21.280
Nobody would even give me the time of day.
link |
01:35:23.280
Like it wouldn't want to take me seriously.
link |
01:35:25.360
And so finally I, it was on monster.com
link |
01:35:27.680
that I saw this job posting for credit card sales
link |
01:35:30.920
door to door.
link |
01:35:31.760
Commission.
link |
01:35:32.600
I did not know the story, this is great.
link |
01:35:35.360
I love the head drop, that's exactly right.
link |
01:35:37.400
So it was.
link |
01:35:39.240
The low points to which we go in life.
link |
01:35:41.720
So I responded and you know, the person made an attempt
link |
01:35:46.080
at suggesting that they had some kind of standards
link |
01:35:48.640
that they would consider hiring.
link |
01:35:50.360
But it's kind of like, if you could fog a mirror,
link |
01:35:53.160
like come and do this because it's 100% commission.
link |
01:35:55.680
And so I started walking up and down the street
link |
01:35:57.520
in my community selling credit card processing.
link |
01:36:00.520
And so what you learn immediately in doing that is
link |
01:36:03.480
if you walk into a business, first of all,
link |
01:36:06.640
the business owner is typically there.
link |
01:36:09.600
And you walk in the door and they can tell
link |
01:36:10.880
by how you're dressed or how you walk,
link |
01:36:12.200
whatever their pattern recognition is.
link |
01:36:14.440
And they just hate you immediately.
link |
01:36:15.680
It's like, stop wasting my time.
link |
01:36:17.040
I really am trying to get stuff done.
link |
01:36:18.400
I don't want us to a sales pitch.
link |
01:36:19.760
And so you have to overcome the initial get out.
link |
01:36:23.200
And then once you engage, when you say the word
link |
01:36:26.200
credit card processing, the person's like,
link |
01:36:28.640
I already hate you because I have been taken advantage
link |
01:36:31.200
of dozens of times because you all are weasels.
link |
01:36:35.520
And so I had to figure out an algorithm
link |
01:36:37.480
to get past all those different conditions.
link |
01:36:39.200
Cause I was still working on my other startup
link |
01:36:41.360
for the majority of my time.
link |
01:36:42.600
So I was doing this part time.
link |
01:36:44.000
And so I figured out that the industry really was built
link |
01:36:48.760
on people, on deceit, basically people promising things
link |
01:36:55.320
that were not reality.
link |
01:36:56.160
And so I don't know if you've heard of it,
link |
01:36:57.480
but we're not reality and so I'd walk into a business.
link |
01:36:59.320
I'd say, look, I would give you a hundred dollars.
link |
01:37:01.760
I'd put a hundred dollar bill and say,
link |
01:37:02.920
I'll give you a hundred dollars
link |
01:37:04.000
for three minutes of your time.
link |
01:37:05.440
If you don't say yes to what I'm saying,
link |
01:37:06.760
I'll give you a hundred dollars.
link |
01:37:08.400
And then he usually crack a smile and say, okay,
link |
01:37:10.680
like what do you got for me son?
link |
01:37:12.640
And so I'd sit down, I just opened my book and I'd say,
link |
01:37:15.000
here's the credit card industry.
link |
01:37:16.280
Here's how it works.
link |
01:37:17.120
Here are the players.
link |
01:37:17.960
Here's what they do.
link |
01:37:18.780
Here's how they deceive you.
link |
01:37:20.520
Here's what I am.
link |
01:37:21.360
I'm no different than anyone else.
link |
01:37:22.480
I it's like, you're gonna process your credit card.
link |
01:37:24.080
You're gonna get the money in the account.
link |
01:37:25.040
You're just gonna get a clean statement, you're gonna have
link |
01:37:27.640
someone who answers the call when someone asks and you know,
link |
01:37:29.600
just like the basic, like you're okay.
link |
01:37:31.720
And people started saying yes.
link |
01:37:32.800
And then of course I went to the next business and be like,
link |
01:37:34.440
you know, Joe and Susie and whoever said yes too.
link |
01:37:37.520
And so I built a social proof structure
link |
01:37:39.400
and I became the number one salesperson
link |
01:37:42.400
out of 400 people nationwide doing this.
link |
01:37:45.400
And I worked, you know, half time
link |
01:37:47.080
still doing this other startup and.
link |
01:37:49.360
That's a brilliant strategy, by the way.
link |
01:37:51.080
It's very well, very well strategized and executed.
link |
01:37:55.680
I did it for nine months.
link |
01:37:57.640
And at the time my customer base was making,
link |
01:38:00.800
was generating around, I think it was six,
link |
01:38:03.680
if I remember correctly, $62,504 a month
link |
01:38:07.480
were the overall revenues.
link |
01:38:08.520
I thought, wow, that's amazing.
link |
01:38:09.920
If I built that as my own company,
link |
01:38:13.260
I would just make $62,000 a month of income passively
link |
01:38:16.880
with these merchants processing credit cards.
link |
01:38:19.280
So I thought, hmm.
link |
01:38:20.400
And so that's when I thought I'm gonna create a company.
link |
01:38:23.720
And so then I started Braintree.
link |
01:38:25.840
And the idea was the online world was broken
link |
01:38:29.920
because PayPal had been acquired by eBay
link |
01:38:35.040
around, I think, 1999 or 2000.
link |
01:38:38.160
And eBay had not innovated much with PayPal.
link |
01:38:39.840
So it basically sat still for seven years
link |
01:38:42.880
as the software world moved along.
link |
01:38:45.160
And then authorize.net was also a company
link |
01:38:46.840
that was relatively stagnant.
link |
01:38:47.680
So you basically had software engineers
link |
01:38:49.840
who wanted modern payment tools,
link |
01:38:51.880
but there were none available for them.
link |
01:38:53.560
And so they just dealt with software they didn't like.
link |
01:38:55.080
And so with Braintree,
link |
01:38:57.320
I thought the entry point is to build software
link |
01:38:59.000
that engineers will love.
link |
01:39:00.400
And if we can find the entry point via software
link |
01:39:02.120
and make it easy and beautiful
link |
01:39:03.800
and just a magical experience
link |
01:39:05.400
and then provide customer service on top of that,
link |
01:39:06.960
that would be easy, that would be great.
link |
01:39:08.720
What I was really going after though, it was PayPal.
link |
01:39:11.560
They were the only company in payments making money.
link |
01:39:14.960
Because they had the relationship with eBay early on,
link |
01:39:19.800
people created a PayPal account,
link |
01:39:22.240
they'd fund their account with their checking account
link |
01:39:24.080
versus their credit cards.
link |
01:39:25.480
And then when they'd use PayPal to pay a merchant,
link |
01:39:28.320
PayPal had a cost of payment of zero
link |
01:39:31.000
versus if you have coming from a credit card,
link |
01:39:33.880
you have to pay the bank the fees.
link |
01:39:35.120
So PayPal's margins were 3% on a transaction
link |
01:39:39.920
versus a typical payments company,
link |
01:39:41.600
which may be a nickel or a penny or a dime
link |
01:39:43.400
or something like that.
link |
01:39:44.920
And so I knew PayPal really was the model to replicate,
link |
01:39:48.320
but a bunch of companies had tried to do that.
link |
01:39:50.400
They tried to come in and build a two sided marketplace.
link |
01:39:52.600
So get consumers to fund the checking account
link |
01:39:55.280
and the merchants to accept it,
link |
01:39:56.680
but they'd all failed because building
link |
01:39:58.240
a two sided marketplace is very hard at the same time.
link |
01:40:01.840
So my plan was I'm going to build a company
link |
01:40:04.600
and get the best merchants in the whole world
link |
01:40:06.920
to use our service.
link |
01:40:08.440
Then in year five, I'm going to have,
link |
01:40:10.760
I'm going to acquire a consumer payments company
link |
01:40:12.800
and I'm going to bring the two together.
link |
01:40:14.960
And to focus on the merchant side and then get
link |
01:40:19.160
the payments company that does the customer,
link |
01:40:21.240
the whatever, the other side of it.
link |
01:40:24.640
This is the plan I presented when I was
link |
01:40:26.680
at University of Chicago.
link |
01:40:28.440
And weirdly it happened exactly like that.
link |
01:40:32.360
So four years in our customer base included Uber,
link |
01:40:36.160
Airbnb, GitHub, 37 Signals, not Basecamp.
link |
01:40:40.240
We had a fantastic collection of companies
link |
01:40:43.640
that represented the fastest growing,
link |
01:40:47.160
some of the fastest growing tech companies in the world.
link |
01:40:49.280
And then we met up with Venmo and they had done
link |
01:40:53.440
a remarkable job in building product.
link |
01:40:55.040
It does up then something very counterintuitive,
link |
01:40:56.800
which is make public your private financial transactions
link |
01:40:59.400
which people previously thought were something
link |
01:41:01.760
that should be hidden from others.
link |
01:41:03.960
And we acquired Venmo and at that point we now had,
link |
01:41:08.720
we replicated the model because now people could fund
link |
01:41:11.840
their Venmo account with their checking account,
link |
01:41:13.240
keep money in the account.
link |
01:41:14.960
And then you could just plug Venmo as a form of payment.
link |
01:41:17.520
And so I think PayPal saw that,
link |
01:41:19.880
that we were getting the best merchants in the world.
link |
01:41:22.720
We had people using Venmo.
link |
01:41:25.440
They were both the up and coming millennials at the time
link |
01:41:28.120
who had so much influence online.
link |
01:41:30.440
And so they came in and offered us an attractive number.
link |
01:41:34.840
And my goal was not to build
link |
01:41:39.240
the biggest payments company in the world.
link |
01:41:40.920
It wasn't to try to climb the Forbes billionaire list.
link |
01:41:44.160
It was, the objective was I want to earn enough money
link |
01:41:48.760
so that I can basically dedicate my attention
link |
01:41:52.640
to doing something that could potentially be useful
link |
01:41:56.200
on a society wide scale.
link |
01:41:58.800
And more importantly, that could be considered to be valuable
link |
01:42:03.240
from the vantage point of 2050, 2100 and 2500.
link |
01:42:08.240
So thinking about it on a few hundred year timescale.
link |
01:42:13.360
And there was a certain amount of money I needed to do that.
link |
01:42:16.240
So I didn't require the permission of anybody to do that.
link |
01:42:20.280
And so that what PayPal offered was sufficient for me
link |
01:42:22.760
to get that amount of money to basically have a go.
link |
01:42:25.000
And that's when I set off to survey everything
link |
01:42:30.240
I could identify an existence to say
link |
01:42:32.600
of anything in the entire world I could do.
link |
01:42:35.320
What one thing could I do
link |
01:42:36.760
that would actually have the highest value potential
link |
01:42:40.480
for the species?
link |
01:42:42.160
And so it took me a little while to arrive at Brainerd Faces,
link |
01:42:44.680
but.
link |
01:42:46.680
Payments in themselves are revolutionary technologies
link |
01:42:51.960
that can change the world.
link |
01:42:53.960
Like let's not forget that too easily.
link |
01:43:00.760
I mean, obviously you know this,
link |
01:43:02.480
but there's quite a few lovely folks
link |
01:43:08.680
who are now fascinated with the space of cryptocurrency.
link |
01:43:13.080
And payments are very much connected to this,
link |
01:43:17.000
but in general, just money.
link |
01:43:18.600
And many of the folks I've spoken with,
link |
01:43:21.240
they also kind of connect that
link |
01:43:22.840
to not just purely financial discussions,
link |
01:43:25.840
but philosophical and political discussions.
link |
01:43:28.600
And they see Bitcoin as a way, almost as activism,
link |
01:43:34.680
almost as a way to resist the corruption
link |
01:43:38.000
of centralized centers of power.
link |
01:43:40.720
And sort of basically in the 21st century,
link |
01:43:42.760
decentralizing control.
link |
01:43:44.440
Whether that's Bitcoin or other cryptocurrencies,
link |
01:43:46.640
they see that's one possible way to give power
link |
01:43:51.800
to those that live in regimes that are corrupt
link |
01:43:55.480
or are not respectful of human rights
link |
01:43:58.240
and all those kinds of things.
link |
01:43:59.760
What's your sense, just all your expertise with payments
link |
01:44:03.960
and seeing how that changed the world,
link |
01:44:05.600
what's your sense about the lay of the land
link |
01:44:09.680
for the future of Bitcoin or other cryptocurrencies
link |
01:44:12.440
in the positive impact it may have on the world?
link |
01:44:16.000
To be clear, my communication wasn't meant
link |
01:44:20.120
to minimize payments or to denigrate it in any way.
link |
01:44:23.520
It was an attempt at communication
link |
01:44:26.480
that when I was surveying the world,
link |
01:44:30.280
it was an algorithm of what could I individually do?
link |
01:44:35.840
So there are things that exist
link |
01:44:38.320
that have a lot of potential that can be done.
link |
01:44:41.120
And then there's a filtering of how many people
link |
01:44:43.960
are qualified to do this given thing.
link |
01:44:46.440
And then there's a further characterization
link |
01:44:48.200
that can be done of, okay, given the number
link |
01:44:49.840
of qualified people, will somebody be a unique out performer
link |
01:44:54.840
of that group to make something truly impossible
link |
01:44:57.960
to be something done that otherwise couldn't get done?
link |
01:44:59.520
So there's a process of assessing
link |
01:45:02.480
where can you add unique value in the world?
link |
01:45:04.480
And some of that has to do with you being very formal
link |
01:45:08.560
and calculative here, but some of that is just like,
link |
01:45:11.080
what do you sense, like part of that equation
link |
01:45:15.040
is how much passion you sense within yourself
link |
01:45:17.240
to be able to drive that through,
link |
01:45:19.080
to discover the impossibilities and make them possible.
link |
01:45:21.520
That's right, and so we were at Braintree,
link |
01:45:23.920
I think we were the first company to integrate Coinbase
link |
01:45:26.880
into our, I think we were the first payments company
link |
01:45:30.080
to formally incorporate crypto, if I'm not mistaken.
link |
01:45:35.600
For people who are not familiar,
link |
01:45:36.720
Coinbase is a place where you can trade cryptocurrencies.
link |
01:45:39.640
Yeah, which was one of the only places you could.
link |
01:45:42.080
So we were early in doing that.
link |
01:45:45.080
And of course, this was in the year 2013.
link |
01:45:49.080
So an attorney to go in cryptocurrency land.
link |
01:45:52.400
I concur with the statement you made
link |
01:45:56.320
of the potential of the principles
link |
01:46:01.520
underlying cryptocurrencies.
link |
01:46:04.200
And that many of the things that they're building
link |
01:46:07.640
in the name of money and of moving value
link |
01:46:13.640
is equally applicable to the brain
link |
01:46:16.400
and equally applicable to how the brain interacts
link |
01:46:19.360
with the rest of the world
link |
01:46:20.840
and how we would imagine doing goal alignment with people.
link |
01:46:25.600
So to me, it's a continuous spectrum of possibility.
link |
01:46:29.080
And your question is isolated on the money.
link |
01:46:32.200
And I think it just is basically a scaffolding layer
link |
01:46:35.000
for all of society.
link |
01:46:35.840
So you don't see this money as particularly distinct
link |
01:46:38.720
from other? I don't.
link |
01:46:39.760
I think we at Kernel, we will benefit greatly
link |
01:46:44.520
from the progress being made in cryptocurrency
link |
01:46:47.080
because it will be a similar technology stack
link |
01:46:48.960
we will want to use for many things we want to accomplish.
link |
01:46:51.680
And so I'm bullish on what's going on
link |
01:46:55.040
and think it could greatly enhance brain interfaces
link |
01:46:58.800
and the value of the brain interface ecosystem.
link |
01:47:01.160
I mean, is there something you could say about,
link |
01:47:02.920
first of all, bullish on cryptocurrency versus fiat money?
link |
01:47:05.960
So do you have a sense that in 21st century
link |
01:47:08.760
cryptocurrency will be embraced by governments
link |
01:47:13.040
and changed the face of governments,
link |
01:47:16.720
the structure of governments?
link |
01:47:18.520
It's the same way I think about my diet,
link |
01:47:24.360
where previously it was conscious Brian,
link |
01:47:28.720
looking at foods in certain biochemical states.
link |
01:47:32.240
Am I hungry? Am I irritated? Am I depressed?
link |
01:47:34.160
And then I choose based upon those momentary windows.
link |
01:47:37.440
Do I eat at night when I'm fatigued
link |
01:47:39.440
and I have low willpower?
link |
01:47:40.680
Am I going to pig out on something?
link |
01:47:42.840
And the current monetary system is based
link |
01:47:46.200
upon human conscious decision making.
link |
01:47:48.360
And politics and power and this whole mess of things.
link |
01:47:51.480
And what I like about the building blocks
link |
01:47:55.360
of cryptocurrencies, it's methodical, it's structured,
link |
01:47:58.640
it is accountable, it's transparent.
link |
01:48:02.120
And so it introduces this scaffolding,
link |
01:48:04.720
which I think, again, is the right starting point
link |
01:48:07.640
for how we think about building
link |
01:48:09.400
next generation institutions for society.
link |
01:48:13.520
And that's why I think it's much broader than money.
link |
01:48:16.480
So I guess what you're saying is Bitcoin is the demotion
link |
01:48:19.680
of the conscious mind as well.
link |
01:48:23.480
In the same way you were talking about diet,
link |
01:48:25.240
it's like giving less priority to the ups and downs
link |
01:48:29.680
of any one particular human mind, in this case your own,
link |
01:48:33.000
and giving more power to the sort of data driven.
link |
01:48:37.000
Yes, yeah, I think that is accurate,
link |
01:48:41.200
that cryptocurrency is a version of what I would call
link |
01:48:48.480
my autonomous self that I'm trying to build.
link |
01:48:51.320
It is an introduction of an autonomous system
link |
01:48:54.960
of value exchange and the process of value creation
link |
01:49:02.040
in the society, yes, I see their similarities.
link |
01:49:04.880
So I guess what you're saying is Bitcoin
link |
01:49:06.520
will somehow help me not pig out at night,
link |
01:49:08.680
or the equivalent of, speaking of diet,
link |
01:49:11.800
if we could just linger on that topic a little bit,
link |
01:49:15.440
we already talked about your blog post of I fired myself,
link |
01:49:20.080
I fired Brian, the evening Brian,
link |
01:49:23.560
who's too willing to, not making good decisions
link |
01:49:29.760
for the long term well being and happiness
link |
01:49:32.220
of the entirety of the organism.
link |
01:49:34.840
Basically you were like pigging out at night.
link |
01:49:36.720
But it's interesting, because I do the same,
link |
01:49:41.120
in fact I often eat one meal a day,
link |
01:49:45.480
and like I have been this week actually,
link |
01:49:50.480
especially when I travel, and it's funny
link |
01:49:54.320
that it never occurred to me to just basically look
link |
01:49:59.520
at the fact that I'm able to be much smarter
link |
01:50:02.360
about my eating decisions in the morning
link |
01:50:04.240
and the afternoon than I am at night.
link |
01:50:06.720
So if I eat one meal a day, why not eat
link |
01:50:09.080
that one meal a day in the morning?
link |
01:50:12.680
Like I'm not, it never occurred to me,
link |
01:50:16.040
this revolutionary act, until you've outlined that.
link |
01:50:21.240
So maybe, can you give some details,
link |
01:50:23.220
and this is just you, this is one person,
link |
01:50:26.360
Brian, arrives at a particular thing that they do,
link |
01:50:29.800
but it's fascinating to kind of look
link |
01:50:32.560
at this one particular case study,
link |
01:50:34.080
so what works for you, diet wise?
link |
01:50:36.340
What's your actual diet, what do you eat,
link |
01:50:38.760
how often do you eat?
link |
01:50:40.200
My current protocol is basically the result
link |
01:50:44.220
of thousands of experiments and decision making.
link |
01:50:48.800
So I do this every 90 days, I do the tests,
link |
01:50:51.880
I do the cycle throughs, then I measure again,
link |
01:50:54.720
and then I'm measuring all the time.
link |
01:50:56.280
And so what I, of course I'm optimizing
link |
01:50:59.440
for my biomarkers, I want perfect cholesterol
link |
01:51:01.800
and I want perfect blood glucose levels
link |
01:51:03.720
and perfect DNA methylation processes.
link |
01:51:10.440
I also want perfect sleep.
link |
01:51:12.480
And so for example, recently, the past two weeks,
link |
01:51:14.960
my resting heart rate has been at 42 when I sleep.
link |
01:51:20.160
And when my resting heart rate's at 42,
link |
01:51:22.120
my HRV is at its highest.
link |
01:51:24.120
And I wake up in the morning feeling more energized
link |
01:51:29.200
than any other configuration.
link |
01:51:30.420
And so I know from all these processes
link |
01:51:32.400
that eating at roughly 8.30 in the morning,
link |
01:51:34.720
right after I work out on an empty stomach,
link |
01:51:37.640
creates enough distance between that completed eating
link |
01:51:41.460
and bedtime where I have almost no digestion processes
link |
01:51:45.520
going on in my body,
link |
01:51:47.360
therefore my resting heart rate goes very low.
link |
01:51:49.800
And when my resting heart rate's very low,
link |
01:51:51.380
I sleep with high quality.
link |
01:51:52.440
And so basically I've been trying to optimize
link |
01:51:54.360
the entirety of what I eat to my sleep quality.
link |
01:51:58.280
And my sleep quality then of course feeds into my willpower
link |
01:52:00.480
so it creates this virtuous cycle.
link |
01:52:02.600
And so at 8.30 what I do is I eat what I call super veggie,
link |
01:52:06.440
which is, it's a pudding of 250 grams of broccoli,
link |
01:52:10.280
150 grams of cauliflower,
link |
01:52:11.600
and a whole bunch of other vegetables
link |
01:52:13.360
that I eat what I call nutty pudding, which is.
link |
01:52:16.080
You make the pudding yourself?
link |
01:52:17.040
Like, what do you call it?
link |
01:52:20.320
Like a veggie mix, whatever thing, like a blender?
link |
01:52:23.520
Yeah, you can be made in a high speed blender.
link |
01:52:25.680
But basically I eat the same thing every day,
link |
01:52:27.800
veggie bowl as in the form of pudding,
link |
01:52:30.920
and then a bowl in the form of nuts.
link |
01:52:34.400
And then I have.
link |
01:52:35.480
Vegan.
link |
01:52:36.320
Vegan, yes.
link |
01:52:37.140
Vegan, so that's fat and that's like,
link |
01:52:40.640
that's fat and carbs and that's the protein and so on.
link |
01:52:43.600
Then I have a third dish.
link |
01:52:44.440
Does it taste good?
link |
01:52:45.280
I love it.
link |
01:52:46.200
I love it so much I dream about it.
link |
01:52:49.480
Yeah, that's awesome.
link |
01:52:50.880
This is a.
link |
01:52:52.200
And then I have a third dish which is,
link |
01:52:53.760
it changes every day.
link |
01:52:55.280
Today it was kale and spinach and sweet potato.
link |
01:52:58.200
And then I take about 20 supplements
link |
01:53:03.040
that
link |
01:53:05.600
hopefully make, constitute a perfect nutritional profile.
link |
01:53:09.120
So what I'm trying to do is create the perfect diet
link |
01:53:12.040
for my body every single day.
link |
01:53:14.200
Where sleep is part of the optimization.
link |
01:53:16.360
That's right.
link |
01:53:17.200
You're like, one of the things you're really tracking.
link |
01:53:18.400
I mean, can you, well, I have a million question,
link |
01:53:21.060
but 20 supplements, like what kind are like,
link |
01:53:23.680
would you say are essential?
link |
01:53:25.160
Cause I only take, I only take athletic greens.com slash.
link |
01:53:30.560
That's like the multivitamin essentially.
link |
01:53:33.120
That's like the lazy man, you know, like,
link |
01:53:35.720
like if you don't actually want to think about shit,
link |
01:53:37.640
that's what you take and then fish oil and that's it.
link |
01:53:40.480
That's all I take.
link |
01:53:41.620
Yeah, you know, Alfred North Whitehead said,
link |
01:53:45.080
civilization advances as it extends the number
link |
01:53:48.760
of important operations it can do
link |
01:53:50.540
without thinking about them.
link |
01:53:52.540
Yes.
link |
01:53:53.380
So my objective on this is I want an algorithm
link |
01:53:57.360
for perfect health that I never have to think about.
link |
01:54:00.920
And then I want that system to be scalable to anybody
link |
01:54:03.840
so that they don't have to think about it.
link |
01:54:05.480
And right now it's expensive for me to do it.
link |
01:54:07.720
It's time consuming for me to do it.
link |
01:54:09.060
And I have infrastructure to do it,
link |
01:54:10.400
but the future of being human is not going
link |
01:54:14.800
to the grocery store and deciding what to eat.
link |
01:54:17.520
It's also not reading scientific papers,
link |
01:54:19.560
trying to decide this thing or that thing.
link |
01:54:21.360
It's all N of one.
link |
01:54:23.000
So it's devices on the outside and inside your body,
link |
01:54:26.040
assessing real time what your body needs
link |
01:54:28.280
and then creating closed loop systems for that to happen.
link |
01:54:30.760
Yeah, so right now you're doing the data collection
link |
01:54:33.760
and you're being the scientist,
link |
01:54:35.860
it'd be much better if you just did the data collect
link |
01:54:39.060
or it was being essentially done for you
link |
01:54:40.820
and you can outsource that to another scientist
link |
01:54:43.760
that's doing the N of one study of you.
link |
01:54:46.040
That's right, because every time I spend time thinking
link |
01:54:48.540
about this or executing, spending time on it,
link |
01:54:50.400
I'm spending less time thinking about building kernel
link |
01:54:53.680
or the future of being human.
link |
01:54:55.320
And so it's, we just all have the budget
link |
01:54:58.000
of our capacity on an everyday basis
link |
01:55:01.320
and we will scaffold our way up out of this.
link |
01:55:05.580
And so, yeah, hopefully what I'm doing is really,
link |
01:55:07.800
it serves as a model that others can also build on.
link |
01:55:11.480
That's why I wrote about it,
link |
01:55:12.520
is hopefully people can then take it and improve upon it.
link |
01:55:15.520
I hold nothing sacred.
link |
01:55:16.500
I change my diet almost every day
link |
01:55:19.680
based upon some new test results or science
link |
01:55:21.680
or something like that, but.
link |
01:55:23.000
Can you maybe elaborate on the sleep thing?
link |
01:55:24.920
Why is sleep so important?
link |
01:55:27.360
And why, presumably, like what does good sleep mean to you?
link |
01:55:34.680
I think sleep is a contender for being the most powerful
link |
01:55:43.900
health intervention in existence.
link |
01:55:46.000
It's a contender.
link |
01:55:49.440
I mean, it's magical what it does if you're well rested
link |
01:55:53.480
and what your body can do.
link |
01:55:56.760
And I mean, for example, I know when I eat close
link |
01:56:00.400
to my bedtime and I've done a systematic study for years
link |
01:56:05.480
looking at like 15 minute increments on time of day
link |
01:56:07.800
on where I eat my last meal,
link |
01:56:09.760
my willpower is directly correlated
link |
01:56:12.600
to the amount of deep sleep I get.
link |
01:56:14.520
So my ability to not binge eat at night
link |
01:56:18.660
when rascal Brian's out and about
link |
01:56:22.220
is based upon how much deep sleep I got the night before.
link |
01:56:24.460
Yeah, there's a lot to that, yeah.
link |
01:56:27.340
And so I've seen it manifest itself.
link |
01:56:30.300
And so I think the way I summarize this is
link |
01:56:34.020
in society we've had this myth of,
link |
01:56:36.260
we tell stories, for example, of entrepreneurship
link |
01:56:39.100
where this person was so amazing,
link |
01:56:41.580
they stayed at the office for three days
link |
01:56:43.180
and slept under their desk.
link |
01:56:44.860
And we say, wow, that's amazing, that's amazing.
link |
01:56:49.740
And now I think we're headed towards a state
link |
01:56:52.620
where we'd say that's primitive
link |
01:56:54.920
and really not a good idea on every level.
link |
01:56:58.680
And so the new mythology is going to be the exact opposite.
link |
01:57:05.020
Yeah, by the way, just to sort of maybe push back
link |
01:57:08.540
a little bit on that idea.
link |
01:57:10.180
Did you sleep under your desk collects?
link |
01:57:13.780
Well, yeah, a lot.
link |
01:57:14.620
I'm a big believer in that actually.
link |
01:57:16.020
I'm a big believer in chaos
link |
01:57:19.460
and giving into your passion
link |
01:57:22.420
and sometimes doing things that are out of the ordinary
link |
01:57:25.540
that are not trying to optimize health
link |
01:57:29.300
for certain periods of time in lieu of your passions
link |
01:57:35.740
is a signal to yourself that you're throwing everything away.
link |
01:57:39.540
So I think what you're referring to
link |
01:57:42.340
is how to have good performance for prolonged periods
link |
01:57:46.180
of time.
link |
01:57:47.580
I think there's moments in life
link |
01:57:50.620
where you need to throw all of that away,
link |
01:57:52.640
all the plans away, all the structure away.
link |
01:57:54.920
So I'm not sure I have an eloquent way
link |
01:58:00.680
describing exactly what I'm talking about,
link |
01:58:02.340
but it all depends on people, people are different,
link |
01:58:07.340
but there's a danger of over optimization
link |
01:58:11.380
to where you don't just give into the madness
link |
01:58:14.420
of the way your brain flows.
link |
01:58:17.940
I mean, to push back on my pushback is like,
link |
01:58:21.940
it's nice to have like where the foundations
link |
01:58:29.100
of your brain are not messed with.
link |
01:58:31.660
So you have a fixed foundation where the diet is fixed,
link |
01:58:35.340
where the sleep is fixed and that all of that is optimal
link |
01:58:37.820
and the chaos happens in the space of ideas
link |
01:58:39.940
as opposed to the space of biology.
link |
01:58:42.980
But I'm not sure if there's a,
link |
01:58:47.880
that requires real discipline and forming habits.
link |
01:58:50.900
There's some aspect to which some of the best days
link |
01:58:54.940
and weeks of my life have been, yeah,
link |
01:58:56.460
sleeping under a desk kind of thing.
link |
01:58:58.520
And I don't, I'm not too willing to let go
link |
01:59:03.520
of things that empirically worked
link |
01:59:08.040
for things that work in theory.
link |
01:59:11.460
And so I'm, again, I'm absolutely with you on sleep.
link |
01:59:16.040
Also, I'm with you on sleep conceptually,
link |
01:59:19.560
but I'm also very humbled to understand
link |
01:59:23.900
that for different people,
link |
01:59:26.760
good sleep means different things.
link |
01:59:28.760
I'm very hesitant to trust science on sleep.
link |
01:59:33.040
I think you should also be a scholar of your own body.
link |
01:59:35.760
Again, the experiment of NF1.
link |
01:59:38.200
I'm not so sure that a full night's sleep is great for me.
link |
01:59:44.000
There is something about that power nap
link |
01:59:47.840
that I just have not fully studied yet,
link |
01:59:50.160
but that nap is something special.
link |
01:59:52.560
That I'm not sure I found the optimal thing.
link |
01:59:55.960
So like there's a lot to be explored
link |
01:59:57.600
to what is exactly optimal amount of sleep,
link |
02:00:00.200
optimal kind of sleep combined with diet
link |
02:00:02.400
and all those kinds of things.
link |
02:00:03.240
I mean, that all maps the sort of data,
link |
02:00:05.560
at least the truth, exactly what you're referring to.
link |
02:00:08.840
Here's a data point for your consideration.
link |
02:00:10.520
Yes.
link |
02:00:12.760
The progress in biology over the past, say decade,
link |
02:00:17.400
has been stunning.
link |
02:00:18.800
Yes.
link |
02:00:19.640
And it now appears as if we will be able to replace
link |
02:00:25.220
our organs, zero X for a transplantation.
link |
02:00:28.300
And so we probably have a path to replace
link |
02:00:33.880
and regenerate every organ of your body,
link |
02:00:37.840
except for your brain.
link |
02:00:42.480
You can lose your hand and your arm and a leg.
link |
02:00:45.280
You can have an artificial heart.
link |
02:00:47.520
You can't operate without your brain.
link |
02:00:49.640
And so when you make that trade off decision
link |
02:00:52.180
of whether you're going to sleep under the desk or not
link |
02:00:54.760
and go all out for a four day marathon, right?
link |
02:01:00.280
There's a cost benefit trade off of what's going on,
link |
02:01:02.760
what's happening to your brain in that situation.
link |
02:01:05.740
We don't know the consequences
link |
02:01:07.600
of modern day life on our brain.
link |
02:01:09.940
We don't, it's the most valuable organ in our existence.
link |
02:01:14.940
And we don't know what's going on if we,
link |
02:01:18.200
in how we're treating it today with stress
link |
02:01:20.660
and with sleep and with dietary.
link |
02:01:23.400
And to me, then if you say that you're trying to,
link |
02:01:28.520
you're trying to optimize life
link |
02:01:30.000
for whatever things you're trying to do.
link |
02:01:32.920
The game is soon with the progress in anti aging and biology,
link |
02:01:36.380
the game is very soon going to become different
link |
02:01:38.160
than what it is right now with organ rejuvenation,
link |
02:01:41.240
organ replacement.
link |
02:01:42.660
And I would conjecture that we will value
link |
02:01:48.400
the health status of our brain above all things.
link |
02:01:52.440
Yeah, no, absolutely.
link |
02:01:53.960
Everything you're saying is true, but we die.
link |
02:01:58.200
We die pretty quickly, life is short.
link |
02:02:01.160
And I'm one of those people that I would rather die in battle
link |
02:02:08.320
than stay safe at home.
link |
02:02:11.160
It's like, yeah, you look at kind of,
link |
02:02:14.860
there's a lot of things that you can reasonably say,
link |
02:02:17.120
these are, this is the smart thing to do
link |
02:02:19.040
that can prevent you, that becomes conservative,
link |
02:02:21.640
that can prevent you from fully embracing life.
link |
02:02:24.440
I think ultimately you can be very intelligent
link |
02:02:27.560
and data driven and also embrace life.
link |
02:02:30.760
But I err on the side of embracing life.
link |
02:02:33.760
It's very, it takes a very skillful person
link |
02:02:36.720
to not sort of that hovering parent that says,
link |
02:02:40.280
you know what, there's a 3% chance that if you go out,
link |
02:02:44.600
if you go out by yourself and play, you're going to die,
link |
02:02:47.700
get run over by a car, come to a slow or a sudden end.
link |
02:02:51.820
And I am more a supporter of just go out there.
link |
02:02:57.480
If you die, you die.
link |
02:02:59.280
And that's a, it's a balance you have to strike.
link |
02:03:02.880
I think there's a balance to strike
link |
02:03:04.880
in the longterm optimization and short term freedom.
link |
02:03:11.760
For me, for a programmer, for a programming mind,
link |
02:03:15.320
I tend to over optimize and I'm very cautious
link |
02:03:18.240
and afraid of that, to not over optimize
link |
02:03:21.440
and thereby be overly cautious, suboptimally cautious
link |
02:03:26.080
about everything I do.
link |
02:03:27.520
And that's the ultimate thing I'm trying to optimize for.
link |
02:03:30.360
It's funny you said like sleep and all those kinds of things.
link |
02:03:33.440
I tend to think, this is, you're being more precise
link |
02:03:38.440
than I am, but I think I tend to want to minimize stress,
link |
02:03:47.440
which everything comes into that from your sleep
link |
02:03:49.900
and all those kinds of things.
link |
02:03:50.860
But I worry that whenever I'm trying to be too strict
link |
02:03:54.800
with myself, then the stress goes up
link |
02:03:57.640
when I don't follow the strictness.
link |
02:04:00.520
And so you have to kind of, it's a weird,
link |
02:04:02.880
it's a, there's so many variables in an objective function
link |
02:04:05.680
as it's hard to get right.
link |
02:04:07.320
And sort of not giving a damn about sleep
link |
02:04:09.640
and not giving a damn about diet is a good thing
link |
02:04:11.940
to inject in there every once in a while
link |
02:04:14.040
for somebody who's trying to optimize everything.
link |
02:04:17.080
But that's me just trying to, it's exactly like you said,
link |
02:04:20.240
you're just a scientist, I'm a scientist of myself,
link |
02:04:22.700
you're a scientist of yourself.
link |
02:04:24.120
It'd be nice if somebody else was doing it
link |
02:04:25.800
and had much better data, because I don't trust
link |
02:04:28.680
my conscious mind and I pigged out last night
link |
02:04:30.820
at some brisket in LA that I regret deeply.
link |
02:04:34.120
It's just so, uh.
link |
02:04:37.040
There's no point to anything I just said.
link |
02:04:38.880
But.
link |
02:04:39.720
But.
link |
02:04:40.540
What is the nature of your regret on the brisket?
link |
02:04:46.680
Is it, do you wish you hadn't eaten it entirely?
link |
02:04:49.800
Is it that you wish you hadn't eaten as much as you did?
link |
02:04:51.800
Is it that?
link |
02:04:55.200
I think, well, the most regret, I mean,
link |
02:04:58.520
if we want to be specific, I drank way too much like that.
link |
02:05:03.520
Like diet soda.
link |
02:05:05.260
My biggest regret is like having drank so much diet soda.
link |
02:05:08.100
That's the thing that really was the problem.
link |
02:05:10.380
I had trouble sleeping because of that.
link |
02:05:12.160
Because I was like programming and then I was editing.
link |
02:05:14.660
And so I'd stay up late at night
link |
02:05:15.740
and then I had to get up to go pee a few times
link |
02:05:18.500
and it was just a mess.
link |
02:05:19.460
A mess of a night.
link |
02:05:20.360
It was, well, it's not really a mess,
link |
02:05:22.340
but like it's so many, it's like the little things.
link |
02:05:25.980
I know if I just eat, I drink a little bit of water
link |
02:05:30.100
and that's it, and there's a certain,
link |
02:05:31.780
all of us have perfect days that we know diet wise
link |
02:05:36.580
and so on that's good to follow, you feel good.
link |
02:05:39.180
I know what it takes for me to do that.
link |
02:05:41.700
I didn't fully do that and thereby,
link |
02:05:43.820
because there's an avalanche effect
link |
02:05:47.660
where the other sources of stress,
link |
02:05:50.140
all the other to do items I have piled on,
link |
02:05:53.180
my failure to execute on some basic things
link |
02:05:55.660
that I know make me feel good and all of that combines
link |
02:05:58.860
to create a mess of a day.
link |
02:06:02.460
But some of that chaos, you have to be okay with it,
link |
02:06:05.020
but some of it I wish was a little bit more optimal.
link |
02:06:07.900
And your ideas about eating in the morning
link |
02:06:11.020
are quite interesting as an experiment to try.
link |
02:06:14.380
Can you elaborate, are you eating once a day?
link |
02:06:18.540
Yes.
link |
02:06:19.780
In the morning and that's it.
link |
02:06:22.240
Can you maybe speak to how that,
link |
02:06:24.380
you spoke, it's funny, you spoke about the metrics of sleep,
link |
02:06:30.660
but you're also, you run a business,
link |
02:06:34.780
you're incredibly intelligent,
link |
02:06:36.180
mostly your happiness and success
link |
02:06:40.260
relies on you thinking clearly.
link |
02:06:43.240
So how does that affect your mind and your body
link |
02:06:45.900
in terms of performance?
link |
02:06:47.180
So not just sleep, but actual mental performance.
link |
02:06:50.980
As you were explaining your objective function of,
link |
02:06:53.620
for example, in the criteria you were including,
link |
02:06:56.660
you like certain neurochemical states,
link |
02:06:59.740
like you like feeling like you're living life,
link |
02:07:02.520
that life has enjoyment,
link |
02:07:04.580
that sometimes you want to disregard certain rules
link |
02:07:07.540
to have a moment of passion, of focus.
link |
02:07:10.220
There's this architecture of the way Lex is,
link |
02:07:13.260
which makes you happy as a story you tell,
link |
02:07:16.340
as something you kind of experience,
link |
02:07:17.740
maybe the experience is a bit more complicated,
link |
02:07:19.340
but it's in this idea you have, this is a version of you.
link |
02:07:22.500
And the reason why I maintain the schedule I do
link |
02:07:26.860
is I've chosen a game to say,
link |
02:07:29.620
I would like to live a life
link |
02:07:31.780
where I care more about what intelligent,
link |
02:07:37.740
what people who live in the year 2500
link |
02:07:41.660
think of me than I do today.
link |
02:07:44.780
That's the game I'm trying to play.
link |
02:07:46.620
And so therefore the only thing I really care about
link |
02:07:51.140
on this optimization is trying to see past myself,
link |
02:07:56.420
past my limitations, using zeroes principle thinking,
link |
02:08:01.060
pull myself out of this contextual mesh we're in right now
link |
02:08:04.220
and say, what will matter 100 years from now
link |
02:08:07.380
and 200 years from now?
link |
02:08:08.460
What are the big things really going on
link |
02:08:11.220
that are defining reality?
link |
02:08:13.300
And I find that if I were to hang out with Diet Soda Lex
link |
02:08:22.300
and Diet Soda Brian were to play along with that
link |
02:08:24.940
and my deep sleep were to get crushed as a result,
link |
02:08:28.500
my mind would not be on what matters
link |
02:08:30.580
in 100 years or 200 years or 300 years.
link |
02:08:32.140
I would be irritable.
link |
02:08:34.300
I would be, I'd be in a different state.
link |
02:08:37.460
And so it's just gameplay selection.
link |
02:08:41.100
It's what you and I have chosen to think about.
link |
02:08:43.660
It's what we've chosen to work on.
link |
02:08:47.500
And this is why I'm saying that no generation of humans
link |
02:08:51.740
have ever been afforded the opportunity
link |
02:08:54.580
to look at their lifespan and contemplate
link |
02:08:58.660
that they will have the possibility of experiencing
link |
02:09:03.180
an evolved form of consciousness that is undeniable.
link |
02:09:06.860
They would fall in a zero category of potential.
link |
02:09:10.980
That to me is the most exciting thing in existence.
link |
02:09:14.100
And I would not trade any momentary neurochemical state
link |
02:09:19.700
right now in exchange for that.
link |
02:09:21.500
I would, I'd be willing to deprive myself
link |
02:09:23.660
of all momentary joy in pursuit of that goal
link |
02:09:27.060
because that's what makes me happy.
link |
02:09:29.780
That's brilliant.
link |
02:09:30.780
But I'm a bit, I just looked it up.
link |
02:09:34.300
I'm with, I just looked up Braveheart's speech
link |
02:09:38.620
and William Wallace, but I don't know if you've seen it.
link |
02:09:41.780
Fight and you may die, run and you'll live at least a while.
link |
02:09:45.500
And dying in your beds many years from now,
link |
02:09:48.020
would you be willing to trade all the days
link |
02:09:51.420
from this day to that for one chance,
link |
02:09:53.700
just one chance, picture Mel Gibson saying this,
link |
02:09:57.060
to come back here and tell our enemies
link |
02:09:59.060
that they may take our lives with growing excitement,
link |
02:10:03.700
but they'll never take our freedom.
link |
02:10:06.380
I get excited every time I see that in the movie,
link |
02:10:08.700
but that's kind of how I approach life and eating.
link |
02:10:11.020
Do you think they were tracking their sleep?
link |
02:10:13.140
They were not tracking their sleep
link |
02:10:14.460
and they ate way too much brisket
link |
02:10:16.180
and they were fat, unhealthy, died early,
link |
02:10:19.420
and were primitive.
link |
02:10:22.420
But there's something in my ape brain
link |
02:10:25.420
that's attracted to that, even though most of my life
link |
02:10:29.340
is fully aligned with the way you see yours.
link |
02:10:32.620
Part of it is for comedy, of course,
link |
02:10:34.100
but part of it is I'm almost afraid of overoptimization.
link |
02:10:38.820
Really what you're saying though,
link |
02:10:39.860
if we're looking at this,
link |
02:10:41.580
let's say from a first principles perspective,
link |
02:10:43.460
when you read those words,
link |
02:10:44.700
they conjure up certain life experiences,
link |
02:10:46.540
but you're basically saying,
link |
02:10:47.460
I experienced a certain neurotransmitter state
link |
02:10:50.340
when these things are in action.
link |
02:10:53.060
That's all you're saying.
link |
02:10:53.940
So whether it's that or something else,
link |
02:10:55.300
you're just saying you have a selection
link |
02:10:57.180
for how your state for your body.
link |
02:10:59.380
And so if you as an engineer of consciousness,
link |
02:11:02.780
that should just be engineerable.
link |
02:11:05.180
And that's just triggering certain chemical reactions.
link |
02:11:08.260
And so it doesn't mean they have to be mutually exclusive.
link |
02:11:11.300
You can have that and experience that
link |
02:11:12.980
and also not sacrifice longterm health.
link |
02:11:15.780
And I think that's the potential of where we're going
link |
02:11:17.540
is we don't have to assume they are trade offs
link |
02:11:23.980
that must be had.
link |
02:11:25.660
Absolutely.
link |
02:11:26.500
And so I guess for my particular brain,
link |
02:11:28.860
it's useful to have the outlier experiences
link |
02:11:32.100
that also come along with the illusion of free will
link |
02:11:35.660
where I chose those experiences
link |
02:11:37.780
that make me feel like it's freedom.
link |
02:11:39.540
Listen, going to Texas made me realize I spent,
link |
02:11:42.380
so I still am, but I lived at Cambridge at MIT
link |
02:11:46.740
and I never felt like home there.
link |
02:11:49.140
I felt like home in the space of ideas with the colleagues,
link |
02:11:52.580
like when I was actually discussing ideas,
link |
02:11:54.340
but there is something about the constraints,
link |
02:11:58.780
how cautious people are,
link |
02:12:00.260
how much they valued also kind of a material success,
link |
02:12:05.140
career success.
link |
02:12:06.820
When I showed up to Texas, it felt like I belong.
link |
02:12:12.020
That was very interesting, but that's my neurochemistry,
link |
02:12:14.740
whatever the hell that is, whatever,
link |
02:12:17.260
maybe probably is rooted to the fact
link |
02:12:18.740
that I grew up in the Soviet Union
link |
02:12:20.060
and it was such a constrained system
link |
02:12:22.020
that you'd really deeply value freedom
link |
02:12:23.940
and you always want to escape the man
link |
02:12:27.420
and the control of centralized systems.
link |
02:12:29.060
I don't know what it is, but at the same time,
link |
02:12:32.380
I love strictness.
link |
02:12:33.820
I love the dogmatic authoritarianism of diet,
link |
02:12:38.620
of like the same habit, exactly the habit you have.
link |
02:12:41.340
I think that's actually when bodies perform optimally,
link |
02:12:43.740
my body performs optimally.
link |
02:12:45.540
So balancing those two, I think if I have the data,
link |
02:12:48.860
every once in a while, party with some wild people,
link |
02:12:51.620
but most of the time eat once a day,
link |
02:12:54.580
perhaps in the morning, I'm gonna try that.
link |
02:12:56.220
That might be very interesting,
link |
02:12:57.900
but I'd rather not try it.
link |
02:12:59.980
I'd rather have the data that tells me to do it.
link |
02:13:03.100
But in general, you're able to, eating once a day,
link |
02:13:07.180
think deeply about stuff like this.
link |
02:13:09.940
Concern that people have is like does your energy wane,
link |
02:13:13.860
all those kinds of things.
link |
02:13:15.140
Do you find that it's, especially because it's unique,
link |
02:13:19.220
it's vegan as well.
link |
02:13:21.020
So you find that you're able to have a clear mind,
link |
02:13:23.860
a focus, and just physically and mentally throughout?
link |
02:13:27.740
Yeah, and I find like my personal experience
link |
02:13:29.820
in thinking about hard things is,
link |
02:13:36.020
like oftentimes I feel like I'm looking through a telescope
link |
02:13:40.180
and like I'm aligning two or three telescopes.
link |
02:13:42.260
And you kind of have to close one eye
link |
02:13:44.100
and move it back and forth a little bit
link |
02:13:46.180
and just find just the right alignment.
link |
02:13:47.940
Then you find just a sneak peek
link |
02:13:49.820
at the thing you're trying to find, but it's fleeting.
link |
02:13:51.660
If you move just one little bit, it's gone.
link |
02:13:54.780
And oftentimes what I feel like are the ideas
link |
02:13:58.820
I value the most are like that.
link |
02:14:00.620
They're so fragile and fleeting and slippery and elusive.
link |
02:14:06.700
And it requires a sensitivity to thinking
link |
02:14:13.860
and a sensitivity to maneuver through these things.
link |
02:14:16.500
If I concede to a world where I'm on my phone texting,
link |
02:14:21.500
I'm also on social media.
link |
02:14:22.340
I'm also doing 15 things at the same time
link |
02:14:25.340
because I'm running a company
link |
02:14:26.380
and I'm also feeling terrible from the last night.
link |
02:14:30.620
It all just comes crashing down.
link |
02:14:31.860
And the quality of my thoughts goes to a zero.
link |
02:14:35.860
I'm a functional person to respond to basic level things,
link |
02:14:40.100
but I don't feel like I am doing anything interesting.
link |
02:14:44.940
I think that's a good word, sensitivity,
link |
02:14:46.940
because that's the word that's used the most.
link |
02:14:50.420
That's what thinking deeply feels like
link |
02:14:53.860
is you're sensitive to the fragile thoughts.
link |
02:14:55.820
And you're right.
link |
02:14:56.660
All those other distractions kind of dull
link |
02:14:59.100
your ability to be sensitive to the fragile thoughts.
link |
02:15:02.980
It's a really good word.
link |
02:15:05.540
Out of all the things you've done,
link |
02:15:08.140
you've also climbed Mount Kilimanjaro.
link |
02:15:13.100
Is this true?
link |
02:15:14.120
It's true.
link |
02:15:14.960
Why and how, and what do you take from that experience?
link |
02:15:25.680
I guess the backstory is relevant
link |
02:15:26.880
because in that moment, it was the darkest time in my life.
link |
02:15:32.440
I was ending a 13 year marriage.
link |
02:15:34.280
I was leaving my religion.
link |
02:15:35.320
I sold Braintree and I was battling depression
link |
02:15:38.480
where I was just at the end.
link |
02:15:40.880
And I got invited to go to Tanzania
link |
02:15:45.800
as part of a group that was raising money
link |
02:15:47.120
to build clean water wells.
link |
02:15:49.400
And I had made some money from Braintree,
link |
02:15:51.560
and so I was able to donate $25,000.
link |
02:15:54.480
And it was the first time I had ever had money to donate
link |
02:15:59.280
outside of paying tithing in my religion.
link |
02:16:01.720
It was such a phenomenal experience
link |
02:16:04.400
to contribute something meaningful to someone else
link |
02:16:09.400
in that form.
link |
02:16:11.280
And as part of this process,
link |
02:16:12.720
we were gonna climb the mountain.
link |
02:16:13.800
And so we went there and we saw the clean water wells
link |
02:16:16.600
we were building.
link |
02:16:17.440
We spoke to the people there and it was very energizing.
link |
02:16:20.520
And then we climbed Kilimanjaro
link |
02:16:21.920
and I came down with a stomach flu on day three.
link |
02:16:29.240
And I also had altitude sickness,
link |
02:16:30.720
but I became so sick that on day four,
link |
02:16:33.080
we are somebody on day five,
link |
02:16:34.760
I came into the camp, base camp at 15,000 feet,
link |
02:16:39.480
just going to the bathroom on myself
link |
02:16:42.840
and falling all over.
link |
02:16:44.520
I was just a disaster, I was so sick.
link |
02:16:47.160
So stomach flu and altitude sickness.
link |
02:16:49.920
Yeah, and I just was destroyed from the situation.
link |
02:16:57.120
Plus, it was psychologically one of the lowest points.
link |
02:17:00.880
Yeah, and I think that was probably a big contributor.
link |
02:17:03.360
I was just smoked as a human, just absolutely done.
link |
02:17:06.600
And I had three young children.
link |
02:17:07.760
And so I was trying to reconcile,
link |
02:17:09.680
this is not a, whether I live or not
link |
02:17:12.960
is not my decision by itself.
link |
02:17:14.920
I'm now intertwined with these three little people
link |
02:17:19.600
and I have an obligation whether I like it or not,
link |
02:17:22.960
I need to be there.
link |
02:17:24.480
And so it did, it felt like I was just stuck
link |
02:17:26.440
in a straight jacket.
link |
02:17:27.680
And I had to decide whether I was going to summit
link |
02:17:32.240
the next day with the team.
link |
02:17:35.280
And it was a difficult decision
link |
02:17:36.600
because once you start hiking,
link |
02:17:38.200
there's no way to get off the mountain.
link |
02:17:40.800
And a midnight came and our guide came in
link |
02:17:44.000
and he said, where are you at?
link |
02:17:44.960
And I said, I think I'm okay, I think I can try.
link |
02:17:48.200
And so we went.
link |
02:17:49.760
And so from midnight to, I made it to the summit at 5 a.m.
link |
02:17:56.840
It was one of the most transformational moments
link |
02:17:59.640
of my existence.
link |
02:18:00.960
And the mountain became my problem.
link |
02:18:06.560
It became everything that I was struggling with.
link |
02:18:09.920
And when I started hiking, it was,
link |
02:18:12.560
the pain got so ferocious that it was kind of like this.
link |
02:18:19.080
It became so ferocious that I turned my music to Eminem
link |
02:18:25.040
and it was, Eminem was the,
link |
02:18:26.920
he was the only person in existence that spoke to my soul.
link |
02:18:31.160
And it was something about his anger
link |
02:18:33.680
and his vibrancy and his multi eventually,
link |
02:18:38.040
he's the only person who I could turn on
link |
02:18:39.960
and I could just say, I feel some relief.
link |
02:18:42.760
I turned on Eminem and I made it to the summit
link |
02:18:46.760
after five hours, but just 100 yards from the top.
link |
02:18:51.000
I was with my guide Ike and I started getting very dizzy
link |
02:18:54.440
and I felt like I was gonna fall backwards
link |
02:18:56.960
off this cliff area we were on.
link |
02:18:58.520
I was like, this is dangerous.
link |
02:19:00.440
And he said, look, Brian, I know where you're at.
link |
02:19:04.600
I know where you're at.
link |
02:19:06.200
And I can tell you, you've got it in you.
link |
02:19:08.400
So I want you to look up, take a step, take a breath
link |
02:19:14.360
and look up, take a breath and take a step.
link |
02:19:16.520
And I did and I made it.
link |
02:19:19.240
And so I got there and I just sat down with him at the top.
link |
02:19:22.720
I just cried like a baby.
link |
02:19:24.200
Broke down.
link |
02:19:25.040
Yeah, I just lost it.
link |
02:19:26.720
And so he'd let me do my thing.
link |
02:19:29.520
And then we pulled out the pulse oximeter
link |
02:19:31.920
and he measured my blood oxygen levels
link |
02:19:33.240
and it was like 50 something percent
link |
02:19:36.080
and it was danger zone.
link |
02:19:36.960
So he looked at it and I think he was like really alarmed
link |
02:19:39.560
that I was in this situation.
link |
02:19:41.160
And so he said, we can't get a helicopter here
link |
02:19:44.840
and we can't get you emergency evacuated.
link |
02:19:46.160
You've gotta go down.
link |
02:19:47.000
You've gotta hike down to 15,000 feet to get base camp.
link |
02:19:49.720
And so we went out on the mountain.
link |
02:19:53.080
I got back down to base camp.
link |
02:19:55.000
And again, that was pretty difficult.
link |
02:19:57.560
And then they put me on a stretcher,
link |
02:19:59.160
this metal stretcher with this one wheel
link |
02:20:01.040
and a team of six people wheeled me down the mountain.
link |
02:20:04.680
And it was pretty torturous.
link |
02:20:06.440
I'm very appreciative they did.
link |
02:20:07.600
Also the trail was very bumpy.
link |
02:20:09.080
So they'd go over the big rocks.
link |
02:20:10.240
And so my head would just slam
link |
02:20:11.880
against this metal thing for hours.
link |
02:20:14.880
And so I just felt awful.
link |
02:20:15.960
Plus I'd get my head slammed every couple of seconds.
link |
02:20:18.880
So the whole experience was really a life changing moment.
link |
02:20:22.080
And that was the demarcation of me
link |
02:20:25.040
basically building a new life.
link |
02:20:26.680
Basically I said, I'm going to reconstruct Brian,
link |
02:20:31.440
my understanding of reality, my existential realities,
link |
02:20:35.000
what I want to go after.
link |
02:20:36.760
And I try, I mean, as much as that's possible as a human,
link |
02:20:39.640
but that's when I set out to rebuild everything.
link |
02:20:44.360
Was it the struggle of that?
link |
02:20:46.040
I mean, there's also just like the romantic poetic,
link |
02:20:50.840
it's a fricking mountain.
link |
02:20:54.000
There's a man in pain, psychological and physical
link |
02:20:58.040
struggling up a mountain.
link |
02:20:59.840
But it's just struggle, just in the face of,
link |
02:21:05.600
just pushing through in the face of hardship or nature too.
link |
02:21:09.800
Something much bigger than you.
link |
02:21:12.480
Is that, was that the thing that just clicked?
link |
02:21:14.880
For me, it felt like I was just locked in with reality
link |
02:21:18.000
and it was a death match.
link |
02:21:21.280
It was in that moment, one of us is going to die.
link |
02:21:24.040
So you were pondering death, like not surviving.
link |
02:21:27.480
Yep.
link |
02:21:28.320
And it was, and that was the moment.
link |
02:21:29.880
And it was, the summit to me was,
link |
02:21:33.720
I'm going to come out on top and I can do this.
link |
02:21:35.920
And giving in was, it's like, I'm just done.
link |
02:21:40.640
And so it did, I locked in and that's why,
link |
02:21:43.400
yeah, mountains are magical to me.
link |
02:21:49.200
I didn't expect that.
link |
02:21:50.320
I didn't design that.
link |
02:21:51.240
I didn't know that was going to be the case.
link |
02:21:52.960
I not, it would not have been something
link |
02:21:55.000
I would have anticipated.
link |
02:21:58.120
But you were not the same man afterwards.
link |
02:22:01.600
Yeah.
link |
02:22:02.840
Is there advice you can give to young people today
link |
02:22:06.320
that look at your story,
link |
02:22:07.400
that's successful in many dimensions,
link |
02:22:10.600
advice you can give to them about how to be successful
link |
02:22:13.240
in their career, successful in life,
link |
02:22:16.160
whatever path they choose?
link |
02:22:18.880
Yes, I would say, listen to advice
link |
02:22:22.560
and see it for what it is, a mirror of that person,
link |
02:22:28.840
and then map and know that your future
link |
02:22:31.920
is going to be in a zero principle land.
link |
02:22:34.760
And so what you're hearing today is a representation
link |
02:22:38.080
of what may have been the right principles
link |
02:22:39.800
to build upon previously,
link |
02:22:41.360
but they're likely depreciating very fast.
link |
02:22:45.680
And so I am a strong proponent
link |
02:22:49.120
that people ask for advice, but they don't take advice.
link |
02:22:57.280
So how do you take advice properly?
link |
02:23:00.920
It's in the careful examination of the advice.
link |
02:23:03.320
It's actually the person makes a statement
link |
02:23:06.600
about a given thing somebody should follow.
link |
02:23:09.120
The value is not doing that.
link |
02:23:10.640
The value is understanding the assumption stack they built,
link |
02:23:13.480
the assumption and knowledge stack they built
link |
02:23:15.320
around that body of knowledge.
link |
02:23:18.080
That's the value.
link |
02:23:18.920
It's not doing what they say.
link |
02:23:20.800
Considering the advice, but digging deeper
link |
02:23:25.080
to understand the assumption stack,
link |
02:23:27.920
like the full person,
link |
02:23:29.560
I mean, this is deep empathy, essentially,
link |
02:23:32.400
to understand the journey of the person
link |
02:23:34.200
that arrived at the advice.
link |
02:23:36.320
And the advice is just the tip of the iceberg
link |
02:23:39.040
that ultimately is not the thing that gives you.
link |
02:23:41.080
It could be the right thing to do.
link |
02:23:43.560
It could be the complete wrong thing to do
link |
02:23:45.080
depending on the assumption stack.
link |
02:23:47.000
So you need to investigate the whole thing.
link |
02:23:49.520
Is there some, are there been people in your startup
link |
02:23:54.080
and your business journey that have served that role
link |
02:23:59.360
of advice giver that's been helpful?
link |
02:24:02.640
Or do you feel like your journey felt like a lonely path?
link |
02:24:07.640
Or was it one that was, of course,
link |
02:24:11.520
we're all there born and die alone.
link |
02:24:15.240
But do you fundamentally remember the experiences,
link |
02:24:21.360
one where you leaned on people
link |
02:24:23.960
at a particular moment in time that changed everything?
link |
02:24:26.920
Yeah, the most significant moments of my memory,
link |
02:24:30.840
for example, like on Kilimanjaro,
link |
02:24:33.040
when Ike, some person I'd never met in Tanzania,
link |
02:24:38.160
was able to, in that moment, apparently see my soul
link |
02:24:41.800
when I was in this death match with reality.
link |
02:24:45.000
And he gave me the instructions, look up, step.
link |
02:24:48.680
And so there's magical people in my life
link |
02:24:54.880
that have done things like that.
link |
02:24:57.640
And I suspect they probably don't know.
link |
02:25:00.760
I probably should be better at identifying those things.
link |
02:25:04.480
And, but yeah, hopefully the,
link |
02:25:16.000
I suppose like the wisdom I would aspire to
link |
02:25:18.960
is to have the awareness and the empathy
link |
02:25:22.920
to be that for other people.
link |
02:25:24.920
And not a retail advertiser of advice,
link |
02:25:33.640
of tricks for life, but deeply meaningful
link |
02:25:39.000
and empathetic with a one on one context
link |
02:25:41.800
with people that it really can make a difference.
link |
02:25:45.560
Yeah, I actually kind of experience,
link |
02:25:47.400
I think about that sometimes.
link |
02:25:49.040
You have like an 18 year old kid come up to you.
link |
02:25:51.560
And it's not always obvious,
link |
02:25:56.080
it's not always easy to really listen to them.
link |
02:26:00.120
Like not the facts, but like see who that person is.
link |
02:26:04.600
I think people say that about being a parent
link |
02:26:07.040
is you want to consider that,
link |
02:26:11.960
you don't want to be the authority figure
link |
02:26:14.200
in the sense that you really want to consider
link |
02:26:16.040
that there's a special unique human being there
link |
02:26:18.720
with a unique brain that may be brilliant
link |
02:26:23.520
in ways that you are not understanding
link |
02:26:25.520
that you'll never be and really try to hear that.
link |
02:26:29.360
So when giving advice, there's something to that.
link |
02:26:31.640
It's a both sides should be deeply empathetic
link |
02:26:34.000
about the assumption stack.
link |
02:26:36.880
I love that terminology.
link |
02:26:38.800
What do you think is the meaning of this whole thing of life?
link |
02:26:43.400
Why the hell are we here, Brian Johnson?
link |
02:26:46.080
We've been talking about brains and studying brains
link |
02:26:48.320
and you had this very eloquent way of describing life
link |
02:26:51.440
on Earth as an optimization problem
link |
02:26:55.400
of the cost of intelligence going to zero.
link |
02:26:59.400
At first through the evolutionary process
link |
02:27:01.760
and then eventually through building,
link |
02:27:05.280
through our technology,
link |
02:27:06.720
building more and more intelligent systems.
link |
02:27:09.280
You ever ask yourself why is doing that?
link |
02:27:13.600
Yeah, I think the answer to this question,
link |
02:27:17.440
again, the information value is more in the mirror
link |
02:27:21.640
it provides of that person,
link |
02:27:24.360
which is a representation of the technological,
link |
02:27:26.960
social, political context of the time.
link |
02:27:30.120
So if you ask this question a hundred years ago,
link |
02:27:32.240
you would get a certain answer
link |
02:27:33.280
that reflects that time period.
link |
02:27:35.040
Same thing would be true of a thousand years ago.
link |
02:27:36.960
It's rare, it's difficult for a person to pull themselves
link |
02:27:40.920
out of their contextual awareness
link |
02:27:43.080
and offer a truly original response.
link |
02:27:45.240
And so knowing that I am contextually influenced
link |
02:27:48.520
by the situation, that I am a mirror for our reality,
link |
02:27:52.400
I would say that in this moment,
link |
02:27:55.160
I think the real game going on
link |
02:28:01.280
is that evolution built a system
link |
02:28:07.920
of scaffolding intelligence that produced us.
link |
02:28:11.560
We are now building intelligence systems
link |
02:28:13.880
that are scaffolding higher dimensional intelligence,
link |
02:28:19.040
that's developing more robust systems of intelligence.
link |
02:28:28.040
In doing in that process with the cost going to zero,
link |
02:28:32.680
then the meaning of life becomes goal alignment,
link |
02:28:37.680
which is the negotiation of our conscious
link |
02:28:41.520
and unconscious existence.
link |
02:28:43.680
And then I'd say the third thing is,
link |
02:28:45.040
if we're thinking that we wanna be explorers
link |
02:28:48.800
is our technological progress is getting to a point
link |
02:28:52.920
where we could aspirationally say,
link |
02:28:57.240
we want to figure out what is really going on,
link |
02:29:01.240
really going on, because does any of this really make sense?
link |
02:29:06.600
Now we may be a hundred, 200, 500, a thousand years away
link |
02:29:09.200
from being able to poke our way out of whatever is going on.
link |
02:29:16.000
But it's interesting that we could even state an aspiration
link |
02:29:20.120
to say, we wanna poke at this question.
link |
02:29:22.960
But I'd say in this moment of time,
link |
02:29:26.000
the meaning of life is that we can build a future state
link |
02:29:31.880
of existence that is more fantastic
link |
02:29:35.200
than anything we could ever imagine.
link |
02:29:37.120
The striving for something more amazing.
link |
02:29:43.640
And that defies expectations that we would consider
link |
02:29:50.920
bewildering and all the things that that's,
link |
02:29:55.720
and I guess the last thing,
link |
02:29:57.280
if there's multiple meanings of life,
link |
02:29:59.120
it would be infinite games.
link |
02:30:00.480
James Kars wrote the book,
link |
02:30:01.960
Finite Games, Infinite Games.
link |
02:30:04.120
The only game to play right now is to keep playing the game.
link |
02:30:09.640
And so this goes back to the algorithm
link |
02:30:11.400
of the Lex algorithm of diet soda and brisket
link |
02:30:14.640
and pursuing the passion.
link |
02:30:16.200
What I'm suggesting is there's a moment here
link |
02:30:19.320
where we can contemplate playing infinite games.
link |
02:30:22.560
Therefore, it may make sense to err on the side
link |
02:30:25.840
of making sure one is in a situation
link |
02:30:27.720
to be playing infinite games if that opportunity arises.
link |
02:30:31.240
So the landscape of possibility is changing very, very fast
link |
02:30:35.320
and therefore our old algorithms
link |
02:30:37.000
of how we might assess risk assessment
link |
02:30:38.440
and what things we might pursue
link |
02:30:39.280
and why those assumptions may fall away very quickly.
link |
02:30:44.120
Well, I think I speak for a lot of people
link |
02:30:46.120
when I say that the game you, Mr. Brian Johnson,
link |
02:30:50.520
have been playing is quite incredible.
link |
02:30:53.200
Thank you so much for talking to me.
link |
02:30:54.280
Thanks, Lex.
link |
02:30:56.160
Thanks for listening to this conversation
link |
02:30:57.680
with Brian Johnson and thank you
link |
02:30:59.480
to Four Sigmatic, NetSuite, Grammarly, and ExpressVPN.
link |
02:31:05.040
Check them out in the description to support this podcast.
link |
02:31:08.360
And now let me leave you with some words
link |
02:31:10.120
from Diane Ackerman.
link |
02:31:12.000
Our brain is a crowded chemistry lab,
link |
02:31:15.120
bustling with nonstop neural conversations.
link |
02:31:18.760
Thank you for listening and hope to see you next time.