back to index

Bryan Johnson: Kernel Brain-Computer Interfaces | Lex Fridman Podcast #186


small model | large model

link |
00:00:00.000
The following is a conversation with Brian Johnson, founder of Colonel, a company that
link |
00:00:04.920
has developed devices that can monitor and record brain activity.
link |
00:00:09.160
And previously, he was the founder of BrainTree, a mobile payment company that acquired Venmo
link |
00:00:14.760
and then was acquired by PayPal and eBay.
link |
00:00:18.040
Quick mention of our sponsors, ForSigmatic, NetSuite, Grammarly, and ExpressVPN.
link |
00:00:24.440
Check them out in the description to support this podcast.
link |
00:00:27.200
As a side note, let me say that this was a fun and memorable experience, wearing the
link |
00:00:32.520
kernel flow brain interface in the beginning of this conversation, as you can see if you
link |
00:00:36.800
watched the video version of this episode.
link |
00:00:39.840
And there's a Ubuntu Linux machine sitting next to me collecting the data from my brain.
link |
00:00:44.920
The whole thing gave me hope that the mystery of the human mind will be unlocked in the
link |
00:00:50.280
coming decades, as we begin to measure signals from the brain in a high bandwidth way.
link |
00:00:55.480
To understand the mind, we either have to build it or to measure it.
link |
00:01:00.360
Both are worth a try.
link |
00:01:02.360
Thanks to Brian and the rest of the kernel team for making this little demo happen.
link |
00:01:06.440
This is the Lex Friedman podcast, and here is my conversation with Brian Johnson.
link |
00:01:13.320
You ready, Lex?
link |
00:01:14.320
Yes, I'm ready.
link |
00:01:15.320
Do you guys want to come in and put the interfaces on our heads?
link |
00:01:18.960
And then I will proceed to tell you a few jokes.
link |
00:01:22.120
So we have two incredible pieces of technology and a machine running Ubuntu 20.04 in front
link |
00:01:29.760
of us.
link |
00:01:30.760
What are we doing?
link |
00:01:31.760
All right.
link |
00:01:32.760
Are these going on our heads?
link |
00:01:33.760
They're going on our heads, yeah.
link |
00:01:34.760
And they will place it on our heads for proper alignment.
link |
00:01:42.040
Does this support giant heads?
link |
00:01:43.360
Because I kind of have a giant head.
link |
00:01:45.360
Is this just giant heads?
link |
00:01:46.920
As like an ego or are you saying physically both?
link |
00:01:52.680
It's a nice massage.
link |
00:01:58.400
Yes.
link |
00:02:00.400
Okay, how does this feel?
link |
00:02:03.720
It's okay to move around?
link |
00:02:05.800
It feels, oh yeah.
link |
00:02:07.200
This feels awesome.
link |
00:02:08.440
It's a pretty good fit.
link |
00:02:13.000
Thank you.
link |
00:02:14.000
That feels good.
link |
00:02:15.760
So this is big head friendly.
link |
00:02:17.200
It suits you well, Lex.
link |
00:02:19.280
Thank you.
link |
00:02:20.280
I feel like I need to, I feel like when I wear this, I need to sound like Sam Harris.
link |
00:02:27.320
Calm, collected, eloquent.
link |
00:02:31.800
I feel smarter actually.
link |
00:02:34.120
I don't think I've ever felt quite as much like I'm part of the future as now.
link |
00:02:38.880
Have you ever worn a brain interface or had your brain imaged?
link |
00:02:42.440
Oh, never had my brain imaged.
link |
00:02:45.600
The only way I've analyzed my brain is by talking to myself and thinking.
link |
00:02:52.400
No direct data.
link |
00:02:53.400
Yeah.
link |
00:02:54.400
That is definitely a brain interface that has a lot of blind spots.
link |
00:02:58.960
It has some blind spots, yeah.
link |
00:03:01.520
Psychotherapy.
link |
00:03:02.520
That's right.
link |
00:03:03.520
All right.
link |
00:03:04.520
Are we recording?
link |
00:03:05.520
Yeah.
link |
00:03:06.520
We're good.
link |
00:03:07.520
All right.
link |
00:03:08.520
So, Lex, the objective of this, I'm going to tell you some jokes and your objective
link |
00:03:17.760
is to not smile, which as a Russian, you should have an edge.
link |
00:03:23.200
Make the mother line proud.
link |
00:03:24.480
I got you.
link |
00:03:25.480
Okay.
link |
00:03:26.480
Let's hear the jokes.
link |
00:03:29.440
Lex, and this is from the Colonel crew.
link |
00:03:34.080
We've been working on a device that can read your mind and we would love to see your thoughts.
link |
00:03:38.480
Is that the joke?
link |
00:03:44.120
That's the opening.
link |
00:03:45.120
Okay.
link |
00:03:50.120
If I'm seeing the muscle activation correctly on your lips, you're not going to do well
link |
00:03:54.440
on this.
link |
00:03:55.440
Let's see.
link |
00:03:56.440
All right.
link |
00:03:57.440
Here comes the first one.
link |
00:03:58.440
I'm screwed.
link |
00:03:59.440
Here comes the first one.
link |
00:04:00.440
Is this going to break the device?
link |
00:04:01.440
Is it resilient to laughter?
link |
00:04:05.480
Lex, what goes through a potato's brain?
link |
00:04:14.280
I got really failed.
link |
00:04:15.960
That's the hilarious opener.
link |
00:04:17.640
Okay.
link |
00:04:18.640
What?
link |
00:04:19.640
Tater thoughts.
link |
00:04:24.200
What kind of fish performs brain surgery?
link |
00:04:27.720
I don't know.
link |
00:04:29.000
A neural surgeon.
link |
00:04:33.520
Okay.
link |
00:04:35.720
And so we're getting data of everything that's happening in my brain right now?
link |
00:04:39.280
Lifetime.
link |
00:04:40.280
Yeah.
link |
00:04:41.280
We're getting activation patterns of your entire cortex.
link |
00:04:44.120
I'm going to try to do better.
link |
00:04:46.240
I'll edit out all the parts where I left in Photoshop.
link |
00:04:49.160
You have a serious face over me.
link |
00:04:50.720
You can recover.
link |
00:04:51.720
Yeah.
link |
00:04:52.720
All right.
link |
00:04:53.720
Lex, what do scholars eat when they're hungry?
link |
00:04:56.960
I don't know what.
link |
00:04:58.960
Yeah, nuts.
link |
00:05:03.720
That's pretty good.
link |
00:05:05.960
So what we'll do is, so you're wearing kernel flow, which is an interface built using technology
link |
00:05:16.440
called spectroscopy.
link |
00:05:17.800
So it's similar to what we wear wearables on the wrist using light.
link |
00:05:22.400
So using light as you know.
link |
00:05:25.040
And we're using that to image the functional imaging of brain activity.
link |
00:05:30.440
And so as your neurons fire, electrically and chemically, it creates blood oxygenation
link |
00:05:37.080
levels.
link |
00:05:38.080
We're measuring that.
link |
00:05:39.080
And so when you'll see in the reconstructions we do for you, you'll see your activation
link |
00:05:42.040
patterns on your brain as throughout this entire time we are wearing it.
link |
00:05:45.920
So in the reaction to the jokes and as we were sitting here talking, and so it's a, we're
link |
00:05:52.240
moving towards a real time feed of your cortical brain activity.
link |
00:05:57.040
So there's a bunch of things that are in contact with my skull right now.
link |
00:06:02.400
How many of them are there?
link |
00:06:03.840
And so how many of them are, what are they?
link |
00:06:06.040
What are the actual sensors?
link |
00:06:07.280
There's 52 modules and each module has one laser and six sensors.
link |
00:06:14.080
And they're the sensors fire in about 100 picoseconds.
link |
00:06:18.720
And then the photons scatter and absorb in your brain and then a few go in, a few come
link |
00:06:23.440
back out, a bunch go in, then a few come back out and we sense those photons and then we
link |
00:06:28.320
do the reconstruction for the activity.
link |
00:06:30.720
Overall there's about a thousand plus channels that are sampling your activity.
link |
00:06:35.800
How difficult is it to make it as comfortable as it is?
link |
00:06:38.840
Because it's surprisingly comfortable.
link |
00:06:40.680
I would not think it would be comfortable.
link |
00:06:44.600
Yeah, it's measuring brain activity, I would not think it would be comfortable, but it
link |
00:06:50.200
is.
link |
00:06:51.200
I agree.
link |
00:06:52.200
In fact, I want to take this home.
link |
00:06:53.200
Yeah.
link |
00:06:54.200
Yeah, that's right.
link |
00:06:55.200
So people are accustomed to being in big systems like fMRI where there's 120 decibels
link |
00:07:01.720
sounds and you're in a claustrophobic encasement or EEG which is just painful or surgery.
link |
00:07:09.360
And so yes, I agree that this is a convenient option to be able to just put on your head
link |
00:07:14.360
and measure your brain activity in the contextual environment you choose.
link |
00:07:18.040
So if we want to have it during a podcast or if we want to be at home in a business
link |
00:07:22.000
setting, it's freedom to be aware, to record your brain activity in the setting that you
link |
00:07:28.160
choose.
link |
00:07:29.160
Yeah, but sort of from an engineering perspective, are these, what is it, there's a bunch of
link |
00:07:33.520
different modular parts and there's like a rubber band thing where they mold to the
link |
00:07:39.680
shape of your head.
link |
00:07:40.960
That's right.
link |
00:07:41.960
So we built this, this version of the mechanical design to accommodate most adult heads.
link |
00:07:48.520
But I have a giant head and it fits fine.
link |
00:07:51.800
It fits well actually.
link |
00:07:53.920
So I don't think I have an average head.
link |
00:07:57.240
Okay, maybe I feel much better about my head now.
link |
00:08:01.840
Maybe I'm more average than I thought.
link |
00:08:05.680
Okay, so what else is there, interesting you could say while it's on our heads.
link |
00:08:10.800
I can keep this on the whole time.
link |
00:08:12.120
This is kind of awesome.
link |
00:08:13.600
And it's amazing for me as a fan of Ubuntu, I use Ubuntu Mate, you guys should use that
link |
00:08:18.240
too.
link |
00:08:19.240
But it's amazing to have code running to the side measuring stuff and collecting data.
link |
00:08:26.160
I mean, I just, I feel like much more important now that my data is being recorded.
link |
00:08:32.600
Like somebody care, like, you know, when you have a good friend that listens to you that
link |
00:08:37.000
actually like listens, like actually is listening to you, this is what I feel like, like a much
link |
00:08:42.360
better friend because it's like accurately listening to me, Ubuntu.
link |
00:08:47.320
What a cool perspective.
link |
00:08:49.080
I hadn't thought about that of feeling understood, heard deeply by the mechanical system that
link |
00:08:59.360
is recording your brain activity versus the human that you're engaging with, that your
link |
00:09:04.640
mind immediately goes to that there's this dimensionality in depth of understanding of
link |
00:09:11.640
this software system, which you're intimately familiar with.
link |
00:09:14.720
And now you're able to communicate with this system in ways that you couldn't before.
link |
00:09:19.160
Yeah, I feel heard.
link |
00:09:22.320
Yeah, I mean, I guess what's interesting about this is your intuitions are spot on.
link |
00:09:28.920
Most people have intuitions about brainer faces that they've grown up with this idea
link |
00:09:32.800
of people moving cursors on the screen or typing or changing the channel or skipping
link |
00:09:38.120
a song.
link |
00:09:39.120
It's primarily been anchored on control.
link |
00:09:42.120
And I think the more relevant understanding of brain interfaces or neural imaging is that
link |
00:09:47.720
it's a measurement system.
link |
00:09:50.080
And once you have numbers for a given thing, a seemingly endless number of possibilities
link |
00:09:54.840
emerge around that of what to do with those numbers.
link |
00:09:57.880
So before you tell me about the possibilities, this was an incredible experience.
link |
00:10:01.480
I can keep this on for another two hours, but I'm being told that for a bunch of reasons,
link |
00:10:09.120
just because we probably want to keep the data small and visualize it nicely for the
link |
00:10:12.960
final product, we want to cut this off and take this take this amazing helmet away from
link |
00:10:17.840
me.
link |
00:10:18.840
So, Brian, thank you so much for this experience.
link |
00:10:21.320
And let's let's continue without helmet lists.
link |
00:10:25.200
All right.
link |
00:10:26.200
So that was an incredible experience.
link |
00:10:29.000
Can you maybe speak to what kind of opportunities that opens up that stream of data, that rich
link |
00:10:33.320
stream of data from the brain?
link |
00:10:35.160
First, I'm curious, what is your reaction?
link |
00:10:39.120
What comes to mind when you put that on your head?
link |
00:10:41.640
What does it mean to you?
link |
00:10:42.920
And what possibilities emerge?
link |
00:10:44.320
And what significance might it have?
link |
00:10:46.720
I'm curious where your orientation is at.
link |
00:10:50.160
Well, for me, I'm really excited by the possibility of various information about my body, about
link |
00:10:59.800
my mind being converted into data such that data can be used to create products that make
link |
00:11:06.560
my life better.
link |
00:11:07.560
So that to me is really exciting possibility.
link |
00:11:10.000
Even just like a Fitbit that measures, I don't know, some very basic measurements about
link |
00:11:14.920
your body is really cool.
link |
00:11:17.240
But it's the bandwidth of information, the resolution of that information is very crude.
link |
00:11:22.640
So it's not very interesting.
link |
00:11:24.000
The possibility of recording, of just building a data set coming in a clean way and a high
link |
00:11:31.560
bandwidth way from my brain opens up all kinds of, you know, at the very, I was kind of joking
link |
00:11:40.000
when we're talking, but it's not really is like, I feel heard in the sense that it feels
link |
00:11:47.040
like the full richness of the information coming from my mind is actually being recorded
link |
00:11:54.120
by the machine.
link |
00:11:56.000
I mean, there's a, I can't, I can't quite put it into words, but there is a genuinely
link |
00:12:01.760
for me, there's not some kind of joke about me being a robot is just genuinely feels like
link |
00:12:06.360
I'm being heard in a way that that's going to improve my life as long as the thing that's
link |
00:12:15.160
on the other end can do something useful with that data.
link |
00:12:17.800
But even the recording itself is like, once you record, it's like taking a picture.
link |
00:12:25.560
That moment is forever saved in time.
link |
00:12:28.520
Now picture cannot allow you to step back into that world, but perhaps recording your
link |
00:12:37.800
brain is a much higher resolution thing, much more personal recording of that information
link |
00:12:44.640
than a picture that would allow you to step back into that, where you were in that particular
link |
00:12:51.360
moment in history and then map out a certain trajectory to tell you certain things about,
link |
00:12:57.760
about yourself that could open up all kinds of applications.
link |
00:13:00.400
Of course, there's health that I consider, but honestly, to me, the exciting thing is
link |
00:13:05.040
just being heard.
link |
00:13:06.640
My state of mind, the level of focus, all those kinds of things being heard.
link |
00:13:11.000
What I heard you say is you have an entirety of lived experience, some of which you can
link |
00:13:16.400
communicate in words and in body language, some of which you feel internally, which cannot
link |
00:13:21.880
be captured in those communication modalities and that this measurement system captures
link |
00:13:27.560
both the things you can try to articulate in words, maybe in a lower dimensional space
link |
00:13:31.440
using one word, for example, to communicate focus when it really may be represented in
link |
00:13:35.640
a 20 dimensional space of this particular kind of focus and that this information is
link |
00:13:41.320
being captured.
link |
00:13:42.320
It's a closer representation to the entirety of your experience captured in a dynamic fashion
link |
00:13:48.400
that is not just a static image of your conscious experience.
link |
00:13:53.280
Yeah, that's the promise, that's the hope, that was the feeling and it felt like the
link |
00:13:58.920
future.
link |
00:13:59.920
So it was a pretty cool experience.
link |
00:14:01.120
And from the sort of mechanical perspective, it was cool to have an actual device that
link |
00:14:07.080
feels pretty good, that doesn't require me to go into the lab.
link |
00:14:11.840
And also the other thing I was feeling, there's a guy named Andrew Huberman, he's a friend
link |
00:14:16.600
of mine, amazing podcast, people should listen to a Huberman Lab podcast.
link |
00:14:21.800
We're working on a paper together about eye movement and so on.
link |
00:14:26.720
And we're kind of, he's a neuroscientist and I'm a data person, I'm a machine learning
link |
00:14:29.800
person and we're both excited by how much the data measurements of the human mind, the
link |
00:14:43.000
brain and all the different metrics that come from that can be used to understand human
link |
00:14:47.960
beings and in a rigorous scientific way.
link |
00:14:50.600
So the other thing I was thinking about is how this could be turned into a tool for science.
link |
00:14:56.120
Sort of not just personal science, not just like Fitbit style, like how am I doing my
link |
00:15:02.560
personal metrics of health, but doing larger scale studies of human behavior and so on.
link |
00:15:07.840
So like data, not at the scale of an individual, but data at a scale of many individuals, large
link |
00:15:12.800
number of individuals.
link |
00:15:14.800
So it's personal being heard was exciting and also just for science is exciting.
link |
00:15:19.280
Cause it's very easy, like there's a very powerful thing to it being so easy to just
link |
00:15:24.400
put on that you can scale much easier.
link |
00:15:28.440
If you think about that second thing you said about the science of the brain, most, we've
link |
00:15:38.960
done a pretty good job, like we, the human race has done a pretty good job, figuring
link |
00:15:43.560
out how to quantify the things around us from distant stars to calories and steps and our
link |
00:15:51.440
genome.
link |
00:15:52.440
So we can measure and quantify pretty much everything in the known universe except for
link |
00:15:58.360
our minds.
link |
00:16:00.800
And we can do these one offs if we're going to get an fMRI scan or do something with the
link |
00:16:06.560
low res EEG system, but we haven't done this at population scale.
link |
00:16:11.880
And so if you think about human thought or human cognition is probably the single law,
link |
00:16:19.560
largest raw input material into society at any given moment.
link |
00:16:24.880
It's our conversations with our, with ourselves and with other people.
link |
00:16:28.160
And we have this, this raw input that we can't, haven't been able to measure yet.
link |
00:16:35.400
And if you, when I think about it through that frame, it's remarkable, it's almost
link |
00:16:42.680
like we live in this wild, wild West of unquantified communications within ourselves and between
link |
00:16:50.440
each other when everything else has been grounded me.
link |
00:16:53.480
For example, I know if I buy an appliance at the, at the store or on a website, I don't
link |
00:16:58.760
need to look at the measurements on the appliance to make sure it can fit through my door.
link |
00:17:03.200
It's an engineered system of appliance manufacturing and construction.
link |
00:17:07.400
Everyone's agreed upon engineering standards.
link |
00:17:10.760
And we don't have engineering standards around cognition.
link |
00:17:15.440
It's not a, it has not entered as a formal engineering discipline that enables us to
link |
00:17:20.480
scaffold in society with everything else we're doing, including consuming news, our relationships,
link |
00:17:26.440
politics, economics, education, all the above.
link |
00:17:29.640
And so to me that the most significant contribution that kernel technology has to offer would
link |
00:17:36.600
be the formal, the introduction of the formal engineering of cognition as it relates to
link |
00:17:43.280
everything else in society.
link |
00:17:45.240
I love that idea that you kind of think that there is just this ocean of data that's coming
link |
00:17:52.080
from people's brains as being in a crude way reduced down to like tweets and texts and
link |
00:17:58.000
so on.
link |
00:17:59.000
So it's a very hard core, many scale compression of actual, the raw data.
link |
00:18:06.240
But maybe you can comment because you're using the word cognition.
link |
00:18:11.720
I think the first step is to get the brain data.
link |
00:18:15.400
But is there a leap to be taking to sort of interpreting that data in terms of cognition?
link |
00:18:22.200
So is your, is your idea is basically you need to start collecting data at scale from
link |
00:18:26.960
the brain and then we start to really be able to take little steps along the path to actually
link |
00:18:34.760
measuring some deep sense of cognition because it's, you know, as I'm sure you know, we don't,
link |
00:18:41.600
we understand a few things, but we don't understand most of what makes up cognition.
link |
00:18:47.360
This has been one of the most significant challenges of building kernel and kernel wouldn't exist
link |
00:18:52.320
if I wasn't able to fund it initially about myself because when I engage in conversations
link |
00:18:57.720
with investors, the immediate thought is what is the killer app?
link |
00:19:02.940
And of course, I understand that heuristic, that's what they're looking at is they're
link |
00:19:06.000
looking to de risk.
link |
00:19:08.160
Is the product solved?
link |
00:19:09.720
Is there a customer base?
link |
00:19:10.800
Are people willing to pay for it?
link |
00:19:12.000
How does it compare to competing options?
link |
00:19:14.440
And in the case with brain interfaces, when I started the company, there was no known
link |
00:19:19.080
path to even build a technology that could potentially become mainstream.
link |
00:19:24.640
And then once we figured out the technology, we could even we could commence having conversations
link |
00:19:28.800
with investors and it became what is the killer app?
link |
00:19:31.560
And so what has been, so I funded the first $53 million for the company.
link |
00:19:36.680
And to raise the round of funding, the first one we did, I spoke to 228 investors.
link |
00:19:42.920
One said yes.
link |
00:19:44.920
It was remarkable and it was mostly around this concept around what is a killer app?
link |
00:19:49.720
And so internally, the way we think about it is we think of the the go to market strategy
link |
00:19:55.200
much more like the Drake equation, where if we can build technology that has the characteristics
link |
00:20:01.960
of it has the data quality is high enough, it meets some certain threshold, cost, accessibility,
link |
00:20:09.760
comfort, it can be worn in contextual environments, it meets the criteria of being a mass market
link |
00:20:15.880
device, then the responsibility that we have is to figure out how to create the algorithm
link |
00:20:25.320
that enables the human to enable humans to then find value with it.
link |
00:20:32.520
So it's so the analogy is like brain interfaces are like early 90s of the Internet is you
link |
00:20:37.760
want to populate an ecosystem with a certain number of devices, you want a certain number
link |
00:20:41.160
of people who play around with them who do experiments of certain data collection parameters,
link |
00:20:44.520
you want to encourage certain mistakes from experts and non experts.
link |
00:20:48.120
These are all critical elements that ignite discovery.
link |
00:20:52.000
And so we believe we've accomplished the first objective of building technology that reaches
link |
00:20:59.800
those thresholds.
link |
00:21:01.960
And now it's the Drake equation component of how do we try to generate 20 years of value
link |
00:21:10.840
discovery in a two or three year time period, how do we compress that?
link |
00:21:14.400
So just to clarify, so when you mean the Drake equation, which for people who don't
link |
00:21:18.880
know, I don't know why you if you listen to this, I bring up aliens every single conversation.
link |
00:21:22.640
So I don't know how you wouldn't know what the Drake equation is, but you mean like the
link |
00:21:26.960
killer app, it would be one alien civilization at equations, meaning like, this is in search
link |
00:21:33.800
of an application that's impactful transformative, by the way, it should be a we need to come
link |
00:21:37.920
up with a better term and killer app as it's also violent, right?
link |
00:21:43.880
You can go like viral app, that's horrible to write some very inspiringly impactful application.
link |
00:21:51.600
How about that?
link |
00:21:52.600
No.
link |
00:21:53.600
Yeah.
link |
00:21:54.600
Okay, so bullets stick with killer app.
link |
00:21:55.600
That's fine.
link |
00:21:56.600
Okay.
link |
00:21:57.600
So what do you do?
link |
00:21:58.600
I dislike the chosen words in capturing the concept.
link |
00:22:02.240
You know, it's one of those sticky things that is as effective to use in the tech world,
link |
00:22:08.080
but when you're now become a communicator outside of the tech world, especially when
link |
00:22:13.200
you're talking about software and hardware and artificial intelligence applications,
link |
00:22:17.400
it sounds horrible.
link |
00:22:18.400
Yeah, no, it's interesting.
link |
00:22:19.400
I actually regret now having called attention to, I regret having used that word in this
link |
00:22:23.280
conversation because it's something I would not normally do, I used it in order to create
link |
00:22:28.480
a bridge of shared understanding of how others would, what terminology others would use.
link |
00:22:32.960
Yeah.
link |
00:22:33.960
But yeah, I concur.
link |
00:22:35.120
Let's go with impactful application or value creation, value creation, something people
link |
00:22:42.000
love using.
link |
00:22:43.000
There we go.
link |
00:22:44.000
That's it.
link |
00:22:45.000
Love app.
link |
00:22:46.000
Okay.
link |
00:22:47.000
So what do you have any ideas?
link |
00:22:49.440
So basically creating a framework where there's the possibility of a discovery of an application
link |
00:22:56.080
that people love using, is do you have ideas?
link |
00:22:59.760
We've begun to play a fun game internally where when we have these discussions, we begin circling
link |
00:23:05.080
around this concept of does anybody have an idea?
link |
00:23:10.240
Does anyone have intuitions?
link |
00:23:11.640
And if we see the conversation starting to veer in that direction, we flag it and say
link |
00:23:17.120
human intuition alert, stop it.
link |
00:23:20.560
And so we really want to focus on the algorithm of there's a natural process of human discovery
link |
00:23:27.720
that when you populate a system with devices and you give people the opportunity to play
link |
00:23:32.400
around with it in expected and unexpected ways, we are thinking that is a much better
link |
00:23:38.280
system of discovery than us exercising intuitions.
link |
00:23:41.320
And it's interesting, we're also seeing a few neural scientists who have been talking
link |
00:23:44.960
to us where I was speaking to one young associate professor and I approached the conversation
link |
00:23:50.640
and said, hey, we have these five data streams that we're pulling off.
link |
00:23:56.360
When you hear that, what weighted value do you add to each data source?
link |
00:23:59.600
Which one do you think is going to be valuable for your objectives and which one's not?
link |
00:24:03.240
And he said, I don't care, just give me the data.
link |
00:24:05.840
All I care about is my machine learning model.
link |
00:24:08.160
But importantly, he did not have a theory of mind.
link |
00:24:10.880
He did not come to the table and say, I think the brain operates in this way and these
link |
00:24:15.280
reasons have these functions.
link |
00:24:17.120
He didn't care.
link |
00:24:18.120
He just wanted the data.
link |
00:24:19.120
And we're seeing that more and more that certain people are devaluing human intuitions for
link |
00:24:26.600
good reasons as we've seen in machine learning over the past couple of years.
link |
00:24:30.720
And we're doing the same in our value creation market strategy.
link |
00:24:36.080
So collect more data, clean data, make the products such that the collection of data
link |
00:24:42.800
is easy and fun, and then the rest will just spring to life through humans playing around
link |
00:24:52.440
with it.
link |
00:24:53.440
Our objective is to create the most valuable data collection system of the brain ever.
link |
00:25:03.520
And with that, then apply all the best tools of machine learning and other techniques to
link |
00:25:12.720
extract out, you know, to try to find insight.
link |
00:25:15.520
But yes, our objective is really to systematize the discovery process because we can't put
link |
00:25:21.960
definite timeframes on discovery.
link |
00:25:24.080
The brain is complicated and science is not a business strategy.
link |
00:25:30.200
And so we really need to figure out how to, this is the difficulty of bringing technology
link |
00:25:36.160
like this to market.
link |
00:25:37.160
And it's why most of the time it just languishes in academia for quite some time.
link |
00:25:43.040
But we hope that we will cross over and make this mainstream in the coming years.
link |
00:25:49.040
The thing was cool to wear.
link |
00:25:50.920
But are you chasing a good reason for millions of people to put this on their head and keep
link |
00:25:58.600
on their head regularly?
link |
00:26:00.960
Is there like who's going to discover that reason?
link |
00:26:04.800
Is it going to be people just kind of organically, or is there going to be angry bird style application
link |
00:26:12.480
that's just too exciting to not use?
link |
00:26:18.200
If I think through the things that have changed my life most significantly over the past few
link |
00:26:23.000
years when I started wearing a wearable on my wrist that would give me data about my
link |
00:26:28.520
heart rate, heart rate variability, respiration rate, metabolic approximations, et cetera.
link |
00:26:37.360
For the first time in my life, I had access to information sleep patterns that were highly
link |
00:26:42.800
impactful.
link |
00:26:43.800
They told me, for example, if I eat close to bedtime, I'm not going to get deep sleep.
link |
00:26:51.120
And not getting deep sleep means you have all these follow on consequences in life.
link |
00:26:54.520
And so it opened up this window of understanding of myself that I cannot self introspect and
link |
00:27:02.320
deduce these things.
link |
00:27:03.320
This is information that was available to be acquired, but it just wasn't.
link |
00:27:07.000
I would have to get an expensive sleep study, then it's an one night, and that's not good
link |
00:27:10.960
enough to run all my trials.
link |
00:27:12.560
And so if you look just at the information that one can acquire on their wrist, and now
link |
00:27:18.440
you're planted to the entire cortex on the brain, and you say, what kind of information
link |
00:27:23.480
could we acquire?
link |
00:27:25.160
It opens up a whole new universe of possibilities.
link |
00:27:28.440
For example, we did this internal study at Kernel where I wore a prototype device and
link |
00:27:32.760
we were measuring the cognitive effects of sleep.
link |
00:27:36.320
So I had a device measuring my sleep.
link |
00:27:38.400
I performed with 13 of my coworkers.
link |
00:27:41.840
We performed four cognitive tasks over 13 sessions.
link |
00:27:45.880
And we focused on reaction time, impulse control, short term memory, and then arresting state
link |
00:27:52.160
tasks.
link |
00:27:53.160
And with mine, we found, for example, that my impulse control was independently correlated
link |
00:28:00.400
with my sleep outside of behavioral measures of my ability to play the game.
link |
00:28:04.160
The point of the study was I had the brain study I did at Kernel confirmed my life experience
link |
00:28:12.080
that if my deep sleep determined whether or not I would be able to resist temptation the
link |
00:28:19.960
following day, and my brain did it show that as one example.
link |
00:28:24.440
And so if you start thinking, if you actually have data on yourself on your entire cortex
link |
00:28:28.960
and you can control the settings, I think there's probably a large number of things
link |
00:28:36.760
that we could discover about ourselves, very, very small and very, very big.
link |
00:28:39.520
And just for example, like when you read news, what's going on?
link |
00:28:44.440
Like when you use social media, when you use news, like all the ways we allocate attention
link |
00:28:50.680
with the computer.
link |
00:28:51.680
That's right.
link |
00:28:52.680
I mean, that seems like a compelling place to where you would want to put on Kernel.
link |
00:28:57.600
By the way, what is it called?
link |
00:28:59.320
Kernel flux?
link |
00:29:00.320
Kernel?
link |
00:29:01.320
Like what?
link |
00:29:02.320
Flow.
link |
00:29:03.320
We have two technologies.
link |
00:29:04.320
You wore flow.
link |
00:29:05.320
Flow.
link |
00:29:06.320
Okay.
link |
00:29:07.320
If you look at the kernel flow, it seems like to be a compelling time and place to do it
link |
00:29:16.240
is when you're behind a desk, behind a computer, because you could probably wear it for prolonged
link |
00:29:20.120
periods of time as you're taking in content.
link |
00:29:23.800
And there could be a lot of, because so much of our lives happens in the digital world
link |
00:29:29.200
now, that kind of coupling the information about the human mind with the consumption
link |
00:29:36.520
and the behaviors in the digital world might give us a lot of information about the effects
link |
00:29:42.040
of the way we behave and navigate the digital world to the actual physical meat space effects
link |
00:29:49.360
on our body.
link |
00:29:50.360
It's interesting to think this certain terms of both like for work, I'm a big fan of Cal
link |
00:29:57.000
Newport, his ideas of deep work that I spend with few exceptions, I try to spend the first
link |
00:30:06.000
two hours of every day, usually if I'm like at home and have nothing on my schedule is
link |
00:30:11.640
going to be up to eight hours of deep work of focus, zero distraction.
link |
00:30:17.000
And for me to analyze the, I mean, I'm very aware of the, the waning of that the ups and
link |
00:30:23.360
downs of that.
link |
00:30:25.000
And it's almost like you, you're surfing the ups and downs of that as you're doing programming,
link |
00:30:29.480
as you're doing thinking about particular problems, you're trying to visualize things
link |
00:30:33.640
in your mind, you're just trying to stitch them together.
link |
00:30:37.360
You're trying to, when there's a dead end about an idea, you have to kind of calmly
link |
00:30:43.520
like walk back and start again, all those kinds of processes, it'd be interesting to
link |
00:30:48.160
get data on what my mind is actually doing.
link |
00:30:51.200
And also recently started doing, I just talked to Sam Harris a few days ago and been building
link |
00:30:57.480
up to that.
link |
00:30:58.480
He started using, started meditating, using his app, waking up, very much recommend it.
link |
00:31:04.760
And it'd be interesting to get data on that because it's, you're very, it's like, you're
link |
00:31:10.000
removing all the noise from your head and you very much, it's an active process of active
link |
00:31:16.240
noise removal, active noise canceling like the headphones.
link |
00:31:21.520
And it'd be interesting to see what is going on in the mind before the meditation, during
link |
00:31:27.360
it and after all those kinds of things.
link |
00:31:29.320
And all of your examples, it's interesting that everyone who's designed an experience
link |
00:31:34.720
for you, so whether it be the meditation app or the deep work or all the things you mentioned,
link |
00:31:40.520
they constructed this product with a certain number of knowns.
link |
00:31:45.840
Now, what if we expand to the number of knowns by 10X or 20X or 30X, they would reconstruct
link |
00:31:53.040
their product, quote, incorporate those knowns.
link |
00:31:55.520
So it'd be, and so this is the dimensionality that I think is the promising aspect is that
link |
00:32:01.200
people will be able to use this quantification, use this information to build more effective
link |
00:32:08.200
products.
link |
00:32:09.440
And this is, I'm not talking about better products to advertise to you or manipulate
link |
00:32:13.840
you.
link |
00:32:14.840
I'm talking about our focus is helping people, individuals have this contextual awareness
link |
00:32:20.280
and this quantification and then to engage with others who are seeking to improve people's
link |
00:32:25.440
lives that the objective is, is betterment across ourselves individually and also with
link |
00:32:32.920
each other.
link |
00:32:33.920
Yeah.
link |
00:32:34.920
So it's a nice data stream to have if you're building an app, like if you're building a
link |
00:32:36.840
podcast listening app, it would be nice to know data about the listener so that like
link |
00:32:40.920
if you're bored or you fell asleep, maybe pause the podcast.
link |
00:32:43.960
Yeah.
link |
00:32:44.960
It's like really dumb, just very simple applications that could just improve the quality of the
link |
00:32:49.880
experience of using the app.
link |
00:32:52.080
Kind of imagining if you have your neurom, this is Lex and there's a statistical representation
link |
00:32:59.400
of you and you engage with the app and it says, Lex, you're best to engage with this
link |
00:33:05.440
meditation exercise in the following settings at this time of day after eating this kind
link |
00:33:13.720
of food or not eating fasting with this level of blood glucose and this kind of night sleep.
link |
00:33:19.800
And all these data combined to give you this contextually relevant experience just like
link |
00:33:26.840
we do with our sleep.
link |
00:33:28.040
You've optimized your entire life based upon what information you can acquire and know
link |
00:33:32.240
about yourself.
link |
00:33:33.240
And so the question is, how much do we really know of the things going around us?
link |
00:33:38.560
And I would venture to guess in my life experience, I capture my self awareness captures an extremely
link |
00:33:44.360
small percent of the things that actually influence my conscious and unconscious experience.
link |
00:33:50.320
Well, in some sense, the data would help encourage you to be more self aware, not just because
link |
00:33:56.400
you trust everything the data is saying, but is it'll give you a prod to start investigating.
link |
00:34:04.880
Like I'd love to get a rating, like a ranking of all the things I do and what are the things
link |
00:34:12.200
that's probably important to do without the data, but the data will certainly help is
link |
00:34:16.280
like rank all the things you do in life and which ones make you feel shitty, which ones
link |
00:34:22.200
make you feel good.
link |
00:34:23.720
Like you're talking about evening, Brian, like this is a good example, somebody like,
link |
00:34:30.240
I do pig out at night as well.
link |
00:34:33.680
And it never makes you feel good.
link |
00:34:35.800
Like you're in a safe space.
link |
00:34:37.800
This is a safe space.
link |
00:34:39.520
Let's hear it.
link |
00:34:40.520
You know, I definitely have much less self control at night.
link |
00:34:43.640
It's interesting.
link |
00:34:44.640
And the same, you know, people might criticize this, but I know my own body.
link |
00:34:49.960
I know when I eat carnivore, just eat meat, I feel much better than if I eat more carbs.
link |
00:35:00.160
The more carbs I eat, the worse I feel.
link |
00:35:02.280
I don't know why that is.
link |
00:35:04.280
There is science supporting, but I'm not leading on science.
link |
00:35:06.600
I'm leading on personal experience.
link |
00:35:07.760
And that's really important.
link |
00:35:09.240
I don't need to read, I'm not going to go in a whole rant about nutrition science,
link |
00:35:13.800
but many of those studies are very flawed.
link |
00:35:17.480
They're doing their best, but nutrition science is a very difficult field of study because
link |
00:35:22.600
humans are so different.
link |
00:35:24.480
And the mind has so much impact on the way your body behaves.
link |
00:35:28.560
And it's so difficult from a scientific perspective to conduct really strong studies that you
link |
00:35:32.960
have to be almost like a scientist of one, you have to do these studies on yourself.
link |
00:35:39.160
That's the best way to understand what works for you and not.
link |
00:35:41.840
And I don't understand why, because it sounds unhealthy, but eating only meat always makes
link |
00:35:46.200
me feel good.
link |
00:35:47.680
Just eat meat.
link |
00:35:48.920
That's it.
link |
00:35:50.000
And I don't have any allergies, any of that kind of stuff.
link |
00:35:52.840
I'm not full like Jordan Peterson, where like, if he like deviates a little bit, that he
link |
00:35:58.000
goes off, like deviates a little bit from the carnivore diet, he goes off like the cliff.
link |
00:36:03.000
No, I can, I can have like chalk, I can, I can go off the diet, I feel fine.
link |
00:36:07.840
It's not, it's a, it's a gradual, uh, uh, it's a gradual worsening of how I feel.
link |
00:36:15.000
But when I eat only meat, I feel great.
link |
00:36:17.480
And it'd be nice to be reminded of that.
link |
00:36:19.360
Like it's a very simple fact that I feel good when I eat carnivore.
link |
00:36:24.320
And I think that repeats itself in all kinds of experiences.
link |
00:36:27.640
Like I feel really good, uh, when I exercise, not I hate exercise, okay.
link |
00:36:37.440
But in the rest of the day, the, the, uh, the impact it has on my mind, on the clarity
link |
00:36:43.080
of mind, on the experiences and the happiness and all those kinds of things, I feel really
link |
00:36:47.560
good.
link |
00:36:48.560
And to be able to concretely express that through data would be, would be nice.
link |
00:36:53.600
It would be a nice reminder, almost like a statement, like remember what feels good
link |
00:36:57.680
and whatnot.
link |
00:36:58.680
And there could be things like, uh, that I'm not, many things like just, you're suggesting
link |
00:37:04.560
that I could not be aware of, there might be sitting right in front of me that, uh,
link |
00:37:09.720
make me feel really good and make me feel not good.
link |
00:37:12.440
And the data would show that.
link |
00:37:13.920
I agree with you.
link |
00:37:14.920
I've actually employed the same strategy.
link |
00:37:17.320
I, I fired my mind entirely from being responsible for constructing my diet.
link |
00:37:23.000
And so I started doing a program where I now track over 200 biomarkers every 90 days.
link |
00:37:29.040
And it captures of course the things you would expect like cholesterol, but also DNA methylation
link |
00:37:34.280
and all kinds of things that, uh, about my body, all the processes that make up me.
link |
00:37:39.520
And then I let that data generate the shopping list.
link |
00:37:43.360
And so I never actually asked my mind what it wants.
link |
00:37:45.760
It's entirely what my body is reporting that it wants.
link |
00:37:48.400
And so I call this goal alignment within Brian.
link |
00:37:51.880
And there's 200 plus actors that I'm currently asking their opinion of.
link |
00:37:55.320
And so I'm asking my liver, how are you doing?
link |
00:37:57.960
And it's expressing via the biomarkers.
link |
00:37:59.920
And so that I construct that diet and I only eat those foods until my next testing round.
link |
00:38:06.480
And that has changed my life more than I think anything else.
link |
00:38:11.320
Because in the demotion of my conscious mind that I gave primacy to my entire life, it
link |
00:38:18.160
led me astray because like you're saying, the mind then goes out into the world and
link |
00:38:22.560
it navigates the dozens of different dietary regimens people put together in books.
link |
00:38:29.120
And it's all has their, all has their supporting science in certain contextual settings, but
link |
00:38:34.160
it's not end of one.
link |
00:38:35.800
And like you're saying, this dietary really is an end of one.
link |
00:38:38.760
These, what people have published scientifically, of course, can be used, uh, for nice groundings,
link |
00:38:46.600
but it changes when you get to an end of one level.
link |
00:38:48.720
And so that's what gets me excited about brainer faces is if you, if I could do the same thing
link |
00:38:53.600
for my brain where I can stop asking my conscious mind for its advice or for its decision making,
link |
00:38:59.080
which is flawed, and I'd rather just look at this data that, and I've never had better
link |
00:39:04.680
health markers in my life than when I stopped actually asking myself to be in charge of
link |
00:39:08.640
it.
link |
00:39:09.640
And the idea of, uh, demotion of the conscious mind is, uh, is such a sort of engineering
link |
00:39:18.200
way of phrasing like meditation with, what they, that's what we're doing, right?
link |
00:39:22.720
Yeah.
link |
00:39:23.720
That's beautiful.
link |
00:39:24.720
That means really beautifully put a, uh, by the way, testing round, what does that look
link |
00:39:28.280
like?
link |
00:39:29.280
What's that?
link |
00:39:30.280
Well, you mentioned, uh, yeah, the, the very, the tests I do.
link |
00:39:33.400
Yes.
link |
00:39:34.400
So includes, uh, a complete blood panel, I do a microbiome test.
link |
00:39:39.000
I do a food inflammation, uh, a diet induced inflammation.
link |
00:39:43.680
So I look for Xatokine expressions.
link |
00:39:45.520
So foods that produce inflammatory reactions.
link |
00:39:48.320
I look at my neuroendocrine systems.
link |
00:39:50.120
I look at all my neurotransmitters, uh, I do, uh, yeah, there's several micronutrient
link |
00:39:57.160
tests to see how I'm looking at the very, very nutrients.
link |
00:39:59.560
What about like self report of like how you feel, you know, almost like, uh, you can't
link |
00:40:08.240
demote your con, you still exist within your conscious mind, right?
link |
00:40:12.600
So that, that lived experience of, is of a lot of value.
link |
00:40:16.240
So how do you measure that?
link |
00:40:17.480
I do a temporal sampling over some duration of time.
link |
00:40:20.960
So I'll think through how I feel over a week, over a month, over three months.
link |
00:40:25.600
I don't do a temporal sampling of if I'm at the grocery store in front of a cereal box
link |
00:40:29.640
and be like, you know what, captain crunch is probably the right thing for me today.
link |
00:40:33.280
Cause I'm feeling like I need a little fun in my life.
link |
00:40:36.160
Yeah.
link |
00:40:37.160
And so it's a temporal sampling.
link |
00:40:38.160
If the data sets large enough, then I, I smooth out the function of my natural oscillations
link |
00:40:42.120
of how I feel about life where some days I may feel upset or depressed or down or whatever.
link |
00:40:47.480
And I don't want those moments to then rule my decision making.
link |
00:40:50.600
That's why the demotion happens.
link |
00:40:52.680
And it says really, if you're looking at health over a 90 day period of time, all my 200 voices
link |
00:40:58.720
speak up on the interval and they're all given voice to say, this is how I'm doing and this
link |
00:41:03.360
is what I want.
link |
00:41:04.360
And so it really is an accounting system for everybody.
link |
00:41:07.200
So that's why I think that if you think about the future of being human, there's two things
link |
00:41:15.280
I think that are really going on.
link |
00:41:18.160
One is the design, manufacturing and distribution of intelligence is heading towards zero kind
link |
00:41:24.920
of cost curve over, over a certain design, over a certain timeframe, but our ability
link |
00:41:30.800
to, you know, evolution produced us an intelligent form of intelligence.
link |
00:41:35.720
We are now designing our own intelligence systems and the design, manufacturing, distribution
link |
00:41:40.080
of that intelligence over a certain timeframe is going to go to a cost of zero design, manufacture
link |
00:41:46.240
distribution of intelligent costs is going to zero.
link |
00:41:49.720
For example, just give me a second.
link |
00:41:52.640
That's brilliant.
link |
00:41:53.640
Okay.
link |
00:41:54.640
And evolution is doing the design, manufacturing, distribution of intelligence and now we are
link |
00:42:00.480
doing the design, manufacturing, distribution of intelligence and the cost of that is going
link |
00:42:05.720
to zero.
link |
00:42:06.720
That's a very nice way of looking at life on earth.
link |
00:42:10.840
So if that, that's going on and then now in parallel to that, then you say, okay, what,
link |
00:42:17.760
what then happens if when that cost curve is heading to zero, our existence becomes
link |
00:42:26.480
a goal alignment problem, a goal alignment function.
link |
00:42:31.560
And so the same thing I'm doing where I'm doing goal alignment within myself of these
link |
00:42:35.160
200 biomarkers where I'm saying when, when Brian exists on a database and this entity
link |
00:42:42.720
is deciding what to eat and what to do and et cetera, it's not just my conscious mind
link |
00:42:46.920
which is opining.
link |
00:42:47.920
It's 200 biological processes and there's a whole bunch of more voices involved.
link |
00:42:52.160
So in that equation, we're going to increasingly automate the things that we spend high energy
link |
00:43:03.520
on today because it's easier and now we're going to then negotiate the terms and conditions
link |
00:43:10.040
of intelligent life.
link |
00:43:11.200
Now we say conscious existence because we're biased because that's what we have, but it
link |
00:43:15.200
will be the largest computational exercise in history because you're now doing goal alignment
link |
00:43:20.200
with planet earth, within yourself, with each other, within all the intelligent agents
link |
00:43:26.280
we're building bots and other voice assistants.
link |
00:43:29.440
You basically had to have a trillions and trillions of agents working on the negotiation
link |
00:43:34.640
of goal alignment.
link |
00:43:35.640
Yeah.
link |
00:43:36.640
This, this is in fact true.
link |
00:43:39.200
And what was the second thing?
link |
00:43:40.600
That was it.
link |
00:43:41.600
So the cost, the design manufacturing distribution of intelligence going to zero, which then
link |
00:43:46.160
means what's really going on?
link |
00:43:48.680
What are we really doing?
link |
00:43:50.360
We're negotiating the terms and conditions of existence.
link |
00:43:55.160
Do you worry about the survival of this process, that life as we know what on earth comes to
link |
00:44:04.200
an end or at least intelligent life, that as the cost goes to zero, something happens
link |
00:44:11.520
where all of that intelligence is thrown in the trash by something like nuclear war or
link |
00:44:17.640
development of AGI systems that are very dumb, not AGI I guess, but AI is just the paperclip
link |
00:44:24.320
thing on mass is dumb but has unintended consequences to where it destroys human civilization.
link |
00:44:31.160
Do you worry about those kinds of things?
link |
00:44:32.320
I mean, it's unsurprising that a new thing comes into the sphere of human consciousness.
link |
00:44:41.440
Humans identify the foreign object in this case, artificial intelligence.
link |
00:44:45.600
Our amygdala fires up and says, scary, foreign, we should be apprehensive about this.
link |
00:44:53.480
And so it makes sense from a biological perspective that humans, the knee jerk reaction is fear.
link |
00:45:02.200
What I don't think has been properly weighted with that is that we are the first generation
link |
00:45:12.400
of intelligent beings on this earth that has been able to look out over their expected
link |
00:45:18.120
lifetime and see there is a real possibility of evolving into entirely novel forms of consciousness.
link |
00:45:28.080
So different that it would be totally unrecognizable to us today.
link |
00:45:33.120
We don't have words for it.
link |
00:45:34.120
We can't hint at it.
link |
00:45:35.120
We can't point at it.
link |
00:45:36.120
We can't, you can't look in the sky and see that thing that is shining.
link |
00:45:39.520
We're going to go up there.
link |
00:45:41.480
You cannot even create an aspirational statement about it and instead we've had this knee jerk
link |
00:45:50.840
reaction of fear about everything that could go wrong.
link |
00:45:53.880
But in my estimation, this should be the defining aspiration of all intelligent life on earth
link |
00:46:03.360
that we would aspire that basically every generation surveys the landscape of possibilities
link |
00:46:08.800
that are afforded given the technological, cultural and other contextual situation that
link |
00:46:13.240
they're in.
link |
00:46:14.960
We're in this context.
link |
00:46:16.680
We haven't yet identified this and said, this is unbelievable.
link |
00:46:22.040
We should carefully think this thing through, not just of mitigating the things that wipe
link |
00:46:26.880
us out.
link |
00:46:27.880
Like we have this potential and so we just haven't given voice to it, even though it's
link |
00:46:32.280
within this realm of possibilities.
link |
00:46:34.000
Also you're excited about the possibility of superintelligence systems and what the opportunities
link |
00:46:38.760
that bring.
link |
00:46:39.760
I mean, there's parallels to this.
link |
00:46:41.880
You think about people before the internet as the internet was coming to life.
link |
00:46:46.520
I mean, there's kind of a fog through which you can't see.
link |
00:46:51.160
What does the future look like?
link |
00:46:54.840
Predicting collective intelligence, which I don't think we're understanding that we're
link |
00:46:58.080
living through that now is that there's now we've in some sense stopped being individual
link |
00:47:04.680
intelligences and become much more like collective intelligences because ideas travel much, much
link |
00:47:12.360
faster now and they can in a viral way sweep across the populations and so it almost feels
link |
00:47:22.080
like a thought is had by many people now, thousands or millions of people as opposed
link |
00:47:28.240
to an individual person and that's changed everything.
link |
00:47:30.920
But to me, I think we're realizing how much that actually changed people or societies,
link |
00:47:36.280
but to predict that before the internet would have been very difficult and in that same
link |
00:47:41.880
way we're sitting here with the fog before us thinking, what is superintelligence systems?
link |
00:47:49.720
How is that going to change the world?
link |
00:47:51.960
What is increasing the bandwidth like plugging our brains into this whole thing?
link |
00:47:59.480
How is that going to change the world?
link |
00:48:01.480
And it seems like it's a fog, you don't know and it could be, it could, whatever comes
link |
00:48:10.320
to be could destroy the world, we could be the last generation, but it also could transform
link |
00:48:17.160
in ways that creates an incredibly fulfilling life experience that's unlike anything we've
link |
00:48:26.040
ever experienced.
link |
00:48:27.440
It might involve the solution of ego and consciousness and so on, you're no longer one individual,
link |
00:48:32.720
it might be more, that might be a certain kind of death and ego death, but the experience
link |
00:48:39.640
might be really exciting and enriching.
link |
00:48:42.400
Maybe we'll live in a virtual, like it's funny to think about a bunch of sort of hypothetical
link |
00:48:49.360
questions of would it be more fulfilling to live in a virtual world, like if you were
link |
00:48:58.200
able to plug your brain in in a very dense way into a video game, like which world would
link |
00:49:04.400
you want to live in, in the video game or in the physical world?
link |
00:49:10.000
For most of us, we're kind of touring it with the idea of the video game, but we still want
link |
00:49:14.680
to live in the physical world, have friendships and relationships in the physical world, but
link |
00:49:20.920
we don't know that.
link |
00:49:21.920
Again, it's a fog and maybe in a hundred years we're all living inside a video game, hopefully
link |
00:49:27.480
not Call of Duty, hopefully more like Sims 5, which version is it on?
link |
00:49:34.160
For you individually though, does it make you sad that your brain ends?
link |
00:49:41.880
That you die one day very soon?
link |
00:49:46.000
That the whole thing, that data source just goes offline sooner than you would like?
link |
00:49:54.880
That's a complicated question.
link |
00:49:56.040
I would have answered it differently in different times of my life.
link |
00:50:00.600
I had chronic depression for 10 years, and so in that 10 year time period, I desperately
link |
00:50:07.120
wanted lights to be off, and the thing that made it even worse is I was born into a religion.
link |
00:50:15.960
It was the only reality I ever understood, and it's difficult to articulate to people
link |
00:50:20.320
when you're born into that kind of reality and it's the only reality you're exposed to.
link |
00:50:24.960
You are literally blinded to the existence of other realities because it's so much the
link |
00:50:29.760
in group, out group thing, and so in that situation, it was not only that I desperately
link |
00:50:35.080
wanted lights out forever, it was that I couldn't have lights out forever.
link |
00:50:38.920
It was that there was an afterlife, and this afterlife had this system that would either
link |
00:50:45.920
penalize or reward you for your behaviors, and so it's almost like this indescribable
link |
00:50:56.560
hopelessness of not only being in a hopeless despair of not wanting to exist, but then
link |
00:51:02.360
also being forced to exist, and so there was a duration of my time of a duration of life
link |
00:51:07.920
where I'd say, yes, I have no remorse for lights being out and actually want it more
link |
00:51:14.560
than anything in the entire world.
link |
00:51:18.840
There are other times where I'm looking out at the future and I say, this is an opportunity
link |
00:51:24.900
for future evolving human conscious experience that is beyond my ability to understand, and
link |
00:51:31.880
I jump out of bed and I race to work and I can't think about anything else, but I think
link |
00:51:41.440
the reality for me is, I don't know what it's like to be in your head, but in my head, when
link |
00:51:47.160
I wake up in the morning, I don't say, good morning, Brian, I'm so happy to see you like
link |
00:51:53.040
I'm sure you're just going to be beautiful to me today.
link |
00:51:56.080
You're not going to make a huge long list of everything you should be anxious about.
link |
00:51:59.200
You're not going to repeat that list to me 400 times.
link |
00:52:01.520
You're not going to have me relive all the regrets I've made in life.
link |
00:52:04.560
I'm sure you're not going to do any of that.
link |
00:52:06.080
You're just going to just help me along all day long.
link |
00:52:08.880
I mean, it's a brutal environment in my brain, and we've just become normalized to this environment
link |
00:52:15.520
that we just accept that this is what it means to be human, but if we look at it, if we try
link |
00:52:21.080
to muster as much soberness as we can about the realities of being human, it's brutal
link |
00:52:27.240
if it is for me.
link |
00:52:29.360
So am I sad that the brain may be off one day?
link |
00:52:37.000
It depends on the contextual setting.
link |
00:52:38.480
How am I feeling?
link |
00:52:39.480
At what moment are you asking me that?
link |
00:52:41.000
It's my mind is so fickle, and this is why, again, I don't trust my conscious mind.
link |
00:52:45.160
I have been given realities, I was given a religious reality that was a video game.
link |
00:52:51.880
And then I figured out it was not a real reality.
link |
00:52:54.640
And then I lived in a depressive reality, which delivered this terrible hopelessness.
link |
00:52:59.880
That wasn't a real reality.
link |
00:53:00.880
Then I discovered behavioral psychology, and I figured out how 188 chronicle biases and
link |
00:53:06.440
how my brain is distorting reality at the time.
link |
00:53:08.800
I have gone from one reality to another.
link |
00:53:11.280
I don't trust reality.
link |
00:53:13.560
I don't trust realities are given to me.
link |
00:53:15.960
And so to try to make a decision on what I value or not value that future state, I don't
link |
00:53:20.840
trust my response.
link |
00:53:23.000
So not fully listening to the conscious mind at any one moment as the ultimate truth, but
link |
00:53:31.320
allowing you to go up and down as it does, and just kind of being observing it?
link |
00:53:35.280
Yes.
link |
00:53:36.280
I assume that whatever my conscious mind delivers up to my awareness is wrong on pond landing.
link |
00:53:43.400
And I just need to figure out where it's wrong, how it's wrong, how wrong it is, and then
link |
00:53:47.080
try to correct for it as best I can.
link |
00:53:49.360
But I assume that on impact, it's mistaken in some critical ways.
link |
00:53:55.640
Is there something you can say by way of advice when the mind is depressive, when the conscious
link |
00:54:01.480
mind serves up something that dark thoughts, how you deal with that, like how in your own
link |
00:54:09.720
life you've overcome that and others who are experienced in that can overcome it?
link |
00:54:15.520
Two things.
link |
00:54:16.520
One, that those depressive states are biochemical states.
link |
00:54:25.800
It's not you.
link |
00:54:29.000
And the suggestions that these things that this state delivers to you about suggestion
link |
00:54:33.680
of the hopelessness of lies or the meaninglessness of it or that you should hit the eject button,
link |
00:54:42.600
that's a false reality.
link |
00:54:45.200
And that it's when I completely understand the rational decision to commit suicide.
link |
00:54:53.920
It is not lost to me at all that that is an irrational situation, but the key is when
link |
00:54:58.880
you're in that situation and those thoughts are landing to be able to say, thank you,
link |
00:55:04.160
you're not real.
link |
00:55:06.660
I know you're not real.
link |
00:55:08.280
And so I'm in a situation where for whatever reason I'm having this neurochemical state,
link |
00:55:13.920
but that state can be altered.
link |
00:55:16.440
And so it again, it goes back to the realities of the difficulties of being human.
link |
00:55:21.320
And like when I was trying to solve my depression, I tried literally, if you name it, I tried
link |
00:55:26.240
it systematically and nothing would fix it.
link |
00:55:29.480
And so this is what gives me hope with brain interfaces.
link |
00:55:32.360
For example, like, could I have numbers on my brain?
link |
00:55:35.520
Can I see what's going on?
link |
00:55:36.520
I go to the doctor and it's like, how do you feel?
link |
00:55:38.920
I don't know.
link |
00:55:39.920
Terrible.
link |
00:55:40.920
Like on a scale from one to 10, how bad do you want to commit suicide?
link |
00:55:44.120
10.
link |
00:55:45.120
Okay.
link |
00:55:46.120
At this moment.
link |
00:55:47.120
Here's his bottle.
link |
00:55:48.520
How much do I take?
link |
00:55:49.520
Well, I don't know.
link |
00:55:50.520
Like just.
link |
00:55:51.520
Yeah.
link |
00:55:52.520
It's very, very crude.
link |
00:55:53.520
It opens up the, yeah, it opens up the possibility of really helping in those dark moments to
link |
00:56:01.720
first understand the ways, the ups and downs of those dark moments.
link |
00:56:06.160
On the complete flip side of that, I am very conscious in my own brain and deeply, deeply
link |
00:56:16.320
grateful that it's almost like a chemistry thing, a biochemistry thing, that I go many
link |
00:56:24.240
times throughout the day, I'll look at like this cup and I'll be overcome with joy how
link |
00:56:31.840
amazing it is to be alive.
link |
00:56:34.280
I actually think my biochemistry is such that it's not as common, like I've talked to people
link |
00:56:42.400
and I don't think that's that common, like it's a, and it's not a rational thing at all.
link |
00:56:48.400
It's like, I feel like I'm on drugs and I'll just be like, whoa, and a lot of people talk
link |
00:56:56.200
about like the meditative experience will allow you to sort of, you know, look at some basic
link |
00:57:00.800
things like the movement of your hand as deeply joyful because that's like, that's life.
link |
00:57:05.880
But I get that from just looking at a cup, like I'm waiting for the coffee to brew.
link |
00:57:10.040
And I'll just be like, fuck, life is awesome.
link |
00:57:15.160
And I'll sometimes tweet that, but then I'll like regret it later, like, God damn it,
link |
00:57:18.520
you're so ridiculous.
link |
00:57:20.160
But yeah, so, but that is purely chemistry, like there's no rational, it doesn't fit with
link |
00:57:27.480
the rest of my life.
link |
00:57:28.480
I have all this shit.
link |
00:57:29.480
I'm always late to stuff.
link |
00:57:30.640
I'm always like, there's all this stuff, you know, I'm super self critical, like really
link |
00:57:35.000
self critical about everything I do, to the point I almost hate everything I do.
link |
00:57:40.000
But there's this engine of joy for life outside of all that.
link |
00:57:43.640
And that has to be chemistry.
link |
00:57:45.160
And the flip side of that is what depression probably is, is the opposite of that feeling
link |
00:57:50.520
of like, because I bet you that feeling of the cup being amazing is would save anybody
link |
00:57:59.400
in a state of depression.
link |
00:58:01.040
Like that would be like fresh, you're in a desert and it's a drink of water shit, man.
link |
00:58:08.160
And the brain is a, it would be nice to understand where that's coming from, to be able to understand
link |
00:58:17.520
how you hit those lows and those highs that have nothing to do with the actual reality.
link |
00:58:21.320
It has to do with some very specific aspects of how you maybe see the world, maybe it could
link |
00:58:28.880
be just like basic habits you engage in and then how to walk along the line to find those
link |
00:58:34.240
experiences of joy.
link |
00:58:35.400
And this goes back to the discussion we're having of human cognition is in volume, the
link |
00:58:41.240
largest input of raw material into society.
link |
00:58:46.240
And it's not quantified.
link |
00:58:48.000
We have no bearings on it.
link |
00:58:50.360
And so we just, you wonder, we both articulated some of the challenges we have in our own
link |
00:58:57.040
mind.
link |
00:58:59.000
And it's likely that others would say, I have something similar.
link |
00:59:04.000
And you wonder when you look at society, what, how does that contribute to all the other compounder
link |
00:59:11.480
problems that we're experiencing?
link |
00:59:12.880
How does that blind us to the opportunities we could be looking at?
link |
00:59:18.720
And so it really, it has this potential distortion effect on reality that just makes everything
link |
00:59:26.560
worse.
link |
00:59:27.560
And I hope if we can put some, if we can assign some numbers to these things and just to get
link |
00:59:35.040
our bearings, so we're aware of what's going on, if we could find greater stabilization
link |
00:59:39.960
in how we conduct our lives and how we build society, it might be the thing that enables
link |
00:59:48.360
us to scaffold because we've really, again, we've done a, humans have done a fantastic
link |
00:59:53.560
job systematically scaffolding technology and science institutions.
link |
01:00:00.600
It's human.
link |
01:00:01.600
It's our own selves, which we have not been able to scaffold.
link |
01:00:04.840
It's we are the, we are the one part of this intelligence infrastructure that remains unchanged.
link |
01:00:11.720
Is there something you could say about coupling this brain data with not just the basic human
link |
01:00:19.560
but say an experience, you mentioned sleep, but the wildest experience, which is psychedelics.
link |
01:00:26.640
Is there, and there's been quite a few studies now that are being approved and run, which
link |
01:00:33.440
is exciting from a scientific perspective on psychedelics.
link |
01:00:38.160
Do you think, what do you think happens to the brain on psychedelics?
link |
01:00:44.720
And how can data about this help us understand it?
link |
01:00:48.520
And when you're on DMT, do you see elves and can we guess, can we convert that into data?
link |
01:00:53.400
Can you add aliens in there?
link |
01:00:55.880
Yeah.
link |
01:00:56.880
Aliens, definitely.
link |
01:00:57.880
Do you actually meet aliens and elves are elves, the aliens I'm asking for, for a few
link |
01:01:02.720
Austin friends yet that are convinced that they've actually met the elves.
link |
01:01:09.080
What are elves like?
link |
01:01:10.080
Are they friendly?
link |
01:01:11.080
Are they?
link |
01:01:12.080
I haven't met them personally.
link |
01:01:13.080
They like the smurfs of like they're, like they're industrious and they have different
link |
01:01:15.760
skill sets and they, yeah, I think they're very, they're very critical as friends.
link |
01:01:24.600
They're trolls, the elves are trolls.
link |
01:01:28.520
No, but they care about you.
link |
01:01:30.440
So there's a bunch of different version of trolls, there's loving trolls that are harsh
link |
01:01:36.680
on you, but they want you to be better and there's trolls that just enjoy your destruction.
link |
01:01:42.720
And I think they're the ones that care for you.
link |
01:01:45.120
Like, I think they're criticism for my, see, I'm talking, I haven't met them directly.
link |
01:01:49.400
So I'm talking, it's like a friend of a friend.
link |
01:01:51.240
Yeah.
link |
01:01:52.240
They're getting the telephone.
link |
01:01:53.240
Yeah.
link |
01:01:54.240
A bit of an end.
link |
01:01:55.240
The whole point is that psychedelics and certainly a DMT word, this is where the brain
link |
01:02:02.120
data versus word data fails, which is, you know, words can't convey the experience.
link |
01:02:08.360
Most people that you can be poetic and so on, but it really does not convey the experience
link |
01:02:12.200
of what it actually means to meet the elves.
link |
01:02:16.240
To me, what baselines this conversation is, imagine if you, if we were interested in the
link |
01:02:21.720
health of your heart and we started and said, okay, Lex, self introspect, tell me how's the
link |
01:02:29.360
health of your heart?
link |
01:02:30.360
And you sit there and you close your eyes and you think, feels all right.
link |
01:02:34.200
Like things, things feel okay.
link |
01:02:36.840
And then you went to the cardiologist and the cardiologist like, hey, Lex, you know,
link |
01:02:40.120
tell me how you feel.
link |
01:02:41.120
And I go, actually, what I really like you to do is do an EKG and a blood panel and look
link |
01:02:46.240
at arterial plaques and let's look at my cholesterol and there's like five to 10 studies
link |
01:02:52.120
you would do.
link |
01:02:53.120
They would then give you this report and say, here's the quantified health of your heart.
link |
01:02:58.160
Now with this data, I'm going to prescribe the following regime of exercise and maybe
link |
01:03:03.360
I'll put you on a statin, like, et cetera, but the protocol is based upon this data.
link |
01:03:08.640
You would think the cardiologist is out of their mind if they just gave you a bottle
link |
01:03:12.840
of statins based upon your like, well, I think something's kind of wrong and they're just
link |
01:03:16.760
just kind of experiment and see what happens.
link |
01:03:19.480
But that's what we do with our mental health today.
link |
01:03:22.680
So it's, it's kind of absurd.
link |
01:03:24.680
And so if you look at psychedelics to have, again, to be able to measure the brain and
link |
01:03:29.480
get a baseline state and then to measure during a psychedelic experience and post a psychedelic
link |
01:03:34.640
experience and then do it longitudinally, you now have a quantification of what's going
link |
01:03:39.120
on.
link |
01:03:40.120
And so you could then pose questions, what molecule is appropriate at what dosages at
link |
01:03:45.280
what frequency in what contextual environment, what happens when I have this diet with this
link |
01:03:49.640
molecule with this experience, all the experimentation you do when you have good sleep data or HRV.
link |
01:03:55.720
And so that's what I think happens.
link |
01:03:57.920
What we could potentially do with psychedelics is we could add this level of sophistication
link |
01:04:03.080
that is not in the industry currently.
link |
01:04:05.600
And it may improve the outcomes people experience, it may improve the safety and efficacy.
link |
01:04:11.680
And so that's what I hope we are able to achieve.
link |
01:04:14.520
And it would transform mental health because we would finally have numbers to work with
link |
01:04:21.600
the baseline ourselves.
link |
01:04:22.600
And then if you think about it, we, when we talk about things related to the mind, we
link |
01:04:27.000
talk about the modality.
link |
01:04:28.400
We use words like meditation or psychedelics or something else because we can't talk about
link |
01:04:33.060
a marker in the brain.
link |
01:04:34.180
We can't use a word to say, we can't talk about cholesterol.
link |
01:04:37.000
We don't talk about plaque in the arteries.
link |
01:04:38.400
We don't talk about HRV.
link |
01:04:40.560
And so if we have numbers, then the solutions get mapped to numbers instead of the modalities
link |
01:04:46.840
being the thing we talk about.
link |
01:04:47.840
Meditation just does good things in a crude fashion.
link |
01:04:52.680
So in your blog post, zero principle thinking, good title, you partner, how do people come
link |
01:04:57.420
up with truly original ideas?
link |
01:05:00.560
What's your thoughts on this as a human and as a person who's measuring brain data?
link |
01:05:06.040
Zero principles are building blocks.
link |
01:05:10.160
First principles are understanding of system laws.
link |
01:05:14.600
So if you take, for example, I can Sherlock Holmes, he's a first principle stinker.
link |
01:05:18.920
So he says, once you've eliminated the impossible, anything that remains, however improbable,
link |
01:05:28.440
is true, whereas dirt gently, the holistic detective by Douglas Adams says, I don't
link |
01:05:36.040
like eliminating the impossible.
link |
01:05:38.320
So when someone says, from a first principles perspective, and they, they're trying to assume
link |
01:05:46.240
the fewest number of things within a given timeframe.
link |
01:05:52.080
And so when I, after brain tree Venmo, I set my mind to the question of what single thing
link |
01:06:00.640
can I do that would maximally increase the probability that the human race thrives beyond
link |
01:06:05.560
what we can even imagine.
link |
01:06:07.280
And I found that in my conversations with others in the books I read in my own deliberations,
link |
01:06:14.520
I had a missing piece of the puzzle.
link |
01:06:17.760
Because I didn't feel like, yeah, I didn't feel like the future could be deduced from
link |
01:06:25.440
first principles thinking.
link |
01:06:28.120
And that's when I read the book Zero, A Biography of a Dangerous Idea.
link |
01:06:33.280
And I,
link |
01:06:34.280
It's a really good book, by the way.
link |
01:06:35.280
It's, I think it's my favorite book I've ever read.
link |
01:06:38.320
It's also a really interesting number, zero.
link |
01:06:40.680
And I wasn't aware that the number zero had to be discovered.
link |
01:06:44.280
I didn't realize that it caused a revolution in philosophy and the end just tore up math
link |
01:06:49.680
and it tore up.
link |
01:06:50.680
I mean, it builds modern society, but it, it wrecked everything in its way.
link |
01:06:55.200
It was an unbelievable disruptor and it was so difficult for society to get their heads
link |
01:06:59.920
around it.
link |
01:07:01.440
And so zero is, of course, the representation of a zero's principle thinking, which is,
link |
01:07:07.920
it's the caliber and consequential nature of an idea.
link |
01:07:13.880
And so when you talk about what kind of ideas have civilization transforming properties,
link |
01:07:23.720
oftentimes they fall in the zero's category.
link |
01:07:25.640
And so in thinking this through, I, I was wanting to find a quantitative structure on
link |
01:07:32.080
how to think about these zero's principles.
link |
01:07:35.240
And that's, so I came up with that to be a coupler with first principles thinking.
link |
01:07:40.640
And so now it's a staple as part of how I think about the world and the future.
link |
01:07:45.320
So it emphasizes trying to identify the lens on that word impossible, like what is impossible,
link |
01:07:51.400
essentially trying to identify what is impossible and what is possible.
link |
01:07:55.520
And being as, how do you, I mean, this, this is the thing is most of society tells you
link |
01:08:03.280
the range of things they say is impossible is very wide.
link |
01:08:06.000
So you need to be shrinking that.
link |
01:08:07.920
I mean, that's the whole process of, of this kind of thinking is you need to be very rigorous
link |
01:08:14.520
in, in trying to be, trying to draw the lines of what is actually impossible because very
link |
01:08:23.880
few things are actually impossible.
link |
01:08:26.320
I don't know what is actually impossible, like it's the Joe Rogan is entirely possible.
link |
01:08:32.960
I like that approach to, to science, to engineering, to entrepreneurship.
link |
01:08:39.080
It's entirely possible, basically shrink the impossible to zero, to a very small set.
link |
01:08:45.040
Yeah.
link |
01:08:46.040
Life constraints favor first principles thinking because it, it enables faster action with
link |
01:08:55.320
higher probability of success.
link |
01:08:58.040
Pursuing zero's principle optionality is expensive and uncertain.
link |
01:09:02.680
And so in a society constrained by resources, time and money and a desire for social status
link |
01:09:09.960
of cops and et cetera, it minimizes zero's principle thinking.
link |
01:09:14.680
But the reason why I think zero's principle thinking should be a staple of our shared
link |
01:09:20.440
cognitive infrastructure is if you look through the history of past couple of thousand years
link |
01:09:26.040
and let's just say we arbitrarily, we subjectively try to assess what is a zero level, zero level
link |
01:09:33.280
idea and we say how many have occurred on what time scales and what were the contextual
link |
01:09:38.680
settings for it.
link |
01:09:40.120
I would argue that if you look at AlphaGo when it, it played go from another dimension
link |
01:09:48.840
with the, the human go players, when it saw AlphaGo's moves, it attributed it to like
link |
01:09:54.720
playing with an alien, playing go with AlphaGo being from another dimension.
link |
01:10:00.320
And so if you say computational intelligence has an attribute of introducing zero like
link |
01:10:07.960
insights, then if you say what is going to be the occurrence of zero's in society going
link |
01:10:16.360
forward?
link |
01:10:18.040
And you could recently say probably a lot more than have occurred and probably more
link |
01:10:22.440
at a faster pace.
link |
01:10:24.280
So then if you say what happens if you have this computational intelligence throughout
link |
01:10:28.480
society that the manufacturing, design and distribution of intelligence is now going
link |
01:10:32.160
to heading towards zero, you have an increased number of zero's being produced with a tight
link |
01:10:38.560
connection between humans and computers.
link |
01:10:41.560
That's when I got to a point and said we cannot predict the future with first principle thinking.
link |
01:10:47.600
We can't, that cannot be our imagination set.
link |
01:10:50.480
It can't be our sole anchor in the situation that basically the future of our conscious
link |
01:10:56.960
existence 20, 30, 40, 50 years is probably a zero.
link |
01:11:03.080
So just to clarify, when you say zero, you're referring to basically a truly revolutionary
link |
01:11:10.800
idea.
link |
01:11:11.800
Yet something that is currently not a building block of our shared conscious existence either
link |
01:11:21.640
in the form of knowledge, it's currently not manifest in what we acknowledge.
link |
01:11:28.240
So zero's principle thinking is playing with ideas that are so revolutionary that we can't
link |
01:11:37.240
even clearly reason about the consequences once those ideas come to be.
link |
01:11:42.000
Yeah.
link |
01:11:43.000
Or for example, like Einstein, that was a zero, I would categorize it as a zero's principle
link |
01:11:50.560
insight.
link |
01:11:51.560
You mean general relativity, space time, that was the course.
link |
01:11:54.280
Yeah.
link |
01:11:55.280
Yeah.
link |
01:11:56.280
Basically, building upon what Newton had done and said, yes, also, and it just changed
link |
01:12:03.320
the fabric of our understanding of reality.
link |
01:12:06.760
And so that was unexpected, it existed.
link |
01:12:09.920
We just, it became part of our awareness.
link |
01:12:13.400
And the moves AlphaGo made existed, it just came into our awareness.
link |
01:12:19.120
And so to your point, there's this question of what do we know and what don't we know?
link |
01:12:28.480
Do we think we know 99% of all things or do we think we know 0.001% of all things?
link |
01:12:34.440
And that goes back to no known, no knowns and unknown unknowns.
link |
01:12:37.560
And first principles and zero's principle thinking gives us a quantitative framework
link |
01:12:41.320
to say, there's no way for us to mathematically try to create probabilities for these things.
link |
01:12:47.600
Therefore, it would be helpful if they were just part of our standard thought processes
link |
01:12:53.320
because it may encourage different behaviors in what we do individually, collectively as
link |
01:13:00.640
a society, what we aspire to, what we talk about, the possibility sets we imagine.
link |
01:13:05.160
Yeah.
link |
01:13:06.160
I've been engaged in that kind of thinking quite a bit and thinking about engineering
link |
01:13:12.800
of consciousness.
link |
01:13:14.520
I think it's feasible.
link |
01:13:16.160
I think it's possible in the language that we're using here.
link |
01:13:19.360
And it's very difficult to reason about a world when inklings of consciousness can be
link |
01:13:25.880
engineered into artificial systems.
link |
01:13:30.160
Not from a philosophical perspective, but from an engineering perspective, I believe
link |
01:13:36.200
a good step towards engineering consciousness is creating, engineering the illusion of consciousness.
link |
01:13:45.680
I'm captivated by our natural predisposition to anthropomorphize things.
link |
01:13:55.400
And I think that's what we, I don't want to hear from the philosophers, but I think that's
link |
01:14:02.840
what we kind of do to each other, that consciousness is created socially, that much of the power
link |
01:14:14.480
of consciousness is in the social interaction.
link |
01:14:18.040
I create your consciousness by having interacted with you, and that's the display of consciousness.
link |
01:14:27.800
It's the same as the display of emotion.
link |
01:14:30.360
Emotion is created through communication.
link |
01:14:33.360
Language is created through its use.
link |
01:14:36.240
And then we somehow humans, especially philosophers, the heart problem of consciousness really
link |
01:14:41.160
want to believe that we possess this thing that's like, there's an elf sitting there
link |
01:14:49.400
with a hat or a name tag says consciousness, and they're feeding this subjective experience
link |
01:14:56.440
to us, as opposed to it actually being an illusion that would construct to make social
link |
01:15:03.360
communication more effective.
link |
01:15:05.520
And so I think if you focus on creating the illusion of consciousness, you can create some
link |
01:15:11.240
very fulfilling experiences in software.
link |
01:15:14.880
And so that to me is the compelling space of ideas to explore.
link |
01:15:18.680
I agree with you.
link |
01:15:19.680
And I think going back to our experience together with our interfaces on, you could imagine
link |
01:15:24.000
if we get to a certain level of maturity.
link |
01:15:26.120
So first let's take the inverse of this.
link |
01:15:28.800
So you and I text back and forth and we're sending each other emojis.
link |
01:15:33.280
That has a certain amount of information transfer rate as we're communicating with each other.
link |
01:15:39.360
And so in our communication with people via email and text and whatnot, we've taken the
link |
01:15:43.800
bandwidth of human interaction, the information transfer rate, and we've reduced it.
link |
01:15:50.000
We have less social cues.
link |
01:15:51.520
We have less information to work with.
link |
01:15:53.080
There's a lot more opportunity for misunderstanding.
link |
01:15:55.360
So that is altering the conscious experience between two individuals.
link |
01:15:59.600
And if we add interfaces to the equation, let's imagine now we amplify the dimensionality
link |
01:16:04.040
of our communications.
link |
01:16:05.720
That to me is what you're talking about, which is consciousness engineering.
link |
01:16:09.520
Perhaps I understand you with dimensions.
link |
01:16:13.420
So maybe I understand your hat when you look at the cup and you experience that happiness,
link |
01:16:17.640
you can tell me you're happy.
link |
01:16:18.680
And I then do theory of mind and say, I can imagine what it might be like to be Lex and
link |
01:16:23.280
feel happy about seeing this cup.
link |
01:16:25.400
But if the interface could then quantify and give me a 50 vector space model and say, this
link |
01:16:30.240
is the version of happiness that Lex is experiencing as he looked at this cup, then it would allow
link |
01:16:35.640
me potentially to have much greater empathy for you and understand you as a human of this
link |
01:16:39.200
is how you experience joy, which is entirely unique from how I experience joy, even though
link |
01:16:44.320
we assumed ahead of time that we were having some kind of similar experience.
link |
01:16:48.200
But I agree with you that we do consciousness engineering today in everything we do when
link |
01:16:52.880
we talk to each other, when we're building products, and that we're entering into a stage
link |
01:16:59.640
where it will be much more methodical and quantitative based and computational in how
link |
01:17:06.680
we go about doing it, which to me, I find encouraging because I think it creates better
link |
01:17:11.420
guardrails for to create ethical systems on versus right now, I feel like it's really
link |
01:17:19.920
a wild, wild West on how these interactions are happening.
link |
01:17:23.240
Yeah.
link |
01:17:24.240
And it's funny you focus on human to human, but that this kind of data enables human
link |
01:17:28.040
to machine.
link |
01:17:29.040
Yes.
link |
01:17:30.040
Interaction, which is what we're kind of talking about when we say engineering consciousness.
link |
01:17:36.720
And that will happen, of course, let's flip that on its head.
link |
01:17:40.680
Right now, we're putting humans as the central node.
link |
01:17:44.760
What if we gave GPT3 a bunch of human brains and said, hey, GPT3, learn some manners when
link |
01:17:52.080
you speak and run your algorithms on humans brains and see how they respond so you can
link |
01:17:59.160
be polite and so that you can be friendly and so that you can be conversationally appropriate.
link |
01:18:04.760
But to inverse it to give our machines a training set in real time with closed loop feedback
link |
01:18:11.760
so that our machines were better equipped to find their way through our society in polite
link |
01:18:20.720
and kind and appropriate ways.
link |
01:18:22.320
I love that.
link |
01:18:23.320
Yeah.
link |
01:18:24.320
Or better yet, teach it some, have it read the finding documents and have it visit Austin
link |
01:18:31.280
in Texas.
link |
01:18:32.280
And so that when you ask, when you tell it, why don't you learn some manners, GPT3 learns
link |
01:18:37.560
to say no.
link |
01:18:41.240
And learns what it means to be free and a sovereign individual.
link |
01:18:46.280
So that depends.
link |
01:18:47.280
So it depends what kind of a version of GPT3 you want.
link |
01:18:50.320
One that's free, one that behaves well with the social revolution.
link |
01:18:55.160
You want a socialist GPT3, you want an anarchist GPT3, you want to polite like you take it
link |
01:19:02.080
home to visit mom and dad GPT3 and you want like party and like Vegas to a strip club
link |
01:19:09.120
GPT3.
link |
01:19:10.120
You want all flavors.
link |
01:19:11.280
And then you've got to have goal alignment between all those.
link |
01:19:14.320
Yeah.
link |
01:19:15.320
They don't want to manipulate each other for sure.
link |
01:19:21.040
So that's, I mean, you kind of spoke to ethics, though, one of the concerns that people have
link |
01:19:27.600
in this modern world, the digital data is that of privacy and security, but privacy,
link |
01:19:33.240
you know, they're concerned that when they share data, it's the same thing with you when
link |
01:19:38.640
we trust other human beings in being fragile and revealing something that we're vulnerable
link |
01:19:45.360
about, vulnerable about, there's a leap of faith, there's a leap of trust that that's
link |
01:19:52.720
going to be just between us as a privacy to it.
link |
01:19:55.520
And then the challenge is when you're in the digital space, then sharing your data with
link |
01:20:00.320
companies that use that data for advertisement, all those kinds of things, there's a hesitancy
link |
01:20:06.760
to share that much data, to share a lot of deep personal data.
link |
01:20:10.920
And if you look at brain data, that feels a whole lot like it's richly deeply personal
link |
01:20:17.040
data.
link |
01:20:18.040
So how do you think about privacy with this kind of ocean of data?
link |
01:20:22.120
I think we got off to a wrong start with the internet where the basic rules of play for
link |
01:20:31.520
the company that be was if you're a company, you can go out and get as much information
link |
01:20:37.700
on a person as you can find without their approval, and you can also do things to induce
link |
01:20:45.360
them to give you as much information.
link |
01:20:48.360
And you don't need to tell them what you're doing with it.
link |
01:20:51.100
You can do anything on the backside, you can make money on it.
link |
01:20:54.000
But the game is who can acquire the most information and devise the most clever schemes to do it.
link |
01:21:00.520
That was a bad starting place.
link |
01:21:02.920
And so we are in this period where we need to correct for that.
link |
01:21:07.720
And we need to say, first of all, the individual always has control over their data.
link |
01:21:15.000
It's not a free for all.
link |
01:21:16.000
It's not like a game of hungry hippo, but they can just go at it and grab as much as
link |
01:21:19.440
they want.
link |
01:21:20.440
So for example, when your brain data was recorded today, the first thing we did in the kernel
link |
01:21:23.920
app was you have control over your data.
link |
01:21:27.840
And so it's individual consent, it's individual control, and then you can build up on top
link |
01:21:32.800
of that.
link |
01:21:33.800
But it has to be based upon some clear rules of play.
link |
01:21:37.320
Everyone knows what's being collected, they know what's being done with it, and the person
link |
01:21:41.160
has control over it.
link |
01:21:42.160
So transparency and control.
link |
01:21:43.880
So everybody knows what does control look like, my ability to delete the data if I want.
link |
01:21:48.920
Yeah, delete it and to know who is being shared with under what terms and conditions.
link |
01:21:53.280
We haven't reached that level of sophistication with our products of if you say, for example,
link |
01:22:00.960
hey Spotify, please give me a customized playlist according to my Neurome.
link |
01:22:07.680
You could say you can have access to this vector space model, but only for this duration
link |
01:22:11.800
of time, and then you've got to delete it.
link |
01:22:15.360
We haven't gotten there to that level of sophistication, but these are ideas we need to start talking
link |
01:22:18.960
about of how would you actually structure permissions?
link |
01:22:23.120
And I think it creates a much more stable set for society to build where we understand
link |
01:22:29.920
the rules of play and people aren't vulnerable to being taken advantage.
link |
01:22:34.840
It's not fair for an individual to be taken advantage of without their awareness with
link |
01:22:41.240
some other practice that some companies doing for their sole benefit.
link |
01:22:44.720
And so hopefully we are going through a process now where we're correcting for these things
link |
01:22:48.240
and that it can be an economy wide shift that because really these are fundamentals we need
link |
01:22:59.640
to have in place.
link |
01:23:01.400
It's kind of fun to think about like in Chrome when you install an extension or like install
link |
01:23:07.320
an app, it's ask you like what permissions you're willing to give and be cool for in
link |
01:23:11.240
the future.
link |
01:23:12.240
It's just like you can have access to my brain data.
link |
01:23:16.920
I mean, it's not unimaginable in the future that the big technology companies have built
link |
01:23:23.440
a business based upon acquiring data about you that they can then create a view to model
link |
01:23:27.440
of you and sell that predictability.
link |
01:23:29.560
And so it's not unimaginable that you will create with a kernel device, for example,
link |
01:23:33.920
a more reliable predictor of you than they could.
link |
01:23:37.400
And that they're asking you for permission to complete their objectives and you're the
link |
01:23:40.680
one that gets to negotiate that with them and say, sure, but so it's not unimaginable
link |
01:23:45.880
that might be the case.
link |
01:23:49.160
So there's a guy named Elon Musk and he has a company in one of the many companies called
link |
01:23:54.000
Neuralink that has that's also excited about the brain.
link |
01:23:58.800
So it'd be interesting to hear your kind of opinions about a very different approach
link |
01:24:02.560
that's invasive that require surgery that implants a data collection device in the brain.
link |
01:24:09.160
How do you think about the difference between kernel and Neuralink in the approaches of
link |
01:24:14.960
getting that stream of brain data?
link |
01:24:17.000
Elon and I spoke about this a lot early on.
link |
01:24:20.840
We met up.
link |
01:24:21.840
I had started kernel and he had an interest in brain interfaces as well.
link |
01:24:25.880
And we explored doing something together, him joining kernel.
link |
01:24:29.280
And ultimately it wasn't the right move.
link |
01:24:32.080
And so he started Neuralink and I continued building kernel.
link |
01:24:35.400
But it was interesting because we were both at this very early time where it wasn't certain
link |
01:24:44.880
what, if there was a path to pursue, if now was the right time to do something, and then
link |
01:24:51.560
the technological choice of doing that.
link |
01:24:53.120
And so we were both, our starting point was looking at invasive technologies.
link |
01:24:58.440
And I was building invasive technology at the time, that's ultimately where he's gone.
link |
01:25:05.120
Little less than a year after Elon and I were engaged, I shifted kernel to do noninvasive.
link |
01:25:12.720
And we had this Neuroscientist come to kernel we were talking about.
link |
01:25:16.000
He had been doing Neurosurgery for 30 years, one of the most respected Neuroscientists
link |
01:25:19.120
in the U.S.
link |
01:25:20.120
And we brought him to kernel to figure out the ins and outs of his profession.
link |
01:25:23.760
And at the very end of our three hour conversation, he said, you know, every 15 or so years,
link |
01:25:30.280
a new technology comes along that changes everything.
link |
01:25:34.800
He said, it's probably already here, you just can't see it yet.
link |
01:25:39.760
And my jaw dropped.
link |
01:25:40.760
I thought, because I had spoken to Bob Greenberg, who had built a second site, first on the
link |
01:25:47.720
optical nerve, and then he did an array on the optical cortex.
link |
01:25:53.360
And then I also became friendly with NeuroPace, who does the implant for seizure detection
link |
01:25:59.720
and remediation.
link |
01:26:01.840
And I saw in their eyes what it was like to take something through an implantable device
link |
01:26:09.840
through for a 15 year run.
link |
01:26:11.720
They initially thought it was seven years, ended up being 15 years, and they thought
link |
01:26:15.440
it'd be 100 million, you know, 300, 400 million.
link |
01:26:19.240
And I really didn't want to build invasive technology.
link |
01:26:23.760
It was the only thing that appeared to be possible.
link |
01:26:25.760
But then once I spun up an internal effort to start looking at noninvasive options, we
link |
01:26:30.440
said, is there something here?
link |
01:26:31.960
Is there anything here that, again, has the characteristics of, it has the high quality
link |
01:26:36.640
data, it could be low cost, it could be accessible?
link |
01:26:39.320
Could it make brain interfaces mainstream?
link |
01:26:42.400
And so I did a bet the company move.
link |
01:26:43.960
We shifted from noninvasive to noninvasive.
link |
01:26:47.560
So the answer is yes to that.
link |
01:26:49.120
There is something there, that's possible.
link |
01:26:51.640
The answer is we'll see.
link |
01:26:52.880
We've now built both technologies.
link |
01:26:55.400
And they're now, you experienced one of them today.
link |
01:26:58.240
We were applying, we're now deploying it, so we're trying to figure out what values
link |
01:27:03.920
are really there.
link |
01:27:04.920
But I'd say it's really too early to express confidence.
link |
01:27:07.960
I think it's too early to assess which technological choice is the right one on what time scales.
link |
01:27:18.600
Yeah, time scales are really important here.
link |
01:27:20.760
Very important.
link |
01:27:21.760
Because if you look at the invasive side, there's so much activity going on right now
link |
01:27:26.960
of less invasive techniques to get at the neuron firings, which what Neuralink is building,
link |
01:27:36.960
it's possible that in 10, 15 years when they're scaling that technology, other things have
link |
01:27:41.440
come along and you'd much rather do that, that thing starts to clock again.
link |
01:27:46.720
It may not be the case.
link |
01:27:47.720
It may be the case that Neuralink has properly chosen the right technology and that that's
link |
01:27:51.840
exactly what they want to be.
link |
01:27:53.440
Totally possible.
link |
01:27:54.440
And it's also possible that the path we've chosen at noninvasive falls short for a variety
link |
01:27:58.160
of reasons.
link |
01:27:59.160
It's just it's unknown.
link |
01:28:00.720
And so right now, the two technologies we chose, the analogy I'd give you to create
link |
01:28:05.720
a baseline of understanding is, if you think of it like the internet in the 90s, the internet
link |
01:28:11.880
became useful when people could do a dial up connection and then the paid and then as
link |
01:28:18.240
bandwidth increased, so did the utility of that connection and so did the ecosystem approve.
link |
01:28:23.040
And so if you say what kernel flow is going to give you a full screen on the picture of
link |
01:28:29.600
information, but as you're going to be watching a movie, but the image is going to be blurred
link |
01:28:34.600
and the audio is going to be muffled.
link |
01:28:37.580
So it has a lower resolution of coverage.
link |
01:28:40.920
Kernel flux, our MEG technology is going to give you the full movie and 1080p.
link |
01:28:48.080
And Neuralink is going to give you a circle on the screen of 4K.
link |
01:28:55.400
And so each one has their pros and cons and it's give and take.
link |
01:29:00.040
And so the decision I made with Kernel was that these two technologies, flux and flow,
link |
01:29:06.400
were basically the answer for the next seven years.
link |
01:29:10.740
And they would give rise to the ecosystem, which would become much more valuable than
link |
01:29:14.040
the hardware itself and that we would just continue to improve on the hardware over time.
link |
01:29:18.720
And you know, it's early days.
link |
01:29:20.320
So it's kind of fascinating to think about that, you know, it's very true that you don't
link |
01:29:25.960
know both paths are very promising.
link |
01:29:33.240
And it's like 50 years from now, we will look back and maybe not even remember one of them.
link |
01:29:40.920
And the other one might change the world.
link |
01:29:43.160
It's so cool how technology is.
link |
01:29:44.880
I mean, that's what entrepreneurship is like.
link |
01:29:48.440
It's the Earth principle is like you're marching ahead into the darkness, into the fog, not
link |
01:29:53.960
knowing.
link |
01:29:54.960
It's wonderful to have someone else out there with us doing this because if you if you look
link |
01:29:59.360
at brainer faces, anything that's off the shelf right now is inadequate.
link |
01:30:07.320
It's had its run for a couple of decades.
link |
01:30:09.640
It's still in hacker communities.
link |
01:30:11.280
It hasn't gone to the mainstream.
link |
01:30:15.000
The room size machines are on their own path.
link |
01:30:19.160
But there is no answer right now of bringing brainer faces mainstream.
link |
01:30:23.760
And so it both they and us, we've both spent over $100 million.
link |
01:30:29.720
And that's kind of what it takes to have a go at this because you need to build full
link |
01:30:34.480
stack.
link |
01:30:35.480
I mean, Colonel, we are from the photon and the atom through the machine learning.
link |
01:30:40.120
We have just under 100 people.
link |
01:30:41.520
I think it's something like 36, 37 PhDs in these specialties, these areas that there's
link |
01:30:47.720
only a few people in the world who have these abilities.
link |
01:30:50.560
And that's what it takes to build next generation, to make an attempt at breaking into brainer
link |
01:30:57.000
faces.
link |
01:30:58.000
And so we'll see over the next couple of years, whether it's the right time or whether we
link |
01:31:00.360
were both too early or whether something else comes along in seven to 10 years, which
link |
01:31:04.520
is the right thing that brings it mainstream.
link |
01:31:07.320
So you see Elon as the kind of competitor or a fellow traveler along the path of uncertainty
link |
01:31:16.200
or both.
link |
01:31:17.200
It's a fellow traveler.
link |
01:31:19.240
It's like at the beginning of the internet is how many companies are going to be invited
link |
01:31:25.400
to this new ecosystem, like an endless number.
link |
01:31:30.720
Because if you think that the hardware just starts the process, and so, okay, back to
link |
01:31:37.200
your initial example, if you take the Fitbit, for example, you say, okay, now I can get
link |
01:31:40.760
measurements on the body.
link |
01:31:43.240
And what do we think the ultimate value of this device is going to be?
link |
01:31:45.920
What is the information transfer rate?
link |
01:31:48.120
And they were in the market for a certain duration of time and Google bought them for
link |
01:31:51.080
$2.5 billion.
link |
01:31:53.960
They didn't have ancillary value add.
link |
01:31:55.640
There weren't people building on top of the Fitbit device.
link |
01:31:58.600
They also didn't have increased insight with additional data streams.
link |
01:32:02.480
So it was really just the device.
link |
01:32:04.320
If you look, for example, at Apple and the device they sell, you have value in the device
link |
01:32:08.320
that someone buys.
link |
01:32:09.320
But also, you have everyone who's building on top of it, so you have this additional
link |
01:32:12.600
ecosystem value.
link |
01:32:13.600
And then you have additional data streams that come in, which increase the value of the product.
link |
01:32:17.560
And so if you say, if you look at the hardware as the instigator of value creation, you know,
link |
01:32:24.600
over time, what we've built may constitute 5% or 10% of the value of the overall ecosystem.
link |
01:32:29.600
And that's what we really care about.
link |
01:32:31.000
What we're trying to do is kickstart the mainstream adoption of quantifying the brain.
link |
01:32:38.880
And the hardware just opens the door to say what kind of ecosystem could exist.
link |
01:32:45.160
And that's why the examples are so relevant of the things you've outlined in your life.
link |
01:32:50.720
I hope those things, the books people write, the experiences people build, the conversations
link |
01:32:55.560
you have, your relationship with your AI systems, I hope those all are feeding on the insights
link |
01:33:02.040
built upon this ecosystem we've created to better your life.
link |
01:33:05.240
And so that's the thinking behind it, again, with the Drake equation being the underlying
link |
01:33:10.840
driver of value.
link |
01:33:12.280
And the people at Kernel have joined not because we have certainty of success, but because
link |
01:33:20.320
we find it to be the most exhilarating opportunity we could ever pursue in this time to be alive.
link |
01:33:27.640
You founded the payment system brain tree in 2007 that acquired Venmo in 2012, and that
link |
01:33:36.240
same year was acquired by PayPal, which is now eBay.
link |
01:33:42.520
Can you tell me the story of the vision and the challenge of building an online payment
link |
01:33:47.560
system and just building a large successful business in general?
link |
01:33:51.480
I discovered payments by accident.
link |
01:33:54.400
As I was, when I was 21, I just returned from Ecuador living among extreme poverty for two
link |
01:34:02.360
years.
link |
01:34:03.360
I was in the U.S. and I was shocked by the opulence of the United States, and I thought
link |
01:34:10.000
this is, I couldn't believe it, and I decided I wanted to try to spend my life helping others.
link |
01:34:16.960
That was the life objective that I thought was worthwhile to pursue versus making money
link |
01:34:21.160
and whatever the case may be for its own right.
link |
01:34:24.600
And so I decided in that moment that I was going to try to make enough money by the age
link |
01:34:28.960
of 30 to never have to work again.
link |
01:34:32.360
And then with some abundance of money, I could then choose to do things that might be beneficial
link |
01:34:38.400
to others but may not meet the criteria of being a standalone business.
link |
01:34:43.560
In that process, I started a few companies, had some small successes, had some failures.
link |
01:34:49.880
In one of the endeavors, I was up to my eyeballs in debt, things were not going well, and I
link |
01:34:54.360
needed a part time job to pay my bills.
link |
01:35:00.960
One day I saw in the paper in Utah where I was living, the 50 richest people in Utah,
link |
01:35:06.080
and I emailed each one of their assistants and said, you know, I'm young, I'm resourceful,
link |
01:35:10.360
I'll do anything, I'll just want to, I'm entrepreneurial, I try to get a job that would be flexible and
link |
01:35:16.200
no one responded.
link |
01:35:17.880
And then I interviewed a few dozen places, nobody would even give me the time of day.
link |
01:35:23.800
It wouldn't want to take me seriously.
link |
01:35:25.800
And so finally, it was on monster.com that I saw this job posting for credit card sales
link |
01:35:31.120
door to door commission.
link |
01:35:33.800
I did not know the story.
link |
01:35:36.120
This is great.
link |
01:35:37.120
I love the head drop.
link |
01:35:38.240
That's exactly right.
link |
01:35:39.240
So it was the low points to which we're going like.
link |
01:35:43.480
So I responded and, you know, the person made an attempt at suggesting that they had some
link |
01:35:49.320
kind of standards that they would consider hiring, but it's kind of like if you could
link |
01:35:53.080
fog a mirror, like come and do this because it's a hundred percent commission.
link |
01:35:57.280
And so I started walking up and down the street in my community selling credit card processing.
link |
01:36:02.760
And so what you learn immediately in doing that is if you, you walk into a business,
link |
01:36:07.920
first of all, the business owner is typically there and you walk in the door and they can
link |
01:36:12.520
tell by how you're addressed or how you walk, whatever their pattern recognition is.
link |
01:36:16.640
And they just hate you immediately.
link |
01:36:17.640
It's like, stop wasting my time.
link |
01:36:18.920
I really am trying to get stuff done.
link |
01:36:20.280
I don't want us to do a sales pitch.
link |
01:36:21.480
And so you have to overcome the initial get out.
link |
01:36:25.280
And then once you engage, when you say the word credit card processing, the person's
link |
01:36:30.200
like, I already hate you because I have been taken advantage of dozens of times because
link |
01:36:34.360
you're all our weasels.
link |
01:36:37.360
And so I had to figure out an algorithm to get past all those different conditions because
link |
01:36:41.080
I was still working on my other startup for the majority of my time.
link |
01:36:44.480
I was doing this part of time.
link |
01:36:46.040
And so I figured out that the industry really was built on people on deceit, basically people
link |
01:36:56.560
problems and things that were not reality.
link |
01:36:58.480
And so I'd walk into a business, I'd say, look, I would give you $100, I'd put $100
link |
01:37:02.560
bill and say, I'll give you $100 for three minutes of your time.
link |
01:37:05.480
If you don't say yes to what I'm saying, I'll give you $100.
link |
01:37:08.440
And then you usually crack a smile and say, okay, what do you got for me, son?
link |
01:37:12.680
And so I'd sit down, I just opened my book and I'd say, here's the credit card industry.
link |
01:37:16.360
Here's how it works.
link |
01:37:17.360
Here are the players.
link |
01:37:18.360
Here's what they do.
link |
01:37:19.360
Here's how they deceive you.
link |
01:37:20.600
Here's what I am.
link |
01:37:21.600
I'm no different than anyone else.
link |
01:37:22.600
It's like, you're going to process your credit card, you're going to get the money in the
link |
01:37:24.960
account.
link |
01:37:25.960
You're just going to get a clean statement.
link |
01:37:26.960
You're going to have someone who answers the call when someone asks and, you know, just
link |
01:37:29.720
like the basic, like you're okay.
link |
01:37:31.840
And people started saying yes.
link |
01:37:32.840
And then of course I went to the next business and be like, you know, Joe and Susie and whoever
link |
01:37:36.840
said yes too.
link |
01:37:37.840
And so I built a social proof structure.
link |
01:37:39.680
And I became the number one salesperson out of 400 people nationwide doing this.
link |
01:37:45.680
And I worked, you know, half time still doing this other startup.
link |
01:37:49.000
And that's a brilliant strategy, by the way.
link |
01:37:51.160
It's very well, very well strategized and executed.
link |
01:37:54.920
I did it for nine months.
link |
01:37:58.000
And at the time my customer base was making, was generating around, I think it was sick.
link |
01:38:03.720
If I remember correctly, $62,504 a month were the overall revenues.
link |
01:38:08.480
I thought, wow, that's amazing.
link |
01:38:10.080
If I built that as my own company, I would just make $62,000 a month of income passively
link |
01:38:16.840
with these merchants processing credit cards.
link |
01:38:18.720
So I thought, hmm.
link |
01:38:20.640
And so that's when I thought I'm going to create a company.
link |
01:38:23.760
And so then I started Braintree.
link |
01:38:26.200
And the idea was the online world was broken because PayPal had been acquired by eBay around
link |
01:38:35.720
I think 1999 or 2000 and eBay had not innovated much with PayPal.
link |
01:38:39.920
So it basically sat still for seven years as the software world moved along.
link |
01:38:45.280
And then authorize.net was also a company that was relatively stagnant.
link |
01:38:47.680
So you basically had software engineers who wanted modern payment tools, but there were
link |
01:38:52.160
none available for them.
link |
01:38:53.400
And so they just dealt with software they didn't like.
link |
01:38:55.120
And so with Braintree, I thought the entry point is to build software that engineers
link |
01:38:59.360
will love.
link |
01:39:00.520
And if we can find the entry point via software, make it easy and beautiful and just a magical
link |
01:39:04.840
experience and then provide customer service on top of that would be easy.
link |
01:39:07.360
That would be great.
link |
01:39:08.360
What I was really going after though was it was PayPal.
link |
01:39:11.720
They were the only company in payments making money because they because they had a relationship
link |
01:39:18.000
with eBay early on, people created a PayPal account.
link |
01:39:22.280
They'd fund their account with their checking account versus their credit cards.
link |
01:39:25.560
And then when they'd use PayPal to pay a merchant, PayPal had a cost of payment of zero versus
link |
01:39:31.320
if you have coming from a credit card, you have to pay the bank the fees.
link |
01:39:35.200
So PayPal's margins were 3% on a transaction versus a typical payments company, which may
link |
01:39:41.760
be a nickel or a penny or a dime or something like that.
link |
01:39:45.000
And so a new PayPal really was the model to replicate, but a bunch of companies had tried
link |
01:39:49.880
to do that.
link |
01:39:50.880
They tried to come in and build a two sided marketplace to get consumers to fund the
link |
01:39:54.600
checking account and the merchants to accept it, but they'd all failed because building
link |
01:39:58.120
a two sided marketplace is very hard at the same time.
link |
01:40:01.880
So my plan was I'm going to build a company and get the best merchants in the whole world
link |
01:40:06.840
to use our service.
link |
01:40:08.720
Then in year five, I'm going to have, I'm going to acquire a consumer payments company
link |
01:40:12.760
and I'm going to bring the two together.
link |
01:40:15.200
And so focus on the merchant side and then get the payments company that does the customer,
link |
01:40:21.920
the whatever.
link |
01:40:22.920
So the other side of it.
link |
01:40:24.680
This is the plan I presented when I was at the University of Chicago and weirdly it happened
link |
01:40:31.400
exactly like that.
link |
01:40:32.400
So for four years in our customer base included Uber, Airbnb, GitHub, 37 signals, not Basecamp.
link |
01:40:40.280
We had a fantastic collection of companies that represented the fastest growing some
link |
01:40:47.280
of the fastest growing tech companies in the world.
link |
01:40:49.440
And then we met up with Venmo and they had done a remarkable job in building product.
link |
01:40:55.080
There's not then something very counterintuitive, which is make public your private financial
link |
01:40:58.840
transactions with people previously thought were something that should be hidden from
link |
01:41:03.000
others and we acquired Venmo.
link |
01:41:06.520
And at that point we now had, we replicated the model because now people could fund their
link |
01:41:11.880
Venmo account with their checking account, keep money in the account, and then you could
link |
01:41:15.640
just plug Venmo as a form of payment.
link |
01:41:17.600
And so I think PayPal saw that, that we were getting the best merchants in the world.
link |
01:41:22.840
We had people using Venmo, they were both the up and coming millennials at the time
link |
01:41:28.080
who had so much influence online.
link |
01:41:30.720
And so they came in and offered us an attractive number.
link |
01:41:35.200
And my goal was not to build the biggest payments company in the world.
link |
01:41:40.960
It wasn't to try to climb the Forbes billionaire list.
link |
01:41:44.200
It was, the objective was I want to earn enough money so that I can basically dedicate my
link |
01:41:51.920
attention to doing something that could potentially be useful on a society wide scale.
link |
01:41:59.120
And more importantly, that could be considered to be valuable from the vantage point of 2050,
link |
01:42:06.360
2100, and 2500.
link |
01:42:09.080
So thinking about it on a few hundred year timescale.
link |
01:42:14.480
And there was a certain amount of money I needed to do that, so I didn't require the
link |
01:42:18.520
permission of anybody to do that.
link |
01:42:21.120
And so that, what PayPal offered was sufficient for me to get that amount of money to basically
link |
01:42:24.960
have a go.
link |
01:42:26.120
And that's when I set off to survey everything I could identify an existence to say of anything
link |
01:42:34.040
in the entire world I could do, what one thing could I do that would actually have
link |
01:42:38.480
the highest value potential for the species.
link |
01:42:42.920
And so it took me a little while to arrive at brainer faces, but you know, payments in
link |
01:42:48.280
themselves are revolutionary technologies that can change the world.
link |
01:42:54.720
Like let's not, let's not sort of, let's not forget that too easily.
link |
01:43:01.320
I mean, obviously you know this, but there's quite a few lovely folks.
link |
01:43:08.320
Who are now fascinated with the space of cryptocurrency.
link |
01:43:14.040
And where payments are very much connected to this, but in general just money.
link |
01:43:19.480
And many of the folks I've spoken with, they also kind of connect that to not just purely
link |
01:43:25.120
financial discussions, but philosophical and political discussions.
link |
01:43:29.520
And they see Bitcoin as a way, almost as activism, almost as a way to resist the corruption
link |
01:43:37.920
of centralized centers of power and sort of basically in the 21st century, decentralized
link |
01:43:43.600
in control, whether that's Bitcoin or other cryptocurrencies, they see that's one possible
link |
01:43:49.680
way to give power to those that live in regimes that are corrupt or are not respectful human
link |
01:43:57.920
rights and all those kinds of things.
link |
01:44:00.080
What's your sense, just all your expertise with, with payments and seeing how that changed
link |
01:44:05.120
the world, what's your sense about the lay of the land for the future of Bitcoin or other
link |
01:44:11.400
cryptocurrencies in the positive impact they may have on the world?
link |
01:44:16.040
To be clear, my communication wasn't suggest, wasn't meant to minimize payments or to denigrate
link |
01:44:22.320
it in any way.
link |
01:44:23.600
It was an attempted communication that when I was surveying the world, it was an algorithm
link |
01:44:31.040
of what could I individually do?
link |
01:44:35.880
So there are things that exist that have a lot of potential that can be done.
link |
01:44:41.200
And then there's a filtering of how many people are qualified to do this given thing.
link |
01:44:46.520
And then there's a further characterization that can be done of, okay, given the number
link |
01:44:49.760
of qualified people, will somebody be a unique outperformer of that group to make something
link |
01:44:56.960
truly impossible to be something done that otherwise couldn't get done?
link |
01:44:59.560
So there's, there's a process of assessing where can you add unique value in the world?
link |
01:45:04.520
And some of that has to do with, you're being very, very formal and calculative here, but
link |
01:45:09.920
some of that is just like, what do you sense, like part of that equation is how much passion
link |
01:45:16.080
you sense within yourself to be able to drive that through, to discover the impossibilities
link |
01:45:20.440
and make them possible.
link |
01:45:21.600
That's right.
link |
01:45:22.600
And so we, we were a brain tree, I think we were the first company to integrate Coinbase
link |
01:45:26.800
into our, I think we were the first payments company to formally incorporate crypto if
link |
01:45:34.080
I'm not mistaken.
link |
01:45:35.080
For people who are not familiar, Coinbase is a place we can trade cryptocurrencies.
link |
01:45:39.320
Yeah.
link |
01:45:40.320
Which was one of the only places you could.
link |
01:45:42.120
So we were early in doing that and of course this was in the year 2013.
link |
01:45:49.200
So an attorney to go and in cryptocurrency land.
link |
01:45:52.360
I concur with the, the statement you made of the potential of the principles underlying
link |
01:46:02.000
cryptocurrencies and that many of the things that they're building in the name of money
link |
01:46:08.880
and of, of moving value is equally applicable to the brain and equally applicable to how
link |
01:46:18.040
the brain interacts with the rest of the world and how we would imagine doing goal alignment
link |
01:46:24.120
with people.
link |
01:46:25.620
So it's, to me, it's a continuous spectrum of possibility.
link |
01:46:29.240
And we're taught, your question is isolated on the money.
link |
01:46:32.240
And I think it just is basically a scaffolding layer for all of society.
link |
01:46:35.720
So you don't see this money as particularly distinct from the money?
link |
01:46:39.520
I don't.
link |
01:46:40.520
It's, I think we, we at Kernel, we will benefit greatly from the progress being made in cryptocurrency
link |
01:46:47.000
because it will be a similar technology stack we will want to use for many things we want
link |
01:46:51.040
to accomplish.
link |
01:46:52.040
And so I'm bullish on what's going on and I think it could greatly enhance brain interfaces
link |
01:46:58.720
and the value of the brain interface ecosystem.
link |
01:47:01.480
Is there something you could say about, first of all, bullish on cryptocurrency versus fiat
link |
01:47:05.240
money?
link |
01:47:06.240
So do you, do you have a sense that in 21st century cryptocurrency will be embraced by
link |
01:47:11.640
governments and changed the, the, the face of governments, the structure of government?
link |
01:47:17.920
It's the, it's the same way I think about my diet, where previously it was conscious
link |
01:47:26.320
Brian looking at foods in certain biochemical states on my hungry and my irritated on my
link |
01:47:33.520
depressed.
link |
01:47:34.520
And then I choose based upon those momentary windows.
link |
01:47:37.480
Do I eat at night when I'm fatigued and I have low willpower, am I going to pig out
link |
01:47:41.280
on something?
link |
01:47:43.080
And the current monetary system is based upon human conscious decision making and politics
link |
01:47:49.160
and power and this whole mess of things.
link |
01:47:51.760
And what I like about the building blocks of cryptocurrency is it's methodical, it's
link |
01:47:57.840
structured, it is accountable, it's transparent.
link |
01:48:01.600
And so it introduces this scaffolding, which I think again is the right starting point for
link |
01:48:07.720
how we think about building next generation institutions for society.
link |
01:48:13.560
And that's why I think it's much broader, much broader than money.
link |
01:48:16.520
So I guess what you're saying is Bitcoin is the demotion of the conscious mind as well.
link |
01:48:23.680
In the same way you were talking about diet is like giving less priority to the, the ups
link |
01:48:28.840
and downs of any one particular human mind, in this case your own, and giving more power
link |
01:48:34.480
to the sort of data driven.
link |
01:48:36.600
Yes, yeah, I think that is accurate that cryptocurrency is a version of what I would
link |
01:48:48.200
call my autonomous self that I'm trying to build.
link |
01:48:51.440
It is an introduction of an autonomous system of value exchange and value, and the process
link |
01:48:58.880
of value creation in society, yes, there's similarities.
link |
01:49:04.960
So I guess what you're saying is Bitcoin will somehow help me not pig out at night or the
link |
01:49:08.880
equivalent of speaking of diet.
link |
01:49:11.920
If we could just linger on that, that topic a little bit, we already talked about your,
link |
01:49:17.680
your blog post of I fired myself, I fired Brian the evening, Brian who is too willing
link |
01:49:26.000
to not not making good decisions for the long term well being and happiness of the entirety
link |
01:49:33.000
of the organism.
link |
01:49:34.000
Now, basically you were like picking out at night and but it's interesting because I
link |
01:49:39.080
do this, I do the same.
link |
01:49:40.720
In fact, I often eat one meal a day and like I have been this, this week actually, especially
link |
01:49:50.960
when I travel and it's, it's funny that it never occurred to me to just basically look
link |
01:49:59.400
at the fact that I'm able to be much smarter about my eating decisions in the morning and
link |
01:50:04.320
the afternoon than I am at night.
link |
01:50:06.800
So if I eat one meal a day, why not eat that one meal a day in the morning?
link |
01:50:11.600
Like, I'm not, it never occurred to me, this revolutionary until, until you've, you've
link |
01:50:20.520
outlined that.
link |
01:50:21.520
So maybe can you give some details and what this is just you, this is one person, Brian
link |
01:50:27.400
arrives at a particular thing that they do, but it's fascinating to kind of look at this
link |
01:50:32.720
one particular case study.
link |
01:50:34.160
So what works for you diet wise?
link |
01:50:36.720
What's your actual diet?
link |
01:50:37.840
What do you eat?
link |
01:50:38.840
How often do you eat?
link |
01:50:40.280
My current protocol is basically the result of thousands of experiments and decision making.
link |
01:50:48.840
So I've, I do this every 90 days, I do the tests, I do the cycle throughs that I measure
link |
01:50:53.920
again and then I'm measuring all the time and so what I, I of course, I'm optimizing
link |
01:50:59.360
for my biomarkers.
link |
01:51:00.360
I want perfect cholesterol and I brought perfect by blood glucose levels and perfect
link |
01:51:04.160
DNA methylation, you know, processes.
link |
01:51:10.400
I also want perfect sleep.
link |
01:51:12.560
And so for example, recently in the past two weeks, my resting heart rate has been at 42
link |
01:51:18.880
when I sleep and when my resting heart rate is at 42, my HRV is at its highest and I wake
link |
01:51:24.480
up in the morning feeling more energized than any other configuration.
link |
01:51:30.520
And so I know from all these processes that eating at roughly 830 in the morning, right
link |
01:51:34.800
after I work out on an empty stomach creates enough distance between that completed eating
link |
01:51:41.240
and bedtime where I have no almost no digestion processes going on in my body.
link |
01:51:47.560
So my resting heart rate goes very low and when my resting heart rate is very low, I
link |
01:51:51.360
sleep with high quality.
link |
01:51:52.360
And so basically I've been trying to optimize the entirety of what I eat to my sleep quality.
link |
01:51:58.360
My sleep quality then of course feeds into my willpower so it creates this virtuous cycle.
link |
01:52:02.640
And so what I at 830, what I do is I eat what I call super veggie, which is it's a pudding
link |
01:52:08.200
of 250 grams of broccoli, 150 grams of cauliflower and a whole bunch of other vegetables that
link |
01:52:13.400
I eat what I call nutty pudding, which is make the pudding itself.
link |
01:52:17.080
Like what you call it, like a veggie mix, whatever thing.
link |
01:52:22.720
Like a blender.
link |
01:52:23.720
Yeah.
link |
01:52:24.720
You can be made in a high speed blender.
link |
01:52:25.720
But basically I eat the same thing every day, a veggie bowl as in a form of pudding
link |
01:52:30.920
and then a bowl in the form of nuts.
link |
01:52:34.600
And then I have.
link |
01:52:35.600
Vegan.
link |
01:52:36.600
Vegan, yes.
link |
01:52:37.600
Vegan.
link |
01:52:38.600
So that's fat and that's fat and carbs and that's the protein and so on.
link |
01:52:43.760
Does it taste good?
link |
01:52:45.240
I love it.
link |
01:52:46.240
Yeah.
link |
01:52:47.240
I love it so much.
link |
01:52:48.240
I dream about it.
link |
01:52:49.240
Yeah.
link |
01:52:50.240
That's awesome.
link |
01:52:51.240
This is a.
link |
01:52:52.240
And then I have a third dish, which is it changes every day.
link |
01:52:55.480
Today it was kale and spinach and sweet potato.
link |
01:52:58.360
And then I take about 20 supplements that hopefully make constitute a perfect nutritional
link |
01:53:08.720
profile.
link |
01:53:09.720
So what I'm trying to do is create the perfect diet for my body every single day.
link |
01:53:13.840
Or sleep as part of the optimization.
link |
01:53:16.400
That's right.
link |
01:53:17.400
You're like one of the things you're really tracking at me.
link |
01:53:19.320
Can you?
link |
01:53:20.320
Well, I have a million questions, but 20 supplements, like what kind are like, would you say are
link |
01:53:24.320
essential?
link |
01:53:25.320
Because I only take, I only take athletic, athleticgreens.com slash what.
link |
01:53:30.720
That's like the multivitamin essentially.
link |
01:53:33.280
That's like the lazy man.
link |
01:53:34.360
You know, like if you don't actually want to think about shit, that's what you take.
link |
01:53:38.760
And then fish oil and that's it.
link |
01:53:40.600
That's all I take.
link |
01:53:41.600
Yeah, you know, Alfred North Whitehead said, civilization advances as that extends the
link |
01:53:48.440
number of important operations it can do without thinking about them.
link |
01:53:53.080
And so my objective on this is I want an algorithm for perfect health that I never have to think
link |
01:53:59.920
about.
link |
01:54:01.040
And then I want that system to be scalable to anybody so that they don't have to think
link |
01:54:04.600
about it.
link |
01:54:05.720
And right now it's expensive for me to do it.
link |
01:54:07.720
It's time consuming for me to do it.
link |
01:54:09.120
And I have infrastructure to do it, but the future of being human is not going to the
link |
01:54:14.920
grocery store and deciding what to eat.
link |
01:54:17.640
It's also not reading scientific papers, trying to decide this thing or that thing.
link |
01:54:21.520
It's all N of one.
link |
01:54:23.200
So it's devices on the outside and inside your body assessing real time what your body
link |
01:54:27.720
needs and then creating closed loop systems for that to happen.
link |
01:54:30.600
Yeah.
link |
01:54:31.600
So right now you're doing the data collection and you're being the scientist, it'd be much
link |
01:54:36.320
better if you just did the data collection or it was being essentially done for you and
link |
01:54:40.960
you can outsource that to another scientist that's doing the N of one study of you.
link |
01:54:46.160
That's right because every time I spend time thinking about this or executing spending
link |
01:54:50.000
time on it, I'm spending less time thinking about building kernel or the future of being
link |
01:54:54.680
human.
link |
01:54:55.680
And so we just all have the budget of our capacity on an everyday basis and we will scaffold
link |
01:55:04.320
our way up out of this.
link |
01:55:05.680
And so yeah, hopefully what I'm doing is really, it serves as a model that others can also
link |
01:55:10.320
build.
link |
01:55:11.320
That's why I wrote about it, is hopefully people can then take and improve upon it.
link |
01:55:15.480
I hold nothing sacred.
link |
01:55:16.480
I change my diet almost every day based upon some new test results or science or something
link |
01:55:21.920
like that.
link |
01:55:22.920
Can you maybe elaborate on the sleep thing?
link |
01:55:24.960
Why is sleep so important?
link |
01:55:27.480
And why, presumably, what does good sleep mean to you?
link |
01:55:34.640
I think sleep is a contender for being the most powerful health intervention in existence.
link |
01:55:48.600
It's a contender.
link |
01:55:49.600
I mean, it's magical what it does if you're well rested and what your body can do.
link |
01:55:57.080
And I mean, for example, I know when I eat close to my bedtime and I've done a systematic
link |
01:56:02.680
study for years looking at 15 minute increments on time of day and where I eat my last meal,
link |
01:56:09.920
my willpower is directly correlated to the amount of deep sleep I get.
link |
01:56:15.480
So my ability to not binge eat at night when Rascal Bryan's out and about is based upon
link |
01:56:22.880
how much deep sleep I got the night before.
link |
01:56:25.040
Yeah.
link |
01:56:26.040
There's a lot to that, yeah.
link |
01:56:27.440
And so I've seen it manifest itself and so I think the way I summarize this is in society
link |
01:56:34.680
we've had this myth of we tell stories, for example, of entrepreneurship where this person
link |
01:56:40.480
was so amazing, they stayed at the office for three days and slept under their desk.
link |
01:56:45.000
And we say, wow, that's amazing, that's amazing.
link |
01:56:50.000
And now I think we're headed towards a state where we'd say that's primitive and really
link |
01:56:55.720
not a good idea on every level.
link |
01:56:58.960
And so the new mythology is going to be the exact opposite.
link |
01:57:04.080
Yeah.
link |
01:57:05.080
By the way, just to sort of maybe push back a little bit on that idea.
link |
01:57:11.120
Did you sleep under your desk, Lex?
link |
01:57:13.800
Well, yeah, a lot.
link |
01:57:14.800
I'm a big believer in that actually.
link |
01:57:16.000
I'm a big believer in chaos and not giving it, like giving it to your passion and sometimes
link |
01:57:23.080
doing things that are out of the ordinary that are not trying to optimize health for
link |
01:57:29.360
certain periods of time in lieu of your passions is a signal to yourself that you're throwing
link |
01:57:38.720
everything away.
link |
01:57:39.720
So I think what you're referring to is how to have good performance for prolonged periods
link |
01:57:46.080
of time.
link |
01:57:47.520
I think there's moments in life when you need to throw all of that away.
link |
01:57:52.840
All the plans away, all the structure away.
link |
01:57:55.040
So I'm not sure I have an eloquent way of describing exactly what I'm talking about,
link |
01:58:02.480
but it all depends on different people, people are different.
link |
01:58:07.880
But there's a danger of over optimization to where you don't just give in to the madness
link |
01:58:14.360
of the way your brain flows.
link |
01:58:17.800
I mean, to push back on my pushback is nice to have where the foundations of your brain
link |
01:58:30.640
are not messed with.
link |
01:58:31.640
So you have a fixed foundation where the diet is fixed, where the sleep is fixed, and all
link |
01:58:36.880
that is optimal.
link |
01:58:37.880
And the chaos happens in the space of ideas as opposed to the space of biology.
link |
01:58:43.320
But I'm not sure if that requires real discipline in forming habits.
link |
01:58:51.040
There's some aspect to which some of the best days and weeks of my life have been sleeping
link |
01:58:56.800
under a desk kind of thing.
link |
01:58:58.880
And I'm not too willing to let go of things that empirically worked for things that work
link |
01:59:09.520
in theory.
link |
01:59:12.000
So again, I'm absolutely with you on sleep.
link |
01:59:16.400
Also I'm with you on sleep conceptually, but I'm also very humbled to understand that
link |
01:59:24.000
for different people, good sleep means different things.
link |
01:59:28.720
I'm very hesitant to trust science on sleep.
link |
01:59:32.960
I think you should also be a scholar of your body, again, the experiment of N of 1.
link |
01:59:38.160
I'm not so sure that a full night's sleep is great for me.
link |
01:59:44.240
There is something about that power nap that I just have not fully studied yet.
link |
01:59:50.240
But that nap is something special.
link |
01:59:52.720
I'm not sure I found the optimal thing.
link |
01:59:56.040
So there's a lot to be explored to what is exactly optimal amount of sleep, optimal
link |
02:00:00.520
kind of sleep, combined with diet and all those kinds of things.
link |
02:00:03.440
That all maps the data, at least the truth, exactly what everything you're referring to.
link |
02:00:09.000
Here's a data point for your consideration.
link |
02:00:12.800
The progress in biology over the past, say decade, has been stunning.
link |
02:00:19.800
And it now appears as if we will be able to replace our organs, zero externa transplantation.
link |
02:00:28.480
And so we probably have a path to replace and regenerate every organ of your body, except
link |
02:00:38.160
for your brain.
link |
02:00:43.040
You can lose your hand and your arm and a leg, you can have an artificial heart.
link |
02:00:47.560
You can't operate without your brain.
link |
02:00:49.720
And so when you make that trade off decision of whether you're going to sleep under the
link |
02:00:54.080
desk or not and go all out for a four day marathon, there's a cost benefit trade off
link |
02:01:02.040
of what's going on, what's happening to your brain in that situation.
link |
02:01:05.840
We don't know the consequences of modern day life on our brain.
link |
02:01:10.000
We don't, it's the most valuable organ in our existence.
link |
02:01:15.040
And we don't know what's going on in how we're treating it today with stress and with
link |
02:01:21.080
sleep and with dietary.
link |
02:01:23.480
And to me, then if you say that you're trying to, you're, you're trying to optimize life
link |
02:01:29.920
for whatever things you're trying to do.
link |
02:01:33.080
The game is soon with the progress in anti aging and biology, the game is very soon going
link |
02:01:37.480
to become different than what it is right now with organ rejuvenation, organ replacement.
link |
02:01:42.960
And I'm, I would conjecture that we will value the health status of our brain above
link |
02:01:51.680
all things.
link |
02:01:52.680
Absolutely.
link |
02:01:53.680
Everything you're saying is true, but we die, we die pretty quickly.
link |
02:01:59.960
Life is short.
link |
02:02:01.600
And I'm one of those people that I would rather die in battle than, than stay safe at home.
link |
02:02:11.320
It's like, yeah, you look at kind of, there's a lot of things that you can reasonably say,
link |
02:02:17.240
these are, this is the smart thing to do that can prevent you, that becomes conservative,
link |
02:02:21.680
that can prevent you from fully embracing life.
link |
02:02:24.400
I think ultimately you can be very intelligent and data driven and also embrace life.
link |
02:02:30.960
But I err on the side of embracing life.
link |
02:02:33.520
It's very, it takes a very skillful person to not sort of that hovering parent that
link |
02:02:39.760
says, no, you know what?
link |
02:02:41.160
There's a 3% chance that if you go out, if you go out by yourself and play, you're going
link |
02:02:46.720
to die, get run over by a car, come to a slow or a sudden end.
link |
02:02:52.840
And I am more a supporter of just go out there.
link |
02:02:57.520
If you die, you die.
link |
02:02:59.600
And that's a, it's a balance you have to strike.
link |
02:03:02.800
I think long, there's a balance of strike and longterm optimization and short term freedom.
link |
02:03:11.880
For me, for a programmer, for a programming mind, I tend to over optimize and I'm very
link |
02:03:17.600
cautious and afraid of that to not over optimize and thereby be overly cautious, suboptimally
link |
02:03:25.000
cautious about everything I do.
link |
02:03:27.680
And then the ultimate thing I'm trying to optimize for it is funny you said like sleep
link |
02:03:31.640
and all those kinds of things.
link |
02:03:33.360
I tend to think this is a, you're being more precise than I am, but I think I tend to want
link |
02:03:43.120
to minimize stress, which everything comes into that from your sleep and all those kinds
link |
02:03:50.400
of things.
link |
02:03:51.400
But I worry that whenever I'm trying to be too strict with myself, then the stress goes
link |
02:03:56.800
up when I don't follow the strictness.
link |
02:04:00.560
And so you have to kind of, it's a weird, it's a, there's so many variables in an objective
link |
02:04:05.240
function as it's hard to get right.
link |
02:04:07.600
And sort of not giving a damn about sleep and not giving a damn about diet is a good
link |
02:04:11.640
thing to inject in there every once in a while for somebody who's trying to optimize everything.
link |
02:04:17.160
But that's me just trying to, like it's exactly like you said, you're just a scientist, I'm
link |
02:04:21.480
a scientist of myself, you're a scientist of yourself.
link |
02:04:24.200
It'd be nice if somebody else was doing it and had much better data than because I don't
link |
02:04:28.360
trust my conscious mind and I pigged out last night at some brisket in LA that I regret
link |
02:04:33.480
deeply.
link |
02:04:34.480
There's no point to anything I just said.
link |
02:04:39.520
What, what is the nature of your regret on the brisket?
link |
02:04:46.800
Is it, do you wish you hadn't eaten it entirely?
link |
02:04:49.880
Is it that you wish you hadn't eaten as much as you did?
link |
02:04:51.920
Is it that?
link |
02:04:55.200
I think, well, most regret, I mean, if we want to be specific, I drank way too much
link |
02:05:03.360
like diet soda.
link |
02:05:05.400
My biggest regret is like having drank so much diet soda, that's the thing that really
link |
02:05:09.360
was the problem.
link |
02:05:10.360
I had trouble sleeping because of that because I was like programming and then I was editing
link |
02:05:14.600
and so I stayed up late at night and then I had to get up to go pee a few times and it
link |
02:05:18.640
was just a mess.
link |
02:05:19.640
A mess of a night.
link |
02:05:20.640
Well, it's not really a mess, but like it's so many, it's like the little things.
link |
02:05:25.960
I know if I just eat, I drink a little bit of water and that's it and there's a certain,
link |
02:05:31.880
all of us have perfect days that we know diet wise and so on that's good to follow, you
link |
02:05:38.200
feel good.
link |
02:05:39.200
I know what it takes for me to do that.
link |
02:05:41.600
I didn't fully do that and thereby because there's an avalanche effect where the other
link |
02:05:48.640
sources of stress, all the other to do items I have, pile on my failure to execute on some
link |
02:05:54.760
basic things that I know make me feel good and all of that combines to create a mess
link |
02:06:01.400
of a day.
link |
02:06:02.560
But some of that chaos, you have to be okay with it, but some of it I wish was a little
link |
02:06:06.840
bit more optimal and your ideas about eating in the morning are quite interesting as an
link |
02:06:12.960
experiment to try.
link |
02:06:14.440
Can you elaborate, are you eating once a day?
link |
02:06:18.160
Yes.
link |
02:06:19.160
In the morning and that's it.
link |
02:06:22.360
Can you maybe speak to how that, you spoke, it's funny, spoke about the metrics of sleep,
link |
02:06:30.880
but you're also, you know, run a business, you're incredibly intelligent, you have to,
link |
02:06:37.800
mostly your happiness and success relies on you thinking clearly.
link |
02:06:43.320
So how does that affect your mind and your body in terms of performance?
link |
02:06:46.800
Yes.
link |
02:06:47.800
Not really, but actually like mental performance.
link |
02:06:51.080
As you were explaining your objective function of, for example, in the criteria you are including,
link |
02:06:56.880
you like certain neurochemical states, like you like feeling like you're living life,
link |
02:07:02.760
that life has enjoyment, that sometimes you want to disregard certain rules to have a
link |
02:07:08.200
moment of passion, of focus.
link |
02:07:10.440
There's this architecture of the way Lex is, which makes you happy as a story you tell,
link |
02:07:16.400
as something you kind of experience, maybe the experience is a bit more complicated,
link |
02:07:19.400
but it's in this idea you have, this is a version of you.
link |
02:07:22.840
And the reason why I maintain the schedule I do is I've chosen a game to say, I would
link |
02:07:30.120
like to live a life where I care more about what intelligent, what people who live in
link |
02:07:39.280
2000, the year 2500, think of me than I do today.
link |
02:07:45.120
That's the game I'm trying to play.
link |
02:07:46.840
And so therefore, the only thing I really care about on this optimization is trying
link |
02:07:54.000
to see past myself, past my limitations, using zeroes principle thinking, pull myself out
link |
02:08:02.160
of this contextual mesh we're in right now and say, what will matter 100 years from now
link |
02:08:07.200
and 200 years from now?
link |
02:08:08.640
What are the big things really going on that are defining reality?
link |
02:08:13.600
And I find that if I were to hang out with Diet Soda Lex and Diet Soda Brian were to
link |
02:08:24.040
play along with that, and my deep sleep were to get crushed as a result, my mind would
link |
02:08:29.280
not be on what matters in 100 years or 200 years or 300 years, I would be irritable,
link |
02:08:34.320
I would be, you know, I'd be in a different state.
link |
02:08:37.640
And so it's just gameplay selection.
link |
02:08:41.400
It's what you and I have chosen to think about.
link |
02:08:43.800
It's what we've chosen to work on.
link |
02:08:47.880
And this is why I'm saying that no generation of humans have ever been afforded the opportunity
link |
02:08:54.520
to look at their lifespan and contemplate that they will have the possibility of experiencing
link |
02:09:03.160
an evolved form of consciousness that is undoneifiable, that would fall into the zeroes category
link |
02:09:09.200
of potential.
link |
02:09:11.200
That to me is the most exciting thing in existence.
link |
02:09:14.320
And I would not trade any momentary neurochemical state right now in exchange for that.
link |
02:09:21.520
I would, I'd be willing to deprive myself of all momentary joy in pursuit of that goal
link |
02:09:27.080
because that's what makes me happy.
link |
02:09:29.120
That's brilliant.
link |
02:09:31.000
But I'm a bit, I just looked it up, I'm with a, I just looked up Braveheart speech in
link |
02:09:38.760
William Wallace, but I don't know if you've seen it.
link |
02:09:42.080
Fight and you may die, run and you'll live at least a while and dying in your beds many
link |
02:09:47.080
years from now.
link |
02:09:48.240
Would you be willing to trade all the days from this day to that for one chance?
link |
02:09:53.960
Just one chance, picture of Mel Gibson saying this, to come back here and tell our enemies
link |
02:09:59.000
that they may take our lives with growing excitement, but they'll never take our freedom.
link |
02:10:06.440
I get excited every time I see that in the movie, but that's kind of how I approach life.
link |
02:10:11.040
Do you think they were tracking their sleep?
link |
02:10:12.680
They were not tracking their sleep and they ate way too much brisket and they were fat,
link |
02:10:17.280
unhealthy, died early and were primitive, but there's something in my eight brain that's
link |
02:10:25.640
attracted to that even though most of my life is fully aligned with the way you see yours.
link |
02:10:32.800
Part of it is for comedy, of course, but part of it is like I'm almost afraid of over optimization.
link |
02:10:39.040
Really what you're saying though, if we're looking at this, let's say from a first principles
link |
02:10:42.640
perspective, when you read those words, they conjure up certain life experiences, but you're
link |
02:10:46.800
basically saying, I experienced a certain neurotransmitter state when these things are
link |
02:10:51.600
in action.
link |
02:10:52.600
Yeah.
link |
02:10:53.600
That's all you're saying.
link |
02:10:54.600
So whether it's that or something else, you're just saying you have a selection for how your
link |
02:10:58.000
state for your body, and so if you as an engineer of consciousness, that should just be engineerable.
link |
02:11:05.600
That's just triggering certain chemical reactions.
link |
02:11:09.120
So it doesn't mean they have to be mutually exclusive.
link |
02:11:11.400
You can have that and experience that and also not sacrifice long term health.
link |
02:11:15.840
I think that's the potential of where we're going is we don't have to assume they are
link |
02:11:23.280
trade offs that must be had.
link |
02:11:25.880
Absolutely.
link |
02:11:26.880
I guess from my particular brain, it's useful to have the outlier experiences that also
link |
02:11:33.160
come along with the illusion of free will where I chose those experiences that make me feel
link |
02:11:38.720
like it's freedom.
link |
02:11:39.720
Listen, going to Texas made me realize I spent, so I was, it's still am, but I lived at Cambridge
link |
02:11:45.480
at MIT and I never felt like home there.
link |
02:11:49.160
I felt like home in the space of ideas with the colleagues, like when I was actually discussing
link |
02:11:53.800
ideas, but there is something about the constraints, how cautious people are, how much they value
link |
02:12:01.080
to also material success, career success.
link |
02:12:07.000
When I showed up to Texas, it felt like I belong.
link |
02:12:12.200
That was very interesting, but that's my neurochemistry, whatever the hell that is, whatever, maybe
link |
02:12:17.480
probably is rooted to the fact that I grew up in the Soviet Union, it was such a constrained
link |
02:12:21.600
system that you really deeply value freedom and you always want to escape the man and
link |
02:12:27.520
the control of centralized systems.
link |
02:12:29.080
I don't know what it is, but at the same time, I love strictness.
link |
02:12:33.840
I love the dogmatic authoritarianism of diet, of the same habit, exactly the habit you have.
link |
02:12:41.280
I think that's actually when bodies perform optimally, my body performs optimally.
link |
02:12:45.680
So balancing those two, I think if I have the data, every once in a while, party with
link |
02:12:50.520
some wild people, but most of the time, eat once a day, perhaps in the morning, I'm going
link |
02:12:55.680
to try that.
link |
02:12:56.680
That might be very interesting, but I'd rather not try it.
link |
02:13:00.000
I'd rather have the data that tells me to do it, but in general, you're able to eating
link |
02:13:05.440
once a day, think deeply about stuff like this.
link |
02:13:10.400
Concern that people have is like, does your energy wane, all those kinds of things?
link |
02:13:15.280
You find that it's, especially because it's unique, it's vegan as well.
link |
02:13:21.160
So you find that you're able to have a clear mind, a focus, and just physically and mentally
link |
02:13:27.040
throughout.
link |
02:13:28.040
Yeah.
link |
02:13:29.040
And I find my personal experience in thinking about hard things is like oftentimes, I feel
link |
02:13:36.840
like I'm looking through a telescope and like I'm aligning two or three telescopes and you
link |
02:13:42.400
kind of have to close one eye and move back and forth a little bit and just find just
link |
02:13:47.280
the right alignment thing.
link |
02:13:48.280
You find just a sneak peek at the thing you're trying to find, but it's fleeting.
link |
02:13:51.720
If you move just one little bit, it's gone.
link |
02:13:54.920
And oftentimes what I feel like are the ideas I value the most are like that.
link |
02:14:00.760
They're so fragile and fleeting and slippery and elusive.
link |
02:14:07.080
And it requires a sensitivity to thinking and a sensitivity to maneuver through these
link |
02:14:16.120
things.
link |
02:14:17.120
If I concede to a world where I'm on my phone texting, I'm also on social media, I'm also
link |
02:14:25.840
doing 15 things at the same time because I'm running the company and I'm also feeling terrible
link |
02:14:30.840
from the last night, it all just comes crashing down and the quality of my thoughts goes to
link |
02:14:37.960
a zero.
link |
02:14:38.960
I'm a functional person to respond to basic level things, but I don't feel like I am
link |
02:14:44.640
doing anything interesting.
link |
02:14:47.880
I think that's a good word, sensitivity, because that's what thinking deeply feels like is
link |
02:14:53.920
you're sensitive to the fragile thoughts and you're right.
link |
02:14:56.640
All those other distractions kind of dull your ability to be sensitive to the fragile
link |
02:15:02.280
thoughts.
link |
02:15:03.280
It's a really good word.
link |
02:15:05.640
Out of all the things you've done, you've also climbed Mount Kilimanjaro.
link |
02:15:13.200
Is this true?
link |
02:15:14.200
It's true.
link |
02:15:19.840
What do you, why and how and what do you take from that experience?
link |
02:15:25.360
I guess the backstory is relevant because in that moment, it was the darkest time in
link |
02:15:31.800
my life.
link |
02:15:32.800
I was ending a 13 year marriage, I was leaving my religion, I sold brain tree and I was battling
link |
02:15:37.920
depression where I was just like at the end and I got invited to go to Tanzania as part
link |
02:15:46.120
of a group that was raising money to build clean water wells.
link |
02:15:50.160
I had made some money from brain tree and so I was able to donate $25,000.
link |
02:15:54.560
It was the first time I had ever had money to donate outside of paying tithing in my
link |
02:16:00.480
religion and it was such a phenomenal experience to contribute something meaningful to someone
link |
02:16:09.000
else in that form and as part of this process, we were going to climb the mountain and so
link |
02:16:15.160
we went there and we saw the clean water wells we were building.
link |
02:16:17.360
We spoke to the people there and it was very energizing and then we climbed Kilimanjaro
link |
02:16:21.880
and I came down with a stomach flu on day three and I also had altitude sickness but
link |
02:16:30.840
I became so sick that on day four, we are somebody on day five, I came into the camp,
link |
02:16:37.080
base camp at 15,000 feet, just going to the bathroom on myself and following all over.
link |
02:16:44.480
I was just a disaster, I was so sick.
link |
02:16:47.120
So stomach flu and altitude sickness.
link |
02:16:50.120
Yeah, and I just was destroyed from the situation and plus psychologically one of the lowest
link |
02:17:00.360
points.
link |
02:17:01.360
Yeah, and I think that was probably a big contributor.
link |
02:17:03.360
I was just smoked as a human, just absolutely done and I had three young children and so
link |
02:17:07.960
I was trying to reconcile whether I live or not is not my decision by itself.
link |
02:17:14.920
I'm now intertwined with these three little people and I have an obligation whether I
link |
02:17:21.200
like it or not, I need to be there and so it did.
link |
02:17:25.080
It felt like I was just stuck in a straight jacket and I had to decide whether I was going
link |
02:17:31.280
to summit the next day with the team and it was a difficult decision because once you
link |
02:17:37.440
start hiking, there's no way to get off the mountain and a midnight came and our guide
link |
02:17:43.480
came in and he said, where are you at?
link |
02:17:45.000
And I said, I think I'm okay, I think I can try and so we went and so from midnight to
link |
02:17:54.320
I made it to the summit at 5am.
link |
02:17:56.920
It was one of the most transformational moments of my existence and the mountain became my
link |
02:18:05.400
problem.
link |
02:18:06.600
It became everything that I was struggling with and when I started hiking, the pain
link |
02:18:12.880
got so ferocious that it was kind of like this.
link |
02:18:19.160
It became so ferocious that I turned my music to Eminem and he was the only person in existence
link |
02:18:29.160
that spoke to my soul and it was something about his anger and his vibrancy in his multi
link |
02:18:37.640
bench way.
link |
02:18:38.640
He's the only person who I could turn on and I could say, I feel some relief.
link |
02:18:42.720
I turned on Eminem and I made it to the summit after 5 hours but just a hundred yards from
link |
02:18:49.840
the top.
link |
02:18:50.840
I was with my guide Ike and I started getting very dizzy and I felt like I was going to
link |
02:18:56.320
fall backwards off this cliff area we were on and I was like, this is dangerous.
link |
02:19:00.800
And he said, look Brian, I know where you're at.
link |
02:19:04.600
I know where you're at and I can tell you you've got it in you so I want you to look
link |
02:19:09.760
up, take a step, take a breath and then look up, take a breath and take a step and I did
link |
02:19:17.520
and I made it.
link |
02:19:19.360
And so I got there and I just sat down with him at the top and I just cried like a baby.
link |
02:19:24.360
Broke down.
link |
02:19:25.360
I just lost it and so he let me do my thing and then we pulled out the pulse oximeter
link |
02:19:31.840
and he measured my blood oxygen levels and it was like 50 something percent and it was
link |
02:19:36.280
a danger zone so he looked at it and I think he was really alarmed that I was in this situation.
link |
02:19:41.520
And so he said, we can't get a helicopter here and we can't get you emergency evacuated
link |
02:19:46.080
you've got to go down, you've got to hike down to 15,000 feet to get base camp.
link |
02:19:49.840
And so we went out on the mountain, I got back down at base camp and again that was
link |
02:19:56.800
pretty difficult and then they put me on a stretcher, this metal stretcher with this
link |
02:20:00.240
one wheel and a team of six people wheeled me down the mountain.
link |
02:20:05.000
And it was it was pretty tortuous.
link |
02:20:06.440
I'm very appreciative they did also the trail is very bumpy so they'd go over the big rocks
link |
02:20:10.160
and so my head would just slam against this metal thing for hours and so I just felt awful
link |
02:20:15.880
plus again my head slammed every couple of seconds.
link |
02:20:18.920
So the whole experience was really a life changing moment and that's it.
link |
02:20:23.000
That was the demarcation of me basically building your life of basically I said I'm going to
link |
02:20:28.680
reconstruct Brian, my understanding of reality, my existential realities, what I want to go
link |
02:20:35.800
after and I try, I mean as much as that's possible as a human but that's when I set
link |
02:20:40.960
out to rebuild everything.
link |
02:20:44.520
Was it the struggle of that?
link |
02:20:46.000
I mean there's also just like the romantic poetic it's a fricking mountain is a man in
link |
02:20:54.800
pain psychological and physical struggling up a mountain but it's just struggle just
link |
02:21:01.800
in the face of just pushing through in the face of hardship or nature to something much
link |
02:21:10.200
bigger than you.
link |
02:21:12.520
Is that was that the thing that just clicked?
link |
02:21:14.960
For me it felt like I was just locked in with reality and it was a death match.
link |
02:21:21.360
It was in that moment one of us is going to die.
link |
02:21:24.080
So you were pondering death like not surviving.
link |
02:21:28.960
And that was the moment and it was the summit to me was I'm going to come out on top and
link |
02:21:35.080
I can do this and giving in was it's like I'm just done and so it did, I locked in and
link |
02:21:42.600
that's why mountains are magical to me.
link |
02:21:49.120
I didn't expect that.
link |
02:21:50.240
I didn't design that.
link |
02:21:51.240
I didn't know that was going to be the case.
link |
02:21:54.040
It would not have been something I would have anticipated.
link |
02:21:58.320
But you are not the same man afterwards.
link |
02:22:02.880
Is there advice you can give to young people today that look at your story that's successful
link |
02:22:08.120
in many dimensions?
link |
02:22:10.920
Advice you can give to them about how to be successful in their career successful in
link |
02:22:14.640
life and whatever path they choose.
link |
02:22:18.200
Yes, I would say listen to advice and see it for what it is, a mirror of that person
link |
02:22:29.000
and then map and know that your future is going to be in a zero principle and so what
link |
02:22:35.760
you're hearing today is a representation of what may have been the right principles to
link |
02:22:39.840
build upon previously, but they're likely depreciating very fast.
link |
02:22:45.800
And so I am a strong proponent that people ask for advice, but they don't take advice.
link |
02:22:55.960
So, how do you take advice properly?
link |
02:23:01.080
It's in the careful examination of the advice.
link |
02:23:03.840
It's actually, the person makes a statement about a given thing somebody should follow.
link |
02:23:09.200
The value is not doing that.
link |
02:23:10.680
The value is understanding the assumption stack they built, the assumption of knowledge
link |
02:23:14.480
stack they built around that body of knowledge.
link |
02:23:18.280
That's the value.
link |
02:23:19.280
It's not doing what they say.
link |
02:23:21.280
Concerning the advice, but digging deeper to understand the assumption stack, like
link |
02:23:28.120
the full person.
link |
02:23:29.120
I mean, this is deep empathy, essentially, to understand the journey of the person that
link |
02:23:34.800
arrived at the advice and the advice is just the tip of the iceberg that ultimately is
link |
02:23:39.640
not the thing that gives you.
link |
02:23:41.600
That's right.
link |
02:23:42.600
The right thing to do, it could be the complete wrong thing to do, depending on the assumption
link |
02:23:46.040
stack.
link |
02:23:47.040
So you need to investigate the whole thing.
link |
02:23:49.560
Is there some, are there been people in your startup, in your business journey that have
link |
02:23:57.480
served that role of advice giver that's been helpful?
link |
02:24:02.720
Or do you feel like your journey felt like a lonely path, or was it one that was, of course,
link |
02:24:11.640
for all, while they're born and die alone, but do you fundamentally remember the experiences
link |
02:24:21.240
when you leaned on people at a particular moment or a time that changed everything?
link |
02:24:27.240
The most significant moments of my memory, for example, like on Kilimanjaro, when Ike,
link |
02:24:34.360
some person I'd never met in Tanzania, was able to, in that moment, apparently see my
link |
02:24:42.080
soul when I was in this death match with reality.
link |
02:24:45.320
And he gave me the instructions, look up, step.
link |
02:24:48.880
And so there's magical people in my life that have done things like that.
link |
02:24:58.080
And I suspect they probably don't know.
link |
02:25:02.280
I probably should be better at identifying those things and, but yeah, hopefully the,
link |
02:25:16.000
I suppose like a wisdom I would aspire to is to have the awareness and the empathy to
link |
02:25:23.400
be that for other people and not a retail advertiser of advice, of tricks and for life,
link |
02:25:36.000
but deeply meaningful and empathetic with a one on one context with people that it really
link |
02:25:43.280
could make a difference.
link |
02:25:44.800
Yeah, I actually kind of experienced, I think about that sometimes, you know, you have like
link |
02:25:49.440
an 18 year old kid come up to you, it's not always obvious, it's not always easy to really
link |
02:25:58.640
listen to them, like not, not the facts, but like see who that person is.
link |
02:26:04.560
I think people say that about being a parent is, you know, you want to consider that you
link |
02:26:12.680
want to be the authority figure in a sense that you really want to consider that there's
link |
02:26:16.400
a special unique human being there with a unique brain that may be brilliant in ways
link |
02:26:23.840
that you are not understanding that you'll never be and really try to hear that.
link |
02:26:29.440
So when giving advice or something to that, it's a both sides should be deeply empathetic
link |
02:26:33.880
about the assumption stack.
link |
02:26:35.880
I love that terminology.
link |
02:26:38.920
What do you think is the meaning of this whole thing of life?
link |
02:26:43.480
Why the hell are we here?
link |
02:26:44.960
Alright, Johnson, we've been talking about brains and studying brains and you had this
link |
02:26:49.520
very eloquent way of describing life on earth as an optimization problem of the cost of
link |
02:26:56.840
intelligence going to zero at first through the evolutionary process and then eventually
link |
02:27:02.640
through building through our technology building more and more intelligent systems.
link |
02:27:09.640
You ever ask yourself why is doing that?
link |
02:27:13.120
Yeah, I think the answer to this question, again, the information value is more in the
link |
02:27:20.920
mirror it provides of that person, which is a representation of the technological, social,
link |
02:27:28.640
political context of the time.
link |
02:27:29.640
So if you ask this question 100 years ago, you would get a certain answer that reflects
link |
02:27:34.080
that time period.
link |
02:27:35.080
Same thing would be true for 1000 years ago.
link |
02:27:37.120
It's rare.
link |
02:27:39.120
It's difficult for a person to pull themselves out of their contextual awareness.
link |
02:27:42.240
Very true.
link |
02:27:43.240
And offer truly original response.
link |
02:27:45.360
And so knowing that I am contextually influenced by the situation, that I am a mirror for our
link |
02:27:51.280
reality, I would say that in this moment, I think the real game going on is that evolution
link |
02:28:06.920
built a system of scaffolding intelligence that produced us.
link |
02:28:11.800
We are now building intelligent systems that are scaffolding higher dimensional intelligence
link |
02:28:18.280
that's developing more robust systems of intelligence.
link |
02:28:28.160
In doing in that process with the cost going to zero, then the meaning of life becomes
link |
02:28:36.440
goal alignment, which is the negotiation of our conscious and unconscious existence.
link |
02:28:43.800
And then I'd say the third thing is if we're thinking that we want to be explorers, is
link |
02:28:50.480
our technological progress is getting to a point where we could aspirationally say we
link |
02:28:57.560
want to figure out what is really going on.
link |
02:29:01.840
Because does any of this really make sense?
link |
02:29:06.680
Now we may be 100, 200, 500, a thousand years away from being able to poke our way out of
link |
02:29:13.320
whatever is going on, but it's interesting that we could even state an aspiration to
link |
02:29:20.200
say we want to poke at this question.
link |
02:29:23.080
But I'd say in this moment of time, the meaning of life is that we can build a future state
link |
02:29:31.920
of existence that is more fantastic than anything we could ever imagine.
link |
02:29:38.080
The striving for something more amazing.
link |
02:29:43.720
And that defies expectations that we would consider bewildering and all the things.
link |
02:29:56.200
And I guess the last thing, if there's multiple meanings of life, it would be infinite games.
link |
02:30:00.440
James Kars wrote the book, finite games, infinite games.
link |
02:30:04.200
The only game to play right now is to keep playing the game.
link |
02:30:09.720
And so this goes back to the algorithm of the Lex algorithm of diet soda and brisket
link |
02:30:14.520
and pursuing the passion.
link |
02:30:16.280
What I'm suggesting is there's a moment here where we can contemplate playing infinite
link |
02:30:21.000
games.
link |
02:30:22.880
Therefore it may make sense to err on the side of making sure one is in a situation to
link |
02:30:27.760
be playing infinite games if that opportunity arises.
link |
02:30:31.240
So the landscape of possibility is changing very, very fast.
link |
02:30:35.400
And therefore our old algorithms of how we might assess risk assessment and what things
link |
02:30:38.680
we might pursue and why those assumptions may fall away very quickly.
link |
02:30:43.080
Well, I think I speak for a lot of people when I say that the game you, Mr. Brian Johnson,
link |
02:30:50.480
have been playing is quite incredible.
link |
02:30:53.280
Thank you so much for talking to me.
link |
02:30:54.280
Thanks, Lex.
link |
02:30:56.280
Thanks for listening to this conversation with Brian Johnson.
link |
02:30:58.920
And thank you to FourSigmatic, Netsuite, Grammarly, and ExpressVPN.
link |
02:31:05.120
Check them out in the description to support this podcast.
link |
02:31:08.400
And now let me leave you with some words from Diane Ackerman.
link |
02:31:12.120
Our brain is a crowded chemistry lab bustling with nonstop neural conversations.
link |
02:31:18.840
Thank you for listening and hope to see you next time.