back to index

Peter Wang: Python and the Source Code of Humans, Computers, and Reality | Lex Fridman Podcast #250


small model | large model

link |
00:00:00.000
The following is a conversation with Peter Wang,
link |
00:00:02.360
one of the most impactful leaders and developers
link |
00:00:04.640
in the Python community.
link |
00:00:06.280
Former physicist, current philosopher,
link |
00:00:09.000
and someone who many people told me about
link |
00:00:11.600
and praised as a truly special mind
link |
00:00:14.160
that I absolutely should talk to.
link |
00:00:16.440
Recommendations ranging from Travis Hallifont
link |
00:00:19.400
to Eric Weinstein.
link |
00:00:20.880
So, here we are.
link |
00:00:23.280
This is the Lex Friedman podcast.
link |
00:00:25.560
To support it, please check out our sponsors
link |
00:00:27.640
in the description.
link |
00:00:28.760
And now, here's my conversation with Peter Wang.
link |
00:00:33.440
You're one of the most impactful humans
link |
00:00:35.680
in the Python ecosystem.
link |
00:00:38.280
So, you're an engineer, leader of engineers,
link |
00:00:40.880
but you're also a philosopher.
link |
00:00:42.920
So, let's talk both in this conversation
link |
00:00:45.160
about programming and philosophy.
link |
00:00:47.320
First, programming.
link |
00:00:49.040
What to you is the best
link |
00:00:51.160
or maybe the most beautiful feature of Python?
link |
00:00:54.080
Or maybe the thing that made you fall in love
link |
00:00:56.200
or stay in love with Python?
link |
00:00:59.000
Well, those are three different things.
link |
00:01:00.880
What I think is the most beautiful,
link |
00:01:01.960
what made me fall in love, what made me stay in love.
link |
00:01:03.960
When I first started using it
link |
00:01:05.760
was when I was a C++ computer graphics performance nerd.
link |
00:01:10.040
In the 90s?
link |
00:01:10.880
Yeah, in the late 90s.
link |
00:01:12.080
And that was my first job out of college.
link |
00:01:15.160
And we kept trying to do more and more abstract
link |
00:01:18.680
and higher order programming in C++,
link |
00:01:20.520
which at the time was quite difficult.
link |
00:01:23.040
With templates, the compiler support wasn't great, et cetera.
link |
00:01:26.560
So, when I started playing around with Python,
link |
00:01:28.720
that was my first time encountering
link |
00:01:30.480
really first class support for types, for functions,
link |
00:01:33.600
and things like that.
link |
00:01:34.440
And it felt so incredibly expressive.
link |
00:01:37.200
So, that was what kind of made me fall in love
link |
00:01:39.120
with it a little bit.
link |
00:01:39.960
And also, once you spend a lot of time
link |
00:01:42.280
in a C++ dev environment,
link |
00:01:44.160
the ability to just whip something together
link |
00:01:46.200
that basically runs and works the first time is amazing.
link |
00:01:49.680
So, really productive scripting language.
link |
00:01:51.960
I mean, I knew Perl, I knew Bash, I was decent at both.
link |
00:01:55.280
But Python just made everything,
link |
00:01:57.160
it made the whole world accessible.
link |
00:01:59.680
I could script this and that and the other,
link |
00:02:01.240
network things, little hard drive utilities.
link |
00:02:04.000
I could write all of these things
link |
00:02:05.040
in the space of an afternoon.
link |
00:02:06.520
And that was really, really cool.
link |
00:02:07.640
So, that's what made me fall in love.
link |
00:02:08.600
Is there something specific you could put your finger on
link |
00:02:11.560
that you're not programming in Perl today?
link |
00:02:14.080
Like, why Python for scripting?
link |
00:02:17.120
I think there's not a specific thing
link |
00:02:19.640
as much as the design motif of both the creator
link |
00:02:22.800
of the language and the core group of people
link |
00:02:25.280
that built the standard library around him.
link |
00:02:28.920
There was definitely, there was a taste to it.
link |
00:02:32.160
I mean, Steve Jobs used that term
link |
00:02:34.320
in somewhat of an arrogant way,
link |
00:02:35.680
but I think it's a real thing,
link |
00:02:37.120
that it was designed to fit.
link |
00:02:39.200
A friend of mine actually expressed this really well.
link |
00:02:40.880
He said, Python just fits in my head.
link |
00:02:42.960
And there's nothing better to say than that.
link |
00:02:45.200
Now, people might argue modern Python,
link |
00:02:47.880
there's a lot more complexity,
link |
00:02:49.240
but certainly as version 1.5.2,
link |
00:02:51.760
I think was my first version,
link |
00:02:53.360
that fit in my head very easily.
link |
00:02:54.800
So, that's what made me fall in love with it.
link |
00:02:56.560
Okay, so the most beautiful feature of Python
link |
00:03:01.400
that made you stay in love.
link |
00:03:03.040
It's like over the years, what has like,
link |
00:03:06.560
you do a double take and you return too often
link |
00:03:09.520
as a thing that just brings you a smile.
link |
00:03:11.560
I really still like the ability to play with meta classes
link |
00:03:17.000
and express higher order of things.
link |
00:03:19.000
When I have to create some new object model
link |
00:03:22.040
to model something, right?
link |
00:03:23.360
It's easy for me,
link |
00:03:24.400
cause I'm pretty expert as a Python programmer.
link |
00:03:27.080
I can easily put all sorts of lovely things together
link |
00:03:29.800
and use properties and decorators and other kinds of things
link |
00:03:32.920
and create something that feels very nice.
link |
00:03:34.680
So, that to me, I would say that's tied
link |
00:03:37.320
with the NumPy and vectorization capabilities.
link |
00:03:40.640
I love thinking in terms of the matrices and the vectors
link |
00:03:43.800
and these kind of data structures.
link |
00:03:46.080
So, I would say those two are kind of tied for me.
link |
00:03:49.400
So, the elegance of the NumPy data structure,
link |
00:03:52.720
like slicing through the different multi dimensional.
link |
00:03:54.760
Yeah, there's just enough things there.
link |
00:03:56.200
It's like a very, it's a very simple, comfortable tool.
link |
00:04:00.040
Just, it's easy to reason about what it does
link |
00:04:02.800
when you don't stray too far afield.
link |
00:04:05.000
Can you put your finger on how to design a language
link |
00:04:09.960
such that it fits in your head?
link |
00:04:11.880
Certain things like the colon
link |
00:04:14.040
or the certain notation aspects of Python
link |
00:04:17.160
that just kind of work.
link |
00:04:18.640
Is it something you have to kind of write out on paper,
link |
00:04:22.200
look and say, it's just right?
link |
00:04:24.640
Is it a taste thing or is there a systematic process?
link |
00:04:27.600
What's your sense?
link |
00:04:28.800
I think it's more of a taste thing.
link |
00:04:31.520
But one thing that should be said
link |
00:04:33.560
is that you have to pick your audience, right?
link |
00:04:36.360
So, the better defined the user audience is
link |
00:04:39.160
or the users are, the easier it is to build something
link |
00:04:42.280
that fits in their minds because their needs
link |
00:04:45.200
will be more compact and coherent.
link |
00:04:47.240
It is possible to find a projection, right?
link |
00:04:49.160
A compact projection for their needs.
link |
00:04:50.920
The more diverse the user base, the harder that is.
link |
00:04:54.480
And so, as Python has grown in popularity,
link |
00:04:57.120
that's also naturally created more complexity
link |
00:05:00.080
as people try to design any given thing.
link |
00:05:01.800
There'll be multiple valid opinions
link |
00:05:04.120
about a particular design approach.
link |
00:05:06.240
And so, I do think that's the downside of popularity.
link |
00:05:10.240
It's almost an intrinsic aspect
link |
00:05:11.440
of the complexity of the problem.
link |
00:05:13.040
Well, at the very beginning,
link |
00:05:14.440
aren't you an audience of one, isn't ultimately,
link |
00:05:17.440
aren't all the greatest projects in history
link |
00:05:19.480
were just solving a problem that you yourself had?
link |
00:05:21.800
Well, so Clay Shirky in his book on crowdsourcing
link |
00:05:25.400
or his kind of thoughts on crowdsourcing,
link |
00:05:27.520
he identifies the first step of crowdsourcing
link |
00:05:29.520
is me first collaboration.
link |
00:05:31.440
You first have to make something
link |
00:05:32.480
that works well for yourself.
link |
00:05:34.280
It's very telling that when you look at all of the impactful
link |
00:05:37.720
big project, well, they're fundamental projects now
link |
00:05:40.000
in the SciPy and Pydata ecosystem.
link |
00:05:42.560
They all started with the people in the domain
link |
00:05:46.720
trying to scratch their own itch.
link |
00:05:48.200
And the whole idea of scratching your own itch
link |
00:05:49.720
is something that the open source
link |
00:05:51.280
or the free software world has known for a long time.
link |
00:05:53.640
But in the scientific computing areas,
link |
00:05:56.800
these are assistant professors
link |
00:05:58.240
or electrical engineering grad students.
link |
00:06:00.520
They didn't have really a lot of programming skill
link |
00:06:03.000
necessarily, but Python was just good enough
link |
00:06:05.520
for them to put something together
link |
00:06:06.720
that fit in their domain, right?
link |
00:06:09.400
So it's almost like a,
link |
00:06:11.120
it's a necessity is the mother of invention aspect.
link |
00:06:13.960
And also it was a really harsh filter
link |
00:06:16.880
for utility and compactness and expressiveness.
link |
00:06:20.880
Like it was too hard to use,
link |
00:06:22.400
then they wouldn't have built it
link |
00:06:23.360
because that was just too much trouble, right?
link |
00:06:24.960
It was a side project for them.
link |
00:06:26.160
And also necessity creates a kind of deadline.
link |
00:06:28.120
It seems like a lot of these projects
link |
00:06:29.440
are quickly thrown together in the first step.
link |
00:06:32.320
And that, even though it's flawed,
link |
00:06:35.560
that just seems to work well for software projects.
link |
00:06:38.840
Well, it does work well for software projects in general.
link |
00:06:41.280
And in this particular space,
link |
00:06:43.520
one of my colleagues, Stan Siebert identified this,
link |
00:06:46.320
that all the projects in the SciPy ecosystem,
link |
00:06:50.360
if we just rattle them off,
link |
00:06:51.200
there's NumPy, there's SciPy
link |
00:06:53.040
built by different collaborations of people.
link |
00:06:55.000
Although Travis is the heart of both of them.
link |
00:06:57.680
But NumPy coming from numeric and numery,
link |
00:06:59.360
these are different people.
link |
00:07:00.680
And then you've got Pandas,
link |
00:07:01.840
you've got Jupyter or IPython,
link |
00:07:04.480
there's Matplotlib,
link |
00:07:06.880
there's just so many others, I'm not gonna do justice
link |
00:07:09.680
if I try to name them all.
link |
00:07:10.720
But all of them are actually different people.
link |
00:07:12.800
And as they rolled out their projects,
link |
00:07:15.280
the fact that they had limited resources
link |
00:07:17.560
meant that they were humble about scope.
link |
00:07:21.600
A great famous hacker, Jamie Zawisky,
link |
00:07:23.560
once said that every geek's dream
link |
00:07:26.040
is to build the ultimate middleware, right?
link |
00:07:29.280
And the thing is with these scientists turned programmers,
link |
00:07:32.280
they had no such dream.
link |
00:07:33.120
They were just trying to write something
link |
00:07:34.800
that was a little bit better for what they needed,
link |
00:07:36.560
the MATLAB,
link |
00:07:37.640
and they were gonna leverage what everyone else had built.
link |
00:07:39.920
So naturally, almost in kind of this annealing process
link |
00:07:42.640
or whatever, we built a very modular cover
link |
00:07:46.480
of the basic needs of a scientific computing library.
link |
00:07:50.280
If you look at the whole human story,
link |
00:07:51.920
how much of a leap is it?
link |
00:07:53.680
We've developed all kinds of languages,
link |
00:07:55.480
all kinds of methodologies for communication.
link |
00:07:57.720
It just kind of like grew this collective intelligence,
link |
00:08:00.560
civilization grew, it expanded, wrote a bunch of books,
link |
00:08:04.400
and now we tweet how big of a leap is programming
link |
00:08:08.680
if programming is yet another language?
link |
00:08:10.880
Is it just a nice little trick
link |
00:08:12.880
that's temporary in our human history,
link |
00:08:15.000
or is it like a big leap in the,
link |
00:08:19.680
almost us becoming another organism
link |
00:08:23.120
at a higher level of abstraction, something else?
link |
00:08:26.160
I think the act of programming
link |
00:08:28.240
or using grammatical constructions
link |
00:08:32.360
of some underlying primitives,
link |
00:08:34.920
that is something that humans do learn,
link |
00:08:37.520
but every human learns this.
link |
00:08:38.840
Anyone who can speak learns how to do this.
link |
00:08:41.160
What makes programming different
link |
00:08:42.440
has been that up to this point,
link |
00:08:44.840
when we try to give instructions to computing systems,
link |
00:08:49.000
all of our computers, well, actually this is not quite true,
link |
00:08:51.560
but I'll first say it,
link |
00:08:53.080
and then I'll tell you why it's not true.
link |
00:08:55.000
But for the most part,
link |
00:08:56.000
we can think of computers as being these iterated systems.
link |
00:08:58.880
So when we program,
link |
00:08:59.920
we're giving very precise instructions to iterated systems
link |
00:09:04.200
that then run at incomprehensible speed
link |
00:09:07.360
and run those instructions.
link |
00:09:08.800
In my experience,
link |
00:09:10.160
some people are just better equipped
link |
00:09:12.720
to model systematic iterated systems,
link |
00:09:16.880
well, whatever, iterated systems in their head.
link |
00:09:20.120
Some people are really good at that,
link |
00:09:21.760
and other people are not.
link |
00:09:23.800
And so when you have like, for instance,
link |
00:09:26.080
sometimes people have tried to build systems
link |
00:09:27.640
that make programming easier by making it visual,
link |
00:09:30.080
drag and drop.
link |
00:09:31.240
And the issue is you can have a drag and drop thing,
link |
00:09:33.640
but once you start having to iterate the system
link |
00:09:35.200
with conditional logic,
link |
00:09:36.120
handling case statements and branch statements
link |
00:09:37.880
and all these other things,
link |
00:09:39.480
the visual drag and drop part doesn't save you anything.
link |
00:09:42.040
You still have to reason about this giant iterated system
link |
00:09:44.680
with all these different conditions around it.
link |
00:09:46.360
That's the hard part, right?
link |
00:09:48.160
So handling iterated logic, that's the hard part.
link |
00:09:52.240
The languages we use then emerge
link |
00:09:54.160
to give us the ability and capability over these things.
link |
00:09:57.240
Now, the one exception to this rule, of course,
link |
00:09:58.880
is the most popular programming system in the world,
link |
00:10:00.680
which is Excel, which is a data flow
link |
00:10:03.280
and a data driven, immediate mode,
link |
00:10:05.440
data transformation oriented programming system.
link |
00:10:08.360
And this actually not an accident
link |
00:10:10.400
that that system is the most popular programming system
link |
00:10:12.840
because it's so accessible
link |
00:10:14.440
to a much broader group of people.
link |
00:10:16.920
I do think as we build future computing systems,
link |
00:10:21.240
you're actually already seeing this a little bit,
link |
00:10:22.920
it's much more about composition of modular blocks.
link |
00:10:25.680
They themselves actually maintain all their internal state
link |
00:10:29.600
and the interfaces between them
link |
00:10:31.040
are well defined data schemas.
link |
00:10:32.960
And so to stitch these things together using like IFTTT
link |
00:10:35.880
or Zapier or any of these kind of,
link |
00:10:38.720
I would say compositional scripting kinds of things,
link |
00:10:42.000
I mean, HyperCard was also a little bit in this vein.
link |
00:10:44.960
That's much more accessible to most people.
link |
00:10:47.560
It's really that implicit state
link |
00:10:49.920
that's so hard for people to track.
link |
00:10:52.000
Yeah, okay, so that's modular stuff,
link |
00:10:53.400
but there's also an aspect
link |
00:10:54.440
where you're standing on the shoulders of giants.
link |
00:10:55.920
So you're building like higher and higher levels
link |
00:10:58.720
of abstraction, but you do that a little bit with language.
link |
00:11:02.160
So with language, you develop sort of ideas,
link |
00:11:05.160
philosophies from Plato and so on.
link |
00:11:07.600
And then you kind of leverage those philosophies
link |
00:11:09.520
as you try to have deeper and deeper conversations.
link |
00:11:12.640
But with programming,
link |
00:11:13.560
it seems like you can build much more complicated systems.
link |
00:11:17.040
Like without knowing how everything works,
link |
00:11:18.920
you can build on top of the work of others.
link |
00:11:21.320
And it seems like you're developing
link |
00:11:22.840
more and more sophisticated expressions,
link |
00:11:27.920
ability to express ideas in a computational space.
link |
00:11:31.280
I think it's worth pondering the difference here
link |
00:11:35.040
between complexity and complication.
link |
00:11:40.400
Okay, right. Back to Excel.
link |
00:11:42.240
Well, not quite back to Excel,
link |
00:11:43.280
but the idea is when we have a human conversation,
link |
00:11:47.560
all languages for humans emerged
link |
00:11:51.240
to support human relational communications,
link |
00:11:55.560
which is that the person we're communicating with
link |
00:11:57.720
is a person and they would communicate back to us.
link |
00:12:01.240
And so we sort of hit a resonance point, right?
link |
00:12:05.640
When we actually agree on some concepts.
link |
00:12:07.480
So there's a messiness to it and there's a fluidity to it.
link |
00:12:10.280
With computing systems,
link |
00:12:11.880
when we express something to the computer and it's wrong,
link |
00:12:14.280
we just try again.
link |
00:12:15.400
So we can basically live many virtual worlds
link |
00:12:17.520
of having failed at expressing ourselves to the computer
link |
00:12:20.200
until the one time we expressed ourselves right.
link |
00:12:22.840
Then we kind of put in production
link |
00:12:23.960
and then discover that it's still wrong
link |
00:12:25.920
a few days down the road.
link |
00:12:27.160
So I think the sophistication of things
link |
00:12:30.480
that we build with computing,
link |
00:12:32.520
one has to really pay attention to the difference
link |
00:12:35.600
between when an end user is expressing something
link |
00:12:38.240
onto a system that exists
link |
00:12:39.960
versus when they're extending the system
link |
00:12:42.520
to increase the system's capability
link |
00:12:45.480
for someone else to then interface with.
link |
00:12:47.480
We happen to use the same language for both of those things
link |
00:12:49.560
in most cases, but it doesn't have to be that.
link |
00:12:52.080
And Excel is actually a great example of this,
link |
00:12:54.360
of kind of a counterpoint to that.
link |
00:12:56.240
Okay, so what about the idea of, you said messiness.
link |
00:13:01.440
Wouldn't you put the software 2.0 idea,
link |
00:13:06.240
this idea of machine learning
link |
00:13:08.720
into the further and further steps
link |
00:13:12.560
into the world of messiness.
link |
00:13:14.560
The same kind of beautiful messiness of human communication.
link |
00:13:17.640
Isn't that what machine learning is?
link |
00:13:19.120
Is building on levels of abstraction
link |
00:13:23.560
that don't have messiness in them,
link |
00:13:25.520
that at the operating system level,
link |
00:13:27.400
then there's Python, the programming languages
link |
00:13:29.400
that have more and more power.
link |
00:13:30.920
But then finally, there's neural networks
link |
00:13:34.640
that ultimately work with data.
link |
00:13:38.200
And so the programming is almost in the space of data
link |
00:13:40.560
and the data is allowed to be messy.
link |
00:13:42.480
Isn't that a kind of program?
link |
00:13:43.880
So the idea of software 2.0 is a lot of the programming
link |
00:13:47.360
happens in the space of data, so back to Excel,
link |
00:13:52.320
all roads lead back to Excel, in the space of data
link |
00:13:55.080
and also the hyperparameters of the neural networks.
link |
00:13:57.400
And all of those allow the same kind of messiness
link |
00:14:02.240
that human communication allows.
link |
00:14:04.400
It does, but my background is in physics.
link |
00:14:07.760
I took like two CS courses in college.
link |
00:14:09.960
So I don't have, now I did cram a bunch of CS in prep
link |
00:14:13.800
when I applied for grad school,
link |
00:14:15.520
but still I don't have a formal background
link |
00:14:18.160
in computer science.
link |
00:14:19.720
But what I have observed in studying programming languages
link |
00:14:22.520
and programming systems and things like that
link |
00:14:25.000
is that there seems to be this triangle.
link |
00:14:27.360
It's one of these beautiful little iron triangles
link |
00:14:30.440
that you find in life sometimes.
link |
00:14:32.080
And it's the connection between the code correctness
link |
00:14:35.640
and kind of expressiveness of code,
link |
00:14:37.440
the semantics of the data,
link |
00:14:39.920
and then the kind of correctness or parameters
link |
00:14:42.520
of the underlying hardware compute system.
link |
00:14:44.960
So there's the algorithms that you wanna apply,
link |
00:14:48.440
there's what the bits that are stored on whatever media
link |
00:14:52.440
actually represent, so the semantics of the data
link |
00:14:55.800
within the representation,
link |
00:14:56.920
and then there's what the computer can actually do.
link |
00:14:59.760
And every programming system, every information system
link |
00:15:02.840
ultimately finds some spot in the middle
link |
00:15:05.480
of this little triangle.
link |
00:15:07.440
Sometimes some systems collapse them into just one edge.
link |
00:15:11.120
Are we including humans as a system in this?
link |
00:15:13.480
No, no, I'm just thinking about computing systems here.
link |
00:15:15.960
And the reason I bring this up is because
link |
00:15:17.800
I believe there's no free lunch around this stuff.
link |
00:15:20.120
So if we build machine learning systems
link |
00:15:23.000
to sort of write the correct code
link |
00:15:25.520
that is at a certain level of performance,
link |
00:15:27.160
so it'll sort of select with hyperparameters
link |
00:15:30.120
we can tune kind of how we want the performance boundary
link |
00:15:32.880
in SLA to look like for transforming some set of inputs
link |
00:15:37.280
into certain kinds of outputs.
link |
00:15:39.600
That training process itself is intrinsically sensitive
link |
00:15:43.640
to the kinds of inputs we put into it.
link |
00:15:45.560
It's quite sensitive to the boundary conditions
link |
00:15:47.960
we put around the performance.
link |
00:15:49.320
So I think even as we move to using automated systems
link |
00:15:52.200
to build this transformation,
link |
00:15:53.480
as opposed to humans explicitly
link |
00:15:55.480
from a top down perspective, figuring out,
link |
00:15:57.360
well, this schema and this database and these columns
link |
00:15:59.960
get selected for this algorithm,
link |
00:16:01.720
and here we put a Fibonacci heap for some other thing.
link |
00:16:04.880
Human design or computer design,
link |
00:16:06.800
ultimately what we hit,
link |
00:16:08.160
the boundaries that we hit with these information systems
link |
00:16:10.640
is when the representation of the data hits the real world
link |
00:16:14.240
is where there's a lot of slop and a lot of interpretation.
link |
00:16:17.560
And that's where actually I think
link |
00:16:18.960
a lot of the work will go in the future
link |
00:16:20.880
is actually understanding kind of how to better
link |
00:16:23.360
in the view of these live data systems,
link |
00:16:26.240
how to better encode the semantics of the world
link |
00:16:29.320
for those things.
link |
00:16:30.160
There'll be less of the details
link |
00:16:31.080
of how we write a particular SQL query.
link |
00:16:33.480
Okay, but given the semantics of the real world
link |
00:16:35.520
and the messiness of that,
link |
00:16:36.920
what does the word correctness mean
link |
00:16:38.640
when you're talking about code?
link |
00:16:40.800
There's a lot of dimensions to correctness.
link |
00:16:42.960
Historically, and this is one of the reasons I say
link |
00:16:45.160
that we're coming to the end of the era of software,
link |
00:16:47.640
because for the last 40 years or so,
link |
00:16:49.880
software correctness was really defined
link |
00:16:52.440
about functional correctness.
link |
00:16:54.840
I write a function, it's got some inputs,
link |
00:16:56.400
does it produce the right outputs?
link |
00:16:57.880
If so, then I can turn it on,
link |
00:16:59.280
hook it up to the live database and it goes.
link |
00:17:01.480
And more and more now we have,
link |
00:17:03.200
I mean, in fact, I think the bright line in the sand
link |
00:17:05.120
between machine learning systems
link |
00:17:06.800
or modern data driven systems
link |
00:17:08.200
versus classical software systems
link |
00:17:10.840
is that the values of the input
link |
00:17:14.320
actually have to be considered with the function together
link |
00:17:17.440
to say this whole thing is correct or not.
link |
00:17:19.480
And usually there's a performance SLA as well.
link |
00:17:21.840
Like did it actually finish making this?
link |
00:17:23.160
What's SLA?
link |
00:17:24.000
Sorry, service level agreement.
link |
00:17:25.440
So it has to return within some time.
link |
00:17:27.120
You have a 10 millisecond time budget
link |
00:17:29.240
to return a prediction of this level of accuracy, right?
link |
00:17:32.760
So these are things that were not traditionally
link |
00:17:35.040
in most business computing systems for the last 20 years
link |
00:17:37.560
at all, people didn't think about it.
link |
00:17:39.400
But now we have value dependence on functional correctness.
link |
00:17:42.720
So that question of correctness
link |
00:17:44.160
is becoming a bigger and bigger question.
link |
00:17:45.760
What does that map to the end of software?
link |
00:17:48.160
We've thought about software as just this thing
link |
00:17:50.520
that you can do in isolation with some test trial inputs
link |
00:17:54.760
and in a very sort of sandboxed environment.
link |
00:17:58.640
And we can quantify how does it scale?
link |
00:18:00.640
How does it perform?
link |
00:18:02.040
How many nodes do we need to allocate
link |
00:18:03.240
if we wanna scale this many inputs?
link |
00:18:05.120
When we start turning this stuff into prediction systems,
link |
00:18:08.360
real cybernetic systems,
link |
00:18:10.120
you're going to find scenarios where you get inputs
link |
00:18:12.320
that you're gonna wanna spend
link |
00:18:13.160
a little more time thinking about.
link |
00:18:14.560
You're gonna find inputs that are not,
link |
00:18:15.840
it's not clear what you should do, right?
link |
00:18:17.480
So then the software has a varying amount of runtime
link |
00:18:20.360
and correctness with regard to input.
link |
00:18:22.280
And that is a different kind of system altogether.
link |
00:18:24.000
Now it's a full on cybernetic system.
link |
00:18:25.960
It's a next generation information system
link |
00:18:27.640
that is not like traditional software systems.
link |
00:18:30.200
Can you maybe describe what is a cybernetic system?
link |
00:18:33.160
Do you include humans in that picture?
link |
00:18:35.080
So is a human in the loop kind of complex mess
link |
00:18:38.920
of the whole kind of interactivity of software
link |
00:18:41.520
with the real world or is it something more concrete?
link |
00:18:44.760
Well, when I say cybernetic,
link |
00:18:45.760
I really do mean that the software itself
link |
00:18:47.720
is closing the observe, orient, decide, act loop by itself.
link |
00:18:51.600
So humans being out of the loop is the fact
link |
00:18:54.280
what for me makes it a cybernetic system.
link |
00:18:58.440
And humans are out of that loop.
link |
00:19:00.920
When humans are out of the loop,
link |
00:19:01.960
when the machine is actually sort of deciding on its own
link |
00:19:05.240
what it should do next to get more information,
link |
00:19:07.680
that makes it a cybernetic system.
link |
00:19:09.760
So we're just at the dawn of this, right?
link |
00:19:11.400
I think everyone talking about MLAI, it's great.
link |
00:19:15.400
But really the thing we should be talking about
link |
00:19:16.880
is when we really enter the cybernetic era
link |
00:19:20.360
and all of the questions of ethics and governance
link |
00:19:22.520
and all correctness and all these things,
link |
00:19:24.640
they really are the most important questions.
link |
00:19:27.200
Okay, can we just linger on this?
link |
00:19:28.600
What does it mean for the human to be out of the loop
link |
00:19:30.640
in a cybernetic system, because isn't the cybernetic system
link |
00:19:34.120
that's ultimately accomplishing some kind of purpose
link |
00:19:37.440
that at the bottom, the turtles all the way down,
link |
00:19:41.840
at the bottom turtle is a human.
link |
00:19:44.240
Well, the human may have set some criteria,
link |
00:19:45.920
but the human wasn't precise.
link |
00:19:47.280
So for instance, I just read the other day
link |
00:19:49.240
that earlier this year,
link |
00:19:51.360
or maybe it was last year at some point,
link |
00:19:52.480
the Libyan army, I think,
link |
00:19:55.240
sent out some automated killer drones with explosives.
link |
00:19:58.200
And there was no human in the loop at that point.
link |
00:20:00.000
They basically put them in a geofenced area,
link |
00:20:02.280
said find any moving target, like a truck or vehicle
link |
00:20:04.280
that looks like this, and boom.
link |
00:20:07.000
That's not a human in the loop, right?
link |
00:20:09.360
So increasingly, the less human there is in the loop,
link |
00:20:12.760
the more concerned you are about these kinds of systems,
link |
00:20:15.520
because there's unintended consequences,
link |
00:20:18.360
like less the original designer and engineer of the system
link |
00:20:22.000
is able to predict, even one with good intent
link |
00:20:25.000
is able to predict the consequences of such a system.
link |
00:20:27.400
Is that it? That's right.
link |
00:20:28.680
There are some software systems, right,
link |
00:20:30.160
that run without humans in the loop
link |
00:20:31.920
that are quite complex.
link |
00:20:32.800
And that's like the electronic markets.
link |
00:20:34.320
And we get flash crashes all the time.
link |
00:20:35.920
We get in the heyday of high frequency trading,
link |
00:20:40.440
there's a lot of market microstructure,
link |
00:20:41.800
people doing all sorts of weird stuff
link |
00:20:43.600
that the market designers had never really thought about,
link |
00:20:47.240
contemplated or intended.
link |
00:20:48.840
So when we run these full on systems
link |
00:20:50.680
with these automated trading bots,
link |
00:20:52.840
now they become automated killer drones
link |
00:20:55.440
and then all sorts of other stuff.
link |
00:20:57.280
We are, that's what I mean by we're at the dawn
link |
00:20:59.560
of the cybernetic era and the end of the era
link |
00:21:01.600
of just pure software.
link |
00:21:03.880
Are you more concerned,
link |
00:21:06.000
if you're thinking about cybernetic systems
link |
00:21:08.280
or even like self replicating systems,
link |
00:21:10.040
so systems that aren't just doing a particular task,
link |
00:21:13.560
but are able to sort of multiply and scale
link |
00:21:15.720
in some dimension in the digital
link |
00:21:18.040
or even the physical world.
link |
00:21:20.360
Are you more concerned about like the lobster being boiled?
link |
00:21:24.800
So a gradual with us not noticing,
link |
00:21:29.160
collapse of civilization or a big explosion,
link |
00:21:34.320
like oops, kind of a big thing where everyone notices,
link |
00:21:38.680
but it's too late.
link |
00:21:40.160
I think that it will be a different experience
link |
00:21:44.040
for different people.
link |
00:21:46.280
I do share a common point of view
link |
00:21:49.800
with some of the climate,
link |
00:21:52.240
people who are concerned about climate change
link |
00:21:53.760
and just the big existential risks that we have.
link |
00:21:59.440
But unlike a lot of people who share my level of concern,
link |
00:22:02.560
I think the collapse will not be quite so dramatic
link |
00:22:06.200
as some of them think.
link |
00:22:07.560
And what I mean is that,
link |
00:22:09.200
I think that for certain tiers of let's say economic class
link |
00:22:12.320
or certain locations in the world,
link |
00:22:14.640
people will experience dramatic collapse scenarios.
link |
00:22:17.760
But for a lot of people, especially in the developed world,
link |
00:22:20.320
the realities of collapse will be managed.
link |
00:22:24.040
There'll be narrative management around it
link |
00:22:26.640
so that they essentially insulate,
link |
00:22:29.160
the middle class will be used to insulate the upper class
link |
00:22:31.320
from the pitchforks and the flaming torches and everything.
link |
00:22:35.880
It's interesting because,
link |
00:22:37.000
so my specific question wasn't as general.
link |
00:22:39.600
My question was more about cybernetic systems or software.
link |
00:22:42.480
Okay.
link |
00:22:43.560
It's interesting,
link |
00:22:44.400
but it would nevertheless perhaps be about class.
link |
00:22:46.720
So the effect of algorithms
link |
00:22:48.360
might affect certain classes more than others.
link |
00:22:50.200
Absolutely.
link |
00:22:51.040
I was more thinking about
link |
00:22:52.680
whether it's social media algorithms or actual robots,
link |
00:22:57.320
is there going to be a gradual effect on us
link |
00:23:00.000
where we wake up one day
link |
00:23:02.600
and don't recognize the humans we are,
link |
00:23:05.160
or is it something truly dramatic
link |
00:23:07.560
where there's like a meltdown of a nuclear reactor
link |
00:23:11.640
kind of thing, Chernobyl, like catastrophic events
link |
00:23:15.240
that are almost bugs in a program that scaled itself
link |
00:23:20.280
too quickly?
link |
00:23:21.120
Yeah, I'm not as concerned about the visible stuff.
link |
00:23:26.280
And the reason is because the big visible explosions,
link |
00:23:29.560
I mean, this is something I said about social media
link |
00:23:31.200
is that at least with nuclear weapons,
link |
00:23:33.200
when a nuke goes off, you can see it
link |
00:23:34.840
and you're like, well, that's really,
link |
00:23:36.360
wow, that's kind of bad, right?
link |
00:23:37.760
I mean, Oppenheimer was reciting the Bhagavad Gita, right?
link |
00:23:40.640
When he saw one of those things go off.
link |
00:23:42.200
So we can see nukes are really bad.
link |
00:23:45.160
He's not reciting anything about Twitter.
link |
00:23:48.000
Well, but right, but then when you have social media,
link |
00:23:51.160
when you have all these different things that conspire
link |
00:23:54.400
to create a layer of virtual experience for people
link |
00:23:57.120
that alienates them from reality and from each other,
link |
00:24:00.960
that's very pernicious, that's impossible to see, right?
link |
00:24:03.840
And it kind of slowly gets in there, so.
link |
00:24:07.160
You've written about this idea of virtuality
link |
00:24:09.600
on this topic, which you define as the subjective phenomenon
link |
00:24:14.120
of knowingly engaging with virtual sensation and perception
link |
00:24:17.800
and suspending or forgetting the context
link |
00:24:19.920
that it's simulacrum.
link |
00:24:22.040
So let me ask, what is real?
link |
00:24:26.440
Is there a hard line between reality and virtuality?
link |
00:24:30.440
Like perception drifts from some kind of physical reality.
link |
00:24:33.440
We have to kind of have a sense of what is the line
link |
00:24:36.120
that's too, we've gone too far.
link |
00:24:37.640
Right, right.
link |
00:24:38.760
For me, it's not about any hard line about physical reality
link |
00:24:42.640
as much as a simple question of,
link |
00:24:47.160
does the particular technology help people connect
link |
00:24:51.560
in a more integral way with other people,
link |
00:24:54.400
with their environment,
link |
00:24:56.000
with all of the full spectrum of things around them?
link |
00:24:58.560
So it's less about, oh, this is a virtual thing
link |
00:25:00.880
and this is a hard real thing,
link |
00:25:03.000
more about when we create virtual representations
link |
00:25:05.760
of the real things, always some things
link |
00:25:09.360
are lost in translation.
link |
00:25:10.760
Usually many, many dimensions are lost in translation.
link |
00:25:14.320
We're now coming to almost two years of COVID,
link |
00:25:16.840
people on Zoom all the time.
link |
00:25:17.920
You know it's different when you meet somebody in person
link |
00:25:19.760
than when you see them on,
link |
00:25:20.600
I've seen you on YouTube lots, right?
link |
00:25:22.560
But then seeing a person is very different.
link |
00:25:24.200
And so I think when we engage in virtual experiences
link |
00:25:29.760
all the time, and we only do that,
link |
00:25:31.680
there is absolutely a level of embodiment.
link |
00:25:34.280
There's a level of embodied experience
link |
00:25:36.560
and participatory interaction that is lost.
link |
00:25:40.080
And it's very hard to put your finger on exactly what it is.
link |
00:25:42.520
It's hard to say, oh, we're gonna spend $100 million
link |
00:25:44.600
building a new system that captures this 5% better,
link |
00:25:49.480
higher fidelity human expression.
link |
00:25:51.240
No one's gonna pay for that, right?
link |
00:25:52.600
So when we rush madly into a world of simulacrum
link |
00:25:57.480
and virtuality, the things that are lost are,
link |
00:26:02.560
it's difficult.
link |
00:26:04.240
Once everyone moves there, it can be hard to look back
link |
00:26:06.400
and see what we've lost.
link |
00:26:08.120
So is it irrecoverably lost?
link |
00:26:10.400
Or rather, when you put it all on the table,
link |
00:26:14.200
is it possible for more to be gained than is lost?
link |
00:26:17.160
If you look at video games,
link |
00:26:18.680
they create virtual experiences that are surreal
link |
00:26:22.720
and can bring joy to a lot of people,
link |
00:26:24.480
can connect a lot of people,
link |
00:26:26.680
and can get people to talk a lot of trash.
link |
00:26:29.880
So they can bring out the best and the worst in people.
link |
00:26:32.600
So is it possible to have a future world
link |
00:26:35.720
where the pros outweigh the cons?
link |
00:26:38.560
It is.
link |
00:26:39.400
I mean, it's possible to have that in the current world.
link |
00:26:41.560
But when literally trillions of dollars of capital
link |
00:26:46.600
are tied to using those things
link |
00:26:48.920
to groom the worst of our inclinations
link |
00:26:52.840
and to attack our weaknesses in the limbic system
link |
00:26:56.080
to create these things into id machines
link |
00:26:57.640
versus connection machines,
link |
00:26:59.120
then those good things don't stand a chance.
link |
00:27:03.200
Can you make a lot of money by building connection machines?
link |
00:27:06.640
Is it possible, do you think,
link |
00:27:09.280
to bring out the best in human nature
link |
00:27:10.920
to create fulfilling connections and relationships
link |
00:27:13.720
in the digital world and make a shit ton of money?
link |
00:27:18.720
If I figure it out, I'll let you know.
link |
00:27:21.120
But what's your intuition
link |
00:27:22.000
without concretely knowing what's the solution?
link |
00:27:24.640
My intuition is that a lot of our digital technologies
link |
00:27:27.720
give us the ability to have synthetic connections
link |
00:27:30.800
or to experience virtuality.
link |
00:27:33.040
They have co evolved with sort of the human expectations.
link |
00:27:38.920
It's sort of like sugary drinks.
link |
00:27:40.840
As people have more sugary drinks,
link |
00:27:42.320
they need more sugary drinks to get that same hit, right?
link |
00:27:45.880
So with these virtual things and with TV and fast cuts
link |
00:27:50.400
and TikToks and all these different kinds of things,
link |
00:27:52.840
we're co creating essentially humanity
link |
00:27:55.040
that sort of asks and needs those things.
link |
00:27:57.320
And now it becomes very difficult
link |
00:27:58.880
to get people to slow down.
link |
00:28:00.360
It gets difficult for people to hold their attention
link |
00:28:03.520
on slow things and actually feel that embodied experience.
link |
00:28:07.320
So mindfulness now more than ever is so important in schools
link |
00:28:11.120
and as a therapy technique for people
link |
00:28:13.560
because our environment has been accelerated.
link |
00:28:15.680
And McLuhan actually talks about this
link |
00:28:17.360
in the electric environment of the television.
link |
00:28:19.520
And that was before TikTok and before front facing cameras.
link |
00:28:22.440
So I think for me, the concern is that
link |
00:28:25.360
it's not like we can ever switch to doing something better,
link |
00:28:28.160
but more of the humans and technology,
link |
00:28:32.080
they're not independent of each other.
link |
00:28:33.560
The technology that we use kind of molds what we need
link |
00:28:37.760
for the next generation of technology.
link |
00:28:39.120
Yeah, but humans are intelligent and they're introspective
link |
00:28:43.360
and they can reflect on the experiences of their life.
link |
00:28:45.840
So for example, there's been many years in my life
link |
00:28:47.880
where I ate an excessive amount of sugar.
link |
00:28:50.920
And then a certain moment I woke up and said,
link |
00:28:54.840
why do I keep doing this?
link |
00:28:55.920
This doesn't feel good.
link |
00:28:57.680
Like longterm.
link |
00:28:59.000
And I think, so going through the TikTok process
link |
00:29:02.320
of realizing, okay, when I shorten my attention span,
link |
00:29:06.280
actually that does not make me feel good longer term.
link |
00:29:10.240
And realizing that and then going to platforms,
link |
00:29:13.160
going to places that are away from the sugar.
link |
00:29:18.280
So in so doing, you can create platforms
link |
00:29:21.080
that can make a lot of money to help people wake up
link |
00:29:24.000
to what actually makes them feel good longterm.
link |
00:29:26.280
Develop, grow as human beings.
link |
00:29:28.280
And it just feels like humans are more intelligent
link |
00:29:31.040
than mice looking for cheese.
link |
00:29:35.120
They're able to sort of think, I mean,
link |
00:29:36.960
we can contemplate our own mortality.
link |
00:29:39.720
We can contemplate things like longterm love
link |
00:29:43.880
and we can have a longterm fear
link |
00:29:46.080
of certain things like mortality.
link |
00:29:48.080
We can contemplate whether the experiences,
link |
00:29:51.240
the sort of the drugs of daily life
link |
00:29:53.680
that we've been partaking in is making us happier,
link |
00:29:57.280
better people.
link |
00:29:58.320
And then once we contemplate that,
link |
00:30:00.200
we can make financial decisions in using services
link |
00:30:03.800
and paying for services that are making us better people.
link |
00:30:06.880
So it just seems that we're in the very first stages
link |
00:30:11.440
of social networks that just were able to make a lot of money
link |
00:30:15.520
really quickly, but in bringing out sometimes
link |
00:30:20.080
the bad parts of human nature, they didn't destroy humans.
link |
00:30:23.200
They just fed everybody a lot of sugar.
link |
00:30:26.040
And now everyone's gonna wake up and say,
link |
00:30:28.600
hey, we're gonna start having like sugar free social media.
link |
00:30:31.720
Right, right.
link |
00:30:33.280
Well, there's a lot to unpack there.
link |
00:30:34.800
I think some people certainly have the capacity for that.
link |
00:30:37.520
And I certainly think, I mean, it's very interesting
link |
00:30:39.680
even the way you said it, you woke up one day
link |
00:30:41.400
and you thought, well, this doesn't feel very good.
link |
00:30:44.160
Well, it's still your limbic system saying
link |
00:30:45.760
this doesn't feel very good, right?
link |
00:30:47.480
You have a cat brain's worth of neurons around your gut,
link |
00:30:50.040
right?
link |
00:30:50.880
And so maybe that saturated and that was telling you,
link |
00:30:53.600
hey, this isn't good.
link |
00:30:55.000
Humans are more than just mice looking for cheese
link |
00:30:58.320
or monkeys looking for sex and power, right?
link |
00:31:00.800
So.
link |
00:31:01.640
Let's slow down.
link |
00:31:02.640
Now a lot of people would argue with you on that one,
link |
00:31:05.960
but yes.
link |
00:31:06.800
Well, we're more than just that, but we're at least that.
link |
00:31:08.480
And we're very, very seldom not that.
link |
00:31:11.800
So I don't actually disagree with you
link |
00:31:15.080
that we could be better and that better platforms exist.
link |
00:31:18.240
And people are voluntarily noping out of things
link |
00:31:20.320
like Facebook and noping out.
link |
00:31:21.680
That's an awesome verb.
link |
00:31:22.760
It's a great term.
link |
00:31:23.680
Yeah, I love it.
link |
00:31:24.520
I use it all the time.
link |
00:31:25.720
You're welcome, Mike.
link |
00:31:26.560
I'm gonna have to nope out of that.
link |
00:31:27.400
I'm gonna have to nope out of that, right?
link |
00:31:28.600
It's gonna be a hard pass and that's great.
link |
00:31:32.840
But that's again, to your point,
link |
00:31:34.240
that's the first generation of front facing cameras
link |
00:31:37.240
of social pressures.
link |
00:31:38.680
And you as a self starter, self aware adult
link |
00:31:43.560
have the capacity to say, yeah, I'm not gonna do that.
link |
00:31:46.360
I'm gonna go and spend time on long form reads.
link |
00:31:48.520
I'm gonna spend time managing my attention.
link |
00:31:50.280
I'm gonna do some yoga.
link |
00:31:52.080
If you're a 15 year old in high school
link |
00:31:54.960
and your entire social environment
link |
00:31:57.000
is everyone doing these things,
link |
00:31:58.240
guess what you're gonna do?
link |
00:31:59.520
You're gonna kind of have to do that
link |
00:32:00.680
because your limbic system says,
link |
00:32:01.760
hey, I need to get the guy or the girl or the whatever.
link |
00:32:04.520
And that's what I'm gonna do.
link |
00:32:05.640
And so one of the things that we have to reason about here
link |
00:32:07.800
is the social media systems or social media,
link |
00:32:10.800
I think is our first encounter with a technological system
link |
00:32:15.800
that runs a bit of a loop around our own cognition
link |
00:32:20.800
and attention.
link |
00:32:21.800
It's not the last, it's far from the last.
link |
00:32:25.600
And it gets to the heart of some of the philosophical
link |
00:32:28.280
Achilles heel of the Western philosophical system,
link |
00:32:31.800
which is each person gets to make their own determination.
link |
00:32:34.280
Each person is an individual that's sacrosanct
link |
00:32:37.280
in their agency and their sovereignty and all these things.
link |
00:32:39.960
The problem with these systems is they come down
link |
00:32:42.640
and they are able to make their own decisions.
link |
00:32:44.560
They come down and they are able to manage everyone on mass.
link |
00:32:48.080
And so every person is making their own decision,
link |
00:32:50.240
but together the bigger system is causing them to act
link |
00:32:53.720
with a group dynamic that's very profitable for people.
link |
00:32:58.760
So this is the issue that we have is that our philosophies
link |
00:33:02.200
are actually not geared to understand
link |
00:33:05.080
what is it for a person to have a high trust connection
link |
00:33:10.400
as part of a collective and for that collective
link |
00:33:12.480
to have its right to coherency and agency.
link |
00:33:16.280
That's something like when a social media app
link |
00:33:19.400
causes a family to break apart,
link |
00:33:21.720
it's done harm to more than just individuals, right?
link |
00:33:24.640
So that concept is not something we really talk about
link |
00:33:27.320
or think about very much, but that's actually the problem
link |
00:33:30.160
is that we're vaporizing molecules into atomic units
link |
00:33:33.200
and then we're hitting all the atoms with certain things.
link |
00:33:35.160
That's like, yeah, well, that person chose to look at my app.
link |
00:33:38.360
So our understanding of human nature
link |
00:33:40.600
at the individual level, it emphasizes the individual
link |
00:33:43.800
too much because ultimately society operates
link |
00:33:46.360
at the collective level.
link |
00:33:47.440
And these apps do as well.
link |
00:33:48.640
And the apps do as well.
link |
00:33:49.880
So for us to understand the progression and the development
link |
00:33:53.360
of this organism we call human civilization,
link |
00:33:56.000
we have to think at the collective level too.
link |
00:33:58.000
I would say multi tiered.
link |
00:33:59.280
Multi tiered.
link |
00:34:00.240
So individual as well.
link |
00:34:01.600
Individuals, family units, social collectives
link |
00:34:05.080
and all the way up.
link |
00:34:06.400
Okay, so you've said that individual humans
link |
00:34:09.400
are multi layered susceptible to signals and waves
link |
00:34:12.200
and multiple strata, the physical, the biological,
link |
00:34:15.280
social, cultural, intellectual.
link |
00:34:16.760
So sort of going along these lines,
link |
00:34:19.480
can you describe the layers of the cake
link |
00:34:22.880
that is a human being and maybe the human collective,
link |
00:34:27.360
human society?
link |
00:34:29.200
So I'm just stealing wholesale here from Robert Persig,
link |
00:34:32.920
who is the author of Zen and the Art of Motorcycle
link |
00:34:34.600
Maintenance and his follow on book has a sequel to it
link |
00:34:40.040
called Lila.
link |
00:34:40.720
He goes into this in a little more detail.
link |
00:34:42.440
But it's a crude approach to thinking about people.
link |
00:34:47.000
But I think it's still an advancement
link |
00:34:48.920
over traditional subject object metaphysics,
link |
00:34:51.200
where we look at people as a dualist would say,
link |
00:34:53.960
well, is your mind, your consciousness,
link |
00:34:57.200
is that just merely the matter that's in your brain
link |
00:35:01.120
or is there something kind of more beyond that?
link |
00:35:03.440
And they would say, yes, there's a soul,
link |
00:35:05.080
sort of ineffable soul beyond just merely the physical body.
link |
00:35:09.280
And I'm not one of those people.
link |
00:35:11.360
I think that we don't have to draw a line between are things
link |
00:35:15.560
only this or only that.
link |
00:35:16.840
Collectives of things can emerge structures and patterns
link |
00:35:19.720
that are just as real as the underlying pieces.
link |
00:35:22.000
But they're transcendent, but they're still
link |
00:35:24.680
of the underlying pieces.
link |
00:35:26.520
So your body is this way.
link |
00:35:28.400
I mean, we just know physically you consist of atoms
link |
00:35:31.000
and whatnot.
link |
00:35:32.800
And then the atoms are arranged into molecules
link |
00:35:34.680
which then arrange into certain kinds of structures
link |
00:35:37.120
that seem to have a homeostasis to them.
link |
00:35:39.560
We call them cells.
link |
00:35:40.720
And those cells form sort of biological structures.
link |
00:35:44.560
Those biological structures give your body
link |
00:35:46.840
its physical ability and the biological ability
link |
00:35:49.000
to consume energy and to maintain homeostasis.
link |
00:35:51.800
But humans are social animals.
link |
00:35:54.040
I mean, human by themselves is not very long for the world.
link |
00:35:57.320
So part of our biology is why are two connect to other people?
link |
00:36:02.200
From the mirror neurons to our language centers
link |
00:36:04.920
and all these other things.
link |
00:36:06.280
So we are intrinsically, there's a layer,
link |
00:36:09.160
there's a part of us that wants to be part of a thing.
link |
00:36:12.560
If we're around other people, not saying a word,
link |
00:36:14.640
but they're just up and down jumping and dancing, laughing,
link |
00:36:17.140
we're going to feel better.
link |
00:36:18.580
And there was no exchange of physical anything.
link |
00:36:21.800
They didn't give us like five atoms of happiness.
link |
00:36:24.840
But there's an induction in our own sense of self
link |
00:36:27.520
that is at that social level.
link |
00:36:29.600
And then beyond that, Persick puts the intellectual level
link |
00:36:33.560
kind of one level higher than social.
link |
00:36:35.100
I think they're actually more intertwined than that.
link |
00:36:37.220
But the intellectual level is the level of pure ideas.
link |
00:36:41.040
That you are a vessel for memes.
link |
00:36:42.840
You're a vessel for philosophies.
link |
00:36:45.000
You will conduct yourself in a particular way.
link |
00:36:47.400
I mean, I think part of this is if we think about it
link |
00:36:49.520
from a physics perspective, you're not,
link |
00:36:52.200
there's the joke that physicists like to approximate things.
link |
00:36:55.080
And we'll say, well, approximate a spherical cow, right?
link |
00:36:57.500
You're not a spherical cow, you're not a spherical human.
link |
00:36:59.640
You're a messy human.
link |
00:37:00.760
And we can't even say what the dynamics of your emotion
link |
00:37:04.000
will be unless we analyze all four of these layers, right?
link |
00:37:08.560
If you're Muslim at a certain time of day, guess what?
link |
00:37:11.760
You're going to be on the ground kneeling and praying, right?
link |
00:37:14.000
And that has nothing to do with your biological need
link |
00:37:15.960
to get on the ground or physics of gravity.
link |
00:37:18.520
It is an intellectual drive that you have.
link |
00:37:20.440
It's a cultural phenomenon
link |
00:37:22.000
and an intellectual belief that you carry.
link |
00:37:23.760
So that's what the four layered stack is all about.
link |
00:37:28.120
It's that a person is not only one of these things,
link |
00:37:30.360
they're all of these things at the same time.
link |
00:37:31.760
It's a superposition of dynamics that run through us
link |
00:37:35.680
that make us who we are.
link |
00:37:37.360
So no layer is special.
link |
00:37:40.520
Not so much no layer is special,
link |
00:37:41.720
each layer is just different.
link |
00:37:44.360
But we are.
link |
00:37:45.840
Each layer gets the participation trophy.
link |
00:37:48.320
Yeah, each layer is a part of what you are.
link |
00:37:50.320
You are a layer cake, right, of all these things.
link |
00:37:52.080
And if we try to deny, right,
link |
00:37:54.480
so many philosophies do try to deny
link |
00:37:56.580
the reality of some of these things, right?
link |
00:37:58.960
Some people will say, well, we're only atoms.
link |
00:38:01.420
Well, we're not only atoms
link |
00:38:02.480
because there's a lot of other things that are only atoms.
link |
00:38:04.080
I can reduce a human being to a bunch of soup
link |
00:38:07.000
and they're not the same thing,
link |
00:38:08.520
even though it's the same atoms.
link |
00:38:09.800
So I think the order and the patterns
link |
00:38:12.160
that emerge within humans to understand,
link |
00:38:15.960
to really think about what a next generation philosophy
link |
00:38:18.480
would look like, that would allow us to reason
link |
00:38:20.040
about extending humans into the digital realm
link |
00:38:22.960
or to interact with autonomous intelligences
link |
00:38:25.560
that are not biological in nature.
link |
00:38:27.520
We really need to appreciate these,
link |
00:38:29.640
that human, what human beings actually are
link |
00:38:32.280
is the superposition of these different layers.
link |
00:38:34.760
You mentioned consciousness.
link |
00:38:36.640
Are each of these layers of cake conscious?
link |
00:38:39.800
Is consciousness a particular quality of one of the layers?
link |
00:38:43.760
Is there like a spike if you have a consciousness detector
link |
00:38:46.920
at these layers or is something that just permeates
link |
00:38:49.360
all of these layers and just takes different form?
link |
00:38:51.920
I believe what humans experience as consciousness
link |
00:38:54.400
is something that sits on a gradient scale
link |
00:38:57.640
of a general principle in the universe
link |
00:39:00.540
that seems to look for order and reach for order
link |
00:39:04.960
when there's an excess of energy.
link |
00:39:06.760
You know, it would be odd to say a proton is alive, right?
link |
00:39:09.400
It'd be odd to say like this particular atom
link |
00:39:12.040
or molecule of hydrogen gas is alive,
link |
00:39:15.840
but there's certainly something we can make assemblages
link |
00:39:20.680
of these things that have autopoetic aspects to them
link |
00:39:24.400
that will create structures that will, you know,
link |
00:39:26.520
crystalline solids will form very interesting
link |
00:39:28.640
and beautiful structures.
link |
00:39:29.960
This gets kind of into weird mathematical territories.
link |
00:39:33.000
You start thinking about Penrose and Game of Life stuff
link |
00:39:35.320
about the generativity of math itself,
link |
00:39:37.900
like the hyperreal numbers, things like that.
link |
00:39:39.480
But without going down that rabbit hole,
link |
00:39:42.320
I would say that there seems to be a tendency
link |
00:39:45.540
in the world that when there is excess energy,
link |
00:39:49.120
things will structure and pattern themselves.
link |
00:39:51.360
And they will then actually furthermore try to create
link |
00:39:53.760
an environment that furthers their continued stability.
link |
00:39:58.040
It's the concept of externalized extended phenotype
link |
00:40:00.800
or niche construction.
link |
00:40:02.240
So this is ultimately what leads to certain kinds
link |
00:40:06.240
of amino acids forming certain kinds of structures
link |
00:40:09.180
and so on and so forth until you get the ladder of life.
link |
00:40:11.120
So what we experience as consciousness,
link |
00:40:12.880
no, I don't think cells are conscious at that level,
link |
00:40:15.560
but is there something beyond mere equilibrium state biology
link |
00:40:19.560
and chemistry and biochemistry
link |
00:40:21.960
that drives what makes things work?
link |
00:40:25.420
I think there is.
link |
00:40:27.620
So Adrian Bajan has his ConstructoLaw.
link |
00:40:29.560
There's other things you can look at.
link |
00:40:31.200
When you look at the life sciences
link |
00:40:32.440
and you look at any kind of statistical physics
link |
00:40:36.040
and statistical mechanics,
link |
00:40:37.440
when you look at things far out of equilibrium,
link |
00:40:40.540
when you have excess energy, what happens then?
link |
00:40:43.400
Life doesn't just make a hotter soup.
link |
00:40:45.800
It starts making structure.
link |
00:40:47.440
There's something there.
link |
00:40:48.640
The poetry of reaches for order
link |
00:40:50.880
when there's an excess of energy.
link |
00:40:54.160
Because you brought up game of life.
link |
00:40:57.200
You did it, not me.
link |
00:40:59.160
I love cellular automata,
link |
00:41:00.360
so I have to sort of linger on that for a little bit.
link |
00:41:06.400
So cellular automata, I guess, or game of life
link |
00:41:09.340
is a very simple example of reaching for order
link |
00:41:11.840
when there's an excess of energy.
link |
00:41:14.080
Or reaching for order and somehow creating complexity.
link |
00:41:17.240
Within this explosion of just turmoil,
link |
00:41:22.440
somehow trying to construct structures.
link |
00:41:25.560
And in so doing, create very elaborate
link |
00:41:29.760
organism looking type things.
link |
00:41:32.480
What intuition do you draw from this simple mechanism?
link |
00:41:35.840
Well, I like to turn that around its head.
link |
00:41:37.840
And look at it as what if every single one of the patterns
link |
00:41:42.400
created life, or created, not life,
link |
00:41:45.800
but created interesting patterns?
link |
00:41:47.400
Because some of them don't.
link |
00:41:48.640
And sometimes you make cool gliders.
link |
00:41:50.520
And other times, you start with certain things
link |
00:41:52.240
and you make gliders and other things
link |
00:41:54.020
that then construct like AND gates and NOT gates, right?
link |
00:41:57.040
And you build computers on them.
link |
00:41:59.240
All of these rules that create these patterns
link |
00:42:00.800
that we can see, those are just the patterns we can see.
link |
00:42:04.640
What if our subjectivity is actually limiting
link |
00:42:06.600
our ability to perceive the order in all of it?
link |
00:42:11.000
What if some of the things that we think are random
link |
00:42:12.320
are actually not that random?
link |
00:42:13.300
We're simply not integrating at a final level
link |
00:42:16.120
across a broad enough time horizon.
link |
00:42:18.980
And this is again, I said, we go down the rabbit holes
link |
00:42:20.600
and the Penrose stuff or like Wolfram's explorations
link |
00:42:22.880
on these things.
link |
00:42:24.960
There is something deep and beautiful
link |
00:42:27.300
in the mathematics of all this.
link |
00:42:28.600
That is hopefully one day I'll have enough money
link |
00:42:30.400
to work and retire and just ponder those questions.
link |
00:42:33.440
But there's something there.
link |
00:42:34.560
But you're saying there's a ceiling to,
link |
00:42:36.120
when you have enough money and you retire and you ponder,
link |
00:42:38.480
there's a ceiling to how much you can truly ponder
link |
00:42:40.720
because there's cognitive limitations
link |
00:42:43.000
in what you're able to perceive as a pattern.
link |
00:42:46.320
Yeah.
link |
00:42:47.160
And maybe mathematics extends your perception capabilities,
link |
00:42:51.520
but it's still finite.
link |
00:42:53.840
It's just like.
link |
00:42:55.460
Yeah, the mathematics we use is the mathematics
link |
00:42:57.640
that can fit in our head.
link |
00:42:59.000
Yeah.
link |
00:43:00.840
Did God really create the integers?
link |
00:43:02.600
Or did God create all of it?
link |
00:43:03.720
And we just happen at this point in time
link |
00:43:05.360
to be able to perceive integers.
link |
00:43:07.240
Well, he just did the positive in it.
link |
00:43:09.400
She, I just said, did she create all of it?
link |
00:43:11.360
And then we.
link |
00:43:14.200
She just created the natural numbers
link |
00:43:15.760
and then we screwed it all up with zero and then I guess.
link |
00:43:17.680
Okay.
link |
00:43:18.560
But we did, we created mathematical operations
link |
00:43:21.600
so that we can have iterated steps
link |
00:43:23.700
to approach bigger problems, right?
link |
00:43:26.000
I mean, the entire point of the Arabic Neural System
link |
00:43:29.040
and it's a rubric for mapping a certain set of operations,
link |
00:43:32.600
folding them into a simple little expression,
link |
00:43:35.360
but that's just the operations that we can fit in our heads.
link |
00:43:38.740
There are many other operations besides, right?
link |
00:43:41.140
The thing that worries me the most about aliens and humans
link |
00:43:46.020
is that the aliens are all around us and we're too dumb.
link |
00:43:50.920
Yeah.
link |
00:43:51.760
To see them.
link |
00:43:52.580
Oh, certainly, yeah.
link |
00:43:53.420
Or life, let's say just life,
link |
00:43:54.800
life of all kinds of forms or organisms.
link |
00:43:58.080
You know what, just even the intelligence of organisms
link |
00:44:01.940
is imperceptible to us
link |
00:44:04.160
because we're too dumb and self centered.
link |
00:44:06.600
That worries me.
link |
00:44:07.440
Well, we're looking for a particular kind of thing.
link |
00:44:09.560
Yeah.
link |
00:44:10.400
When I was at Cornell,
link |
00:44:11.220
I had a lovely professor of Asian religions,
link |
00:44:13.860
Jane Marie Law,
link |
00:44:14.700
and she would tell this story about a musical,
link |
00:44:17.760
a musician, a Western musician who went to Japan
link |
00:44:20.100
and he taught classical music
link |
00:44:21.880
and could play all sorts of instruments.
link |
00:44:24.000
He went to Japan and he would ask people,
link |
00:44:27.480
he would basically be looking for things in the style of
link |
00:44:30.640
a Western chromatic scale and these kinds of things.
link |
00:44:34.080
And then finding none of it,
link |
00:44:35.380
he would say, well, there's really no music in Japan,
link |
00:44:37.520
but they're using a different scale.
link |
00:44:38.800
They're playing different kinds of instruments, right?
link |
00:44:40.440
The same thing she was using as a sort of a metaphor
link |
00:44:42.620
for religion as well.
link |
00:44:43.660
In the West, we center a lot of religion,
link |
00:44:45.780
certainly the religions of Abraham,
link |
00:44:47.980
we center them around belief.
link |
00:44:50.040
And in the East, it's more about practice, right?
link |
00:44:52.440
Spirituality and practice rather than belief.
link |
00:44:54.600
So anyway, the point is here to your point,
link |
00:44:57.360
life, we, I think so many people are so fixated
link |
00:45:00.820
on certain aspects of self replication
link |
00:45:03.320
or homeostasis or whatever.
link |
00:45:06.120
But if we kind of broaden and generalize this thing
link |
00:45:08.840
of things reaching for order,
link |
00:45:10.840
under which conditions can they then create an environment
link |
00:45:13.540
that sustains that order, that allows them,
link |
00:45:17.320
the invention of death is an interesting thing.
link |
00:45:20.120
There are some organisms on earth
link |
00:45:21.480
that are thousands of years old.
link |
00:45:23.380
And it's not like they're incredibly complex,
link |
00:45:25.520
they're actually simpler than the cells that comprise us,
link |
00:45:28.560
but they never die.
link |
00:45:29.640
So at some point, death was invented,
link |
00:45:33.320
somewhere along the eukaryotic scale,
link |
00:45:34.720
I mean, even the protists, right?
link |
00:45:35.980
There's death.
link |
00:45:37.180
And why is that along with the sexual reproduction, right?
link |
00:45:41.480
There is something about the renewal process,
link |
00:45:45.000
something about the ability to respond
link |
00:45:46.520
to a changing environment,
link |
00:45:48.160
where it just becomes,
link |
00:45:50.240
just killing off the old generation
link |
00:45:51.520
and letting new generations try,
link |
00:45:54.240
seems to be the best way to fit into the niche.
link |
00:45:57.040
Human historians seems to write about wheels and fires,
link |
00:46:00.320
the greatest inventions,
link |
00:46:01.640
but it seems like death and sex are pretty good.
link |
00:46:04.360
And they're kind of essential inventions
link |
00:46:06.560
at the very beginning.
link |
00:46:07.400
At the very beginning, yeah.
link |
00:46:08.600
Well, we didn't invent them, right?
link |
00:46:10.560
Well, Broad, you didn't invent them.
link |
00:46:13.300
I see us as one,
link |
00:46:15.400
you particular Homo sapiens did not invent them,
link |
00:46:17.880
but we together, it's a team project,
link |
00:46:21.000
just like you're saying.
link |
00:46:21.880
I think the greatest Homo sapiens invention
link |
00:46:24.280
is collaboration.
link |
00:46:25.640
So when you say collaboration,
link |
00:46:29.640
Peter, where do ideas come from
link |
00:46:32.200
and how do they take hold in society?
link |
00:46:35.280
Is that the nature of collaboration?
link |
00:46:36.960
Is that the basic atom of collaboration is ideas?
link |
00:46:40.440
It's not not ideas, but it's not only ideas.
link |
00:46:43.200
There's a book I just started reading
link |
00:46:44.440
called Death From A Distance.
link |
00:46:45.920
Have you heard of this?
link |
00:46:46.740
No.
link |
00:46:47.580
It's a really fascinating thesis,
link |
00:46:49.200
which is that humans are the only conspecific,
link |
00:46:53.200
the only species that can kill other members
link |
00:46:55.800
of the species from range.
link |
00:46:58.200
And maybe there's a few exceptions,
link |
00:46:59.600
but if you look in the animal world,
link |
00:47:01.020
you see like pronghorns butting heads, right?
link |
00:47:03.000
You see the alpha lion and the beta lion
link |
00:47:05.840
and they take each other down.
link |
00:47:07.200
Humans, we developed the ability
link |
00:47:08.720
to chuck rocks at each other,
link |
00:47:10.120
well, at prey, but also at each other.
link |
00:47:11.960
And that means the beta male can chunk a rock
link |
00:47:14.920
at the alpha male and take them down.
link |
00:47:17.440
And he can throw a lot of rocks actually,
link |
00:47:20.040
miss a bunch of times, but just hit once and be good.
link |
00:47:22.400
So this ability to actually kill members
link |
00:47:25.800
of our own species from range
link |
00:47:27.280
without a threat of harm to ourselves
link |
00:47:29.960
created essentially mutually assured destruction
link |
00:47:32.360
where we had to evolve cooperation.
link |
00:47:34.000
If we didn't, then if we just continue to try to do,
link |
00:47:37.300
like I'm the biggest monkey in the tribe
link |
00:47:39.480
and I'm gonna own this tribe and you have to go,
link |
00:47:43.040
if we do it that way, then those tribes basically failed.
link |
00:47:46.720
And the tribes that persisted
link |
00:47:48.440
and that have now given rise to the modern Homo sapiens
link |
00:47:51.440
are the ones where respecting the fact
link |
00:47:53.860
that we can kill each other from a range
link |
00:47:56.240
without harm, like there's an asymmetric ability
link |
00:47:58.640
to snipe the leader from range.
link |
00:48:00.840
That meant that we sort of had to learn
link |
00:48:03.920
how to cooperate with each other, right?
link |
00:48:05.400
Come back here, don't throw that rock at me.
link |
00:48:06.600
Let's talk our differences out.
link |
00:48:07.960
So violence is also part of collaboration.
link |
00:48:10.240
The threat of violence, let's say.
link |
00:48:12.320
Well, the recognition, maybe the better way to put it
link |
00:48:15.720
is the recognition that we have more to gain
link |
00:48:17.480
by working together than the prisoner's dilemma
link |
00:48:21.140
of both of us defecting.
link |
00:48:23.520
So mutually assured destruction in all its forms
link |
00:48:26.380
is part of this idea of collaboration.
link |
00:48:28.720
Well, and Eric Weinstein talks about our nuclear peace,
link |
00:48:31.040
right, I mean, it kind of sucks
link |
00:48:32.440
with thousands of warheads aimed at each other,
link |
00:48:34.040
we mean Russia and the US, but it's like,
link |
00:48:36.320
on the other hand, we only fought proxy wars, right?
link |
00:48:39.880
We did not have another World War III
link |
00:48:41.360
of like hundreds of millions of people dying
link |
00:48:43.520
to like machine gun fire and giant guided missiles.
link |
00:48:47.840
So the original nuclear weapon is a rock
link |
00:48:50.400
that we learned how to throw, essentially?
link |
00:48:52.280
The original, yeah, well, the original scope of the world
link |
00:48:54.280
for any human being was their little tribe.
link |
00:48:58.740
I would say it still is for the most part.
link |
00:49:00.680
Eric Weinstein speaks very highly of you,
link |
00:49:05.800
which is very surprising to me at first
link |
00:49:08.000
because I didn't know there's this depth to you
link |
00:49:10.800
because I knew you as an amazing leader of engineers
link |
00:49:15.560
and an engineer yourself and so on, so it's fascinating.
link |
00:49:18.440
Maybe just as a comment, a side tangent that we can take,
link |
00:49:23.800
what's your nature of your friendship with Eric Weinstein?
link |
00:49:27.120
How did the two, how did such two interesting paths cross?
link |
00:49:30.620
Is it your origins in physics?
link |
00:49:32.960
Is it your interest in philosophy
link |
00:49:35.620
and the ideas of how the world works?
link |
00:49:37.280
What is it?
link |
00:49:38.120
It's very random, Eric found me.
link |
00:49:40.920
He actually found Travis and I.
link |
00:49:43.840
Travis Oliphant.
link |
00:49:44.680
Oliphant, yeah, we were both working
link |
00:49:45.900
at a company called Enthought back in the mid 2000s
link |
00:49:48.520
and we were doing a lot of consulting
link |
00:49:50.840
around scientific Python and we'd made some tools
link |
00:49:54.400
and Eric was trying to use some of these Python tools
link |
00:49:57.400
to visualize, he had a fiber bundle approach
link |
00:50:00.200
to modeling certain aspects of economics.
link |
00:50:03.440
He was doing this and that's how he kind of got in touch
link |
00:50:05.200
with us and so.
link |
00:50:06.160
This was in the early.
link |
00:50:08.160
This was mid 2000s, oh seven timeframe, oh six, oh seven.
link |
00:50:13.680
Eric Weinstein trying to use Python.
link |
00:50:16.200
Right, to visualize fiber bundles.
link |
00:50:18.680
Using some of the tools that we had built
link |
00:50:20.160
in the open source.
link |
00:50:21.280
That's somehow entertaining to me, the thought of that.
link |
00:50:24.160
It's very funny but then we met with him a couple times,
link |
00:50:27.160
a really interesting guy and then in the wake
link |
00:50:28.780
of the oh seven, oh eight kind of financial collapse,
link |
00:50:31.360
he helped organize with Lee Smolin a symposium
link |
00:50:35.440
at the Perimeter Institute about okay, well clearly,
link |
00:50:39.960
big finance can't be trusted, government's in its pockets
link |
00:50:42.220
with regulatory capture, what the F do we do?
link |
00:50:45.360
And all sorts of people, Nassim Tlaib was there
link |
00:50:47.680
and Andy Lowe from MIT was there and Bill Janeway,
link |
00:50:51.300
I mean just a lot of top billing people were there
link |
00:50:54.880
and he invited me and Travis and another one
link |
00:50:58.480
of our coworkers, Robert Kern, who is anyone
link |
00:51:01.320
in the SciPy, NumPy community knows Robert.
link |
00:51:03.960
Really great guy.
link |
00:51:04.780
So the three of us also got invited to go to this thing
link |
00:51:06.640
and that's where I met Brett Weinstein
link |
00:51:07.800
for the first time as well.
link |
00:51:09.520
Yeah, I knew him before he got all famous
link |
00:51:11.680
for unfortunate reasons, I guess.
link |
00:51:13.440
But anyway, so we met then and kind of had a friendship
link |
00:51:19.600
throughout since then.
link |
00:51:21.400
You have a depth of thinking that kind of runs
link |
00:51:26.080
with Eric in terms of just thinking about the world deeply
link |
00:51:28.600
and thinking philosophically and then there's Eric's
link |
00:51:31.720
interest in programming.
link |
00:51:33.480
I actually have never, you know, he'll bring up programming
link |
00:51:38.080
to me quite a bit as a metaphor for stuff.
link |
00:51:41.100
But I never kind of pushed the point of like,
link |
00:51:44.520
what's the nature of your interest in programming?
link |
00:51:46.820
I think he saw it probably as a tool.
link |
00:51:48.840
Yeah, absolutely.
link |
00:51:49.680
That you visualize, to explore mathematics
link |
00:51:52.200
and explore physics and I was wondering like,
link |
00:51:55.020
what's his depth of interest and also his vision
link |
00:51:59.520
for what programming would look like in the future.
link |
00:52:05.640
Have you had interaction with him, like discussion
link |
00:52:08.000
in the space of Python, programming?
link |
00:52:09.840
Well, in the sense of sometimes he asks me,
link |
00:52:11.920
why is this stuff still so hard?
link |
00:52:13.620
Yeah, you know, everybody's a critic.
link |
00:52:18.280
But actually, no, Eric.
link |
00:52:20.320
Programming, you mean, like in general?
link |
00:52:21.440
Yes, yes, well, not programming in general,
link |
00:52:23.320
but certain things in the Python ecosystem.
link |
00:52:25.560
But he actually, I think what I find in listening
link |
00:52:29.560
to some of his stuff is that he does use
link |
00:52:31.640
programming metaphors a lot, right?
link |
00:52:33.280
He'll talk about APIs or object oriented
link |
00:52:35.420
and things like that.
link |
00:52:36.400
So I think that's a useful set of frames
link |
00:52:39.240
for him to draw upon for discourse.
link |
00:52:42.880
I haven't pair programmed with him in a very long time.
link |
00:52:45.440
You've previously pair coded with Eric.
link |
00:52:47.240
Well, I mean, I look at his code trying to help
link |
00:52:49.520
like put together some of the visualizations
link |
00:52:50.960
around these things.
link |
00:52:51.800
But it's been a very, not really pair programmed,
link |
00:52:54.040
but like even looked at his code, right?
link |
00:52:55.800
I mean.
link |
00:52:56.640
How legendary would be is that like Git repo
link |
00:53:01.040
with Peter Wang and Eric Weinstein?
link |
00:53:02.680
Well, honestly, Robert Kern did all the heavy lifting.
link |
00:53:05.480
So I have to give credit where credit is due.
link |
00:53:06.880
Robert is the silent but incredibly deep, quiet,
link |
00:53:10.760
not silent, but quiet, but incredibly deep individual
link |
00:53:13.480
at the heart of a lot of those things
link |
00:53:14.720
that Eric was trying to do.
link |
00:53:16.720
But we did have, you know, as Travis and I
link |
00:53:19.160
were starting our company in 2012 timeframe,
link |
00:53:23.520
we went to New York.
link |
00:53:24.640
Eric was still in New York at the time.
link |
00:53:26.120
He hadn't moved to, this is before he joined Teal Capital.
link |
00:53:29.160
We just had like a steak dinner somewhere.
link |
00:53:31.560
Maybe it was Keynes, I don't know, somewhere in New York.
link |
00:53:33.400
So it was me, Travis, Eric, and then Wes McKinney,
link |
00:53:36.360
the creative pandas, and then Wes's then business partner,
link |
00:53:39.520
Adam, the five of us sat around having this,
link |
00:53:42.040
just a hilarious time, amazing dinner.
link |
00:53:45.520
I forget what all we talked about,
link |
00:53:46.960
but it was one of those conversations,
link |
00:53:49.000
which I wish as soon as COVID is over,
link |
00:53:51.400
maybe Eric and I can sit down.
link |
00:53:53.080
Recreate.
link |
00:53:53.920
Recreate it somewhere in LA, or maybe he comes here,
link |
00:53:56.720
because a lot of cool people are here in Austin, right?
link |
00:53:58.160
Exactly.
link |
00:53:59.000
Yeah, we're all here.
link |
00:53:59.840
He should come here.
link |
00:54:00.680
Come here.
link |
00:54:01.520
Yeah.
link |
00:54:02.360
So he uses the metaphor source code sometimes
link |
00:54:04.120
to talk about physics.
link |
00:54:05.320
We figure out our own source code.
link |
00:54:07.160
So you with a physics background
link |
00:54:10.880
and somebody who's quite a bit of an expert in source code,
link |
00:54:14.160
do you think we'll ever figure out our own source code
link |
00:54:17.760
in the way that Eric means?
link |
00:54:19.040
Do you think we'll figure out the nature of reality?
link |
00:54:20.360
Well, I think we're constantly working on that problem.
link |
00:54:21.760
I mean, I think we'll make more and more progress.
link |
00:54:24.400
For me, there's some things I don't really doubt too much.
link |
00:54:28.120
Like, I don't really doubt that one day
link |
00:54:29.960
we will create a synthetic, maybe not fully in silicon,
link |
00:54:34.480
but a synthetic approach to
link |
00:54:39.240
cognition that rivals the biological
link |
00:54:42.680
20 watt computers in our heads.
link |
00:54:44.760
What's cognition here?
link |
00:54:46.080
Cognition.
link |
00:54:46.920
Which aspect?
link |
00:54:47.760
Perception, attention, memory, recall,
link |
00:54:49.960
asking better questions.
link |
00:54:51.840
That for me is a measure of intelligence.
link |
00:54:53.200
Doesn't Roomba vacuum cleaner already do that?
link |
00:54:55.400
Or do you mean, oh, it doesn't ask questions.
link |
00:54:57.080
I mean, no, it's, I mean, I have a Roomba,
link |
00:55:00.160
but it's not even as smart as my cat, right?
link |
00:55:03.080
Yeah, but it asks questions about what is this wall?
link |
00:55:05.320
It now, new feature asks, is this poop or not, apparently.
link |
00:55:08.920
Yes, a lot of our current cybernetic system,
link |
00:55:11.320
it's a cybernetic system.
link |
00:55:12.640
It will go and it will happily vacuum up some poop, right?
link |
00:55:14.960
The older generations would.
link |
00:55:16.640
The new one, just released, does not vacuum up the poop.
link |
00:55:19.400
Okay.
link |
00:55:20.240
This is a commercial for.
link |
00:55:21.080
I wonder if it still gets stuck
link |
00:55:21.920
under my first rung of my stair.
link |
00:55:23.800
In any case, these cybernetic systems we have,
link |
00:55:27.160
they are mold, they're designed to be sent off
link |
00:55:32.080
into a relatively static environment.
link |
00:55:34.040
And whatever dynamic things happen in the environment,
link |
00:55:36.400
they have a very limited capacity to respond to.
link |
00:55:38.920
A human baby, a human toddler of 18 months of age
link |
00:55:43.080
has more capacity to manage its own attention
link |
00:55:45.840
and its own capacity to make better sense of the world
link |
00:55:49.200
than the most advanced robots today.
link |
00:55:51.720
So again, my cat, I think can do a better job of my two
link |
00:55:55.160
and they're both pretty clever.
link |
00:55:56.400
So I do think though, back to my kind of original point,
link |
00:55:59.440
I think that it's not, for me, it's not question at all
link |
00:56:02.720
that we will be able to create synthetic systems
link |
00:56:05.960
that are able to do this better than the human,
link |
00:56:09.200
at an equal level or better than the human mind.
link |
00:56:11.720
It's also for me, not a question that we will be able
link |
00:56:16.400
to put them alongside humans
link |
00:56:20.160
so that they capture the full broad spectrum
link |
00:56:23.240
of what we are seeing as well.
link |
00:56:25.400
And also looking at our responses,
link |
00:56:28.040
listening to our responses,
link |
00:56:28.920
even maybe measuring certain vital signs about us.
link |
00:56:32.040
So in this kind of sidecar mode,
link |
00:56:34.440
a greater intelligence could use us
link |
00:56:37.600
and our whatever 80 years of life to train itself up
link |
00:56:42.080
and then be a very good simulacrum of us moving forward.
link |
00:56:45.080
So who is in the sidecar
link |
00:56:48.120
in that picture of the future exactly?
link |
00:56:50.440
The baby version of our immortal selves.
link |
00:56:52.960
Okay, so once the baby grows up,
link |
00:56:56.160
is there any use for humans?
link |
00:56:58.440
I think so.
link |
00:56:59.960
I think that out of epistemic humility,
link |
00:57:03.240
we need to keep humans around for a long time.
link |
00:57:05.600
And I would hope that anyone making those systems
link |
00:57:07.960
would believe that to be true.
link |
00:57:10.040
Out of epistemic humility,
link |
00:57:11.640
what's the nature of the humility that?
link |
00:57:13.480
That we don't know what we don't know.
link |
00:57:16.440
So we don't.
link |
00:57:18.960
Right?
link |
00:57:19.800
So we don't know.
link |
00:57:20.640
First we have to build systems
link |
00:57:21.680
that help us do the things that we do know about
link |
00:57:24.400
that can then probe the unknowns that we know about.
link |
00:57:26.760
But the unknown unknowns, we don't know.
link |
00:57:28.560
We could always know.
link |
00:57:30.040
Nature is the one thing
link |
00:57:31.160
that is infinitely able to surprise us.
link |
00:57:33.800
So we should keep biological humans around
link |
00:57:35.880
for a very, very, very long time.
link |
00:57:37.600
Even after our immortal selves have transcended
link |
00:57:40.440
and have gone off to explore other worlds,
link |
00:57:42.880
gone to go communicate with the lifeforms living in the sun
link |
00:57:45.200
or whatever else.
link |
00:57:46.040
So I think that's,
link |
00:57:49.200
for me, these seem like things that are going to happen.
link |
00:57:53.000
Like I don't really question that,
link |
00:57:54.480
that they're gonna happen.
link |
00:57:55.720
Assuming we don't completely destroy ourselves.
link |
00:57:58.240
Is it possible to create an AI system
link |
00:58:02.480
that you fall in love with and it falls in love with you
link |
00:58:06.160
and you have a romantic relationship with it?
link |
00:58:08.480
Or a deep friendship, let's say.
link |
00:58:10.760
I would hope that that is the design criteria
link |
00:58:12.680
for any of these systems.
link |
00:58:14.520
If we cannot have a meaningful relationship with it,
link |
00:58:18.480
then it's still just a chunk of silicon.
link |
00:58:20.320
So then what is meaningful?
link |
00:58:21.680
Because back to sugar.
link |
00:58:23.800
Well, sugar doesn't love you back, right?
link |
00:58:25.400
So the computer has to love you back.
link |
00:58:26.840
And what does love mean?
link |
00:58:28.200
Well, in this context, for me, love,
link |
00:58:30.160
I'm gonna take a page from Alain de Botton.
link |
00:58:32.040
Love means that it wants to help us
link |
00:58:34.400
become the best version of ourselves.
link |
00:58:36.640
Yes, that's beautiful.
link |
00:58:39.440
That's a beautiful definition of love.
link |
00:58:40.760
So what role does love play in the human condition
link |
00:58:45.360
at the individual level and at the group level?
link |
00:58:48.720
Because you were kind of saying that humans,
link |
00:58:51.120
we should really consider humans
link |
00:58:52.280
both at the individual and the group and the societal level.
link |
00:58:55.320
What's the role of love in this whole thing?
link |
00:58:56.960
We talked about sex, we talked about death,
link |
00:58:59.640
thanks to the bacteria that invented it.
link |
00:59:02.320
At which point did we invent love, by the way?
link |
00:59:04.320
I mean, is that also?
link |
00:59:05.560
No, I think love is the start of it all.
link |
00:59:08.960
And the feelings of, and this gets sort of beyond
link |
00:59:13.080
just romantic, sensual, whatever kind of things,
link |
00:59:16.760
but actually genuine love as we have for another person.
link |
00:59:19.720
Love as it would be used in a religious text, right?
link |
00:59:22.680
I think that capacity to feel love
link |
00:59:25.480
more than consciousness, that is the universal thing.
link |
00:59:28.440
Our feeling of love is actually a sense
link |
00:59:30.320
of that generativity.
link |
00:59:31.320
When we can look at another person
link |
00:59:33.120
and see that they can be something more than they are,
link |
00:59:37.440
and more than just a pigeonhole we might stick them in.
link |
00:59:42.480
I mean, I think there's, in any religious text,
link |
00:59:44.160
you'll find voiced some concept of this,
link |
00:59:47.640
that you should see the grace of God in the other person.
link |
00:59:50.920
They're made in the spirit of the love
link |
00:59:54.760
that God feels for his creation or her creation.
link |
00:59:57.120
And so I think this thing is actually the root of it.
link |
01:00:00.180
So I would say, I don't think molecules of water
link |
01:00:04.800
feel consciousness, have consciousness,
link |
01:00:06.280
but there is some proto micro quantum thing of love.
link |
01:00:10.920
That's the generativity when there's more energy
link |
01:00:14.080
than what they need to maintain equilibrium.
link |
01:00:16.280
And that when you sum it all up is something that leads to,
link |
01:00:19.800
I mean, I had my mind blown one day as an undergrad
link |
01:00:23.500
at the physics computer lab.
link |
01:00:24.560
I logged in and when you log into bash for a long time,
link |
01:00:28.300
there was a little fortune that would come out.
link |
01:00:29.600
And it said, man was created by water
link |
01:00:32.400
to carry itself uphill.
link |
01:00:33.940
And I was logging into work on some problem set
link |
01:00:37.960
and I logged in and I saw that and I just said,
link |
01:00:40.540
son of a bitch, I just, I logged out
link |
01:00:43.120
and I went to the coffee shop and I got a coffee
link |
01:00:45.080
and I sat there on the quad and I'm like,
link |
01:00:47.040
you know, it's not wrong and yet WTF, right?
link |
01:00:53.440
So when you look at it that way,
link |
01:00:55.320
it's like, yeah, okay, non equilibrium physics is a thing.
link |
01:00:59.040
And so when we think about love,
link |
01:01:00.960
when we think about these kinds of things, I would say
link |
01:01:05.360
that in the modern day human condition,
link |
01:01:08.160
there's a lot of talk about freedom and individual liberty
link |
01:01:12.300
and rights and all these things,
link |
01:01:14.600
but that's very Hegelian, it's very kind of following
link |
01:01:18.120
from the Western philosophy of the individual as sacrosanct,
link |
01:01:22.760
but it's not really couched I think the right way
link |
01:01:26.360
because it should be how do we maximize people's ability
link |
01:01:29.460
to love each other, to love themselves first,
link |
01:01:32.100
to love each other, their responsibilities
link |
01:01:34.560
to the previous generation, to the future generations.
link |
01:01:37.800
Those are the kinds of things
link |
01:01:39.280
that should be our design criteria, right?
link |
01:01:41.840
Those should be what we start with to then come up
link |
01:01:45.720
with the philosophies of self and of rights
link |
01:01:48.280
and responsibilities, but that love being at the center
link |
01:01:52.080
of it, I think when we design systems for cognition,
link |
01:01:56.680
it should absolutely be built that way.
link |
01:01:58.700
I think if we simply focus on efficiency and productivity,
link |
01:02:02.240
these kind of very industrial era,
link |
01:02:05.900
all the things that Marx had issues with, right?
link |
01:02:08.200
Those, that's a way to go and really I think go off
link |
01:02:11.880
the deep end in the wrong way.
link |
01:02:13.600
So one of the interesting consequences of thinking of life
link |
01:02:19.200
in this hierarchical way of an individual human
link |
01:02:22.440
and then there's groups and there's societies
link |
01:02:25.040
is I believe that you believe that corporations are people.
link |
01:02:31.680
So this is a kind of a politically dense idea,
link |
01:02:36.600
all those kinds of things.
link |
01:02:37.440
If we just throw politics aside,
link |
01:02:39.100
if we throw all of that aside,
link |
01:02:41.300
in which sense do you believe that corporations are people?
link |
01:02:46.260
And how does love connect to that?
link |
01:02:47.760
Right, so the belief is that groups of people
link |
01:02:52.040
have some kind of higher level, I would say mesoscopic
link |
01:02:55.680
claim to agency.
link |
01:02:57.960
So where do I, let's start with this.
link |
01:03:00.920
Most people would say, okay, individuals have claims
link |
01:03:03.720
to agency and sovereignty.
link |
01:03:05.140
Nations, we certainly act as if nations,
link |
01:03:07.640
so at a very large, large scale,
link |
01:03:09.520
nations have rights to sovereignty and agency.
link |
01:03:13.120
Like everyone plays the game of modernity
link |
01:03:15.000
as if that's true, right?
link |
01:03:16.320
We believe France is a thing,
link |
01:03:17.380
we believe the United States is a thing.
link |
01:03:18.800
But to say that groups of people at a smaller level
link |
01:03:23.100
than that, like a family unit is the thing.
link |
01:03:26.640
Well, in our laws, we actually do encode this concept.
link |
01:03:30.440
I believe that in a relationship and a marriage, right,
link |
01:03:33.800
one partner can sue for loss of consortium, right?
link |
01:03:37.720
If someone breaks up the marriage or whatever.
link |
01:03:39.820
So these are concepts that even in law,
link |
01:03:41.560
we do respect that there is something about the union
link |
01:03:44.720
and about the family.
link |
01:03:45.980
So for me, I don't think it's so weird to think
link |
01:03:48.640
that groups of people have a right to,
link |
01:03:51.820
a claim to rights and sovereignty of some degree.
link |
01:03:54.640
I mean, we look at our clubs, we look at churches.
link |
01:03:59.040
These are, we talk about these collectives of people
link |
01:04:02.000
as if they have a real agency to them, and they do.
link |
01:04:05.800
But I think if we take that one step further and say,
link |
01:04:08.720
okay, they can accrue resources.
link |
01:04:10.320
Well, yes, check, you know, and by law they can.
link |
01:04:13.760
They can own land, they can engage in contracts,
link |
01:04:17.060
they can do all these different kinds of things.
link |
01:04:18.840
So we in legal terms support this idea
link |
01:04:22.600
that groups of people have rights.
link |
01:04:26.200
Where we go wrong on this stuff
link |
01:04:28.040
is that the most popular version of this
link |
01:04:31.280
is the for profit absentee owner corporation
link |
01:04:35.360
that then is able to amass larger resources
link |
01:04:38.440
than anyone else in the landscape, anything else,
link |
01:04:40.840
any other entity of equivalent size.
link |
01:04:42.760
And they're able to essentially bully around individuals,
link |
01:04:45.480
whether it's laborers, whether it's people
link |
01:04:47.040
whose resources they want to capture.
link |
01:04:48.960
They're also able to bully around
link |
01:04:50.480
our system of representation,
link |
01:04:52.160
which is still tied to individuals, right?
link |
01:04:55.540
So I don't believe that's correct.
link |
01:04:58.520
I don't think it's good that they, you know,
link |
01:05:01.160
they're people, but they're assholes.
link |
01:05:02.320
I don't think that corporations as people
link |
01:05:03.680
acting like assholes is a good thing.
link |
01:05:05.520
But the idea that collectives and collections of people
link |
01:05:08.440
that we should treat them philosophically
link |
01:05:10.120
as having some agency and some mass,
link |
01:05:15.120
at a mesoscopic level, I think that's an important thing
link |
01:05:18.000
because one thing I do think we underappreciate sometimes
link |
01:05:22.400
is the fact that relationships have relationships.
link |
01:05:26.200
So it's not just individuals
link |
01:05:27.240
having relationships with each other.
link |
01:05:29.120
But if you have eight people seated around a table, right?
link |
01:05:32.080
Each person has a relationship with each of the others
link |
01:05:34.200
and that's obvious.
link |
01:05:35.640
But then if it's four couples,
link |
01:05:37.880
each couple also has a relationship
link |
01:05:39.720
with each of the other couples, right?
link |
01:05:41.720
The dyads do.
link |
01:05:42.720
And if it's couples, but one is the, you know,
link |
01:05:45.600
father and mother older, and then, you know,
link |
01:05:48.200
one of their children and their spouse,
link |
01:05:50.960
that family unit of four has a relationship
link |
01:05:53.980
with the other family unit of four.
link |
01:05:55.760
So the idea that relationships have relationships
link |
01:05:57.640
is something that we intuitively know
link |
01:05:59.920
in navigating the social landscape,
link |
01:06:01.780
but it's not something I hear expressed like that.
link |
01:06:04.720
It's certainly not something that is,
link |
01:06:06.680
I think, taken into account very well
link |
01:06:08.160
when we design these kinds of things.
link |
01:06:09.480
So I think the reason why I care a lot about this
link |
01:06:14.080
is because I think the future of humanity
link |
01:06:16.520
requires us to form better sense make,
link |
01:06:19.660
collective sense making units at something, you know,
link |
01:06:23.680
around Dunbar number, you know, half to five X Dunbar.
link |
01:06:28.160
And that's very different than right now
link |
01:06:30.640
where we defer sense making
link |
01:06:33.600
to massive aging zombie institutions.
link |
01:06:37.040
Or we just do it ourselves.
link |
01:06:38.520
Go it alone.
link |
01:06:39.360
Go to the dark force of the internet by ourselves.
link |
01:06:41.120
So that's really interesting.
link |
01:06:42.360
So you've talked about agency,
link |
01:06:45.340
I think maybe calling it a convenient fiction
link |
01:06:47.760
at all these different levels.
link |
01:06:49.620
So even at the human individual level,
link |
01:06:52.080
it's kind of a fiction.
link |
01:06:53.160
We all believe, because we are, like you said,
link |
01:06:55.080
made of cells and cells are made of atoms.
link |
01:06:57.720
So that's a useful fiction.
link |
01:06:58.980
And then there's nations that seems to be a useful fiction,
link |
01:07:02.900
but it seems like some fictions are better than others.
link |
01:07:06.900
You know, there's a lot of people that argue
link |
01:07:08.400
the fiction of nation is a bad idea.
link |
01:07:11.000
One of them lives two doors down from me,
link |
01:07:13.880
Michael Malice, he's an anarchist.
link |
01:07:16.160
You know, I'm sure there's a lot of people
link |
01:07:18.280
who are into meditation that believe the idea,
link |
01:07:21.860
this useful fiction of agency of an individual
link |
01:07:24.920
is a troublesome as well.
link |
01:07:26.680
We need to let go of that in order to truly,
link |
01:07:29.460
like to transcend, I don't know.
link |
01:07:32.320
I don't know what words you want to use,
link |
01:07:33.760
but suffering or to elevate the experience of life.
link |
01:07:38.440
So you're kind of arguing that,
link |
01:07:40.800
okay, so we have some of these useful fictions of agency.
link |
01:07:44.680
We should add a stronger fiction that we tell ourselves
link |
01:07:49.000
about the agency of groups in the hundreds
link |
01:07:52.700
of the half a Dunbar's number, 5X Dunbar's number.
link |
01:07:57.920
Yeah, something on that order.
link |
01:07:58.760
And we call them fictions,
link |
01:07:59.800
but really they're rules of the game, right?
link |
01:08:01.580
Rules that we feel are fair or rules that we consent to.
link |
01:08:05.720
Yeah, I always question the rules
link |
01:08:07.000
when I lose like a monopoly.
link |
01:08:08.600
That's when I usually question the rules.
link |
01:08:09.840
When I'm winning, I don't question the rules.
link |
01:08:11.600
We should play a game Monopoly someday.
link |
01:08:12.880
There's a trippy version of it that we could do.
link |
01:08:15.240
What kind?
link |
01:08:16.320
Contract Monopoly is introduced by a friend of mine to me
link |
01:08:19.200
where you can write contracts on future earnings
link |
01:08:23.120
or landing on various things.
link |
01:08:24.600
And you can hand out like, you know,
link |
01:08:26.720
you can land the first three times you land
link |
01:08:28.080
in a park place, it's free or whatever.
link |
01:08:30.160
And then you can start trading those contracts for money.
link |
01:08:33.480
And then you create a human civilization
link |
01:08:36.760
and somehow Bitcoin comes into it.
link |
01:08:38.280
Okay, but some of these.
link |
01:08:40.520
Actually, I bet if me and you and Eric sat down
link |
01:08:43.040
to play a game Monopoly and we were to make NFTs
link |
01:08:45.320
out of the contracts we wrote, we could make a lot of money.
link |
01:08:48.000
Now it's a terrible idea.
link |
01:08:49.320
I would never do it,
link |
01:08:50.200
but I bet we could actually sell the NFTs around.
link |
01:08:52.920
I have other ideas to make money that I could tell you
link |
01:08:56.800
and they're all terrible ideas.
link |
01:08:58.400
Yeah, including cat videos on the internet.
link |
01:09:02.280
Okay, but some of these rules of the game,
link |
01:09:04.600
some of these fictions are,
link |
01:09:06.040
it seems like they're better than others.
link |
01:09:09.280
They have worked this far to cohere human,
link |
01:09:13.320
to organize human collective action.
link |
01:09:14.840
But you're saying something about,
link |
01:09:16.600
especially this technological age
link |
01:09:19.240
requires modified fictions, stories of agency.
link |
01:09:23.640
Why the Dunbar number?
link |
01:09:25.040
And also, you know, how do you select the group of people?
link |
01:09:28.240
You know, Dunbar numbers, I think I have the sense
link |
01:09:31.760
that it's overused as a kind of law
link |
01:09:36.360
that somehow we can have deep human connection at this scale.
link |
01:09:41.280
Like some of it feels like an interface problem too.
link |
01:09:45.440
It feels like if I have the right tools,
link |
01:09:48.080
I can deeply connect with a larger number of people.
link |
01:09:51.840
It just feels like there's a huge value
link |
01:09:55.480
to interacting just in person, getting to share
link |
01:09:59.400
traumatic experiences together,
link |
01:10:00.960
beautiful experiences together.
link |
01:10:02.720
There's other experiences like that in the digital space
link |
01:10:06.920
that you can share.
link |
01:10:07.760
It just feels like Dunbar's number
link |
01:10:09.360
could be expanded significantly,
link |
01:10:10.800
perhaps not to the level of millions and billions,
link |
01:10:15.000
but it feels like it could be expanded.
link |
01:10:16.280
So how do we find the right interface, you think,
link |
01:10:21.680
for having a little bit of a collective here
link |
01:10:24.840
that has agency?
link |
01:10:26.080
You're right that there's many different ways
link |
01:10:28.080
that we can build trust with each other.
link |
01:10:30.000
Yeah.
link |
01:10:30.840
My friend Joe Edelman talks about a few different ways
link |
01:10:33.960
that, you know, mutual appreciation, trustful conflict,
link |
01:10:39.360
just experiencing something like, you know,
link |
01:10:41.320
there's a variety of different things that we can do,
link |
01:10:43.640
but all those things take time and you have to be present.
link |
01:10:48.480
The less present you are, I mean, there's just, again,
link |
01:10:50.320
a no free lunch principle here.
link |
01:10:51.560
The less present you are, the more of them you can do,
link |
01:10:54.200
but then the less connection you build.
link |
01:10:56.800
So I think there is sort of a human capacity issue
link |
01:10:59.440
around some of these things.
link |
01:11:00.280
Now, that being said, if we can use certain technologies,
link |
01:11:04.800
so for instance, if I write a little monograph
link |
01:11:07.520
on my view of the world,
link |
01:11:08.720
you read it asynchronously at some point,
link |
01:11:10.600
and you're like, wow, Peter, this is great.
link |
01:11:11.920
Here's mine.
link |
01:11:12.800
I read it.
link |
01:11:13.640
I'm like, wow, Lex, this is awesome.
link |
01:11:15.320
We can be friends without having to spend 10 years,
link |
01:11:18.720
you know, figuring all this stuff out together.
link |
01:11:20.560
We just read each other's thing and be like,
link |
01:11:22.200
oh yeah, this guy's exactly in my wheelhouse
link |
01:11:24.800
and vice versa.
link |
01:11:26.080
And we can then, you know, connect just a few times a year
link |
01:11:30.520
and maintain a high trust relationship.
link |
01:11:33.040
It can be expanded a little bit,
link |
01:11:34.320
but it also requires,
link |
01:11:35.840
these things are not all technological in nature.
link |
01:11:37.320
It requires the individual themselves
link |
01:11:39.640
to have a certain level of capacity,
link |
01:11:41.680
to have a certain lack of neuroticism, right?
link |
01:11:44.760
If you want to use like the ocean big five sort of model,
link |
01:11:48.080
people have to be pretty centered.
link |
01:11:49.680
The less centered you are,
link |
01:11:50.640
the fewer authentic connections you can really build
link |
01:11:52.840
for a particular unit of time.
link |
01:11:54.800
It just takes more time.
link |
01:11:55.840
Other people have to put up with your crap.
link |
01:11:57.160
Like there's just a lot of the stuff
link |
01:11:58.120
that you have to deal with
link |
01:12:00.000
if you are not so well balanced, right?
link |
01:12:02.240
So yes, we can help people get better
link |
01:12:04.760
to where they can develop more relationships faster,
link |
01:12:06.760
and then you can maybe expand Dunbar number by quite a bit,
link |
01:12:09.560
but you're not going to do it.
link |
01:12:10.640
I think it's going to be hard to get it beyond 10X,
link |
01:12:12.880
kind of the rough swag of what it is, you know?
link |
01:12:16.280
Well, don't you think that AI systems could be an addition
link |
01:12:20.840
to the Dunbar's number?
link |
01:12:22.640
So like why?
link |
01:12:23.480
Do you count as one system or multiple AI systems?
link |
01:12:25.600
Multiple AI systems.
link |
01:12:26.560
So I do believe that AI systems,
link |
01:12:28.600
for them to integrate into human society as it is now,
link |
01:12:31.360
have to have a sense of agency.
link |
01:12:32.600
So there has to be a individual
link |
01:12:35.280
because otherwise we wouldn't relate to them.
link |
01:12:37.600
We could engage certain kinds of individuals
link |
01:12:40.240
to make sense of them for us and be almost like,
link |
01:12:42.680
did you ever watch Star Trek?
link |
01:12:44.960
Like Voyager, like there's the Volta,
link |
01:12:46.480
who are like the interfaces,
link |
01:12:47.640
the ambassadors for the Dominion.
link |
01:12:50.480
We may have ambassadors that speak
link |
01:12:53.120
on behalf of these systems.
link |
01:12:54.400
They're like the Mentats of Dune, maybe,
link |
01:12:56.160
or something like this.
link |
01:12:57.280
I mean, we already have this to some extent.
link |
01:12:59.320
If you look at the biggest sort of,
link |
01:13:01.160
I wouldn't say AI system,
link |
01:13:02.120
but the biggest cybernetic system in the world
link |
01:13:04.080
is the financial markets.
link |
01:13:05.160
It runs outside of any individual's control,
link |
01:13:08.000
and you have an entire stack of people on Wall Street,
link |
01:13:09.920
Wall Street analysts to CNBC reporters, whatever.
link |
01:13:13.240
They're all helping to communicate what does this mean?
link |
01:13:16.920
You know, like Jim Cramer,
link |
01:13:18.120
like coming around and yelling and stuff.
link |
01:13:19.560
Like all of these people are part of that lowering
link |
01:13:22.560
of the complexity there to meet sense,
link |
01:13:26.320
you know, to help do sense making for people
link |
01:13:28.440
at whatever capacity they're at.
link |
01:13:29.800
And I don't see this changing with AI systems.
link |
01:13:31.560
I think you would have ringside commentators
link |
01:13:33.400
talking about all this stuff
link |
01:13:34.560
that this AI system is trying to do over here, over here,
link |
01:13:36.600
because it's actually a super intelligence.
link |
01:13:39.120
So if you want to talk about humans interfacing,
link |
01:13:40.800
making first contact with the super intelligence,
link |
01:13:42.480
we're already there.
link |
01:13:43.600
We do it pretty poorly.
link |
01:13:44.800
And if you look at the gradient of power and money,
link |
01:13:47.240
what happens is the people closest to it
link |
01:13:48.800
will absolutely exploit their distance
link |
01:13:50.960
for personal financial gain.
link |
01:13:54.360
So we should look at that and be like,
link |
01:13:56.080
oh, well, that's probably what the future
link |
01:13:57.320
will look like as well.
link |
01:13:58.880
But nonetheless, I mean,
link |
01:14:00.240
we're already doing this kind of thing.
link |
01:14:01.360
So in the future, we can have AI systems,
link |
01:14:03.800
but you're still gonna have to trust people
link |
01:14:05.720
to bridge the sense making gap to them.
link |
01:14:08.400
See, I just feel like there could be
link |
01:14:10.800
like millions of AI systems that have,
link |
01:14:15.400
have agencies, you have,
link |
01:14:17.080
when you say one super intelligence,
link |
01:14:19.480
super intelligence in that context means
link |
01:14:22.280
it's able to solve particular problems extremely well.
link |
01:14:26.080
But there's some aspect of human like intelligence
link |
01:14:29.240
that's necessary to be integrated into human society.
link |
01:14:32.320
So not financial markets,
link |
01:14:33.720
not sort of weather prediction systems,
link |
01:14:36.760
or I don't know, logistics optimization.
link |
01:14:39.680
I'm more referring to things that you interact with
link |
01:14:43.240
on the intellectual level.
link |
01:14:45.120
And that I think requires,
link |
01:14:47.120
there has to be a backstory.
link |
01:14:48.920
There has to be a personality.
link |
01:14:50.080
I believe it has to fear its own mortality in a genuine way.
link |
01:14:53.320
Like there has to be all,
link |
01:14:56.560
many of the elements that we humans experience
link |
01:14:59.680
that are fundamental to the human condition,
link |
01:15:01.920
because otherwise we would not have
link |
01:15:03.840
a deep connection with it.
link |
01:15:05.840
But I don't think having a deep connection with it
link |
01:15:07.800
is necessarily going to stop us from building a thing
link |
01:15:10.600
that has quite an alien intelligence aspect here.
link |
01:15:13.360
So the other kind of alien intelligence on this planet
link |
01:15:16.640
is the octopuses or octopodes
link |
01:15:18.640
or whatever you wanna call them.
link |
01:15:19.720
Octopi. Octopi, yeah.
link |
01:15:21.000
There's a little controversy
link |
01:15:22.400
as to what the plural is, I guess.
link |
01:15:23.720
But an octopus. I look forward to your letters.
link |
01:15:26.560
Yeah, an octopus,
link |
01:15:30.360
it really acts as a collective intelligence
link |
01:15:32.120
of eight intelligent arms, right?
link |
01:15:34.320
Its arms have a tremendous amount of neural density to them.
link |
01:15:37.000
And I see if we can build,
link |
01:15:40.360
I mean, just let's go with what you're saying.
link |
01:15:42.000
If we build a singular intelligence
link |
01:15:44.400
that interfaces with humans that has a sense of agency
link |
01:15:48.080
so it can run the cybernetic loop
link |
01:15:49.600
and develop its own theory of mind
link |
01:15:51.080
as well as its theory of action,
link |
01:15:52.960
all these things, I agree with you
link |
01:15:54.040
that that's the necessary components
link |
01:15:56.240
to build a real intelligence, right?
link |
01:15:57.800
There's gotta be something at stake.
link |
01:15:58.720
It's gotta make a decision.
link |
01:16:00.000
It's gotta then run the OODA loop.
link |
01:16:01.280
Okay, so we build one of those.
link |
01:16:03.000
Well, if we can build one of those,
link |
01:16:03.880
we can probably build 5 million of them.
link |
01:16:05.640
So we build 5 million of them.
link |
01:16:07.400
And if their cognitive systems are already digitized
link |
01:16:09.960
and already kind of there,
link |
01:16:12.000
we stick an antenna on each of them,
link |
01:16:13.720
bring it all back to a hive mind
link |
01:16:15.440
that maybe doesn't make all the individual decisions
link |
01:16:17.640
for them, but treats each one
link |
01:16:19.360
as almost like a neuronal input
link |
01:16:21.480
of a much higher bandwidth and fidelity,
link |
01:16:23.880
going back to a central system
link |
01:16:25.920
that is then able to perceive much broader dynamics
link |
01:16:30.240
that we can't see.
link |
01:16:31.120
In the same way that a phased array radar, right?
link |
01:16:32.600
You think about how phased array radar works.
link |
01:16:34.440
It's just sensitivity.
link |
01:16:36.200
It's just radars, and then it's hypersensitivity
link |
01:16:39.120
and really great timing between all of them.
link |
01:16:41.160
And with a flat array,
link |
01:16:42.600
it's as good as a curved radar dish, right?
link |
01:16:44.740
So with these things,
link |
01:16:45.580
it's a phased array of cybernetic systems
link |
01:16:47.800
that'll give the centralized intelligence
link |
01:16:51.280
much, much better, a much higher fidelity understanding
link |
01:16:55.040
of what's actually happening in the environment.
link |
01:16:56.600
But the more power,
link |
01:16:57.720
the more understanding the central super intelligence has,
link |
01:17:02.480
the dumber the individual like fingers
link |
01:17:06.600
of this intelligence are, I think.
link |
01:17:08.080
I think you...
link |
01:17:08.920
Not necessarily.
link |
01:17:09.760
In my sense...
link |
01:17:10.580
I don't see what has to be.
link |
01:17:11.420
This argument, there has to be,
link |
01:17:13.560
the experience of the individual agent
link |
01:17:15.660
has to have the full richness of the human like experience.
link |
01:17:20.840
You have to be able to be driving the car in the rain,
link |
01:17:23.800
listening to Bruce Springsteen,
link |
01:17:25.240
and all of a sudden break out in tears
link |
01:17:28.280
because remembering something that happened to you
link |
01:17:30.580
in high school.
link |
01:17:31.420
We can implant those memories
link |
01:17:32.240
if that's really needed.
link |
01:17:33.080
But no, I'm...
link |
01:17:33.920
No, but the central agency,
link |
01:17:34.960
like I guess I'm saying for, in my view,
link |
01:17:37.720
for intelligence to be born,
link |
01:17:39.620
you have to have a decentralization.
link |
01:17:43.860
Like each one has to struggle and reach.
link |
01:17:47.280
So each one in excess of energy has to reach for order
link |
01:17:51.920
as opposed to a central place doing so.
link |
01:17:54.420
Have you ever read like some sci fi
link |
01:17:55.640
where there's like hive minds?
link |
01:17:58.820
Like the Wernher Vinge, I think has one of these.
link |
01:18:01.080
And then some of the stuff from the Commonwealth Saga,
link |
01:18:05.160
the idea that you're an individual,
link |
01:18:06.960
but you're connected with like a few other individuals
link |
01:18:09.240
telepathically as well.
link |
01:18:10.300
And together you form a swarm.
link |
01:18:12.560
So if you are, I ask you,
link |
01:18:14.580
what do you think is the experience of if you are like,
link |
01:18:18.040
well, a Borg, right?
link |
01:18:18.920
If you are one, if you're part of this hive mind,
link |
01:18:22.600
outside of all the aesthetics, forget the aesthetics,
link |
01:18:25.360
internally, what is your experience like?
link |
01:18:28.360
Because I have a theory as to what that looks like.
link |
01:18:30.600
The one question I have for you about that experience is
link |
01:18:34.280
how much is there a feeling of freedom, of free will?
link |
01:18:38.560
Because I obviously as a human, very unbiased,
link |
01:18:43.280
but also somebody who values freedom and biased,
link |
01:18:46.120
it feels like the experience of freedom is essential for
link |
01:18:52.800
trying stuff out, to being creative
link |
01:18:55.960
and doing something truly novel, which is at the core of.
link |
01:18:59.080
Yeah, well, I don't think you have to lose any freedom
link |
01:19:00.920
when you're in that mode.
link |
01:19:02.040
Because I think what happens is we think,
link |
01:19:04.560
we still think, I mean, you're still thinking about this
link |
01:19:06.920
in a sense of a top down command and control hierarchy,
link |
01:19:09.800
which is not what it has to be at all.
link |
01:19:12.280
I think the experience, so I'll just show by cards here.
link |
01:19:16.040
I think the experience of being a robot in that robot swarm,
link |
01:19:19.720
a robot who has agency over their own local environment
link |
01:19:22.840
that's doing sense making
link |
01:19:23.880
and reporting it back to the hive mind,
link |
01:19:25.880
I think that robot's experience would be one,
link |
01:19:28.840
when the hive mind is working well,
link |
01:19:31.000
it would be an experience of like talking to God, right?
link |
01:19:34.360
That you essentially are reporting to,
link |
01:19:37.480
you're sort of saying, here's what I see.
link |
01:19:38.760
I think this is what's gonna happen over here.
link |
01:19:40.200
I'm gonna go do this thing.
link |
01:19:41.240
Because I think if I'm gonna do this,
link |
01:19:42.840
this will make this change happen in the environment.
link |
01:19:45.400
And then God, she may tell you, that's great.
link |
01:19:50.400
And in fact, your brothers and sisters will join you
link |
01:19:52.400
to help make this go better, right?
link |
01:19:54.040
And then she can let your brothers and sisters know,
link |
01:19:56.440
hey, Peter's gonna go do this thing.
link |
01:19:58.680
Would you like to help him?
link |
01:19:59.880
Because we think that this will make this thing go better.
link |
01:20:01.640
And they'll say, yes, we'll help him.
link |
01:20:03.000
So the whole thing could be actually very emergent.
link |
01:20:05.560
The sense of, what does it feel like to be a cell
link |
01:20:09.160
and a network that is alive, that is generative.
link |
01:20:11.880
And I think actually the feeling is serendipity.
link |
01:20:16.040
That there's random order, not random disorder or chaos,
link |
01:20:20.360
but random order, just when you need it to hear Bruce Springsteen,
link |
01:20:24.040
you turn on the radio and bam, it's Bruce Springsteen, right?
link |
01:20:28.040
That feeling of serendipity, I feel like,
link |
01:20:30.600
this is a bit of a flight of fancy,
link |
01:20:31.880
but every cell in your body must have,
link |
01:20:35.240
what does it feel like to be a cell in your body?
link |
01:20:37.320
When it needs sugar, there's sugar.
link |
01:20:39.320
When it needs oxygen, there's just oxygen.
link |
01:20:41.640
Now, when it needs to go and do its work
link |
01:20:43.080
and pull like as one of your muscle fibers, right?
link |
01:20:46.120
It does its work and it's great.
link |
01:20:48.120
It contributes to the cause, right?
link |
01:20:49.560
So this is all, again, a flight of fancy,
link |
01:20:51.960
but I think as we extrapolate up,
link |
01:20:53.880
what does it feel like to be an independent individual
link |
01:20:56.520
with some bounded sense of freedom?
link |
01:20:58.200
All sense of freedom is actually bounded,
link |
01:20:59.640
but it was a bounded sense of freedom
link |
01:21:01.240
that still lives within a network that has order to it.
link |
01:21:04.360
And I feel like it has to be a feeling of serendipity.
link |
01:21:06.600
So the cell, there's a feeling of serendipity, even though.
link |
01:21:10.680
It has no way of explaining why it's getting oxygen
link |
01:21:12.680
and sugar when it gets it.
link |
01:21:13.480
So you have to, each individual component has to be too dumb
link |
01:21:17.400
to understand the big picture.
link |
01:21:19.240
No, the big picture is bigger than what it can understand.
link |
01:21:22.840
But isn't that an essential characteristic
link |
01:21:24.680
of the individual is to be too dumb
link |
01:21:27.880
to understand the bigger picture.
link |
01:21:29.480
Like not dumb necessarily,
link |
01:21:31.160
but limited in its capacity to understand.
link |
01:21:33.960
Because the moment you understand,
link |
01:21:36.680
I feel like that leads to, if you tell me now
link |
01:21:41.000
that there are some bigger intelligence
link |
01:21:43.480
controlling everything I do,
link |
01:21:45.640
intelligence broadly defined, meaning like,
link |
01:21:47.960
you know, even the Sam Harris thing, there's no free will.
link |
01:21:51.480
If I'm smart enough to truly understand that that's the case,
link |
01:21:56.360
that's gonna, I don't know if I.
link |
01:21:58.840
We have philosophical breakdown, right?
link |
01:22:00.920
Because we're in the West and we're pumped full of this stuff
link |
01:22:03.560
of like, you are a golden, fully free individual
link |
01:22:06.760
with all your freedoms and all your liberties
link |
01:22:08.360
and go grab a gun and shoot whatever you want to.
link |
01:22:10.520
No, it's actually, you don't actually have a lot of these,
link |
01:22:14.360
you're not unconstrained,
link |
01:22:15.800
but the areas where you can manifest agency,
link |
01:22:20.120
you're free to do those things.
link |
01:22:21.720
You can say whatever you want on this podcast.
link |
01:22:23.160
You can create a podcast, right?
link |
01:22:24.280
Yeah.
link |
01:22:24.760
You're not, I mean, you have a lot of this kind of freedom,
link |
01:22:27.720
but even as you're doing this, you are actually,
link |
01:22:30.040
I guess where the denouement of this is that
link |
01:22:33.480
we are already intelligent agents in such a system, right?
link |
01:22:37.720
In that one of these like robots
link |
01:22:39.960
of one of 5 million little swarm robots
link |
01:22:42.280
or one of the Borg,
link |
01:22:43.480
they're just posting on internal bulletin board.
link |
01:22:45.320
I mean, maybe the Borg cube
link |
01:22:46.200
is just a giant Facebook machine floating in space
link |
01:22:48.440
and everyone's just posting on there.
link |
01:22:50.360
They're just posting really fast and like, oh yeah.
link |
01:22:52.680
It's called the metaverse now.
link |
01:22:53.720
That's called the metaverse, that's right.
link |
01:22:54.840
Here's the enterprise.
link |
01:22:55.480
Maybe we should all go shoot it.
link |
01:22:56.440
Yeah, everyone upvotes and they're gonna go shoot it, right?
link |
01:22:58.840
But we already are part of a human online
link |
01:23:02.120
collaborative environment
link |
01:23:03.640
and collaborative sensemaking system.
link |
01:23:05.640
It's not very good yet.
link |
01:23:07.240
It's got the overhangs of zombie sensemaking institutions
link |
01:23:10.760
all over it, but as that washes away
link |
01:23:13.240
and as we get better at this,
link |
01:23:15.400
we are going to see humanity improving
link |
01:23:18.520
at speeds that are unthinkable in the past.
link |
01:23:21.800
And it's not because anyone's freedoms were limited.
link |
01:23:23.640
In fact, the open source,
link |
01:23:24.520
and we started this with open source software, right?
link |
01:23:26.680
The collaboration, what the internet surfaced
link |
01:23:29.240
was the ability for people all over the world
link |
01:23:31.160
to collaborate and produce some of the most
link |
01:23:32.920
foundational software that's in use today, right?
link |
01:23:35.800
That entire ecosystem was created
link |
01:23:36.920
by collaborators all over the place.
link |
01:23:38.840
So these online kind of swarm kind of things
link |
01:23:42.760
are not novel.
link |
01:23:44.280
It's just, I'm just suggesting that future AI systems,
link |
01:23:47.320
if you can build one smart system,
link |
01:23:49.480
you have no reason not to build multiple.
link |
01:23:51.400
If you build multiple,
link |
01:23:52.120
there's no reason not to integrate them all
link |
01:23:53.800
into a collective sensemaking substrate.
link |
01:23:57.720
And that thing will certainly have immersion intelligence
link |
01:24:00.360
that none of the individuals
link |
01:24:01.720
and probably not any of the human designers
link |
01:24:03.400
will be able to really put a bow around and explain.
link |
01:24:06.680
But in some sense, would that AI system
link |
01:24:09.160
still be able to go like rural Texas,
link |
01:24:13.160
buy a ranch, go off the grid, go full survivalist?
link |
01:24:16.840
Like, can you disconnect from the hive mind?
link |
01:24:20.200
You may not want to.
link |
01:24:25.080
So to be ineffective, to be intelligent.
link |
01:24:27.960
You have access to way more intelligence capability
link |
01:24:30.200
if you're plugged into five million other
link |
01:24:31.560
really, really smart cyborgs.
link |
01:24:33.320
Why would you leave?
link |
01:24:34.840
So like there's a word control that comes to mind.
link |
01:24:37.400
So it doesn't feel like control,
link |
01:24:39.800
like overbearing control.
link |
01:24:43.080
It's just knowledge.
link |
01:24:44.600
I think systems, well, this is to your point.
link |
01:24:46.360
I mean, look at how much,
link |
01:24:47.640
how uncomfortable you are with this concept, right?
link |
01:24:49.800
I think systems that feel like overbearing control
link |
01:24:52.280
will not evolutionarily win out.
link |
01:24:54.760
I think systems that give their individual elements
link |
01:24:57.480
the feeling of serendipity and the feeling of agency
link |
01:25:00.280
that that will, those systems will win.
link |
01:25:04.280
But that's not to say that there will not be
link |
01:25:05.720
emergent higher level order on top of it.
link |
01:25:09.560
And that's the thing, that's the philosophical breakdown
link |
01:25:11.320
that we're staring right at,
link |
01:25:13.480
which is in the Western mind,
link |
01:25:14.840
I think there's a very sharp delineation
link |
01:25:17.560
between explicit control,
link |
01:25:21.240
Cartesian, like what is the vector?
link |
01:25:23.320
Where is the position?
link |
01:25:24.280
Where is it going?
link |
01:25:25.560
It's completely deterministic.
link |
01:25:27.240
And kind of this idea that things emerge.
link |
01:25:30.840
Everything we see is the emergent patterns
link |
01:25:32.840
of other things.
link |
01:25:33.960
And there is agency when there's extra energy.
link |
01:25:38.760
So you have spoken about a kind of meaning crisis
link |
01:25:42.200
that we're going through.
link |
01:25:44.520
But it feels like since we invented sex and death,
link |
01:25:50.920
we broadly speaking,
link |
01:25:52.520
we've been searching for a kind of meaning.
link |
01:25:54.760
So it feels like a human civilization
link |
01:25:56.840
has been going through a meaning crisis
link |
01:25:58.200
of different flavors throughout its history.
link |
01:26:00.840
Why is, how is this particular meaning crisis different?
link |
01:26:05.800
Or is it really a crisis and it wasn't previously?
link |
01:26:09.000
What's your sense?
link |
01:26:09.960
A lot of human history,
link |
01:26:11.400
there wasn't so much a meaning crisis.
link |
01:26:13.080
There was just a like food
link |
01:26:14.280
and not getting eaten by bears crisis, right?
link |
01:26:16.920
Once you get to a point where you can make food,
link |
01:26:18.840
there was the like not getting killed
link |
01:26:20.200
by other humans crisis.
link |
01:26:21.960
So sitting around wondering what is it all about,
link |
01:26:24.760
it's actually a relatively recent luxury.
link |
01:26:26.600
And to some extent, the meaning crisis coming out of that
link |
01:26:29.640
is precisely because, well, it's not precisely because,
link |
01:26:33.000
I believe that meaning is the consequence of
link |
01:26:37.320
when we make consequential decisions,
link |
01:26:40.200
it's tied to agency, right?
link |
01:26:42.440
When we make consequential decisions,
link |
01:26:44.760
that generates meaning.
link |
01:26:46.600
So if we make a lot of decisions,
link |
01:26:47.960
but we don't see the consequences of them,
link |
01:26:50.040
then it feels like what was the point, right?
link |
01:26:52.200
But if there's all these big things
link |
01:26:53.640
that we don't see the consequences of,
link |
01:26:55.160
right, but if there's all these big things happening,
link |
01:26:57.000
but we're just along for the ride,
link |
01:26:58.280
then it also does not feel very meaningful.
link |
01:27:00.680
Meaning, as far as I can tell,
link |
01:27:01.880
this is my working definition of CERCA 2021,
link |
01:27:04.600
is generally the result of a person
link |
01:27:08.280
making a consequential decision,
link |
01:27:09.720
acting on it and then seeing the consequences of it.
link |
01:27:12.120
So historically, just when humans are in survival mode,
link |
01:27:16.520
you're making consequential decisions all the time.
link |
01:27:19.400
So there's not a lack of meaning
link |
01:27:20.760
because like you either got eaten or you didn't, right?
link |
01:27:23.320
You got some food and that's great, you feel good.
link |
01:27:25.480
Like these are all consequential decisions.
link |
01:27:27.400
Only in the post fossil fuel and industrial revolution
link |
01:27:33.640
could we create a massive leisure class.
link |
01:27:36.760
I could sit around not being threatened by bears,
link |
01:27:39.240
not starving to death,
link |
01:27:43.480
making decisions somewhat,
link |
01:27:44.680
but a lot of times not seeing the consequences
link |
01:27:47.320
of any decisions they make.
link |
01:27:49.080
The general sort of sense of anomie,
link |
01:27:51.400
I think that is the French term for it,
link |
01:27:53.240
in the wake of the consumer society,
link |
01:27:55.400
in the wake of mass media telling everyone,
link |
01:27:58.520
hey, choosing between Hermes and Chanel
link |
01:28:01.960
is a meaningful decision.
link |
01:28:03.080
No, it's not.
link |
01:28:04.120
I don't know what either of those mean.
link |
01:28:05.560
Oh, they're high end luxury purses and crap like that.
link |
01:28:10.840
But the point is that we give people the idea
link |
01:28:13.480
that consumption is meaning,
link |
01:28:15.000
that making a choice of this team versus that team,
link |
01:28:17.480
spectating has meaning.
link |
01:28:20.040
So we produce all of these different things
link |
01:28:22.440
that are as if meaning, right?
link |
01:28:25.240
But really making a decision that has no consequences for us.
link |
01:28:28.840
And so that creates the meaning crisis.
link |
01:28:30.920
Well, you're saying choosing between Chanel
link |
01:28:33.240
and the other one has no consequence.
link |
01:28:35.240
I mean, why is one more meaningful than the other?
link |
01:28:38.120
It's not that it's more meaningful than the other.
link |
01:28:39.480
It's that you make a decision between these two brands
link |
01:28:42.440
and you're told this brand will make me look better
link |
01:28:45.080
in front of other people.
link |
01:28:45.800
If I buy this brand of car,
link |
01:28:47.560
if I wear that brand of apparel, right?
link |
01:28:50.040
Like a lot of decisions we make are around consumption,
link |
01:28:54.120
but consumption by itself doesn't actually yield meaning.
link |
01:28:57.080
Gaining social status does provide meaning.
link |
01:28:59.960
So that's why in this era of abundant production,
link |
01:29:05.400
so many things turn into status games.
link |
01:29:07.320
The NFT kind of explosion is a similar kind of thing.
link |
01:29:09.880
Everywhere there are status games
link |
01:29:11.880
because we just have so much excess production.
link |
01:29:16.040
But aren't those status games a source of meaning?
link |
01:29:18.360
Like why do the games we play have to be grounded
link |
01:29:22.360
in physical reality like they are
link |
01:29:24.120
when you're trying to run away from lions?
link |
01:29:26.040
Why can't we, in this virtuality world, on social media,
link |
01:29:30.280
why can't we play the games on social media,
link |
01:29:32.200
even the dark ones?
link |
01:29:33.320
Right, we can, we can.
link |
01:29:35.160
But you're saying that's creating a meaning crisis.
link |
01:29:37.640
Well, there's a meaning crisis
link |
01:29:39.080
in that there's two aspects of it.
link |
01:29:41.000
Number one, playing those kinds of status games
link |
01:29:44.520
oftentimes requires destroying the planet
link |
01:29:46.600
because it ties to consumption,
link |
01:29:51.800
consuming the latest and greatest version of a thing,
link |
01:29:54.280
buying the latest limited edition sneaker
link |
01:29:56.840
and throwing out all the old ones.
link |
01:29:58.120
Maybe it keeps in the old ones,
link |
01:29:59.000
but the amount of sneakers we have to cut up
link |
01:30:01.000
and destroy every year
link |
01:30:02.680
to create artificial scarcity for the next generation, right?
link |
01:30:05.640
This is kind of stuff that's not great.
link |
01:30:07.720
It's not great at all.
link |
01:30:09.880
So conspicuous consumption fueling status games
link |
01:30:13.400
is really bad for the planet, not sustainable.
link |
01:30:16.040
The second thing is you can play these kinds of status games,
link |
01:30:19.640
but then what it does is it renders you captured
link |
01:30:22.520
to the virtual environment.
link |
01:30:24.360
The status games that really wealthy people are playing
link |
01:30:26.760
are all around the hard resources
link |
01:30:29.240
where they're gonna build the factories,
link |
01:30:30.440
they're gonna have the fuel in the rare earths
link |
01:30:31.800
to make the next generation of robots.
link |
01:30:33.160
They're then going to run game,
link |
01:30:34.760
run circles around you and your children.
link |
01:30:37.240
So that's another reason not to play
link |
01:30:38.760
those virtual status games.
link |
01:30:40.040
So you're saying ultimately the big picture game is won
link |
01:30:44.040
by people who have access or control
link |
01:30:46.840
over actual hard resources.
link |
01:30:48.360
So you can't, you don't see a society
link |
01:30:51.080
where most of the games are played in the virtual space.
link |
01:30:55.400
They'll be captured in the physical space.
link |
01:30:57.160
It all builds.
link |
01:30:57.960
It's just like the stack of human being, right?
link |
01:31:00.840
If you only play the game at the cultural
link |
01:31:04.360
and then intellectual level,
link |
01:31:05.960
then the people with the hard resources
link |
01:31:07.320
and access to layer zero physical are going to own you.
link |
01:31:10.920
But isn't money not connected to,
link |
01:31:13.480
or less and less connected to hard resources
link |
01:31:15.560
and money still seems to work?
link |
01:31:17.080
It's a virtual technology.
link |
01:31:18.520
There's different kinds of money.
link |
01:31:20.520
Part of the reason that some of the stuff is able
link |
01:31:22.280
to go a little unhinged is because the big sovereignties
link |
01:31:29.960
where one spends money and uses money
link |
01:31:32.040
and plays money games and inflates money,
link |
01:31:34.600
their ability to adjudicate the physical resources
link |
01:31:38.440
and hard resources and the resources
link |
01:31:40.120
and hard resources on land and things like that,
link |
01:31:42.200
those have not been challenged in a very long time.
link |
01:31:45.480
So, you know, we went off the gold standard.
link |
01:31:47.640
Most money is not connected to physical resources.
link |
01:31:51.640
It's an idea.
link |
01:31:53.640
And that idea is very closely connected to status.
link |
01:31:59.880
But it's also tied to like, it's actually tied to law.
link |
01:32:03.000
It is tied to some physical hard things
link |
01:32:04.680
so you have to pay your taxes.
link |
01:32:06.120
Yes, so it's always at the end going to be connected
link |
01:32:09.880
to the blockchain of physical reality.
link |
01:32:12.600
So in the case of law and taxes, it's connected to government
link |
01:32:17.240
and government is what violence is the,
link |
01:32:21.480
I'm playing with stacks of devil's advocates here
link |
01:32:27.720
and popping one devil off the stack at a time.
link |
01:32:30.520
Isn't ultimately, of course,
link |
01:32:31.560
it'll be connected to physical reality,
link |
01:32:33.080
but just because people control the physical reality,
link |
01:32:35.560
it doesn't mean the status.
link |
01:32:36.600
I guess LeBron James in theory could make more money
link |
01:32:39.720
than the owners of the teams in theory.
link |
01:32:43.320
And to me, that's a virtual idea.
link |
01:32:44.920
So somebody else constructed a game
link |
01:32:47.320
and now you're playing in the virtual space of the game.
link |
01:32:51.880
So it just feels like there could be games where status,
link |
01:32:55.480
we build realities that give us meaning in the virtual space.
link |
01:33:00.280
I can imagine such things being possible.
link |
01:33:02.840
Oh yeah, okay, so I see what you're saying.
link |
01:33:04.440
I think I see what you're saying there
link |
01:33:05.560
with the idea there, I mean, we'll take the LeBron James side
link |
01:33:08.840
and put in like some YouTube influencer.
link |
01:33:10.760
Yes, sure.
link |
01:33:11.480
So the YouTube influencer, it is status games,
link |
01:33:15.000
but at a certain level, it precipitates into real dollars
link |
01:33:18.920
and into like, well, you look at Mr. Beast, right?
link |
01:33:21.160
He's like sending off half a million dollars
link |
01:33:23.160
worth of fireworks or something, right?
link |
01:33:24.440
Not a YouTube video.
link |
01:33:25.640
And also like saving, like saving trees and so on.
link |
01:33:28.520
Sure, right, trying to find a million trees
link |
01:33:29.880
with Mark Rober or whatever it was.
link |
01:33:30.920
Yeah, like it's not that those kinds of games
link |
01:33:33.000
can't lead to real consequences.
link |
01:33:34.520
It's that for the vast majority of people in consumer culture,
link |
01:33:40.200
they are incented by the, I would say mostly,
link |
01:33:44.360
I'm thinking about middle class consumers.
link |
01:33:46.920
They're incented by advertisements,
link |
01:33:48.760
they're scented by their memetic environment
link |
01:33:50.840
to treat the purchasing of certain things,
link |
01:33:54.760
the need to buy the latest model, whatever,
link |
01:33:56.440
the need to appear, however,
link |
01:33:58.280
the need to pursue status games as a driver of meaning.
link |
01:34:02.360
And my point would be that it's a very hollow
link |
01:34:04.440
driver of meaning.
link |
01:34:05.960
And that is what creates a meaning crisis.
link |
01:34:08.280
Because at the end of the day,
link |
01:34:10.040
it's like eating a lot of empty calories, right?
link |
01:34:12.120
Yeah, it tasted good going down, a lot of sugar,
link |
01:34:13.960
but man, it did not, it was not enough protein
link |
01:34:15.800
to help build your muscles.
link |
01:34:17.080
And you kind of feel that in your gut.
link |
01:34:18.920
And I think that's, I mean, so all this stuff aside
link |
01:34:21.240
and setting aside our discussion on currency,
link |
01:34:22.840
which I hope we get back to,
link |
01:34:24.360
that's what I mean about the meaning crisis,
link |
01:34:27.480
part of it being created by the fact that we don't,
link |
01:34:30.120
we're not encouraged to have more and more
link |
01:34:32.520
direct relationships.
link |
01:34:34.200
We're actually alienated from relating to,
link |
01:34:37.880
even our family members sometimes, right?
link |
01:34:40.360
We're encouraged to relate to brands.
link |
01:34:43.880
We're encouraged to relate to these kinds of things
link |
01:34:46.280
that then tell us to do things
link |
01:34:49.240
that are really of low consequence.
link |
01:34:51.240
And that's where the meaning crisis comes from.
link |
01:34:52.920
So the role of technology in this,
link |
01:34:54.760
so there's somebody you mentioned who's Jacques,
link |
01:34:57.240
his view of technology, he warns about the towering piles
link |
01:35:01.400
of technique, which I guess is a broad idea of technology.
link |
01:35:05.400
So I think, correct me if I'm wrong for him,
link |
01:35:08.120
technology is bad at moving away from human nature
link |
01:35:12.440
and it's ultimately is destructive.
link |
01:35:14.680
My question, broadly speaking, this meaning crisis,
link |
01:35:16.920
can technology, what are the pros and cons of technology?
link |
01:35:19.640
Can it be a good?
link |
01:35:21.000
Yeah, I think it can be.
link |
01:35:22.280
I certainly think it can be a good thing.
link |
01:35:24.600
Can it be a good? Yeah, I think it can be.
link |
01:35:27.240
I certainly draw on some of Alol's ideas
link |
01:35:29.720
and I think some of them are pretty good.
link |
01:35:32.920
But the way he defines technique is,
link |
01:35:36.200
well, also Simondon as well.
link |
01:35:37.720
I mean, he speaks to the general mentality of efficiency,
link |
01:35:41.240
homogenized processes, homogenized production,
link |
01:35:43.640
homogenized labor to produce homogenized artifacts
link |
01:35:47.160
that then are not actually,
link |
01:35:50.920
they don't sit well in the environment.
link |
01:35:53.080
Essentially, you can think of it as the antonym of craft.
link |
01:35:57.880
Whereas a craftsman will come to a problem,
link |
01:36:02.040
maybe a piece of wood and they make into a chair.
link |
01:36:04.280
It may be a site to build a house or build a stable
link |
01:36:06.600
or build whatever.
link |
01:36:08.520
And they will consider how to bring various things in
link |
01:36:12.280
to build something well contextualized
link |
01:36:15.080
that's in right relationship with that environment.
link |
01:36:20.360
But the way we have driven technology
link |
01:36:22.360
over the last 100 and 150 years is not that at all.
link |
01:36:25.720
It is how can we make sure the input materials
link |
01:36:30.480
are homogenized, cut to the same size,
link |
01:36:33.400
diluted and doped to exactly the right alloy concentrations.
link |
01:36:36.840
How do we create machines that then consume exactly
link |
01:36:38.680
the right kind of energy to be able to run
link |
01:36:39.960
at this high speed to stamp out the same parts,
link |
01:36:42.600
which then go out the door,
link |
01:36:44.080
everyone gets the same tickle of Mielmo.
link |
01:36:45.760
And the reason why everyone wants it
link |
01:36:46.800
is because we have broadcasts that tells everyone
link |
01:36:49.280
this is the cool thing.
link |
01:36:50.520
So we homogenize demand, right?
link |
01:36:52.560
And we're like Baudrillard and other critiques
link |
01:36:55.440
of modernity coming from that direction,
link |
01:36:57.520
the situation lists as well.
link |
01:36:59.240
It's that their point is that at this point in time,
link |
01:37:02.080
consumption is the thing that drives
link |
01:37:04.560
a lot of the economic stuff, not the need,
link |
01:37:06.680
but the need to consume and build status games on top.
link |
01:37:09.400
So we have homogenized, when we discovered,
link |
01:37:12.120
I think this is really like Bernays and stuff, right?
link |
01:37:14.800
In the early 20th century, we discovered we can create,
link |
01:37:17.920
we can create demand, we can create desire
link |
01:37:20.880
in a way that was not possible before
link |
01:37:23.560
because of broadcast media.
link |
01:37:25.560
And not only do we create desire,
link |
01:37:27.680
we don't create a desire for each person
link |
01:37:29.360
to connect to some bespoke thing,
link |
01:37:31.080
to build a relationship with their neighbor or their spouse.
link |
01:37:33.640
We are telling them, you need to consume this brand,
link |
01:37:36.080
you need to drive this vehicle,
link |
01:37:37.200
you gotta listen to this music,
link |
01:37:38.320
have you heard this, have you seen this movie, right?
link |
01:37:40.920
So creating homogenized demand makes it really cheap
link |
01:37:44.800
to create homogenized product.
link |
01:37:46.520
And now you have economics of scale.
link |
01:37:48.520
So we make the same tickle me Elmo,
link |
01:37:50.000
give it to all the kids and all the kids are like,
link |
01:37:52.800
hey, I got a tickle me Elmo, right?
link |
01:37:54.400
So this is ultimately where this ties in then
link |
01:37:58.640
to runaway hypercapitalism is that we then,
link |
01:38:03.040
capitalism is always looking for growth.
link |
01:38:04.800
It's always looking for growth
link |
01:38:05.960
and growth only happens at the margins.
link |
01:38:07.960
So you have to squeeze more and more demand out.
link |
01:38:09.960
You gotta make it cheaper and cheaper
link |
01:38:11.040
to make the same thing,
link |
01:38:12.280
but tell everyone they're still getting meaning from it.
link |
01:38:15.080
You're still like, this is still your tickle me Elmo, right?
link |
01:38:18.040
And we see little bits of this dripping critiques
link |
01:38:21.400
of this dripping in popular culture.
link |
01:38:22.800
You see it sometimes it's when Buzz Lightyear
link |
01:38:25.920
walks into the thing, he's like,
link |
01:38:27.760
oh my God, at the toy store, I'm just a toy.
link |
01:38:30.640
Like there's millions of other,
link |
01:38:31.760
or there's hundreds of other Buzz Lightyear's
link |
01:38:33.400
just like me, right?
link |
01:38:34.720
That is, I think, a fun Pixar critique
link |
01:38:38.080
on this homogenization dynamic.
link |
01:38:40.080
I agree with you on most of the things you're saying.
link |
01:38:42.880
So I'm playing devil's advocate here,
link |
01:38:44.560
but this homogenized machine of capitalism
link |
01:38:50.600
is also the thing that is able to fund,
link |
01:38:54.240
if channeled correctly, innovation, invention,
link |
01:38:59.120
and development of totally new things
link |
01:39:00.760
that in the best possible world,
link |
01:39:02.280
create all kinds of new experiences that can enrich lives,
link |
01:39:06.680
the quality of lives for all kinds of people.
link |
01:39:09.840
So isn't this the machine
link |
01:39:12.360
that actually enables the experiences
link |
01:39:15.120
and more and more experiences that will then give meaning?
link |
01:39:18.640
It has done that to some extent.
link |
01:39:21.040
I mean, it's not all good or bad in my perspective.
link |
01:39:24.680
We can always look backwards
link |
01:39:26.760
and offer a critique of the path we've taken
link |
01:39:29.120
to get to this point in time.
link |
01:39:31.640
But that's a different, that's somewhat different
link |
01:39:33.760
and informs the discussion,
link |
01:39:35.880
but it's somewhat different than the question
link |
01:39:37.680
of where do we go in the future, right?
link |
01:39:40.600
Is this still the same rocket we need to ride
link |
01:39:42.720
to get to the next point?
link |
01:39:43.560
Will it even get us to the next point?
link |
01:39:44.560
Well, how does this, so you're predicting the future,
link |
01:39:46.240
how does it go wrong in your view?
link |
01:39:48.760
We have the mechanisms,
link |
01:39:51.040
we have now explored enough technologies
link |
01:39:53.920
to where we can actually, I think, sustainably produce
link |
01:39:59.520
what most people in the world need to live.
link |
01:40:03.360
We have also created the infrastructures
link |
01:40:07.640
to allow continued research and development
link |
01:40:10.400
of additional science and medicine
link |
01:40:13.040
and various other kinds of things.
link |
01:40:16.080
The organizing principles that we use
link |
01:40:18.520
to govern all these things today have been,
link |
01:40:21.840
a lot of them have been just inherited
link |
01:40:25.520
from honestly medieval times.
link |
01:40:28.440
Some of them have refactored a little bit
link |
01:40:30.160
in the industrial era,
link |
01:40:31.920
but a lot of these modes of organizing people
link |
01:40:35.920
are deeply problematic.
link |
01:40:38.280
And furthermore, they're rooted in,
link |
01:40:41.640
I think, a very industrial mode perspective on human labor.
link |
01:40:46.160
And this is one of those things,
link |
01:40:47.800
I'm gonna go back to the open source thing.
link |
01:40:49.720
There was a point in time when,
link |
01:40:51.920
well, let me ask you this.
link |
01:40:53.640
If you look at the core SciPy sort of collection of libraries,
link |
01:40:57.080
so SciPy, NumPy, Matplotlib, right?
link |
01:40:59.360
There's iPython Notebook, let's throw pandas in there,
link |
01:41:01.440
scikit learn, a few of these things.
link |
01:41:03.400
How much value do you think, economic value,
link |
01:41:07.400
would you say they drive in the world today?
link |
01:41:10.800
That's one of the fascinating things
link |
01:41:12.640
about talking to you and Travis is like,
link |
01:41:16.000
it's a measure, it's like a...
link |
01:41:18.160
At least a billion dollars a day, maybe?
link |
01:41:20.080
A billion dollars, sure.
link |
01:41:21.240
I mean, it's like, it's similar question of like,
link |
01:41:23.680
how much value does Wikipedia create?
link |
01:41:26.080
Right.
link |
01:41:26.920
It's like, all of it, I don't know.
link |
01:41:30.520
Well, I mean, if you look at it,
link |
01:41:31.840
all of it, I don't know.
link |
01:41:33.440
Well, I mean, if you look at our systems,
link |
01:41:34.720
when you do a Google search, right?
link |
01:41:36.120
Now, some of that stuff runs through TensorFlow,
link |
01:41:37.760
but when you look at Siri,
link |
01:41:40.080
when you do credit card transaction fraud,
link |
01:41:42.080
like just everything, right?
link |
01:41:43.360
Every intelligence agency under the sun,
link |
01:41:45.240
they're using some aspect of these kinds of tools.
link |
01:41:47.680
So I would say that these create billions of dollars
link |
01:41:51.200
of value.
link |
01:41:52.040
Oh, you mean like direct use of tools
link |
01:41:53.560
that leverage this data?
link |
01:41:54.400
Yes, direct, yeah.
link |
01:41:55.240
Yeah, even that's billions a day, yeah.
link |
01:41:56.720
Yeah, right, easily, I think.
link |
01:41:58.800
Like the things they could not do
link |
01:41:59.800
if they didn't have these tools, right?
link |
01:42:01.160
Yes.
link |
01:42:02.000
So that's billions of dollars a day, great.
link |
01:42:04.880
I think that's about right.
link |
01:42:05.760
Now, if we take, how many people did it take
link |
01:42:07.880
to make that, right?
link |
01:42:09.960
And there was a point in time, not anymore,
link |
01:42:11.720
but there was a point in time when they could fit
link |
01:42:12.960
in a van.
link |
01:42:13.800
I could have fit them in my Mercedes winter, right?
link |
01:42:15.960
And so if you look at that, like, holy crap,
link |
01:42:19.240
literally a van of maybe a dozen people
link |
01:42:22.480
could create value to the tune of billions of dollars a day.
link |
01:42:28.320
What lesson do you draw from that?
link |
01:42:30.080
Well, here's the thing.
link |
01:42:31.400
What can we do to do more of that?
link |
01:42:35.240
Like that's open source.
link |
01:42:36.360
The way I've talked about this in other environments is
link |
01:42:39.760
when we use generative participatory crowdsourced
link |
01:42:43.440
approaches, we unlock human potential
link |
01:42:47.600
at a level that is better than what capitalism can do.
link |
01:42:52.280
I would challenge anyone to go and try to hire
link |
01:42:55.520
the right 12 people in the world
link |
01:42:58.360
to build that entire stack
link |
01:43:00.240
the way those 12 people did that, right?
link |
01:43:02.520
They would be very, very hard pressed to do that.
link |
01:43:04.160
If a hedge fund could just hire a dozen people
link |
01:43:06.760
and create like something that is worth
link |
01:43:08.480
billions of dollars a day,
link |
01:43:10.120
every single one of them would be racing to do it, right?
link |
01:43:12.400
But finding the right people,
link |
01:43:13.680
fostering the right collaborations,
link |
01:43:15.160
getting it adopted by the right other people
link |
01:43:16.840
to then refine it,
link |
01:43:18.080
that is a thing that was organic in nature.
link |
01:43:21.080
That took crowdsourcing.
link |
01:43:22.200
That took a lot of the open source ethos
link |
01:43:24.160
and it took the right kinds of people, right?
link |
01:43:26.480
Now those people who started that said,
link |
01:43:27.880
I need to have a part of a multi billion dollar a day
link |
01:43:30.880
sort of enterprise.
link |
01:43:32.440
They're like, I'm doing this cool thing
link |
01:43:33.560
to solve my problem for my friends, right?
link |
01:43:35.480
So the point of telling the story
link |
01:43:37.880
is to say that our way of thinking about value,
link |
01:43:40.760
our way of thinking about allocation of resources,
link |
01:43:42.880
our ways of thinking about property rights
link |
01:43:44.920
and all these kinds of things,
link |
01:43:46.200
they come from finite game, scarcity mentality,
link |
01:43:50.040
medieval institutions.
link |
01:43:52.160
As we are now entering,
link |
01:43:54.200
to some extent we're sort of in a post scarcity era,
link |
01:43:57.080
although some people are hoarding a whole lot of stuff.
link |
01:43:59.800
We are at a point where if not now soon,
link |
01:44:02.200
we'll be in a post scarcity era.
link |
01:44:03.920
The question of how we allocate resources
link |
01:44:06.480
has to be revisited at a fundamental level
link |
01:44:08.720
because the kind of software these people built,
link |
01:44:11.000
the modalities that those human ecologies
link |
01:44:13.960
that built that software,
link |
01:44:15.840
it treats offers unproperty.
link |
01:44:17.960
Actually sharing creates value.
link |
01:44:20.480
Restricting and forking reduces value.
link |
01:44:23.080
So that's different than any other physical resource
link |
01:44:26.360
that we've ever dealt with.
link |
01:44:27.200
It's different than how most corporations
link |
01:44:28.720
treat software IP, right?
link |
01:44:31.240
So if treating software in this way
link |
01:44:34.600
created this much value so efficiently, so cheaply,
link |
01:44:37.560
because feeding a dozen people for 10 years
link |
01:44:39.160
is really cheap, right?
link |
01:44:41.640
That's the reason I care about this right now
link |
01:44:44.680
is because looking forward
link |
01:44:46.040
when we can automate a lot of labor,
link |
01:44:48.000
where we can in fact,
link |
01:44:49.560
the programming for your robot in your part,
link |
01:44:52.200
neck of the woods and your part of the Amazon
link |
01:44:54.120
to build something sustainable for you
link |
01:44:55.960
and your tribe to deliver the right medicines,
link |
01:44:58.200
to take care of the kids,
link |
01:45:00.240
that's just software, that's just code
link |
01:45:02.880
that could be totally open sourced, right?
link |
01:45:05.480
So we can actually get to a mode
link |
01:45:07.360
where all of this additional generative things
link |
01:45:10.920
that humans are doing,
link |
01:45:12.400
they don't have to be wrapped up in a container
link |
01:45:16.200
and then we charge for all the exponential dynamics
link |
01:45:18.360
out of it.
link |
01:45:19.200
That's what Facebook did.
link |
01:45:20.400
That's what modern social media did, right?
link |
01:45:22.400
Because the old internet was connecting people just fine.
link |
01:45:24.920
So Facebook came along and said,
link |
01:45:25.960
well, anyone can post a picture,
link |
01:45:26.960
anyone can post some text
link |
01:45:28.440
and we're gonna amplify the crap out of it to everyone else.
link |
01:45:31.120
And it exploded this generative network
link |
01:45:33.280
of human interaction.
link |
01:45:34.720
And then I said, how do I make money off that?
link |
01:45:36.080
Oh yeah, I'm gonna be a gatekeeper
link |
01:45:38.160
on everybody's attention.
link |
01:45:39.880
And that's how I'm gonna make money.
link |
01:45:41.040
So how do we create more than one van?
link |
01:45:45.640
How do we have millions of vans full of people
link |
01:45:47.840
that create NumPy, SciPy, that create Python?
link |
01:45:51.000
So the story of those people is often they have
link |
01:45:55.120
some kind of job outside of this.
link |
01:45:57.080
This is what they're doing for fun.
link |
01:45:58.880
Don't you need to have a job?
link |
01:46:00.960
Don't you have to be connected,
link |
01:46:02.240
plugged in to the capitalist system?
link |
01:46:04.960
Isn't that what,
link |
01:46:07.280
isn't this consumerism,
link |
01:46:09.160
the engine that results in the individuals
link |
01:46:13.880
that kind of take a break from it every once in a while
link |
01:46:15.880
to create something magical?
link |
01:46:17.320
Like at the edges is where the innovation happens.
link |
01:46:19.160
There's a surplus, right, this is the question.
link |
01:46:21.360
Like if everyone were to go and run their own farm,
link |
01:46:24.400
no one would have time to go and write NumPy, SciPy, right?
link |
01:46:27.320
Maybe, but that's what I'm talking about
link |
01:46:29.960
when I say we're maybe at a post scarcity point
link |
01:46:32.800
for a lot of people.
link |
01:46:34.160
The question that we're never encouraged to ask
link |
01:46:37.240
in a Super Bowl ad is how much do you need?
link |
01:46:40.640
How much is enough?
link |
01:46:41.960
Do you need to have a new car every two years, every five?
link |
01:46:45.040
If you have a reliable car,
link |
01:46:46.160
can you drive one for 10 years, is that all right?
link |
01:46:48.480
I had a car for 10 years and it was fine.
link |
01:46:50.520
Your iPhone, do you have to upgrade every two years?
link |
01:46:52.800
I mean, it's sort of, you're using the same apps
link |
01:46:54.320
you did four years ago, right?
link |
01:46:56.680
This should be a Super Bowl ad.
link |
01:46:58.320
This should be a Super Bowl ad, that's great.
link |
01:46:59.680
Maybe somebody. Do you really need a new iPhone?
link |
01:47:01.400
Maybe one of our listeners will fund something like this
link |
01:47:03.920
of like, no, but just actually bringing it back,
link |
01:47:06.960
bringing it back to actually the question
link |
01:47:09.440
of what do you need?
link |
01:47:11.480
How do we create the infrastructure
link |
01:47:13.560
for collectives of people to live on the basis
link |
01:47:17.640
of providing what we need, meeting people's needs
link |
01:47:21.000
with a little bit of access to handle emergencies,
link |
01:47:23.200
things like that, pulling our resources together
link |
01:47:26.320
to handle the really, really big emergencies,
link |
01:47:28.760
somebody with a really rare form of cancer
link |
01:47:30.880
or some massive fire sweeps through half the village
link |
01:47:34.120
or whatever, but can we actually unscale things
link |
01:47:38.200
and solve for people's needs
link |
01:47:41.560
and then give them the capacity to explore
link |
01:47:45.160
how to be the best version of themselves?
link |
01:47:47.320
And for Travis, that was throwing away his shot of tenure
link |
01:47:51.000
in order to write NumPy.
link |
01:47:52.880
For others, there is a saying in the SciFi community
link |
01:47:56.840
that SciFi advances one failed postdoc at a time.
link |
01:48:00.920
And that's, we can do these things.
link |
01:48:03.800
We can actually do this kind of collaboration
link |
01:48:05.600
because code, software, information, organization,
link |
01:48:08.360
that's cheap.
link |
01:48:09.880
Those bits are very cheap to fling across the oceans.
link |
01:48:13.000
So you mentioned Travis.
link |
01:48:14.760
We've been talking and we'll continue to talk
link |
01:48:16.560
about open source.
link |
01:48:19.480
Maybe you can comment.
link |
01:48:20.440
How did you meet Travis?
link |
01:48:21.960
Who is Travis Aliphant?
link |
01:48:24.080
What's your relationship been like through the years?
link |
01:48:28.440
Where did you work together?
link |
01:48:30.160
How did you meet?
link |
01:48:31.800
What's the present and the future look like?
link |
01:48:35.120
Yeah, so the first time I met Travis
link |
01:48:36.600
was at a SciFi conference in Pasadena.
link |
01:48:39.360
Do you remember the year?
link |
01:48:40.920
2005.
link |
01:48:42.040
I was working at, again, at nthought,
link |
01:48:44.400
working on scientific computing consulting.
link |
01:48:47.000
And a couple of years later,
link |
01:48:51.160
he joined us at nthought, I think 2007.
link |
01:48:55.240
And he came in as the president.
link |
01:48:58.240
One of the founders of nthought was the CEO, Eric Jones.
link |
01:49:01.880
And we were all very excited that Travis was joining us
link |
01:49:04.080
and that was great fun.
link |
01:49:05.080
And so I worked with Travis
link |
01:49:06.960
on a number of consulting projects
link |
01:49:08.920
and we worked on some open source stuff.
link |
01:49:12.120
I mean, it was just a really, it was a good time there.
link |
01:49:15.080
And then...
link |
01:49:15.920
It was primarily Python related?
link |
01:49:17.840
Oh yeah, it was all Python, NumPy, SciFi consulting
link |
01:49:19.800
kind of stuff.
link |
01:49:21.000
Towards the end of that time,
link |
01:49:23.240
we started getting called into more and more finance shops.
link |
01:49:27.720
They were adopting Python pretty heavily.
link |
01:49:29.840
I did some work on like a high frequency trading shop,
link |
01:49:33.320
working on some stuff.
link |
01:49:34.160
And then we worked together on some,
link |
01:49:36.520
at a couple of investment banks in Manhattan.
link |
01:49:39.840
And so we started seeing that there was a potential
link |
01:49:42.680
to take Python in the direction of business computing,
link |
01:49:45.720
more than just being this niche like MATLAB replacement
link |
01:49:48.120
for big vector computing.
link |
01:49:50.520
What we were seeing was, oh yeah,
link |
01:49:51.800
you could actually use Python as a Swiss army knife
link |
01:49:53.880
to do a lot of shadow data transformation kind of stuff.
link |
01:49:56.840
So that's when we realized the potential is much greater.
link |
01:50:00.520
And so we started Anaconda,
link |
01:50:03.360
I mean, it was called Continuum Analytics at the time,
link |
01:50:05.240
but we started in January of 2012
link |
01:50:07.520
with a vision of shoring up the parts of Python
link |
01:50:10.760
that needed to get expanded to handle data at scale,
link |
01:50:13.760
to do web visualization, application development, et cetera.
link |
01:50:17.200
And that was that, yeah.
link |
01:50:18.040
So he was CEO and I was president for the first five years.
link |
01:50:23.880
And then we raised some money and then the board,
link |
01:50:27.480
it was sort of put in a new CEO.
link |
01:50:28.960
They hired a kind of professional CEO.
link |
01:50:31.320
And then Travis, you laugh at that.
link |
01:50:34.080
I took over the CTO role.
link |
01:50:35.240
Travis then left after a year to do his own thing,
link |
01:50:37.920
to do Quonsight, which was more oriented
link |
01:50:41.120
around some of the bootstrap years that we did at Continuum
link |
01:50:43.920
where it was open source and consulting.
link |
01:50:46.200
It wasn't sort of like gung ho product development.
link |
01:50:48.600
And it wasn't focused on,
link |
01:50:50.120
we accidentally stumbled
link |
01:50:51.120
into the package management problem at Anaconda,
link |
01:50:55.560
but we had a lot of other visions of other technology
link |
01:50:57.760
that we built in the open source.
link |
01:50:58.920
And Travis was really trying to push,
link |
01:51:02.080
again, the frontiers of numerical computing,
link |
01:51:04.160
vector computing,
link |
01:51:05.320
handling things like auto differentiation and stuff
link |
01:51:07.720
intrinsically in the open ecosystem.
link |
01:51:09.960
So I think that's kind of the direction
link |
01:51:14.320
he's working on in some of his work.
link |
01:51:18.280
We remain great friends and colleagues and collaborators,
link |
01:51:22.520
even though he's no longer day to day working at Anaconda,
link |
01:51:25.760
but he gives me a lot of feedback
link |
01:51:27.000
about this and that and the other.
link |
01:51:29.000
What's a big lesson you've learned from Travis
link |
01:51:32.200
about life or about programming or about leadership?
link |
01:51:35.440
Wow, there's a lot.
link |
01:51:36.480
There's a lot.
link |
01:51:37.320
Travis is a really, really good guy.
link |
01:51:39.600
He really, his heart is really in it.
link |
01:51:41.920
He cares a lot.
link |
01:51:44.760
I've gotten that sense having to interact with him.
link |
01:51:46.920
It's so interesting.
link |
01:51:47.760
Such a good human being.
link |
01:51:48.600
He's a really good dude.
link |
01:51:49.720
And he and I, it's so interesting.
link |
01:51:51.360
We come from very different backgrounds.
link |
01:51:53.240
We're quite different as people,
link |
01:51:56.000
but I think we can like not talk for a long time
link |
01:52:00.800
and then be on a conversation
link |
01:52:03.280
and be eye to eye on like 90% of things.
link |
01:52:06.400
And so he's someone who I believe
link |
01:52:08.280
no matter how much fog settles in over the ocean,
link |
01:52:10.600
his ship, my ship are pointed
link |
01:52:12.200
sort of in the same direction of the same star.
link |
01:52:14.120
Wow, that's a beautiful way to phrase it.
link |
01:52:16.840
No matter how much fog there is,
link |
01:52:18.600
we're pointed at the same star.
link |
01:52:20.400
Yeah, and I hope he feels the same way.
link |
01:52:21.880
I mean, I hope he knows that over the years now.
link |
01:52:23.760
We both care a lot about the community.
link |
01:52:27.000
For someone who cares so deeply,
link |
01:52:28.120
I would say this about Travis that's interesting.
link |
01:52:29.880
For someone who cares so deeply about the nerd details
link |
01:52:33.360
of like type system design and vector computing
link |
01:52:36.000
and efficiency of expressing this and that and the other,
link |
01:52:38.760
memory layouts and all that stuff,
link |
01:52:40.440
he cares even more about the people
link |
01:52:43.280
in the ecosystem, the community.
link |
01:52:45.880
And I have a similar kind of alignment.
link |
01:52:49.760
I care a lot about the tech, I really do.
link |
01:52:53.080
But for me, the beauty of what this human ecology
link |
01:52:58.080
has produced is I think a touchstone.
link |
01:53:01.680
It's an early version, we can look at it and say,
link |
01:53:03.600
how do we replicate this for humanity at scale?
link |
01:53:05.800
What this open source collaboration was able to produce?
link |
01:53:08.760
How can we be generative in human collaboration
link |
01:53:11.600
moving forward and create that
link |
01:53:12.760
as a civilizational kind of dynamic?
link |
01:53:15.080
Like, can we seize this moment to do that?
link |
01:53:17.440
Because like a lot of the other open source movements,
link |
01:53:19.720
it's all nerds nerding out on code for nerds.
link |
01:53:23.600
And this because it's scientists,
link |
01:53:25.840
because it's people working on data,
link |
01:53:27.160
that all of it faces real human problems.
link |
01:53:31.480
I think we have an opportunity
link |
01:53:32.600
to actually make a bigger impact.
link |
01:53:34.360
Is there a way for this kind of open source vision
link |
01:53:37.480
to make money?
link |
01:53:39.040
Absolutely.
link |
01:53:40.080
To fund the people involved?
link |
01:53:41.640
Is that an essential part of it?
link |
01:53:43.040
It's hard, but we're trying to do that
link |
01:53:45.640
in our own way at Anaconda,
link |
01:53:48.560
because we know that business users,
link |
01:53:49.880
as they use more of the stuff, they have needs,
link |
01:53:52.000
like business specific needs around security, provenance.
link |
01:53:54.840
They really can't tell their VPs and their investors,
link |
01:53:59.200
hey, we're having, our data scientists
link |
01:54:01.040
are installing random packages from who knows where
link |
01:54:03.520
and running on customer data.
link |
01:54:04.680
So they have to have someone to talk to you.
link |
01:54:05.920
And that's what Anaconda does.
link |
01:54:07.360
So we are a governed source of packages for them,
link |
01:54:10.400
and that's great, that makes some money.
link |
01:54:12.160
We take some of that and we just take that as a dividend.
link |
01:54:16.160
We take a percentage of our revenues
link |
01:54:17.200
and write that as a dividend for the open source community.
link |
01:54:20.000
But beyond that, I really see the development
link |
01:54:23.440
of a marketplace for people to create notebooks,
link |
01:54:27.560
models, data sets, curation of these different kinds
link |
01:54:30.840
of things, and to really have
link |
01:54:33.120
a long tail marketplace dynamic with that.
link |
01:54:37.280
Can you speak about this problem
link |
01:54:38.800
that you stumbled into of package management,
link |
01:54:41.800
Python package management?
link |
01:54:43.120
What is that?
link |
01:54:46.080
A lot of people speak very highly of Conda,
link |
01:54:48.160
which is part of Anaconda, which is a package manager.
link |
01:54:50.720
There's a ton of packages.
link |
01:54:52.400
So first, what are package managers?
link |
01:54:55.080
And second, what was there before?
link |
01:54:57.240
What is pip?
link |
01:54:58.600
And why is Conda more awesome?
link |
01:55:01.840
The package problem is this, which is that
link |
01:55:04.200
in order to do numerical computing efficiently with Python,
link |
01:55:11.640
there are a lot of low level libraries
link |
01:55:14.480
that need to be compiled, compiled with a C compiler
link |
01:55:17.400
or C++ compiler or Fortran compiler.
link |
01:55:19.880
They need to not just be compiled,
link |
01:55:21.120
but they need to be compiled with all of the right settings.
link |
01:55:23.680
And oftentimes those settings are tuned
link |
01:55:25.200
for specific chip architectures.
link |
01:55:27.440
And when you add GPUs to the mix,
link |
01:55:29.320
when you look at different operating systems,
link |
01:55:32.280
you may be on the same chip,
link |
01:55:33.800
but if you're running Mac versus Linux versus Windows
link |
01:55:37.200
on the same x86 chip, you compile and link differently.
link |
01:55:40.040
All of this complexity is beyond the capability
link |
01:55:44.800
of most data scientists to reason about.
link |
01:55:46.720
And it's also beyond what most of the package developers
link |
01:55:50.240
want to deal with too.
link |
01:55:51.880
Because if you're a package developer,
link |
01:55:52.840
you're like, I code on Linux.
link |
01:55:54.280
This works for me, I'm good.
link |
01:55:55.840
It is not my problem to figure out how to build this
link |
01:55:58.080
on an ancient version of Windows, right?
link |
01:56:00.040
That's just simply not my problem.
link |
01:56:01.920
So what we end up with is we have a creator economy
link |
01:56:05.120
or create a very creative crowdsourced environment
link |
01:56:08.560
where people want to use this stuff, but they can't.
link |
01:56:11.160
And so we ended up creating a new set of technologies
link |
01:56:15.720
like a build recipe system, a build system
link |
01:56:18.400
and an installer system that is able to,
link |
01:56:22.520
well, to put it simply,
link |
01:56:24.640
it's able to build these packages correctly
link |
01:56:27.680
on each of these different kinds of platforms
link |
01:56:29.320
and operating systems,
link |
01:56:30.320
and make it so when people want to install something,
link |
01:56:33.040
they can, it's just one command.
link |
01:56:34.400
They don't have to set up a big compiler system
link |
01:56:36.960
and do all these things.
link |
01:56:38.320
So when it works well, it works great.
link |
01:56:40.440
Now, the difficulty is we have literally thousands
link |
01:56:43.920
of people writing code in the ecosystem,
link |
01:56:46.280
building all sorts of stuff and each person writing code,
link |
01:56:48.760
they may take a dependence on something else.
link |
01:56:50.640
And so you have all this web,
link |
01:56:52.280
incredibly complex web of dependencies.
link |
01:56:54.840
So installing the correct package
link |
01:56:57.640
for any given set of packages you want,
link |
01:57:00.360
getting that right subgraph is an incredibly hard problem.
link |
01:57:04.600
And again, most data scientists
link |
01:57:05.800
don't want to think about this.
link |
01:57:06.640
They're like, I want to install NumPy and pandas.
link |
01:57:09.160
I want this version of some like geospatial library.
link |
01:57:11.720
I want this other thing.
link |
01:57:13.080
Like, why is this hard?
link |
01:57:14.000
These exist, right?
link |
01:57:15.680
And it is hard because it's, well,
link |
01:57:17.680
you're installing this on a version of Windows, right?
link |
01:57:20.760
And half of these libraries are not built for Windows
link |
01:57:23.400
or the latest version isn't available,
link |
01:57:25.120
but the old version was.
link |
01:57:26.240
And if you go to the old version of this library,
link |
01:57:27.560
that means you need to go to a different version
link |
01:57:28.600
of that library.
link |
01:57:30.040
And so the Python ecosystem,
link |
01:57:32.480
by virtue of being crowdsourced,
link |
01:57:34.480
we were able to fill a hundred thousand different niches.
link |
01:57:38.000
But then we also suffer this problem
link |
01:57:40.360
that because it's crowdsourced and no one,
link |
01:57:43.120
it's like a tragedy of the commons, right?
link |
01:57:44.480
No one really needs, wants to support
link |
01:57:47.080
their thousands of other dependencies.
link |
01:57:49.200
So we end up sort of having to do a lot of this.
link |
01:57:52.160
And of course the conda forge community
link |
01:57:53.480
also steps up as an open source community that,
link |
01:57:55.640
you know, maintain some of these recipes.
link |
01:57:57.520
That's what conda does.
link |
01:57:58.680
Now, pip is a tool that came along after conda,
link |
01:58:01.880
to some extent, it came along as an easier way
link |
01:58:04.480
for the Python developers writing Python code
link |
01:58:09.520
that didn't have as much compiled, you know, stuff.
link |
01:58:12.920
They could then install different packages.
link |
01:58:15.360
And what ended up happening in the Python ecosystem
link |
01:58:17.800
was that a lot of the core Python and web Python developers,
link |
01:58:20.800
they never ran into any of this compilation stuff at all.
link |
01:58:24.160
So even we have, you know, on video,
link |
01:58:27.040
we have Guido van Rossum saying,
link |
01:58:29.520
you know what, the scientific community's packaging problems
link |
01:58:31.600
are just too exotic and different.
link |
01:58:33.160
I mean, you're talking about Fortran compilers, right?
link |
01:58:35.680
Like you guys just need to build your own solution
link |
01:58:37.680
perhaps, right?
link |
01:58:38.960
So the Python core Python community went
link |
01:58:41.720
and built its own sort of packaging technologies,
link |
01:58:45.320
not really contemplating the complexity
link |
01:58:47.880
of this stuff over here.
link |
01:58:49.280
And so now we have the challenge where
link |
01:58:51.560
you can pip install some things, some libraries,
link |
01:58:53.600
if you just want to get started with them,
link |
01:58:55.280
you can pip install TensorFlow and that works great.
link |
01:58:57.560
The instant you want to also install some other packages
link |
01:59:00.200
that use different versions of NumPy
link |
01:59:02.560
or some like graphics library or some OpenCV thing
link |
01:59:05.560
or some other thing, you now run into dependency hell
link |
01:59:08.720
because you cannot, you know,
link |
01:59:09.880
OpenCV can have a different version of libjpeg over here
link |
01:59:12.800
than PyTorch over here.
link |
01:59:14.320
Like they actually, they all have to use the,
link |
01:59:15.800
if you want to use GPU acceleration,
link |
01:59:17.400
they have to all use the same underlying drivers
link |
01:59:18.840
and same GPU CUDA things.
link |
01:59:20.400
So it's, it gets to be very gnarly
link |
01:59:22.960
and it's a level of technology
link |
01:59:24.240
that both the makers and the users
link |
01:59:26.120
don't really want to think too much about.
link |
01:59:28.560
And that's where you step in and try to solve this.
link |
01:59:30.480
We try to solve it.
link |
01:59:31.320
Subgraph problems.
link |
01:59:32.160
How much is that?
link |
01:59:33.000
I mean, you said that you don't want to think,
link |
01:59:34.960
they don't want to think about it,
link |
01:59:35.960
but how much is it a little bit on the developer
link |
01:59:38.000
and providing them tools to be a little bit more clear
link |
01:59:42.440
of that subgraph of dependency that's necessary?
link |
01:59:44.920
It is getting to a point where we do have to think about,
link |
01:59:47.920
look, can we pull some of the most popular packages together
link |
01:59:51.280
and get them to work on a coordinated release timeline,
link |
01:59:53.640
get them to build against the same test matrix,
link |
01:59:55.880
et cetera, et cetera, right?
link |
01:59:57.000
And there is a little bit of dynamic around this,
link |
01:59:58.780
but again, it is a volunteer community.
link |
02:00:01.880
Yeah.
link |
02:00:02.720
You know, people working on these different projects
link |
02:00:04.980
have their own timelines
link |
02:00:06.040
and their own things they're trying to meet.
link |
02:00:07.600
So we end up trying to pull these things together.
link |
02:00:11.360
And then it's this incredibly,
link |
02:00:13.080
and I would recommend just as a business tip,
link |
02:00:15.440
don't ever go into business
link |
02:00:16.560
where when your hard work works, you're invisible.
link |
02:00:19.600
And when it breaks because of someone else's problem,
link |
02:00:21.720
you get flagged for it.
link |
02:00:23.200
Because that's in our situation, right?
link |
02:00:25.640
When something doesn't condensate all properly,
link |
02:00:27.280
usually it's some upstream issue,
link |
02:00:28.980
but it looks like condensate is broken.
link |
02:00:30.280
It looks like, you know, Anaconda screwed something up.
link |
02:00:32.640
When things do work though, it's like, oh yeah, cool.
link |
02:00:34.560
It's worked.
link |
02:00:35.400
Assuming naturally, of course,
link |
02:00:36.240
it's very easy to make that work, right?
link |
02:00:38.200
So we end up in this kind of problematic scenario,
link |
02:00:41.960
but it's okay because I think we're still,
link |
02:00:45.040
you know, our heart's in the right place.
link |
02:00:46.760
We're trying to move this forward
link |
02:00:47.840
as a community sort of affair.
link |
02:00:49.220
I think most of the people in the community
link |
02:00:50.520
also appreciate the work we've done over the years
link |
02:00:53.000
to try to move these things forward
link |
02:00:54.280
in a collaborative fashion, so.
link |
02:00:57.320
One of the subgraphs of dependencies
link |
02:01:01.200
that became super complicated
link |
02:01:03.560
is the move from Python 2 to Python 3.
link |
02:01:05.760
So there's all these ways to mess
link |
02:01:07.200
with these kinds of ecosystems of packages and so on.
link |
02:01:11.520
So I just want to ask you about that particular one.
link |
02:01:13.760
What do you think about the move from Python 2 to 3?
link |
02:01:18.000
Why did it take so long?
link |
02:01:19.440
What were, from your perspective,
link |
02:01:20.960
just seeing the packages all struggle
link |
02:01:23.700
and the community all struggle through this process,
link |
02:01:26.280
what lessons do you take away from it?
link |
02:01:27.840
Why did it take so long?
link |
02:01:29.480
Looking back, some people perhaps underestimated
link |
02:01:33.380
how much adoption Python 2 had.
link |
02:01:38.120
I think some people also underestimated how much,
link |
02:01:41.800
or they overestimated how much value
link |
02:01:44.440
some of the new features in Python 3 really provided.
link |
02:01:47.400
Like the things they really loved about Python 3
link |
02:01:49.720
just didn't matter to some of these people in Python 2.
link |
02:01:52.600
Because this change was happening as Python, SciPy,
link |
02:01:56.440
was starting to take off really like past,
link |
02:01:58.700
like a hockey stick of adoption
link |
02:02:00.000
in the early data science era, in the early 2010s.
link |
02:02:02.880
A lot of people were learning and onboarding
link |
02:02:04.880
in whatever just worked.
link |
02:02:06.280
And the teachers were like,
link |
02:02:07.180
well, yeah, these libraries I need
link |
02:02:09.080
are not supported in Python 3 yet,
link |
02:02:10.360
I'm going to teach you Python 2.
link |
02:02:12.040
Took a lot of advocacy to get people
link |
02:02:13.960
to move over to Python 3.
link |
02:02:15.700
So I think it wasn't any particular single thing,
link |
02:02:18.860
but it was one of those death by a dozen cuts,
link |
02:02:21.820
which just really made it hard to move off of Python 2.
link |
02:02:25.480
And also Python 3 itself,
link |
02:02:27.280
as they were kind of breaking things
link |
02:02:28.820
and changing things around
link |
02:02:29.660
and reorganizing the standard library,
link |
02:02:30.720
there's a lot of stuff that was happening there
link |
02:02:32.920
that kept giving people an excuse to say,
link |
02:02:35.880
I'll put off till the next version.
link |
02:02:37.760
2 is working fine enough for me right now.
link |
02:02:39.680
So I think that's essentially what happened there.
link |
02:02:41.480
And I will say this though,
link |
02:02:43.760
the strength of the Python data science movement,
link |
02:02:48.660
I think is what kept Python alive in that transition.
link |
02:02:52.440
Because a lot of languages have died
link |
02:02:54.040
and left their user bases behind.
link |
02:02:56.840
If there wasn't the use of Python for data,
link |
02:02:58.980
there's a good chunk of Python users
link |
02:03:01.160
that during that transition,
link |
02:03:02.740
would have just left for Go and Rust and stayed.
link |
02:03:04.920
In fact, some people did.
link |
02:03:06.220
They moved to Go and Rust and they just never looked back.
link |
02:03:08.880
The fact that we were able to grow by millions of users,
link |
02:03:13.720
the Python data community,
link |
02:03:15.820
that is what kept the momentum for Python going.
link |
02:03:18.320
And now the usage of Python for data is over 50%
link |
02:03:21.760
of the overall Python user base.
link |
02:03:24.320
So I'm happy to debate that on stage somewhere,
link |
02:03:27.960
I don't know if they really wanna take issue
link |
02:03:29.920
with that statement, but from where I sit,
link |
02:03:31.600
I think that's true.
link |
02:03:32.560
The statement there, the idea is that the switch
link |
02:03:35.280
from Python 2 to Python 3 would have probably
link |
02:03:39.040
destroyed Python if it didn't also coincide with Python
link |
02:03:43.600
for whatever reason,
link |
02:03:45.780
just overtaking the data science community,
link |
02:03:49.680
anything that processes data.
link |
02:03:51.800
So like the timing was perfect that this maybe
link |
02:03:55.760
imperfect decision was coupled with a great timing
link |
02:03:59.080
on the value of data in our world.
link |
02:04:02.080
I would say the troubled execution of a good decision.
link |
02:04:04.780
It was a decision that was necessary.
link |
02:04:07.360
It's possible if we had more resources,
link |
02:04:08.840
we could have done in a way that was a little bit smoother,
link |
02:04:11.120
but ultimately, the arguments for Python 3,
link |
02:04:15.200
I bought them at the time and I buy them now, right?
link |
02:04:17.400
Having great text handling is like a nonnegotiable
link |
02:04:20.800
table stakes thing you need to have in a language.
link |
02:04:23.440
So that's great, but the execution,
link |
02:04:29.480
Python is the, it's volunteer driven.
link |
02:04:33.000
It's like now the most popular language on the planet,
link |
02:04:34.880
but it's all literally volunteers.
link |
02:04:37.080
So the lack of resources meant that they had to really,
link |
02:04:40.400
they had to do things in a very hamstrung way.
link |
02:04:43.600
And I think to carry the Python momentum in the language
link |
02:04:46.720
through that time, the data movement
link |
02:04:48.460
was a critical part of that.
link |
02:04:49.920
So some of it is carrot and stick, I actually have to
link |
02:04:54.080
shamefully admit that it took me a very long time
link |
02:04:57.040
to switch from Python 2 and Python 3
link |
02:04:58.760
because I'm a machine learning person.
link |
02:05:00.320
It was just for the longest time,
link |
02:05:01.960
you could just do fine with Python 2.
link |
02:05:03.820
Right.
link |
02:05:04.960
But I think the moment where I switched everybody
link |
02:05:09.200
I worked with and switched myself for small projects
link |
02:05:13.080
and big is when finally, when NumPy announced
link |
02:05:17.700
that they're going to end support like in 2020
link |
02:05:21.480
or something like that.
link |
02:05:22.320
Right.
link |
02:05:23.160
So like when I realized, oh, this isn't going,
link |
02:05:26.200
this is going to end.
link |
02:05:27.400
Right.
link |
02:05:28.240
So that's the stick, that's not a carrot.
link |
02:05:29.720
That's not, so for the longest time it was carrots.
link |
02:05:31.720
It was like all of these packages were saying,
link |
02:05:34.420
okay, we have Python 3 support now, come join us.
link |
02:05:37.460
We have Python 2 and Python 3, but when NumPy,
link |
02:05:40.120
one of the packages I sort of love and depend on
link |
02:05:43.440
said like, nope, it's over.
link |
02:05:47.660
That's when I decided to switch.
link |
02:05:50.360
I wonder if you think it was possible much earlier
link |
02:05:53.840
for somebody like NumPy or some major package
link |
02:05:58.840
to step into the cold and say like we're ending this.
link |
02:06:03.840
Well, it's a chicken and egg problem too, right?
link |
02:06:05.320
You don't want to cut off a lot of users
link |
02:06:07.580
unless you see the user momentum going too.
link |
02:06:09.400
So the decisions for the scientific community
link |
02:06:12.900
for each of the different projects,
link |
02:06:14.080
you know, there's not a monolith.
link |
02:06:15.280
Some projects are like, we'll only be releasing
link |
02:06:17.000
new features on Python 3.
link |
02:06:18.920
And that was more of a sticky carrot, right?
link |
02:06:21.400
A firm carrot, if you will, a firm carrot.
link |
02:06:26.360
A stick shaped carrot.
link |
02:06:27.960
But then for others, yeah, NumPy in particular,
link |
02:06:30.680
cause it's at the base of the dependency stack
link |
02:06:32.680
for so many things, that was the final stick.
link |
02:06:36.160
That was a stick shaped stick.
link |
02:06:37.600
People were saying, look, if I have to keep maintaining
link |
02:06:40.080
my releases for Python 2, that's that much less energy
link |
02:06:43.640
that I can put into making things better
link |
02:06:45.700
for the Python 3 folks or in my new version,
link |
02:06:48.160
which is of course going to be Python 3.
link |
02:06:49.920
So people were also getting kind of pulled by this tension.
link |
02:06:53.320
So the overall community sort of had a lot of input
link |
02:06:56.140
into when the NumPy core folks decided
link |
02:06:58.480
that they would end of life on Python 2.
link |
02:07:01.380
So as these numbers are a little bit loose,
link |
02:07:04.000
but there are about 10 million Python programmers
link |
02:07:06.840
in the world, you could argue that number,
link |
02:07:08.400
but let's say 10 million.
link |
02:07:10.320
That's actually where I was looking,
link |
02:07:12.280
said 27 million total programmers, developers in the world.
link |
02:07:17.240
You mentioned in a talk that changes need to be made
link |
02:07:20.540
for there to be 100 million Python programmers.
link |
02:07:24.000
So first of all, do you see a future
link |
02:07:26.020
where there's 100 million Python programmers?
link |
02:07:28.280
And second, what kind of changes need to be made?
link |
02:07:31.320
So Anaconda and Miniconda get downloaded
link |
02:07:33.200
about a million times a week.
link |
02:07:34.880
So I think the idea that there's only
link |
02:07:37.720
10 million Python programmers in the world
link |
02:07:39.320
is a little bit undercounting.
link |
02:07:41.960
There are a lot of people who escape traditional counting
link |
02:07:44.860
that are using Python and data in their jobs.
link |
02:07:48.440
I do believe that the future world for it to,
link |
02:07:52.100
well, the world I would like to see
link |
02:07:53.280
is one where people are data literate.
link |
02:07:56.240
So they are able to use tools
link |
02:07:58.600
that let them express their questions and ideas fluidly.
link |
02:08:03.180
And the data variety and data complexity will not go down.
link |
02:08:06.440
It will only keep increasing.
link |
02:08:08.320
So I think some level of code or code like things
link |
02:08:12.460
will continue to be relevant.
link |
02:08:15.460
And so my hope is that we can build systems
link |
02:08:19.760
that allow people to more seamlessly integrate
link |
02:08:22.900
Python kinds of expressivity with data systems
link |
02:08:26.040
and operationalization methods that are much more seamless.
link |
02:08:31.200
And what I mean by that is, you know,
link |
02:08:32.380
right now you can't punch Python code into an Excel cell.
link |
02:08:35.660
I mean, there's some tools you can do to kind of do this.
link |
02:08:37.960
We didn't build a thing for doing this back in the day,
link |
02:08:39.920
but I feel like the total addressable market
link |
02:08:43.820
for Python users, if we do the things right,
link |
02:08:46.860
is on the order of the Excel users,
link |
02:08:49.080
which is, you know, a few hundred million.
link |
02:08:51.180
So I think Python has to get better at being embedded,
link |
02:08:57.300
you know, being a smaller thing that pulls in
link |
02:08:59.660
just the right parts of the ecosystem
link |
02:09:01.720
to run numerics and do data exploration,
link |
02:09:05.560
meeting people where they're already at
link |
02:09:07.920
with their data and their data tools.
link |
02:09:09.640
And then I think also it has to be easier
link |
02:09:12.800
to take some of those things they've written
link |
02:09:14.720
and flow those back into deployed systems
link |
02:09:17.540
or little apps or visualizations.
link |
02:09:19.460
I think if we don't do those things,
link |
02:09:20.720
then we will always be kept in a silo
link |
02:09:23.160
as sort of an expert user's tool
link |
02:09:25.920
and not a tool for the masses.
link |
02:09:27.400
You know, I work with a bunch of folks
link |
02:09:28.960
in the Adobe Creative Suite,
link |
02:09:32.520
and I'm kind of forcing them or inspired them
link |
02:09:35.400
to learn Python, to do a bunch of stuff that helps them.
link |
02:09:38.560
And it's interesting, because they probably
link |
02:09:39.840
wouldn't call themselves Python programmers,
link |
02:09:41.740
but they're all using Python.
link |
02:09:43.820
I would love it if the tools like Photoshop and Premiere
link |
02:09:46.480
and all those kinds of tools that are targeted
link |
02:09:48.760
towards creative people, I guess that's where Excel,
link |
02:09:52.080
Excel is targeted towards a certain kind of audience
link |
02:09:54.220
that works with data, financial people,
link |
02:09:56.320
all that kind of stuff, if there would be easy ways
link |
02:10:00.040
to leverage to use Python for quick scripting tasks.
link |
02:10:03.600
And you know, there's an exciting application
link |
02:10:06.760
of artificial intelligence in this space
link |
02:10:09.760
that I'm hopeful about, looking at open AI codecs
link |
02:10:13.280
with generating programs.
link |
02:10:16.940
So almost helping people bridge the gap
link |
02:10:20.800
from kind of visual interface to generating programs,
link |
02:10:25.840
to something formal, and then they can modify it and so on,
link |
02:10:28.980
but kind of without having to read the manual,
link |
02:10:32.360
without having to do a Google search and stack overflow,
link |
02:10:34.880
which is essentially what a neural network does
link |
02:10:36.760
when it's doing code generation,
link |
02:10:39.040
is actually generating code and allowing a human
link |
02:10:42.840
to communicate with multiple programs,
link |
02:10:44.760
and then maybe even programs to communicate
link |
02:10:46.540
with each other via Python.
link |
02:10:48.420
So that to me is a really exciting possibility,
link |
02:10:51.220
because I think there's a friction to kind of,
link |
02:10:55.960
like how do I learn how to use Python in my life?
link |
02:10:58.920
There's oftentimes you kind of start a class,
link |
02:11:03.000
you start learning about types, I don't know, functions.
link |
02:11:07.080
Like this is, you know, Python is the first language
link |
02:11:09.600
with which you start to learn to program.
link |
02:11:11.920
But I feel like that's going to take a long time
link |
02:11:16.680
for you to understand why it's useful.
link |
02:11:18.560
You almost want to start with a script.
link |
02:11:20.300
Well, you do, in fact.
link |
02:11:22.000
I think starting with the theory behind programming languages
link |
02:11:24.840
and types and all that, I mean,
link |
02:11:26.100
types are there to make the compiler writer's jobs easier.
link |
02:11:30.000
Types are not, I mean, heck, do you have an ontology
link |
02:11:32.760
of types or just the objects on this table?
link |
02:11:34.320
No.
link |
02:11:35.560
So types are there because compiler writers are human
link |
02:11:39.160
and they're limited in what they can do.
link |
02:11:40.840
But I think that the beauty of scripting,
link |
02:11:45.100
like there's a Python book that's called
link |
02:11:47.560
"'Automate the Boring Stuff,'
link |
02:11:49.200
which is exactly the right mentality.
link |
02:11:51.120
I grew up with computers in a time when I could,
link |
02:11:56.720
when Steve Jobs was still pitching these things
link |
02:11:58.840
as bicycles for the mind.
link |
02:11:59.880
They were supposed to not be just media consumption devices,
link |
02:12:03.300
but they were actually, you could write some code.
link |
02:12:05.920
You could write basic, you could write some stuff
link |
02:12:07.280
to do some things.
link |
02:12:09.000
And that feeling of a computer as a thing
link |
02:12:12.160
that we can use to extend ourselves
link |
02:12:14.240
has all but evaporated for a lot of people.
link |
02:12:17.780
So you see a little bit in parts
link |
02:12:19.520
in the current, the generation of youth
link |
02:12:21.560
around Minecraft or Roblox, right?
link |
02:12:23.560
And I think Python, circuit Python,
link |
02:12:25.120
these things could be a renaissance of that,
link |
02:12:28.800
of people actually shaping and using their computers
link |
02:12:32.660
as computers, as an extension of their minds
link |
02:12:35.500
and their curiosity, their creativity.
link |
02:12:37.880
So you talk about scripting the Adobe Suite with Python
link |
02:12:41.560
in the 3D graphics world.
link |
02:12:42.920
Python is a scripting language
link |
02:12:46.320
that some of these 3D graphics suites use.
link |
02:12:48.760
And I think that's great.
link |
02:12:49.600
We should better support those kinds of things.
link |
02:12:51.280
But ultimately the idea that I should be able
link |
02:12:54.200
to have power over my computing environment.
link |
02:12:56.280
If I want these things to happen repeatedly all the time,
link |
02:12:59.640
I should be able to say that somehow to the computer, right?
link |
02:13:02.640
Now, whether the operating systems get there faster
link |
02:13:06.520
by having some Siri backed with open AI with whatever.
link |
02:13:09.560
So you can just say, Siri, make this do this
link |
02:13:10.860
and this and every other Friday, right?
link |
02:13:12.640
We probably will get there somewhere.
link |
02:13:14.240
And Apple's always had these ideas.
link |
02:13:15.880
There's the Apple script in the menu that no one ever uses,
link |
02:13:19.160
but you can do these kinds of things.
link |
02:13:21.640
But when you start doing that kind of scripting,
link |
02:13:23.700
the challenge isn't learning the type system
link |
02:13:25.920
or even the syntax of the language.
link |
02:13:27.160
The challenge is all of the dictionaries
link |
02:13:29.240
and all the objects of all their properties
link |
02:13:31.160
and attributes and parameters.
link |
02:13:32.840
Like who's got time to learn all that stuff, right?
link |
02:13:35.420
So that's when then programming by prototype
link |
02:13:38.840
or by example becomes the right way
link |
02:13:40.960
to get the user to express their desire.
link |
02:13:43.780
So there's a lot of these different ways
link |
02:13:45.200
that we can approach programming.
link |
02:13:46.060
But I do think when, as you were talking
link |
02:13:48.280
about the Adobe scripting thing,
link |
02:13:49.800
I was thinking about, you know,
link |
02:13:51.340
when we do use something like NumPy,
link |
02:13:53.780
when we use things in the Python data
link |
02:13:55.760
and scientific, let's say, expression system,
link |
02:14:00.160
there's a reason we use that,
link |
02:14:01.360
which is that it gives us mathematical precision.
link |
02:14:04.400
It gives us actually quite a lot of precision
link |
02:14:06.800
over precisely what we mean about this data set,
link |
02:14:09.960
that data set, and it's the fact
link |
02:14:11.800
that we can have that precision
link |
02:14:13.680
that lets Python be powerful over as a duct tape for data.
link |
02:14:18.720
You know, you give me a TSV or a CSV,
link |
02:14:21.240
and if you give me some massively expensive vendor tool
link |
02:14:25.120
for data transformation,
link |
02:14:26.460
I don't know I'm gonna be able to solve your problem.
link |
02:14:28.280
But if you give me a Python prompt,
link |
02:14:30.600
you can throw whatever data you want at me.
link |
02:14:32.500
I will be able to mash it into shape.
link |
02:14:34.400
So that ability to take it as sort of this like,
link |
02:14:38.360
you know, machete out into the data jungle
link |
02:14:40.320
is really powerful.
link |
02:14:41.600
And I think that's why at some level,
link |
02:14:44.300
we're not gonna get away from some of these expressions
link |
02:14:47.760
and APIs and libraries in Python for data transformation.
link |
02:14:53.960
You've been at the center of the Python community
link |
02:14:57.280
for many years.
link |
02:14:58.400
If you could change one thing about the community
link |
02:15:03.640
to help it grow, to help it improve,
link |
02:15:05.540
to help it flourish and prosper, what would it be?
link |
02:15:09.440
I mean, you know, it doesn't have to be one thing,
link |
02:15:11.720
but what kind of comes to mind?
link |
02:15:13.880
What are the challenges?
link |
02:15:15.360
Humility is one of the values that we have
link |
02:15:16.840
at Anaconda at the company,
link |
02:15:17.940
but it's also one of the values in the community.
link |
02:15:21.320
That it's been breached a little bit in the last few years,
link |
02:15:24.480
but in general, people are quite decent
link |
02:15:27.360
and reasonable and nice.
link |
02:15:29.320
And that humility prevents them from seeing
link |
02:15:34.420
the greatness that they could have.
link |
02:15:36.880
I don't know how many people in the core Python community
link |
02:15:40.800
really understand that they stand perched at the edge
link |
02:15:46.280
of an opportunity to transform how people use computers.
link |
02:15:50.160
And actually, PyCon, I think it's the last physical PyCon
link |
02:15:52.760
I went to, Russell Keith McGee gave a great keynote
link |
02:15:56.680
about very much along the lines of the challenges I have,
link |
02:16:00.560
which is Python, for a language that doesn't actually,
link |
02:16:04.320
that can't put an interface up,
link |
02:16:05.720
put an interface up on the most popular computing devices,
link |
02:16:09.360
it's done really well as a language, hasn't it?
link |
02:16:11.800
You can't write a web front end with Python, really.
link |
02:16:13.720
I mean, everyone uses JavaScript.
link |
02:16:15.080
You certainly can't write native apps.
link |
02:16:17.040
So for a language that you can't actually write apps
link |
02:16:20.580
in any of those front end runtime environments,
link |
02:16:22.720
Python's done exceedingly well.
link |
02:16:26.000
And so that wasn't to pat ourselves on the back.
link |
02:16:28.760
That was to challenge ourselves as a community to say,
link |
02:16:30.580
we, through our current volunteer dynamic,
link |
02:16:32.520
have gotten to this point.
link |
02:16:34.460
What comes next and how do we seize,
link |
02:16:36.720
you know, we've caught the tiger by the tail.
link |
02:16:38.620
How do we make sure we keep up with it as it goes forward?
link |
02:16:40.960
So that's one of the questions I have
link |
02:16:42.440
about sort of open source communities,
link |
02:16:44.240
is at its best, there's a kind of humility.
link |
02:16:48.500
Is that humility prevent you to have a vision
link |
02:16:52.500
for creating something like very new and powerful?
link |
02:16:55.560
And you've brought us back to consciousness again.
link |
02:16:57.640
The collaboration is a swarm emergent dynamic.
link |
02:17:00.700
Humility lets these people work together
link |
02:17:02.640
without anyone trouncing anyone else.
link |
02:17:04.840
How do they, you know, in consciousness,
link |
02:17:07.200
there's the question of the binding problem.
link |
02:17:08.680
How does a singular, our attention,
link |
02:17:10.720
how does that emerge from billions of neurons?
link |
02:17:13.880
So how can you have a swarm of people emerge a consensus
link |
02:17:17.680
that has a singular vision to say, we will do this.
link |
02:17:20.620
And most importantly, we're not gonna do these things.
link |
02:17:23.880
Emerging a coherent, pointed, focused leadership dynamic
link |
02:17:29.000
from a collaboration, being able to do that kind of,
link |
02:17:32.060
and then dissolve it so people can still do
link |
02:17:34.040
the swarm thing, that's a problem, that's a question.
link |
02:17:37.240
So do you have to have a charismatic leader?
link |
02:17:40.800
For some reason, Linus Torvald comes to mind,
link |
02:17:42.640
but there's people who criticize.
link |
02:17:44.640
He rules with an iron fist, man.
link |
02:17:46.640
But there's still charisma to it.
link |
02:17:48.520
There is charisma, right?
link |
02:17:49.520
There's a charisma to that iron fist.
link |
02:17:51.760
There's, every leader's different, I would say,
link |
02:17:55.320
in their success.
link |
02:17:56.700
So he doesn't, I don't even know if you can say
link |
02:17:59.480
he doesn't have humility, there's such a meritocracy
link |
02:18:04.840
of ideas that like, this is a good idea
link |
02:18:09.000
and this is a bad idea.
link |
02:18:10.440
There's a step function to it.
link |
02:18:11.640
Once you clear a threshold, he's open.
link |
02:18:13.920
Once you clear the bozo threshold,
link |
02:18:15.680
he's open to your ideas, I think, right?
link |
02:18:17.560
But see, the interesting thing is obviously
link |
02:18:20.280
that will not stand in an open source community
link |
02:18:23.440
if that threshold that is defined
link |
02:18:25.920
by that one particular person is not actually that good.
link |
02:18:30.320
So you actually have to be really excellent at what you do.
link |
02:18:33.720
So he's very good at what he does.
link |
02:18:37.120
And so there's some aspect of leadership
link |
02:18:39.040
where you can get thrown out, people can just leave.
link |
02:18:42.920
That's how it works with open source, the fork.
link |
02:18:45.960
But at the same time, you want to sometimes be a leader
link |
02:18:49.940
like with a strong opinion, because people,
link |
02:18:52.440
I mean, there's some kind of balance here
link |
02:18:54.440
for this like hive mind to get like behind.
link |
02:18:57.400
Leadership is a big topic.
link |
02:18:58.520
And I didn't, I'm not one of these guys
link |
02:18:59.880
that went to MBA school and said,
link |
02:19:01.120
I'm gonna be an entrepreneur and I'm gonna be a leader.
link |
02:19:03.560
And I'm gonna read all these Harvard Business Review
link |
02:19:05.280
articles on leadership and all this other stuff.
link |
02:19:07.760
Like I was a physicist turned into a software nerd
link |
02:19:10.700
who then really like nerded out on Python.
link |
02:19:13.600
And now I am entrepreneurial, right?
link |
02:19:14.840
I saw a business opportunity around the use
link |
02:19:16.280
of Python for data.
link |
02:19:17.120
But for me, what has been interesting over this journey
link |
02:19:20.720
with the last 10 years is how much I started really
link |
02:19:25.320
enjoying the understanding, thinking deeper
link |
02:19:28.520
about organizational dynamics and leadership.
link |
02:19:31.320
And leadership does come down to a few core things.
link |
02:19:35.280
Number one, a leader has to create belief
link |
02:19:40.000
or at least has to dispel disbelief.
link |
02:19:44.060
Leadership also, you have to have vision,
link |
02:19:46.480
loyalty and experience.
link |
02:19:49.360
So can you say belief in a singular vision?
link |
02:19:52.640
Like what does belief mean?
link |
02:19:53.680
Yeah, belief means a few things.
link |
02:19:55.240
Belief means here's what we need to do
link |
02:19:57.000
and this is a valid thing to do and we can do it.
link |
02:20:01.600
That you have to be able to drive that belief.
link |
02:20:06.040
And every step of leadership along the way
link |
02:20:08.800
has to help you amplify that belief to more people.
link |
02:20:12.680
I mean, I think at a fundamental level, that's what it is.
link |
02:20:15.920
You have to have a vision.
link |
02:20:17.120
You have to be able to show people that,
link |
02:20:20.920
or you have to convince people to believe in the vision
link |
02:20:23.640
and to get behind you.
link |
02:20:25.240
And that's where the loyalty part comes in
link |
02:20:26.480
and the experience part comes in.
link |
02:20:28.200
There's all different flavors of leadership.
link |
02:20:30.200
So if we talk about Linus, we could talk about Elon Musk
link |
02:20:34.320
and Steve Jobs, there's Sunder Prachai.
link |
02:20:38.440
There's people that kind of put themselves at the center
link |
02:20:40.760
and are strongly opinionated.
link |
02:20:42.560
And some people are more like consensus builders.
link |
02:20:45.160
What works well for open source?
link |
02:20:47.720
What works well in the space of programmers?
link |
02:20:49.680
So you've been a programmer, you've led many programmers
link |
02:20:53.240
that are now sort of at the center of this ecosystem.
link |
02:20:55.600
What works well in the programming world would you say?
link |
02:20:58.960
It really depends on the people.
link |
02:21:01.120
What style of leadership is best?
link |
02:21:02.600
And it depends on the programming community.
link |
02:21:04.160
I think for the Python community,
link |
02:21:06.320
servant leadership is one of the values.
link |
02:21:08.560
At the end of the day, the leader has to also be
link |
02:21:11.720
the high priest of values, right?
link |
02:21:14.720
So any collection of people has values of their living.
link |
02:21:19.160
And if you want to maintain certain values
link |
02:21:23.680
and those values help you as an organization
link |
02:21:26.280
become more powerful,
link |
02:21:27.520
then the leader has to live those values unequivocally
link |
02:21:30.560
and has to hold the values.
link |
02:21:33.600
So in our case, in this collaborative community
link |
02:21:36.680
around Python, I think that the humility
link |
02:21:41.400
is one of those values.
link |
02:21:42.920
Servant leadership, you actually have to kind of do the stuff.
link |
02:21:45.320
You have to walk the walk, not just talk the talk.
link |
02:21:49.000
I don't feel like the Python community really demands
link |
02:21:52.040
that much from a vision standpoint.
link |
02:21:53.800
And they should.
link |
02:21:54.760
And I think they should.
link |
02:21:56.680
This is the interesting thing is like so many people
link |
02:22:00.840
use Python, from where comes the vision?
link |
02:22:04.720
You know, like you have a Elon Musk type character
link |
02:22:07.640
who makes bold statements about the vision
link |
02:22:12.160
for particular companies he's involved with.
link |
02:22:14.640
And it's like, I think a lot of people that work
link |
02:22:18.680
at those companies kind of can only last
link |
02:22:22.560
if they believe that vision.
link |
02:22:24.680
And some of it is super bold.
link |
02:22:26.240
So my question is, and by the way,
link |
02:22:28.600
those companies often use Python.
link |
02:22:32.520
How do you establish a vision?
link |
02:22:33.760
Like get to 100 million users, right?
link |
02:22:37.360
Get to where, you know, the Python is at the center
link |
02:22:42.280
of the machine learning and was it data science,
link |
02:22:46.760
machine learning, deep learning,
link |
02:22:48.440
artificial intelligence revolution, right?
link |
02:22:51.720
Like in many ways, perhaps the Python community
link |
02:22:54.920
is not thinking of it that way,
link |
02:22:55.840
but it's leading the way on this.
link |
02:22:58.160
Like the tooling is like essential.
link |
02:23:01.240
Right, well, you know, for a while,
link |
02:23:03.320
PyCon people in the scientific Python
link |
02:23:05.840
and the PyData community, they would submit talks.
link |
02:23:09.320
Those are early 2010s, mid 2010s.
link |
02:23:12.160
They would submit talks for PyCon
link |
02:23:14.080
and the talks would all be rejected
link |
02:23:15.760
because there was the separate sort of PyData conferences.
link |
02:23:18.680
And like, well, these probably belong more to PyData.
link |
02:23:21.200
And instead there'd be yet another talk about, you know,
link |
02:23:23.520
threads and, you know, whatever, some web framework.
link |
02:23:26.400
And it's like, that was an interesting dynamic to see
link |
02:23:29.880
that there was, I mean, at the time it was a little annoying
link |
02:23:32.680
because we wanted to try to get more users
link |
02:23:34.280
and get more people talking about these things.
link |
02:23:35.760
And PyCon is a huge venue, right?
link |
02:23:37.320
It's thousands of Python programmers.
link |
02:23:40.240
But then also came to appreciate that, you know,
link |
02:23:42.200
parallel, having an ecosystem that allows parallel
link |
02:23:45.400
innovation is not bad, right?
link |
02:23:47.400
There are people doing embedded Python stuff.
link |
02:23:49.040
There's people doing web programming,
link |
02:23:50.640
people doing scripting, there's cyber uses of Python.
link |
02:23:53.280
I think the, ultimately at some point,
link |
02:23:55.120
if your slide mold covers so much stuff,
link |
02:23:58.040
you have to respect that different things are growing
link |
02:24:00.160
in different areas and different niches.
link |
02:24:02.320
Now, at some point that has to come together
link |
02:24:04.160
and the central body has to provide resources.
link |
02:24:07.560
The principle here is subsidiarity.
link |
02:24:09.160
Give resources to the various groups
link |
02:24:11.760
to then allocate as they see fit in their niches.
link |
02:24:15.040
That would be a really helpful dynamic.
link |
02:24:16.360
But again, it's a volunteer community.
link |
02:24:17.520
It's not like they had that many resources to start with.
link |
02:24:21.160
What was or is your favorite programming setup?
link |
02:24:23.920
What operating system, what keyboard,
link |
02:24:25.720
how many screens are you listening to?
link |
02:24:29.080
What time of day are you drinking coffee, tea?
link |
02:24:32.960
Tea, sometimes coffee, depending on how well I slept.
link |
02:24:36.280
I used to have.
link |
02:24:37.120
How much sleep do you get?
link |
02:24:38.120
Are you a night owl?
link |
02:24:39.680
I remember somebody asked you somewhere,
link |
02:24:41.520
a question about work life balance.
link |
02:24:44.880
Not just work life balance, but like a family,
link |
02:24:47.720
you lead a company and your answer was basically like,
link |
02:24:52.080
I still haven't figured it out.
link |
02:24:54.240
Yeah, I think I've gotten to a little bit better balance.
link |
02:24:56.320
I have a really great leadership team now supporting me
link |
02:24:58.880
and so that takes a lot of the day to day stuff
link |
02:25:01.680
off my plate and my kids are getting a little older.
link |
02:25:04.400
So that helps.
link |
02:25:05.240
So, and of course I have a wonderful wife
link |
02:25:07.520
who takes care of a lot of the things
link |
02:25:09.240
that I'm not able to take care of and she's great.
link |
02:25:11.680
I try to get to sleep earlier now
link |
02:25:13.760
because I have to get up every morning at six
link |
02:25:15.160
to take my kid down to the bus stop.
link |
02:25:17.000
So there's a hard thing.
link |
02:25:19.280
For a while I was doing polyphasic sleep,
link |
02:25:21.120
which is really interesting.
link |
02:25:22.040
Like I go to bed at nine, wake up at like 2 a.m.,
link |
02:25:24.640
work till five, sleep three hours, wake up at eight.
link |
02:25:27.320
Like that was actually, it was interesting.
link |
02:25:29.360
It wasn't too bad.
link |
02:25:30.200
How did it feel?
link |
02:25:31.160
It was good.
link |
02:25:32.000
I didn't keep it up for years, but once I have travel,
link |
02:25:34.960
then it just, everything goes out the window, right?
link |
02:25:37.360
Because then you're like time zones and all these things.
link |
02:25:39.120
Socially was it, except like were you able to live
link |
02:25:42.440
outside of how you felt?
link |
02:25:43.480
Were you able to live normal society?
link |
02:25:45.720
Oh yeah, because like on the nights
link |
02:25:47.360
that I wasn't out hanging out with people or whatever,
link |
02:25:48.920
going to bed at nine, no one cares.
link |
02:25:50.680
I wake up at two, I'm still responding to their slacks,
link |
02:25:52.720
emails, whatever, and you know, shitposting on Twitter
link |
02:25:56.880
or whatever at two in the morning is great, right?
link |
02:25:59.200
And then you go to bed for a few hours and you wake up,
link |
02:26:02.440
it's like you had an extra day in the middle.
link |
02:26:04.520
And I'd read somewhere that humans naturally
link |
02:26:06.320
have biphasic sleep or something, I don't know.
link |
02:26:09.160
I read basically everything somewhere.
link |
02:26:11.160
So every option of everything.
link |
02:26:13.400
Every option of everything.
link |
02:26:14.520
I will say that that worked out for me for a while,
link |
02:26:16.840
although I don't do it anymore.
link |
02:26:18.480
In terms of programming setup,
link |
02:26:19.400
I had a 27 inch high DPI setup that I really liked,
link |
02:26:24.840
but then I moved to a curved monitor
link |
02:26:26.200
just because when I moved to the new house,
link |
02:26:28.840
I want to have a bit more screen for Zoom plus communications
link |
02:26:32.440
plus various kinds of things.
link |
02:26:33.280
So it's like one large monitor.
link |
02:26:35.800
One large curved monitor.
link |
02:26:38.040
What operating system?
link |
02:26:39.720
Mac.
link |
02:26:40.640
Okay. Yeah.
link |
02:26:41.640
Is that what happens when you become important,
link |
02:26:43.920
is you stop using Linux and Windows?
link |
02:26:46.680
No, I actually have a Windows box as well
link |
02:26:48.520
on the next table over, but I have three desks, right?
link |
02:26:54.360
So the main one is the standing desk so that I can,
link |
02:26:57.200
whatever, when I'm like, I have a teleprompter set up
link |
02:26:59.320
and everything else.
link |
02:27:00.160
And then I've got my iMac and then eGPU and then Windows PC.
link |
02:27:06.480
The reason I moved to Mac was it's got a Linux prompt
link |
02:27:10.680
or no, sorry, it's got a, it's got a Unix prompt
link |
02:27:13.000
so I can do all my stuff, but then I don't have to worry.
link |
02:27:18.320
Like when I'm presenting for clients
link |
02:27:19.640
or investors or whatever, like it,
link |
02:27:21.800
I don't have to worry about any like ACPI related
link |
02:27:25.120
fsic things in the middle of a presentation,
link |
02:27:27.160
like none of that.
link |
02:27:28.000
It just, it will always wake from sleep
link |
02:27:30.960
and it won't kernel panic on me.
link |
02:27:32.320
And this is not a dig against Linux,
link |
02:27:34.200
except that I just, I feel really bad.
link |
02:27:38.440
I feel like a traitor to my community saying this, right?
link |
02:27:40.180
But in 2012, I was just like, okay, start my own company.
link |
02:27:43.240
What do I get?
link |
02:27:44.060
And Linux laptops were just not quite there.
link |
02:27:47.520
And so I've just stuck with Macs.
link |
02:27:48.840
Can I just defend something that nobody respectable
link |
02:27:51.360
seems to do, which is, so I do a boot on Linux windows,
link |
02:27:55.800
but in windows, I have a windows subsystems
link |
02:27:59.560
for Linux or whatever, WSL.
link |
02:28:02.920
And I find myself being able to handle everything I need
link |
02:28:06.480
and almost everything I need in Linux
link |
02:28:08.360
for basic sort of tasks, scripting tasks within WSL
link |
02:28:11.280
and it creates a really nice environment.
link |
02:28:12.960
So I've been, but like whenever I hang out with like,
link |
02:28:15.400
especially important people,
link |
02:28:17.920
like they're all on iPhone and a Mac
link |
02:28:20.720
and it's like, yeah, like what,
link |
02:28:23.360
there is a messiness to windows and a messiness to Linux
link |
02:28:27.840
that makes me feel like you're still in it.
link |
02:28:31.880
Well, the Linux stuff, windows subsystem for Linux
link |
02:28:34.700
is very tempting, but there's still the windows
link |
02:28:37.840
on the outside where I don't know where,
link |
02:28:40.280
and I've been, okay, I've used DOS since version 1.11
link |
02:28:44.160
or 1.21 or something.
link |
02:28:45.560
So I've been a long time Microsoft user.
link |
02:28:48.360
And I will say that like, it's really hard
link |
02:28:52.120
for me to know where anything is,
link |
02:28:53.440
how to get to the details behind something
link |
02:28:55.040
when something screws up as an invariably does
link |
02:28:57.600
and just things like changing group permissions
link |
02:28:59.920
on some shared folders and stuff,
link |
02:29:01.400
just everything seems a little bit more awkward,
link |
02:29:03.600
more clicks than it needs to be.
link |
02:29:06.440
Not to say that there aren't weird things
link |
02:29:07.800
like hidden attributes and all this other happy stuff
link |
02:29:09.880
on Mac, but for the most part,
link |
02:29:14.060
and well, actually, especially now
link |
02:29:15.400
with the new hardware coming out on Mac,
link |
02:29:16.880
it'll be very interesting with the new M1.
link |
02:29:20.080
There were some dark years in the last few years
link |
02:29:21.640
when I was like, I think maybe I have to move off of Mac
link |
02:29:24.000
as a platform, but this, I mean,
link |
02:29:27.400
like my keyboard was just not working.
link |
02:29:29.080
Like literally my keyboard just wasn't working, right?
link |
02:29:31.200
I had this touch bar, didn't have a physical escape button
link |
02:29:33.540
like I needed to because I used Vim,
link |
02:29:35.200
and now I think we're back, so.
link |
02:29:37.440
So you use Vim and you have a, what kind of keyboard?
link |
02:29:40.340
So I use a RealForce 87U, it's a mechanical,
link |
02:29:44.080
it's a Topre keyswitch.
link |
02:29:45.240
Like it's a weird shape, there's a normal shape, okay.
link |
02:29:48.520
Well, no, because I say that because I use a Kinesis,
link |
02:29:51.480
and you said some dark, you said you had dark moments.
link |
02:29:55.000
I recently had a dark moment,
link |
02:29:57.440
I was like, what am I doing with my life?
link |
02:29:58.780
So I remember sort of flying in a very kind of tight space,
link |
02:30:03.020
and as I'm working, this is what I do on an airplane.
link |
02:30:06.600
I pull out a laptop, and on top of the laptop,
link |
02:30:09.600
I'll put a Kinesis keyboard.
link |
02:30:11.080
That's hardcore, man.
link |
02:30:12.040
I was thinking, is this who I am?
link |
02:30:13.800
Is this what I'm becoming?
link |
02:30:15.080
Will I be this person?
link |
02:30:16.200
Because I'm on Emacs with this Kinesis keyboard,
link |
02:30:18.560
sitting like with everybody around.
link |
02:30:21.960
Emacs on Windows.
link |
02:30:23.620
On WSL, yeah.
link |
02:30:25.600
Yeah, Emacs on Linux on Windows.
link |
02:30:27.360
Yeah, on Windows.
link |
02:30:28.720
And like everybody around me is using their iPhone
link |
02:30:32.200
to look at TikTok.
link |
02:30:33.140
So I'm like in this land, and I thought, you know what?
link |
02:30:36.560
Maybe I need to become an adult and put the 90s behind me,
link |
02:30:40.880
and use like a normal keyboard.
link |
02:30:43.120
And then I did some soul searching,
link |
02:30:45.040
and I decided like this is who I am.
link |
02:30:46.760
This is me like coming out of the closet
link |
02:30:48.520
to saying I'm Kinesis keyboard all the way.
link |
02:30:50.760
I'm going to use Emacs.
link |
02:30:52.840
You know who else is a Kinesis fan?
link |
02:30:55.160
Wes McKinney, the creator of Pandas.
link |
02:30:56.720
Oh.
link |
02:30:57.560
He banged out Pandas on a Kinesis keyboard, I believe.
link |
02:31:00.000
I don't know if he's still using one, maybe,
link |
02:31:01.920
but certainly 10 years ago, like he was.
link |
02:31:04.200
If anyone's out there,
link |
02:31:05.320
maybe we need to have a Kinesis support group.
link |
02:31:07.520
Please reach out.
link |
02:31:08.360
Isn't there already one?
link |
02:31:09.440
Is there one?
link |
02:31:10.280
I don't know.
link |
02:31:11.100
There's gotta be an RSC channel, man.
link |
02:31:12.100
Oh no, and you access it through Emacs.
link |
02:31:16.880
Okay.
link |
02:31:18.040
Do you still program these days?
link |
02:31:19.600
I do a little bit.
link |
02:31:21.440
Honestly, the last thing I did was I had written,
link |
02:31:25.900
I was working with my son to script some Minecraft stuff.
link |
02:31:28.540
So I was doing a little bit of that.
link |
02:31:29.560
That was the last, literally the last code I wrote.
link |
02:31:33.100
Oh, you know what?
link |
02:31:33.940
Also, I wrote some code to do some cap table evaluation,
link |
02:31:36.600
waterfall modeling kind of stuff.
link |
02:31:39.260
What advice would you give to a young person,
link |
02:31:41.320
you said your son, today, in high school,
link |
02:31:44.200
maybe even college, about career, about life?
link |
02:31:48.580
This may be where I get into trouble a little bit.
link |
02:31:51.120
We are coming to the end.
link |
02:31:53.380
We're rapidly entering a time between worlds.
link |
02:31:56.520
So we have a world now that's starting to really crumble
link |
02:31:59.920
under the weight of aging institutions
link |
02:32:01.900
that no longer even pretend to serve the purposes
link |
02:32:04.460
they were created for.
link |
02:32:05.740
We are creating technologies that are hurtling billions
link |
02:32:09.060
of people headlong into philosophical crises
link |
02:32:11.460
who they don't even know the philosophical operating systems
link |
02:32:14.300
in their firmware.
link |
02:32:15.120
And they're heading into a time when that gets vaporized.
link |
02:32:17.840
So for people in high school,
link |
02:32:20.100
and certainly I tell my son this as well,
link |
02:32:21.520
he's in middle school, people in college,
link |
02:32:24.920
you are going to have to find your own way.
link |
02:32:29.740
You're going to have to have a pioneer spirit,
link |
02:32:31.700
even if you live in the middle
link |
02:32:34.080
of the most dense urban environment.
link |
02:32:36.360
All of human reality around you
link |
02:32:40.120
is the result of the last few generations of humans
link |
02:32:44.440
agreeing to play certain kinds of games.
link |
02:32:47.360
A lot of those games no longer operate
link |
02:32:51.880
according to the rules they used to.
link |
02:32:55.760
Collapse is nonlinear, but it will be managed.
link |
02:32:58.040
And so if you are in a particular social caste
link |
02:33:02.200
or economic caste,
link |
02:33:03.880
and I think it's not kosher to say that about America,
link |
02:33:10.200
but America is a very stratified and classist society.
link |
02:33:14.140
There's some mobility, but it's really quite classist.
link |
02:33:17.080
And in America, unless you're in the upper middle class,
link |
02:33:20.800
you are headed into very choppy waters.
link |
02:33:23.520
So it is really, really good to think
link |
02:33:26.260
and understand the fundamentals of what you need
link |
02:33:29.240
to build a meaningful life for you, your loved ones,
link |
02:33:32.760
with your family.
link |
02:33:35.480
And almost all of the technology being created
link |
02:33:38.000
that's consumer facing is designed to own people,
link |
02:33:41.760
to take the four stack of people, to delaminate them,
link |
02:33:47.120
and to own certain portions of that stack.
link |
02:33:50.360
And so if you want to be an integral human being,
link |
02:33:52.280
if you want to have your agency
link |
02:33:54.400
and you want to find your own way in the world,
link |
02:33:57.640
when you're young would be a great time to spend time
link |
02:34:00.580
looking at some of the classics
link |
02:34:02.960
around what it means to live a good life,
link |
02:34:05.880
what it means to build connection with people.
link |
02:34:08.080
And so much of the status game, so much of the stuff,
link |
02:34:13.060
one of the things that I sort of talk about
link |
02:34:14.860
as we create more and more technology,
link |
02:34:18.640
there's a gradient of technology,
link |
02:34:19.840
and a gradient of technology
link |
02:34:20.820
always leads to a gradient of power.
link |
02:34:22.640
And this is Jacques Leleu's point to some extent as well.
link |
02:34:25.200
That gradient of power is not going to go away.
link |
02:34:27.280
The technologies are going so fast
link |
02:34:29.900
that even people like me who helped create
link |
02:34:32.040
some of the stuff, I'm being left behind.
link |
02:34:33.640
Some of the cutting edge research,
link |
02:34:34.680
I don't know what's going on against today.
link |
02:34:36.800
You know, I go read some proceedings.
link |
02:34:38.660
So as the world gets more and more technological,
link |
02:34:42.620
it will create more and more gradients
link |
02:34:44.240
where people will seize power, economic fortunes.
link |
02:34:48.400
And the way they make the people who are left behind
link |
02:34:51.320
okay with their lot in life is they create lottery systems.
link |
02:34:54.480
They make you take part in the narrative
link |
02:35:00.020
of your own being trapped in your own economic sort of zone.
link |
02:35:04.180
So avoiding those kinds of things is really important.
link |
02:35:07.920
Knowing when someone is running game on you basically.
link |
02:35:10.740
So these are the things I would tell young people.
link |
02:35:12.260
It's a dark message, but it's realism.
link |
02:35:14.220
I mean, that's what I see.
link |
02:35:15.860
So after you gave some realism, you sit back.
link |
02:35:18.500
You sit back with your son.
link |
02:35:19.760
You're looking out at the sunset.
link |
02:35:21.700
What to him can you give as words of hope and to you
link |
02:35:27.900
from where do you derive hope for the future of our world?
link |
02:35:32.120
So you said at the individual level,
link |
02:35:33.620
you have to have a pioneer mindset
link |
02:35:36.460
to go back to the classics,
link |
02:35:38.100
to understand what is in human nature you can find meaning.
link |
02:35:41.580
But at the societal level, what trajectory,
link |
02:35:44.500
when you look up possible trajectories, what gives you hope?
link |
02:35:47.460
What gives me hope is that we have little tremors now
link |
02:35:52.780
shaking people out of the reverie
link |
02:35:54.620
of the fiction of modernity that they've been living in,
link |
02:35:57.020
kind of a late 20th century style modernity.
link |
02:36:00.540
That's good, I think.
link |
02:36:02.860
Because, and also to your point earlier,
link |
02:36:06.400
people are burning out on some of the social media stuff.
link |
02:36:08.180
They're sort of seeing the ugly side,
link |
02:36:09.300
especially the latest news with Facebook
link |
02:36:11.620
and the whistleblower, right?
link |
02:36:12.820
It's quite clear these things are not
link |
02:36:15.740
all they're cracked up to be.
link |
02:36:16.740
Do you believe, I believe better social media can be built
link |
02:36:20.740
because they are burning out
link |
02:36:21.820
and it'll incentivize other competitors to be built.
link |
02:36:25.300
Do you think that's possible?
link |
02:36:26.460
Well, the thing about it is that
link |
02:36:29.060
when you have extractive return on returns
link |
02:36:33.820
capital coming in and saying,
link |
02:36:35.380
look, you own a network,
link |
02:36:36.620
give me some exponential dynamics out of this network.
link |
02:36:39.020
What are you gonna do?
link |
02:36:39.840
You're gonna just basically put a toll keeper
link |
02:36:41.460
at every single node and every single graph edge,
link |
02:36:45.220
every node, every vertex, every edge.
link |
02:36:48.020
But if you don't have that need for it,
link |
02:36:49.920
if no one's sitting there saying,
link |
02:36:51.220
hey, Wikipedia, monetize every character,
link |
02:36:53.260
every byte, every phrase,
link |
02:36:54.980
then generative human dynamics will naturally sort of arise,
link |
02:36:58.340
assuming we respect a few principles
link |
02:37:01.140
around online communications.
link |
02:37:03.020
So the greatest and biggest social network in the world
link |
02:37:05.800
is still like email, SMS, right?
link |
02:37:08.780
So we're fine there.
link |
02:37:10.580
The issue with the social media, as we call it now,
link |
02:37:13.220
is they're actually just new amplification systems, right?
link |
02:37:16.620
Now it's benefited certain people like yourself
link |
02:37:18.660
who have interesting content to be amplified.
link |
02:37:23.140
So it's created a creator economy, and that's cool.
link |
02:37:25.180
There's a lot of great content out there.
link |
02:37:26.820
But giving everyone a shot at the fame lottery,
link |
02:37:29.300
saying, hey, you could also have your,
link |
02:37:31.500
if you wiggle your butt the right way on TikTok,
link |
02:37:33.940
you can have your 15 seconds of micro fame.
link |
02:37:36.100
That's not healthy for society at large.
link |
02:37:38.180
So I think if we can create tools that help people
link |
02:37:41.700
be conscientious about their attention,
link |
02:37:45.180
spend time looking at the past,
link |
02:37:46.700
and really retrieving memory and calling,
link |
02:37:49.600
not calling, but processing and thinking about that,
link |
02:37:53.260
I think that's certainly possible,
link |
02:37:55.100
and hopefully that's what we get.
link |
02:37:57.260
So the bigger question that you're asking
link |
02:38:01.020
about what gives me hope
link |
02:38:02.380
is that these early shocks of COVID lockdowns
link |
02:38:08.140
and remote work and all these different kinds of things,
link |
02:38:11.620
I think it's getting people to a point
link |
02:38:13.980
where they're sort of no longer in the reverie.
link |
02:38:19.740
As my friend Jim Rutt says,
link |
02:38:21.020
there's more people with ears to hear now.
link |
02:38:23.700
With the pandemic and education,
link |
02:38:26.100
everyone's like, wait, wait,
link |
02:38:27.140
what have you guys been doing with my kids?
link |
02:38:28.820
How are you teaching them?
link |
02:38:30.140
What is this crap you're giving them as homework?
link |
02:38:32.420
So I think these are the kinds of things
link |
02:38:33.940
that are getting, and the supply chain disruptions,
link |
02:38:36.860
getting more people to think about,
link |
02:38:38.220
how do we actually just make stuff?
link |
02:38:40.200
This is all good, but the concern is that
link |
02:38:44.820
it's still gonna take a while for these things,
link |
02:38:48.380
for people to learn how to be agentic again,
link |
02:38:51.860
and to be in right relationship with each other
link |
02:38:53.920
and with the world.
link |
02:38:55.860
So the message of hope is still people are resilient,
link |
02:38:58.420
and we are building some really amazing technology.
link |
02:39:01.380
And I also, to me, I derive a lot of hope
link |
02:39:03.920
from individuals in that van.
link |
02:39:08.140
The power of a single individual to transform the world,
link |
02:39:11.980
to do positive things for the world is quite incredible.
link |
02:39:14.700
Now you've been talking about,
link |
02:39:16.000
it's nice to have as many of those individuals as possible,
link |
02:39:18.860
but even the power of one, it's kind of magical.
link |
02:39:21.900
It is, it is.
link |
02:39:22.740
We're in a mode now where we can do that.
link |
02:39:24.500
I think also, part of what I try to do
link |
02:39:26.960
is in coming to podcasts like yours,
link |
02:39:29.020
and then spamming with all this philosophical stuff
link |
02:39:31.740
that I've got going on,
link |
02:39:33.540
there are a lot of good people out there
link |
02:39:34.800
trying to put words around the current technological,
link |
02:39:40.060
social, economic crises that we're facing.
link |
02:39:43.220
And in the space of a few short years,
link |
02:39:44.580
I think there has been a lot of great content
link |
02:39:46.340
produced around this stuff.
link |
02:39:47.620
For people who wanna see, wanna find out more,
link |
02:39:50.660
or think more about this,
link |
02:39:52.140
we're popularizing certain kinds of philosophical ideas
link |
02:39:54.540
that move people beyond just the,
link |
02:39:56.580
oh, you're communist, oh, you're capitalist kind of stuff.
link |
02:39:58.700
Like it's sort of, we're way past that now.
link |
02:40:01.200
So that also gives me hope,
link |
02:40:03.400
that I feel like I myself am getting a handle
link |
02:40:05.700
on how to think about these things.
link |
02:40:08.460
It makes me feel like I can,
link |
02:40:09.940
hopefully affect change for the better.
link |
02:40:12.560
We've been sneaking up on this question all over the place.
link |
02:40:15.700
Let me ask the big, ridiculous question.
link |
02:40:17.580
What is the meaning of life?
link |
02:40:20.380
Wow.
link |
02:40:23.900
The meaning of life.
link |
02:40:28.860
Yeah, I don't know.
link |
02:40:29.700
I mean, I've never really understood that question.
link |
02:40:32.060
When you say meaning crisis,
link |
02:40:34.700
you're saying that there is a search
link |
02:40:39.860
for a kind of experience
link |
02:40:42.300
that could be described as fulfillment,
link |
02:40:45.480
as like the aha moment of just like joy,
link |
02:40:50.180
and maybe when you see something beautiful,
link |
02:40:53.400
or maybe you have created something beautiful,
link |
02:40:55.220
that experience that you get,
link |
02:40:57.200
it feels like it all makes sense.
link |
02:41:02.600
So some of that is just chemicals coming together
link |
02:41:04.340
in your mind and all those kinds of things.
link |
02:41:06.420
But it seems like we're building
link |
02:41:08.820
a sophisticated collective intelligence
link |
02:41:12.620
that's providing meaning in all kinds of ways
link |
02:41:15.300
to its members.
link |
02:41:17.120
And there's a theme to that meaning.
link |
02:41:20.620
So for a lot of history,
link |
02:41:22.860
I think faith played an important role.
link |
02:41:26.620
Faith in God, sort of religion.
link |
02:41:29.660
I think technology in the modern era
link |
02:41:32.660
is kind of serving a little bit
link |
02:41:34.740
of a source of meaning for people,
link |
02:41:36.120
like innovation of different kinds.
link |
02:41:39.580
I think the old school things of love
link |
02:41:43.500
and the basics of just being good at stuff.
link |
02:41:47.480
But you were a physicist,
link |
02:41:50.300
so there's a desire to say, okay, yeah,
link |
02:41:52.580
but these seem to be like symptoms of something deeper.
link |
02:41:56.300
Right.
link |
02:41:57.140
Like why?
link |
02:41:57.960
A little meaning, what's capital M meaning?
link |
02:41:59.060
Yeah, what's capital M meaning?
link |
02:42:00.960
Why are we reaching for order
link |
02:42:03.080
when there is excess of energy?
link |
02:42:06.940
I don't know if I can answer the why.
link |
02:42:09.340
Any why that I come up with, I think, is gonna be,
link |
02:42:13.700
I'd have to think about that a little more,
link |
02:42:15.500
maybe get back to you on that.
link |
02:42:17.040
But I will say this.
link |
02:42:19.260
We do look at the world through a traditional,
link |
02:42:22.280
I think most people look at the world through
link |
02:42:24.220
what I would say is a subject object
link |
02:42:25.980
to kind of metaphysical lens,
link |
02:42:27.200
that we have our own subjectivity,
link |
02:42:29.620
and then there's all of these object things that are not us.
link |
02:42:34.220
So I'm me, and these things are not me, right?
link |
02:42:37.220
And I'm interacting with them, I'm doing things to them.
link |
02:42:39.880
But a different view of the world
link |
02:42:41.380
that looks at it as much more connected,
link |
02:42:44.220
that realizes, oh, I'm really quite embedded
link |
02:42:49.220
in a soup of other things,
link |
02:42:50.680
and I'm simply almost like a standing wave pattern
link |
02:42:54.060
of different things, right?
link |
02:42:55.920
So when you look at the world
link |
02:42:58.020
in that kind of connected sense,
link |
02:42:59.140
I've recently taken a shine
link |
02:43:02.980
to this particular thought experiment,
link |
02:43:04.540
which is what if it was the case
link |
02:43:08.560
that everything that we touch with our hands,
link |
02:43:12.380
that we pay attention to,
link |
02:43:13.880
that we actually give intimacy to,
link |
02:43:16.300
what if there's actually all the mumbo jumbo,
link |
02:43:21.300
like people with the magnetic healing crystals
link |
02:43:25.100
and all this other kind of stuff and quantum energy stuff,
link |
02:43:28.240
what if that was a thing?
link |
02:43:30.340
What if literally when your hand touches an object,
link |
02:43:34.140
when you really look at something
link |
02:43:35.420
and you concentrate and you focus on it
link |
02:43:36.840
and you really give it attention,
link |
02:43:39.020
you actually give it,
link |
02:43:40.580
there is some physical residue of something,
link |
02:43:44.240
a part of you, a bit of your life force that goes into it.
link |
02:43:48.120
Okay, now this is of course completely mumbo jumbo stuff.
link |
02:43:51.500
This is not like, I don't actually think this is real,
link |
02:43:53.680
but let's do the thought experiment.
link |
02:43:55.640
What if it was?
link |
02:43:57.720
What if there actually was some quantum magnetic crystal
link |
02:44:01.720
and energy field thing that just by touching this can,
link |
02:44:05.320
this can has changed a little bit somehow.
link |
02:44:08.920
And it's not much unless you put a lot into it
link |
02:44:11.940
and you touch it all the time, like your phone, right?
link |
02:44:15.000
These things gained, they gain meaning to you a little bit,
link |
02:44:19.640
but what if there's something that,
link |
02:44:23.520
technical objects, the phone is a technical object,
link |
02:44:25.240
it does not really receive attention or intimacy
link |
02:44:29.160
and then allow itself to be transformed by it.
link |
02:44:31.980
But if it's a piece of wood,
link |
02:44:33.360
if it's the handle of a knife that your mother used
link |
02:44:36.720
for 20 years to make dinner for you, right?
link |
02:44:40.340
What if it's a keyboard that you banged out,
link |
02:44:43.860
your world transforming software library on?
link |
02:44:46.400
These are technical objects
link |
02:44:47.840
and these are physical objects,
link |
02:44:48.780
but somehow there's something to them.
link |
02:44:51.400
We feel an attraction to these objects
link |
02:44:53.320
as if we have imbued them with life energy, right?
link |
02:44:56.980
So if you walk that thought experiment through,
link |
02:44:58.760
what happens when we touch another person,
link |
02:45:00.440
when we hug them, when we hold them?
link |
02:45:03.160
And the reason this ties into my answer for your question
link |
02:45:07.800
is that if there is such a thing,
link |
02:45:12.800
if there is such a thing,
link |
02:45:13.680
if we were to hypothesize, you know,
link |
02:45:15.660
hypothesize it's such a thing,
link |
02:45:18.480
it could be that the purpose of our lives
link |
02:45:23.440
is to imbue as many things with that love as possible.
link |
02:45:30.640
That's a beautiful answer
link |
02:45:32.820
and a beautiful way to end it, Peter.
link |
02:45:35.240
You're an incredible person.
link |
02:45:36.600
Thank you.
link |
02:45:37.440
Spanning so much in the space of engineering
link |
02:45:41.520
and in the space of philosophy.
link |
02:45:44.760
I'm really proud to be living in the same city as you
link |
02:45:49.000
and I'm really grateful
link |
02:45:51.120
that you would spend your valuable time with me today.
link |
02:45:53.000
Thank you so much.
link |
02:45:53.840
Well, thank you.
link |
02:45:54.660
I appreciate the opportunity to speak with you.
link |
02:45:56.560
Thanks for listening to this conversation with Peter Wang.
link |
02:45:59.240
To support this podcast,
link |
02:46:00.600
please check out our sponsors in the description.
link |
02:46:03.520
And now let me leave you with some words
link |
02:46:05.680
from Peter Wang himself.
link |
02:46:07.960
We tend to think of people
link |
02:46:09.480
as either malicious or incompetent,
link |
02:46:12.480
but in a world filled with corruptible
link |
02:46:15.200
and unchecked institutions,
link |
02:46:17.120
there exists a third thing, malicious incompetence.
link |
02:46:21.080
It's a social cancer
link |
02:46:22.620
and it only appears once human organizations scale
link |
02:46:26.000
beyond personal accountability.
link |
02:46:27.840
Thank you for listening and hope to see you next time.