back to index

Peter Wang: Python and the Source Code of Humans, Computers, and Reality | Lex Fridman Podcast #250


small model | large model

link |
00:00:00.000
The following is a conversation with Peter Wang, one of the most impactful leaders and
link |
00:00:03.920
developers in the Python community, former physicist, current philosopher, and someone
link |
00:00:09.920
who many people told me about and praised as a truly special mind that I absolutely should talk to.
link |
00:00:16.400
Recommendations ranging from Travis Hallifant to Eric Weinstein. So, here we are. This is
link |
00:00:24.000
the Lex Freeman podcast. To support it, please check out our sponsors in the description.
link |
00:00:28.720
And now, here's my conversation with Peter Wang.
link |
00:00:33.280
You're one of the most impactful humans in the Python ecosystem. So, you're an engineer,
link |
00:00:39.360
leader of engineers, but you're also a philosopher. So, let's talk both in this conversation about
link |
00:00:45.280
programming and philosophy. First, programming. What to you is the best or maybe the most beautiful
link |
00:00:52.240
feature of Python? Or maybe the thing they made you fall in love or stay in love with Python?
link |
00:00:58.960
Well, those are three different things. What I think is most beautiful, what made me fall in
link |
00:01:02.400
love, what made me stay in love. When I first started using it was when I was a C++ computer
link |
00:01:08.320
graphics performance nerd. In the 90s? Yeah, in late 90s. And that was my first job out of college.
link |
00:01:15.040
And we kept trying to do more and more abstract and higher order programming in C++, which at
link |
00:01:20.720
the time was quite difficult with templates. The compiler support wasn't great, et cetera.
link |
00:01:26.480
So, when I started playing around with Python, that was my first time encountering really
link |
00:01:30.720
first class support for types, for functions, and things like that. And it felt so incredibly
link |
00:01:36.000
expressive. So, that was what kind of made me fall in love with it a little bit. And also,
link |
00:01:40.720
once you spend a lot of time in a C++ dev environment, the ability to just whip something
link |
00:01:45.680
together that basically runs and works the first time is amazing. So, really productive scripting
link |
00:01:50.880
language. I knew Perl, I knew Bash, I was decent at both. But Python just made everything, it made
link |
00:01:57.520
the whole world accessible. I could script this and that and the other network things,
link |
00:02:02.800
little hard drive utilities, I could write all of these things in the space of an afternoon.
link |
00:02:06.400
And that was really, really cool. So, that's what made me fall in love.
link |
00:02:08.480
Is there something specific you could put your finger on that you're not programming in Perl
link |
00:02:13.120
today? Like why Python for scripting? Oh, I think there's not a specific thing as much as the design
link |
00:02:20.560
motif of both the creator of the language and the core group of people that built the standard
link |
00:02:25.920
library around him. There was definitely, there was a taste to it. I mean, Steve Jobs used that
link |
00:02:33.520
term in somewhat of an arrogant way, but I think it's a real thing that it was designed to fit.
link |
00:02:39.120
A friend of mine actually expressed this really well. He said, Python just fits in my head.
link |
00:02:42.240
And there's nothing better to say than that. Now, people might argue modern Python,
link |
00:02:47.840
there's a lot more complexity, but certainly as version 152, I think was my first version,
link |
00:02:53.200
that fit in my head very easily. So, that's what made me fall in love with it.
link |
00:02:56.000
Okay. So, the most beautiful feature of Python that made you stay in love. It's like over the years,
link |
00:03:04.800
what has like, you know, you do a double take and you return too often as a thing that just
link |
00:03:10.400
brings you a smile. I really still like the ability to play with metaclasses and express
link |
00:03:17.360
higher order things when I have to create some new object model to model something, right?
link |
00:03:23.280
It's easy for me because I'm pretty expert as a Python programmer. I can easily put all sorts of
link |
00:03:28.720
lovely things together and use properties and decorators and other kinds of things
link |
00:03:32.800
and create something that feels very nice. So, that to me, I would say that's tied with the
link |
00:03:37.840
numpy and vectorization capabilities. I love thinking in terms of the matrices and the vectors
link |
00:03:43.680
and these kind of data structures. So, I would say those two are kind of tied for me.
link |
00:03:49.280
So, the elegance of the numpy data structure, like slicing through the different multidimensional...
link |
00:03:54.640
Yeah, there's just enough things there. It's like a very simple, comfortable tool.
link |
00:04:00.480
It's easy to reason about what it does when you don't stray too far afield.
link |
00:04:04.160
Can you put your finger on how to design a language such that it fits in your head?
link |
00:04:11.840
Certain things like the colon or the certain notation aspects of Python that just kind of work.
link |
00:04:18.560
Is it something you have to kind of write out on paper look and say,
link |
00:04:23.600
it's just right? Is it a taste thing or is there a systematic process? What's your sense?
link |
00:04:28.160
I think it's more of a taste thing. But one thing that should be said is that
link |
00:04:34.640
you have to pick your audience. So, the better defined the user audience is or the users are,
link |
00:04:40.320
the easier it is to build something that fits in their minds because their needs
link |
00:04:45.120
will be more compact and coherent. It is possible to find a projection, a compact
link |
00:04:49.440
projection for their needs. The more diverse the user base, the harder that is. And so,
link |
00:04:55.360
as Python has grown in popularity, that's also naturally created more complexity
link |
00:05:00.000
as people try to design any given thing. There will be multiple valid
link |
00:05:03.520
opinions about a particular design approach. And so, I do think that's the
link |
00:05:09.120
downside of popularity. It's almost an intrinsic aspect of the complexity of the problem.
link |
00:05:12.960
Well, at the very beginning, aren't you an audience of one? Aren't all the greatest
link |
00:05:18.480
projects in history were just solving a problem that you yourself had?
link |
00:05:21.840
Well, so Clay Scherke in his book on crowdsourcing or in his kind of thoughts on crowdsourcing,
link |
00:05:27.440
he identifies the first step of crowdsourcing is me first collaboration.
link |
00:05:31.360
You first have to make something that works well for yourself. It's very telling that when you
link |
00:05:35.600
look at all of the impactful big projects, well, their fundamental projects now in the
link |
00:05:40.480
SciPy and PyData ecosystem, they all started with the people in the domain trying to scratch
link |
00:05:47.280
their own itch. And the whole idea of scratching your own itch is something that
link |
00:05:50.560
the open source or the free software world has known for a long time.
link |
00:05:53.680
But in the scientific computing areas, these are assistant professors or
link |
00:05:58.400
electrical engineering grad students. They didn't have really a lot of programming skill
link |
00:06:02.960
necessarily, but Python was just good enough for them to put something together that fit in their
link |
00:06:07.920
domain. So, it's almost like a necessity is a mother intervention aspect. And also, it was a
link |
00:06:14.880
really harsh filter for utility and compactness and expressiveness. It was too hard to use,
link |
00:06:22.240
then they wouldn't have built it because there was just too much trouble. It was a side project
link |
00:06:25.680
for them. And also, necessity creates a kind of deadline. It seems like a lot of these projects
link |
00:06:29.360
are quickly thrown together in the first step. And that even though it's flawed,
link |
00:06:35.440
that just seems to work well for software projects.
link |
00:06:38.800
Well, it does work well for software projects in general. And in this particular space,
link |
00:06:42.560
but one of my colleagues, Stan Siebert, identified this, that all the projects in the
link |
00:06:47.680
SciPy ecosystem, you know, if we just rattle them off, there's NumPy, there's SciPy, built by
link |
00:06:53.360
different collaborations of people, although Travis is the heart of both of them. But NumPy
link |
00:06:58.000
coming from numeric and numeric, these are different people. And then you've got Pandas,
link |
00:07:01.760
you've got Jupyter or iPython, there's Matplotlib, there's just so many others I'm,
link |
00:07:08.240
you know, not going to do justice if I try to name them all. But all of them are actually
link |
00:07:11.520
different people. And as they rolled out their projects, the fact that they had limited resources
link |
00:07:17.520
meant that they were humble about scope. A great famous hacker, Jamie Zewiski, once said that every
link |
00:07:25.120
geek's dream is to build the ultimate middleware, right? And the thing is with these
link |
00:07:31.120
scientist turn programmers, they had no such thing. They were just trying to write something
link |
00:07:34.640
that was a little bit better for what they needed, the MATLAB. And they were going to
link |
00:07:38.080
leverage what everyone else had built. So naturally, almost in kind of this annealing
link |
00:07:42.160
process or whatever, we built a very modular cover of the basic needs of a scientific computing
link |
00:07:49.040
library. If you look at the whole human story, how much of a leap is it? We've developed all
link |
00:07:54.560
kinds of languages, all kinds of methodologies for communication, and just kind of like grew
link |
00:07:59.040
this collective intelligence, the civilization grew, it expanded, we wrote a bunch of books,
link |
00:08:04.240
and now we tweet. How big of a leap is programming if programming is yet another language? Is it
link |
00:08:11.040
just a nice little trick that's temporary in our human history? Or is it like a big leap in the,
link |
00:08:19.520
almost us becoming another organism at a higher level of abstraction, something else?
link |
00:08:26.080
I think the act of programming or using grammatical constructions of some underlying primitives,
link |
00:08:34.880
that is something that humans do learn, but every human learns this. Anyone who can speak
link |
00:08:39.440
learns how to do this. What makes programming different has been that up to this point,
link |
00:08:44.880
when we try to give instructions to computing systems, all of our computers, well, actually,
link |
00:08:50.800
this is not quite true, but I'll first say it and then I'll tell you why it's not true.
link |
00:08:54.800
But for the most part, we can think of computers as being these iterated systems.
link |
00:08:58.880
So when we program, we're giving very precise instructions to iterated systems that then run at
link |
00:09:05.600
incomprehensible speed and run those instructions. In my experience, some people are just better
link |
00:09:12.000
equipped to model systematic iterated systems in their head. Some people are really good at that
link |
00:09:21.600
and other people are not. For instance, sometimes people have tried to build systems that make
link |
00:09:28.560
programming easier by making it visual drag and drop. And the issue is you can have a drag and
link |
00:09:32.880
drop thing, but once you start having to iterate the system with conditional logic, handling case
link |
00:09:36.640
statements and branch statements and all these other things, the visual drag and drop part doesn't
link |
00:09:41.040
save you anything. You still have to reason about this giant iterated system with all these different
link |
00:09:45.200
conditions around it. That's the hard part. So handling iterated logic, that's the hard part.
link |
00:09:52.160
The languages we use then emerge to give us ability and capability over these things.
link |
00:09:57.120
Now, the one exception to this rule, of course, is the most popular programming system in the
link |
00:10:00.320
world, which is Excel, which is a data flow and a data driven, immediate mode, data transformation
link |
00:10:06.160
oriented programming system. And this is actually not an accident that that system is the most
link |
00:10:11.280
popular programming system because it's so accessible to a much broader group of people.
link |
00:10:16.800
I do think as we build future computing systems, you're actually already seeing this a little bit,
link |
00:10:22.800
it's much more about composition of modular blocks. They themselves
link |
00:10:27.600
actually maintain all their internal state and the interfaces between them are well defined data
link |
00:10:32.000
schemas. And so to stitch these things together using like IFTTT or Zapier or any of these kind of,
link |
00:10:37.680
you know, I would say compositional scripting kinds of things. I mean, hypercard was also a
link |
00:10:43.040
little bit in this vein. That's much more accessible to most people. It's really that implicit state
link |
00:10:49.840
that's so hard for people to track. Yeah, okay. So that's modular stuff. But there's also an
link |
00:10:54.000
aspect where you're standing on the shoulders of giants. You're building like higher and higher
link |
00:10:58.320
levels of abstraction. You do that a little bit with language. So with language, you develop
link |
00:11:04.080
sort of ideas, philosophies from Plato and so on. And then you kind of leverage those philosophies as
link |
00:11:09.520
you try to have deeper and deeper conversations. But with programming, it seems like you can build
link |
00:11:15.200
much more complicated systems. Like without knowing how everything works, you can build on top of the
link |
00:11:19.920
work of others. And it seems like you're developing more and more sophisticated
link |
00:11:23.920
expressions, ability to express ideas in a computational space. I think it's worth pondering
link |
00:11:34.160
the difference here between complexity and complication. Back to Excel. Well, not quite
link |
00:11:42.640
back to Excel, but the idea is, you know, when we have a human conversation, all languages
link |
00:11:48.240
is for humans emerged to support human relational communications, which is that the person we're
link |
00:11:56.720
communicating with is a person, and they would communicate back to us. And so we sort of hit a
link |
00:12:04.480
resonance point, right, when we actually agree on some concepts. So there's a messiness to it,
link |
00:12:08.720
and there's a fluidity to it. With computing systems, when we express something to the computer,
link |
00:12:12.960
and it's wrong, we just try again. So we can basically live many virtual worlds of having failed at
link |
00:12:18.240
expressing ourselves to the computer until the one time we expressed ourselves right.
link |
00:12:22.720
Then we kind of put in production and then discover that it's still wrong,
link |
00:12:25.360
you know, a few days down the road. So I think the sophistication of things that we build with
link |
00:12:31.040
computing, one has to really pay attention to the difference between when an end user is expressing
link |
00:12:37.280
something onto a system that exists versus when they're extending the system to increase the
link |
00:12:43.600
system's capability for someone else to that interface with. We happen to use the same language
link |
00:12:48.560
for both of those things in most cases, but it doesn't have to be that. And Excel is actually
link |
00:12:52.880
a great example of this, of kind of a counterpoint to that. Okay. So what about the idea of, you
link |
00:12:59.360
said messiness, wouldn't you put the software 2.0 idea, this idea of machine learning into the
link |
00:13:10.480
further and further steps into the world of messiness. The same kind of beautiful messiness
link |
00:13:16.000
of human communication, isn't that what machine learning is, is building on levels of abstraction
link |
00:13:23.440
that don't have messiness in them, that at the operating system level, then there's Python,
link |
00:13:28.160
the programming languages that have more and more power. But then finally, there's neural networks
link |
00:13:34.480
that ultimately work with data. And so the programming is almost in the space of data,
link |
00:13:40.400
and the data is allowed to be messy. Isn't that a kind of program? So the idea of software 2.0
link |
00:13:45.840
is a lot of the programming happens in the space of data, all roads lead back to Excel,
link |
00:13:53.920
in the space of data, and also the hyperparameters of the neural networks. And all of those allow
link |
00:14:00.000
the same kind of messiness that human communication allows.
link |
00:14:04.240
It does, but my background is a physics. I took like two CS courses in college.
link |
00:14:09.840
So I don't have, now I did cram a bunch of CS in prep when I applied for grad school,
link |
00:14:15.360
but still, I don't have a formal background in computer science. But what I have observed in
link |
00:14:20.880
studying programming languages and programming systems and things like that is that there seems
link |
00:14:25.600
to be this triangle. It's one of these beautiful little iron triangles that you find in life sometimes.
link |
00:14:32.000
And it's the connection between the code correctness and kind of expressiveness of code,
link |
00:14:37.360
the semantics of the data, and then the kind of correctness or parameters of the underlying
link |
00:14:43.120
hardware compute system. So there's the algorithms that you want to apply. There's what the bits,
link |
00:14:50.160
that are stored on whatever media actually represents, the semantics of the data within
link |
00:14:55.920
their representation. And then there's what the computer can actually do. And every programming
link |
00:15:00.640
system, every information system ultimately finds some spot in the middle of this little triangle.
link |
00:15:07.360
Sometimes some systems collapse them into just one edge.
link |
00:15:10.800
Are we including humans as a system in this?
link |
00:15:13.440
No, no, I'm just thinking about computing systems here.
link |
00:15:15.280
Okay. And the reason I bring this up is because I believe there's no free lunch around this stuff.
link |
00:15:19.920
So if we build machine learning systems to sort of write the correct code that is at a certain
link |
00:15:26.080
level of performance, so it'll sort of select with the hyperparameters, we can tune kind of how
link |
00:15:31.440
we want the performance boundary in SLA to look like for transforming some set of inputs into
link |
00:15:37.440
certain kinds of outputs. That training process itself is intrinsically sensitive to the kinds
link |
00:15:44.000
of inputs we put into it. It's quite sensitive to the boundary conditions we put around the
link |
00:15:48.240
performance. So I think even as we move to using automated systems to build this transformation,
link |
00:15:53.360
as opposed to humans explicitly from a top down perspective, figuring out, well, this schema and
link |
00:15:58.000
this database and these columns get selected for this algorithm. And here we put a, you know,
link |
00:16:02.320
a Fibonacci heap for some other thing, human design or computer design. Ultimately, what we hit,
link |
00:16:08.000
the boundaries that we hit with these information systems is when the representation of the data
link |
00:16:13.040
hits the real world is where there's a lot of slop and a lot of interpretation. And that's where
link |
00:16:17.840
actually I think a lot of the work will go in the future is actually understanding kind of how to
link |
00:16:22.320
better, in the view of these live data systems, how to better encode the semantics of the world
link |
00:16:29.200
for those things. There'll be less of the details of how we write a particular SQL query.
link |
00:16:33.280
Okay. But given the semantics of the real world and the messiness of that,
link |
00:16:36.720
what does the word correctness mean when you're talking about code?
link |
00:16:40.720
There's a lot of dimensions to correctness. Historically, and this is one of the reasons
link |
00:16:44.640
I say that we're coming to the end of the era of software, because for the last 40 years or so,
link |
00:16:49.760
software correctness was really defined about functional correctness. I write a function,
link |
00:16:55.360
it's got some inputs, does it produce the right outputs? If so, then I can turn it on,
link |
00:16:59.120
hook it up to the live database and it goes. And more and more now, we have, I mean, in fact,
link |
00:17:03.680
I think the bright line in the sand between machine learning systems or modern data driven
link |
00:17:07.680
systems versus classical software systems is that the values of the input actually have to
link |
00:17:15.920
be considered with the function together to say this whole thing is correct or not.
link |
00:17:19.360
And usually there's a performance SLA as well. Like, did it actually finish making this SLA?
link |
00:17:23.680
Sorry, service level agreement. So it has to return within some time. You have a 10 millisecond
link |
00:17:28.480
time budget to return a prediction of this level of accuracy. So these are things that were not
link |
00:17:34.400
traditionally in most business computing systems for the last 20 years at all. People didn't think
link |
00:17:38.320
about it. But now we have value dependence on functional correctness. So that question of
link |
00:17:43.680
correctness is becoming a bigger and bigger question. What is that map to the end of software?
link |
00:17:48.080
We've thought about software as just this thing that you can do in isolation with some test trial
link |
00:17:53.760
inputs and in a very sort of sandboxed environment. And we can quantify how does it scale? How does
link |
00:18:00.880
it perform? How many nodes do we need to allocate if we want to scale this many inputs? When we start
link |
00:18:05.840
turning this stuff into prediction systems, real cybernetic systems, you're going to find scenarios
link |
00:18:11.440
where you get inputs that you don't want to spend a little more time thinking about. You're going to
link |
00:18:14.640
find inputs that are not, it's not clear what you should do, right? So then the software has a varying
link |
00:18:18.720
amount of runtime and correctness with regard to input. And that is a different kind of system
link |
00:18:23.440
altogether. Now it's a full on cybernetic system. It's a next generation information system that
link |
00:18:27.680
is not like traditional software systems. Can you maybe describe what is a cybernetic system?
link |
00:18:33.040
Do you include humans in that picture? So is a human in the loop kind of complex mess
link |
00:18:38.800
of the whole kind of interactivity of software with the real world or is it something more
link |
00:18:44.000
concrete? Well, when I say cybernetic, I really do mean that the software itself is closing the
link |
00:18:48.480
Observe Orient Decide Act loop by itself. So humans being out of the loop is the fact what,
link |
00:18:54.560
for me, makes it a cybernetic system. Humans are out of that loop.
link |
00:19:00.560
When humans are out of the loop, when the machine is actually sort of deciding on its own what it
link |
00:19:05.360
should do next to get more information, that makes it a cybernetic system. So we're just at the dawn
link |
00:19:10.560
of this, right? I think everyone talking about MLAI, it's great. But really the thing we should
link |
00:19:16.160
be talking about is when we really enter the cybernetic era and all of the questions of ethics
link |
00:19:21.760
and governance and correctness and all these things, they really are the most important questions.
link |
00:19:27.040
Okay. Can we just linger on this? What does it mean for the human to be out of the loop in a
link |
00:19:30.640
cybernetic system? Because isn't the cybernetic system that's ultimately accomplishing some kind
link |
00:19:36.800
of purpose that at the bottom, the turtles all the way down at the bottom turtle is a human?
link |
00:19:44.000
Well, the human may have set some criteria, but the human wasn't precise. So for instance,
link |
00:19:47.840
I just read the other day that earlier this year, or maybe it was last year at some point,
link |
00:19:52.400
the Libyan army, I think, sent out some automated killer drones with explosives.
link |
00:19:58.400
And there was no human in the loop at that point. They basically put them in a geofenced area,
link |
00:20:02.080
said find any moving target like a truck or vehicle that looks like this and boom,
link |
00:20:06.880
that's not a human in the loop, right? So increasingly the less human there is in the
link |
00:20:12.320
loop, the more concerned you are about these kinds of systems because there's unintended
link |
00:20:17.280
consequences like less the original designer and engineer of the system is able to predict.
link |
00:20:23.600
Even one with good intent is able to predict the consequences of such a system.
link |
00:20:27.760
That's right. There are some software systems that run without humans in the loop that are
link |
00:20:32.000
quite complex. And that's like the electronic markets. And we get flash crashes all the time.
link |
00:20:35.760
We get in the heyday of high frequency trading, there's a lot of market microstructure,
link |
00:20:41.680
people doing all sorts of weird stuff that the market designers had never really thought about,
link |
00:20:47.120
contemplated or intended. So when we run these full on systems with these automated trading bots,
link |
00:20:53.040
now they become automated killer drones and then all sorts of other stuff,
link |
00:20:58.320
that's what I mean by we're at the dawn of the cybernetic era and the end of the era of just
link |
00:21:01.760
pure software. Are you more concerned if you're thinking about cybernetic systems or even like
link |
00:21:08.720
self replicating systems? So systems that aren't just doing a particular task, but are able to
link |
00:21:14.000
sort of multiply and scale in some dimension in the digital or even the physical world.
link |
00:21:20.240
Are you more concerned about like the lobster being boiled? So a gradual with us not noticing
link |
00:21:29.040
collapse of civilization or a big explosion? Like oops, kind of a big thing where everyone
link |
00:21:38.000
notices, but it's too late. I think that it will be a different experience for different people.
link |
00:21:46.240
I do share a common point of view with some of the people who are concerned about climate change
link |
00:21:53.680
and just the big existential risks that we have. But unlike a lot of people who share my level
link |
00:22:02.080
of concern, I think the collapse will not be quite so dramatic as some of them think. And what I
link |
00:22:08.080
mean is that I think that for certain tiers of let's say economic class or certain locations
link |
00:22:13.520
in the world, people will experience dramatic collapse scenarios. But for a lot of people,
link |
00:22:18.560
especially in the developed world, the realities of collapse will be managed. There will be narrative
link |
00:22:24.640
management around it so that they essentially insulate, the middle class will be used to
link |
00:22:30.080
insulate the upper class from the pitchforks and the flaming torches and everything.
link |
00:22:35.760
It's interesting because my specific question wasn't as general as my question is more about
link |
00:22:40.560
cybernetic systems or software. Okay. It's interesting, but it would nevertheless perhaps
link |
00:22:45.680
be about class. So the effect of algorithms might affect certain classes more than others.
link |
00:22:50.160
Absolutely. I was more thinking about whether it's social media algorithms or actual robots.
link |
00:22:55.760
Is there going to be a gradual effect on us where we wake up one day and don't recognize the humans
link |
00:23:03.920
we are? Or is it something truly dramatic where there's a Meltdown of a nuclear reactor kind of
link |
00:23:11.840
thing, Chernobyl, catastrophic events that are almost bugs in a program that scaled itself too
link |
00:23:20.400
quickly? Yeah, I'm not as concerned about the visible stuff. And the reason is because the big
link |
00:23:27.920
visible explosions, I mean, this is something I said about social media is that, at least with
link |
00:23:32.240
nuclear weapons, when a nuke goes off, you can see it and you're like, well, that's really, wow,
link |
00:23:36.720
that's kind of bad. I mean, Oppenheimer was reciting the Bhagavad Gita when he saw one of
link |
00:23:41.040
those things go off. So we can see nukes are really bad. He's not reciting anything about
link |
00:23:46.640
Twitter. Well, right. But then when you have social media, when you have all these different things
link |
00:23:53.360
that conspire to create a layer of virtual experience for people that alienates them from
link |
00:23:59.360
reality and from each other, that's very pernicious. It's impossible to see, right? And it slowly
link |
00:24:05.120
gets in there. You've written about this idea of virtuality on this topic, which you define as
link |
00:24:11.840
the subjective phenomenon of knowingly engaging with virtual sensation and perception and
link |
00:24:17.840
suspending or forgetting the context that it's some alakam. So let me ask, what is real? Is there
link |
00:24:26.720
a hard line between reality and virtuality? Like perception drifts from some kind of physical
link |
00:24:32.560
reality. We have to kind of have a sense of what is the line that's too, we've gone too far.
link |
00:24:37.680
Right, right. For me, it's not about any hard line about physical reality as much as a simple
link |
00:24:45.360
question of, does the particular technology help people connect in a more integral way
link |
00:24:52.960
with other people, with their environment, with all of the full spectrum of things around them?
link |
00:24:58.400
So it's less about, oh, this is a virtual thing and this is a hard real thing,
link |
00:25:02.880
more about when we create virtual representations of the real things. Always,
link |
00:25:08.800
some things are lost in translation. Usually, many, many dimensions are lost in translation,
link |
00:25:13.760
right? We're now coming to almost two years of COVID, people on Zoom all the time. You know,
link |
00:25:18.080
it's different when you meet somebody in person than when you see them on, I've seen you on YouTube
link |
00:25:21.280
lots, right? But the senior person is very different. And so I think when we engage in virtual
link |
00:25:28.320
experiences all the time, and we only do that, there is absolutely a level of embodiment,
link |
00:25:34.160
there's a level of embodied experience and participatory interaction that is lost.
link |
00:25:40.160
And it's very hard to put your finger on exactly what it is. It's hard to say, oh,
link |
00:25:43.280
we're going to spend $100 million building a new system that captures this 5% better
link |
00:25:49.360
higher fidelity human expression. No one's going to pay for that, right?
link |
00:25:52.560
So when we rush madly into a world of simulacrum and virtuality,
link |
00:26:00.160
you know, the things that are lost are, it's difficult. Once everyone moves there, it can
link |
00:26:05.520
be hard to look back and see what we've lost. So is it irrecoverably lost? Or rather,
link |
00:26:11.360
when you put it all on the table, is it possible for more to be gained than is lost?
link |
00:26:16.960
If you look at video games, they create virtual experiences that are surreal and can bring joy
link |
00:26:23.600
to a lot of people, connect a lot of people and can get people to talk a lot of trash.
link |
00:26:29.760
So they can bring out the best and the worst in people. So is it possible to have a future world
link |
00:26:35.600
where the pros outweigh the cons? It is. I mean, it's possible to have that in the current world.
link |
00:26:41.520
But when literally trillions of dollars of capital are tied to using those things to
link |
00:26:49.920
groom the worst of our inclinations and to attack our weaknesses in the limbic system
link |
00:26:55.920
to create these things into id machines versus connection machines,
link |
00:26:59.120
then those good things don't stand a chance.
link |
00:27:03.040
Can you make a lot of money by building connection machines? Is it possible,
link |
00:27:07.520
do you think, to bring out the best in human nature to create fulfilling connections and
link |
00:27:12.960
relationships in the digital world and make a shit ton of money?
link |
00:27:18.560
If I figured out, I'll let you know.
link |
00:27:20.880
But what's your intuition without concretely knowing what's the solution?
link |
00:27:24.560
My intuition is that a lot of our digital technologies give us the ability to have
link |
00:27:29.680
synthetic connections or to experience virtuality. They have co evolved with
link |
00:27:36.080
sort of the human expectations. It's sort of like sugary drinks. As people have more sugary
link |
00:27:41.920
drinks, they need more sugary drinks to get that same hit. So with these virtual things
link |
00:27:47.760
and with TV and fast cuts and TikToks and all these different kinds of things,
link |
00:27:52.720
we're co creating essentially humanity that sort of asks and needs those things.
link |
00:27:57.200
And now it becomes very difficult to get people to slow down. It gets difficult for people to
link |
00:28:01.600
hold their attention on slow things and actually feel that embodied experience.
link |
00:28:07.120
So mindfulness now more than ever is so important in schools and as a therapy technique
link |
00:28:12.960
for people because our environment has been accelerated. And McLuhan actually talks about
link |
00:28:16.960
this in the electric environment of the television. And that was before TikTok and before front
link |
00:28:21.280
facing cameras. So I think for me, the concern is that it's not like we can ever switch to
link |
00:28:27.040
doing something better, but more of the humans and technology, they're not independent of each
link |
00:28:32.880
other. The technology that we use kind of molds what we need for the next generation of technology.
link |
00:28:38.960
Yeah, but humans are intelligent and they're introspective and they can reflect on the
link |
00:28:44.400
experiences of their life. So for example, there's been many years in my life where I ate an
link |
00:28:49.120
excessive amount of sugar. And then a certain moment I woke up and said,
link |
00:28:53.280
why do I keep doing this? This doesn't feel good. Like long term. And I think
link |
00:29:00.240
so going through the TikTok process of realizing, okay, when I short my attention span,
link |
00:29:06.240
actually that does not make me feel good longer term and realizing that and then going to platforms,
link |
00:29:13.120
going to places that are away from the sugar. And so doing you can create platforms that can
link |
00:29:21.120
make a lot of money when so to help people wake up to what actually makes them feel good long term
link |
00:29:26.160
develop, grow as human beings. And it just feels like humans are more intelligent than
link |
00:29:32.560
mice looking for cheese. They're able to sort of think, I mean, we can think we can contemplate
link |
00:29:38.240
our mortality and contemplate things like long term love and we can have a long term fear of
link |
00:29:46.000
certain things like mortality. We can contemplate whether the experience is the sort of the drugs
link |
00:29:52.560
of daily life that we've been partaking in is making us happier, better people. And then
link |
00:29:58.880
once we contemplate that we can make financial decisions in using services and paying for services
link |
00:30:04.720
that are making us better people. So it just seems that we're in the very first stages of
link |
00:30:11.920
social networks that just we're able to make a lot of money really quickly. But in bringing out
link |
00:30:18.480
sometimes the bad parts of human nature, they didn't destroy humans. They just fed everybody
link |
00:30:24.880
a lot of sugar. And now everyone's going to wake up and say, hey, we're going to start having
link |
00:30:29.920
like sugar free social media. Right. Well, there's a lot to unpack there. I think some people
link |
00:30:35.680
certainly have the capacity for that. And I certainly think, I mean, it's very interesting
link |
00:30:39.520
even the way you said it. You woke up one day and you thought, well, this doesn't feel very good.
link |
00:30:44.000
Well, that's still your limbic system saying this doesn't feel very good.
link |
00:30:47.120
Right. You have a cat brain's worth of neurons around your gut. Right. And so maybe that
link |
00:30:51.920
saturated and that was telling you, hey, this isn't good. Humans are more than just mice looking
link |
00:30:57.760
for cheese or monkeys looking for sex and power. Right. So let's slow down. Now you're,
link |
00:31:03.520
now a lot of people would argue with you on that one. But we're more than just that,
link |
00:31:07.280
but we're at least that. And we're very, very seldom not that. So my, I don't actually disagree with
link |
00:31:14.800
you that we could be better and that we can, that better platforms exist. And people are
link |
00:31:18.560
voluntarily noping out of things like Facebook and noping out. That's an awesome verb.
link |
00:31:22.640
It's a great term. Yeah, I love it. I use it all the time. You're welcome. I'm going to have to
link |
00:31:26.240
nope out of that. I want to nope out of that. Right. It's going to be a hard pass. And that's,
link |
00:31:30.880
and that's, that's great. But that's again, to your point, that's the first generation
link |
00:31:35.840
of front facing cameras of social pressures and you as a, you know, self starter, self aware
link |
00:31:42.560
adult have the capacity to say, yeah, I'm not going to do that. I'm going to go and spend time
link |
00:31:47.440
on long form reads. I'm going to spend time managing my attention. I'm going to do some yoga.
link |
00:31:52.000
If you're a 15 year old in high school and your entire social environment is everyone doing
link |
00:31:57.680
these things, guess what you're going to do? You're going to kind of have to do that because
link |
00:32:00.720
your limbic system says, hey, I need to get the guy or the girl or the whatever. And that's
link |
00:32:04.640
what I'm going to do. And so one of the things that we have to reason about here is the social
link |
00:32:08.640
media systems or, you know, social media, I think is a first, our first encounter with
link |
00:32:16.000
a technological system that runs a bit of a loop around our own cognition and attention.
link |
00:32:24.080
It's not the last. It's far from the last. And it gets to the heart of some of the
link |
00:32:30.000
philosophical Achilles heel of the Western philosophical system, which is each person
link |
00:32:34.960
gets to make their own determination. Each person is an individual that's sacrosanct
link |
00:32:39.600
and their agency and their sovereignty and all these things. The problem with these systems is
link |
00:32:44.160
they come down and they are able to manage everyone en masse. And so every person is making
link |
00:32:48.960
their own decision, but together the bigger system is causing them to act with a group
link |
00:32:54.560
dynamic that's very profitable for people. So this is the issue that we have is that our philosophies
link |
00:33:01.920
are actually not geared to understand what is it for a person to have a high trust connection
link |
00:33:10.160
as part of a collective and for that collective to have its right to coherency and agency.
link |
00:33:16.160
That's something like when a social media app causes a family to break apart,
link |
00:33:20.800
it's done harm to more than just individuals. So that concept is not something we really talk
link |
00:33:26.880
about or think about very much, but that's actually the problem is that we're vaporizing
link |
00:33:31.200
molecules into atomic units and then we're hitting all the atoms with certain things that's like,
link |
00:33:35.440
yeah, well, that person chose to look at my app. So our understanding of human nature is at the
link |
00:33:41.200
individual level. It emphasizes the individual too much because ultimately society operates at
link |
00:33:46.320
the collective level. And these apps do as well. And the apps do as well. So for us to understand
link |
00:33:51.360
the progression and development of this organism, we call human civilization,
link |
00:33:55.760
we have to think of the collective level too. I would say multi tiered. Multi tiered. So individual
link |
00:34:00.880
as well. Just individuals, family units, social collectives, and on the way up. Okay. So you've
link |
00:34:07.760
said that individual humans are multi layered susceptible to signals and waves and multiple
link |
00:34:12.640
strata, the physical, the biological, social, cultural, intellectual. So sort of going along
link |
00:34:18.640
these lines, can you describe the layers of the cake that is a human being and maybe the human
link |
00:34:26.720
collective human society? So I'm just stealing wholesale here from Robert Persig, who is the
link |
00:34:33.200
author of Zen and the Auto Motorcycle Maintenance. And his follow on book has a sequel to it called
link |
00:34:40.160
Lila. He goes into this in a little more detail. But it's a crude approach to thinking about people,
link |
00:34:46.800
but I think it's still an advancement over traditional subject object metaphysics,
link |
00:34:51.120
where we look at people as a dualist would say, well, is your mind, your consciousness,
link |
00:34:57.040
is that just merely the matter that's in your brain? Or is there something kind of more beyond
link |
00:35:03.040
that? And they would say, yes, there's a soul, sort of ineffable soul beyond just merely the
link |
00:35:07.840
physical body. And I'm not one of those people. I think that we don't have to draw a line between
link |
00:35:14.400
are things only this or only that, collectives of things can emerge structures and patterns
link |
00:35:19.600
that are just as real as the underlying pieces. But they're transcendent, but they're still of
link |
00:35:24.800
the underlying pieces. So your body is this way. I mean, we just know physically, you consist of
link |
00:35:30.480
atoms and whatnot. And then the atoms are arranged into molecules, which then arrange into certain
link |
00:35:36.240
kinds of structures that seem to have a homeostasis to them, we'll call them cells. And those cells
link |
00:35:41.200
form biological structures. Those biological structures give your body its physical ability
link |
00:35:48.000
and biological ability to consume energy and to maintain homeostasis. But humans are social
link |
00:35:53.440
animals. I mean, human by themselves is not very long for the world. So we also part of our biology
link |
00:35:59.440
is why are two connect to other people from the mirror neurons to our language centers and all
link |
00:36:04.960
these other things. So we are intrinsically, there's a layer, there's a part of us that
link |
00:36:10.640
wants to be part of a thing. If we're around other people, not saying a word, but they're just up
link |
00:36:15.200
and down jumping and dancing, laughing, we're going to feel better, right? And there was no
link |
00:36:19.920
exchange of physical anything. They didn't give us like five atoms of happiness, right? But there's
link |
00:36:25.120
an induction in our own sense of self that is at that social level. And then beyond that,
link |
00:36:30.640
person puts the intellectual level kind of one level higher than social. I think they're actually
link |
00:36:34.320
more intertwined than that. But the intellectual level is the level of pure ideas that you are a
link |
00:36:40.160
vessel for memes. You're a vessel for philosophies. You will conduct yourself in a particular way.
link |
00:36:46.320
I mean, I think part of this is if we think about it from a physics perspective, you're not, you
link |
00:36:50.400
know, there's the joke that physicists like to approximate things and we'll say, well, approximate
link |
00:36:55.120
a spherical cow, right? You're not a spherical cow. You're not a spherical human. You're a messy
link |
00:36:58.800
human. You're a messy human. And we can't even say what the dynamics of your emotion will be
link |
00:37:04.640
unless we analyze all four of these layers, right? If you're a Muslim at a certain time of day,
link |
00:37:11.200
guess what? You're going to be on the ground kneeling and praying, right? And that has nothing
link |
00:37:14.480
to do with your biological need to get on the ground or physics of gravity. It is an intellectual
link |
00:37:19.440
drive that you have. It's a cultural phenomenon and an intellectual belief that you carry. So
link |
00:37:24.000
that's what the four layered stack is all about. It's that a person is not only one of these things.
link |
00:37:30.160
They're all of these things at the same time. It's a superposition of dynamics that run through
link |
00:37:35.360
us that make us who we are. So no layers special? Not so much no layers special. Each layer is
link |
00:37:42.080
just different. But we are... Each layer gives the participation trophy. Each layer is a part
link |
00:37:49.440
of what you are. You are a layer cake of all these things. And if we try to deny... So many
link |
00:37:54.640
philosophies do try to deny the reality of some of these things. Some people will say,
link |
00:37:59.520
well, we're only atoms. Well, we're not only atoms because there's a lot of other things that are
link |
00:38:03.360
only atoms. I can reduce a human being to a bunch of soup and they're not the same thing,
link |
00:38:08.320
even though it's the same atoms. So I think the order and the patterns that emerge within humans
link |
00:38:14.240
to understand, to really think about what a next generation of philosophy would look like,
link |
00:38:18.880
that would allow us to reason about extending humans into the digital realm or to interact
link |
00:38:23.600
with autonomous intelligences that are not biological nature. We really need to appreciate
link |
00:38:28.960
these... What human beings actually are is the superposition of these different layers.
link |
00:38:34.640
You mentioned consciousness. Are each of these layers of cake conscious? Is consciousness a
link |
00:38:40.560
particular quality of one of the layers? Is there like a spike if you have a consciousness
link |
00:38:45.920
detector at these layers? Or is something that just permeates all of these layers and just takes
link |
00:38:50.800
different form? I believe what humans experience as consciousness is something that sits on a gradient
link |
00:38:56.480
scale of a general principle in the universe that seems to look for order and reach for order when
link |
00:39:05.040
there's an excess of energy. It would be odd to say a proton is alive. It'd be odd to say this
link |
00:39:10.160
particular atom or molecule of hydrogen gas is alive. But there's certainly something we can make
link |
00:39:19.920
assemblages of these things that have auto poetic aspects to them, that will create structures,
link |
00:39:25.360
that will... Crystalline solids will form very interesting and beautiful structures. This gets
link |
00:39:30.240
kind of into weird mathematical territories. You start to think about Penrose and Game of
link |
00:39:34.320
Life stuff about the generativity of math itself, like the hyperreal numbers, things like that.
link |
00:39:39.360
But without going down that rabbit hole, I would say that there seems to be a tendency
link |
00:39:45.520
in the world that when there is excess energy, things will structure and pattern themselves.
link |
00:39:51.200
And they will then actually furthermore try to create an environment that furthers their continued
link |
00:39:56.480
stability. It's the concept of externalized extended phenotype or niche construction. So
link |
00:40:03.520
this is ultimately what leads to certain kinds of amino acids forming certain kinds of structures
link |
00:40:09.040
and so on and so forth until you get the latter of life. So what we experience as consciousness,
link |
00:40:12.800
no, I don't think cells are conscious of that level. But is there something beyond mere equilibrium
link |
00:40:18.400
state biology and chemistry and biochemistry that drives what makes things work? I think there is.
link |
00:40:27.440
So Adrian Bajon has his Constructo Law. There's other things you look at when you look at the
link |
00:40:31.680
life sciences and you look at any kind of statistical physics and statistical mechanics.
link |
00:40:36.800
When you look at things far out of equilibrium, when you have excess energy,
link |
00:40:42.080
what happens then? Life doesn't just make a hotter soup. It starts making structure.
link |
00:40:47.280
There's something there.
link |
00:40:48.240
With the poetry of reaches for order when there's an excess of energy.
link |
00:40:54.080
Because you brought up game of life.
link |
00:40:57.120
You did it. Not me. I love cellular automata, so I have to sort of linger on that for a little bit.
link |
00:41:03.920
So cellular automata, I guess, or game of life is a very simple example of reaching for order
link |
00:41:11.680
when there's an excess of energy or reaching for order and somehow creating complexity.
link |
00:41:18.480
It's explosion of just turmoil, somehow trying to construct structures and so doing
link |
00:41:26.320
creates very elaborate organism looking type things. What intuition do you draw from the
link |
00:41:34.080
simplest mechanism? Well, I like to turn that around its head and look at it as what if every
link |
00:41:40.880
single one of the patterns created life or created, not life, but created interesting
link |
00:41:46.880
patterns? Because some of them don't and sometimes you make cool gliders. And other times,
link |
00:41:51.200
you start with certain things and you make gliders and other things that then construct
link |
00:41:54.800
like and gates and not gates and you build computers on them. All of these rules that
link |
00:41:59.920
create these patterns that we can see, those are just the patterns we can see. What if our
link |
00:42:04.960
subjectivity is actually limiting our ability to perceive the order in all of it? What if some
link |
00:42:11.120
of the things that we think are random are actually not that random? We're simply not
link |
00:42:13.840
integrating at a final level across a broad enough time horizon. And this again, I said,
link |
00:42:19.520
we go down the rabbit holes and the Penrose stuff or like Wolfram's explorations on these things.
link |
00:42:25.840
There is something deep and beautiful in the mathematics of all this that is hopefully one
link |
00:42:29.600
day I'll have enough money to work and retire and just ponder those questions. But there's
link |
00:42:33.600
something there. But you're saying there's a ceiling to when you have enough money and you
link |
00:42:37.200
retire and you ponder it, there's a ceiling to how much you can truly ponder because there's
link |
00:42:41.200
cognitive limitations in what you're able to perceive as a pattern. Yeah. And maybe mathematics
link |
00:42:48.720
extends your perception capabilities, but it's still finite. It's just like...
link |
00:42:55.280
Yeah. The mathematics we use is the mathematics that can fit in our head.
link |
00:42:58.880
Yeah. Did God really create the integers or did God create all of it? And we just happened
link |
00:43:04.400
at this point in time to be able to perceive integers.
link |
00:43:07.040
Well, he just did the positive in it.
link |
00:43:09.360
I said, did she create all of it?
link |
00:43:14.080
She just created the natural numbers and then we screwed all up with zero and then I guess.
link |
00:43:17.600
Okay. But we did. We created mathematical operations so that we can have iterated
link |
00:43:22.880
steps to approach bigger problems. I mean, the entire point of the Arabic neural system and
link |
00:43:29.280
it's a rubric for mapping a certain set of operations, folding them into a simple little
link |
00:43:34.000
expression, but that's just the operations that we can fit in our heads. There are many other
link |
00:43:39.280
operations besides. The thing that worries me the most about aliens and humans is that
link |
00:43:46.640
they're aliens. They're all around us and we're too dumb to see them.
link |
00:43:51.520
Oh, certainly. Yeah.
link |
00:43:52.720
Or life. Let's say just life. Life of all kinds of forms or organisms. You know what? Just even
link |
00:43:59.040
the intelligence of organisms is imperceptible to us because we're too dumb and self centered.
link |
00:44:06.880
Well, we're looking for a particular kind of thing. When I was at Cornell, I had a lovely
link |
00:44:11.840
professor of Asian religions, Jamery Law, and she would tell this story about a musician,
link |
00:44:18.160
a Western musician, who went to Japan and he taught classical music and could play all sorts
link |
00:44:23.440
of instruments. He went to Japan and he would ask people, he would basically be looking for
link |
00:44:28.480
things in the style of a Western chromatic scale and these kinds of things. And then
link |
00:44:34.560
finding none of it, he would say, well, there's really no music in Japan, but they're using
link |
00:44:37.920
a different scale. They're playing different kinds of instruments. The same thing she was
link |
00:44:41.040
using as sort of a metaphor for religion as well. In the West, we center a lot of religion,
link |
00:44:45.680
certainly the religions of Abraham, we center them around belief. And in the East, it's more
link |
00:44:50.880
about practice, right? Spirituality and practice rather than belief. So anyway, the point is here,
link |
00:44:55.760
to your point, life. I think so many people are so fixated on certain aspects of self replication
link |
00:45:03.200
or homeostasis or whatever. But if we kind of broaden and generalize this thing of things
link |
00:45:09.200
reaching for order, under which conditions can they then create an environment that sustains that
link |
00:45:14.240
order that allows them, the invention of death is an interesting thing. There are some organisms
link |
00:45:20.960
on earth that are thousands of years old. And it's not like they're incredibly complex,
link |
00:45:25.360
they're actually simpler than the cells that comprise us, but they never die. So at some point,
link |
00:45:31.680
death was invented somewhere along the eukaryotic scale, I mean, even the protists, right? There's
link |
00:45:36.080
death. And why is that along with the sexual reproduction, right? There is something about
link |
00:45:43.840
the renewal process, something about the ability to respond to a changing environment,
link |
00:45:48.080
where it just become, you know, just killing off the old generation and letting new generations
link |
00:45:53.280
try seems to be the best way to fit into the niche.
link |
00:45:56.560
You know, human historian seems to write about wheels and fires, the greatest inventions,
link |
00:46:01.520
but it seems like death and sex are pretty good. And they're kind of essential inventions
link |
00:46:06.400
at the very beginning. At the very beginning, yeah. Well, we didn't invent them, right?
link |
00:46:10.400
Well, broad, we, you didn't invent them. ICS is one, you particular Homo sapiens did not
link |
00:46:17.200
invent them, but we together, it's a team project, just like you're saying.
link |
00:46:21.680
I think the greatest Homo sapiens invention is collaboration.
link |
00:46:25.440
So when you say collaboration, Peter, where do ideas come from? And how do they take hold in
link |
00:46:33.840
society? What's, is that the nature of collaboration? Is that the basic atom of collaboration is ideas?
link |
00:46:40.320
It's not not ideas, but it's not only ideas. There's a book I just started reading called
link |
00:46:44.560
Death from a Distance. Have you heard of this? No.
link |
00:46:46.880
It's a really fascinating thesis, which is that humans are the only conspecific,
link |
00:46:52.480
the, the, the only species that can kill other members of the species from range.
link |
00:46:58.160
And maybe there's a few exceptions, but if you look in the animal world,
link |
00:47:00.880
you see like pronghorns, butting heads, right? You see the alpha lion and the beta lion,
link |
00:47:05.680
and they take each other down. Humans, we develop the ability to chuck rocks at each other and,
link |
00:47:09.920
well, at prey, but also at each other. And that means the beta male can chuck a rock
link |
00:47:14.800
at the alpha male and take them down. And with very, and he can throw a lot of rocks actually,
link |
00:47:19.920
miss a bunch of times, which is hit once and be good. So this ability to actually kill members
link |
00:47:25.680
of our own species from range without a threat of harm to ourselves, create essentially mutually
link |
00:47:30.960
assured destruction where we had to evolve cooperation. If we didn't, then if we just
link |
00:47:36.240
continue to try to do like I'm the, the biggest monkey in the tribe, and I'm going to, you know,
link |
00:47:40.960
own this tribe and you have to go, if we do it that way, then those tribes basically failed.
link |
00:47:46.560
And the tribes that persisted and that have now given rise to the modern Homo sapiens
link |
00:47:51.360
are the ones where respecting the fact that we can kill each other from range
link |
00:47:56.080
without harm, like there's an asymmetric ability to, to snipe the leader from range.
link |
00:48:00.880
That meant that we sort of had to learn how to cooperate with each other, right? Come back here,
link |
00:48:05.680
don't throw that rock at me. Let's talk our, this is out. So violence is also part of collaboration.
link |
00:48:10.080
The threat of violence, let's say. Well, the recognition, I would, maybe the better way to
link |
00:48:15.040
put it is the recognition that we have more to gain by working together than the prisoner's dilemma
link |
00:48:21.040
of both of us defecting. So mutually assured destruction in all its forms is part of this
link |
00:48:27.040
idea of collaboration. Well, and Eric Weinstein talks about our nuclear peace, right? I mean,
link |
00:48:31.600
it kind of sucks though if thousands of warheads aimed at each other, Russian and the US, but
link |
00:48:35.440
it's like, on the other hand, you know, we only fought proxy wars, right? We did not have another
link |
00:48:40.560
world war three of like hundreds of millions of people dying to like machine gun fire and,
link |
00:48:45.120
and, you know, giant, you know, guided missiles. So the original nuclear weapon is, is a rock
link |
00:48:50.240
that we learned how to throw essentially. The original, yeah. Well, the original scope of
link |
00:48:53.760
the world for any human being was their little tribe. I would say it still is to the most,
link |
00:48:59.680
for the most part. Eric Weinstein speaks very highly of you, which is very surprising to me
link |
00:49:07.440
at first because I didn't know there's this depth to you because I knew you as a, as a, as an amazing
link |
00:49:14.320
leader of engineers and an engineer yourself and so on. So it's fascinating. Maybe just as a comment,
link |
00:49:20.320
a side tangent that we can take. What's your nature of your friendship with Eric Weinstein?
link |
00:49:27.040
How did the two, how did such two interesting paths cross? Is it your origins in physics?
link |
00:49:32.880
Is it your interest in philosophy and the ideas of how the world works? What is it?
link |
00:49:37.600
It's actually, it's very random. It's Eric found me. He actually found Travis and I.
link |
00:49:43.680
Travis Alffant. Yeah, we were both working at a company called Enthought back in the mid 2000s,
link |
00:49:48.400
and we're doing a lot of consulting around scientific Python. And we'd made some,
link |
00:49:53.520
some tools and Eric was trying to use some of these Python tools to visualize,
link |
00:49:58.480
that he had a fiber bundle approach to modeling certain aspects of economics. He was doing this
link |
00:50:03.840
and that's how he kind of got in touch with us. And so this was in the early mid 2000s.
link |
00:50:12.000
Oh, seven time frame? Oh, six, oh, seven.
link |
00:50:13.680
Eric Weinstein trying to use Python to visualize fiber bundles using some of the tools that we
link |
00:50:19.760
had built in the open source. That's somehow entertaining to me, the thought of that.
link |
00:50:24.000
It's really funny. But then, you know, we've met with him a couple of times,
link |
00:50:27.040
really interesting guy. And then in the wake of the 0708 kind of financial collapse, he helped
link |
00:50:32.560
organize with Lee Smolin a symposium at the Perimeter Institute about, okay, well, clearly,
link |
00:50:39.520
you know, big finance can't be trusted, governments in its pockets with regulatory capture,
link |
00:50:43.440
what the F do we do? And all sorts of people, Nassim Tlaib was there and Andy Lowe from MIT was
link |
00:50:49.040
there and, you know, Bill Janeway, I mean, just a lot of, you know, top billing people were there.
link |
00:50:54.640
And he invited me and Travis and another one of our coworkers, Robert Kern, who was a anyone in
link |
00:51:01.200
the SciPy, NumPy community knows Robert, really great guy. So the three of us also got invited
link |
00:51:05.840
to go to this thing. And that's where I met Brett Weinstein for the first time as well.
link |
00:51:09.280
Yeah, I knew him before he got all famous for unfortunate reasons, I guess. But anyway, we,
link |
00:51:16.320
so we met then and kind of had a friendship, you know, throughout since then.
link |
00:51:21.360
You have a depth of thinking that kind of runs with Eric in terms of just thinking about the
link |
00:51:27.920
world deeply and thinking philosophically. And then there's Eric's interest in programming.
link |
00:51:33.360
Actually, never, you know, he'll bring up programming to me quite a bit as a metaphor
link |
00:51:39.760
for stuff. Right. But I never kind of pushed the point of like,
link |
00:51:44.320
what's the nature of your interest in programming? I think you saw it probably as a tool.
link |
00:51:48.720
Yeah, absolutely. To visualize, to explore mathematics and explore physics. But
link |
00:51:53.600
and I was wondering, like, what's the, his depth of interest and also his
link |
00:52:00.640
vision for what programming would look like in the future? Have you had interaction with him,
link |
00:52:07.200
like discussion in the space of Python programming? Well, in the sense of sometimes he asks me,
link |
00:52:11.840
why is this stuff still so hard? Yeah, you know, everybody's a critic. But actually, no, Eric.
link |
00:52:20.000
Programming, you mean like ingest? Yes. Yes. Well, not programming in general,
link |
00:52:23.200
but certain things in the Python ecosystem. But he actually, I think what I find in listening
link |
00:52:29.440
to some of his stuff is that he does use programming metaphors a lot, right? He'll
link |
00:52:33.600
talk about APIs or object oriented and things like that. So I think that's a useful
link |
00:52:37.360
set of frames for him to draw upon for discourse. I haven't pair programmed with him
link |
00:52:43.680
in a very long time. You've previously... Well, I mean, try to help, like, put together some
link |
00:52:50.160
of the visualizations around these things. But it's been a very, not really pair program,
link |
00:52:53.840
but like, even looked at his code, right? I mean, how legendary would be is that, like,
link |
00:52:58.800
get repo with Peter Wang and Eric Weinstein? Well, honestly, Robert Kern did all the heavy
link |
00:53:04.960
lifting. So I have to give credit where credit is due. Robert is the silent, but incredibly deep,
link |
00:53:10.160
quiet, not silent, but quiet, but incredibly deep individual at the heart of a lot of those things
link |
00:53:14.560
that Eric was trying to do. But we did have, you know, in the... As Travis and I were starting
link |
00:53:19.440
our company in 2012 timeframe, we went to New York. Eric was still in New York at the time.
link |
00:53:26.000
He hadn't moved to... This is before he joined Teal Capital. We just had like a steak dinner
link |
00:53:30.720
somewhere. Maybe it was Keynes, I don't know, somewhere in New York. So it's me, Travis, Eric,
link |
00:53:35.120
and then Wes McKinney, the creative pandas, and then Wes's then business partner, Adam.
link |
00:53:40.080
The five of us sat around having this just a hilarious time, amazing dinner. I forget what
link |
00:53:46.080
all we talked about, but it was one of those conversations which I wish as soon as COVID is
link |
00:53:51.040
over, maybe Eric and I can sit down. Recreate. Recreate in somewhere in LA or maybe he comes
link |
00:53:56.320
here because a lot of cool people are here in Austin, right? Exactly. Yeah, we're all here.
link |
00:53:59.120
He should come here. Eric, come here. Yeah. So he uses the metaphor source code sometimes to
link |
00:54:04.160
talk about physics. We figure out our own source code. So you were the physics background
link |
00:54:10.800
and somebody who's quite a bit of an expert in source code, do you think we'll ever figure out
link |
00:54:15.520
our own source code in the way that Eric means? Do you think we'll figure out the nature we have?
link |
00:54:20.000
Well, I think we're constantly working on that problem. I mean, I think we'll make more and more
link |
00:54:23.680
progress. For me, there's some things I don't really doubt too much. I don't really doubt
link |
00:54:29.280
that one day we will create a synthetic, maybe not fully in silicon, but a synthetic approach
link |
00:54:36.240
to cognition that rivals the biological 20 watt computers in our heads.
link |
00:54:44.640
What's cognition here? Cognition, perception, attention, memory, recall, asking better questions.
link |
00:54:50.960
That, for me, is a measure of intelligence. Doesn't Roomba vacuum clean already do that?
link |
00:54:55.200
Or do you mean, oh, it doesn't ask questions? I mean, no. So I mean, I have a Roomba, but it's
link |
00:55:00.960
not even as smart as my cat, right? Yeah, but it asks questions about what is this wall.
link |
00:55:05.440
It now, a new feature asks, is this poop or not, apparently? Yes. A lot of our current
link |
00:55:10.480
cybernetic system, it's a cybernetic system. It will go and it will happily vacuum up some poop,
link |
00:55:14.480
right? The older generations would. A new one, just released, does not vacuum up the poop.
link |
00:55:19.280
Okay. I wonder if it still gets stuck under my first rung of my stare. In any case, these
link |
00:55:25.200
cybernetic systems we have, they're designed to be sent off into a relatively static environment.
link |
00:55:33.920
And whatever dynamic things happen in the environment, they have a very limited capacity
link |
00:55:37.600
to respond to. A human baby, a human toddler of 18 months of age has more capacity to manage
link |
00:55:44.880
its own attention and its own capacity to make better sense of the world than the most advanced
link |
00:55:50.480
robots today. So again, my cat, I think, can do a better job of my two and they're both pretty
link |
00:55:55.600
clever. So I do think though, back to my kind of original point, I think that it's not, for me,
link |
00:56:01.040
it's not a question at all that we will be able to create synthetic systems that are able to do this
link |
00:56:07.920
better than the human, at an equal level or better than the human mind. It's also for me,
link |
00:56:12.480
not a question that we will be able to put them alongside humans so that they capture the full
link |
00:56:22.000
broad spectrum of what we are seeing as well. And also looking at our responses, listening
link |
00:56:28.160
to our responses, even maybe measuring certain vital signs about us. So in this kind of sidecar
link |
00:56:33.440
mode, a greater intelligence could use us and our whatever, 80 years of life to train itself up and
link |
00:56:42.160
then be a very good simulacrum of us moving forward. So who is in the sidecar in that picture
link |
00:56:49.200
of the future exactly? The baby version of our immortal selves. Okay. So once the baby grows up,
link |
00:56:56.080
is there any use for humans? I think so. I think that out of epistemic humility,
link |
00:57:03.120
we need to keep humans around for a long time. And I would hope that anyone making those systems
link |
00:57:07.920
would believe that to be true. Out of epistemic humility, what's the nature of the humility that...
link |
00:57:13.840
We don't know what we don't know. So we don't...
link |
00:57:18.880
Right? So we don't know... First we have to build systems that help us do the things that we do
link |
00:57:23.440
know about, that can then probe the unknowns that we know about. But the unknown unknowns,
link |
00:57:27.760
we don't know. Nature is the one thing that is infinitely able to surprise us. So we should
link |
00:57:34.560
keep biological humans around for a very, very, very long time. Even after our immortal selves
link |
00:57:39.600
have transcended and have gone off to explore other worlds, gone to go communicate with the
link |
00:57:44.000
lifeforms living in the sun or whatever else. So I think that's... For me, these seem like
link |
00:57:51.520
things that are going to happen. I don't really question that, that they're going to happen.
link |
00:57:55.600
Assuming we don't completely destroy ourselves.
link |
00:57:58.160
Is it possible to create an AI system that you fall in love with and it falls in love with you
link |
00:58:06.000
and you have a romantic relationship with it or a deep friendship, let's say?
link |
00:58:10.640
I would hope that that is the design criteria for any of these systems.
link |
00:58:14.400
If we cannot have a meaningful relationship with it, then it's still just a chunk of silicon.
link |
00:58:20.240
So then what is meaningful? Because back to sugar.
link |
00:58:23.600
Well, sugar doesn't love you back, right? So the computer has to love you back. And what does love
link |
00:58:27.360
mean? Well, in this context, for me, love... I'm going to take a page from Alain de Bouton. Love
link |
00:58:32.320
means that it wants to help us become the best version of ourselves.
link |
00:58:37.760
That's beautiful. That's a beautiful definition of love. So what role does love play in the human
link |
00:58:44.560
condition at the individual level and at the group level? Because you were kind of saying that
link |
00:58:50.880
we should really consider humans both at the individual and the group and the societal level.
link |
00:58:55.120
What's the role of love in this whole thing? We talked about sex, we talked about death
link |
00:58:59.520
thanks to the bacteria, they invented it. At which point did we invent love, by the way?
link |
00:59:04.160
I mean, is that also... No, I think love is the start of it all and the feelings of... And this
link |
00:59:10.560
gets sort of beyond just romantic, sensual, whatever kind of things, but actually genuine
link |
00:59:18.000
love as we have for another person, love as it would be used in a religious text, right?
link |
00:59:22.560
I think that capacity to feel love more than consciousness, that is the universal thing.
link |
00:59:28.240
Our feeling of love is actually a sense of that generativity. When we can look at another person
link |
00:59:32.880
and see that they can be something more than they are and more than just what we...
link |
00:59:40.080
A pigeonhole, we might stick them in. I think in any religious text, you'll find
link |
00:59:46.160
voiced some concept of this, that you should see the grace of God and the other person,
link |
00:59:50.240
right? They're made in the spirit of the love that God feels for his creation or her creation.
link |
00:59:57.840
I think this thing is actually the root of it. I don't think molecules of water
link |
01:00:04.640
feel consciousness, have consciousness, but there is some proto micro quantum thing of love
link |
01:00:10.720
that's the generativity when there's more energy than what they need to maintain equilibrium.
link |
01:00:15.440
That, when you sum it all up, is something that leads to... I had my mind blown one day as an
link |
01:00:22.800
undergrad at the physics computer lab. I logged in and when you log in to Bash for a long time,
link |
01:00:28.080
there was a little fortune that would come out and it said, man was created by water to carry
link |
01:00:32.720
itself uphill. I was logging in to work on some problem set and I logged in and I saw that and
link |
01:00:39.600
I just said, son of a bitch, I logged out and I went to the coffee shop and I got a coffee and I sat
link |
01:00:45.360
there on the quad and I'm like, it's not wrong and yet WTF, right? So when you look at it that way,
link |
01:00:55.920
okay, non equilibrium physics is a thing. So when we think about love, when we think about
link |
01:01:01.440
these kinds of things, I would say that in the modern day human condition,
link |
01:01:07.040
there's a lot of talk about freedom and individual liberty and rights and all these things,
link |
01:01:14.560
but that's very Hegelian. It's very kind of following from the Western philosophy
link |
01:01:19.760
of the individual as sacrosanct, but it's not really couched, I think, the right way because
link |
01:01:26.480
it should be how do we maximize people's ability to love each other, to love themselves first,
link |
01:01:31.840
to love each other, their responsibilities to the previous generation, to the future generations.
link |
01:01:37.760
Those are the kinds of things that should be our design criteria, right? Those should be
link |
01:01:42.960
what we start with to then come up with the philosophies of self and of rights and responsibilities.
link |
01:01:50.000
But that love being at the center of it, I think when we design systems for cognition,
link |
01:01:56.480
it should absolutely be built that way. I think if we simply focus on efficiency
link |
01:02:00.640
and productivity, these kind of very industrial era, all the things that Marx had issues with,
link |
01:02:07.440
right? That's a way to go and really, I think, go off the deep end in the wrong way.
link |
01:02:12.960
So one of the interesting consequences of thinking of life in this hierarchical way
link |
01:02:20.880
of an individual human, and then there's groups and there are societies, is I believe that
link |
01:02:27.200
that you believe that corporations are people. So this is a kind of a politically dense idea
link |
01:02:36.240
and all those kinds of things. If we just throw politics aside, if we throw all of that aside,
link |
01:02:41.200
in which sense do you believe that corporations are people?
link |
01:02:46.080
And how does love connect to that? Right. So the belief is that groups of people
link |
01:02:51.200
have some kind of higher level, I would say, mesoscopic claim to agency. So where do I,
link |
01:02:59.520
let's start with this. Most people would say, okay, individuals have claims to agency and
link |
01:03:04.080
sovereignty. Nations, we certainly act as if nations sort of very large, large scale. Nations have
link |
01:03:10.480
rights to sovereignty and agency. Like everyone plays the game of modernity as if that's true.
link |
01:03:16.080
We believe France is a thing. We believe the United States is a thing. But to say that groups of
link |
01:03:20.800
people at a smaller level than that, like a family unit is a thing. Well, in our laws,
link |
01:03:27.840
we actually do encode this concept. I believe that in a relationship and a marriage, one partner can
link |
01:03:34.720
sue for loss of consortium if someone breaks up the marriage or whatever. So these are concepts
link |
01:03:40.720
that even in law, we do respect that there is something about the union and about the family.
link |
01:03:46.000
So for me, I don't think it's so weird to think that groups of people have a claim to rights
link |
01:03:52.720
and sovereignty of some degree. And we look at our clubs, we look at churches. We talk about
link |
01:04:00.320
these collectives of people as if they have a real agency to them. And then they do. But I think
link |
01:04:06.320
if we take that one step further and say, okay, they can accrue resources. Well, yes, check. By
link |
01:04:11.840
law, they can. They can own land. They can engage in contracts. They can do all these different
link |
01:04:17.680
kinds of things. So we, in legal terms, support this idea that groups of people have rights.
link |
01:04:26.080
Where we go wrong on this stuff is that the most popular version of this is the for profit
link |
01:04:32.480
absentee owner corporation that then is able to amass larger resources than anyone else in the
link |
01:04:39.520
landscape, anything else, any other entity of equivalent size. And they're able to essentially
link |
01:04:44.160
bully around individuals, whether it's laborers, whether it's people whose resources they want
link |
01:04:47.760
to capture. They're also able to bully around our system of representation, which is still tied
link |
01:04:53.440
to individuals. So I don't believe that's correct. I don't think it's good that they're people,
link |
01:05:01.440
but they're assholes. I don't think that corporations as people acting like assholes is a
link |
01:05:04.480
good thing. But the idea that collectives and collections of people that we should treat
link |
01:05:09.120
them philosophically as having some agency, some agency and some mass at a mesoscopic level,
link |
01:05:16.800
I think that's an important thing. Because one thing I do think we under appreciate sometimes
link |
01:05:22.320
is the fact that relationships have relationships. So it's not just individuals having
link |
01:05:27.360
relationships with each other. But if you have eight people seated around a table,
link |
01:05:31.760
right, each person has a relationship with each of the others. And that's obvious.
link |
01:05:35.440
But then if it's four couples, each couple also has a relationship with each of the other couples,
link |
01:05:41.280
right? The dyads do. And if it's couples, but one is the father, mother, older,
link |
01:05:46.960
and then one of their children and their spouse, that family unit of four has a relationship with
link |
01:05:53.920
the other family unit of four. So the idea that relationships have relationships is something
link |
01:05:57.920
that we intuitively know in navigating the social landscape. But it's not something I
link |
01:06:02.720
hear expressed like that. It's certainly not something that is, I think, taken into account
link |
01:06:07.520
very well when we design these kinds of things. So I think the reason why I care a lot about this
link |
01:06:13.920
is because I think the future of humanity requires us to form better sense, make collective
link |
01:06:20.000
sense making units at something around Dunbar number, half to 5x Dunbar. And that's very different
link |
01:06:29.600
than right now where we defer sense making to massive aging zombie institutions.
link |
01:06:36.880
Or we just do it ourselves. We go it alone, go to the dark forest of the internet by ourselves.
link |
01:06:41.040
So that's really interesting. So you've talked about agency, I think, maybe calling it a convenient
link |
01:06:47.280
fiction at all these different levels. So even at the human individual level, it's kind of a fiction.
link |
01:06:52.960
We all believe because we are, like you said, made of cells and cells are made of atoms.
link |
01:06:56.720
So that's a useful fiction. And then there's nations. That seems to be a useful fiction.
link |
01:07:02.800
But it seems like some fictions are better than others. There's a lot of people that argue the
link |
01:07:08.400
fiction of nation is a bad idea. One of them lives two doors down from me. Michael Malis,
link |
01:07:14.480
he's an anarchist. I'm sure there's a lot of people who are into meditation that believe
link |
01:07:20.240
the idea this useful fiction of agency of an individual is troublesome as well. We need to
link |
01:07:26.960
let go of that in order to truly like to transcend, I don't know, I don't know what words you want to
link |
01:07:33.440
use, but suffering or to elevate the experience of life. So you're kind of arguing that, okay,
link |
01:07:40.960
so we have some of these useful fictions of agency. We should add a stronger fiction that we tell
link |
01:07:48.320
ourselves about the agency of groups in the hundreds of half a Dunbar's number or five X
link |
01:07:56.640
Dunbar's number. Yeah, something in that order. And we call them fictions, but really they're
link |
01:08:00.080
rules of the game, right? Rules that we feel are fair or rules that we consent to.
link |
01:08:05.520
I always question the rules when I lose like a monopoly. That's when I usually question them.
link |
01:08:09.680
When I'm winning, I don't question the rules. We should play a game of monopoly someday.
link |
01:08:12.720
There's a trippy version of it that we could do. Contract monopoly is introduced by a friend of mine
link |
01:08:18.480
to me where you can write contracts on future earnings or landing on various things and you
link |
01:08:24.800
can hand out like, you know, you can land first three times, you land a park place is free or
link |
01:08:28.960
whatever, just and then you can start trading those contracts for money. And then you create
link |
01:08:34.240
human civilization and somehow Bitcoin comes into it. Okay, but some of these... Actually,
link |
01:08:40.720
I bet if me and you and Eric sat down to play a game of monopoly and we were to make NFTs out
link |
01:08:45.280
of the contracts we wrote, we could make a lot of money. Now, it's a terrible idea.
link |
01:08:48.960
Yes. I would never do it, but I bet we could actually sell the NFTs around.
link |
01:08:52.800
I have other ideas to make money that I could tell you and they're all terrible ideas,
link |
01:08:59.840
including cat videos on the internet. Okay, but some of these rules of the game, some of these
link |
01:09:04.880
fictions are, it seems like they're better than others. They have worked this far to
link |
01:09:11.200
cohere human, to organize human collective action. But you're saying something about,
link |
01:09:16.400
especially this technological age requires modified fictions, stories of agency. Why the
link |
01:09:24.160
Dunbar number and also, you know, how do you select the group of people? You know,
link |
01:09:28.320
Dunbar numbers, I think, I have this sense that it's overused as a kind of law that somehow
link |
01:09:37.920
we can have deep human connection at this scale. Like some of it feels like an interface problem
link |
01:09:44.480
too. It feels like if I have the right tools, I can deeply connect with a larger number of people.
link |
01:09:51.680
It just feels like there's a huge value to interacting just in person, getting to share
link |
01:09:59.360
traumatic experiences together or beautiful experiences together. There's other experiences
link |
01:10:05.600
that in the digital space that you can share. It just feels like Dunbar's number can be
link |
01:10:09.680
expanded significantly, perhaps not to the level of millions and billions, but it feels
link |
01:10:15.280
like it could be expanded. How do we find the right interface, you think, for having a little
link |
01:10:23.600
bit of a collective here that has agency? You're right, that there's many different ways that we
link |
01:10:28.320
can build trust with each other. My friend Joe Edelman talks about a few different ways that
link |
01:10:35.920
mutual appreciation, trustful conflict, just experiencing something. There's a variety of
link |
01:10:41.840
different things that we can do, but all those things take time and you have to be present.
link |
01:10:48.320
The less present you are, I mean, there's just, again, a no free lunch principle here. The less
link |
01:10:51.760
present you are, the more of them you can do, but then the less connection you build. I think
link |
01:10:57.040
there is a human capacity issue around some of these things. Now, that being said, if we can
link |
01:11:03.200
use certain technologies. For instance, if I write a little monograph on my view of the world,
link |
01:11:08.640
you read it asynchronously at some point and you're like, wow, Peter, this is great. Here's
link |
01:11:12.080
mine. I read it. I'm like, wow, Lex, this is awesome. We can be friends without having to spend 10
link |
01:11:17.920
years figuring all this stuff out together. We can just read each other's thing and be like, oh,
link |
01:11:22.160
yeah, this guy's exactly in my wheelhouse and vice versa. We can then connect just a few times a year
link |
01:11:30.320
and maintain a high trust relationship. It can be expanded a little bit, but it also requires,
link |
01:11:35.680
these things are not all technological in nature. It requires the individual themselves
link |
01:11:39.520
to have a certain level of capacity, to have a certain lack of neuroticism. If you want to
link |
01:11:44.960
use the ocean big five sort of model, people have to be pretty centered. The less centered you are,
link |
01:11:50.480
the fewer authentic connections you can really build for a particular unit of time. It just takes
link |
01:11:55.120
more time. Other people have to put up with your crap. There's just a lot of the stuff that you
link |
01:11:58.240
have to deal with if you are not so well balanced. Yes, we can help people get better to where they
link |
01:12:04.960
can develop more relationships faster. Then you can maybe expand Dunbar number by quite a bit,
link |
01:12:09.440
but you're not going to do it. I think it's going to be hard to get it beyond 10x,
link |
01:12:12.800
kind of the rough swag of what it is. Well, don't you think that AI systems could be in addition
link |
01:12:20.720
to the Dunbar's number? Do you count as one system or multiple AI systems?
link |
01:12:25.520
Multiple AI systems. I do believe that AI systems, for them to integrate into human
link |
01:12:30.000
society as it is now, have to have a sense of agency. There has to be an individual,
link |
01:12:35.120
because otherwise we wouldn't relate to them. We could engage certain kinds of individuals to make
link |
01:12:40.560
sense of them for us and be almost like, did you ever watch Star Trek? There's the Volta,
link |
01:12:46.320
who are the interfaces, the ambassadors for the Dominion. We may have ambassadors that speak on
link |
01:12:53.120
behalf of these systems. They're like the Mentats of Dune, maybe, or something like this.
link |
01:12:57.040
I mean, we already have this to some extent. If you look at the biggest AI system, but the biggest
link |
01:13:02.800
cybernetic system in the world is the financial markets. It runs outside of any individual's
link |
01:13:06.800
control. You have an entire stack of people on Wall Street, Wall Street analysts, to CNBC,
link |
01:13:11.840
reporters, whatever. They're all helping to communicate, what does this mean? Jim Cramer,
link |
01:13:17.920
like Murrow Down, yelling and stuff. All of these people are part of that lowering of the
link |
01:13:22.960
complexity there to help do sense making for people at whatever capacity they're at. I don't
link |
01:13:29.920
see this changing with AI systems. I think you would have ringside commentators talking about
link |
01:13:33.600
all this stuff that this AI system is trying to do over here, over here, because it's actually a
link |
01:13:37.600
super intelligence. If you want to talk about humans interfacing, making first contact with
link |
01:13:41.440
the super intelligence, we're already there. We do it pretty poorly. If you look at the gradient
link |
01:13:45.520
of power and money, what happens is the people closest to it will absolutely exploit their
link |
01:13:49.760
distance for personal financial gain. We should look at that and be like, oh, well, that's probably
link |
01:13:56.720
what the future will look like as well. Nonetheless, we're already doing this kind of thing. In the
link |
01:14:01.520
future, we can have AI systems, but you're still going to have to trust people to bridge the sense
link |
01:14:06.400
making gap to them. I just feel like there could be millions of AI systems that have
link |
01:14:13.440
agencies. When you say one super intelligence, super intelligence in that context means
link |
01:14:22.160
it's able to solve particular problems extremely well, but there's some aspect of human like
link |
01:14:28.480
intelligence that's necessary to be integrated into human society, so not financial markets,
link |
01:14:33.680
not weather prediction systems or logistics optimization. I'm more referring to things
link |
01:14:41.360
that you interact with on the intellectual level. I think there has to be a backstory,
link |
01:14:48.720
there has to be a personality. I believe it has to fear its own mortality in a genuine way.
link |
01:14:56.480
Many of the elements that we humans experience that are fundamental to the human condition,
link |
01:15:01.760
because otherwise, we would not have a deep connection with it.
link |
01:15:05.680
But I don't think having a deep connection with it is necessarily going to stop us from
link |
01:15:09.440
building a thing that has quite an alien intelligence aspect. The other kind of alien
link |
01:15:15.520
intelligence on this planet is octopuses or octopodes or whatever you want to call them.
link |
01:15:21.280
There's a little controversy as to what the plural is, I guess.
link |
01:15:23.600
I look forward to your letters.
link |
01:15:27.840
An octopus, it really acts as a collective intelligence of eight intelligent arms.
link |
01:15:33.280
Its arms have a tremendous amount of neural density to them. Let's go with what you're
link |
01:15:41.520
saying. If we build a singular intelligence that interfaces with humans that has a sense of agency
link |
01:15:47.920
so we can run the cybernetic loop and develop its own theory of mind as well as its theory of
link |
01:15:52.080
action, I agree with you that that's the necessary components to build a real intelligence.
link |
01:15:57.520
There's got to be something at stake, it's got to make a decision, it's got to then run the
link |
01:16:00.640
OODA loop. Okay, so we build one of those. Well, if we can build one of those, we can probably
link |
01:16:04.000
build five million of them. So we'll build five million of them and if their cognitive systems
link |
01:16:08.720
are already digitized and are already kind of there, we stick our antenna on each of them,
link |
01:16:13.520
bring it all back to a hive mind that maybe doesn't make all the individual decisions for them,
link |
01:16:17.920
but treats each one as almost like a neuronal input of a much higher bandwidth and fidelity,
link |
01:16:23.680
going back to a central system that is then able to perceive much broader dynamics that we
link |
01:16:30.320
can't see. In the same way that a phased array radar, you think about how a phased array radar
link |
01:16:33.680
works, it's just sensitivity, it's just radars and then it's hypersensitivity and really great
link |
01:16:39.600
timing between all of them and with a flat array, it's as good as a curved radar dish.
link |
01:16:44.640
So with these things, it's a phased array of cybernetic systems that'll give the centralized
link |
01:16:49.280
intelligence much, much better, much higher fidelity understanding of what's actually
link |
01:16:55.520
happening in the environment. But the more power, the more understanding the central
link |
01:16:59.280
superintelligence has, the dumber the individual fingers of this intelligence are, I think.
link |
01:17:07.680
Not necessarily.
link |
01:17:08.800
In my sense, this argument, there has to be the experience of the individual agent has to have
link |
01:17:16.080
the full richness of the human like experience. You have to be able to be driving the car in
link |
01:17:23.200
the rain, listening to Bruce Springsteen and all of a sudden break out in tears because
link |
01:17:28.480
remembering something that happened to you in high school.
link |
01:17:30.880
We can implant those memories if that's really needed.
link |
01:17:32.720
No, but the central agency, I guess I'm saying in my view, for intelligence to be born,
link |
01:17:39.520
you have to have a decentralization. Each one has to struggle and reach,
link |
01:17:47.200
so each one in excess of energy has to reach for order as opposed to a central place doing so.
link |
01:17:54.320
Have you ever read some sci fi where there's hive minds? The Verner Vinge, I think, has one of these
link |
01:18:01.040
and then some of the stuff from the Commonwealth saga, the idea that you're an individual, but
link |
01:18:07.040
you're connected with a few other individuals telepathically as well. Together, you form a swarm.
link |
01:18:12.800
If you are, ask you, what do you think is the experience of, well, a borg, right? If you are
link |
01:18:19.280
one, if you're part of this hive mind, outside of all the aesthetics, forget the aesthetics,
link |
01:18:25.360
internally, what is your experience like? Because I have a theory as to what that looks like.
link |
01:18:30.560
The one question I have for you about that experience is, how much is there feeling of freedom,
link |
01:18:36.800
of free will? Because I obviously, as a human, very biased, but also somebody who values freedom
link |
01:18:44.960
and bias, it feels like the experience of freedom is essential for trying stuff out, to being creative
link |
01:18:55.920
and doing something truly novel, which is at the core of... Yeah. Well, I don't think you have to
link |
01:19:00.160
lose any freedom when you're in that mode, because I think what happens is we think,
link |
01:19:04.480
we still think, and I mean, you're still thinking about this in a sense of a top down,
link |
01:19:08.160
command and control hierarchy, which is not what it has to be at all. I think the experience,
link |
01:19:13.360
so I'll just show my cards here. I think the experience of being a robot in that robot swarm,
link |
01:19:19.520
a robot who has agency over their own local environment that's doing sense making and
link |
01:19:23.840
reporting it back to the hive mind, I think that robot's experience would be when the hive mind
link |
01:19:30.560
is working well, it would be an experience of talking to God, that you essentially are reporting
link |
01:19:37.360
to, you're sort of saying, here's what I see. I think this is what's going to happen over here.
link |
01:19:41.040
I'm going to go do this thing, because I think if I want to do this, this will make this change
link |
01:19:44.800
happen in the environment. And then, God, she may tell you, that's great. And in fact, your
link |
01:19:52.160
brothers and sisters will join you to help make this go better, right? And then she can let your
link |
01:19:56.160
brothers and sisters know, hey, Peter's going to go do this thing. Would you like to help him?
link |
01:20:00.560
Because we think that this will make this thing go better. And they'll say, yes, we'll help him.
link |
01:20:03.680
So the whole thing could be actually very emergent, the sense of what does it feel like to be a cell
link |
01:20:09.920
in a network that is alive, that is generative. And I think actually the feeling is serendipity,
link |
01:20:16.080
that there's random order, not random disorder or chaos, but random order, just when you need it
link |
01:20:23.600
to hear Bruce Springsteen, you turn on the radio and bam, it's Bruce Springsteen, right?
link |
01:20:28.560
That feeling of serendipity, I feel like this is a bit of a flight of fancy, but every cell in your
link |
01:20:33.840
body must have like, what does it feel like to be a cell in your body? When it needs sugar,
link |
01:20:39.040
there's sugar. When it needs oxygen, there's just oxygen. Now, when it needs to go and do its work
link |
01:20:43.680
and pull like as one of your muscle fibers, right? It does its work and it's great. It contributes
link |
01:20:49.200
to the cause, right? So this is all, again, a flight of fancy, but I think as we extrapolate up,
link |
01:20:54.400
what does it feel like to be an independent individual with some bounded sense of freedom?
link |
01:20:58.640
All sense of freedom is actually bounded, but it was a bounded sense of freedom that still lives
link |
01:21:02.480
within a network that has order to it. And I feel like it has to be a feeling of serendipity.
link |
01:21:07.040
So the cell, there's a feeling of serendipity, even though...
link |
01:21:11.120
It has no way of explaining why it's getting oxygen and sugar when it gets it.
link |
01:21:14.000
So you have to, each individual component has to be too dumb to understand the big picture.
link |
01:21:20.240
No, the big picture is bigger than what it can understand.
link |
01:21:22.800
But isn't that an essential characteristic of the individual is to be too dumb to understand
link |
01:21:28.720
the bigger picture? Like not dumb necessarily, but limited in its capacity to understand.
link |
01:21:33.920
Because the moment you understand, I feel like that leads to, if you tell me now
link |
01:21:40.960
that there are some bigger intelligence controlling everything I do,
link |
01:21:45.600
intelligence broadly defined, meaning like even the Sam Harris thing, there's no free will.
link |
01:21:51.440
If I'm smart enough to truly understand that that's the case, that's kind of, I don't know if I...
link |
01:21:58.880
Well, you have philosophical breakdown, right? Because we're in the West and we're pumped full
link |
01:22:02.960
of this stuff of like you are a golden, fully free individual with all your freedoms and all
link |
01:22:07.760
your liberties and go grab a gun and shoot whatever you want to. No, it's actually...
link |
01:22:11.360
You don't actually have a lot of these... You're not unconstrained, but the areas where you can
link |
01:22:17.920
manifest agency, you're free to do those things. You can say whatever you want on this podcast.
link |
01:22:23.120
You can create a podcast, right? Yeah.
link |
01:22:24.800
You're not... I mean, you have a lot of this kind of freedom, but even as you're doing this,
link |
01:22:29.040
you are actually, I guess with the denouement of this is that we are already intelligent agents
link |
01:22:35.760
in such a system, right? In that one of these like robots of one to five million little
link |
01:22:41.520
swarm robots or one of the Borg, they're just posting an internal bulletin board.
link |
01:22:45.280
I mean, maybe the Borg Cube is just a giant Facebook machine floating in space
link |
01:22:48.480
and everyone's just posting on there. They're just posting really fast and like...
link |
01:22:52.480
It's called the metaverse now.
link |
01:22:53.680
The nest called the metaverse. That's right. Here's the enterprise. Maybe we should all
link |
01:22:55.920
go shoot it. Yeah, everyone upvotes and they're going to go shoot it, right?
link |
01:22:58.800
But we already are part of a human online collaborative environment and collaborative
link |
01:23:04.240
sense making system. It's not very good yet. It's got the overhangs of zombie sense making
link |
01:23:09.680
institutions all over it. But as that washes away and as we get better at this,
link |
01:23:15.440
we are going to see humanity improving at speeds that are unthinkable in the past.
link |
01:23:21.840
And it's not because anyone's freedoms were limited. In fact, the open source...
link |
01:23:24.560
And we started this with open source software, right?
link |
01:23:26.720
The collaboration, what the internet surfaced was the ability for people all over the world
link |
01:23:31.200
to collaborate and produce some of the most foundational software that's in use today,
link |
01:23:35.600
right? That entire ecosystem was created by collaborators all over the place.
link |
01:23:38.880
So these online kind of swarm kind of things are not novel. I'm just suggesting that
link |
01:23:45.920
future AI systems, if you can build one smart system, you have no reason not to build multiple.
link |
01:23:51.360
If you build multiple, there's no reason not to integrate them all into a collective sense making
link |
01:23:57.040
substrate. And that thing will certainly have immersion intelligence that none of the individuals
link |
01:24:01.760
and probably not any of the human designers will be able to really put a bow around and explain.
link |
01:24:06.720
But in some sense, would that AI system still be able to go like rural Texas by ranch,
link |
01:24:14.480
go off the grid, go full survivalist? Can you disconnect from the hive mind?
link |
01:24:22.400
You may not want to.
link |
01:24:25.120
So to be an effective, to be intelligent.
link |
01:24:27.920
You have access to way more intelligence capability if you're plugged into five
link |
01:24:31.040
million other really, really smart cyborgs. Why would you leave?
link |
01:24:34.800
So like there's a word control that comes to mind. So it doesn't feel like control,
link |
01:24:40.480
like overbearing control. It's just knowledge.
link |
01:24:45.200
I think systems, well, this is to your point. I mean, look at how uncomfortable you are with
link |
01:24:49.280
this concept, right? I think systems that feel like overbearing control will not evolutionarily
link |
01:24:54.800
win out. I think systems that give their individual elements the feeling of serendipity and the
link |
01:25:00.080
feeling of agency that those systems will win. But that's not to say that there will not be
link |
01:25:06.400
emergent higher level order on top of it. And that's the thing, that's the philosophical breakdown
link |
01:25:11.920
that we're staring right at, which is in the Western mind, I think there's a very sharp
link |
01:25:16.800
delineation between explicit control. Cartesian, like what is the vector? Where is the position?
link |
01:25:24.800
Where is it going? It's completely deterministic. And kind of this idea that things emerge,
link |
01:25:31.360
everything we see is the emergent patterns of other things. And there is agency when there's
link |
01:25:37.200
extra energy. So you have spoken about a kind of meaning crisis that we're going through.
link |
01:25:45.840
But it feels like since we invented sex and death, we broadly speaking, we've been searching for a
link |
01:25:55.040
kind of meaning. So it feels like human civilization has been going through a meaning crisis of
link |
01:25:59.760
different flavors throughout its history. Why is how is this particular meaning crisis different?
link |
01:26:07.360
Or is it really a crisis and it wasn't previously? What's your sense?
link |
01:26:11.520
A lot of human history, there wasn't so much a meaning crisis. There was just a food and not
link |
01:26:16.000
getting eaten by bears crisis. Once you get to a point where you can make food, there was the
link |
01:26:21.040
not getting killed by other humans crisis. So sitting around wondering what is it all about,
link |
01:26:26.160
it's actually a relatively recent luxury. And to some extent, the meaning crisis coming out of
link |
01:26:32.640
that is precisely because... Well, it's not precisely because I believe that meaning is the
link |
01:26:38.160
consequence of when we make consequential decisions. It's tied to agency. When we make
link |
01:26:46.160
consequential decisions, that generates meaning. So if we make a lot of decisions,
link |
01:26:51.040
but we don't see the consequences of them, then it feels like what was the point?
link |
01:26:54.400
Right? But if there's all these big things happening, but words are long for the ride,
link |
01:26:58.240
then it also does not feel very meaningful. Meaning, as far as I can tell, this is my
link |
01:27:02.320
working definition of Serga 2021 is generally the result of a person making a consequential
link |
01:27:09.200
decision, acting on it, and then seeing the consequences of it. So historically,
link |
01:27:14.640
just when humans are in survival mode, you're making consequential decisions all the time.
link |
01:27:19.360
So there's not a lack of meaning because you either got eaten or you didn't.
link |
01:27:22.560
Right? You got some food and that's great. You feel good. These are all consequential decisions.
link |
01:27:27.440
Only in the post fossil fuel and industrial revolution could we create a massive leisure
link |
01:27:35.600
class. I could sit around not being threatened by bears, not starving to death, making decisions
link |
01:27:44.320
somewhat, but a lot of times not seeing the consequences of any decisions they make.
link |
01:27:48.480
The general sense of anomy, I think there's the French term for it, in the wake of the consumer
link |
01:27:54.880
society, in the wake of mass media telling everyone, hey, choosing between Hermes and Chanel is a
link |
01:28:02.240
meaningful decision. No, it's not. I don't know what either of those mean.
link |
01:28:05.600
Oh, there's high end luxury purses and crap like that. But the point is that we give people the
link |
01:28:13.360
idea that consumption is meaning, that making a choice of this team versus that team spectating has
link |
01:28:19.280
meaning. So we produce all of these different things that are as if meaning, but really making
link |
01:28:26.480
a decision that has no consequences for us. And so that creates the meaning crisis.
link |
01:28:31.040
Well, you're saying choosing between Chanel and the other one has no consequence?
link |
01:28:36.400
Why is one more meaningful than the other? It's not that it's more meaningful than the other.
link |
01:28:39.520
It's that you make a decision between these two brands and you're told this brand will make me
link |
01:28:44.560
look better in front of other people. If I buy this brand of car, if I wear that brand of apparel,
link |
01:28:50.080
the idea, like a lot of decisions we make are around consumption, but consumption by itself
link |
01:28:55.360
doesn't actually yield meaning. Gaining social status does provide meaning. So that's why in this
link |
01:29:01.600
era of abundant production, so many things turn into status games. Then the NFT kind of explosion
link |
01:29:08.480
is a similar kind of thing. Everywhere there are status games because we just have so much
link |
01:29:14.400
excess production. But aren't those status games a source of meaning? Why do the games we play have
link |
01:29:21.440
to be grounded in physical reality like they are when you're trying to run away from lions?
link |
01:29:26.080
Why can't we in this virtuality world on social media, why can't we play the games on social
link |
01:29:31.840
media, even the dark ones? Right, we can. But you're saying that's creating a meaning crisis.
link |
01:29:37.680
Well, there's a meaning crisis in that there's two aspects of it. Number one, playing those kinds
link |
01:29:42.960
of status games oftentimes requires destroying the planet because it ties to consumption, consuming
link |
01:29:52.960
the latest and greatest version of a thing, buying the latest limited edition sneaker,
link |
01:29:57.440
and throwing out all the old ones. Maybe it keeps in the old ones, but the amount of sneakers we
link |
01:30:01.040
have to cut up and destroy every year to create artificial scarcity for the next generation.
link |
01:30:05.760
Right? This is kind of stuff that's not great. It's not great at all.
link |
01:30:10.560
So, can speakers consumption fueling status games is really bad for the planet, not sustainable?
link |
01:30:17.040
The second thing is you can play these kinds of status games, but then what it does is it
link |
01:30:22.160
renders you captured to the virtual environment. The status games that really wealthy people are
link |
01:30:27.040
playing are all around the hard resources where they're going to build the factories,
link |
01:30:31.280
they're going to have the fuel in the rare earths to make the next generation of robots.
link |
01:30:34.080
They're then going to run circles around you and your children. So, that's another reason
link |
01:30:39.200
not to play those virtual status games. So, you're saying ultimately the big picture game is one
link |
01:30:45.840
by people who have access or control over actual hard resources. So, you can't,
link |
01:30:51.200
you don't see a society where most of the games are played in the virtual space.
link |
01:30:56.720
They'll be captured in the physical space. It all builds. It's just like the stack
link |
01:31:00.560
of human being. If you only play the game at the cultural and intellectual level,
link |
01:31:07.360
then the people with the hard resources and access to layer zero physical are going to own you.
link |
01:31:12.800
But isn't money not connected to or less and less connected to hard resources and money still
link |
01:31:17.920
seems to work? It's a virtual technology. There's different kinds of money. Part of the reason that
link |
01:31:23.120
some of the stuff is able to go a little unhinged is because the big sovereignties where one spends
link |
01:31:32.880
money and uses money and plays money games and inflates money, their ability to adjudicate
link |
01:31:39.120
the physical resources and hard resources on land and things like that, those have not been
link |
01:31:42.960
challenged in a very long time. So, we went off the gold standard. Most money is not connected
link |
01:31:49.200
to physical resources. It's an idea. And that idea is very closely connected to status.
link |
01:31:59.920
But it's also tied to, it's actually tied to law. It is tied to some physical hard things. So,
link |
01:32:04.880
you have to pay your taxes. Yes. So, it's always at the end going to be connected to the blockchain
link |
01:32:11.440
of physical reality. So, in the case of law and taxes, it's connected to government and government
link |
01:32:18.640
is what? Violence is the, I'm playing a stack of devil's advocates here. And popping one devil
link |
01:32:29.040
off the stack at a time. Isn't ultimately, of course, it'll be connected to physical reality,
link |
01:32:33.120
but just because people control the physical reality, it doesn't mean the status,
link |
01:32:37.680
LeBron James, in theory, could make more money than the owners of the teams
link |
01:32:42.320
in theory. And to me, that's a virtual idea. So, somebody else constructed a game
link |
01:32:47.360
and now you're playing in the virtual space of the game. And so, it just feels like there could
link |
01:32:53.440
be games where status, we build realities that give us meaning in the virtual space. I can imagine
link |
01:33:01.280
such things being possible. Oh, yeah. Okay. So, I think I see what you're saying there. With
link |
01:33:06.240
the idea there, I mean, we'll take the LeBron James side and put in some YouTube influencer.
link |
01:33:10.720
Yes, sure. Right. So, the YouTube influencer, it is status games, but at a certain level,
link |
01:33:16.480
it precipitates into real dollars. Well, you look at Mr. Beast, right? He's setting off
link |
01:33:22.560
half a million dollars worth of fireworks or something on a YouTube video.
link |
01:33:25.680
And also, saving trees and so on. Sure, right. They're trying to plan a
link |
01:33:29.600
million trees with Mark Rober or whatever it was. Yeah. It's not that those kinds of games can't
link |
01:33:33.200
lead to real consequences. It's that for the vast majority of people in consumer culture,
link |
01:33:39.840
they are incented by the... I would say mostly I'm thinking about middle class consumers.
link |
01:33:47.680
They're incented by advertisements. They're incented by their memetic environment to treat
link |
01:33:53.040
the purchasing of certain things, the need to buy the latest model, whatever, the need to appear,
link |
01:33:58.240
however, the need to pursue status games as a driver of meaning. And my point would be that
link |
01:34:04.720
it's a very hollow driver of meaning. And that is what creates a meaning crisis because at the
link |
01:34:10.240
end of the day, it's like eating a lot of empty calories, right? Yeah, it tasted good going down.
link |
01:34:14.480
It's a lot of sugar, but man, it was not enough protein to help build your muscles.
link |
01:34:18.240
And you kind of feel that in your gut. And I think that's... I mean, all this stuff aside and
link |
01:34:22.640
setting aside our discussion on currency, which I hope we get back to, that's what I mean about
link |
01:34:27.360
the meaning crisis, part of it being created by the fact that we're not encouraged to have
link |
01:34:34.560
more and more direct relationships. We're actually alienated from relating to even our family members
link |
01:34:42.080
sometimes, right? We're encouraged to relate to brands. We're encouraged to relate to these kinds
link |
01:34:48.000
of things that then tell us to do things that are really of low consequence. And that's where
link |
01:34:54.160
the meaning crisis comes from. So the role of technology in this... So there's somebody you
link |
01:34:58.000
mentioned who's Jacques Eliel. His view of technology, he warns about the towering piles
link |
01:35:05.840
of technique, which I guess is a broad idea of technology. So I think, correct me if I'm wrong
link |
01:35:12.240
for him, technology is bad at moving away from human nature and ultimately is destructive.
link |
01:35:19.360
My question broadly speaking in this meaning crisis, what are the pros and cons of technology?
link |
01:35:24.160
Can it be a good? Yeah, I think it can be. I certainly draw on some of Elul's
link |
01:35:28.800
ideas and I think some of them are pretty good. But the way he defines technique is
link |
01:35:35.760
well, also, Simonda as well. I mean, he speaks to the general mentality of efficiency,
link |
01:35:40.800
homogenized processes, homogenized production, homogenized labor to produce homogenized artifacts
link |
01:35:46.800
that then are not actually... They don't sit well in the environment. So it's essentially,
link |
01:35:53.440
you can think of it as the antonym of craft. Whereas a craftsman will come to a problem,
link |
01:36:01.760
maybe a piece of wood and they need to make into a chair. It may be a site to build a house or
link |
01:36:05.760
build a stable or build whatever. And they will consider how to bring various things in to build
link |
01:36:12.480
something well contextualized that's in right relationship with that environment.
link |
01:36:19.200
But the way we have driven technology over the last 150 years is not that at all. It is how can we
link |
01:36:27.520
make sure the input materials are homogenized, cut to the same size, diluted and doped to exactly
link |
01:36:34.720
the right alloy concentrations. How do we create machines that then consume exactly the right kind
link |
01:36:38.880
of energy to be able to run at this high speed to stamp out the same parts, which then go out the
link |
01:36:43.440
door. Everyone gets the same tickle, Mielmo. And the reason why everyone wants it is because we
link |
01:36:47.120
have broadcasts that tells everyone, this is the cool thing. So homogenized demand. And we're like
link |
01:36:52.880
Baudrillard and the other critiques of modernity coming from that direction, the situationalist
link |
01:36:58.320
as well. It's that their point is that at this point in time, consumption is the thing that drives
link |
01:37:04.480
a lot of the economic stuff, not the need, but the need to consume and build status games on top.
link |
01:37:09.360
So we have homogenized, when we discovered, I think this is really like Bernays and stuff,
link |
01:37:14.720
in the early 20th century, we discovered we can create, we can create demand. We can create desire
link |
01:37:20.800
in a way that was not possible before because of broadcast media. And not only do we create desire,
link |
01:37:27.520
we don't create a desire for each person to connect to some bespoke thing to build a relationship
link |
01:37:31.680
with their neighbor or their spouse. We are telling them, you need to consume this brand.
link |
01:37:35.920
You need to drive this vehicle. You got to listen to this music. Have you heard this?
link |
01:37:39.200
Have you seen this movie? So creating homogenized demand makes it really cheap to create homogenized
link |
01:37:45.680
product. And now you have economics of scale. So we make the same tickle myelmo, give it to
link |
01:37:50.400
all the kids. And all the kids are like, hey, I got a tickle myelmo. So this is ultimately where
link |
01:37:57.520
this ties in then to runaway hyper capitalism is that capitalism is always looking for growth.
link |
01:38:04.720
It's always looking for growth. And growth only happens at the margins. So you have to squeeze
link |
01:38:08.880
more and more demand out. You got to make it cheaper and cheaper to make the same thing.
link |
01:38:12.160
But tell everyone they're still getting meaning from it. This is still your tickle myelmo.
link |
01:38:18.080
And we see little bits of this, critiques of this dripping in popular culture. You see it
link |
01:38:23.120
sometimes. It's when Buzz Lightyear walks into the thing, he's like, oh my God, at the toy store,
link |
01:38:28.960
I'm just a toy. There's millions of other, or there's hundreds of other Buzz Lightyears just
link |
01:38:33.520
like me, right? That is, I think, a fun Pixar critique on this homogenization dynamic.
link |
01:38:40.080
I agree with you on most of the things you're saying. So I'm playing devil's advocate here.
link |
01:38:44.560
But this homogenized machine of capitalism is also the thing that is able to fund if
link |
01:38:54.400
channeled correctly innovation, invention, and development of totally new things that
link |
01:39:01.200
in the best possible world create all kinds of new experiences that can enrich lives,
link |
01:39:06.560
the quality of lives for all kinds of people. So isn't this the machine that actually enables
link |
01:39:14.000
the experiences and more and more experiences that would then give meaning?
link |
01:39:18.560
It has done that to some extent. I mean, it's not all good or bad in my perspective.
link |
01:39:24.560
We can always look backwards and offer a critique of the path we've taken to get
link |
01:39:29.280
to this point in time. But that's somewhat different and informs the discussion,
link |
01:39:35.760
but it's somewhat different than the question of where do we go in the future, right?
link |
01:39:40.560
Is this still the same rocket we need to ride to get to the next point? We'll even get us to
link |
01:39:44.000
the next point. Well, how does this, so you're predicting the future, how does it go wrong in
link |
01:39:47.600
your view? We have the mechanisms we have now explored enough technologies to where we can
link |
01:39:54.960
actually, I think, sustainably produce what most people in the world need to live.
link |
01:40:03.360
We have also created the infrastructures to allow continued research and development
link |
01:40:10.320
of additional science and medicine and various other kinds of things.
link |
01:40:16.160
The organizing principles that we use to govern all these things today
link |
01:40:19.840
have been, a lot of them have been just inherited from, honestly, medieval times.
link |
01:40:28.400
Some of them have refactored a little bit in the industrial era, but a lot of these modes
link |
01:40:33.840
of organizing people are deeply problematic. Furthermore, they're rooted in, I think, a
link |
01:40:43.360
very industrial mode perspective on human labor. This is one of those things I'm going to go back
link |
01:40:49.600
to the open source thing. There was a point in time when, well, let me ask you this. If you look
link |
01:40:55.280
at the core SciPy collection of libraries, that's SciPy NumPy Mapplotlib. There's SciPython Notebook.
link |
01:41:01.600
Let's throw pandas in there, scikit learn, a few of these things. How much value do you think,
link |
01:41:08.640
economic value, would you say they drive in the world today?
link |
01:41:13.040
That's one of the fascinating things about talking to you and Travis. It's a measure
link |
01:41:19.280
with at least $1 billion a day, maybe?
link |
01:41:22.320
$1 billion, sure. It's similar question of how much value does Wikipedia create.
link |
01:41:31.520
All of it? I don't know.
link |
01:41:32.560
Well, I mean, if you look at our systems, when you do a Google search, some of that stuff runs
link |
01:41:36.720
through TensorFlow, but when you look at Siri, when you do credit card transaction,
link |
01:41:42.480
just everything, every intelligence agency under the sun, they're using some aspect of
link |
01:41:46.480
these kinds of tools. I would say that these create billions of dollars of value.
link |
01:41:51.440
Oh, you mean like direct use of tools that leverage this?
link |
01:41:53.840
Yeah, even that's billions a day, yeah.
link |
01:41:56.560
Yeah, easily. I think the things they could not do if they didn't have these tools.
link |
01:42:02.720
That's billions of dollars a day. Great. I think that's about right. Now,
link |
01:42:05.840
if we take how many people did it take to make that? There was a point in time,
link |
01:42:10.880
not anymore, but there was a point in time when they could fit in a van. I could have
link |
01:42:13.680
fit them in my Mercedes printer. If you look at that, like holy crap,
link |
01:42:19.120
literally a van of maybe a dozen people could create value to the tune of billions of dollars
link |
01:42:27.200
a day. What lesson do you draw from that?
link |
01:42:29.920
Well, here's the thing. What can we do to do more of that? That's open source. The way I've
link |
01:42:36.560
talked about this in other environments is when we use generative participatory crowd
link |
01:42:42.960
sourced approaches, we unlock human potential at a level that is better than what capitalism can do.
link |
01:42:52.080
I would challenge anyone to go and try to hire the right 12 people in the world
link |
01:42:58.160
to build that entire stack the way those 12 people did that. They would be very,
link |
01:43:02.880
very hard to press to do that. If a hedge fund could just hire a dozen people
link |
01:43:06.560
and create something that is worth billions of dollars a day, every single one of them
link |
01:43:10.640
will be racing to do it. Finding the right people, fostering the right collaborations,
link |
01:43:15.040
getting it adopted by the right other people to then refine it, that is a thing that was
link |
01:43:19.600
organic in nature. That took crowdsourcing. That took a lot of the open source ethos and it took
link |
01:43:24.320
the right kinds of people. None of those people who started that said, I need to have a part of a
link |
01:43:29.120
multi billion dollar a day enterprise. They're like, I'm doing this cool thing to solve my
link |
01:43:33.840
problem for my friends. The point of telling the story is to say that our way of thinking about
link |
01:43:39.840
value, our way of thinking about allocation of resources, our ways of thinking about property
link |
01:43:44.480
rights and all these kinds of things, they come from finite game, scarcity mentality,
link |
01:43:49.920
medieval institutions. As we are now entering, to some extent, we're in a post scarcity era,
link |
01:43:56.960
although some people are hoarding a whole lot of stuff. We are at a point where,
link |
01:44:01.200
if not now soon, we'll be in a post scarcity era. The question of how we allocate resources
link |
01:44:06.400
has to be revisited at a fundamental level, because the kind of software these people built,
link |
01:44:10.880
the modalities that those human ecologies that built the software, it treats softwares unproperty.
link |
01:44:17.840
Actually, sharing creates value. Restricting and forking reduces value. That's different than
link |
01:44:24.960
any other physical resource that we've ever dealt with. It's different than how most corporations
link |
01:44:28.640
treat software IP. If treating software in this way created this much value so efficiently,
link |
01:44:36.880
so cheaply, because feeding a dozen people for 10 years is really cheap. That's the reason I care
link |
01:44:43.120
about this right now is because looking forward when we can automate a lot of labor, where we can,
link |
01:44:48.320
in fact, the programming for your robot in your neck of the woods and your part of the Amazon to
link |
01:44:54.080
build something sustainable for you and your tribe to deliver the right medicines to take care of the
link |
01:44:58.960
kids, that's just software. That's just code. That could be totally open sourced. We can actually
link |
01:45:06.240
get to a mode where all of these additional generative things that humans are doing,
link |
01:45:12.320
they don't have to be wrapped up in a container and then we charge for all the exponential dynamics
link |
01:45:18.240
out of it. That's what Facebook did. That's what modern social media did, because the old internet
link |
01:45:23.040
was connecting people just fine. Facebook came along and said, well, anyone can post a picture,
link |
01:45:26.720
anyone can post some text, and we're going to amplify the crap out of it to everyone else,
link |
01:45:30.880
and it exploded this generative network of human interaction, and then I said,
link |
01:45:34.880
how do I make money off that? Oh yeah, I'm going to be a gatekeeper on everybody's attention,
link |
01:45:39.680
and that's how we make money. How do we create more than one van? How do we have millions of vans
link |
01:45:47.040
full of people that create NumPy, SciPy, that create Python? The story of those people is often
link |
01:45:54.720
they have some kind of job outside of this. This is what they're doing for fun. Don't you need to
link |
01:45:59.440
have a job? Don't you have to be connected, plugged in to the capitalist system? Isn't this consumerism,
link |
01:46:09.040
the engine that results in the individuals that take a break from it every once in a while to
link |
01:46:15.840
create something magical at the edges? The question of surplus, this is the question. If everyone
link |
01:46:22.560
were to go and run their own farm, no one would have time to go and write NumPy, SciPy,
link |
01:46:26.240
right? Maybe, but that's what I'm talking about when I say we're maybe at a post scarcity point
link |
01:46:32.720
for a lot of people. The question that we're never encouraged to ask in a Super Bowl ad is,
link |
01:46:38.800
how much do you need? How much is enough? Do you need to have a new car every two years,
link |
01:46:44.000
every five? If you have a reliable car, can you drive one for 10 years? Is that all right?
link |
01:46:48.400
I had a car for 10 years and it was fine. Your iPhone, do you have to upgrade every two years?
link |
01:46:52.720
I mean, you're using the same apps you did four years ago, right?
link |
01:46:56.560
This should be a Super Bowl ad.
link |
01:46:58.160
This should be a Super Bowl ad. That's great. Maybe somebody...
link |
01:46:59.840
Do you really need a new iPhone?
link |
01:47:01.280
Maybe one of our listeners will fund something like this of like, no, but just actually
link |
01:47:05.600
bring it back, bring it back to actually the question of what do you need? How do we create
link |
01:47:12.560
the infrastructure for collectives of people to live on the basis of providing what we need,
link |
01:47:19.760
meeting people's needs with a little bit of access to handle emergencies and things like that,
link |
01:47:24.320
pulling our resources together to handle the really, really big emergencies, somebody with a
link |
01:47:29.280
really rare cat form of cancer or some massive fire sweeps through half the village or whatever,
link |
01:47:34.800
but can we actually unscale things and solve for people's needs and then give them the capacity
link |
01:47:44.080
to explore how to be the best version of themselves?
link |
01:47:47.120
And for Travis, that was throwing away his shot of tenure in order to write NumPy.
link |
01:47:52.720
For others, there is a saying in the sci fi community that sci fi advance is one failed
link |
01:47:58.640
postdoc at a time and we can do these things. We can actually do this kind of collaboration
link |
01:48:05.520
because code, software information organization, that's cheap. Those bits are very cheap to
link |
01:48:10.880
fling across the oceans.
link |
01:48:12.800
So you mentioned Travis. We've been talking and we'll continue to talk about open source.
link |
01:48:19.360
Maybe you can comment, how did you meet Travis? Who is Travis Alfon?
link |
01:48:23.280
What's your relationship been like through the years? Where did you work together?
link |
01:48:30.000
How did you meet? What's the present and the future look like?
link |
01:48:34.960
Yeah. So the first time I met Travis was at a sci fi conference in Pasadena.
link |
01:48:39.120
Do you remember the year?
link |
01:48:40.720
2005. I was working at Nthought, working on scientific computing, consulting,
link |
01:48:46.400
and a couple of years later, he joined us at Nthought, I think 2007.
link |
01:48:55.120
And he came in as the president, one of the founders of Nthought was the CEO, Eric Jones.
link |
01:49:01.760
And we were all very excited that Travis was joining us and that was great fun.
link |
01:49:04.880
And so I worked with Travis on a number of consulting projects and we worked on
link |
01:49:10.880
some open source stuff. I mean, it was just a really, it was a good time there.
link |
01:49:14.480
It was primarily Python related?
link |
01:49:17.600
Oh yeah, it was all Python, NumPy, sci fi consulting kind of stuff.
link |
01:49:20.800
Towards the end of that time, we started getting called into more and more finance shops.
link |
01:49:27.520
They were adopting Python pretty heavily. I did some work on like a high frequency
link |
01:49:31.840
trading shop, working on some stuff, and then we worked together on some
link |
01:49:36.640
a couple of investment banks in Manhattan. And so we started seeing that there was a
link |
01:49:42.000
potential to take Python in the direction of business computing.
link |
01:49:45.600
More than just being this niche like MATLAB replacement for big vector computing,
link |
01:49:50.400
what we were seeing was, oh yeah, you could actually use Python as a Swiss army knife to do
link |
01:49:54.000
a lot of shadow data transformation kind of stuff. So that's when we realized the potential
link |
01:49:59.600
was much greater. And so we started Anaconda, I mean, it was called Continuum Analytics at the
link |
01:50:04.560
time, but we started in January of 2012 with the vision of shoring up the parts of Python
link |
01:50:10.640
that needed to get expanded to handle data at scale, to do web visualization, application
link |
01:50:15.200
development, et cetera. And that was that, yeah. So he was CEO and I was president for the first
link |
01:50:23.120
five years. And then we raised some money and then the board, it was sort of put in a new CEO.
link |
01:50:28.800
They hired a kind of professional CEO. And then Travis, you laugh out that, I took over the CTO
link |
01:50:34.880
role, Travis then left after a year to do his own thing, to do Quonsite, which was more oriented
link |
01:50:40.960
around some of the bootstrap years that we did at Continuum, where it was open source and consulting.
link |
01:50:46.000
It wasn't sort of like gung ho product development. And it wasn't focused on,
link |
01:50:49.760
you know, we accidentally stumbled into the package management problem
link |
01:50:53.680
at Anaconda. But we had a lot of other visions of other technology that we built in the open
link |
01:50:58.400
source. And Travis was really trying to push, again, the frontiers of numerical computing,
link |
01:51:04.000
vector computing, handling things like auto differentiation and stuff intrinsically in
link |
01:51:08.400
the open ecosystem. So I think that's kind of the direction he's working on in some of his work.
link |
01:51:18.080
We remain great friends and colleagues and collaborators, even though he's no longer
link |
01:51:23.840
day to day working at Anaconda. But he gives me a lot of feedback about this and that and the other.
link |
01:51:28.320
What's a big lesson you've learned from Travis about life or about programming or about leadership?
link |
01:51:35.360
Wow, there's a lot. There's a lot. Travis is a really, really good guy.
link |
01:51:40.240
His heart is really in it. He cares a lot.
link |
01:51:44.480
I've gotten that sense having to interact with him. It's so interesting, such a good
link |
01:51:48.480
he's a really good dude. And he and I, you know, it's so interesting. We come from very different
link |
01:51:52.080
backgrounds. We're quite different as people. But I think we can not talk for a long time
link |
01:52:00.720
and then be on a conversation and be eye to eye on 90% of things. And so he's someone who I believe,
link |
01:52:08.160
no matter how much fog settles in over the ocean, his ship, my ship are pointed in the
link |
01:52:12.560
same direction to the same star. Wow, that's a beautiful way to phrase it. No matter how much
link |
01:52:17.440
fog there is or pointed at the same star. Yeah, and I hope he feels the same way. I mean, I hope
link |
01:52:22.160
he knows that over the years now. We both care a lot about the community. For someone who cares so
link |
01:52:27.760
deeply, I would say this about Travis, that's interesting. For someone who cares so deeply
link |
01:52:31.280
about the nerd details of like type system design and vector computing and efficiency of
link |
01:52:36.880
expressing this and that and the other, memory layouts and all that stuff, he cares even more
link |
01:52:41.840
about the people in the ecosystem, the community. And I have a similar kind of alignment. I care a
link |
01:52:50.240
lot about the tech. I really do. But for me, the beauty of what this human ecology has produced
link |
01:52:59.280
is, I think, a touchstone. It's an early version. We should look at it and say,
link |
01:53:03.600
how do we replicate this for humanity at scale? What this open source collaboration was able to
link |
01:53:07.680
produce? How can we be generative in human collaboration moving forward and create that as
link |
01:53:12.800
a civilizational kind of dynamic? Can we seize this moment to do that? Because a lot of the
link |
01:53:18.320
other open source movements, it's all nerds nerding out on code for nerds. And this,
link |
01:53:24.720
because it's scientists, because it's people working on data that all of it faces real human
link |
01:53:29.280
problems, I think we have an opportunity to actually make a bigger impact.
link |
01:53:33.760
Is there a way for this kind of open source vision to make money?
link |
01:53:39.040
Absolutely.
link |
01:53:40.080
To fund the people involved. Is that an essential part of it?
link |
01:53:43.040
It's hard. But we're trying to do that in our own way at Anaconda because we know that business
link |
01:53:49.440
users, as they use more of the stuff, they have needs that like business specific needs
link |
01:53:52.880
around security, provenance. They really can't tell their VPs and their investors, hey, we're
link |
01:54:00.480
having our data scientists are installing random packages from who knows where and running on
link |
01:54:04.960
customer data. So they have to have someone to talk to you and that's what Anaconda does.
link |
01:54:08.320
So we are a governed source of packages for them and that's great. That makes some money.
link |
01:54:13.040
We take some of that and we just take that as a dividend. We take a percentage of our revenues
link |
01:54:18.080
and write that as a dividend for the open source community. But beyond that, I really see the
link |
01:54:23.840
development of a marketplace for people to create notebooks, models, data sets, curation of these
link |
01:54:31.040
different kinds of things and to really have a long tail marketplace dynamic with that.
link |
01:54:37.920
Can you speak about this problem that you stumbled into of package management,
link |
01:54:42.400
Python package management? What is that? A lot of people speak very highly of Conda,
link |
01:54:48.800
which is part of Anaconda, which is a package manager. There's a ton of packages.
link |
01:54:52.480
So first, what are package managers? And second, what was there before? What is PIP?
link |
01:54:58.560
And why is Conda more awesome? The package problem is this, which is that in order to do
link |
01:55:08.080
numerical computing efficiently with Python, there are a lot of low level libraries that need
link |
01:55:14.720
to be compiled, compiled with a C compiler or C++ compiler or Fortran compiler. They need to not
link |
01:55:20.320
just be compiled, but they need to be compiled with all of the right settings. And oftentimes,
link |
01:55:24.240
those settings are tuned for specific chip architectures. And when you add GPUs to the mix,
link |
01:55:29.200
when you look at different operating systems, you may be on the same chip. But if you're running Mac
link |
01:55:35.120
versus Linux versus Windows on the same X86 chip, you compile and link differently.
link |
01:55:39.920
All of this complexity is beyond the capability of most data scientists to reason about. And it's
link |
01:55:46.960
also beyond what most of the package developers want to deal with too. Because if you're a package
link |
01:55:52.320
developer, you're like, I code on Linux, this works for me, I'm good. It is not my problem to
link |
01:55:56.720
figure out how to build this on an ancient version of Windows, right? That's just simply not my
link |
01:56:00.720
problem. So what we end up with is we have a creator or create a very creative crowdsourced
link |
01:56:07.360
environment where people want to use this stuff, but they can't. And so we ended up creating
link |
01:56:13.120
a new set of technologies like a build recipe system, a build system, and an installer system
link |
01:56:19.120
that is able to, well, to put it simply, it's able to build these packages correctly on each of
link |
01:56:27.680
these different kinds of platforms and operating systems and make it so when people want to install
link |
01:56:31.360
something, they can. It's just one command. They don't have to, you know, set up a big compiler
link |
01:56:35.920
system and do all these things. So when it works well, it works great. Now, the difficulty is
link |
01:56:40.800
we have literally thousands of people writing code in the ecosystem, building all sorts of stuff,
link |
01:56:45.520
and each person writing code, they may take a dependence on something else. And so all this
link |
01:56:50.000
web, incredibly complex web of dependencies. So installing the correct package for any given
link |
01:56:57.040
set of packages you want, getting that right subgraph is an incredibly hard problem. And
link |
01:57:02.960
again, most data scientists don't want to think about this. They're like, I want to install NumPy
link |
01:57:06.480
and Pandas. I want this version of some geospatial library. I want this other thing. Why is this
link |
01:57:13.600
hard? These exist, right? And it is hard because it's, well, you're installing this on a version
link |
01:57:19.360
of Windows, right? And half of these libraries are not built for Windows. Or the latest version
link |
01:57:24.560
isn't available, but the old version was. If you go to the old version of this library,
link |
01:57:27.440
that means you need to go to a different version of that library. And so the Python ecosystem,
link |
01:57:32.320
by virtue of being crowdsourced, we were able to fill 100,000 different niches. But then we also
link |
01:57:38.720
suffer this problem that because it's crowdsourced, and no one, it's like a tragedy to the comments,
link |
01:57:44.160
right? No one really needs, wants to support their thousands of other dependencies. So we end up sort
link |
01:57:50.160
of having to do a lot of this. And of course, the Kanda Forge community also steps up as an open
link |
01:57:54.400
source community that, you know, maintains some of these recipes. That's what Kanda does. Now,
link |
01:57:58.960
PIP is a tool that came along after Kanda to some extent. It came along as an easier way for the
link |
01:58:06.320
Python developers writing Python code that didn't have as much compiled, you know, stuff,
link |
01:58:12.800
they could then install different packages. And what ended up happening in the Python ecosystem
link |
01:58:17.680
was that a lot of the core Python and web Python developers, they never ran into any of this
link |
01:58:22.320
compilation stuff at all. So even we have, you know, on video, we have Guido van Rossum saying,
link |
01:58:29.280
you know what, the scientific community's packaging problems are just too exotic and
link |
01:58:32.400
different. I mean, you're talking about Fortran compilers, right? Like, you guys just need to
link |
01:58:36.720
build your own solution, perhaps, right? So the Python core Python community went and built its
link |
01:58:42.160
own sort of packaging technologies, not really contemplating the complexity of the stuff over
link |
01:58:48.400
here. And so now we have the challenge where you can't PIP install some things. Some libraries,
link |
01:58:53.520
if you just want to get started with them, you can PIP install TensorFlow and that works great.
link |
01:58:57.440
The instant you want to also install some other packages that use different versions of
link |
01:59:01.680
NumPy or some like graphics library or some OpenCV thing or some other thing,
link |
01:59:06.640
you now run into dependency hell. Because you cannot, you know, OpenCV can have a different
link |
01:59:10.880
version of libjpeg over here than PyTorch over here. Like they actually, they all have to use
link |
01:59:15.600
the, if you want to use GPU acceleration, they have to all use the same underlying drivers and
link |
01:59:18.800
same GPU CUDA things. So it's, it gets to be very gnarly. And it's a level of technology that
link |
01:59:24.560
both the makers and the users don't really want to think too much about.
link |
01:59:28.400
And that's where you step in and try to solve the sub graph problem. How much is that? And you
link |
01:59:33.120
said that you don't want to think, they don't want to think about it, but how much is it a
link |
01:59:36.720
little bit on the developer and providing them tools to, to be a little bit more clear of that
link |
01:59:42.720
sub graph of dependency that's necessary? It is, it is getting to a point where we do have to think
link |
01:59:47.040
about, look, can we pull some of the most popular packages together and get them to work on a
link |
01:59:52.240
coordinated release timeline, get them to build against the same test matrix, et cetera, et cetera.
link |
01:59:56.560
Right. And there is a little bit of dynamic around this, but again, it is a volunteer community.
link |
02:00:02.560
You know, people working on these different projects have their own timelines and their own
link |
02:00:06.160
things they're trying to meet. So we end up trying to pull these things together. And then it's,
link |
02:00:12.080
it's just incredibly, and I would recommend just as a business tip, don't ever go into business
link |
02:00:16.480
where when your hard work works, you're invisible. And when it breaks, because of someone else's
link |
02:00:21.120
problem, you get flack for it. Because that's, that's a, in our situation, right? When something
link |
02:00:25.920
doesn't condo install properly, usually it's some upstream issue, but it looks like condo is broken.
link |
02:00:30.080
It looks like, you know, anaconda, screw something up. When things do work though, it's like, oh,
link |
02:00:34.000
yeah, cool, it's worked. Assuming naturally, of course, that's very easy to make that work, right?
link |
02:00:38.080
So we end up in this kind of problematic scenario. But, but it's okay, because I think we're still,
link |
02:00:44.880
you know, our hearts in the right place. We're trying to move this forward as a community sort
link |
02:00:48.720
of affair. I think most of the people in the community also appreciate the work we've done
link |
02:00:51.760
over the years to try to move these things forward in a, in a collaborative fashion. So
link |
02:00:57.280
one of the sub graphs of dependencies that became super complicated is the move from
link |
02:01:04.160
Python two to Python three. So there's all these ways to mess with these kinds of
link |
02:01:09.200
ecosystems of packages and so on. So I just want to ask you about that particular one.
link |
02:01:13.600
What do you think about the move from Python two to three? Now, why did it take so long?
link |
02:01:19.280
What were, from your perspective, just seeing the packages all struggle in the community,
link |
02:01:24.320
all struggle through this process? What lessons do you take away from it? Why did it take so long?
link |
02:01:28.640
Looking back, some people perhaps underestimated how much adoption Python two had.
link |
02:01:38.000
I think some people also underestimated how much, or they overestimated how much value
link |
02:01:44.320
some of the new features in Python three really provided. Like the things they really loved
link |
02:01:48.880
about Python three just didn't matter to some of these people on Python two.
link |
02:01:51.440
Yeah. Because this change was happening as Python scipy was starting to take off really like
link |
02:01:58.240
past like a hockey stick of adoption in the early data science era in the early 2010s.
link |
02:02:02.800
A lot of people were learning and onboarding in whatever just worked. And the teachers were like,
link |
02:02:07.120
well, yeah, these libraries I need are not supporting Python three yet. I'm going to
link |
02:02:10.480
teach you Python two. It took a lot of advocacy to get people to move over to Python three.
link |
02:02:15.600
So I think it wasn't any particular single thing, but it was one of those death by,
link |
02:02:20.240
you know, a dozen cuts, which just really made it hard to move off of Python two.
link |
02:02:25.440
And also Python three itself, as they were kind of breaking things and changing these
link |
02:02:29.200
around and reorganizing the standard library, there's a lot of stuff that was happening there
link |
02:02:32.880
that kept giving people an excuse to say, I'll put off till the next version.
link |
02:02:37.600
Two is working fine enough for me right now. So I think that's essentially what happened there.
link |
02:02:41.440
And I will say this though, the strength of the Python data science movement,
link |
02:02:47.760
I think, is what kept Python alive in that transition. Because a lot of languages have died
link |
02:02:54.000
and left their user bases behind. If there wasn't the use of Python for data, there's a good chunk
link |
02:03:00.160
of Python users that during that transition would have just left for Go and Rust and stayed.
link |
02:03:04.800
In fact, some people did. They moved to Go and Rust and they just never look back.
link |
02:03:08.720
The fact that we were able to grow by millions of users, the Python data community,
link |
02:03:15.120
that is what kept the momentum for Python going. And now the usage of Python for data
link |
02:03:19.760
is over 50% of the overall Python user base. So I will put, I will make, I'm happy to debate
link |
02:03:26.800
that on stage somewhere. But from where I sit, I think that's true.
link |
02:03:32.480
The statement there, the idea is that the switch from Python two to Python three
link |
02:03:37.840
would have probably destroyed Python if it didn't also coincide with Python for whatever
link |
02:03:44.560
reason, just overtaking the data science community, anything that processes data.
link |
02:03:51.680
So the timing was perfect that this maybe imperfect decision was coupled with a great
link |
02:03:58.560
timing on the value of data in our world.
link |
02:04:01.920
I would say the troubled execution of a good decision. It was a decision that was necessary.
link |
02:04:07.280
It's possible if we had more resources, we could have done it in a way that was a little bit
link |
02:04:10.240
smoother. But ultimately, the arguments for Python three, I bought them at the time and I buy them
link |
02:04:16.640
now. Having great text handling is like a nonnegotiable table stakes thing you need to have
link |
02:04:22.080
in a language. So that's great. But the execution, Python is the, it's volunteer driven.
link |
02:04:32.880
It's like now the most popular language on the planet, but it's all literally volunteers.
link |
02:04:36.560
So the lack of resources meant that they had to really, they had to do things in a very
link |
02:04:42.000
hamstrung way. And I think to carry the Python momentum in the language through that time,
link |
02:04:47.600
the data movement was a critical part of that.
link |
02:04:49.760
So some of it is carrot and stick. I actually have to shamefully admit that it took me a very
link |
02:04:56.480
long time to switch from Python two and Python three because I'm a machine learning person.
link |
02:05:00.160
It was just for the longest time, you could just do fine with Python two.
link |
02:05:03.600
Right. But I think the moment where I switched everybody I worked with and switched myself
link |
02:05:12.000
for small projects and big is when finally when NumPy announced that they're going to end support
link |
02:05:20.320
like in 2020 or something like that. Right. So like when I realized, oh, this isn't going,
link |
02:05:26.080
this is going to end. Right. So that's the stick. That's not a carrot. That's not so for the longest
link |
02:05:30.800
time was carrots. It was like all of these packages were saying, okay, we have Python three support
link |
02:05:35.840
now come join us with Python two and Python three. But when NumPy, one of the packages I
link |
02:05:41.680
sort of love and depend on said like, nope, it's over. That's, that's when I decided to switch.
link |
02:05:50.240
I wonder if you think it was possible much earlier for somebody like,
link |
02:05:54.640
like NumPy or some major package to step into the cold. Well, it's a chicken and egg problem too.
link |
02:06:04.800
Right. You don't want to cut off a lot of users unless you see the user momentum going too. So
link |
02:06:09.440
the decisions for the scientific community, for each of the different projects, you know,
link |
02:06:14.080
there's not a monolith. Some projects are like, we'll only be releasing new features on Python
link |
02:06:18.000
three. And that was more of a sticky carrot or a firm carrot, if you will, a firm carrot, a
link |
02:06:26.320
stick shaped carrot. But then for others, yeah, NumPy in particular, because it's at the base
link |
02:06:31.600
of the dependency stack for so many things. That was the final stick. That was a stick shaped stick.
link |
02:06:37.440
People were saying, look, if I have to keep maintaining my releases for Python two,
link |
02:06:41.520
that's that much less energy that I can put into making things better for the Python three folks
link |
02:06:46.880
or in my new version, which is of course going to be Python three. So people were also getting
link |
02:06:51.360
kind of pulled by this tension. So the overall community sort of had a lot of input into when
link |
02:06:56.560
the NumPy core folks decided that they would end of life on Python two.
link |
02:07:01.280
So as these numbers are a little bit loose, but there are about 10 million Python programmers
link |
02:07:06.720
in the world, you could argue that number, but let's say 10 million. That's actually
link |
02:07:10.720
where I was looking said 27 million total programmers, developers in the world. You mentioned in a talk
link |
02:07:18.400
that changes need to be made for there to be 100 million programmers. So first of all,
link |
02:07:24.960
do you see a future where there's 100 million Python programmers? And second, what kind of
link |
02:07:29.440
changes need to be made? So Anaconda, Miniconda get downloaded about a million times a week.
link |
02:07:34.880
So I think the idea that there's only 10 million Python programmers in the world is a little bit
link |
02:07:40.240
undercounting. There are a lot of people who escape traditional counting that are using Python and
link |
02:07:46.640
data in their jobs. I do believe that the future world for it to, well, the world I would like to
link |
02:07:52.960
see is one where people are data literate. So they are able to use tools that let them express
link |
02:07:59.520
their questions and ideas fluidly. And the data variety and data complexity will not go down.
link |
02:08:06.320
It will only keep increasing. So I think some level of code or code like things will continue to
link |
02:08:14.080
be relevant. And so my hope is that we can build systems that allow people to more seamlessly
link |
02:08:21.760
integrate Python kinds of expert sensitivity with data systems and operationalization methods
link |
02:08:28.080
that are much more seamless. And what I mean by that is, right now, you can't punch Python code
link |
02:08:34.000
into an Excel cell. I mean, there's some tools you can do to kind of do this. We didn't build
link |
02:08:38.320
a thing for doing this back in the day. But I feel like the total addressable market for Python
link |
02:08:44.560
users, if we do the things right, is on the order of the Excel users, which is a few hundred million.
link |
02:08:51.200
So I think Python has to get better at being embedded, being a smaller thing that pulls
link |
02:08:59.360
in just the right parts of the ecosystem to run numerics and do data exploration, meeting people
link |
02:09:06.240
where they're already at with their data and their data tools. And then I think also,
link |
02:09:11.840
it has to be easier to take some of those things they've written and flow those back into deployed
link |
02:09:16.960
systems or little apps or visualizations. I think if we don't do those things, then we will always be
link |
02:09:22.160
kept in a silo as sort of an expert user's tool and not a tool for the masses.
link |
02:09:27.040
You know, I work with a bunch of folks in the Adobe creative suite, and I'm kind of forcing them
link |
02:09:34.400
or inspired them to learn Python to do a bunch of stuff that helps them. And it's interesting
link |
02:09:39.120
because they probably wouldn't call themselves a Python programmer, but they're all using Python.
link |
02:09:43.680
I would love it if the tools like Photoshop and Premiere and all those kinds of tools that are
link |
02:09:48.240
targeted towards creative people, I guess that's where Excel is targeted towards a certain kind
link |
02:09:53.600
of audience that works with data, financial people, all that kind of stuff. If there would be easy
link |
02:09:59.680
ways to leverage to use Python for quick scripting tasks. And you know, there's an exciting application
link |
02:10:06.640
of artificial intelligence in the space that I'm hopeful about looking at open AI codex with
link |
02:10:13.600
generating programs. So almost helping people bridge the gap from kind of visual interface
link |
02:10:23.440
to generating programs to something formal. And then they can modify it and so on. But kind of
link |
02:10:30.560
without having to read the manual, without having to do a Google search and stack overflow,
link |
02:10:34.720
which is essentially what a neural network does when it's doing code generation,
link |
02:10:38.000
is actually generating code and allowing a human to communicate with multiple programs. And then
link |
02:10:44.880
maybe even programs to communicate with each other via Python. So that to me is a really exciting
link |
02:10:50.480
possibility because I think there's a friction to kind of like, how do I learn how to use Python
link |
02:10:57.600
in my life? Oftentimes, you kind of start a class, you start learning about types,
link |
02:11:04.240
yes, I don't know, functions. Like this is, you know, Python is the first language with
link |
02:11:09.680
which you start to learn to program. But I feel like that's going to take a long time for you
link |
02:11:16.800
to understand why it's useful. You almost want to start with a script. Well, you do, in fact. I
link |
02:11:22.160
think starting with the theory behind programming languages and types and all that, I mean,
link |
02:11:26.000
types are there to make the compiler writer's jobs easier. Types are not, I mean,
link |
02:11:30.880
heck, do you have an ontology of types or just the objects on this table? No. So types are there
link |
02:11:36.960
because compiler writers are human and they're limited in what they can do. But I think that
link |
02:11:43.040
the beauty of scripting, like there's a Python book that's called Automate the Boring Stuff,
link |
02:11:49.040
which is exactly the right mentality. I grew up with computers in a time when Steve Jobs
link |
02:11:57.920
was still pitching these things as bicycles for the mind. They were supposed to not be just
link |
02:12:01.200
media consumption devices. But they were actually, you could write some code, you could write basic,
link |
02:12:06.400
you could write some stuff to do some things. And that feeling of a computer as a thing that
link |
02:12:12.560
we can use to extend ourselves has all but evaporated for a lot of people. So you see a
link |
02:12:18.560
little bit in parts of the current, the generation of youth around Minecraft or Roblox, right?
link |
02:12:23.520
And I think Python, circuit Python, these things could be a renaissance of that, of people actually
link |
02:12:31.040
shaping and using their computers as computers, as an extension of their minds and the curiosity,
link |
02:12:36.480
their creativity. So you talk about scripting the Adobe suite with Python in the 3D graphics world.
link |
02:12:42.800
Python is a scripting language that some of these 3D graphics suites use. And I think it's great.
link |
02:12:49.280
We should better support those kinds of things. But ultimately, the idea that I should be able
link |
02:12:54.080
to have power over my computing environment, if I want these things to happen repeatedly all the
link |
02:12:58.800
time, I should be able to say that somehow to the computer, right? Now, whether the operating systems
link |
02:13:05.520
get there faster by having some, you know, Siri backed with open AI with whatever. So you can
link |
02:13:09.680
just say, Siri, make this do this and this and every other Friday, right? We probably will get
link |
02:13:13.600
there somewhere. And Apple's always had these ideas, you know, there's the Apple script in the menu
link |
02:13:17.680
that no one ever uses. But you can do these kinds of things. But when you start doing that kind
link |
02:13:22.640
of scripting, the challenge isn't learning the type system, or even the syntax of the language.
link |
02:13:27.040
The challenge is all of the dictionaries and all the objects of all their properties and attributes
link |
02:13:31.520
and parameters, like who's got time to learn all that stuff, right? So that's when then programming
link |
02:13:37.440
by prototype, or by example, becomes the right way to get the user to express their desire.
link |
02:13:43.600
So there's a lot of these different ways that we can approach programming. But I do think
link |
02:13:46.640
when, as you were talking about the Adobe scripting thing, I was thinking about, you know,
link |
02:13:51.280
when we do use something like NumPy, when we use things in the Python data and scientific,
link |
02:13:57.520
they say expression system, there's a reason we use that, which is that it gives us mathematical
link |
02:14:03.120
precision. It gives us actually quite a lot of precision over precisely what we mean about
link |
02:14:08.960
this data set, that data set. And it's the fact that we can have that precision
link |
02:14:12.960
that lets Python be powerful over as a duct tape for data. You know, you give me a TSV or a CSV,
link |
02:14:21.120
and if you give me some massively expensive vendor tool for data transformation, I don't know,
link |
02:14:26.640
I'm going to be able to solve your problem. But if you give me a Python prompt, you can throw
link |
02:14:31.200
whatever data you want at me, I will be able to mash it into shape. So that ability to take it as
link |
02:14:36.480
sort of this machete out into the data jungle is really powerful. And I think that's why at some
link |
02:14:43.680
level, we're not going to get away from some of these expressions and APIs in libraries in Python
link |
02:14:50.240
for data transformation. You've been at the center of the Python community for many years.
link |
02:14:58.320
If you could change one thing about the community to help it grow, to help it improve,
link |
02:15:05.440
to help it flourish and prosper, what would it be? I mean, that doesn't have to be one thing,
link |
02:15:11.600
but what kind of comes to mind? What are the challenges?
link |
02:15:15.280
Humility is one of the values that we have at Anaconda, the company, but it's also one of
link |
02:15:18.400
the values in the community that it's been breached a little bit in the last few years,
link |
02:15:24.400
but in general, people are quite decent and reasonable and nice. And that humility prevents
link |
02:15:31.440
them from seeing the greatness that they could have. I don't know how many people in the core
link |
02:15:40.160
Python community really understand that they stand perched at the edge of an opportunity
link |
02:15:47.840
to transform how people use computers. And actually, PyCon, I think it was the last physical
link |
02:15:53.440
PyCon I went to, Russell Keith McGee gave a great keynote about very much along the lines of the
link |
02:16:00.880
challenges I have, which is Python, for a language that doesn't actually, that can't put an interface
link |
02:16:06.320
up on the most popular computing devices, it's done really well as a language, hasn't it?
link |
02:16:11.600
You can't write a web frontend with Python, really. I mean, everyone uses JavaScript.
link |
02:16:14.880
You certainly can't write native apps. So for a language that you can't actually write apps in
link |
02:16:20.640
any of the frontend runtime environments, Python's done exceedingly well. And so that wasn't to
link |
02:16:27.520
pat ourselves in the back. That was to challenge ourselves as a community to say, we, through
link |
02:16:31.200
our current volunteer dynamic, have gotten to this point, what comes next and how do we seize,
link |
02:16:36.480
you know, we've caught the tiger by the tail, how do we make sure we keep up with it as it goes
link |
02:16:40.320
forward? So that's one of the questions I have about sort of open source communities, is at
link |
02:16:45.600
its best, there's a kind of humility. Is that humility prevent you to have a vision for creating
link |
02:16:53.040
something like very new and powerful? And you've brought us back to consciousness again. The
link |
02:16:57.680
collaboration is a swarm emergent dynamic. Humility lets these people work together without
link |
02:17:02.720
anyone trouncing anyone else. How do they, you know, in consciousness, there's the question of
link |
02:17:07.760
the binding problem. How does a singular, our attention, how does that emerge from, you know,
link |
02:17:12.400
billions of neurons? So how can you have a swarm of people emerge a consensus that has a singular
link |
02:17:19.040
vision to say, we will do this. And most importantly, we're not going to do these things. Emerging a
link |
02:17:25.360
coherent, pointed, focused leadership dynamic from a collaboration, being able to do that kind of,
link |
02:17:31.920
and then dissolve it so people can still do the swarm thing. That's a problem. That's a question.
link |
02:17:37.040
So do you have to have a charismatic leader? For some reason, Linus Torvald comes to mind,
link |
02:17:42.480
but you know, there's people who criticize the rules that iron fist, man. But there's
link |
02:17:46.880
still charisma. There's charisma, right? There's charisma to that iron fist. There's a,
link |
02:17:53.440
every leader is different, I would say, in their success. So he doesn't, I don't even know if you
link |
02:17:59.040
can say he doesn't have humility. There's such a meritocracy of ideas that like, this is a good
link |
02:18:08.480
idea and this is a bad idea. There's a step function to it. Once you clear a threshold,
link |
02:18:12.800
he's open to your ideas, I think. The interesting thing is, obviously,
link |
02:18:20.160
that will not stand in an open source community if that threshold that is defined by that one
link |
02:18:26.320
particular person is not actually that good. So you actually have to be really excellent at what
link |
02:18:32.320
you do. So he's very good at what he does. And so there's some aspect of leadership where
link |
02:18:39.680
you can get thrown out. People can just leave. That's how it works with the open source of the
link |
02:18:44.960
fork. But at the same time, you want to sometimes be a leader with a strong opinion because people,
link |
02:18:52.160
I mean, there's some kind of balance here for this high of mind to get behind.
link |
02:18:57.200
Leadership is a big topic. And I didn't, I'm not one of these guys that went to MBA school and said,
link |
02:19:00.960
I'm going to be an entrepreneur and I'm going to be a leader. And I'm going to read all these
link |
02:19:04.400
Harvard Business Review articles on leadership and all this other stuff. I was a physicist
link |
02:19:09.520
turned into a software nerd who then really nerded out on Python. And now I am entrepreneurial. I
link |
02:19:14.640
saw a business opportunity around the use of Python for data. But for me, what has been interesting
link |
02:19:19.840
over this journey with the last 10 years is how much I started really enjoying the understanding,
link |
02:19:27.840
thinking deeper about organizational dynamics and leadership. And leadership does come down to
link |
02:19:33.120
a few core things. Number one, a leader has to create belief or at least has to dispel disbelief.
link |
02:19:43.920
Leadership also, you have to have vision, loyalty and experience.
link |
02:19:49.200
So can you say belief in a singular vision? What is belief?
link |
02:19:53.600
Yeah, belief means a few things. Belief means here's what we need to do. And this is a valid thing
link |
02:19:57.760
to do. And we can do it that you have to be able to drive that belief. And every step of leadership
link |
02:20:07.600
along the way has to help you amplify that belief to more people. I mean, I think at a fundamental
link |
02:20:14.160
level, that's what it is. You have to have a vision. You have to be able to show people that,
link |
02:20:20.640
or you have to convince people to believe in the vision and to get behind you. And that's where
link |
02:20:25.360
the loyalty part comes in and the experience part comes in. There's all different flavors of
link |
02:20:29.520
leadership. So if we talk about Linus, we could talk about Elon Musk and Steve Jobs. There's
link |
02:20:36.560
Sander Perchai. There's people that kind of put themselves at the center and are strongly opinionated.
link |
02:20:42.320
And some people are more like consensus builders. What works well for open source? What works well
link |
02:20:48.480
in the space of programmers? So you've been a programmer. You've led many programmers and now
link |
02:20:53.600
sort of at the center of this ecosystem, what works well in the programming world, would you say?
link |
02:20:58.800
It really depends on the people, what style of leadership is best. And it depends on the
link |
02:21:03.280
programming community. I think for the Python community, servant leadership is one of the
link |
02:21:07.680
values. At the end of the day, the leader has to also be the high priest of values. So any
link |
02:21:17.200
collection of people has values of their living. And if you want to maintain certain values and
link |
02:21:24.480
those values help you as an organization become more powerful, then the leader has to live those
link |
02:21:29.600
values unequivocally and has to hold the values. So in our case, in this collaborative community
link |
02:21:37.520
around Python, I think that the humility is one of those values. Servant leadership,
link |
02:21:43.760
you actually have to kind of do the stuff. You have to walk the walk, not just talk the talk.
link |
02:21:48.880
I don't feel like the Python community really demands that much from a vision standpoint.
link |
02:21:53.680
And they should. And I think they should. This is the interesting thing is that so
link |
02:22:00.160
many people use Python. From where comes the vision? You have an Elon Musk type character who
link |
02:22:07.760
has bold statements about the vision for particular companies he's involved with. And it's like,
link |
02:22:16.320
I think a lot of people that work at those companies kind of can only last if they believe
link |
02:22:23.120
that vision. And some of it is super bold. So my question is, and by the way, those companies
link |
02:22:29.120
often use Python, how do you establish a vision? Get to 100 million users, right?
link |
02:22:36.160
Get to where the Python is at the center of the machine learning and was it data science,
link |
02:22:46.640
machine learning, deep learning, artificial intelligence, revolution, right? In many ways,
link |
02:22:53.680
perhaps the Python community is not thinking of it that way, but it's leading the way on this.
link |
02:22:58.080
Like the tooling is like essential. Right. Well, for a while,
link |
02:23:03.120
PyCon, people in the scientific Python and the PyData community, they would submit talks.
link |
02:23:09.120
Those are early 2010s. They would submit talks to PyCon and the talks would all be rejected
link |
02:23:15.680
because there was the separate sort of PyData conferences. And they're like, well, these
link |
02:23:19.440
probably belong more to PyData. And instead, there'd be yet another talk about threads and
link |
02:23:24.720
whatever, some web framework. And it's like, that was an interesting dynamic to see that there was,
link |
02:23:30.800
I mean, at the time, it was a little annoying because we want to try to get more users and
link |
02:23:34.240
get more people talking about these things. And PyCon is a huge venue, right? It's
link |
02:23:38.000
thousands of Python programmers. But then also came to appreciate that, you know, parallel,
link |
02:23:42.960
having an ecosystem that allows parallel innovation is not bad, right? There are people
link |
02:23:47.680
doing embedded Python stuff. There's people doing web programming, people doing scripting,
link |
02:23:51.280
there's cyber uses of Python. I think ultimately at some point, if your slide mode
link |
02:23:56.320
covers so much stuff, you have to respect that different things are growing in different areas
link |
02:24:01.040
and different niches. Now, at some point, that has to come together and the central body has to
link |
02:24:06.080
provide resources. The principle here is subsidiarity. Give resources to the various groups
link |
02:24:11.600
to then allocate as they see fit in their niches. That would be a really helpful dynamic. But again,
link |
02:24:16.560
it's a volunteer community. It's not like they had that many resources to start with.
link |
02:24:21.040
What was or is your favorite programming setup? What operating system, what keyboard,
link |
02:24:25.600
how many screens are you listening to? What time of day? Are you drinking coffee? Tea?
link |
02:24:32.960
Tea. Sometimes coffee, depending on how well I slept. I used to have...
link |
02:24:37.120
How deep do you get? A night owl? I remember somebody asked you somewhere a question about
link |
02:24:42.080
work life balance. Not just work life balance, but like a family. You lead a company and your
link |
02:24:49.280
answer was basically like, I still haven't figured it out.
link |
02:24:53.440
Yeah. I think I've gotten to a little bit better balance. I have a really great leadership team
link |
02:24:57.840
now supporting me. That takes a lot of the day to day stuff off my plate. My kids are getting
link |
02:25:03.600
a little older, so that helps. Of course, I have a wonderful wife who takes care of a lot of the
link |
02:25:08.800
things that I'm not able to take care of and she's great. I try to get to sleep earlier now
link |
02:25:13.520
because I have to get up every morning at six to take my kid down to the bus stop.
link |
02:25:16.800
So there's a hard thing. For a while, I was doing polyphasic sleep, which is really interesting.
link |
02:25:21.920
Like I go to bed at nine, wake up at like 2 a.m., work till five, sleep three hours,
link |
02:25:26.560
wake up at eight. That was actually... It was interesting. It wasn't too bad.
link |
02:25:29.840
How did it feel? It was good. I didn't keep it up for years, but once I have travel,
link |
02:25:34.800
then it just... Everything goes out the window, right? Because then you're like time zones and
link |
02:25:38.480
all these things. Socially, was it accepted? Were you able to live outside of how you felt?
link |
02:25:43.360
Yes. Were you able to live normal society?
link |
02:25:45.600
Oh yeah, because on the night that wasn't out, hanging out with people or whatever,
link |
02:25:48.800
going to bed at nine, no one cares. I wake up at two, I'm still responding to their slacks, emails,
link |
02:25:53.120
whatever, and shitposting on Twitter or whatever at two in the morning is great.
link |
02:25:57.920
Yes, exactly.
link |
02:25:58.800
Right? And then you go to bed for a few hours and you wake up. It's like you had an extra day in
link |
02:26:03.280
the middle. And I'd read somewhere that humans naturally have biphasic sleep or something.
link |
02:26:07.520
I don't know. I read basically everything somewhere. So every option of everything.
link |
02:26:13.200
Every option of everything. I will say that that worked out for me for a while,
link |
02:26:16.640
although I don't do it anymore. In terms of programming setup, I had a 27 inch high DPI
link |
02:26:22.720
setup that I really liked. But then I moved to a curved monitor just because when I moved to the
link |
02:26:27.840
new house, I want to have a bit more screen for Zoom plus communications plus various kinds of things.
link |
02:26:33.840
It's like one large monitor.
link |
02:26:35.600
One large curved monitor.
link |
02:26:37.840
What operating system?
link |
02:26:39.680
Mac.
link |
02:26:40.480
Okay.
link |
02:26:41.040
Yeah.
link |
02:26:41.520
Is that what happens when you become important? Is you stop using Linux and Windows?
link |
02:26:46.080
No, I actually have a Windows box as well on the next table over.
link |
02:26:50.480
But I have three desks, right?
link |
02:26:54.000
Yes.
link |
02:26:54.240
So a main one is a standing desk so that I can whatever. I have a teleprompter set up and
link |
02:26:59.280
everything else. And then I've got my iMac and then eGPU and then Windows PC.
link |
02:27:06.320
The reason I moved to Mac was it's got a Linux prompt.
link |
02:27:10.400
Or sorry, it's got a Unix prompt so I can do all my stuff.
link |
02:27:14.080
But then I don't have to worry like when I'm presenting for clients or investors or whatever.
link |
02:27:21.600
I don't have to worry about any like ACPI related FSIQ things in the middle of a presentation,
link |
02:27:27.040
like none of that. It just, it will always wake from sleep.
link |
02:27:30.800
And it won't kernel panic on me.
link |
02:27:32.160
And this is not a dig against Linux except that I just, I feel really bad.
link |
02:27:38.240
I feel like a traitor to my community saying this, right?
link |
02:27:40.080
But in 2012, I was just like, okay, start my own company, what do I get?
link |
02:27:43.920
And Linux laptops were just not quite there.
link |
02:27:47.360
And so I've just stuck with Max.
link |
02:27:48.400
Can I just defend something that nobody respectable seems to do?
link |
02:27:52.240
Which is, I do a boot on Linux Windows, but in Windows, I have Windows subsystems for Linux
link |
02:28:00.320
or whatever WSL.
link |
02:28:01.600
Yes, WSL.
link |
02:28:02.880
And I find myself being able to handle everything I need and almost everything I need in Linux
link |
02:28:08.240
for basic sort of tasks, scripting tasks within WSL and it creates a really nice environment.
link |
02:28:12.800
So I've been, but like whenever I hang out with like, especially important people,
link |
02:28:17.760
like they're all on iPhone and a Mac.
link |
02:28:20.560
And it's like, yeah, like what there, there is a messiness to Windows and a messiness to Linux
link |
02:28:27.680
that makes me feel like you're still in it.
link |
02:28:31.760
Well, the Linux stuff, Windows subsystem for Linux is very tempting,
link |
02:28:35.760
but there's still the Windows on the outside where I don't know where, and I've been,
link |
02:28:40.640
okay, I've been, I've used DOS since version 1.1.1 or 1.2.1 or something.
link |
02:28:45.440
So I've been a long time Microsoft user.
link |
02:28:48.240
And I will say that like, it's really hard for me to know where anything is,
link |
02:28:53.280
how to get to the details behind something when something screws up as an invariably does.
link |
02:28:57.440
And just things like changing group permissions on some shared folders and stuff,
link |
02:29:01.280
just everything seems a little bit more awkward, more clicks than it needs to be.
link |
02:29:06.320
Not to say that there aren't any weird things like hidden attributes and all this other happy
link |
02:29:09.440
stuff on Mac, but for the most part, and well, actually, especially now with the new hardware
link |
02:29:16.000
coming out on Mac, that'll be very interesting, with the new M1, there were some dark years
link |
02:29:20.880
in the last few years when I was like, I think maybe I have to move off of Mac as a platform.
link |
02:29:26.480
I mean, like my keyboard was just not working, like literally my keyboard just wasn't working,
link |
02:29:30.560
right? I had this touch bar, didn't have a physical escape button like I needed to
link |
02:29:34.000
because I used VIM, and now I think we're back.
link |
02:29:36.800
So you use VIM, and you have what kind of keyboard?
link |
02:29:40.160
So I use a RealForce 87U. It's a mechanical, it's a topra key switch.
link |
02:29:45.040
It's a weird shape. There's a normal shape.
link |
02:29:48.240
Oh, no, because I say that because I use a kinesis and I had, you said some dark, you said,
link |
02:29:53.040
yeah, dark moments. I've recently had a dark moment was like, what am I doing with my life?
link |
02:29:58.720
So I remember sort of flying in a very kind of tight space. And as I'm working,
link |
02:30:04.960
this is what I do on an airplane. I pull out a laptop and on top of the laptop,
link |
02:30:09.520
I'll put a kinesis keyboard. That's hardcore, man.
link |
02:30:11.920
I was thinking, is this who I am? Is this what I'm becoming?
link |
02:30:14.960
Will I be this person? Because I'm on Emacs with this kinesis keyboard sitting like
link |
02:30:20.560
with everybody around. Emacs on Windows.
link |
02:30:23.600
On the WSL, yeah. Yeah, Emacs on Linux on Windows.
link |
02:30:27.280
Yeah, on Windows. And like everybody around me is using their iPhone to look at TikTok.
link |
02:30:33.040
So I'm like in this land and I thought, you know what? Maybe I need to become an adult and
link |
02:30:38.400
put the 90s behind me and use like a normal keyboard. And then I did some soul searching
link |
02:30:44.880
and I decided like, this is who I am. This is me like coming out of the closet to saying,
link |
02:30:48.720
I'm kinesis keyboard all the way. I'm going to use Emacs.
link |
02:30:52.800
You know, also the kinesis fan, Wes McKinney, that created pandas.
link |
02:30:57.440
He just, he banged out pandas on a kinesis keyboard, I believe. I don't know if he's still
link |
02:31:00.480
using one maybe, but certainly 10 years ago, like he was.
link |
02:31:03.920
If anyone's out there, maybe we need to have a kinesis support group. Please reach out.
link |
02:31:08.240
Isn't there already one? Is there one?
link |
02:31:10.000
I don't know. There's got to be an IRC channel, man.
link |
02:31:14.320
Oh no. And you access it through Emacs. Okay. Do you still program these days?
link |
02:31:19.440
I do a little bit. Honestly, the last thing I did was I had written, I was working with my son
link |
02:31:26.720
to script some Minecraft stuff. So I was doing a little bit of that. That was the last, literally
link |
02:31:30.480
the last code I wrote. Oh, you know what? Also, I wrote some code to do some
link |
02:31:35.360
cap table evaluation, waterfall modeling kind of stuff.
link |
02:31:39.120
What advice would you give to a young person? He said your son today in high school,
link |
02:31:44.000
maybe even college, about career, about life.
link |
02:31:48.400
This may be where I get into trouble a little bit.
link |
02:31:51.040
We are coming to the end. We're rapidly entering a time between worlds.
link |
02:31:55.840
So we have a world now that's starting to really crumble under the weight of aging institutions
link |
02:32:01.680
that no longer even pretend to serve the purposes they were created for. We are creating technologies
link |
02:32:07.280
that are hurtling billions of people headlong into philosophical crises, who they don't even know
link |
02:32:12.480
the philosophical operating systems in their firmware, and they're heading into a time when
link |
02:32:16.320
that gets vaporized. So for people in high school, and certainly I tell my son this as well, he's in
link |
02:32:21.760
middle school, people in college, you are going to have to find your own way. You're going to
link |
02:32:30.000
have to have a pioneer spirit, even if you live in the middle of the most dense urban environment.
link |
02:32:37.520
All of human reality around you is the result of the last few generations of humans
link |
02:32:45.520
agreeing to play certain kinds of games. A lot of those games no longer operate according to
link |
02:32:53.520
the rules they used to. Collapse is nonlinear, but it will be managed. And so if you are in a
link |
02:33:01.120
particular social caste or economic caste, and I think it's not kosher to say that about America,
link |
02:33:10.000
but America is a very stratified and classist society, there's some mobility, but it's really
link |
02:33:16.240
quite classist. And in America, unless you're in the upper middle class, you are head into very
link |
02:33:22.000
choppy waters. So it is really, really good to think and understand the fundamentals of what you
link |
02:33:28.400
need to build a meaningful life for you, your loved ones with your family. And almost all of the
link |
02:33:36.560
technology being created that's consumer facing is designed to own people, to take the four stack
link |
02:33:44.240
of people, to delaminate them and to own certain portions of that stack. And so if you want to
link |
02:33:50.880
be an integral human being, if you want to have your agency and you want to find your own way in
link |
02:33:56.240
the world, when you're young, would be a great time to spend time looking at some of the classics
link |
02:34:02.880
around what it means to live a good life, what it means to build connection with people. And so
link |
02:34:08.880
much of the status games, so much of the stuff, one of the things that I talk about as we create
link |
02:34:16.640
more and more technology, there's a gradient of technology and a gradient of technology always
link |
02:34:20.960
leads to a gradient of power. And this is Jacques Aloul's point to some extent as well.
link |
02:34:25.040
That gradient of power is not going to go away. The technologies are going so fast
link |
02:34:29.120
that even people like me, who helped create some of the stuff, I'm being left behind that's
link |
02:34:33.600
cutting edge research. I don't know what's going on against today. I'll go read some proceedings.
link |
02:34:38.640
So as the world gets more and more technological, it will create more and more gradients where people
link |
02:34:45.040
will seize power, economic fortunes. And the way they make the people who are left behind okay
link |
02:34:51.600
with their lot in life is they create lottery systems. They make you take part in the narrative
link |
02:34:59.840
of your own being trapped in your own economic zone. So avoiding those kinds of things is really
link |
02:35:07.200
important knowing when someone is running game on you basically. So these are things I would tell
link |
02:35:11.760
young people. It's a dark message, but it's realism. I mean, that's what I see. So after you
link |
02:35:16.400
gave some realism, you sit back with your son, you're looking out at the sunset. What to him
link |
02:35:24.080
can you give as words of hope and to you from where do you derive hope for the future of our
link |
02:35:31.360
world? So you said at the individual level, you have to have a pioneer mindset to go back to
link |
02:35:37.280
the classics to understand what is in human nature you can find meaning. But at the societal level,
link |
02:35:42.960
what trajectory, when you look at possible trajectories, what gives you hope?
link |
02:35:47.280
What gives me hope is that we have little tremors now shaking people out of the reverie
link |
02:35:54.560
of the fiction of modernity that they've been living in kind of a late 20th century style
link |
02:35:58.640
modernity. That's good, I think, because and also to your point earlier, people are burning
link |
02:36:06.880
out on some of the social media stuff. They're sort of seeing the ugly side, especially the
link |
02:36:09.680
latest news with Facebook and the whistleblower. It's quite clear these things are not all
link |
02:36:15.760
they're cracked up to be. I believe better social media can be built because they are burning out
link |
02:36:21.680
and incentivize other competitors to be built. Do you think that's possible?
link |
02:36:26.320
Well, the thing about it is that when you have extractive return on returns capital coming in
link |
02:36:34.560
and saying, look, you own a network, give me some exponential dynamics out of this network.
link |
02:36:38.320
What are you going to do? You're going to just basically put a toll keeper at every single node
link |
02:36:42.560
and every single graph edge, every node, every vertex, every edge. But if you don't have that
link |
02:36:49.120
need for it, if no one's sitting there saying, hey, Wikipedia monetize every character, every
link |
02:36:53.360
byte, every phrase, then generative human dynamics will naturally sort of arise, assuming we respect
link |
02:36:59.840
a few principles around online communications. So the greatest and biggest social network
link |
02:37:05.200
in the world is still like email SMS. So we're fine there. The issue with the social media,
link |
02:37:12.000
as we call it now, is they're actually just new amplification systems. Now it's benefit
link |
02:37:17.040
to certain people like yourself who have interesting content to be amplified. So it's
link |
02:37:23.200
created a creator economy and that's cool. There's a lot of great content out there.
link |
02:37:26.720
But giving everyone a shot at the fame lottery saying, hey, you could also have your, if you
link |
02:37:31.520
wiggle your butt the right way on TikTok, you can have your 15 seconds of microfame. That's not
link |
02:37:36.320
healthy for society at large. So I think if we can create tools that help people be conscientious
link |
02:37:44.160
about their attention, spend time looking at the past and really retrieving memory and calling,
link |
02:37:49.520
not calling, but processing and thinking about that, I think that's certainly possible. And
link |
02:37:55.200
hopefully that's what we get. So the bigger picture, the bigger question that you're asking about,
link |
02:38:01.040
what gives me hope is that these early shocks of COVID lockdowns and remote work and all these
link |
02:38:10.800
different kinds of things, I think it's getting people to a point where they are looking, they're
link |
02:38:16.880
sort of no longer in the reverie. As my friend Jim Rutt says, there's more people with ears to hear
link |
02:38:22.640
now with the pandemic and education. Everyone's like, wait, wait, what have you guys been doing
link |
02:38:27.920
with my kids? How are you teaching them? What is this crap you're giving them as homework?
link |
02:38:32.240
So I think these are the kinds of things that are getting in the supply chain disruptions,
link |
02:38:36.720
getting more people to think about, how do we actually just make stuff? This is all good,
link |
02:38:42.800
but the concern is that it's still going to take a while for these things, for people to learn how
link |
02:38:49.200
to be agentic again and to be in right relationship with each other and with the world. So the
link |
02:38:56.320
message of hope is still people are resilient and we are building some really amazing technology.
link |
02:39:01.280
And I also, to me, I derive a lot of hope from individuals in that van. The power of a single
link |
02:39:09.440
individual to transform the world, to do positive things to the world is quite incredible. You've
link |
02:39:15.120
been talking about it's nice to have as many of those individuals as possible, but even the power
link |
02:39:19.520
of one is kind of magical. It is. We're in a mode now where we can do that. I think also,
link |
02:39:25.120
you know, part of what I try to do is in coming to podcasts like yours and then spamming with all
link |
02:39:30.080
this philosophical stuff that I've got going on, there are a lot of good people out there trying
link |
02:39:34.960
to put words around the current technological, social, economic crises that we're facing.
link |
02:39:43.120
And the space of a few short years, I think there has been a lot of great content produced
link |
02:39:46.560
around this stuff for people who want to see, want to find out more or think more about this.
link |
02:39:52.000
We're popularizing certain kinds of philosophical ideas that move people beyond just the, oh,
link |
02:39:56.560
you're a communist, oh, you're a capitalist kind of stuff. We're way past that now.
link |
02:40:01.120
So that also gives me hope that I feel like I myself am getting a handle on how to think about
link |
02:40:06.240
these things. It makes me feel like I can hopefully effect change for the better.
link |
02:40:12.400
We've been sneaking up on this question all over the place. Let me ask the big ridiculous question.
link |
02:40:17.520
What is the meaning of life? Wow. The meaning of life.
link |
02:40:28.720
Yeah, I don't know. I mean, I've not really understood that question.
link |
02:40:32.080
When you say meaning crisis, you're saying that there is a search for a kind of experience that's
link |
02:40:42.480
could be described as fulfillment, like the aha moment of just like joy. And maybe when you see
link |
02:40:51.120
something beautiful, or maybe you have created something beautiful, that experience that you get,
link |
02:40:57.280
it feels like it all makes sense. So some of that is just chemicals coming together in your mind
link |
02:41:03.920
and all kinds of things. But it seems like we're building a sophisticated collective intelligence
link |
02:41:11.600
that's providing meaning in all kinds of ways to its members. And there's a theme to that meaning.
link |
02:41:19.920
So for a lot of history, I think faith played an important role. Faith in God is a religion.
link |
02:41:28.800
I think technology in the modern era is kind of serving a little bit of a source of meaning
link |
02:41:34.800
for people like innovation of different kinds. I think the old school things of love and the
link |
02:41:43.120
basics of just being good at stuff. But you were a physicist. So there's a desire to say, okay,
link |
02:41:51.600
yeah, but these seem to be like symptoms of something deeper. Right. Like why little meaning?
link |
02:41:57.600
What's capital M meaning? Yeah, what's capital M meaning? Why are we reaching for order when
link |
02:42:02.800
there is excess of energy? I don't know if I can answer the why. Any why that I come up with,
link |
02:42:10.000
I think, is going to be, I'd have to think about that a little more, maybe get back to you on that.
link |
02:42:16.480
But I will say this. We do look at the world through a traditional, I think most people look
link |
02:42:22.720
at the world through what I would say is a subject object kind of metaphysical lens, that
link |
02:42:27.120
we have our own subjectivity. And then there's all of these object things that are not us.
link |
02:42:34.000
So I'm me and these things are not me. And I'm interacting with them. I'm doing things to them.
link |
02:42:39.760
But a different view of the world that looks at it as much more connected that realizes, oh,
link |
02:42:46.880
I'm really quite embedded in a soup of other things. And I'm simply almost like a standing
link |
02:42:53.040
wave pattern of different things. So when you look at the world in that kind of connected sense,
link |
02:42:59.040
I've recently taken a shine to this particular thought experiment, which is what if it was the
link |
02:43:06.480
case that everything that we touch with our hands, that we pay attention to, that we actually give
link |
02:43:14.560
intimacy to, what if there's actually all the mumbo jumbo people with the magnetic healing
link |
02:43:24.400
crystals and all this other kind of stuff and quantum energy stuff, what if that was a thing?
link |
02:43:30.160
What if when you literally when your hand touches an object, when you really look at
link |
02:43:34.880
something and you concentrate and you focus on it and you really give it attention,
link |
02:43:38.000
you actually give it, there is some physical residue of something, a part of you,
link |
02:43:45.440
a bit of your life force that goes into it. Now, this is, of course, completely mumbo jumbo stuff.
link |
02:43:51.280
This is not like, I don't actually think this is real, but let's do the thought experiment.
link |
02:43:55.600
What if it was? What if there actually was some quantum magnetic crystal and energy field thing
link |
02:44:03.440
that just by touching this can, this can has changed a little bit somehow, and it's not much
link |
02:44:10.400
unless you put a lot into it and you touch it all the time, like your phone, right?
link |
02:44:14.880
These things gain, they gain meaning to you a little bit, but what if there's something that
link |
02:44:23.360
technical objects, the phone is a technical object, it does not really receive attention or
link |
02:44:28.160
intimacy and then allow itself to be transformed by it. But if it's a piece of wood,
link |
02:44:33.280
if it's the handle of a knife that your mother used for 20 years to make dinner for you, right?
link |
02:44:40.320
What if it's a keyboard that you banged out your world transforming software library on?
link |
02:44:46.560
These are technical objects and these are physical objects, but somehow there's something to them.
link |
02:44:51.360
We feel an attraction to these objects as if we have imbued them with life energy.
link |
02:44:57.120
If you walk that thought experiment through, what happens when we touch another person,
link |
02:45:00.320
when we hug them, when we hold them? And the reason this ties into my answer for your question is
link |
02:45:08.160
that if there is such a thing, if we were to hypothesize, you know, hypothesize it's such a
link |
02:45:16.320
thing, it could be that the purpose of our lives is to imbue as many things with that love as possible.
link |
02:45:30.560
That's a beautiful answer and a beautiful way to end it. Peter, you're an incredible person.
link |
02:45:36.480
Thank you.
link |
02:45:36.880
Benning so much in the space of engineering and in the space of philosophy. I'm really proud to
link |
02:45:46.560
be living in the same city as you. And I'm really grateful that you have spent your valuable time
link |
02:45:52.320
with me today. Well, thank you. I appreciate the opportunity to speak with you.
link |
02:45:56.400
Thanks for listening to this conversation with Peter Wang. To support this podcast,
link |
02:46:00.480
please check out our sponsors in the description. And now let me leave you with some words for
link |
02:46:05.680
Peter Wang himself. We tend to think of people as either malicious or incompetent. But in a world
link |
02:46:13.280
filled with corruptible and unchecked institutions, there exists a third thing, malicious incompetence.
link |
02:46:20.880
It's a social cancer and it only appears once human organizations scale beyond personal accountability.
link |
02:46:27.520
Thank you for listening and hope to see you next time.