back to index

Jim Keller: The Future of Computing, AI, Life, and Consciousness | Lex Fridman Podcast #162


small model | large model

link |
00:00:00.000
The following is a conversation with Jim Keller,
link |
00:00:02.920
his second time in the podcast.
link |
00:00:04.960
Jim is a legendary microprocessor architect
link |
00:00:08.480
and is widely seen as one of the greatest
link |
00:00:11.080
engineering minds of the computing age.
link |
00:00:14.620
In a peculiar twist of space time in our simulation,
link |
00:00:18.840
Jim is also a brother in law of Jordan Peterson.
link |
00:00:22.200
We talk about this and about computing,
link |
00:00:25.320
artificial intelligence, consciousness, and life.
link |
00:00:29.200
Quick mention of our sponsors.
link |
00:00:31.280
Athletic Greens All In One Nutrition Drink,
link |
00:00:33.780
Brooklyn and Sheets, ExpressVPN,
link |
00:00:36.600
and Belcampo Grass Fed Meat.
link |
00:00:39.600
Click the sponsor links to get a discount
link |
00:00:41.680
and to support this podcast.
link |
00:00:43.920
As a side note, let me say that Jim is someone who,
link |
00:00:46.540
on a personal level, inspired me to be myself.
link |
00:00:50.160
There was something in his words, on and off the mic,
link |
00:00:53.340
or perhaps that he even paid attention to me at all,
link |
00:00:56.200
that almost told me, you're all right, kid.
link |
00:00:59.160
A kind of pat on the back that can make the difference
link |
00:01:01.820
between a mind that flourishes
link |
00:01:03.640
and a mind that is broken down
link |
00:01:05.760
by the cynicism of the world.
link |
00:01:08.160
So I guess that's just my brief few words
link |
00:01:10.440
of thank you to Jim, and in general,
link |
00:01:12.800
gratitude for the people who have given me a chance
link |
00:01:15.440
on this podcast, in my work, and in life.
link |
00:01:19.000
If you enjoy this thing, subscribe on YouTube,
link |
00:01:21.200
review on Apple Podcast, follow on Spotify,
link |
00:01:24.240
support on Patreon, or connect with me
link |
00:01:26.360
on Twitter, Alex Friedman.
link |
00:01:28.560
And now, here's my conversation with Jim Keller.
link |
00:01:33.360
What's the value and effectiveness
link |
00:01:35.340
of theory versus engineering, this dichotomy,
link |
00:01:38.080
in building good software or hardware systems?
link |
00:01:43.400
Well, good design is both.
link |
00:01:46.440
I guess that's pretty obvious.
link |
00:01:48.680
By engineering, do you mean reduction of practice
link |
00:01:51.840
of known methods?
link |
00:01:53.260
And then science is the pursuit of discovering things
link |
00:01:55.960
that people don't understand.
link |
00:01:57.780
Or solving unknown problems.
link |
00:02:00.340
Definitions are interesting here,
link |
00:02:01.960
but I was thinking more in theory,
link |
00:02:04.120
constructing models that kind of generalize
link |
00:02:06.740
about how things work.
link |
00:02:08.540
And engineering is actually building stuff.
link |
00:02:12.760
The pragmatic, like, okay, we have these nice models,
link |
00:02:16.180
but how do we actually get things to work?
link |
00:02:17.920
Maybe economics is a nice example.
link |
00:02:20.740
Like, economists have all these models
link |
00:02:22.440
of how the economy works,
link |
00:02:23.640
and how different policies will have an effect,
link |
00:02:26.680
but then there's the actual, okay,
link |
00:02:29.240
let's call it engineering,
link |
00:02:30.480
of like, actually deploying the policies.
link |
00:02:33.240
So computer design is almost all engineering.
link |
00:02:36.380
And reduction of practice of known methods.
link |
00:02:38.200
Now, because of the complexity of the computers we built,
link |
00:02:43.560
you know, you could think you're,
link |
00:02:44.960
well, we'll just go write some code,
link |
00:02:46.600
and then we'll verify it, and then we'll put it together,
link |
00:02:49.160
and then you find out that the combination
link |
00:02:50.920
of all that stuff is complicated.
link |
00:02:53.200
And then you have to be inventive
link |
00:02:54.700
to figure out how to do it, right?
link |
00:02:56.920
So that definitely happens a lot.
link |
00:02:59.760
And then, every so often, some big idea happens.
link |
00:03:04.440
But it might be one person.
link |
00:03:06.360
And that idea is in the space of engineering,
link |
00:03:08.840
or is it in the space of...
link |
00:03:10.440
Well, I'll give you an example.
link |
00:03:11.380
So one of the limits of computer performance
link |
00:03:13.140
is branch prediction.
link |
00:03:14.880
So, and there's a whole bunch of ideas
link |
00:03:17.500
about how good you could predict a branch.
link |
00:03:19.440
And people said, there's a limit to it,
link |
00:03:21.640
it's an asymptotic curve.
link |
00:03:23.480
And somebody came up with a better way
link |
00:03:24.920
to do branch prediction, it was a lot better.
link |
00:03:28.280
And he published a paper on it,
link |
00:03:29.720
and every computer in the world now uses it.
link |
00:03:32.760
And it was one idea.
link |
00:03:34.600
So the engineers who build branch prediction hardware
link |
00:03:37.960
were happy to drop the one kind of training array
link |
00:03:40.520
and put it in another one.
link |
00:03:42.380
So it was a real idea.
link |
00:03:44.840
And branch prediction is one of the key problems
link |
00:03:48.520
underlying all of sort of the lowest level of software.
link |
00:03:51.960
It boils down to branch prediction.
link |
00:03:53.800
Boils down to uncertainty.
link |
00:03:54.860
Computers are limited by...
link |
00:03:56.280
Single thread computer is limited by two things.
link |
00:03:58.640
The predictability of the path of the branches
link |
00:04:01.400
and the predictability of the locality of data.
link |
00:04:05.320
So we have predictors that now predict
link |
00:04:07.080
both of those pretty well.
link |
00:04:09.160
So memory is a couple hundred cycles away,
link |
00:04:11.880
local cache is a couple cycles away.
link |
00:04:14.540
When you're executing fast,
link |
00:04:15.720
virtually all the data has to be in the local cache.
link |
00:04:19.020
So a simple program says,
link |
00:04:21.320
add one to every element in an array,
link |
00:04:23.280
it's really easy to see what the stream of data will be.
link |
00:04:26.680
But you might have a more complicated program
link |
00:04:28.520
that says, get an element of this array,
link |
00:04:31.080
look at something, make a decision,
link |
00:04:32.800
go get another element, it's kind of random.
link |
00:04:35.200
And you can think, that's really unpredictable.
link |
00:04:37.760
And then you make this big predictor
link |
00:04:39.200
that looks at this kind of pattern and you realize,
link |
00:04:41.400
well, if you get this data and this data,
link |
00:04:43.000
then you probably want that one.
link |
00:04:44.560
And if you get this one and this one and this one,
link |
00:04:46.440
you probably want that one.
link |
00:04:47.960
And is that theory or is that engineering?
link |
00:04:49.920
Like the paper that was written,
link |
00:04:51.320
was it asymptotic kind of discussion
link |
00:04:54.640
or is it more like, here's a hack that works well?
link |
00:04:57.920
It's a little bit of both.
link |
00:04:59.100
Like there's information theory in it, I think somewhere.
link |
00:05:01.280
Okay, so it's actually trying to prove some kind of stuff.
link |
00:05:04.320
But once you know the method,
link |
00:05:06.360
implementing it is an engineering problem.
link |
00:05:09.560
Now there's a flip side of this,
link |
00:05:10.800
which is in a big design team,
link |
00:05:13.400
what percentage of people think
link |
00:05:14.960
their plan or their life's work is engineering
link |
00:05:20.800
versus inventing things?
link |
00:05:23.480
So lots of companies will reward you for filing patents.
link |
00:05:27.520
Some, many big companies get stuck
link |
00:05:29.280
because to get promoted,
link |
00:05:30.420
you have to come up with something new.
link |
00:05:32.940
And then what happens is everybody's trying
link |
00:05:34.740
to do some random new thing,
link |
00:05:36.480
99% of which doesn't matter.
link |
00:05:39.120
And the basics get neglected.
link |
00:05:41.140
Or there's a dichotomy, they think like the cell library
link |
00:05:47.700
and the basic CAD tools or basic software validation methods,
link |
00:05:53.260
that's simple stuff.
link |
00:05:54.740
They wanna work on the exciting stuff.
link |
00:05:56.900
And then they spend lots of time
link |
00:05:58.460
trying to figure out how to patent something.
link |
00:06:00.740
And that's mostly useless.
link |
00:06:02.240
But the breakthrough is on simple stuff.
link |
00:06:04.580
No, no, you have to do the simple stuff really well.
link |
00:06:08.940
If you're building a building out of bricks,
link |
00:06:11.460
you want great bricks.
link |
00:06:13.240
So you go to two places that sell bricks.
link |
00:06:14.900
So one guy says, yeah, they're over there in a ugly pile.
link |
00:06:17.980
And the other guy is like lovingly tells you
link |
00:06:19.860
about the 50 kinds of bricks and how hard they are
link |
00:06:22.300
and how beautiful they are and how square they are.
link |
00:06:26.100
Which one are you gonna buy bricks from?
link |
00:06:28.220
Which is gonna make a better house?
link |
00:06:30.420
So you're talking about the craftsman,
link |
00:06:32.020
the person who understands bricks,
link |
00:06:33.500
who loves bricks, who loves the varieties.
link |
00:06:35.140
That's a good word.
link |
00:06:36.540
Good engineering is great craftsmanship.
link |
00:06:39.460
And when you start thinking engineering is about invention
link |
00:06:44.880
and you set up a system that rewards invention,
link |
00:06:47.940
the craftsmanship gets neglected.
link |
00:06:50.660
Okay, so maybe one perspective is the theory,
link |
00:06:53.500
the science overemphasizes invention
link |
00:06:57.660
and engineering emphasizes craftsmanship.
link |
00:07:00.420
And therefore, so it doesn't matter what you do,
link |
00:07:03.940
theory, engineering. Well, everybody does.
link |
00:07:05.060
Like read the tech ranks are always talking
link |
00:07:06.740
about some breakthrough or innovation
link |
00:07:09.540
and everybody thinks that's the most important thing.
link |
00:07:12.460
But the number of innovative ideas
link |
00:07:13.900
is actually relatively low.
link |
00:07:15.980
We need them, right?
link |
00:07:17.260
And innovation creates a whole new opportunity.
link |
00:07:19.820
Like when some guy invented the internet, right?
link |
00:07:24.020
Like that was a big thing.
link |
00:07:25.940
The million people that wrote software against that
link |
00:07:28.240
were mostly doing engineering software writing.
link |
00:07:31.180
So the elaboration of that idea was huge.
link |
00:07:34.300
I don't know if you know Brendan Eich,
link |
00:07:35.580
he wrote JavaScript in 10 days.
link |
00:07:38.180
That's an interesting story.
link |
00:07:39.540
It makes me wonder, and it was famously for many years
link |
00:07:43.740
considered to be a pretty crappy programming language.
link |
00:07:47.660
Still is perhaps.
link |
00:07:48.780
It's been improving sort of consistently.
link |
00:07:51.140
But the interesting thing about that guy is,
link |
00:07:55.580
you know, he doesn't get any awards.
link |
00:07:58.540
You don't get a Nobel Prize or a Fields Medal or.
link |
00:08:01.140
For inventing a crappy piece of, you know, software code.
link |
00:08:06.820
That is currently the number one programming language
link |
00:08:08.700
in the world and runs,
link |
00:08:10.100
now is increasingly running the backend of the internet.
link |
00:08:13.740
Well, does he know why everybody uses it?
link |
00:08:17.640
Like that would be an interesting thing.
link |
00:08:19.300
Was it the right thing at the right time?
link |
00:08:22.340
Cause like when stuff like JavaScript came out,
link |
00:08:24.900
like there was a move from, you know,
link |
00:08:26.260
writing C programs and C++ to what they call
link |
00:08:30.620
managed code frameworks,
link |
00:08:32.340
where you write simple code, it might be interpreted,
link |
00:08:35.220
it has lots of libraries, productivity is high,
link |
00:08:37.780
and you don't have to be an expert.
link |
00:08:39.520
So, you know, Java was supposed to solve
link |
00:08:41.340
all the world's problems.
link |
00:08:42.180
It was complicated.
link |
00:08:43.780
JavaScript came out, you know,
link |
00:08:45.220
after a bunch of other scripting languages.
link |
00:08:47.660
I'm not an expert on it.
link |
00:08:49.220
But was it the right thing at the right time?
link |
00:08:51.420
Or was there something, you know, clever?
link |
00:08:54.260
Cause he wasn't the only one.
link |
00:08:56.300
There's a few elements.
link |
00:08:57.420
And maybe if he figured out what it was,
link |
00:08:59.500
then he'd get a prize.
link |
00:09:02.020
Like that.
link |
00:09:02.860
Yeah, you know, maybe his problem is he hasn't defined this.
link |
00:09:06.860
Or he just needs a good promoter.
link |
00:09:09.500
Well, I think there was a bunch of blog posts
link |
00:09:11.900
written about it, which is like,
link |
00:09:13.620
wrong is right, which is like doing the crappy thing fast.
link |
00:09:19.340
Just like hacking together the thing
link |
00:09:21.340
that answers some of the needs.
link |
00:09:23.260
And then iterating over time, listening to developers.
link |
00:09:26.100
Like listening to people who actually use the thing.
link |
00:09:28.220
This is something you can do more in software.
link |
00:09:31.540
But the right time, like you have to sense,
link |
00:09:33.760
you have to have a good instinct
link |
00:09:35.140
of when is the right time for the right tool.
link |
00:09:37.580
And make it super simple.
link |
00:09:40.260
And just get it out there.
link |
00:09:42.720
The problem is, this is true with hardware.
link |
00:09:45.200
This is less true with software.
link |
00:09:46.420
Is there's backward compatibility
link |
00:09:48.420
that just drags behind you as, you know,
link |
00:09:51.740
as you try to fix all the mistakes of the past.
link |
00:09:53.820
But the timing.
link |
00:09:55.820
It was good.
link |
00:09:56.640
There's something about that.
link |
00:09:57.480
And it wasn't accidental.
link |
00:09:58.820
You have to like give yourself over to the,
link |
00:10:02.580
you have to have this like broad sense
link |
00:10:05.380
of what's needed now.
link |
00:10:07.740
Both scientifically and like the community.
link |
00:10:10.860
And just like this, it was obvious that there was no,
link |
00:10:15.500
the interesting thing about JavaScript
link |
00:10:17.980
is everything that ran in the browser at the time,
link |
00:10:20.900
like Java and I think other like Scheme,
link |
00:10:24.460
other programming languages,
link |
00:10:25.940
they were all in a separate external container.
link |
00:10:30.500
And then JavaScript was literally
link |
00:10:32.500
just injected into the webpage.
link |
00:10:34.620
It was the dumbest possible thing
link |
00:10:36.380
running in the same thread as everything else.
link |
00:10:39.340
And like it was inserted as a comment.
link |
00:10:43.100
So JavaScript code is inserted as a comment in the HTML code.
link |
00:10:47.420
And it was, I mean, there's,
link |
00:10:50.260
it's either genius or super dumb, but it's like.
link |
00:10:53.100
Right, so it had no apparatus for like a virtual machine
link |
00:10:55.980
and container, it just executed in the framework
link |
00:10:58.460
of the program that's already running.
link |
00:10:59.780
Yeah, that's cool.
link |
00:11:00.940
And then because something about that accessibility,
link |
00:11:04.140
the ease of its use resulted in then developers innovating
link |
00:11:10.060
of how to actually use it.
link |
00:11:11.420
I mean, I don't even know what to make of that,
link |
00:11:13.660
but it does seem to echo across different software,
link |
00:11:18.340
like stories of different software.
link |
00:11:19.740
PHP has the same story, really crappy language.
link |
00:11:22.900
They just took over the world.
link |
00:11:25.380
I always have a joke that the random length instructions,
link |
00:11:28.340
variable length instructions, that's always one,
link |
00:11:30.660
even though they're obviously worse.
link |
00:11:33.060
Like nobody knows why.
link |
00:11:34.460
X86 is arguably the worst architecture on the planet.
link |
00:11:38.660
It's one of the most popular ones.
link |
00:11:40.500
Well, I mean, isn't that also the story of risk versus,
link |
00:11:43.700
I mean, is that simplicity?
link |
00:11:46.220
There's something about simplicity that us
link |
00:11:49.420
in this evolutionary process is valued.
link |
00:11:53.500
If it's simple, it spreads faster, it seems like.
link |
00:11:58.820
Or is that not always true?
link |
00:11:59.980
Not always true.
link |
00:12:01.140
Yeah, it could be simple is good, but too simple is bad.
link |
00:12:04.260
So why did risk win, you think, so far?
link |
00:12:06.460
Did risk win?
link |
00:12:08.700
In the long archivist tree.
link |
00:12:10.580
We don't know.
link |
00:12:11.420
So who's gonna win?
link |
00:12:12.700
What's risk, what's CISC, and who's gonna win in that space
link |
00:12:15.900
in these instruction sets?
link |
00:12:17.580
AI software's gonna win, but there'll be little computers
link |
00:12:21.140
that run little programs like normal all over the place.
link |
00:12:24.980
But we're going through another transformation, so.
link |
00:12:28.580
But you think instruction sets underneath it all will change?
link |
00:12:32.420
Yeah, they evolve slowly.
link |
00:12:33.700
They don't matter very much.
link |
00:12:35.500
They don't matter very much, okay.
link |
00:12:36.820
I mean, the limits of performance are predictability
link |
00:12:40.420
of instructions and data.
link |
00:12:41.700
I mean, that's the big thing.
link |
00:12:43.420
And then the usability of it is some quality of design,
link |
00:12:49.180
quality of tools, availability.
link |
00:12:52.180
Like right now, x86 is proprietary with Intel and AMD,
link |
00:12:56.460
but they can change it any way they want independently.
link |
00:12:59.740
ARM is proprietary to ARM,
link |
00:13:01.660
and they won't let anybody else change it.
link |
00:13:03.700
So it's like a sole point.
link |
00:13:05.740
And RISC 5 is open source, so anybody can change it,
link |
00:13:09.140
which is super cool.
link |
00:13:10.660
But that also might mean it gets changed
link |
00:13:12.500
too many random ways that there's no common subset of it
link |
00:13:16.340
that people can use.
link |
00:13:17.700
Do you like open or do you like closed?
link |
00:13:19.940
Like if you were to bet all your money on one
link |
00:13:21.780
or the other, RISC 5 versus it?
link |
00:13:23.300
No idea.
link |
00:13:24.180
It's case dependent?
link |
00:13:25.020
Well, x86, oddly enough, when Intel first started
link |
00:13:27.660
developing it, they licensed like seven people.
link |
00:13:30.220
So it was the open architecture.
link |
00:13:33.060
And then they moved faster than others
link |
00:13:35.340
and also bought one or two of them.
link |
00:13:37.460
But there was seven different people making x86
link |
00:13:40.260
because at the time there was 6502 and Z80s and 8086.
link |
00:13:46.940
And you could argue everybody thought Z80
link |
00:13:49.060
was the better instruction set,
link |
00:13:50.940
but that was proprietary to one place.
link |
00:13:54.460
Oh, and the 6800.
link |
00:13:56.100
So there's like four or five different microprocessors.
link |
00:13:59.420
Intel went open, got the market share
link |
00:14:02.380
because people felt like they had multiple sources from it,
link |
00:14:04.700
and then over time it narrowed down to two players.
link |
00:14:07.620
So why, you as a historian, why did Intel win for so long
link |
00:14:14.420
with their processors?
link |
00:14:17.260
I mean, I mean.
link |
00:14:18.100
They were great.
link |
00:14:18.940
Their process development was great.
link |
00:14:21.020
Oh, so it's just looking back to JavaScript
link |
00:14:23.700
and what I like is Microsoft and Netscape
link |
00:14:26.540
and all these internet browsers.
link |
00:14:28.940
Microsoft won the browser game
link |
00:14:31.740
because they aggressively stole other people's ideas
link |
00:14:35.940
like right after they did it.
link |
00:14:37.820
You know, I don't know
link |
00:14:39.100
if Intel was stealing other people's ideas.
link |
00:14:41.180
They started making.
link |
00:14:42.020
In a good way, stealing in a good way just to clarify.
link |
00:14:43.780
They started making RAMs, random access memories.
link |
00:14:48.260
And then at the time
link |
00:14:50.300
when the Japanese manufacturers came up,
link |
00:14:52.940
you know, they were getting out competed on that
link |
00:14:54.860
and they pivoted the microprocessors
link |
00:14:56.580
and they made the first, you know,
link |
00:14:57.700
integrated microprocessor grant programs.
link |
00:14:59.860
It was the 4D04 or something.
link |
00:15:03.820
Who was behind that pivot?
link |
00:15:04.820
That's a hell of a pivot.
link |
00:15:05.860
Andy Grove and he was great.
link |
00:15:08.780
That's a hell of a pivot.
link |
00:15:10.140
And then they led semiconductor industry.
link |
00:15:13.860
Like they were just a little company, IBM,
link |
00:15:15.980
all kinds of big companies had boatloads of money
link |
00:15:18.980
and they out innovated everybody.
link |
00:15:21.180
Out innovated, okay.
link |
00:15:22.420
Yeah, yeah.
link |
00:15:23.260
So it's not like marketing, it's not any of that stuff.
link |
00:15:26.260
Their processor designs were pretty good.
link |
00:15:29.340
I think the, you know, Core 2 was probably the first one
link |
00:15:34.340
I thought was great.
link |
00:15:36.180
It was a really fast processor and then Haswell was great.
link |
00:15:40.180
What makes a great processor in that?
link |
00:15:42.220
Oh, if you just look at it,
link |
00:15:43.300
it's performance versus everybody else.
link |
00:15:45.580
It's, you know, the size of it, the usability of it.
link |
00:15:49.860
So it's not specific,
link |
00:15:50.940
some kind of element that makes you beautiful.
link |
00:15:52.620
It's just like literally just raw performance.
link |
00:15:55.100
Is that how you think about processors?
link |
00:15:57.140
It's just like raw performance?
link |
00:15:59.740
Of course.
link |
00:16:01.300
It's like a horse race.
link |
00:16:02.300
The fastest one wins.
link |
00:16:04.260
Now.
link |
00:16:05.100
You don't care how.
link |
00:16:05.940
Just as long as it wins.
link |
00:16:08.460
Well, there's the fastest in the environment.
link |
00:16:10.620
Like, you know, for years you made the fastest one you could
link |
00:16:13.060
and then people started to have power limits.
link |
00:16:14.940
So then you made the fastest at the right PowerPoint.
link |
00:16:17.660
And then when we started doing multi processors,
link |
00:16:20.460
like if you could scale your processors
link |
00:16:23.580
more than the other guy,
link |
00:16:24.420
you could be 10% faster on like a single thread,
link |
00:16:26.980
but you have more threads.
link |
00:16:28.420
So there's lots of variability.
link |
00:16:30.020
And then ARM really explored,
link |
00:16:34.460
like, you know, they have the A series
link |
00:16:36.580
and the R series and the M series,
link |
00:16:38.900
like a family of processors
link |
00:16:40.340
for all these different design points
link |
00:16:41.980
from like unbelievably small and simple.
link |
00:16:44.580
And so then when you're doing the design,
link |
00:16:46.540
it's sort of like this big pallet of CPUs.
link |
00:16:49.380
Like they're the only ones with a credible,
link |
00:16:51.500
you know, top to bottom pallet.
link |
00:16:54.700
What do you mean a credible top to bottom?
link |
00:16:56.900
Well, there's people who make microcontrollers
link |
00:16:58.620
that are small, but they don't have a fast one.
link |
00:17:00.500
There's people who make fast processors,
link |
00:17:02.080
but don't have a medium one or a small one.
link |
00:17:04.900
Is that hard to do that full pallet?
link |
00:17:07.420
That seems like a...
link |
00:17:08.260
Yeah, it's a lot of different.
link |
00:17:09.380
So what's the difference in the ARM folks and Intel
link |
00:17:13.340
in terms of the way they're approaching this problem?
link |
00:17:15.620
Well, Intel, almost all their processor designs
link |
00:17:19.200
were, you know, very custom high end,
link |
00:17:21.740
you know, for the last 15, 20 years.
link |
00:17:23.460
So the fastest horse possible.
link |
00:17:24.900
Yeah.
link |
00:17:25.860
In one horse race.
link |
00:17:27.540
Yeah, and then architecturally they're really good,
link |
00:17:30.420
but the company itself was fairly insular
link |
00:17:33.380
to what's going on in the industry with CAD tools and stuff.
link |
00:17:36.300
And there's this debate about custom design
link |
00:17:38.200
versus synthesis and how do you approach that?
link |
00:17:41.340
I'd say Intel was slow on getting to synthesize processors.
link |
00:17:45.700
ARM came in from the bottom and they generated IP,
link |
00:17:49.100
which went to all kinds of customers.
link |
00:17:50.860
So they had very little say
link |
00:17:52.020
on how the customer implemented their IP.
link |
00:17:54.980
So ARM is super friendly to the synthesis IP environment.
link |
00:17:59.420
Whereas Intel said,
link |
00:18:00.260
we're gonna make this great client chip or server chip
link |
00:18:03.200
with our own CAD tools, with our own process,
link |
00:18:05.460
with our own, you know, other supporting IP
link |
00:18:08.140
and everything only works with our stuff.
link |
00:18:11.340
So is that, is ARM winning the mobile platform space
link |
00:18:16.440
in terms of process?
link |
00:18:17.280
Yeah.
link |
00:18:18.120
And so in that, what you're describing
link |
00:18:21.780
is why they're winning.
link |
00:18:22.860
Well, they had lots of people doing lots
link |
00:18:24.940
of different experiments.
link |
00:18:26.420
So they controlled the processor architecture and IP,
link |
00:18:29.420
but they let people put in lots of different chips.
link |
00:18:32.060
And there was a lot of variability in what happened there.
link |
00:18:35.260
Whereas Intel, when they made their mobile,
link |
00:18:37.140
their foray into mobile,
link |
00:18:38.460
they had one team doing one part, right?
link |
00:18:41.700
So it wasn't 10 experiments.
link |
00:18:43.180
And then their mindset was PC mindset,
link |
00:18:45.980
Microsoft software mindset.
link |
00:18:48.060
And that brought a whole bunch of things along
link |
00:18:49.940
that the mobile world and the embedded world don't do.
link |
00:18:52.580
Do you think it was possible for Intel to pivot hard
link |
00:18:55.460
and win the mobile market?
link |
00:18:58.260
That's a hell of a difficult thing to do, right?
link |
00:19:00.060
For a huge company to just pivot.
link |
00:19:03.420
I mean, it's so interesting to,
link |
00:19:05.540
because we'll talk about your current work.
link |
00:19:07.420
It's like, it's clear that PCs were dominating
link |
00:19:11.100
for several decades, like desktop computers.
link |
00:19:14.180
And then mobile, it's unclear.
link |
00:19:17.940
It's a leadership question.
link |
00:19:19.380
Like Apple under Steve Jobs, when he came back,
link |
00:19:23.060
they pivoted multiple times.
link |
00:19:25.660
You know, they built iPads and iTunes and phones
link |
00:19:28.260
and tablets and great Macs.
link |
00:19:30.060
Like who knew computers should be made out of aluminum?
link |
00:19:33.380
Nobody knew that.
link |
00:19:35.300
But they're great.
link |
00:19:36.140
It's super fun.
link |
00:19:37.160
That was Steve?
link |
00:19:38.000
Yeah, Steve Jobs.
link |
00:19:38.820
Like they pivoted multiple times.
link |
00:19:41.400
And you know, the old Intel, they did that multiple times.
link |
00:19:45.860
They made DRAMs and processors and processes
link |
00:19:48.420
and I gotta ask this,
link |
00:19:50.900
what was it like working with Steve Jobs?
link |
00:19:53.060
I didn't work with him.
link |
00:19:54.420
Did you interact with him?
link |
00:19:55.700
Twice.
link |
00:19:57.420
I said hi to him twice in the cafeteria.
link |
00:19:59.860
What did he say?
link |
00:20:01.020
Hi?
link |
00:20:01.860
He said, hey fellas.
link |
00:20:04.340
He was friendly.
link |
00:20:05.940
He was wandering around and with somebody,
link |
00:20:08.260
he couldn't find a table because the cafeteria was packed
link |
00:20:12.300
and I gave him my table.
link |
00:20:13.700
But I worked for Mike Colbert who talked to,
link |
00:20:16.060
like Mike was the unofficial CTO of Apple
link |
00:20:19.260
and a brilliant guy and he worked for Steve for 25 years,
link |
00:20:22.140
maybe more and he talked to Steve multiple times a day
link |
00:20:26.680
and he was one of the people who could put up with Steve's,
link |
00:20:29.380
let's say, brilliance and intensity
link |
00:20:31.740
and Steve really liked him and Steve trusted Mike
link |
00:20:35.700
to translate the shit he thought up
link |
00:20:39.060
into engineering products that work
link |
00:20:40.860
and then Mike ran a group called Platform Architecture
link |
00:20:43.140
and I was in that group.
link |
00:20:44.760
So many times I'd be sitting with Mike
link |
00:20:46.380
and the phone would ring and it'd be Steve
link |
00:20:48.680
and Mike would hold the phone like this
link |
00:20:50.420
because Steve would be yelling about something or other.
link |
00:20:53.060
And then he would translate.
link |
00:20:54.120
And he'd translate and then he would say,
link |
00:20:55.900
Steve wants us to do this.
link |
00:20:58.300
So.
link |
00:20:59.460
Was Steve a good engineer or no?
link |
00:21:01.100
I don't know.
link |
00:21:02.380
He was a great idea guy.
link |
00:21:03.780
Idea person.
link |
00:21:04.620
And he's a really good selector for talent.
link |
00:21:07.540
Yeah, that seems to be one of the key elements
link |
00:21:09.580
of leadership, right?
link |
00:21:10.740
And then he was a really good first principles guy.
link |
00:21:12.740
Like somebody would say something couldn't be done
link |
00:21:15.060
and he would just think, that's obviously wrong, right?
link |
00:21:20.300
But you know, maybe it's hard to do.
link |
00:21:23.020
Maybe it's expensive to do.
link |
00:21:24.420
Maybe we need different people.
link |
00:21:25.860
You know, there's like a whole bunch of,
link |
00:21:27.260
if you want to do something hard,
link |
00:21:29.420
you know, maybe it takes time.
link |
00:21:30.580
Maybe you have to iterate.
link |
00:21:31.580
There's a whole bunch of things you could think about
link |
00:21:33.700
but saying it can't be done is stupid.
link |
00:21:36.340
How would you compare?
link |
00:21:38.060
So it seems like Elon Musk is more engineering centric
link |
00:21:42.860
but is also, I think he considers himself a designer too.
link |
00:21:45.660
He has a design mind.
link |
00:21:46.980
Steve Jobs feels like he's much more idea space,
link |
00:21:50.540
design space versus engineering.
link |
00:21:52.740
Just make it happen.
link |
00:21:53.900
Like the world should be this way.
link |
00:21:55.820
Just figure it out.
link |
00:21:57.140
But he used computers.
link |
00:21:58.680
You know, he had computer people talk to him all the time.
link |
00:22:01.840
Like Mike was a really good computer guy.
link |
00:22:03.340
He knew computers could do.
link |
00:22:04.820
Computer meaning computer hardware?
link |
00:22:06.300
Like hardware, software, all the pieces.
link |
00:22:09.100
And then he would have an idea about
link |
00:22:12.100
what could we do with this next.
link |
00:22:14.540
That was grounded in reality.
link |
00:22:16.060
It wasn't like he was just finger painting on the wall
link |
00:22:19.220
and wishing somebody would interpret it.
link |
00:22:21.380
So he had this interesting connection
link |
00:22:23.420
because he wasn't a computer architect or designer
link |
00:22:28.320
but he had an intuition from the computers we had
link |
00:22:30.820
to what could happen.
link |
00:22:31.960
And it's interesting you say intuition
link |
00:22:35.280
because it seems like he was pissing off a lot of engineers
link |
00:22:39.980
in his intuition about what can and can't be done.
link |
00:22:43.660
Those, like the, what is all these stories
link |
00:22:46.840
about like floppy disks and all that kind of stuff.
link |
00:22:49.080
Yeah, so in Steve, the first round,
link |
00:22:52.080
like he'd go into a lab and look at what's going on
link |
00:22:55.420
and hate it and fire people or ask somebody
link |
00:22:59.920
in the elevator what they're doing for Apple.
link |
00:23:01.840
And not be happy.
link |
00:23:03.840
When he came back, my impression was
link |
00:23:06.520
is he surrounded himself
link |
00:23:08.000
with a relatively small group of people
link |
00:23:10.640
and didn't really interact outside of that as much.
link |
00:23:13.880
And then the joke was you'd see like somebody moving
link |
00:23:16.320
a prototype through the quad with a black blanket over it.
link |
00:23:20.800
And that was because it was secret, partly from Steve
link |
00:23:24.200
because they didn't want Steve to see it until it was ready.
link |
00:23:26.980
Yeah, the dynamic with Johnny Ive and Steve is interesting.
link |
00:23:31.420
It's like you don't wanna,
link |
00:23:34.200
he ruins as many ideas as he generates.
link |
00:23:37.280
Yeah, yeah.
link |
00:23:38.800
It's a dangerous kind of line to walk.
link |
00:23:42.080
If you have a lot of ideas,
link |
00:23:43.480
like Gordon Bell was famous for ideas, right?
link |
00:23:47.260
And it wasn't that the percentage of good ideas
link |
00:23:49.120
was way higher than anybody else.
link |
00:23:51.420
It was, he had so many ideas
link |
00:23:53.160
and he was also good at talking to people about it
link |
00:23:55.840
and getting the filters right.
link |
00:23:58.120
And seeing through stuff.
link |
00:24:00.200
Whereas Elon was like, hey, I wanna build rockets.
link |
00:24:03.360
So Steve would hire a bunch of rocket guys
link |
00:24:05.980
and Elon would go read rocket manuals.
link |
00:24:08.520
So Elon is a better engineer, a sense like,
link |
00:24:11.440
or like more like a love and passion for the manuals.
link |
00:24:16.880
And the details.
link |
00:24:17.800
The details, the craftsmanship too, right?
link |
00:24:20.800
Well, I guess Steve had craftsmanship too,
link |
00:24:22.720
but of a different kind.
link |
00:24:24.240
What do you make of the,
link |
00:24:26.200
just to stay in there for just a little longer,
link |
00:24:27.920
what do you make of like the anger
link |
00:24:29.200
and the passion and all of that?
link |
00:24:30.640
The firing and the mood swings and the madness,
link |
00:24:35.080
the being emotional and all of that, that's Steve.
link |
00:24:39.360
And I guess Elon too.
link |
00:24:40.680
So what, is that a bug or a feature?
link |
00:24:43.680
It's a feature.
link |
00:24:45.020
So there's a graph, which is Y axis productivity,
link |
00:24:50.240
X axis at zero is chaos,
link |
00:24:52.920
and infinity is complete order, right?
link |
00:24:56.280
So as you go from the origin,
link |
00:25:00.920
as you improve order, you improve productivity.
link |
00:25:04.160
And at some point, productivity peaks,
link |
00:25:06.420
and then it goes back down again.
link |
00:25:08.340
Too much order, nothing can happen.
link |
00:25:09.800
Yes.
link |
00:25:10.640
But the question is, how close to the chaos is that?
link |
00:25:13.680
No, no, no, here's the thing,
link |
00:25:15.000
is once you start moving in the direction of order,
link |
00:25:16.920
the force vector to drive you towards order is unstoppable.
link |
00:25:21.000
Oh, so it's a slippery slope.
link |
00:25:22.240
And every organization will move to the place
link |
00:25:24.880
where their productivity is stymied by order.
link |
00:25:27.120
So you need a...
link |
00:25:28.160
So the question is, who's the counter force?
link |
00:25:31.880
Because it also feels really good.
link |
00:25:33.360
As you get more organized, the productivity goes up.
link |
00:25:36.240
The organization feels it, they orient towards it, right?
link |
00:25:39.720
They hired more people.
link |
00:25:41.080
They got more guys who couldn't run process,
link |
00:25:42.880
you get bigger, right?
link |
00:25:44.740
And then inevitably, the organization gets captured
link |
00:25:49.120
by the bureaucracy that manages all the processes.
link |
00:25:51.820
Yeah.
link |
00:25:53.660
All right, and then humans really like that.
link |
00:25:55.540
And so if you just walk into a room and say,
link |
00:25:57.840
guys, love what you're doing,
link |
00:26:00.980
but I need you to have less order.
link |
00:26:04.980
If you don't have some force behind that,
link |
00:26:06.900
nothing will happen.
link |
00:26:09.080
I can't tell you on how many levels that's profound, so.
link |
00:26:12.500
So that's why I'd say it's a feature.
link |
00:26:14.080
Now, could you be nicer about it?
link |
00:26:17.220
I don't know, I don't know any good examples
link |
00:26:18.940
of being nicer about it.
link |
00:26:20.140
Well, the funny thing is to get stuff done,
link |
00:26:23.460
you need people who can manage stuff and manage people,
link |
00:26:25.940
because humans are complicated.
link |
00:26:26.900
They need lots of care and feeding that you need
link |
00:26:28.500
to tell them they look nice and they're doing good stuff
link |
00:26:30.780
and pat them on the back, right?
link |
00:26:33.060
I don't know, you tell me, is that needed?
link |
00:26:35.940
Oh yeah.
link |
00:26:36.780
Do humans need that?
link |
00:26:37.600
I had a friend, he started a magic group and he said,
link |
00:26:39.660
I figured it out.
link |
00:26:40.820
You have to praise them before they do anything.
link |
00:26:43.380
I was waiting until they were done.
link |
00:26:45.220
And they were always mad at me.
link |
00:26:46.520
Now I tell them what a great job they're doing
link |
00:26:48.140
while they're doing it.
link |
00:26:49.380
But then you get stuck in that trap,
link |
00:26:51.020
because then when they're not doing something,
link |
00:26:52.180
how do you confront these people?
link |
00:26:54.060
I think a lot of people that had trauma
link |
00:26:55.900
in their childhood would disagree with you,
link |
00:26:57.540
successful people, that you need to first do the rough stuff
link |
00:27:00.640
and then be nice later.
link |
00:27:02.320
I don't know.
link |
00:27:03.160
Okay, but engineering companies are full of adults
link |
00:27:05.820
who had all kinds of range of childhoods.
link |
00:27:08.100
You know, most people had okay childhoods.
link |
00:27:11.400
Well, I don't know if...
link |
00:27:12.900
Lots of people only work for praise, which is weird.
link |
00:27:15.620
You mean like everybody.
link |
00:27:16.820
I'm not that interested in it, but...
link |
00:27:21.140
Well, you're probably looking for somebody's approval.
link |
00:27:25.420
Even still.
link |
00:27:27.400
Yeah, maybe.
link |
00:27:28.240
I should think about that.
link |
00:27:29.540
Maybe somebody who's no longer with us kind of thing.
link |
00:27:33.160
I don't know.
link |
00:27:34.100
I used to call up my dad and tell him what I was doing.
link |
00:27:36.340
He was very excited about engineering and stuff.
link |
00:27:38.580
You got his approval?
link |
00:27:40.140
Uh, yeah, a lot.
link |
00:27:42.060
I was lucky.
link |
00:27:43.340
Like, he decided I was smart and unusual as a kid
link |
00:27:47.180
and that was okay when I was really young.
link |
00:27:50.180
So when I did poorly in school, I was dyslexic.
link |
00:27:52.520
I didn't read until I was third or fourth grade.
link |
00:27:55.220
They didn't care.
link |
00:27:56.060
My parents were like, oh, he'll be fine.
link |
00:27:59.760
So I was lucky.
link |
00:28:01.520
That was cool.
link |
00:28:02.480
Is he still with us?
link |
00:28:05.180
You miss him?
link |
00:28:07.500
Sure, yeah.
link |
00:28:08.340
He had Parkinson's and then cancer.
link |
00:28:10.740
His last 10 years were tough and I killed him.
link |
00:28:15.980
Killing a man like that's hard.
link |
00:28:18.280
The mind?
link |
00:28:19.420
Well, it's pretty good.
link |
00:28:21.460
Parkinson's causes slow dementia
link |
00:28:23.780
and the chemotherapy, I think, accelerated it.
link |
00:28:29.060
But it was like hallucinogenic dementia.
link |
00:28:31.020
So he was clever and funny and interesting
link |
00:28:34.180
and it was pretty unusual.
link |
00:28:37.920
Do you remember conversations?
link |
00:28:39.820
From that time?
link |
00:28:41.500
Like, do you have fond memories of the guy?
link |
00:28:43.940
Yeah, oh yeah.
link |
00:28:45.220
Anything come to mind?
link |
00:28:48.020
A friend told me one time I could draw a computer
link |
00:28:50.340
on the whiteboard faster than anybody he'd ever met.
link |
00:28:52.500
I said, you should meet my dad.
link |
00:28:54.920
Like, when I was a kid, he'd come home and say,
link |
00:28:56.860
I was driving by this bridge and I was thinking about it
link |
00:28:58.820
and he pulled out a piece of paper
link |
00:28:59.780
and he'd draw the whole bridge.
link |
00:29:01.500
He was a mechanical engineer.
link |
00:29:03.620
And he would just draw the whole thing
link |
00:29:05.000
and then he would tell me about it
link |
00:29:06.260
and then tell me how he would have changed it.
link |
00:29:08.700
And he had this idea that he could understand
link |
00:29:11.900
and conceive anything.
link |
00:29:13.380
And I just grew up with that, so that was natural.
link |
00:29:16.460
So when I interview people, I ask them to draw a picture
link |
00:29:19.780
of something they did on a whiteboard
link |
00:29:21.780
and it's really interesting.
link |
00:29:22.860
Like, some people draw a little box
link |
00:29:25.900
and then they'll say, and then this talks to this
link |
00:29:27.820
and I'll be like, oh, this is frustrating.
link |
00:29:30.220
I had this other guy come in one time, he says,
link |
00:29:32.620
well, I designed a floating point in this chip
link |
00:29:34.500
but I'd really like to tell you how the whole thing works
link |
00:29:36.320
and then tell you how the floating point works inside of it.
link |
00:29:38.180
Do you mind if I do that?
link |
00:29:39.080
And he covered two whiteboards in like 30 minutes
link |
00:29:42.060
and I hired him.
link |
00:29:42.900
Like, he was great.
link |
00:29:44.580
This is craftsman.
link |
00:29:45.420
I mean, that's the craftsmanship to that.
link |
00:29:47.060
Yeah, but also the mental agility
link |
00:29:49.500
to understand the whole thing,
link |
00:29:51.660
put the pieces in context,
link |
00:29:54.780
real view of the balance of how the design worked.
link |
00:29:58.640
Because if you don't understand it properly,
link |
00:30:01.020
when you start to draw it,
link |
00:30:02.220
you'll fill up half the whiteboard
link |
00:30:03.820
with like a little piece of it
link |
00:30:05.220
and like your ability to lay it out in an understandable way
link |
00:30:09.260
takes a lot of understanding, so.
link |
00:30:11.500
And be able to, so zoom into the detail
link |
00:30:13.460
and then zoom out to the big picture.
link |
00:30:14.980
Zoom out really fast.
link |
00:30:16.420
What about the impossible thing?
link |
00:30:17.620
You see, your dad believed that you can do anything.
link |
00:30:22.960
That's a weird feature for a craftsman.
link |
00:30:25.500
Yeah.
link |
00:30:26.700
It seems that that echoes in your own behavior.
link |
00:30:30.820
Like that's the.
link |
00:30:32.100
Well, it's not that anybody can do anything right now, right?
link |
00:30:36.500
It's that if you work at it, you can get better at it
link |
00:30:39.660
and there might not be a limit.
link |
00:30:43.100
And they did funny things like,
link |
00:30:44.620
like he always wanted to play piano.
link |
00:30:46.140
So at the end of his life, he started playing the piano
link |
00:30:48.460
when he had Parkinson's and he was terrible.
link |
00:30:51.580
But he thought if he really worked out in this life,
link |
00:30:53.540
maybe the next life he'd be better at it.
link |
00:30:56.420
He might be onto something.
link |
00:30:57.620
Yeah, he enjoyed doing it.
link |
00:31:00.940
Yeah.
link |
00:31:01.780
It's pretty funny.
link |
00:31:02.620
Do you think the perfect is the enemy of the good
link |
00:31:06.180
in hardware and software engineering?
link |
00:31:08.180
It's like we were talking about JavaScript a little bit
link |
00:31:10.500
and the messiness of the 10 day building process.
link |
00:31:14.780
Yeah, you know, creative tension, right?
link |
00:31:19.060
So creative tension is you have two different ideas
link |
00:31:21.460
that you can't do both, right?
link |
00:31:24.380
And, but the fact that you wanna do both
link |
00:31:27.660
causes you to go try to solve that problem.
link |
00:31:29.980
That's the creative part.
link |
00:31:32.020
So if you're building computers,
link |
00:31:35.140
like some people say we have the schedule
link |
00:31:37.060
and anything that doesn't fit in the schedule we can't do.
link |
00:31:40.220
Right?
link |
00:31:41.060
And so they throw out the perfect
link |
00:31:42.100
because they have a schedule.
link |
00:31:44.300
I hate that.
link |
00:31:46.620
Then there's other people who say
link |
00:31:48.220
we need to get this perfectly right.
link |
00:31:50.540
And no matter what, you know, more people, more money,
link |
00:31:53.980
right?
link |
00:31:55.500
And there's a really clear idea about what you want.
link |
00:31:57.860
Some people are really good at articulating it, right?
link |
00:32:00.740
So let's call that the perfect, yeah.
link |
00:32:02.380
Yeah.
link |
00:32:03.300
All right, but that's also terrible
link |
00:32:04.780
because they never ship anything.
link |
00:32:06.180
You never hit any goals.
link |
00:32:07.420
So now you have your framework.
link |
00:32:09.980
Yes.
link |
00:32:10.820
You can't throw out stuff
link |
00:32:11.660
because you can't get it done today
link |
00:32:12.820
because maybe you'll get it done tomorrow
link |
00:32:14.020
or the next project, right?
link |
00:32:15.860
You can't, so you have to,
link |
00:32:18.340
I work with a guy that I really like working with,
link |
00:32:20.620
but he over filters his ideas.
link |
00:32:23.140
Over filters?
link |
00:32:24.780
He'd start thinking about something
link |
00:32:26.620
and as soon as he figured out what was wrong with it,
link |
00:32:28.020
he'd throw it out.
link |
00:32:29.820
And then I start thinking about it
link |
00:32:31.260
and you come up with an idea
link |
00:32:32.700
and then you find out what's wrong with it.
link |
00:32:34.980
And then you give it a little time to set
link |
00:32:36.780
because sometimes you figure out how to tweak it
link |
00:32:39.260
or maybe that idea helps some other idea.
link |
00:32:42.620
So idea generation is really funny.
link |
00:32:45.100
So you have to give your ideas space.
link |
00:32:46.940
Like spaciousness of mind is key.
link |
00:32:49.780
But you also have to execute programs and get shit done.
link |
00:32:53.420
And then it turns out computer engineering is fun
link |
00:32:55.540
because it takes 100 people to build a computer,
link |
00:32:58.300
200 or 300, whatever the number is.
link |
00:33:00.620
And people are so variable about temperament
link |
00:33:05.260
and skill sets and stuff.
link |
00:33:07.700
That in a big organization,
link |
00:33:09.460
you find the people who love the perfect ideas
link |
00:33:11.860
and the people that want to get stuffed on yesterday
link |
00:33:13.780
and people like to come up with ideas
link |
00:33:16.500
and people like to, let's say shoot down ideas.
link |
00:33:19.300
And it takes the whole, it takes a large group of people.
link |
00:33:23.300
Some are good at generating ideas, some are good at filtering ideas.
link |
00:33:25.980
And then all in that giant mess, you're somehow,
link |
00:33:30.980
I guess the goal is for that giant mess of people
link |
00:33:33.820
to find the perfect path through the tension,
link |
00:33:37.260
the creative tension.
link |
00:33:38.460
But like, how do you know when you said
link |
00:33:41.340
there's some people good at articulating
link |
00:33:42.940
what perfect looks like, what a good design is?
link |
00:33:44.740
Like if you're sitting in a room
link |
00:33:48.060
and you have a set of ideas
link |
00:33:51.020
about like how to design a better processor,
link |
00:33:55.340
how do you know this is something special here?
link |
00:33:58.820
This is a good idea, let's try this.
link |
00:34:00.780
Have you ever brainstormed an idea
link |
00:34:02.220
with a couple of people that were really smart?
link |
00:34:04.540
And you kind of go into it and you don't quite understand it
link |
00:34:07.540
and you're working on it.
link |
00:34:09.700
And then you start talking about it,
link |
00:34:12.180
putting it on the whiteboard, maybe it takes days or weeks.
link |
00:34:16.140
And then your brain starts to kind of synchronize.
link |
00:34:18.620
It's really weird.
link |
00:34:19.540
Like you start to see what each other is thinking.
link |
00:34:25.980
And it starts to work.
link |
00:34:28.460
Like you can see work.
link |
00:34:29.380
Like my talent in computer design
link |
00:34:30.980
is I can see how computers work in my head, like really well.
link |
00:34:35.340
And I know other people can do that too.
link |
00:34:37.340
And when you're working with people that can do that,
link |
00:34:40.460
like it is kind of an amazing experience.
link |
00:34:45.380
And then every once in a while you get to that place
link |
00:34:48.180
and then you find the flaw, which is kind of funny
link |
00:34:50.220
because you can fool yourself.
link |
00:34:53.740
The two of you kind of drifted along
link |
00:34:55.900
in the direction that was useless.
link |
00:34:58.460
That happens too.
link |
00:34:59.420
Like you have to, because the nice thing
link |
00:35:03.500
about computer design is always reduction in practice.
link |
00:35:05.580
Like you come up with your good ideas
link |
00:35:08.100
and I know some architects who really love ideas
link |
00:35:10.980
and then they work on them and they put it on the shelf
link |
00:35:13.100
and they go work on the next idea and put it on the shelf
link |
00:35:14.820
and they never reduce it to practice.
link |
00:35:16.820
So they find out what's good and bad.
link |
00:35:18.780
Because almost every time I've done something really new,
link |
00:35:22.500
by the time it's done, like the good parts are good,
link |
00:35:25.660
but I know all the flaws, like.
link |
00:35:27.620
Yeah.
link |
00:35:28.460
Would you say your career, just your own experience,
link |
00:35:31.580
is your career defined mostly by flaws or by successes?
link |
00:35:35.260
Like if...
link |
00:35:36.100
Again, there's great tension between those.
link |
00:35:38.020
If you haven't tried hard, right?
link |
00:35:42.580
And done something new, right?
link |
00:35:46.300
Then you're not gonna be facing the challenges
link |
00:35:48.500
when you build it.
link |
00:35:49.340
Then you find out all the problems with it.
link |
00:35:51.900
And...
link |
00:35:52.740
But when you look back, do you see problems?
link |
00:35:55.580
Okay.
link |
00:35:56.420
Oh, when I look back?
link |
00:35:58.060
What do you remember?
link |
00:35:58.900
I think earlier in my career,
link |
00:36:00.460
like EV5 was the second alpha chip.
link |
00:36:04.100
I was so embarrassed about the mistakes,
link |
00:36:06.500
I could barely talk about it.
link |
00:36:08.580
And it was in the Guinness Book of World Records
link |
00:36:10.340
and it was the fastest processor on the planet.
link |
00:36:12.420
Yeah.
link |
00:36:13.740
So it was, and at some point I realized
link |
00:36:15.780
that was really a bad mental framework
link |
00:36:18.540
to deal with doing something new.
link |
00:36:20.020
We did a bunch of new things
link |
00:36:21.180
and some worked out great and some were bad.
link |
00:36:23.460
And we learned a lot from it.
link |
00:36:24.660
And then the next one, we learned a lot.
link |
00:36:28.020
That EV6 also had some really cool things in it.
link |
00:36:31.820
I think the proportion of good stuff went up,
link |
00:36:34.240
but it had a couple of fatal flaws in it that were painful.
link |
00:36:39.580
And then, yeah.
link |
00:36:41.500
You learned to channel the pain into like pride.
link |
00:36:44.660
Not pride, really.
link |
00:36:45.740
You know, just a realization about how the world works
link |
00:36:50.060
or how that kind of idea set works.
link |
00:36:52.300
Life is suffering.
link |
00:36:53.220
That's the reality.
link |
00:36:55.540
No, it's not.
link |
00:36:57.140
Well, I know the Buddha said that
link |
00:36:58.380
and a couple other people are stuck on it.
link |
00:37:00.480
No, it's, you know, there's this kind of weird combination
link |
00:37:03.820
of good and bad, you know, light and darkness
link |
00:37:06.940
that you have to tolerate and, you know, deal with.
link |
00:37:10.260
Yeah, there's definitely lots of suffering in the world.
link |
00:37:12.620
Depends on the perspective.
link |
00:37:13.780
It seems like there's way more darkness,
link |
00:37:15.420
but that makes the light part really nice.
link |
00:37:18.620
What computing hardware or just any kind,
link |
00:37:24.780
even software design, do you find beautiful
link |
00:37:28.760
from your own work, from other people's work?
link |
00:37:32.500
You're just, we were just talking about the battleground
link |
00:37:37.340
of flaws and mistakes and errors,
link |
00:37:39.260
but things that were just beautifully done.
link |
00:37:42.540
Is there something that pops to mind?
link |
00:37:44.500
Well, when things are beautifully done,
link |
00:37:47.900
usually there's a well thought out set of abstraction layers.
link |
00:37:53.660
So the whole thing works in unison nicely.
link |
00:37:56.420
Yes.
link |
00:37:57.380
And when I say abstraction layer,
link |
00:37:59.380
that means two different components
link |
00:38:01.180
when they work together, they work independently.
link |
00:38:04.940
They don't have to know what the other one is doing.
link |
00:38:07.740
So that decoupling.
link |
00:38:08.660
Yeah.
link |
00:38:09.500
So the famous one was the network stack.
link |
00:38:11.500
Like there's a seven layer network stack,
link |
00:38:13.100
you know, data transport and protocol and all the layers.
link |
00:38:16.380
And the innovation was,
link |
00:38:17.580
is when they really wrote, got that right.
link |
00:38:20.000
Cause networks before that didn't define those very well.
link |
00:38:22.940
The layers could innovate independently.
link |
00:38:26.220
And occasionally the layer boundary would,
link |
00:38:28.780
the interface would be upgraded.
link |
00:38:30.980
And that let the design space breathe.
link |
00:38:34.780
And you could do something new in layer seven
link |
00:38:37.860
without having to worry about how layer four worked.
link |
00:38:40.620
And so good design does that.
link |
00:38:43.000
And you see it in processor designs.
link |
00:38:45.220
When we did the Zen design at AMD,
link |
00:38:48.580
we made several components very modular.
link |
00:38:51.940
And, you know, my insistence at the top was
link |
00:38:54.700
I wanted all the interfaces defined
link |
00:38:56.620
before we wrote the RTL for the pieces.
link |
00:38:59.320
One of the verification leads said,
link |
00:39:01.060
if we do this right,
link |
00:39:02.220
I can test the pieces so well independently
link |
00:39:04.900
when we put it together,
link |
00:39:06.440
we won't find all these interaction bugs
link |
00:39:08.140
cause the floating point knows how the cache works.
link |
00:39:10.700
And I was a little skeptical,
link |
00:39:12.020
but he was mostly right.
link |
00:39:14.220
That the modularity of the design
link |
00:39:16.700
greatly improved the quality.
link |
00:39:18.960
Is that universally true in general?
link |
00:39:20.540
Would you say about good designs,
link |
00:39:21.860
the modularity is like usually modular?
link |
00:39:24.180
Well, we talked about this before.
link |
00:39:25.180
Humans are only so smart.
link |
00:39:26.420
Like, and we're not getting any smarter, right?
link |
00:39:29.460
But the complexity of things is going up.
link |
00:39:32.260
So, you know, a beautiful design can't be bigger
link |
00:39:36.200
than the person doing it.
link |
00:39:37.960
It's just, you know, their piece of it.
link |
00:39:40.020
Like the odds of you doing a really beautiful design
link |
00:39:42.420
of something that's way too hard for you is low, right?
link |
00:39:46.560
If it's way too simple for you,
link |
00:39:48.000
it's not that interesting.
link |
00:39:49.020
It's like, well, anybody could do that.
link |
00:39:50.600
But when you get the right match of your expertise
link |
00:39:54.720
and, you know, mental power to the right design size,
link |
00:39:58.680
that's cool, but that's not big enough
link |
00:40:00.400
to make a meaningful impact in the world.
link |
00:40:02.220
So now you have to have some framework
link |
00:40:04.900
to design the pieces so that the whole thing
link |
00:40:08.060
is big and harmonious.
link |
00:40:10.060
But, you know, when you put it together,
link |
00:40:13.520
it's, you know, sufficiently interesting to be used.
link |
00:40:18.900
And, you know, so that's what a beautiful design is.
link |
00:40:23.300
Matching the limits of that human cognitive capacity
link |
00:40:27.960
to the module that you can create
link |
00:40:30.300
and creating a nice interface between those modules
link |
00:40:33.100
and thereby, do you think there's a limit
link |
00:40:34.500
to the kind of beautiful complex systems
link |
00:40:37.080
we can build with this kind of modular design?
link |
00:40:40.980
It's like, you know, if we build increasingly
link |
00:40:45.900
more complicated, you can think of like the internet.
link |
00:40:49.500
Okay, let's scale it down.
link |
00:40:50.900
Or you can think of like social network,
link |
00:40:52.300
like Twitter as one computing system.
link |
00:40:57.740
But those are little modules, right?
link |
00:41:00.700
But it's built on so many components
link |
00:41:03.780
nobody at Twitter even understands.
link |
00:41:05.980
Right.
link |
00:41:06.820
So if an alien showed up and looked at Twitter,
link |
00:41:09.300
he wouldn't just see Twitter as a beautiful,
link |
00:41:11.180
simple thing that everybody uses, which is really big.
link |
00:41:14.420
You would see the network, it runs on the fiber optics,
link |
00:41:18.180
the data is transported to the computers.
link |
00:41:19.880
The whole thing is so bloody complicated,
link |
00:41:22.060
nobody at Twitter understands it.
link |
00:41:23.760
And so that's what the alien would see.
link |
00:41:25.760
So yeah, if an alien showed up and looked at Twitter
link |
00:41:28.820
or looked at the various different network systems
link |
00:41:32.060
that you could see on Earth.
link |
00:41:33.700
So imagine they were really smart
link |
00:41:34.980
and they could comprehend the whole thing.
link |
00:41:36.700
And then they sort of evaluated the human
link |
00:41:40.140
and thought, this is really interesting.
link |
00:41:41.540
No human on this planet comprehends the system they built.
link |
00:41:45.500
No individual, well, would they even see individual humans?
link |
00:41:48.900
Like we humans are very human centric, entity centric.
link |
00:41:52.720
And so we think of us as the central organism
link |
00:41:56.860
and the networks as just the connection of organisms.
link |
00:41:59.820
But from a perspective of an alien,
link |
00:42:02.500
from an outside perspective, it seems like.
link |
00:42:05.380
Yeah, I get it.
link |
00:42:06.980
We're the ants and they'd see the ant colony.
link |
00:42:08.940
The ant colony, yeah.
link |
00:42:10.500
Or the result of production of the ant colony,
link |
00:42:12.780
which is like cities and it's,
link |
00:42:18.100
in that sense, humans are pretty impressive.
link |
00:42:19.880
The modularity that we're able to,
link |
00:42:23.120
and how robust we are to noise and mutation
link |
00:42:25.940
and all that kind of stuff.
link |
00:42:26.780
Well, that's because it's stress tested all the time.
link |
00:42:28.540
Yeah.
link |
00:42:29.380
You know, you build all these cities with buildings
link |
00:42:31.060
and you get earthquakes occasionally
link |
00:42:32.420
and, you know, wars, earthquakes.
link |
00:42:35.540
Viruses every once in a while.
link |
00:42:37.620
You know, changes in business plans
link |
00:42:39.500
or, you know, like shipping or something.
link |
00:42:41.620
Like as long as it's all stress tested,
link |
00:42:44.740
then it keeps adapting to the situation.
link |
00:42:48.560
So that's a curious phenomenon.
link |
00:42:52.540
Well, let's go, let's talk about Moore's Law a little bit.
link |
00:42:55.060
It's at the broad view of Moore's Law
link |
00:43:00.060
was just exponential improvement of computing capability.
link |
00:43:05.260
Like OpenAI, for example, recently published
link |
00:43:08.380
this kind of papers looking at the exponential improvement
link |
00:43:14.060
in the training efficiency of neural networks
link |
00:43:17.020
for like ImageNet and all that kind of stuff.
link |
00:43:18.620
We just got better on this purely software side,
link |
00:43:22.300
just figuring out better tricks and algorithms
link |
00:43:25.620
for training neural networks.
link |
00:43:26.980
And that seems to be improving significantly faster
link |
00:43:30.620
than the Moore's Law prediction, you know.
link |
00:43:33.100
So that's in the software space.
link |
00:43:35.300
What do you think if Moore's Law continues
link |
00:43:39.140
or if the general version of Moore's Law continues,
link |
00:43:42.900
do you think that comes mostly from the hardware,
link |
00:43:45.320
from the software, some mix of the two,
link |
00:43:47.580
some interesting, totally,
link |
00:43:50.000
so not the reduction of the size of the transistor
link |
00:43:52.800
kind of thing, but more in the,
link |
00:43:54.420
in the totally interesting kinds of innovations
link |
00:43:58.940
in the hardware space, all that kind of stuff.
link |
00:44:01.260
Well, there's like a half a dozen things
link |
00:44:04.060
going on in that graph.
link |
00:44:05.580
So one is there's initial innovations
link |
00:44:08.500
that had a lot of headroom to be exploited.
link |
00:44:11.660
So, you know, the efficiency of the networks
link |
00:44:13.980
has improved dramatically.
link |
00:44:15.900
And then the decomposability of those and the use going,
link |
00:44:19.660
you know, they started running on one computer,
link |
00:44:21.380
then multiple computers, then multiple GPUs,
link |
00:44:23.740
and then arrays of GPUs, and they're up to thousands.
link |
00:44:27.100
And at some point, so it's sort of like
link |
00:44:30.620
they were consumed, they were going from
link |
00:44:32.300
like a single computer application
link |
00:44:33.860
to a thousand computer application.
link |
00:44:36.240
So that's not really a Moore's Law thing.
link |
00:44:38.200
That's an independent vector.
link |
00:44:39.520
How many computers can I put on this problem?
link |
00:44:42.340
Because the computers themselves are getting better
link |
00:44:44.220
on like a Moore's Law rate,
link |
00:44:45.980
but their ability to go from one to 10
link |
00:44:47.900
to 100 to a thousand, you know, was something.
link |
00:44:51.180
And then multiplied by, you know, the amount of computes
link |
00:44:54.300
it took to resolve like AlexNet to ResNet to transformers.
link |
00:44:58.300
It's been quite, you know, steady improvements.
link |
00:45:01.700
But those are like S curves, aren't they?
link |
00:45:03.300
That's the exactly kind of S curves
link |
00:45:04.940
that are underlying Moore's Law from the very beginning.
link |
00:45:07.620
So what's the biggest, what's the most productive,
link |
00:45:13.380
rich source of S curves in the future, do you think?
link |
00:45:16.740
Is it hardware, is it software, or is it?
link |
00:45:18.780
So hardware is going to move along relatively slowly.
link |
00:45:23.660
Like, you know, double performance every two years.
link |
00:45:26.660
There's still...
link |
00:45:28.380
I like how you call that slowly.
link |
00:45:29.620
Yeah, that's the slow version.
link |
00:45:31.460
The snail's pace of Moore's Law.
link |
00:45:33.220
Maybe we should trademark that one.
link |
00:45:39.100
Whereas the scaling by number of computers, you know,
link |
00:45:41.980
can go much faster, you know.
link |
00:45:44.020
I'm sure at some point Google had a, you know,
link |
00:45:46.380
their initial search engine was running on a laptop,
link |
00:45:48.900
you know, like.
link |
00:45:50.140
And at some point they really worked on scaling that.
link |
00:45:52.580
And then they factored the indexer from, you know,
link |
00:45:55.940
this piece and this piece and this piece,
link |
00:45:57.500
and they spread the data on more and more things.
link |
00:45:59.340
And, you know, they did a dozen innovations.
link |
00:46:02.820
But as they scaled up the number of computers on that,
link |
00:46:05.420
it kept breaking, finding new bottlenecks
link |
00:46:07.500
in their software and their schedulers,
link |
00:46:09.220
and made them rethink.
link |
00:46:11.780
Like, it seems insane to do a scheduler
link |
00:46:13.980
across 1,000 computers to schedule parts of it
link |
00:46:16.700
and then send the results to one computer.
link |
00:46:19.020
But if you want to schedule a million searches,
link |
00:46:21.380
that makes perfect sense.
link |
00:46:23.180
So there's the scaling by just quantity
link |
00:46:26.860
is probably the richest thing.
link |
00:46:28.980
But then as you scale quantity,
link |
00:46:31.980
like a network that was great on 100 computers
link |
00:46:34.660
may be completely the wrong one.
link |
00:46:36.580
You may pick a network that's 10 times slower
link |
00:46:39.620
on 10,000 computers, like per computer.
link |
00:46:42.540
But if you go from 100 to 10,000, it's 100 times.
link |
00:46:45.820
So that's one of the things that happened
link |
00:46:47.220
when we did internet scaling.
link |
00:46:48.740
This efficiency went down, not up.
link |
00:46:52.580
The future of computing is inefficiency, not efficiency.
link |
00:46:55.500
But scale, inefficient scale.
link |
00:46:57.620
It's scaling faster than inefficiency bites you.
link |
00:47:01.860
And as long as there's, you know, dollar value there,
link |
00:47:03.860
like scaling costs lots of money.
link |
00:47:05.980
But Google showed, Facebook showed, everybody showed
link |
00:47:08.220
that the scale was where the money was at.
link |
00:47:10.740
It was, and so it was worth the financial.
link |
00:47:13.780
Do you think, is it possible that like basically
link |
00:47:17.780
the entirety of Earth will be like a computing surface?
link |
00:47:21.800
Like this table will be doing computing.
link |
00:47:24.460
This hedgehog will be doing computing.
link |
00:47:26.140
Like everything really inefficient,
link |
00:47:28.180
dumb computing will be leveraged.
link |
00:47:29.500
The science fiction books, they call it computronium.
link |
00:47:31.820
Computronium?
link |
00:47:32.660
We turn everything into computing.
link |
00:47:34.700
Well, most of the elements aren't very good for anything.
link |
00:47:37.980
Like you're not gonna make a computer out of iron.
link |
00:47:39.940
Like, you know, silicon and carbon have like nice structures.
link |
00:47:45.020
You know, we'll see what you can do with the rest of it.
link |
00:47:48.060
Like people talk about, well, maybe we can turn the sun
link |
00:47:50.380
into computer, but it's hydrogen and a little bit of helium.
link |
00:47:54.980
So.
link |
00:47:55.820
What I mean is more like actually just adding computers
link |
00:47:59.060
to everything.
link |
00:47:59.940
Oh, okay.
link |
00:48:00.780
So you're just converting all the mass of the universe
link |
00:48:03.100
into computer.
link |
00:48:04.260
No, no, no.
link |
00:48:05.100
So not using.
link |
00:48:05.920
It'd be ironic from the simulation point of view.
link |
00:48:07.580
It's like the simulator build mass, the simulates.
link |
00:48:12.020
Yeah, I mean, yeah.
link |
00:48:12.860
So, I mean, ultimately this is all heading
link |
00:48:14.940
towards a simulation.
link |
00:48:15.780
Yeah, well, I think I might've told you this story.
link |
00:48:18.460
At Tesla, they were deciding,
link |
00:48:20.280
so they wanna measure the current coming out of the battery
link |
00:48:22.420
and they decided between putting the resistor in there
link |
00:48:25.900
and putting a computer with a sensor in there.
link |
00:48:29.460
And the computer was faster than the computer
link |
00:48:31.940
I worked on in 1982.
link |
00:48:34.140
And we chose the computer
link |
00:48:35.560
because it was cheaper than the resistor.
link |
00:48:38.660
So, sure, this hedgehog costs $13
link |
00:48:42.340
and we can put an AI that's as smart as you
link |
00:48:45.160
in there for five bucks.
link |
00:48:46.060
It'll have one.
link |
00:48:48.560
So computers will be everywhere.
link |
00:48:51.780
I was hoping it wouldn't be smarter than me because.
link |
00:48:54.620
Well, everything's gonna be smarter than you.
link |
00:48:56.660
But you were saying it's inefficient.
link |
00:48:58.060
I thought it was better to have a lot of dumb things.
link |
00:49:00.240
Well, Moore's law will slowly compact that stuff.
link |
00:49:02.740
So even the dumb things will be smarter than us.
link |
00:49:04.860
The dumb things are gonna be smart
link |
00:49:06.020
or they're gonna be smart enough to talk to something
link |
00:49:08.020
that's really smart.
link |
00:49:10.140
You know, it's like.
link |
00:49:12.580
Well, just remember, like a big computer chip.
link |
00:49:15.220
Yeah.
link |
00:49:16.060
You know, it's like an inch by an inch
link |
00:49:17.620
and, you know, 40 microns thick.
link |
00:49:20.980
It doesn't take very much, very many atoms
link |
00:49:23.340
to make a high power computer.
link |
00:49:25.020
Yeah.
link |
00:49:25.860
And 10,000 of them can fit in a shoebox.
link |
00:49:29.060
But, you know, you have the cooling and power problems,
link |
00:49:31.500
but, you know, people are working on that.
link |
00:49:33.540
But they still can't write compelling poetry or music
link |
00:49:37.660
or understand what love is or have a fear of mortality.
link |
00:49:41.740
So we're still winning.
link |
00:49:43.500
Neither can most of humanity, so.
link |
00:49:46.180
Well, they can write books about it.
link |
00:49:48.280
So, but speaking about this,
link |
00:49:53.900
this walk along the path of innovation
link |
00:49:56.860
towards the dumb things being smarter than humans,
link |
00:50:00.100
you are now the CTO of 10storrent as of two months ago.
link |
00:50:08.500
They build hardware for deep learning.
link |
00:50:13.780
How do you build scalable and efficient deep learning?
link |
00:50:16.140
This is such a fascinating space.
link |
00:50:17.460
Yeah, yeah, so it's interesting.
link |
00:50:18.740
So up until recently,
link |
00:50:20.780
I thought there was two kinds of computers.
link |
00:50:22.340
There are serial computers that run like C programs,
link |
00:50:25.380
and then there's parallel computers.
link |
00:50:27.100
So the way I think about it is, you know,
link |
00:50:29.340
parallel computers have given parallelism.
link |
00:50:31.900
Like, GPUs are great because you have a million pixels,
link |
00:50:34.780
and modern GPUs run a program on every pixel.
link |
00:50:37.500
They call it the shader program, right?
link |
00:50:39.340
So, or like finite element analysis.
link |
00:50:42.460
You build something, you know,
link |
00:50:43.900
you make this into little tiny chunks,
link |
00:50:45.540
you give each chunk to a computer,
link |
00:50:47.100
so you're given all these chunks,
link |
00:50:48.420
you have parallelism like that.
link |
00:50:50.160
But most C programs, you write this linear narrative,
link |
00:50:53.520
and you have to make it go fast.
link |
00:50:55.540
To make it go fast, you predict all the branches,
link |
00:50:57.680
all the data fetches, and you run that.
link |
00:50:59.300
More parallel, but that's found parallelism.
link |
00:51:04.260
AI is, I'm still trying to decide how fundamental this is.
link |
00:51:08.420
It's a given parallelism problem.
link |
00:51:10.900
But the way people describe the neural networks,
link |
00:51:14.800
and then how they write them in PyTorch, it makes graphs.
link |
00:51:17.900
Yeah, that might be fundamentally different
link |
00:51:19.980
than the GPU kind of.
link |
00:51:21.660
Parallelism, yeah, it might be.
link |
00:51:23.280
Because when you run the GPU program on all the pixels,
link |
00:51:27.300
you're running, you know, it depends,
link |
00:51:29.860
this group of pixels say it's background blue,
link |
00:51:32.540
and it runs a really simple program.
link |
00:51:34.020
This pixel is, you know, some patch of your face,
link |
00:51:36.900
so you have some really interesting shader program
link |
00:51:39.520
to give you the impression of translucency.
link |
00:51:41.740
But the pixels themselves don't talk to each other.
link |
00:51:43.940
There's no graph, right?
link |
00:51:46.620
So you do the image, and then you do the next image,
link |
00:51:49.540
and you do the next image,
link |
00:51:51.300
and you run eight million pixels,
link |
00:51:53.860
eight million programs every time,
link |
00:51:55.620
and modern GPUs have like 6,000 thread engines in them.
link |
00:51:59.580
So, you know, to get eight million pixels,
link |
00:52:02.100
each one runs a program on, you know, 10 or 20 pixels.
link |
00:52:06.140
And that's how they work, but there's no graph.
link |
00:52:09.380
But you think graph might be a totally new way
link |
00:52:13.680
to think about hardware.
link |
00:52:14.900
So Rajagat Dori and I have been having this conversation
link |
00:52:18.140
about given versus found parallelism.
link |
00:52:20.580
And then the kind of walk,
link |
00:52:22.540
because we got more transistors,
link |
00:52:23.860
like, you know, computers way back when
link |
00:52:25.660
did stuff on scalar data.
link |
00:52:27.820
Now we did it on vector data, famous vector machines.
link |
00:52:30.740
Now we're making computers that operate on matrices, right?
link |
00:52:34.500
And then the category we said that was next was spatial.
link |
00:52:38.900
Like, imagine you have so much data
link |
00:52:40.580
that, you know, you want to do the compute on this data,
link |
00:52:43.420
and then when it's done, it says,
link |
00:52:45.920
send the result to this pile of data on some software on that.
link |
00:52:49.260
And it's better to think about it spatially
link |
00:52:53.060
than to move all the data to a central processor
link |
00:52:56.140
and do all the work.
link |
00:52:57.580
So spatially, you mean moving in the space of data
link |
00:53:00.740
as opposed to moving the data.
link |
00:53:02.460
Yeah, you have a petabyte data space
link |
00:53:05.340
spread across some huge array of computers.
link |
00:53:08.620
And when you do a computation somewhere,
link |
00:53:10.560
you send the result of that computation
link |
00:53:12.300
or maybe a pointer to the next program
link |
00:53:14.380
to some other piece of data and do it.
link |
00:53:16.660
But I think a better word might be graph.
link |
00:53:18.800
And all the AI neural networks are graphs.
link |
00:53:21.700
Do some computations, send the result here,
link |
00:53:24.060
do another computation, do a data transformation,
link |
00:53:26.420
do a merging, do a pooling, do another computation.
link |
00:53:30.340
Is it possible to compress and say
link |
00:53:32.280
how we make this thing efficient,
link |
00:53:34.580
this whole process efficient, this different?
link |
00:53:37.300
So first, the fundamental elements in the graphs
link |
00:53:40.920
are things like matrix multiplies, convolutions,
link |
00:53:43.220
data manipulations, and data movements.
link |
00:53:46.140
So GPUs emulate those things with their little singles,
link |
00:53:49.660
you know, basically running a single threaded program.
link |
00:53:53.100
And then there's, you know, and NVIDIA calls it a warp
link |
00:53:55.580
where they group a bunch of programs
link |
00:53:56.900
that are similar together.
link |
00:53:58.420
So for efficiency and instruction use.
link |
00:54:01.580
And then at a higher level, you kind of,
link |
00:54:04.020
you take this graph and you say this part of the graph
link |
00:54:06.100
is a matrix multiplier, which runs on these 32 threads.
link |
00:54:09.860
But the model at the bottom was built
link |
00:54:12.660
for running programs on pixels, not executing graphs.
link |
00:54:17.180
So it's emulation, ultimately.
link |
00:54:19.440
So is it possible to build something
link |
00:54:21.120
that natively runs graphs?
link |
00:54:23.060
Yes, so that's what 10storrent did.
link |
00:54:26.260
So.
link |
00:54:27.100
Where are we on that?
link |
00:54:28.220
How, like, in the history of that effort,
link |
00:54:30.920
are we in the early days?
link |
00:54:32.100
Yeah, I think so.
link |
00:54:33.420
10storrent started by a friend of mine,
link |
00:54:35.740
Labisha Bajek, and I was his first investor.
link |
00:54:39.020
So I've been, you know, kind of following him
link |
00:54:41.660
and talking to him about it for years.
link |
00:54:43.740
And in the fall when I was considering things to do,
link |
00:54:47.000
I decided, you know, we held a conference last year
link |
00:54:51.620
with a friend, organized it,
link |
00:54:53.020
and we wanted to bring in thinkers.
link |
00:54:56.180
And two of the people were Andre Carpassi and Chris Ladner.
link |
00:55:00.520
And Andre gave this talk, it's on YouTube,
link |
00:55:03.440
called Software 2.0, which I think is great.
link |
00:55:06.860
Which is, we went from programmed computers,
link |
00:55:10.200
where you write programs, to data program computers.
link |
00:55:13.820
You know, like the future of software is data programs,
link |
00:55:18.180
the networks.
link |
00:55:19.380
And I think that's true.
link |
00:55:21.380
And then Chris has been working,
link |
00:55:23.980
he worked on LLVM, the low level virtual machine,
link |
00:55:26.620
which became the intermediate representation
link |
00:55:29.100
for all compilers.
link |
00:55:31.380
And now he's working on another project called MLIR,
link |
00:55:33.660
which is mid level intermediate representation,
link |
00:55:36.460
which is essentially under the graph
link |
00:55:39.860
about how do you represent that kind of computation
link |
00:55:42.820
and then coordinate large numbers
link |
00:55:44.360
of potentially heterogeneous computers.
link |
00:55:47.880
And I would say technically, Tens Torrents,
link |
00:55:51.500
you know, two pillars of those two ideas,
link |
00:55:54.900
software 2.0 and mid level representation.
link |
00:55:58.300
But it's in service of executing graph programs.
link |
00:56:01.900
The hardware is designed to do that.
link |
00:56:03.820
So it's including the hardware piece.
link |
00:56:05.580
Yeah.
link |
00:56:06.480
And then the other cool thing is,
link |
00:56:08.500
for a relatively small amount of money,
link |
00:56:10.100
they did a test chip and two production chips.
link |
00:56:13.340
So it's like a super effective team.
link |
00:56:15.380
And unlike some AI startups,
link |
00:56:18.180
where if you don't build the hardware
link |
00:56:20.180
to run the software that they really want to do,
link |
00:56:22.900
then you have to fix it by writing lots more software.
link |
00:56:26.060
So the hardware naturally does matrix multiply,
link |
00:56:29.100
convolution, the data manipulations,
link |
00:56:31.820
and the data movement between processing elements
link |
00:56:35.340
that you can see in the graph,
link |
00:56:37.600
which I think is all pretty clever.
link |
00:56:40.340
And that's what I'm working on now.
link |
00:56:45.060
So the, I think it's called the Grace Call Processor.
link |
00:56:49.660
I introduced last year.
link |
00:56:51.260
It's, you know, there's a bunch of measures of performance.
link |
00:56:53.780
We're talking about horses.
link |
00:56:55.480
It seems to outperform 368 trillion operations per second.
link |
00:56:59.820
It seems to outperform NVIDIA's Tesla T4 system.
link |
00:57:03.180
So these are just numbers.
link |
00:57:04.620
What do they actually mean in real world performance?
link |
00:57:07.540
Like what are the metrics for you
link |
00:57:10.140
that you're chasing in your horse race?
link |
00:57:12.380
Like what do you care about?
link |
00:57:13.820
Well, first, so the native language of,
link |
00:57:17.700
you know, people who write AI network programs
link |
00:57:20.340
is PyTorch now, PyTorch, TensorFlow.
link |
00:57:22.500
There's a couple others.
link |
00:57:24.020
Do you think PyTorch is one over TensorFlow?
link |
00:57:25.820
Or is it just?
link |
00:57:26.640
I'm not an expert on that.
link |
00:57:27.980
I know many people who have switched
link |
00:57:29.780
from TensorFlow to PyTorch.
link |
00:57:31.660
And there's technical reasons for it.
link |
00:57:33.820
I use both.
link |
00:57:34.740
Both are still awesome.
link |
00:57:35.900
Both are still awesome.
link |
00:57:37.160
But the deepest love is for PyTorch currently.
link |
00:57:39.860
Yeah, there's more love for that.
link |
00:57:41.360
And that may change.
link |
00:57:42.620
So the first thing is when they write their programs,
link |
00:57:46.680
can the hardware execute it pretty much as it was written?
link |
00:57:50.460
Right, so PyTorch turns into a graph.
link |
00:57:53.340
We have a graph compiler that makes that graph.
link |
00:57:55.580
Then it fractions the graph down.
link |
00:57:57.480
So if you have big matrix multiply,
link |
00:57:58.820
we turn it into right size chunks
link |
00:58:00.140
to run on the processing elements.
link |
00:58:02.180
It hooks all the graph up.
link |
00:58:03.300
It lays out all the data.
link |
00:58:05.140
There's a couple of mid level representations of it
link |
00:58:08.020
that are also simulatable.
link |
00:58:09.420
So that if you're writing the code,
link |
00:58:12.140
you can see how it's gonna go through the machine,
link |
00:58:15.100
which is pretty cool.
link |
00:58:15.940
And then at the bottom, it schedules kernels,
link |
00:58:17.700
like math, data manipulation, data movement kernels,
link |
00:58:21.780
which do this stuff.
link |
00:58:22.860
So we don't have to write a little program
link |
00:58:26.180
to do matrix multiply,
link |
00:58:27.300
because we have a big matrix multiplier.
link |
00:58:29.140
There's no SIMD program for that.
link |
00:58:31.240
But there is scheduling for that, right?
link |
00:58:36.000
So one of the goals is,
link |
00:58:37.640
if you write a piece of PyTorch code
link |
00:58:40.200
that looks pretty reasonable,
link |
00:58:41.240
you should be able to compile it, run it on the hardware
link |
00:58:43.480
without having to tweak it
link |
00:58:44.760
and do all kinds of crazy things to get performance.
link |
00:58:48.100
There's not a lot of intermediate steps.
link |
00:58:50.120
It's running directly as written.
link |
00:58:51.320
Like on a GPU, if you write a large matrix multiply naively,
link |
00:58:54.640
you'll get five to 10% of the peak performance of the GPU.
link |
00:58:58.680
Right, and then there's a bunch of people
link |
00:59:00.520
who've published papers on this,
link |
00:59:01.600
and I read them about what steps do you have to do.
link |
00:59:04.080
And it goes from pretty reasonable,
link |
00:59:06.760
well, transpose one of the matrices.
link |
00:59:08.480
So you do row ordered, not column ordered,
link |
00:59:11.680
block it so that you can put a block of the matrix
link |
00:59:14.520
on different SMs, groups of threads.
link |
00:59:19.340
But some of it gets into little details,
link |
00:59:21.160
like you have to schedule it just so,
link |
00:59:23.000
so you don't have register conflicts.
link |
00:59:25.040
So they call them CUDA ninjas.
link |
00:59:28.240
CUDA ninjas, I love it.
link |
00:59:31.080
To get to the optimal point,
link |
00:59:32.320
you either use a prewritten library,
link |
00:59:36.080
which is a good strategy for some things,
link |
00:59:37.880
or you have to be an expert
link |
00:59:39.600
in micro architecture to program it.
link |
00:59:42.200
Right, so the optimization step
link |
00:59:43.480
is way more complicated with the GPU.
link |
00:59:44.960
So our goal is if you write PyTorch,
link |
00:59:47.880
that's good PyTorch, you can do it.
link |
00:59:49.560
Now there's, as the networks are evolving,
link |
00:59:53.080
they've changed from convolutional to matrix multiply.
link |
00:59:56.440
People are talking about conditional graphs,
link |
00:59:58.040
they're talking about very large matrices,
link |
00:59:59.800
they're talking about sparsity,
link |
01:00:01.680
they're talking about problems
link |
01:00:03.360
that scale across many, many chips.
link |
01:00:06.120
So the native data item is a packet.
link |
01:00:11.720
So you send a packet to a processor, it gets processed,
link |
01:00:14.560
it does a bunch of work,
link |
01:00:15.400
and then it may send packets to other processors,
link |
01:00:17.640
and they execute in like a data flow graph
link |
01:00:20.520
kind of methodology.
link |
01:00:22.080
Got it.
link |
01:00:22.920
We have a big network on chip,
link |
01:00:24.400
and then the second chip has 16 ethernet ports
link |
01:00:27.760
to hook lots of them together,
link |
01:00:29.560
and it's the same graph compiler across multiple chips.
link |
01:00:32.400
So that's where the scale comes in.
link |
01:00:33.600
So it's built to scale naturally.
link |
01:00:35.120
Now, my experience with scaling is as you scale,
link |
01:00:38.180
you run into lots of interesting problems.
link |
01:00:40.760
So scaling is the mountain to climb.
link |
01:00:43.200
Yeah.
link |
01:00:44.040
So the hardware is built to do this,
link |
01:00:44.980
and then we're in the process of.
link |
01:00:47.700
Is there a software part to this
link |
01:00:49.160
with ethernet and all that?
link |
01:00:51.640
Well, the protocol at the bottom,
link |
01:00:54.760
we sent, it's an ethernet PHY,
link |
01:00:57.640
but the protocol basically says,
link |
01:00:59.760
send the packet from here to there.
link |
01:01:01.440
It's all point to point.
link |
01:01:03.120
The header bit says which processor to send it to,
link |
01:01:05.840
and we basically take a packet off our on chip network,
link |
01:01:09.560
put an ethernet header on it,
link |
01:01:11.200
send it to the other end to strip the header off,
link |
01:01:13.920
and send it to the local thing.
link |
01:01:14.880
It's pretty straightforward.
link |
01:01:16.120
Human to human interaction is pretty straightforward too,
link |
01:01:18.160
but when you get a million of us,
link |
01:01:19.360
we could do some crazy stuff together.
link |
01:01:21.440
Yeah, it's gonna be fun.
link |
01:01:23.380
So is that the goal is scale?
link |
01:01:25.860
So like, for example, I've been recently
link |
01:01:28.360
doing a bunch of robots at home
link |
01:01:30.100
for my own personal pleasure.
link |
01:01:32.360
Am I going to ever use 10th Story, or is this more for?
link |
01:01:35.780
There's all kinds of problems.
link |
01:01:37.200
Like, there's small inference problems,
link |
01:01:38.720
or small training problems, or big training problems.
link |
01:01:41.440
What's the big goal?
link |
01:01:42.680
Is it the big training problems,
link |
01:01:45.080
or the small training problems?
link |
01:01:46.320
Well, one of the goals is to scale
link |
01:01:48.060
from 100 milliwatts to a megawatt, you know?
link |
01:01:51.720
So like, really have some range on the problems,
link |
01:01:54.840
and the same kind of AI programs
link |
01:01:57.120
work at all different levels.
link |
01:01:59.320
So that's the goal.
link |
01:02:00.600
The natural, since the natural data item
link |
01:02:02.960
is a packet that we can move around,
link |
01:02:05.320
it's built to scale, but so many people have small problems.
link |
01:02:11.560
Right, right.
link |
01:02:12.400
But the, you know.
link |
01:02:13.240
Like, inside that phone is a small problem to solve.
link |
01:02:16.400
So do you see 10th Story potentially being inside a phone?
link |
01:02:19.960
Well, the power efficiency of local memory,
link |
01:02:22.600
local computation, and the way we built it is pretty good.
link |
01:02:26.360
And then there's a lot of efficiency
link |
01:02:28.520
on being able to do conditional graphs and sparsity.
link |
01:02:31.500
I think it's, for complicated networks
link |
01:02:34.540
that wanna go in a small factor, it's gonna be quite good.
link |
01:02:38.180
But we have to prove that, that's all.
link |
01:02:40.200
It's a fun problem.
link |
01:02:41.040
And that's the early days of the company, right?
link |
01:02:42.280
It's a couple years, you said?
link |
01:02:44.600
But you think, you invested, you think they're legit.
link |
01:02:47.560
Yeah.
link |
01:02:48.400
And so you joined.
link |
01:02:49.220
Yeah, I joined.
link |
01:02:50.060
Well, that's.
link |
01:02:50.900
That's a really interesting place to be.
link |
01:02:53.240
Like, the AI world is exploding, you know.
link |
01:02:55.720
And I looked at some other opportunities
link |
01:02:58.520
like build a faster processor, which people want.
link |
01:03:01.520
But that's more on an incremental path
link |
01:03:03.760
than what's gonna happen in AI in the next 10 years.
link |
01:03:07.860
Yeah.
link |
01:03:08.700
So this is kind of, you know,
link |
01:03:10.080
an exciting place to be part of.
link |
01:03:12.240
Yeah, the revolutions will be happening
link |
01:03:14.080
in the very space that Tesla is.
link |
01:03:15.280
And then lots of people are working on it,
link |
01:03:16.680
but there's lots of technical reasons why some of them,
link |
01:03:18.900
you know, aren't gonna work out that well.
link |
01:03:20.320
And, you know, that's interesting.
link |
01:03:23.640
And there's also the same problem
link |
01:03:25.860
about getting the basics right.
link |
01:03:27.540
Like, we've talked to customers about exciting features.
link |
01:03:30.000
And at some point we realized that,
link |
01:03:32.080
Labish and I were realizing they want to hear first
link |
01:03:34.720
about memory bandwidth, local bandwidth,
link |
01:03:36.700
compute intensity, programmability.
link |
01:03:39.240
They want to know the basics, power management,
link |
01:03:42.000
how the network ports work, what are the basics,
link |
01:03:44.140
do all the basics work.
link |
01:03:46.120
Because it's easy to say, we've got this great idea,
link |
01:03:48.000
you know, the crack GPT3, but the people we talked to
link |
01:03:53.260
want to say, if I buy the, so we have a PCI Express card
link |
01:03:57.520
with our chip on it, if you buy the card,
link |
01:03:59.680
you plug it in your machine to download the driver,
link |
01:04:01.960
how long does it take me to get my network to run?
link |
01:04:05.080
Right, right.
link |
01:04:05.920
You know, that's a real question.
link |
01:04:06.760
It's a very basic question.
link |
01:04:08.360
So, yeah.
link |
01:04:09.360
Is there an answer to that yet,
link |
01:04:10.520
or is it trying to get to that?
link |
01:04:11.360
Our goal is like an hour.
link |
01:04:13.400
Okay.
link |
01:04:14.240
When can I buy a Tesla?
link |
01:04:16.800
Pretty soon.
link |
01:04:17.640
Or my, for the small case training.
link |
01:04:19.640
Yeah, pretty soon.
link |
01:04:21.120
Months.
link |
01:04:21.960
Good.
link |
01:04:22.800
I love the idea of you inside the room
link |
01:04:24.740
with the Carpathi, Andre Carpathi and Chris Ladner.
link |
01:04:31.440
Very, very interesting, very brilliant people,
link |
01:04:35.980
very out of the box thinkers,
link |
01:04:37.560
but also like first principles thinkers.
link |
01:04:39.960
Well, they both get stuff done.
link |
01:04:42.640
They only get stuff done to get their own projects done.
link |
01:04:44.920
They talk about it clearly.
link |
01:04:47.000
They educate large numbers of people,
link |
01:04:48.720
and they've created platforms for other people
link |
01:04:50.520
to go do their stuff on.
link |
01:04:52.000
Yeah, the clear thinking that's able to be communicated
link |
01:04:55.520
is kind of impressive.
link |
01:04:57.200
It's kind of remarkable to, yeah, I'm a fan.
link |
01:05:00.760
Well, let me ask,
link |
01:05:02.000
because I talk to Chris actually a lot these days.
link |
01:05:05.000
He's been one of the, just to give him a shout out,
link |
01:05:08.880
he's been so supportive as a human being.
link |
01:05:13.700
So everybody's quite different.
link |
01:05:16.280
Like great engineers are different,
link |
01:05:17.640
but he's been like sensitive to the human element
link |
01:05:20.760
in a way that's been fascinating.
link |
01:05:22.240
Like he was one of the early people
link |
01:05:23.960
on this stupid podcast that I do to say like,
link |
01:05:27.880
don't quit this thing,
link |
01:05:29.640
and also talk to whoever the hell you want to talk to.
link |
01:05:34.120
That kind of from a legit engineer to get like props
link |
01:05:38.040
and be like, you can do this.
link |
01:05:39.960
That was, I mean, that's what a good leader does, right?
link |
01:05:42.240
To just kind of let a little kid do his thing,
link |
01:05:45.100
like go do it, let's see what turns out.
link |
01:05:48.700
That's a pretty powerful thing.
link |
01:05:50.500
But what do you, what's your sense about,
link |
01:05:54.440
he used to be, no, I think stepped away from Google, right?
link |
01:05:58.800
He's at SciFive, I think.
link |
01:06:02.400
What's really impressive to you
link |
01:06:03.820
about the things that Chris has worked on?
link |
01:06:05.720
Because we mentioned the optimization,
link |
01:06:08.300
the compiler design stuff, the LLVM,
link |
01:06:10.840
then there's, he's also at Google worked at the TPU stuff.
link |
01:06:16.400
He's obviously worked on Swift,
link |
01:06:19.360
so the programming language side.
link |
01:06:21.360
Talking about people that work in the entirety of the stack.
link |
01:06:24.280
What, from your time interacting with Chris
link |
01:06:27.920
and knowing the guy, what's really impressive to you
link |
01:06:30.760
that just inspires you?
link |
01:06:32.120
Well, like LLVM became the defacto platform
link |
01:06:37.120
for the defacto platform for compilers.
link |
01:06:42.180
It's amazing.
link |
01:06:43.840
And it was good code quality, good design choices.
link |
01:06:46.380
He hit the right level of abstraction.
link |
01:06:48.860
There's a little bit of the right time, the right place.
link |
01:06:52.060
And then he built a new programming language called Swift,
link |
01:06:55.460
which after, let's say some adoption resistance
link |
01:06:59.100
became very successful.
link |
01:07:01.180
I don't know that much about his work at Google,
link |
01:07:03.380
although I know that that was a typical,
link |
01:07:07.140
they started TensorFlow stuff and it was new.
link |
01:07:11.580
They wrote a lot of code and then at some point
link |
01:07:13.620
it needed to be refactored to be,
link |
01:07:17.220
because its development slowed down,
link |
01:07:19.100
why PyTorch started a little later and then passed it.
link |
01:07:22.340
So he did a lot of work on that.
link |
01:07:23.940
And then his idea about MLIR,
link |
01:07:25.980
which is what people started to realize
link |
01:07:28.260
is the complexity of the software stack above
link |
01:07:30.580
the low level IR was getting so high
link |
01:07:33.540
that forcing the features of that into the level
link |
01:07:36.580
was putting too much of a burden on it.
link |
01:07:38.740
So he's splitting that into multiple pieces.
link |
01:07:41.580
And that was one of the inspirations for our software stack
link |
01:07:43.820
where we have several intermediate representations
link |
01:07:46.700
that are all executable and you can look at them
link |
01:07:49.700
and do transformations on them before you lower the level.
link |
01:07:53.940
So that was, I think we started before MLIR
link |
01:07:58.160
really got far enough along to use,
link |
01:08:01.700
but we're interested in that.
link |
01:08:02.820
He's really excited about MLIR.
link |
01:08:04.660
That's his like little baby.
link |
01:08:06.660
So he, and there seems to be some profound ideas on that
link |
01:08:10.900
that are really useful.
link |
01:08:11.820
So each one of those things has been,
link |
01:08:14.960
as the world of software gets more and more complicated,
link |
01:08:17.780
how do we create the right abstraction levels
link |
01:08:20.060
to simplify it in a way that people can now work independently
link |
01:08:23.340
on different levels of it?
link |
01:08:25.140
So I would say all three of those projects,
link |
01:08:27.200
LLVM, Swift, and MLIR did that successfully.
link |
01:08:31.620
So I'm interested in what he's gonna do next
link |
01:08:33.700
in the same kind of way.
link |
01:08:34.820
Yes.
link |
01:08:36.220
On either the TPU or maybe the Nvidia GPU side,
link |
01:08:41.820
how does 10th Story think, or the ideas underlying it,
link |
01:08:45.860
does it have to be 10th Story?
link |
01:08:47.020
Just this kind of graph focused,
link |
01:08:51.580
graph centric hardware, deep learning centric hardware,
link |
01:08:56.580
beat NVIDIAs, do you think it's possible
link |
01:09:00.180
for it to basically overtake NVIDIA?
link |
01:09:02.280
Sure.
link |
01:09:03.500
What's that process look like?
link |
01:09:05.600
What's that journey look like, you think?
link |
01:09:08.060
Well, GPUs were built to run shader programs
link |
01:09:11.060
on millions of pixels, not to run graphs.
link |
01:09:13.860
Yes.
link |
01:09:14.700
So there's a hypothesis that says
link |
01:09:17.380
the way the graphs are built
link |
01:09:20.300
is going to be really interesting
link |
01:09:21.540
to be inefficient on computing this.
link |
01:09:24.080
And then the primitives is not a SIMD program,
link |
01:09:27.520
it's matrix multiply convolution.
link |
01:09:30.080
And then the data manipulations are fairly extensive about,
link |
01:09:33.780
like, how do you do a fast transpose with a program?
link |
01:09:36.380
I don't know if you've ever written a transpose program.
link |
01:09:38.780
They're ugly and slow, but in hardware,
link |
01:09:40.420
you can do really well.
link |
01:09:42.140
Like, I'll give you an example.
link |
01:09:43.300
So when GPU accelerators first started doing triangles,
link |
01:09:47.800
like, so you have a triangle
link |
01:09:49.020
which maps on a set of pixels.
link |
01:09:51.180
So you build, it's very easy,
link |
01:09:52.580
straightforward to build a hardware engine
link |
01:09:54.220
that'll find all those pixels.
link |
01:09:55.860
And it's kind of weird
link |
01:09:56.700
because you walk along the triangle to get to the edge,
link |
01:09:59.260
and then you have to go back down to the next row
link |
01:10:01.300
and walk along, and then you have to decide on the edge
link |
01:10:04.080
if the line of the triangle is like half on the pixel,
link |
01:10:08.060
what's the pixel color?
link |
01:10:09.140
Because it's half of this pixel and half the next one.
link |
01:10:11.100
That's called rasterization.
link |
01:10:12.980
And you're saying that could be done in hardware?
link |
01:10:15.900
No, that's an example of that operation
link |
01:10:19.340
as a software program is really bad.
link |
01:10:22.100
I've written a program that did rasterization.
link |
01:10:24.420
The hardware that does it has actually less code
link |
01:10:26.860
than the software program that does it,
link |
01:10:28.980
and it's way faster.
link |
01:10:31.640
Right, so there are certain times
link |
01:10:33.440
when the abstraction you have, rasterize a triangle,
link |
01:10:37.780
you know, execute a graph, you know, components of a graph.
link |
01:10:41.300
But the right thing to do in the hardware software boundary
link |
01:10:43.860
is for the hardware to naturally do it.
link |
01:10:45.780
And so the GPU is really optimized
link |
01:10:47.940
for the rasterization of triangles.
link |
01:10:50.100
Well, you know, that's just, well, like in a modern,
link |
01:10:52.860
you know, that's a small piece of modern GPUs.
link |
01:10:56.980
What they did is that they still rasterize triangles
link |
01:10:59.940
when you're running in a game, but for the most part,
link |
01:11:02.460
most of the computation in the area of the GPU
link |
01:11:04.420
is running shader programs.
link |
01:11:05.900
But they're single threaded programs on pixels, not graphs.
link |
01:11:09.580
I have to be honest, I'd say I don't actually know
link |
01:11:11.820
the math behind shader, shading and lighting
link |
01:11:15.060
and all that kind of stuff.
link |
01:11:16.180
I don't know what.
link |
01:11:17.780
They look like little simple floating point programs
link |
01:11:20.100
or complicated ones.
link |
01:11:21.220
You can have 8,000 instructions in a shader program.
link |
01:11:23.740
But I don't have a good intuition
link |
01:11:25.580
why it could be parallelized so easily.
link |
01:11:27.980
No, it's because you have 8 million pixels in every single.
link |
01:11:30.660
So when you have a light, right, that comes down,
link |
01:11:34.660
the angle, you know, the amount of light,
link |
01:11:36.780
like say this is a line of pixels across this table, right?
link |
01:11:40.740
The amount of light on each pixel is subtly different.
link |
01:11:43.620
And each pixel is responsible for figuring out what.
link |
01:11:45.980
Figuring it out.
link |
01:11:46.820
So that pixel says, I'm this pixel.
link |
01:11:48.580
I know the angle of the light.
link |
01:11:49.940
I know the occlusion.
link |
01:11:50.900
I know the color I am.
link |
01:11:52.420
Like every single pixel here is a different color.
link |
01:11:54.420
Every single pixel gets a different amount of light.
link |
01:11:57.160
Every single pixel has a subtly different translucency.
link |
01:12:00.580
So to make it look realistic,
link |
01:12:02.140
the solution was you run a separate program on every pixel.
link |
01:12:05.140
See, but I thought there's like reflection
link |
01:12:06.720
from all over the place.
link |
01:12:08.060
Every pixel. Yeah, but there is.
link |
01:12:09.620
So you build a reflection map,
link |
01:12:11.060
which also has some pixelated thing.
link |
01:12:14.180
And then when the pixel is looking at the reflection map,
link |
01:12:16.340
it has to calculate what the normal of the surface is.
link |
01:12:19.220
And it does it per pixel.
link |
01:12:20.900
By the way, there's boatloads of hacks on that.
link |
01:12:22.780
You know, like you may have a lower resolution light map,
link |
01:12:25.660
your reflection map.
link |
01:12:26.660
There's all these, you know, tax they do.
link |
01:12:29.220
But at the end of the day, it's per pixel computation.
link |
01:12:32.940
And it's so happening that you can map
link |
01:12:35.540
graph like computation onto this pixel central computation.
link |
01:12:39.340
You can do floating point programs
link |
01:12:41.360
on convolutions and the matrices.
link |
01:12:43.460
And Nvidia invested for years in CUDA.
link |
01:12:46.220
First for HPC, and then they got lucky with the AI trend.
link |
01:12:50.140
But do you think they're going to essentially
link |
01:12:52.300
not be able to hardcore pivot out of their?
link |
01:12:55.440
We'll see.
link |
01:12:57.420
That's always interesting.
link |
01:12:59.460
How often do big companies hardcore pivot?
link |
01:13:01.260
Occasionally.
link |
01:13:03.820
How much do you know about Nvidia, folks?
link |
01:13:06.340
Some. Some?
link |
01:13:08.140
Well, I'm curious as well.
link |
01:13:10.020
Who's ultimately, as a...
link |
01:13:11.460
Well, they've innovated several times.
link |
01:13:13.380
But they've also worked really hard on mobile.
link |
01:13:15.220
They've worked really hard on radios.
link |
01:13:17.340
You know, they're fundamentally a GPU company.
link |
01:13:20.680
Well, they tried to pivot.
link |
01:13:21.860
There's an interesting little game and play
link |
01:13:26.160
in autonomous vehicles, right?
link |
01:13:27.660
With, or semi autonomous, like playing with Tesla
link |
01:13:30.660
and so on and seeing that's dipping a toe
link |
01:13:34.020
into that kind of pivot.
link |
01:13:35.700
They came out with this platform,
link |
01:13:37.100
which is interesting technically.
link |
01:13:39.140
But it was like a 3000 watt, you know,
link |
01:13:42.700
3000 watt, $3,000 GPU platform.
link |
01:13:46.220
I don't know if it's interesting technically.
link |
01:13:47.540
It's interesting philosophically.
link |
01:13:49.920
Technically, I don't know if it's the execution
link |
01:13:51.900
of the craftsmanship is there.
link |
01:13:53.440
I'm not sure.
link |
01:13:54.580
But I didn't get a sense.
link |
01:13:55.420
I think they were repurposing GPUs
link |
01:13:57.780
for an automotive solution.
link |
01:13:59.140
Right, it's not a real pivot.
link |
01:14:00.340
They didn't build a ground up solution.
link |
01:14:03.140
Right.
link |
01:14:03.980
Like the chips inside Tesla are pretty cheap.
link |
01:14:06.360
Like Mobileye has been doing this.
link |
01:14:08.080
They're doing the classic work from the simplest thing.
link |
01:14:10.840
Yeah.
link |
01:14:11.680
I mean, 40 square millimeter chips.
link |
01:14:14.260
And Nvidia, their solution had 800 millimeter chips
link |
01:14:17.500
and two 200 millimeter chips.
link |
01:14:19.180
And, you know, like boatloads are really expensive DRAMs.
link |
01:14:22.540
And, you know, it's a really different approach.
link |
01:14:27.020
And Mobileye fit the, let's say,
link |
01:14:28.900
automotive cost and form factor.
link |
01:14:31.300
And then they added features as it was economically viable.
link |
01:14:34.140
And Nvidia said, take the biggest thing
link |
01:14:36.300
and we're gonna go make it work.
link |
01:14:38.780
You know, and that's also influenced like Waymo.
link |
01:14:41.420
There's a whole bunch of autonomous startups
link |
01:14:43.660
where they have a 5,000 watt server in their trunk.
link |
01:14:46.820
Right.
link |
01:14:47.860
But that's because they think, well, 5,000 watts
link |
01:14:50.580
and, you know, $10,000 is okay
link |
01:14:52.300
because it's replacing a driver.
link |
01:14:54.740
Elon's approach was that port has to be cheap enough
link |
01:14:58.100
to put it in every single Tesla,
link |
01:14:59.540
whether they turn on autonomous driving or not.
link |
01:15:02.300
Which, and Mobileye was like,
link |
01:15:04.740
we need to fit in the bomb and, you know,
link |
01:15:06.820
cost structure that car companies do.
link |
01:15:09.460
So they may sell you a GPS for 1500 bucks,
link |
01:15:12.460
but the bomb for that, it's like $25.
link |
01:15:16.460
Well, and for Mobileye, it seems like neural networks
link |
01:15:20.140
were not first class citizens, like the computation.
link |
01:15:22.980
They didn't start out as a...
link |
01:15:24.660
Yeah, it was a CV problem.
link |
01:15:26.100
Yeah.
link |
01:15:27.100
And did classic CV and found stoplights and lines.
link |
01:15:29.940
And they were really good at it.
link |
01:15:31.220
Yeah, and they never, I mean,
link |
01:15:33.060
I don't know what's happening now,
link |
01:15:34.140
but they never fully pivoted.
link |
01:15:35.820
I mean, it's like, it's the Nvidia thing.
link |
01:15:37.980
And then as opposed to,
link |
01:15:39.740
so if you look at the new Tesla work,
link |
01:15:41.980
it's like neural networks from the ground up, right?
link |
01:15:45.540
Yeah, and even Tesla started with a lot of CV stuff in it
link |
01:15:48.100
and Andrei's basically been eliminating it.
link |
01:15:51.740
Move everything into the network.
link |
01:15:54.340
So without, this isn't like confidential stuff,
link |
01:15:57.940
but you sitting on a porch, looking over the world,
link |
01:16:01.620
looking at the work that Andrei's doing,
link |
01:16:03.740
that Elon's doing with Tesla Autopilot,
link |
01:16:06.420
do you like the trajectory of where things are going
link |
01:16:08.780
on the hardware side?
link |
01:16:09.620
Well, they're making serious progress.
link |
01:16:10.900
I like the videos of people driving the beta stuff.
link |
01:16:14.100
I guess taking some pretty complicated intersections
link |
01:16:16.500
and all that, but it's still an intervention per drive.
link |
01:16:20.780
I mean, I have autopilot, the current autopilot,
link |
01:16:23.020
my Tesla, I use it every day.
link |
01:16:24.540
Do you have full self driving beta or no?
link |
01:16:26.340
No.
link |
01:16:27.180
So you like where this is going?
link |
01:16:28.700
They're making progress.
link |
01:16:29.540
It's taking longer than anybody thought.
link |
01:16:32.220
You know, my wonder is, you know, hardware three,
link |
01:16:37.380
is it enough computing off by two, off by five,
link |
01:16:40.620
off by 10, off by a hundred?
link |
01:16:42.380
Yeah.
link |
01:16:43.220
And I thought it probably wasn't enough,
link |
01:16:47.180
but they're doing pretty well with it now.
link |
01:16:49.820
Yeah.
link |
01:16:50.660
And one thing is the data set gets bigger,
link |
01:16:53.380
the training gets better.
link |
01:16:55.060
And then there's this interesting thing is you sort of train
link |
01:16:58.420
and build an arbitrary size network that solves the problem.
link |
01:17:01.380
And then you refactor the network down to the thing
link |
01:17:03.720
that you can afford to ship, right?
link |
01:17:06.780
So the goal isn't to build a network that fits in the phone.
link |
01:17:10.740
It's to build something that actually works.
link |
01:17:14.860
And then how do you make that most effective
link |
01:17:17.700
on the hardware you have?
link |
01:17:19.860
And they seem to be doing that much better
link |
01:17:21.700
than a couple of years ago.
link |
01:17:23.580
Well, the one really important thing is also
link |
01:17:25.820
what they're doing well is how to iterate that quickly,
link |
01:17:28.700
which means like it's not just about one time deployment,
link |
01:17:31.780
one building, it's constantly iterating the network
link |
01:17:34.220
and trying to automate as many steps as possible, right?
link |
01:17:37.540
And that's actually the principles of the Software 2.0,
link |
01:17:41.700
like you mentioned with Andre is it's not just,
link |
01:17:46.980
I mean, I don't know what the actual,
link |
01:17:48.300
his description of Software 2.0 is.
link |
01:17:50.900
If it's just high level philosophical or their specifics,
link |
01:17:53.520
but the interesting thing about what that actually looks
link |
01:17:57.100
in the real world is it's that what I think Andre calls
link |
01:18:01.860
the data engine, it's like it's the iterative improvement
link |
01:18:05.740
of the thing.
link |
01:18:06.580
You have a neural network that does stuff,
link |
01:18:10.500
fails on a bunch of things and learns from it
link |
01:18:12.740
over and over and over.
link |
01:18:13.620
So you're constantly discovering edge cases.
link |
01:18:15.900
So it's very much about like data engineering,
link |
01:18:19.920
like figuring out, it's kind of what you were talking about
link |
01:18:23.060
with TestTorrent is you have the data landscape.
link |
01:18:25.740
And you have to walk along that data landscape
link |
01:18:27.580
in a way that is constantly improving the neural network.
link |
01:18:32.600
And that feels like that's the central piece of it.
link |
01:18:35.820
And there's two pieces of it.
link |
01:18:37.140
Like you find edge cases that don't work
link |
01:18:40.900
and then you define something that goes,
link |
01:18:42.340
get your data for that.
link |
01:18:44.220
But then the other constraint is whether you have
link |
01:18:45.820
to label it or not.
link |
01:18:46.940
Like the amazing thing about like the GPT3 stuff
link |
01:18:49.860
is it's unsupervised.
link |
01:18:51.540
So there's essentially infinite amount of data.
link |
01:18:53.300
Now there's obviously infinite amount of data available
link |
01:18:56.260
from cars of people successfully driving.
link |
01:18:59.220
But the current pipelines are mostly running
link |
01:19:02.060
on labeled data, which is human limited.
link |
01:19:04.660
So when that becomes unsupervised,
link |
01:19:09.040
it'll create unlimited amount of data,
link |
01:19:12.620
which then they'll scale.
link |
01:19:14.240
Now the networks that may use that data
link |
01:19:16.220
might be way too big for cars,
link |
01:19:18.260
but then there'll be the transformation from now
link |
01:19:20.020
we have unlimited data, I know exactly what I want.
link |
01:19:22.360
Now can I turn that into something that fits in the car?
link |
01:19:25.820
And that process is gonna happen all over the place.
link |
01:19:29.220
Every time you get to the place where you have
link |
01:19:30.700
unlimited data, and that's what software 2.0 is about,
link |
01:19:34.100
unlimited data training networks to do stuff
link |
01:19:37.980
without humans writing code to do it.
link |
01:19:40.700
And ultimately also trying to discover,
link |
01:19:42.980
like you're saying, the self supervised formulation
link |
01:19:46.540
of the problem.
link |
01:19:47.380
So the unsupervised formulation of the problem.
link |
01:19:49.660
Like in driving, there's this really interesting thing,
link |
01:19:53.540
which is you look at a scene that's before you,
link |
01:19:58.140
and you have data about what a successful human driver did
link |
01:20:01.900
in that scene one second later.
link |
01:20:04.460
It's a little piece of data that you can use
link |
01:20:06.620
just like with GPT3 as training.
link |
01:20:09.380
Currently, even though Tesla says they're using that,
link |
01:20:12.380
it's an open question to me, how far can you,
link |
01:20:15.980
can you solve all of the driving
link |
01:20:17.420
with just that self supervised piece of data?
link |
01:20:20.940
And like, I think.
link |
01:20:23.380
Well, that's what Common AI is doing.
link |
01:20:25.540
That's what Common AI is doing,
link |
01:20:26.860
but the question is how much data.
link |
01:20:29.980
So what Common AI doesn't have is as good
link |
01:20:33.580
of a data engine, for example, as Tesla does.
link |
01:20:35.940
That's where the, like the organization of the data.
link |
01:20:39.820
I mean, as far as I know, I haven't talked to George,
link |
01:20:41.900
but they do have the data.
link |
01:20:44.580
The question is how much data is needed,
link |
01:20:47.860
because we say infinite very loosely here.
link |
01:20:51.420
And then the other question, which you said,
link |
01:20:54.380
I don't know if you think it's still an open question is,
link |
01:20:57.700
are we on the right order of magnitude
link |
01:20:59.420
for the compute necessary?
link |
01:21:02.020
That is this, is it like what Elon said,
link |
01:21:04.940
this chip that's in there now is enough
link |
01:21:07.140
to do full self driving,
link |
01:21:08.620
or do we need another order of magnitude?
link |
01:21:10.820
I think nobody actually knows the answer to that question.
link |
01:21:13.300
I like the confidence that Elon has, but.
link |
01:21:16.260
Yeah, we'll see.
link |
01:21:17.820
There's another funny thing is you don't learn to drive
link |
01:21:20.180
with infinite amounts of data.
link |
01:21:22.260
You learn to drive with an intellectual framework
link |
01:21:24.300
that understands physics and color and horizontal surfaces
link |
01:21:28.060
and laws and roads and all your experience
link |
01:21:33.980
from manipulating your environment.
link |
01:21:36.700
Like, look, there's so many factors go into that.
link |
01:21:39.020
So then when you learn to drive,
link |
01:21:40.660
like driving is a subset of this conceptual framework
link |
01:21:44.380
that you have, right?
link |
01:21:46.300
And so with self driving cars right now,
link |
01:21:48.580
we're teaching them to drive with driving data.
link |
01:21:51.540
You never teach a human to do that.
link |
01:21:53.580
You teach a human all kinds of interesting things,
link |
01:21:55.780
like language, like don't do that, watch out.
link |
01:21:59.340
There's all kinds of stuff going on.
link |
01:22:01.020
Well, this is where you, I think previous time
link |
01:22:02.900
we talked about where you poetically disagreed
link |
01:22:07.300
with my naive notion about humans.
link |
01:22:10.300
I just think that humans will make
link |
01:22:13.700
this whole driving thing really difficult.
link |
01:22:15.700
Yeah, all right.
link |
01:22:17.180
I said, humans don't move that slow.
link |
01:22:19.460
It's a ballistics problem.
link |
01:22:20.820
It's a ballistics, humans are a ballistics problem,
link |
01:22:22.700
which is like poetry to me.
link |
01:22:24.060
It's very possible that in driving
link |
01:22:26.180
they're indeed purely a ballistics problem.
link |
01:22:28.460
And I think that's probably the right way to think about it.
link |
01:22:30.860
But I still, they still continue to surprise me,
link |
01:22:34.420
those damn pedestrians, the cyclists,
link |
01:22:36.940
other humans in other cars and.
link |
01:22:39.340
Yeah, but it's gonna be one of these compensating things.
link |
01:22:41.180
So like when you're driving,
link |
01:22:43.980
you have an intuition about what humans are going to do,
link |
01:22:46.860
but you don't have 360 cameras and radars
link |
01:22:49.660
and you have an attention problem.
link |
01:22:51.140
So the self driving car comes in with no attention problem,
link |
01:22:55.100
360 cameras right now, a bunch of other features.
link |
01:22:58.780
So they'll wipe out a whole class of accidents, right?
link |
01:23:01.980
And emergency braking with radar
link |
01:23:05.780
and especially as it gets AI enhanced
link |
01:23:07.980
will eliminate collisions, right?
link |
01:23:10.940
But then you have the other problems
link |
01:23:12.060
of these unexpected things where
link |
01:23:13.860
you think your human intuition is helping,
link |
01:23:15.600
but then the cars also have a set of hardware features
link |
01:23:19.580
that you're not even close to.
link |
01:23:21.500
And the key thing of course is if you wipe out
link |
01:23:25.380
a huge number of kind of accidents,
link |
01:23:27.020
then it might be just way safer than a human driver,
link |
01:23:30.240
even though, even if humans are still a problem,
link |
01:23:32.980
that's hard to figure out.
link |
01:23:34.740
Yeah, that's probably what will happen.
link |
01:23:36.180
Those autonomous cars will have a small number of accidents
link |
01:23:38.820
humans would have avoided, but they'll wipe,
link |
01:23:41.060
they'll get rid of the bulk of them.
link |
01:23:43.840
What do you think about like Tesla's dojo efforts
link |
01:23:48.660
or it can be bigger than Tesla in general.
link |
01:23:51.140
It's kind of like the tense torrent trying to innovate,
link |
01:23:55.140
like this is the dichotomy, like should a company
link |
01:23:58.160
try to from scratch build its own
link |
01:24:00.380
neural network training hardware?
link |
01:24:03.180
Well, first of all, I think it's great.
link |
01:24:04.260
So we need lots of experiments, right?
link |
01:24:06.840
And there's lots of startups working on this
link |
01:24:09.460
and they're pursuing different things.
link |
01:24:11.580
I was there when we started dojo and it was sort of like,
link |
01:24:14.580
what's the unconstrained computer solution
link |
01:24:17.980
to go do very large training problems?
link |
01:24:21.760
And then there's fun stuff like, we said,
link |
01:24:24.520
well, we have this 10,000 watt board to cool.
link |
01:24:27.220
Well, you go talk to guys at SpaceX
link |
01:24:29.140
and they think 10,000 watts is a really small number,
link |
01:24:31.200
not a big number.
link |
01:24:32.740
And there's brilliant people working on it.
link |
01:24:35.300
I'm curious to see how it'll come out.
link |
01:24:37.300
I couldn't tell you, I know it pivoted
link |
01:24:39.840
a few times since I left, so.
link |
01:24:41.660
So the cooling does seem to be a big problem.
link |
01:24:44.540
I do like what Elon said about it, which is like,
link |
01:24:47.640
we don't wanna do the thing unless it's way better
link |
01:24:50.380
than the alternative, whatever the alternative is.
link |
01:24:52.980
So it has to be way better than like racks or GPUs.
link |
01:24:57.620
Yeah, and the other thing is just like,
link |
01:25:00.100
you know, the Tesla autonomous driving hardware,
link |
01:25:03.900
it was only serving one software stack.
link |
01:25:06.620
And the hardware team and the software team
link |
01:25:08.040
were tightly coupled.
link |
01:25:09.880
You know, if you're building a general purpose AI solution,
link |
01:25:12.160
then you know, there's so many different customers
link |
01:25:14.280
with so many different needs.
link |
01:25:16.420
Now, something Andre said is, I think this is amazing.
link |
01:25:19.780
10 years ago, like vision, recommendation, language,
link |
01:25:24.660
were completely different disciplines.
link |
01:25:27.140
He said, the people literally couldn't talk to each other.
link |
01:25:29.740
And three years ago, it was all neural networks,
link |
01:25:32.580
but the very different neural networks.
link |
01:25:34.860
And recently, it's converging on one set of networks.
link |
01:25:37.740
They vary a lot in size, obviously, they vary in data,
link |
01:25:40.460
vary in outputs, but the technology has converged
link |
01:25:43.820
a good bit.
link |
01:25:44.780
Yeah, these transformers behind GPT3,
link |
01:25:47.420
it seems like they could be applied to video,
link |
01:25:48.980
they could be applied to a lot of, and it's like,
link |
01:25:51.020
and they're all really simple.
link |
01:25:52.500
And it was like they literally replace letters with pixels.
link |
01:25:56.380
It does vision, it's amazing.
link |
01:25:58.780
And then size actually improves the thing.
link |
01:26:02.100
So the bigger it gets, the more compute you throw at it,
link |
01:26:04.420
the better it gets.
link |
01:26:05.660
And the more data you have, the better it gets.
link |
01:26:08.320
So then you start to wonder, well,
link |
01:26:11.220
is that a fundamental thing?
link |
01:26:12.540
Or is this just another step to some fundamental understanding
link |
01:26:16.580
about this kind of computation?
link |
01:26:18.820
Which is really interesting.
link |
01:26:20.300
Us humans don't want to believe that that kind of thing
link |
01:26:22.260
will achieve conceptual understandings, you were saying,
link |
01:26:24.420
like you'll figure out physics, but maybe it will.
link |
01:26:27.000
Maybe.
link |
01:26:27.840
Maybe it will.
link |
01:26:29.360
Well, it's worse than that.
link |
01:26:31.060
It'll understand physics in ways that we can't understand.
link |
01:26:33.780
I like your Stephen Wolfram talk where he said,
link |
01:26:36.340
you know, there's three generations of physics.
link |
01:26:38.020
There was physics by reasoning.
link |
01:26:40.100
Well, big things should fall faster than small things,
link |
01:26:42.620
right?
link |
01:26:43.460
That's reasoning.
link |
01:26:44.280
And then there's physics by equations.
link |
01:26:46.940
Like, you know, but the number of programs in the world
link |
01:26:49.620
that are solved with a single equation is relatively low.
link |
01:26:51.980
Almost all programs have, you know,
link |
01:26:53.660
more than one line of code, maybe 100 million lines of code.
link |
01:26:56.860
So he said, then now we're going to physics by equation,
link |
01:26:59.980
which is his project, which is cool.
link |
01:27:02.580
I might point out there was two generations of physics
link |
01:27:07.260
before reasoning habit.
link |
01:27:10.240
Like all animals, you know, know things fall
link |
01:27:12.360
and, you know, birds fly and, you know, predators know
link |
01:27:15.300
how to, you know, solve a differential equation
link |
01:27:17.360
to cut off a accelerating, you know, curving animal path.
link |
01:27:22.360
And then there was, you know, the gods did it, right?
link |
01:27:28.400
So, right.
link |
01:27:29.560
So there was, you know, there's five generations.
link |
01:27:31.620
Now, software 2.0 says programming things
link |
01:27:35.960
is not the last step.
link |
01:27:38.320
Data.
link |
01:27:39.160
So there's going to be a physics past Stephen Wolfram's con.
link |
01:27:44.060
That's not explainable to us humans.
link |
01:27:47.520
And actually there's no reason that I can see
link |
01:27:51.060
well that even that's the limit.
link |
01:27:53.280
Like, there's something beyond that.
link |
01:27:55.600
I mean, they're usually, like, usually when you have
link |
01:27:57.080
this hierarchy, it's not like, well, if you have this step
link |
01:27:59.620
and this step and this step and they're all qualitatively
link |
01:28:01.840
different and conceptually different, it's not obvious why,
link |
01:28:05.100
you know, six is the right number of hierarchy steps
link |
01:28:07.360
and not seven or eight or.
link |
01:28:09.200
Well, then it's probably impossible for us to,
link |
01:28:12.120
to comprehend something that's beyond the thing
link |
01:28:15.920
that's not explainable.
link |
01:28:18.280
Yeah.
link |
01:28:19.800
But the thing that, you know, understands the thing
link |
01:28:21.760
that's not explainable to us will conceive the next one.
link |
01:28:25.120
And like, I'm not sure why there's a limit to it.
link |
01:28:30.920
Click your brain hurts.
link |
01:28:31.760
That's a sad story.
link |
01:28:34.840
If we look at our own brain, which is an interesting
link |
01:28:38.560
illustrative example in your work with test story
link |
01:28:42.600
and trying to design deep learning architectures,
link |
01:28:46.160
do you think about the brain at all?
link |
01:28:50.080
Maybe from a hardware designer perspective,
link |
01:28:53.500
if you could change something about the brain,
link |
01:28:56.240
what would you change or do?
link |
01:28:58.200
Funny question.
link |
01:29:00.120
Like, how would you do it?
link |
01:29:00.960
So your brain is really weird.
link |
01:29:02.380
Like, you know, your cerebral cortex where we think
link |
01:29:04.440
we do most of our thinking is what,
link |
01:29:06.400
like six or seven neurons thick?
link |
01:29:08.660
Yeah.
link |
01:29:09.500
Like, that's weird.
link |
01:29:10.320
Like all the big networks are way bigger than that.
link |
01:29:13.240
Like way deeper.
link |
01:29:14.360
So that seems odd.
link |
01:29:16.200
And then, you know, when you're thinking if it's,
link |
01:29:19.200
if the input generates a result you can lose,
link |
01:29:21.840
it goes really fast.
link |
01:29:22.840
But if it can't, that generates an output
link |
01:29:25.280
that's interesting, which turns into an input
link |
01:29:27.120
and then your brain to the point where you mold things
link |
01:29:29.840
over for days and how many trips
link |
01:29:31.560
through your brain is that, right?
link |
01:29:33.440
Like it's, you know, 300 milliseconds or something
link |
01:29:36.120
to get through seven levels of neurons.
link |
01:29:37.880
I forget the number exactly.
link |
01:29:39.880
But then it does it over and over and over as it searches.
link |
01:29:43.320
And the brain clearly looks like some kind of graph
link |
01:29:46.160
because you have a neuron with connections
link |
01:29:48.200
and it talks to other ones
link |
01:29:49.240
and it's locally very computationally intense,
link |
01:29:52.400
but it's also does sparse computations
link |
01:29:55.520
across a pretty big area.
link |
01:29:57.840
There's a lot of messy biological type of things
link |
01:30:00.680
and it's meaning like, first of all,
link |
01:30:03.760
there's mechanical, chemical and electrical signals.
link |
01:30:06.040
It's all that's going on.
link |
01:30:07.480
Then there's the asynchronicity of signals.
link |
01:30:12.400
And there's like, there's just a lot of variability
link |
01:30:14.720
that seems continuous and messy
link |
01:30:16.520
and just the mess of biology.
link |
01:30:18.600
And it's unclear whether that's a good thing
link |
01:30:22.640
or it's a bad thing, because if it's a good thing
link |
01:30:26.320
that we need to run the entirety of the evolution,
link |
01:30:29.240
well, we're gonna have to start with basic bacteria
link |
01:30:31.560
to create something.
link |
01:30:32.400
So imagine we could control,
link |
01:30:34.000
you could build a brain with 10 layers.
link |
01:30:35.640
Would that be better or worse?
link |
01:30:37.360
Or more connections or less connections,
link |
01:30:39.800
or we don't know to what level our brains are optimized.
link |
01:30:44.240
But if I was changing things,
link |
01:30:45.480
like you can only hold like seven numbers in your head.
link |
01:30:49.360
Like why not a hundred or a million?
link |
01:30:51.840
Never thought of that.
link |
01:30:53.680
And why can't we have like a floating point processor
link |
01:30:56.800
that can compute anything we want
link |
01:30:59.560
and see it all properly?
link |
01:31:01.240
Like that would be kind of fun.
link |
01:31:03.120
And why can't we see in four or eight dimensions?
link |
01:31:05.760
Because 3D is kind of a drag.
link |
01:31:10.040
Like all the hard mass transforms
link |
01:31:11.600
are up in multiple dimensions.
link |
01:31:13.960
So you could imagine a brain architecture
link |
01:31:16.560
that you could enhance with a whole bunch of features
link |
01:31:21.120
that would be really useful for thinking about things.
link |
01:31:24.440
It's possible that the limitations you're describing
link |
01:31:26.880
are actually essential for like the constraints
link |
01:31:29.880
are essential for creating like the depth of intelligence.
link |
01:31:34.000
Like that, the ability to reason.
link |
01:31:38.360
It's hard to say
link |
01:31:39.200
because like your brain is clearly a parallel processor.
link |
01:31:44.360
10 billion neurons talking to each other
link |
01:31:46.200
at a relatively low clock rate.
link |
01:31:48.440
But it produces something
link |
01:31:50.480
that looks like a serial thought process.
link |
01:31:52.640
It's a serial narrative in your head.
link |
01:31:54.720
That's true.
link |
01:31:55.560
But then there are people famously who are visual thinkers.
link |
01:31:59.040
Like I think I'm a relatively visual thinker.
link |
01:32:02.320
I can imagine any object and rotate it in my head
link |
01:32:05.120
and look at it.
link |
01:32:06.440
And there are people who say
link |
01:32:07.360
they don't think that way at all.
link |
01:32:09.640
And recently I read an article about people
link |
01:32:12.440
who say they don't have a voice in their head.
link |
01:32:16.240
They can talk.
link |
01:32:18.520
But when they, you know, it's like,
link |
01:32:19.880
well, what are you thinking?
link |
01:32:21.040
No, they'll describe something that's visual.
link |
01:32:24.400
So that's curious.
link |
01:32:26.480
Now, if you're saying,
link |
01:32:31.760
if we dedicated more hardware to holding information,
link |
01:32:34.960
like, you know, 10 numbers or a million numbers,
link |
01:32:37.960
like would that distract us from our ability
link |
01:32:41.680
to form this kind of singular identity?
link |
01:32:44.760
Like it dissipates somehow.
link |
01:32:46.960
But maybe, you know, future humans
link |
01:32:49.400
will have many identities
link |
01:32:50.720
that have some higher level organization
link |
01:32:53.120
but can actually do lots more things in parallel.
link |
01:32:55.620
Yeah, there's no reason, if we're thinking modularly,
link |
01:32:57.880
there's no reason we can't have multiple consciousnesses
link |
01:33:00.280
in one brain.
link |
01:33:01.520
Yeah, and maybe there's some way to make it faster
link |
01:33:03.720
so that the, you know, the area of the computation
link |
01:33:07.920
could still have a unified feel to it
link |
01:33:13.240
while still having way more ability
link |
01:33:15.720
to do parallel stuff at the same time.
link |
01:33:17.600
Could definitely be improved.
link |
01:33:19.040
Could be improved?
link |
01:33:20.040
Yeah.
link |
01:33:20.860
Okay, well, it's pretty good right now.
link |
01:33:22.920
Actually, people don't give it enough credit.
link |
01:33:24.680
The thing is pretty nice.
link |
01:33:25.880
The, you know, the fact that the right ends
link |
01:33:29.240
seem to be, give a nice, like,
link |
01:33:32.920
spark of beauty to the whole experience.
link |
01:33:37.920
I don't know.
link |
01:33:38.760
I don't know if it can be improved easily.
link |
01:33:40.280
It could be more beautiful.
link |
01:33:42.480
I don't know how, I, what?
link |
01:33:44.320
What do you mean, what do you mean how?
link |
01:33:46.280
All the ways you can't imagine.
link |
01:33:48.280
No, but that's the whole point.
link |
01:33:49.500
I wouldn't be able to,
link |
01:33:51.080
the fact that I can imagine ways
link |
01:33:53.200
in which it could be more beautiful means.
link |
01:33:55.880
So do you know, you know, Ian Banks, his stories?
link |
01:33:59.400
So the super smart AIs there live,
link |
01:34:03.600
mostly live in the world of what they call infinite fun
link |
01:34:07.540
because they can create arbitrary worlds.
link |
01:34:12.200
So they interact in, you know, the story has it.
link |
01:34:14.480
They interact in the normal world and they're very smart
link |
01:34:16.720
and they can do all kinds of stuff.
link |
01:34:18.560
And, you know, a given mind can, you know,
link |
01:34:20.420
talk to a million humans at the same time
link |
01:34:22.040
because we're very slow and for reasons,
link |
01:34:24.680
you know, artificial, the story,
link |
01:34:26.280
they're interested in people and doing stuff,
link |
01:34:28.240
but they mostly live in this other land of thinking.
link |
01:34:33.000
My inclination is to think that the ability
link |
01:34:36.520
to create infinite fun will not be so fun.
link |
01:34:41.200
That's sad.
link |
01:34:42.400
Well, there are so many things to do.
link |
01:34:43.800
Imagine being able to make a star move planets around.
link |
01:34:47.600
Yeah, yeah, but because we can imagine that
link |
01:34:50.080
is why life is fun, if we actually were able to do it,
link |
01:34:53.360
it would be a slippery slope
link |
01:34:55.040
where fun wouldn't even have a meaning
link |
01:34:56.720
because we just consistently desensitize ourselves
link |
01:35:00.320
by the infinite amounts of fun we're having.
link |
01:35:04.120
And the sadness, the dark stuff is what makes it fun.
link |
01:35:07.480
I think that could be the Russian.
link |
01:35:10.440
It could be the fun makes it fun
link |
01:35:12.400
and the sadness makes it bittersweet.
link |
01:35:16.560
Yeah, that's true.
link |
01:35:17.400
Fun could be the thing that makes it fun.
link |
01:35:20.560
So what do you think about the expansion,
link |
01:35:22.560
not through the biology side,
link |
01:35:23.920
but through the BCI, the brain computer interfaces?
link |
01:35:27.220
Yeah, you got a chance to check out the Neuralink stuff.
link |
01:35:30.120
It's super interesting.
link |
01:35:31.520
Like humans like our thoughts to manifest as action.
link |
01:35:37.600
You know, like as a kid, you know,
link |
01:35:39.560
like shooting a rifle was super fun,
link |
01:35:41.720
driving a mini bike, doing things.
link |
01:35:44.320
And then computer games, I think,
link |
01:35:46.160
for a lot of kids became the thing
link |
01:35:47.920
where they can do what they want.
link |
01:35:50.360
They can fly a plane, they can do this, they can do this.
link |
01:35:53.600
But you have to have this physical interaction.
link |
01:35:55.860
Now imagine, you could just imagine stuff and it happens.
link |
01:36:03.280
Like really richly and interestingly.
link |
01:36:06.620
Like we kind of do that when we dream.
link |
01:36:08.080
Like dreams are funny because like if you have some control
link |
01:36:12.040
or awareness in your dreams,
link |
01:36:13.520
like it's very realistic looking,
link |
01:36:16.380
or not realistic looking, it depends on the dream.
link |
01:36:19.420
But you can also manipulate that.
link |
01:36:22.500
And you know, what's possible there is odd.
link |
01:36:26.220
And the fact that nobody understands it, it's hilarious, but.
link |
01:36:29.860
Do you think it's possible to expand
link |
01:36:31.780
that capability through computing?
link |
01:36:34.060
Sure.
link |
01:36:35.340
Is there some interesting,
link |
01:36:36.500
so from a hardware designer perspective,
link |
01:36:38.420
is there, do you think it'll present totally new challenges
link |
01:36:41.660
in the kind of hardware required that like,
link |
01:36:44.100
so this hardware isn't standalone computing.
link |
01:36:47.740
Well, this is not working with the brain.
link |
01:36:49.540
So today, computer games are rendered by GPUs.
link |
01:36:52.860
Right.
link |
01:36:53.700
Right, so, but you've seen the GAN stuff, right?
link |
01:36:56.840
Where trained neural networks render realistic images,
link |
01:37:00.900
but there's no pixels, no triangles, no shaders,
link |
01:37:03.740
no light maps, no nothing.
link |
01:37:05.400
So the future of graphics is probably AI, right?
link |
01:37:09.540
Yes.
link |
01:37:10.380
AI is heavily trained by lots of real data, right?
link |
01:37:14.820
So if you have an interface with a AI renderer, right?
link |
01:37:20.340
So if you say render a cat, it won't say,
link |
01:37:23.420
well, how tall's the cat and how big it,
link |
01:37:25.060
you know, it'll render a cat.
link |
01:37:26.260
And you might say, oh, a little bigger, a little smaller,
link |
01:37:28.220
you know, make it a tabby, shorter hair.
link |
01:37:31.060
You know, like you could tweak it.
link |
01:37:32.900
Like the amount of data you'll have to send
link |
01:37:36.500
to interact with a very powerful AI renderer
link |
01:37:40.120
could be low.
link |
01:37:41.420
But the question is brain computer interfaces
link |
01:37:44.780
would need to render not onto a screen,
link |
01:37:47.860
but render onto the brain and like directly
link |
01:37:51.980
so that there's a bandwidth.
link |
01:37:52.820
Well, it could do it both ways.
link |
01:37:53.880
I mean, our eyes are really good sensors.
link |
01:37:56.020
They could render onto a screen
link |
01:37:58.580
and we could feel like we're participating in it.
link |
01:38:01.100
You know, they're gonna have, you know,
link |
01:38:03.360
like the Oculus kind of stuff.
link |
01:38:04.860
It's gonna be so good when a projection to your eyes,
link |
01:38:07.020
you think it's real.
link |
01:38:08.040
You know, they're slowly solving those problems.
link |
01:38:12.520
And I suspect when the renderer of that information
link |
01:38:17.240
into your head is also AI mediated,
link |
01:38:19.760
they'll be able to give you the cues that, you know,
link |
01:38:23.520
you really want for depth and all kinds of stuff.
link |
01:38:27.280
Like your brain is partly faking your visual field, right?
link |
01:38:30.920
Like your eyes are twitching around,
link |
01:38:32.680
but you don't notice that.
link |
01:38:33.800
Occasionally they blank, you don't notice that.
link |
01:38:36.520
You know, there's all kinds of things.
link |
01:38:37.800
Like you think you see over here,
link |
01:38:39.160
but you don't really see there.
link |
01:38:40.840
It's all fabricated.
link |
01:38:42.200
Yeah, peripheral vision is fascinating.
link |
01:38:45.520
So if you have an AI renderer that's trained
link |
01:38:48.560
to understand exactly how you see
link |
01:38:51.700
and the kind of things that enhance the realism
link |
01:38:54.760
of the experience, it could be super real actually.
link |
01:39:01.160
So I don't know what the limits to that are,
link |
01:39:03.520
but obviously if we have a brain interface
link |
01:39:06.960
that goes inside your visual cortex
link |
01:39:10.480
in a better way than your eyes do, which is possible,
link |
01:39:13.480
it's a lot of neurons, maybe that'll be even cooler.
link |
01:39:19.800
Well, the really cool thing is that it has to do
link |
01:39:21.600
with the infinite fun that you were referring to,
link |
01:39:24.240
which is our brains seem to be very limited.
link |
01:39:26.640
And like you said, computations.
link |
01:39:28.360
It's also very plastic.
link |
01:39:29.920
Very plastic, yeah.
link |
01:39:30.920
Yeah, so it's a interesting combination.
link |
01:39:33.640
The interesting open question is the limits
link |
01:39:37.480
of that neuroplasticity, like how flexible is that thing?
link |
01:39:42.320
Because we haven't really tested it.
link |
01:39:44.880
We know about that at the experiments
link |
01:39:46.240
where they put like a pressure pad on somebody's head
link |
01:39:49.120
and had a visual transducer pressurize it
link |
01:39:51.520
and somebody slowly learned to see.
link |
01:39:53.440
Yep.
link |
01:39:55.880
Especially at a young age, if you throw a lot at it,
link |
01:39:58.720
like what can it, so can you like arbitrarily expand it
link |
01:40:05.920
with computing power?
link |
01:40:06.880
So connected to the internet directly somehow?
link |
01:40:09.880
Yeah, the answer's probably yes.
link |
01:40:11.960
So the problem with biology and ethics
link |
01:40:13.840
is like there's a mess there.
link |
01:40:15.560
Like us humans are perhaps unwilling to take risks
link |
01:40:21.840
into directions that are full of uncertainty.
link |
01:40:25.600
So it's like. No, no.
link |
01:40:26.440
90% of the population's unwilling to take risks.
link |
01:40:28.880
The other 10% is rushing into the risks
link |
01:40:31.360
unaided by any infrastructure whatsoever.
link |
01:40:34.400
And that's where all the fun happens in society.
link |
01:40:38.960
There's been huge transformations
link |
01:40:41.160
in the last couple thousand years.
link |
01:40:43.600
Yeah, it's funny.
link |
01:40:44.560
I got a chance to interact with this Matthew Johnson
link |
01:40:48.200
from Johns Hopkins.
link |
01:40:49.360
He's doing this large scale study of psychedelics.
link |
01:40:52.520
It's becoming more and more,
link |
01:40:54.240
I've gotten a chance to interact
link |
01:40:55.240
with that community of scientists working on psychedelics.
link |
01:40:57.760
But because of that, that opened the door to me
link |
01:41:00.080
to all these, what do they call it?
link |
01:41:02.740
Psychonauts, the people who, like you said,
link |
01:41:05.340
the 10% who are like, I don't care.
link |
01:41:08.000
I don't know if there's a science behind this.
link |
01:41:09.840
I'm taking this spaceship to,
link |
01:41:12.040
if I'm being the first on Mars, I'll be.
link |
01:41:15.760
Psychedelics are interesting in the sense
link |
01:41:17.440
that in another dimension, like you said,
link |
01:41:21.400
it's a way to explore the limits of the human mind.
link |
01:41:25.440
Like, what is this thing capable of doing?
link |
01:41:28.240
Because you kind of, like when you dream, you detach it.
link |
01:41:31.440
I don't know exactly the neuroscience of it,
link |
01:41:33.080
but you detach your reality from what your mind,
link |
01:41:39.000
the images your mind is able to conjure up
link |
01:41:40.800
and your mind goes into weird places and entities appear.
link |
01:41:44.960
Somehow Freudian type of trauma
link |
01:41:48.800
is probably connected in there somehow,
link |
01:41:50.320
but you start to have these weird, vivid worlds that like.
link |
01:41:54.040
So do you actively dream?
link |
01:41:56.400
Do you, why not?
link |
01:41:59.060
I have like six hours of dreams a night.
link |
01:42:01.360
It's like really useful time.
link |
01:42:03.140
I know, I haven't, I don't for some reason.
link |
01:42:06.160
I just knock out and I have sometimes anxiety inducing
link |
01:42:11.040
kind of like very pragmatic nightmare type of dreams,
link |
01:42:16.680
but nothing fun, nothing.
link |
01:42:18.480
Nothing fun?
link |
01:42:19.320
Nothing fun.
link |
01:42:20.640
I try, I unfortunately have mostly have fun
link |
01:42:24.640
in the waking world, which is very limited
link |
01:42:27.760
in the amount of fun you can have.
link |
01:42:30.040
It's not that limited either.
link |
01:42:31.240
Yeah, that's why.
link |
01:42:32.600
We'll have to talk.
link |
01:42:33.440
Yeah, I need instructions.
link |
01:42:36.840
Yeah.
link |
01:42:37.680
There's like a manual for that.
link |
01:42:38.680
You might wanna.
link |
01:42:41.040
I'll look it up.
link |
01:42:41.860
I'll ask Elon.
link |
01:42:42.700
What would you dream?
link |
01:42:44.720
You know, years ago when I read about, you know,
link |
01:42:47.120
like, you know, a book about how to have, you know,
link |
01:42:51.360
become aware of your dreams.
link |
01:42:53.080
I worked on it for a while.
link |
01:42:54.320
Like there's this trick about, you know,
link |
01:42:55.980
imagine you can see your hands and look out
link |
01:42:58.280
and I got somewhat good at it.
link |
01:43:00.640
Like, but my mostly, when I'm thinking about things
link |
01:43:04.400
or working on problems, I prep myself before I go to sleep.
link |
01:43:09.040
It's like, I pull into my mind all the things
link |
01:43:13.160
I wanna work on or think about.
link |
01:43:15.440
And then that, let's say, greatly improves the chances
link |
01:43:19.840
that I'll work on that while I'm sleeping.
link |
01:43:23.400
And then I also, you know, basically ask to remember it.
link |
01:43:30.320
And I often remember very detailed.
link |
01:43:33.180
Within the dream.
link |
01:43:34.120
Yeah.
link |
01:43:34.960
Or outside the dream.
link |
01:43:35.780
Well, to bring it up in my dreaming
link |
01:43:37.840
and then to remember it when I wake up.
link |
01:43:41.020
It's just, it's more of a meditative practice.
link |
01:43:43.360
You say, you know, to prepare yourself to do that.
link |
01:43:48.920
Like if you go to, you know, to sleep,
link |
01:43:50.600
still gnashing your teeth about some random thing
link |
01:43:52.960
that happened that you're not that really interested in,
link |
01:43:55.800
you'll dream about it.
link |
01:43:57.960
That's really interesting.
link |
01:43:58.840
Maybe.
link |
01:43:59.680
But you can direct your dreams somewhat by prepping.
link |
01:44:04.440
Yeah, I'm gonna have to try that.
link |
01:44:05.480
It's really interesting.
link |
01:44:06.400
Like the most important, the interesting,
link |
01:44:08.480
not like what did this guy send in an email
link |
01:44:12.240
kind of like stupid worry stuff,
link |
01:44:14.080
but like fundamental problems
link |
01:44:15.240
you're actually concerned about.
link |
01:44:16.320
Yeah.
link |
01:44:17.160
And interesting things you're worried about.
link |
01:44:18.200
Or books you're reading or, you know,
link |
01:44:20.040
some great conversation you had
link |
01:44:21.360
or some adventure you want to have.
link |
01:44:23.480
Like there's a lot of space there.
link |
01:44:28.880
And it seems to work that, you know,
link |
01:44:32.520
my percentage of interesting dreams and memories went up.
link |
01:44:36.440
Is there, is that the source of,
link |
01:44:40.440
if you were able to deconstruct like
link |
01:44:42.280
where some of your best ideas came from,
link |
01:44:45.720
is there a process that's at the core of that?
link |
01:44:49.400
Like, so some people, you know, walk and think,
link |
01:44:52.420
some people like in the shower, the best ideas hit them.
link |
01:44:55.160
If you talk about like Newton, Apple hitting them on the head.
link |
01:44:58.560
No, I found out a long time ago,
link |
01:45:01.080
I process things somewhat slowly.
link |
01:45:03.200
So like in college, I had friends who could study
link |
01:45:05.680
at the last minute, get an A the next day.
link |
01:45:07.520
I can't do that at all.
link |
01:45:09.060
So I always front loaded all the work.
link |
01:45:10.920
Like I do all the problems early, you know,
link |
01:45:14.160
for finals, like the last three days,
link |
01:45:15.800
I wouldn't look at a book because I want, you know,
link |
01:45:18.840
cause like a new fact day before finals may screw up
link |
01:45:22.200
my understanding of what I thought I knew.
link |
01:45:23.880
So my goal was to always get it in and give it time to soak.
link |
01:45:29.880
And I used to, you know,
link |
01:45:32.060
I remember when we were doing like 3D calculus,
link |
01:45:33.780
I would have these amazing dreams of 3D surfaces
link |
01:45:36.280
with normal, you know, calculating the gradient.
link |
01:45:38.560
And it's just like all come up.
link |
01:45:40.160
So it was like really fun, like very visual.
link |
01:45:43.920
And if I got cycles of that, that was useful.
link |
01:45:48.520
And the other is, is don't over filter your ideas.
link |
01:45:50.960
Like I like that process of brainstorming
link |
01:45:54.520
where lots of ideas can happen.
link |
01:45:55.640
I like people who have lots of ideas.
link |
01:45:57.360
But then there's a, yeah, I'll let them sit
link |
01:46:00.240
and let it breathe a little bit
link |
01:46:02.560
and then reduce it to practice.
link |
01:46:04.960
Like at some point you really have to, does it really work?
link |
01:46:09.920
Like, you know, is this real or not, right?
link |
01:46:13.360
But you have to do both.
link |
01:46:15.020
There's creative tension there.
link |
01:46:16.160
Like how do you be both open and, you know, precise?
link |
01:46:20.480
Have you had ideas that you just,
link |
01:46:22.280
that sit in your mind for like years before the?
link |
01:46:26.120
Sure.
link |
01:46:27.640
It's an interesting way to just generate ideas
link |
01:46:31.760
and just let them sit, let them sit there for a while.
link |
01:46:35.080
I think I have a few of those ideas.
link |
01:46:38.480
You know, that was so funny.
link |
01:46:40.160
Yeah, I think that's, you know,
link |
01:46:42.440
creativity this one or something.
link |
01:46:45.740
For the slow thinkers in the room, I suppose.
link |
01:46:49.380
As I, some people, like you said, are just like, like the.
link |
01:46:53.300
Yeah, it's really interesting.
link |
01:46:54.840
There's so much diversity in how people think.
link |
01:46:57.680
You know, how fast or slow they are,
link |
01:46:59.320
how well they remember or don't.
link |
01:47:01.660
Like, you know, I'm not super good at remembering facts,
link |
01:47:04.040
but processes and methods.
link |
01:47:06.440
Like in our engineering, I went to Penn State
link |
01:47:08.040
and almost all our engineering tests were open book.
link |
01:47:11.860
I could remember the page and not the formula.
link |
01:47:14.800
But as soon as I saw the formula,
link |
01:47:15.920
I could remember the whole method if I'd learned it.
link |
01:47:19.720
Yeah.
link |
01:47:20.560
So it's just a funny, where some people could, you know,
link |
01:47:23.480
I'd watch friends like flipping through the book,
link |
01:47:25.580
trying to find the formula,
link |
01:47:27.440
even knowing that they'd done just as much work.
link |
01:47:30.080
And I would just open the book
link |
01:47:31.240
and I was on page 27, about half,
link |
01:47:33.680
I could see the whole thing visually.
link |
01:47:35.960
Yeah.
link |
01:47:36.800
And, you know.
link |
01:47:37.640
And you have to learn that about yourself
link |
01:47:39.040
and figure out what would function optimally.
link |
01:47:41.480
I had a friend who was always concerned
link |
01:47:43.320
he didn't know how he came up with ideas.
link |
01:47:45.760
He had lots of ideas, but he said they just sort of popped up.
link |
01:47:49.160
Like, you'd be working on something, you have this idea,
link |
01:47:51.080
like, where does it come from?
link |
01:47:53.360
But you can have more awareness of it.
link |
01:47:54.840
Like, how your brain works is a little murky
link |
01:47:59.760
as you go down from the voice in your head
link |
01:48:01.600
or the obvious visualizations.
link |
01:48:03.920
Like, when you visualize something, how does that happen?
link |
01:48:06.580
Yeah, that's right.
link |
01:48:07.420
You know, if I say, you know, visualize a volcano,
link |
01:48:09.080
it's easy to do, right?
link |
01:48:10.320
And what does it actually look like when you visualize it?
link |
01:48:12.560
I can visualize to the point where I don't see very much
link |
01:48:14.880
out of my eyes and I see the colors
link |
01:48:16.280
of the thing I'm visualizing.
link |
01:48:18.280
Yeah, but there's a shape, there's a texture,
link |
01:48:20.600
there's a color, but there's also conceptual visualization.
link |
01:48:23.160
Like, what are you actually visualizing
link |
01:48:25.720
when you're visualizing a volcano?
link |
01:48:27.240
Just like with peripheral vision,
link |
01:48:28.480
you think you see the whole thing.
link |
01:48:29.720
Yeah, yeah, yeah, that's a good way to say it.
link |
01:48:31.840
You know, you have this kind of almost peripheral vision
link |
01:48:34.860
of your visualizations, they're like these ghosts.
link |
01:48:38.440
But if, you know, if you work on it,
link |
01:48:40.200
you can get a pretty high level of detail.
link |
01:48:42.320
And somehow you can walk along those visualizations
link |
01:48:44.400
and come up with an idea, which is weird.
link |
01:48:47.240
But when you're thinking about solving problems,
link |
01:48:50.940
like, you're putting information in,
link |
01:48:53.000
you're exercising the stuff you do know,
link |
01:48:55.760
you're sort of teasing the area that you don't understand
link |
01:48:59.400
and don't know, but you can almost, you know,
link |
01:49:02.240
feel, you know, that process happening.
link |
01:49:06.600
You know, that's how I, like,
link |
01:49:10.080
like, I know sometimes when I'm working really hard
link |
01:49:12.040
on something, like, I get really hot when I'm sleeping.
link |
01:49:14.920
And, you know, it's like, we got the blank throw,
link |
01:49:17.320
I wake up, all the blanks are on the floor.
link |
01:49:20.080
And, you know, every time it's, well,
link |
01:49:21.920
I wake up and think, wow, that was great.
link |
01:49:24.880
You know?
link |
01:49:25.720
Are you able to reverse engineer
link |
01:49:27.600
what the hell happened there?
link |
01:49:28.960
Well, sometimes it's vivid dreams
link |
01:49:30.360
and sometimes it's just kind of, like you say,
link |
01:49:32.500
like shadow thinking that you sort of have this feeling
link |
01:49:35.120
you're going through this stuff, but it's not that obvious.
link |
01:49:38.720
Isn't that so amazing that the mind
link |
01:49:40.320
just does all these little experiments?
link |
01:49:42.880
I never, you know, I always thought it's like a river
link |
01:49:46.040
that you can't, you're just there for the ride,
link |
01:49:48.160
but you're right, if you prep it.
link |
01:49:50.360
No, it's all understandable.
link |
01:49:52.400
Meditation really helps.
link |
01:49:53.720
You gotta start figuring out,
link |
01:49:55.160
you need to learn language of your own mind.
link |
01:49:59.320
And there's multiple levels of it, but.
link |
01:50:02.600
The abstractions again, right?
link |
01:50:04.040
It's somewhat comprehensible and observable
link |
01:50:06.700
and feelable or whatever the right word is.
link |
01:50:11.960
You know, you're not alone for the ride.
link |
01:50:13.680
You are the ride.
link |
01:50:15.600
I have to ask you, hardware engineer,
link |
01:50:17.960
working on neural networks now, what's consciousness?
link |
01:50:21.420
What the hell is that thing?
link |
01:50:22.840
Is that just some little weird quirk
link |
01:50:25.960
of our particular computing device?
link |
01:50:29.280
Or is it something fundamental
link |
01:50:30.560
that we really need to crack open
link |
01:50:32.040
if we're to build good computers?
link |
01:50:36.560
Do you ever think about consciousness?
link |
01:50:37.940
Like why it feels like something to be?
link |
01:50:39.960
I know, it's really weird.
link |
01:50:42.640
So.
link |
01:50:43.680
Yeah.
link |
01:50:45.560
I mean, everything about it's weird.
link |
01:50:48.000
First, it's a half a second behind reality, right?
link |
01:50:51.340
It's a post hoc narrative about what happened.
link |
01:50:53.780
You've already done stuff
link |
01:50:56.520
by the time you're conscious of it.
link |
01:50:58.880
And your consciousness generally
link |
01:51:00.160
is a single threaded thing,
link |
01:51:01.240
but we know your brain is 10 billion neurons
link |
01:51:03.680
running some crazy parallel thing.
link |
01:51:07.980
And there's a really big sorting thing going on there.
link |
01:51:11.200
It also seems to be really reflective
link |
01:51:13.040
in the sense that you create a space in your head.
link |
01:51:18.000
Like we don't really see anything, right?
link |
01:51:19.640
Like photons hit your eyes,
link |
01:51:21.600
it gets turned into signals,
link |
01:51:22.840
it goes through multiple layers of neurons.
link |
01:51:26.600
I'm so curious that that looks glassy
link |
01:51:29.160
and that looks not glassy.
link |
01:51:30.480
Like how the resolution of your vision is so high
link |
01:51:33.520
you have to go through all this processing.
link |
01:51:36.080
Where for most of it, it looks nothing like vision.
link |
01:51:39.680
Like there's no theater in your mind, right?
link |
01:51:43.640
So we have a world in our heads.
link |
01:51:46.820
We're literally just isolated behind our sensors.
link |
01:51:51.740
But we can look at it, speculate about it,
link |
01:51:55.580
speculate about alternatives, problem solve, what if.
link |
01:52:00.240
There's so many things going on
link |
01:52:02.880
and that process is lagging reality.
link |
01:52:06.200
And it's single threaded
link |
01:52:07.580
even though the underlying thing is like massively parallel.
link |
01:52:10.460
So it's so curious.
link |
01:52:12.780
So imagine you're building an AI computer.
link |
01:52:14.520
If you wanted to replicate humans,
link |
01:52:16.380
well, you'd have huge arrays of neural networks
link |
01:52:18.380
and apparently only six or seven deep, which is hilarious.
link |
01:52:22.420
They don't even remember seven numbers,
link |
01:52:23.780
but I think we can upgrade that a lot, right?
link |
01:52:26.220
And then somewhere in there,
link |
01:52:28.240
you would train the network to create
link |
01:52:30.020
basically the world that you live in, right?
link |
01:52:32.860
So like tell stories to itself
link |
01:52:34.860
about the world that it's perceiving.
link |
01:52:36.800
Well, create the world, tell stories in the world
link |
01:52:40.820
and then have many dimensions of like side shows to it.
link |
01:52:47.660
Like we have an emotional structure,
link |
01:52:49.340
like we have a biological structure.
link |
01:52:51.500
And that seems hierarchical too.
link |
01:52:52.740
Like if you're hungry, it dominates your thinking.
link |
01:52:55.620
If you're mad, it dominates your thinking.
link |
01:52:59.220
And we don't know if that's important
link |
01:53:00.380
to consciousness or not,
link |
01:53:01.300
but it certainly disrupts, intrudes in the consciousness.
link |
01:53:05.740
Like so there's lots of structure to that.
link |
01:53:08.160
And we like to dwell on the past.
link |
01:53:09.880
We like to think about the future.
link |
01:53:11.280
We like to imagine, we like to fantasize, right?
link |
01:53:14.740
And the somewhat circular observation of that
link |
01:53:18.580
is the thing we call consciousness.
link |
01:53:21.760
Now, if you created a computer system
link |
01:53:23.340
and did all things, create worldviews,
link |
01:53:24.900
create the future alternate histories,
link |
01:53:27.620
dwelled on past events, accurately or semi accurately.
link |
01:53:33.020
Well, consciousness just spring up like naturally.
link |
01:53:35.380
Well, would that look and feel conscious to you?
link |
01:53:38.100
Like you seem conscious to me, but I don't know.
link |
01:53:39.940
Off of the external observer sense.
link |
01:53:41.780
Do you think a thing that looks conscious is conscious?
link |
01:53:44.940
Like do you, again, this is like an engineering
link |
01:53:48.220
kind of question, I think, because like.
link |
01:53:53.900
I don't know.
link |
01:53:54.860
If we want to engineer consciousness,
link |
01:53:56.840
is it okay to engineer something
link |
01:53:58.300
that just looks conscious?
link |
01:54:00.740
Or is there a difference between something that is?
link |
01:54:02.660
Well, we evolve consciousness
link |
01:54:04.060
because it's a super effective way to manage our affairs.
link |
01:54:07.140
Yeah, this is a social element, yeah.
link |
01:54:09.020
Well, it gives us a planning system.
link |
01:54:11.540
We have a huge amount of stuff.
link |
01:54:13.280
Like when we're talking, like the reason
link |
01:54:15.220
we can talk really fast is we're modeling each other
link |
01:54:17.260
at a really high level of detail.
link |
01:54:19.100
And consciousness is required for that.
link |
01:54:21.340
Well, all those components together
link |
01:54:23.740
manifest consciousness, right?
link |
01:54:26.740
So if we make intelligent beings
link |
01:54:28.460
that we want to interact with that we're like
link |
01:54:30.820
wondering what they're thinking,
link |
01:54:32.860
looking forward to seeing them,
link |
01:54:35.140
when they interact with them, they're interesting,
link |
01:54:37.280
surprising, you know, fascinating, you know,
link |
01:54:41.460
they will probably feel conscious like we do
link |
01:54:43.500
and we'll perceive them as conscious.
link |
01:54:47.180
I don't know why not, but you never know.
link |
01:54:49.980
Another fun question on this,
link |
01:54:51.460
because from a computing perspective,
link |
01:54:55.020
we're trying to create something
link |
01:54:55.980
that's humanlike or superhumanlike.
link |
01:54:59.740
Let me ask you about aliens.
link |
01:55:01.280
Aliens.
link |
01:55:02.120
Do you think there's intelligent alien civilizations
link |
01:55:08.440
out there and do you think their technology,
link |
01:55:13.160
their computing, their AI bots,
link |
01:55:16.480
their chips are of the same nature as ours?
link |
01:55:21.280
Yeah, I've got no idea.
link |
01:55:23.120
I mean, if there's lots of aliens out there
link |
01:55:25.000
that have been awfully quiet,
link |
01:55:27.320
you know, there's speculation about why.
link |
01:55:29.620
There seems to be more than enough planets out there.
link |
01:55:34.940
There's a lot.
link |
01:55:37.460
There's intelligent life on this planet
link |
01:55:38.980
that seems quite different, you know,
link |
01:55:40.500
like dolphins seem like plausibly understandable,
link |
01:55:44.580
octopuses don't seem understandable at all.
link |
01:55:47.620
If they lived longer than a year,
link |
01:55:48.820
maybe they would be running the planet.
link |
01:55:50.980
They seem really smart.
link |
01:55:52.700
And their neural architecture
link |
01:55:54.260
is completely different than ours.
link |
01:55:56.540
Now, who knows how they perceive things.
link |
01:55:58.700
I mean, that's the question is for us intelligent beings,
link |
01:56:01.180
we might not be able to perceive other kinds of intelligence
link |
01:56:03.620
if they become sufficiently different than us.
link |
01:56:05.580
Yeah, like we live in the current constrained world,
link |
01:56:08.940
you know, it's three dimensional geometry
link |
01:56:10.660
and the geometry defines a certain amount of physics.
link |
01:56:14.500
And, you know, there's like how time works seems to work.
link |
01:56:18.560
There's so many things that seem like
link |
01:56:21.100
a whole bunch of the input parameters to the, you know,
link |
01:56:23.500
another conscious being are the same.
link |
01:56:25.540
Yes, like if it's biological,
link |
01:56:28.180
biological things seem to be
link |
01:56:30.020
in a relatively narrow temperature range, right?
link |
01:56:32.940
Because, you know, organics aren't stable,
link |
01:56:35.620
too cold or too hot.
link |
01:56:37.740
Now, so if you specify the list of things that input to that,
link |
01:56:45.260
but as soon as we make really smart, you know, beings
link |
01:56:49.620
and they go solve about how to think
link |
01:56:51.140
about a billion numbers at the same time
link |
01:56:52.940
and how to think in end dimensions.
link |
01:56:56.060
There's a funny science fiction book
link |
01:56:57.340
where all the society had uploaded into this matrix.
link |
01:57:01.620
And at some point, some of the beings in the matrix thought,
link |
01:57:05.340
I wonder if there's intelligent life out there.
link |
01:57:07.900
So they had to do a whole bunch of work to figure out
link |
01:57:09.940
like how to make a physical thing
link |
01:57:12.380
because their matrix was self sustaining
link |
01:57:15.000
and they made a little spaceship
link |
01:57:16.140
and they traveled to another planet when they got there,
link |
01:57:18.540
there was like life running around,
link |
01:57:20.660
but there was no intelligent life.
link |
01:57:22.700
And then they figured out that there was these huge,
link |
01:57:26.260
you know, organic matrix all over the planet
link |
01:57:28.780
inside there where intelligent beings
link |
01:57:30.540
had uploaded themselves into that matrix.
link |
01:57:34.960
So everywhere intelligent life was,
link |
01:57:38.220
soon as it got smart, it upleveled itself
link |
01:57:42.180
into something way more interesting than 3D geometry.
link |
01:57:45.180
Yeah, it escaped whatever this,
link |
01:57:47.100
not escaped, uplevel is better.
link |
01:57:49.780
The essence of what we think of as an intelligent being,
link |
01:57:53.180
I tend to like the thought experiment of the organism,
link |
01:57:58.100
like humans aren't the organisms.
link |
01:58:00.340
I like the notion of like Richard Dawkins and memes
link |
01:58:03.700
that ideas themselves are the organisms,
link |
01:58:07.980
like that are just using our minds to evolve.
link |
01:58:11.460
So like we're just like meat receptacles
link |
01:58:15.180
for ideas to breed and multiply and so on.
link |
01:58:18.140
And maybe those are the aliens.
link |
01:58:20.980
Yeah, so Jordan Peterson has a line that says,
link |
01:58:26.300
you know, you think you have ideas, but ideas have you.
link |
01:58:29.180
Yeah, good line.
link |
01:58:30.620
Which, and then we know about the phenomenon of groupthink
link |
01:58:34.220
and there's so many things that constrain us.
link |
01:58:37.940
But I think you can examine all that
link |
01:58:39.920
and not be completely owned by the ideas
link |
01:58:43.300
and completely sucked into groupthink.
link |
01:58:46.120
And part of your responsibility as a human
link |
01:58:49.820
is to escape that kind of phenomenon,
link |
01:58:51.740
which isn't, it's one of the creative tension things again,
link |
01:58:55.940
you're constructed by it, but you can still observe it
link |
01:58:59.500
and you can think about it and you can make choices
link |
01:59:01.820
about to some level, how constrained you are by it.
link |
01:59:06.940
And it's useful to do that.
link |
01:59:09.780
And, but at the same time, and it could be by doing that,
link |
01:59:17.380
you know, the group and society you're part of
link |
01:59:21.460
becomes collectively even more interesting.
link |
01:59:24.140
So, you know, so the outside observer will think,
link |
01:59:27.020
wow, you know, all these Lexus running around
link |
01:59:30.060
with all these really independent ideas
link |
01:59:31.540
have created something even more interesting
link |
01:59:33.700
in the aggregate.
link |
01:59:35.700
So, I don't know, those are lenses to look at the situation
link |
01:59:41.860
that'll give you some inspiration,
link |
01:59:43.500
but I don't think they're constrained.
link |
01:59:45.460
Right.
link |
01:59:46.660
As a small little quirk of history,
link |
01:59:49.340
it seems like you're related to Jordan Peterson,
link |
01:59:53.540
like you mentioned.
link |
01:59:54.740
He's going through some rough stuff now.
link |
01:59:57.620
Is there some comment you can make
link |
01:59:59.180
about the roughness of the human journey, the ups and downs?
link |
02:00:04.180
Well, I became an expert in Benza withdrawal,
link |
02:00:10.700
like, which is, you took Benza to Aspen's,
link |
02:00:13.540
and at some point they interact with GABA circuits,
link |
02:00:18.940
you know, to reduce anxiety and do a hundred other things.
link |
02:00:21.860
Like there's actually no known list of everything they do
link |
02:00:25.100
because they interact with so many parts of your body.
link |
02:00:28.180
And then once you're on them, you habituate to them
link |
02:00:30.460
and you have a dependency.
link |
02:00:32.580
It's not like you're a drug dependency
link |
02:00:34.180
where you're trying to get high.
link |
02:00:35.020
It's a metabolic dependency.
link |
02:00:38.820
And then if you discontinue them,
link |
02:00:42.580
there's a funny thing called kindling,
link |
02:00:45.340
which is if you stop them and then go,
link |
02:00:47.540
you know, you'll have a horrible withdrawal symptoms.
link |
02:00:49.900
And if you go back on them at the same level,
link |
02:00:51.460
you won't be stable.
link |
02:00:53.260
And that unfortunately happened to him.
link |
02:00:55.820
Because it's so deeply integrated
link |
02:00:57.140
into all the kinds of systems in the body.
link |
02:00:58.860
It literally changes the size and numbers
link |
02:01:00.780
of neurotransmitter sites in your brain.
link |
02:01:03.820
So there's a process called the Ashton protocol
link |
02:01:07.340
where you taper it down slowly over two years
link |
02:01:10.300
to people go through that goes through unbelievable hell.
link |
02:01:13.660
And what Jordan went through seemed to be worse
link |
02:01:15.620
because on advice of doctors, you know,
link |
02:01:18.460
we'll stop taking these and take this.
link |
02:01:20.260
It was the disaster.
link |
02:01:21.340
And he got some, yeah, it was pretty tough.
link |
02:01:26.620
He seems to be doing quite a bit better intellectually.
link |
02:01:29.180
You can see his brain clicking back together.
link |
02:01:32.020
I spent a lot of time with him.
link |
02:01:32.940
I've never seen anybody suffer so much.
link |
02:01:34.940
Well, his brain is also like this powerhouse, right?
link |
02:01:37.740
So I wonder, does a brain that's able to think deeply
link |
02:01:42.500
about the world suffer more through these kinds
link |
02:01:44.740
of withdrawals, like?
link |
02:01:46.220
I don't know.
link |
02:01:47.060
I've watched videos of people going through withdrawal.
link |
02:01:49.260
They all seem to suffer unbelievably.
link |
02:01:54.060
And, you know, my heart goes out to everybody.
link |
02:01:57.580
And there's some funny math about this.
link |
02:01:59.300
Some doctor said, as best he can tell, you know,
link |
02:02:01.980
there's the standard recommendations.
link |
02:02:03.620
Don't take them for more than a month
link |
02:02:04.820
and then taper over a couple of weeks.
link |
02:02:07.220
Many doctors prescribe them endlessly,
link |
02:02:09.380
which is against the protocol, but it's common, right?
link |
02:02:13.180
And then something like 75% of people, when they taper,
link |
02:02:17.500
it's, you know, half the people have difficulty,
link |
02:02:19.900
but 75% get off okay.
link |
02:02:22.140
20% have severe difficulty
link |
02:02:24.020
and 5% have life threatening difficulty.
link |
02:02:27.300
And if you're one of those, it's really bad.
link |
02:02:29.580
And the stories that people have on this
link |
02:02:31.580
is heartbreaking and tough.
link |
02:02:34.980
So you put some of the fault at the doctors.
link |
02:02:36.860
They just not know what the hell they're doing.
link |
02:02:38.660
No, no, it's hard to say.
link |
02:02:40.580
It's one of those commonly prescribed things.
link |
02:02:43.140
Like one doctor said, what happens is,
link |
02:02:46.140
if you're prescribed them for a reason
link |
02:02:47.820
and then you have a hard time getting off,
link |
02:02:49.900
the protocol basically says you're either crazy
link |
02:02:52.420
or dependent and you get kind of pushed
link |
02:02:55.500
into a different treatment regime.
link |
02:02:58.380
You're a drug addict or a psychiatric patient.
link |
02:03:01.820
And so like one doctor said, you know,
link |
02:03:04.100
I prescribed them for 10 years thinking
link |
02:03:05.500
I was helping my patients
link |
02:03:06.580
and I realized I was really harming them.
link |
02:03:09.620
And you know, the awareness of that is slowly coming up.
link |
02:03:14.420
The fact that they're casually prescribed to people
link |
02:03:18.180
is horrible and it's bloody scary.
link |
02:03:23.780
And some people are stable on them,
link |
02:03:25.020
but they're on them for life.
link |
02:03:26.260
Like once you, you know, it's another one of those drugs.
link |
02:03:29.260
But benzos long range have real impacts on your personality.
link |
02:03:32.540
People talk about the benzo bubble
link |
02:03:34.140
where you get disassociated from reality
link |
02:03:36.300
and your friends a little bit.
link |
02:03:38.180
It's really terrible.
link |
02:03:40.340
The mind is terrifying.
link |
02:03:41.700
We were talking about how the infinite possibility of fun,
link |
02:03:45.460
but like it's the infinite possibility of suffering too,
link |
02:03:48.660
which is one of the dangers of like expansion
link |
02:03:52.340
of the human mind.
link |
02:03:53.500
It's like, I wonder if all the possible experiences
link |
02:03:58.260
that an intelligent computer can have,
link |
02:04:01.740
is it mostly fun or is it mostly suffering?
link |
02:04:05.860
So like if you brute force expand the set of possibilities,
link |
02:04:11.380
like are you going to run into some trouble
link |
02:04:13.980
in terms of like torture and suffering and so on?
link |
02:04:16.580
Maybe our human brain is just protecting us
link |
02:04:18.900
from much more possible pain and suffering.
link |
02:04:22.300
Maybe the space of pain is like much larger
link |
02:04:25.980
than we could possibly imagine.
link |
02:04:27.540
And that.
link |
02:04:28.380
The world's in a balance.
link |
02:04:30.780
You know, all the literature on religion and stuff is,
link |
02:04:34.260
you know, the struggle between good and evil
link |
02:04:36.340
is balanced for very finely tuned
link |
02:04:39.420
for reasons that are complicated.
link |
02:04:41.660
But that's a long philosophical conversation.
link |
02:04:44.900
Speaking of balance that's complicated,
link |
02:04:46.700
I wonder because we're living through
link |
02:04:48.460
one of the more important moments in human history
link |
02:04:51.620
with this particular virus.
link |
02:04:53.780
It seems like pandemics have at least the ability
link |
02:04:56.980
to kill off most of the human population at their worst.
link |
02:05:03.060
And there's just fascinating
link |
02:05:04.300
because there's so many viruses in this world.
link |
02:05:06.180
There's so many, I mean, viruses basically run the world
link |
02:05:08.620
in the sense that they've been around very long time.
link |
02:05:12.260
They're everywhere.
link |
02:05:13.700
They seem to be extremely powerful
link |
02:05:15.340
in the distributed kind of way.
link |
02:05:17.300
But at the same time, they're not intelligent
link |
02:05:19.620
and they're not even living.
link |
02:05:21.260
Do you have like high level thoughts about this virus
link |
02:05:23.820
that like in terms of you being fascinated or terrified
link |
02:05:28.260
or somewhere in between?
link |
02:05:30.420
So I believe in frameworks, right?
link |
02:05:32.500
So like one of them is evolution.
link |
02:05:36.300
Like we're evolved creatures, right?
link |
02:05:37.900
Yes.
link |
02:05:38.980
And one of the things about evolution
link |
02:05:40.900
is it's hyper competitive.
link |
02:05:42.740
And it's not competitive out of a sense of evil.
link |
02:05:44.900
It's competitive as a sense of there's endless variation
link |
02:05:47.820
and variations that work better when.
link |
02:05:50.380
And then over time, there's so many levels
link |
02:05:52.980
of that competition.
link |
02:05:55.260
Like multicellular life partly exists
link |
02:05:57.740
because of the competition
link |
02:06:01.140
between different kinds of life forms.
link |
02:06:04.260
And we know sex partly exists to scramble our genes
link |
02:06:06.900
so that we have genetic variation
link |
02:06:09.900
against the invasion of the bacteria and the viruses.
link |
02:06:14.220
And it's endless.
link |
02:06:16.020
Like I read some funny statistic,
link |
02:06:18.020
like the density of viruses and bacteria in the ocean
link |
02:06:20.780
is really high.
link |
02:06:22.020
And one third of the bacteria die every day
link |
02:06:23.900
because a virus is invading them.
link |
02:06:26.220
Like one third of them.
link |
02:06:27.940
Wow.
link |
02:06:29.020
Like I don't know if that number is true,
link |
02:06:31.020
but it was like the amount of competition
link |
02:06:34.900
and what's going on is stunning.
link |
02:06:37.380
And there's a theory as we age,
link |
02:06:38.660
we slowly accumulate bacterias and viruses
link |
02:06:41.780
and as our immune system kind of goes down,
link |
02:06:45.620
that's what slowly kills us.
link |
02:06:47.740
It just feels so peaceful from a human perspective
link |
02:06:50.220
when we sit back and are able
link |
02:06:51.420
to have a relaxed conversation.
link |
02:06:54.220
And there's wars going on out there.
link |
02:06:56.780
Like right now, you're harboring how many bacteria?
link |
02:07:00.900
And the ones, many of them are parasites on you
link |
02:07:04.860
and some of them are helpful
link |
02:07:06.060
and some of them are modifying your behavior
link |
02:07:07.780
and some of them are, it's just really wild.
link |
02:07:12.220
But this particular manifestation is unusual
link |
02:07:16.460
in the demographic, how it hit
link |
02:07:18.420
and the political response that it engendered
link |
02:07:21.380
and the healthcare response it engendered
link |
02:07:23.860
and the technology it engendered, it's kind of wild.
link |
02:07:27.100
Yeah, the communication on Twitter that it led to,
link |
02:07:30.500
all that kind of stuff, at every single level, yeah.
link |
02:07:32.980
But what usually kills life,
link |
02:07:34.620
the big extinctions are caused by meteors and volcanoes.
link |
02:07:39.460
That's the one you're worried about
link |
02:07:40.820
as opposed to human created bombs that we launch.
link |
02:07:44.500
Solar flares are another good one.
link |
02:07:46.100
Occasionally, solar flares hit the planet.
link |
02:07:48.580
So it's nature.
link |
02:07:51.100
Yeah, it's all pretty wild.
link |
02:07:53.540
On another historic moment, this is perhaps outside
link |
02:07:57.500
but perhaps within your space of frameworks
link |
02:08:02.460
that you think about that just happened,
link |
02:08:04.540
I guess a couple of weeks ago is,
link |
02:08:06.620
I don't know if you're paying attention at all,
link |
02:08:08.020
is the GameStop and Wall Street bets.
link |
02:08:12.540
It's super fun.
link |
02:08:14.100
So it's really fascinating.
link |
02:08:16.580
There's kind of a theme to this conversation today
link |
02:08:19.180
because it's like neural networks,
link |
02:08:21.980
it's cool how there's a large number of people
link |
02:08:25.020
in a distributed way, almost having a kind of fun,
link |
02:08:30.340
we're able to take on the powerful elites,
link |
02:08:34.620
elite hedge funds, centralized powers and overpower them.
link |
02:08:39.980
Do you have thoughts on this whole saga?
link |
02:08:43.340
I don't know enough about finance,
link |
02:08:45.020
but it was like the Elon, Robinhood guy when they talked.
link |
02:08:49.260
Yeah, what'd you think about that?
link |
02:08:51.580
Well, Robinhood guy didn't know
link |
02:08:52.660
how the finance system worked.
link |
02:08:54.300
That was clear, right?
link |
02:08:55.540
He was treating like the people
link |
02:08:57.340
who settled the transactions as a black box.
link |
02:09:00.020
And suddenly somebody called him up and say,
link |
02:09:01.620
hey, black box calling you, your transaction volume
link |
02:09:04.740
means you need to put out $3 billion right now.
link |
02:09:06.940
And he's like, I don't have $3 billion.
link |
02:09:08.940
Like I don't even make any money on these trades.
link |
02:09:10.540
Why do I owe $3 billion while you're sponsoring the trade?
link |
02:09:13.220
So there was a set of abstractions
link |
02:09:15.620
that I don't think either, like now we understand it.
link |
02:09:19.540
Like this happens in chip design.
link |
02:09:21.100
Like you buy wafers from TSMC or Samsung or Intel,
link |
02:09:25.660
and they say it works like this
link |
02:09:27.460
and you do your design based on that.
link |
02:09:29.020
And then chip comes back and doesn't work.
link |
02:09:31.260
And then suddenly you started having to open the black boxes.
link |
02:09:34.260
The transistors really work like they said,
link |
02:09:36.380
what's the real issue?
link |
02:09:37.620
So there's a whole set of things
link |
02:09:43.260
that created this opportunity and somebody spotted it.
link |
02:09:46.220
Now, people spot these kinds of opportunities all the time.
link |
02:09:49.900
So there's been flash crashes,
link |
02:09:51.380
there's always short squeezes are fairly regular.
link |
02:09:55.340
Every CEO I know hates the shorts
link |
02:09:58.500
because they're trying to manipulate their stock
link |
02:10:01.860
in a way that they make money
link |
02:10:03.860
and deprive value from both the company
link |
02:10:07.420
and the investors.
link |
02:10:08.900
So the fact that some of these stocks were so short,
link |
02:10:13.700
it's hilarious that this hasn't happened before.
link |
02:10:17.340
I don't know why, and I don't actually know why
link |
02:10:19.900
some serious hedge funds didn't do it to other hedge funds.
link |
02:10:23.460
And some of the hedge funds
link |
02:10:24.380
actually made a lot of money on this.
link |
02:10:26.580
So my guess is we know 5% of what really happened
link |
02:10:32.140
and that a lot of the players don't know what happened.
link |
02:10:34.420
And the people who probably made the most money
link |
02:10:37.420
aren't the people that they're talking about.
link |
02:10:39.500
That's.
link |
02:10:41.060
Do you think there was something,
link |
02:10:42.660
I mean, this is the cool kind of Elon,
link |
02:10:47.940
you're the same kind of conversationalist,
link |
02:10:50.660
which is like first principles questions of like,
link |
02:10:53.860
what the hell happened?
link |
02:10:56.260
Just very basic questions of like,
link |
02:10:57.900
was there something shady going on?
link |
02:11:00.780
What, who are the parties involved?
link |
02:11:03.660
It's the basic questions everybody wants to know about.
link |
02:11:06.340
Yeah, so like we're in a very hyper competitive world,
link |
02:11:10.340
but transactions like buying and selling stock
link |
02:11:12.180
is a trust event.
link |
02:11:13.780
I trust the company, represented themselves properly.
link |
02:11:16.980
I bought the stock because I think it's gonna go up.
link |
02:11:19.660
I trust that the regulations are solid.
link |
02:11:22.660
Now, inside of that, there's all kinds of places
link |
02:11:26.140
where humans over trust and this exposed,
link |
02:11:31.140
let's say some weak points in the system.
link |
02:11:34.580
I don't know if it's gonna get corrected.
link |
02:11:37.340
I don't know if we have close to the real story.
link |
02:11:41.740
Yeah, my suspicion is we don't.
link |
02:11:44.460
And listen to that guy, he was like a little wide eyed
link |
02:11:47.300
about and then he did this and then he did that.
link |
02:11:49.060
And I was like, I think you should know more
link |
02:11:51.820
about your business than that.
link |
02:11:54.180
But again, there's many businesses
link |
02:11:56.140
when like this layer is really stable,
link |
02:11:58.780
you stop paying attention to it.
link |
02:12:00.700
You pay attention to the stuff that's bugging you or new.
link |
02:12:04.500
You don't pay attention to the stuff
link |
02:12:05.780
that just seems to work all the time.
link |
02:12:07.060
You just, sky's blue every day, California.
link |
02:12:11.100
And every once in a while it rains
link |
02:12:12.740
and everybody's like, what do we do?
link |
02:12:15.300
Somebody go bring in the lawn furniture.
link |
02:12:17.940
It's getting wet.
link |
02:12:18.780
You don't know why it's getting wet.
link |
02:12:19.980
Yeah, it doesn't always work.
link |
02:12:20.820
I was blue for like a hundred days and now it's, so.
link |
02:12:24.580
But part of the problem here with Vlad,
link |
02:12:27.020
the CEO of Robinhood is the scaling
link |
02:12:29.540
that we've been talking about is there's a lot
link |
02:12:32.540
of unexpected things that happen with the scaling
link |
02:12:36.020
and you have to be, I think the scaling forces you
link |
02:12:39.660
to then return to the fundamentals.
link |
02:12:41.780
Well, it's interesting because when you buy and sell stocks,
link |
02:12:44.460
the scaling is, the stocks don't only move
link |
02:12:46.460
in a certain range and if you buy a stock,
link |
02:12:48.180
you can only lose that amount of money.
link |
02:12:50.020
On the short market, you can lose a lot more
link |
02:12:52.420
than you can benefit.
link |
02:12:53.860
Like it has a weird cost function
link |
02:12:57.220
or whatever the right word for that is.
link |
02:12:59.260
So he was trading in a market
link |
02:13:01.140
where he wasn't actually capitalized for the downside.
link |
02:13:04.220
If it got outside a certain range.
link |
02:13:07.380
Now, whether something nefarious has happened,
link |
02:13:09.780
I have no idea, but at some point,
link |
02:13:12.580
the financial risk to both him and his customers
link |
02:13:16.540
was way outside of his financial capacity
link |
02:13:19.140
and his understanding how the system work was clearly weak
link |
02:13:23.380
or he didn't represent himself.
link |
02:13:25.140
I don't know the person and when I listened to him,
link |
02:13:28.780
it could have been the surprise question was like,
link |
02:13:30.500
and then these guys called and it sounded like
link |
02:13:34.020
he was treating stuff as a black box.
link |
02:13:36.260
Maybe he shouldn't have, but maybe he has a whole pile
link |
02:13:38.540
of experts somewhere else and it was going on.
link |
02:13:40.060
I don't know.
link |
02:13:41.220
Yeah, I mean, this is one of the qualities
link |
02:13:45.180
of a good leader is under fire, you have to perform.
link |
02:13:49.060
And that means to think clearly and to speak clearly.
link |
02:13:53.020
And he dropped the ball on those things
link |
02:13:55.260
because and understand the problem quickly,
link |
02:13:58.060
learn and understand the problem at this basic level.
link |
02:14:03.380
What the hell happened?
link |
02:14:05.100
And my guess is, at some level it was amateurs trading
link |
02:14:09.820
against experts slash insiders slash people
link |
02:14:12.940
with special information.
link |
02:14:14.900
Outsiders versus insiders.
link |
02:14:16.900
Yeah, and the insiders, my guess is the next time
link |
02:14:20.700
this happens, we'll make money on it.
link |
02:14:22.980
The insiders always win?
link |
02:14:25.100
Well, they have more tools and more incentive.
link |
02:14:27.140
I mean, this always happens.
link |
02:14:28.460
Like the outsiders are doing this for fun.
link |
02:14:30.820
The insiders are doing this 24 seven.
link |
02:14:33.340
But there's numbers in the outsiders.
link |
02:14:35.740
This is the interesting thing is it could be
link |
02:14:37.540
a new chapter. There's numbers
link |
02:14:38.380
on the insiders too.
link |
02:14:41.100
Different kind of numbers, yeah.
link |
02:14:44.020
But this could be a new era because, I don't know,
link |
02:14:46.100
at least I didn't expect that a bunch of Redditors could,
link |
02:14:49.460
there's millions of people who can get together.
link |
02:14:51.580
It was a surprise attack.
link |
02:14:52.420
The next one will be a surprise.
link |
02:14:54.220
But don't you think the crowd, the people are planning
link |
02:14:57.540
the next attack?
link |
02:14:59.260
We'll see.
link |
02:15:00.500
But it has to be a surprise.
link |
02:15:01.420
It can't be the same game.
link |
02:15:04.620
And so the insiders.
link |
02:15:05.460
It's like, it could be there's a very large number
link |
02:15:07.980
of games to play and they can be agile about it.
link |
02:15:10.540
I don't know.
link |
02:15:11.380
I'm not an expert.
link |
02:15:12.220
Right, that's a good question.
link |
02:15:13.780
The space of games, how restricted is it?
link |
02:15:18.020
Yeah, and the system is so complicated
link |
02:15:20.220
it could be relatively unrestricted.
link |
02:15:22.740
And also during the last couple of financial crashes,
link |
02:15:27.180
what set it off was sets of derivative events
link |
02:15:30.180
where Nassim Taleb's thing is they're trying
link |
02:15:35.980
to lower volatility in the short run
link |
02:15:39.420
by creating tail events.
link |
02:15:41.660
And the system's always evolved towards that
link |
02:15:43.700
and then they always crash.
link |
02:15:45.620
The S curve is the start low, ramp, plateau, crash.
link |
02:15:50.620
It's 100% effective.
link |
02:15:54.540
In the long run.
link |
02:15:55.860
Let me ask you some advice to put on your profound hat.
link |
02:16:01.660
There's a bunch of young folks who listen to this thing
link |
02:16:04.620
for no good reason whatsoever.
link |
02:16:07.460
Undergraduate students, maybe high school students,
link |
02:16:10.620
maybe just young folks, a young at heart
link |
02:16:13.020
looking for the next steps to take in life.
link |
02:16:16.860
What advice would you give to a young person today
link |
02:16:19.300
about life, maybe career, but also life in general?
link |
02:16:23.860
Get good at some stuff.
link |
02:16:26.100
Well, get to know yourself, right?
link |
02:16:28.220
Get good at something that you're actually interested in.
link |
02:16:30.660
You have to love what you're doing to get good at it.
link |
02:16:33.500
You really gotta find that.
link |
02:16:34.420
Don't waste all your time doing stuff
link |
02:16:35.800
that's just boring or bland or numbing, right?
link |
02:16:40.140
Don't let old people screw you.
link |
02:16:42.380
Well, people get talked into doing all kinds of shit
link |
02:16:46.740
and racking up huge student debts
link |
02:16:49.300
and there's so much crap going on.
link |
02:16:52.580
And then drains your time and drains your energy.
link |
02:16:54.700
The Eric Weinstein thesis that the older generation
link |
02:16:58.100
won't let go and they're trapping all the young people.
link |
02:17:01.100
Do you think there's some truth to that?
link |
02:17:02.460
Yeah, sure.
link |
02:17:04.940
Just because you're old doesn't mean you stop thinking.
link |
02:17:06.940
I know lots of really original old people.
link |
02:17:10.380
I'm an old person.
link |
02:17:14.260
But you have to be conscious about it.
link |
02:17:15.660
You can fall into the ruts and then do that.
link |
02:17:18.940
I mean, when I hear young people spouting opinions
link |
02:17:22.060
that sounds like they come from Fox News or CNN,
link |
02:17:24.380
I think they've been captured by groupthink and memes.
link |
02:17:27.980
They're supposed to think on their own.
link |
02:17:29.780
So if you find yourself repeating
link |
02:17:31.420
what everybody else is saying,
link |
02:17:33.420
you're not gonna have a good life.
link |
02:17:36.260
Like, that's not how the world works.
link |
02:17:38.460
It seems safe, but it puts you at great jeopardy
link |
02:17:41.040
for being boring or unhappy.
link |
02:17:45.900
How long did it take you to find the thing
link |
02:17:47.780
that you have fun with?
link |
02:17:50.620
Oh, I don't know.
link |
02:17:52.140
I've been a fun person since I was pretty little.
link |
02:17:54.300
So everything.
link |
02:17:55.140
I've gone through a couple periods of depression in my life.
link |
02:17:58.100
For a good reason or for a reason
link |
02:18:00.180
that doesn't make any sense?
link |
02:18:02.620
Yeah, like some things are hard.
link |
02:18:05.980
Like you go through mental transitions in high school.
link |
02:18:08.900
I was really depressed for a year
link |
02:18:10.700
and I think I had my first midlife crisis at 26.
link |
02:18:15.140
I kind of thought, is this all there is?
link |
02:18:16.660
Like I was working at a job that I loved,
link |
02:18:20.500
but I was going to work and all my time was consumed.
link |
02:18:23.420
What's the escape out of that depression?
link |
02:18:25.820
What's the answer to is this all there is?
link |
02:18:29.220
Well, a friend of mine, I asked him,
link |
02:18:31.820
because he was working his ass off,
link |
02:18:32.900
I said, what's your work life balance?
link |
02:18:34.540
Like there's work, friends, family, personal time.
link |
02:18:39.540
Are you balancing any of that?
link |
02:18:41.380
And he said, work 80%, family 20%.
link |
02:18:43.580
And I tried to find some time to sleep.
link |
02:18:47.540
Like there's no personal time.
link |
02:18:49.220
There's no passionate time.
link |
02:18:51.820
Like the young people are often passionate about work.
link |
02:18:54.580
So I was certainly like that.
link |
02:18:56.980
But you need to have some space in your life
link |
02:18:59.940
for different things.
link |
02:19:01.820
And that creates, that makes you resistant
link |
02:19:05.860
to the whole, the deep dips into depression kind of thing.
link |
02:19:11.260
Yeah, well, you have to get to know yourself too.
link |
02:19:13.060
Meditation helps.
link |
02:19:14.460
Some physical, something physically intense helps.
link |
02:19:18.540
Like the weird places your mind goes kind of thing.
link |
02:19:21.940
Like, and why does it happen?
link |
02:19:23.780
Why do you do what you do?
link |
02:19:24.860
Like triggers, like the things that cause your mind
link |
02:19:27.660
to go to different places kind of thing,
link |
02:19:29.460
or like events like.
link |
02:19:32.180
Your upbringing for better or worse,
link |
02:19:33.740
whether your parents are great people or not,
link |
02:19:35.700
you come into adulthood with all kinds of emotional burdens.
link |
02:19:42.780
And you can see some people are so bloody stiff
link |
02:19:45.060
and restrained, and they think the world's
link |
02:19:47.180
fundamentally negative, like you maybe.
link |
02:19:50.660
You have unexplored territory.
link |
02:19:53.020
Yeah.
link |
02:19:53.980
Or you're afraid of something.
link |
02:19:56.300
Definitely afraid of quite a few things.
link |
02:19:58.780
Then you gotta go face them.
link |
02:20:00.180
Like what's the worst thing that can happen?
link |
02:20:03.460
You're gonna die, right?
link |
02:20:05.180
Like that's inevitable.
link |
02:20:06.340
You might as well get over that.
link |
02:20:07.380
Like 100%, that's right.
link |
02:20:09.780
Like people are worried about the virus,
link |
02:20:11.060
but you know, the human condition is pretty deadly.
link |
02:20:14.460
There's something about embarrassment
link |
02:20:16.300
that's, I've competed a lot in my life,
link |
02:20:18.220
and I think the, if I'm to introspect it,
link |
02:20:21.980
the thing I'm most afraid of is being like humiliated,
link |
02:20:26.100
I think.
link |
02:20:26.940
Yeah, nobody cares about that.
link |
02:20:28.020
Like you're the only person on the planet
link |
02:20:29.980
that cares about you being humiliated.
link |
02:20:31.620
Exactly.
link |
02:20:32.460
It's like a really useless thought.
link |
02:20:34.740
It is.
link |
02:20:35.580
It's like, you're all humiliated.
link |
02:20:39.540
Something happened in a room full of people,
link |
02:20:41.140
and they walk out, and they didn't think about it
link |
02:20:42.660
one more second.
link |
02:20:43.780
Or maybe somebody told a funny story to somebody else.
link |
02:20:45.900
And then it dissipates it throughout, yeah.
link |
02:20:48.580
Yeah.
link |
02:20:49.420
No, I know it too.
link |
02:20:50.260
I mean, I've been really embarrassed about shit
link |
02:20:53.340
that nobody cared about myself.
link |
02:20:55.500
Yeah.
link |
02:20:56.340
It's a funny thing.
link |
02:20:57.180
So the worst thing ultimately is just.
link |
02:20:59.620
Yeah, but that's a cage,
link |
02:21:01.020
and then you have to get out of it.
link |
02:21:02.060
Yeah.
link |
02:21:02.900
Like once you, here's the thing.
link |
02:21:03.860
Once you find something like that,
link |
02:21:05.740
you have to be determined to break it.
link |
02:21:09.060
Because otherwise you'll just,
link |
02:21:10.260
so you accumulate that kind of junk,
link |
02:21:11.740
and then you die as a mess.
link |
02:21:15.420
So the goal, I guess it's like a cage within a cage.
link |
02:21:18.420
I guess the goal is to die in the biggest possible cage.
link |
02:21:21.980
Well, ideally you'd have no cage.
link |
02:21:25.460
People do get enlightened.
link |
02:21:26.500
I've met a few.
link |
02:21:27.460
It's great.
link |
02:21:28.500
You've found a few?
link |
02:21:29.340
There's a few out there?
link |
02:21:30.460
I don't know.
link |
02:21:31.280
Of course there are.
link |
02:21:32.120
I don't know.
link |
02:21:33.360
Either that or it's a great sales pitch.
link |
02:21:35.520
There's enlightened people writing books
link |
02:21:37.080
and doing all kinds of stuff.
link |
02:21:38.280
It's a good way to sell a book.
link |
02:21:39.520
I'll give you that.
link |
02:21:40.840
You've never met somebody you just thought,
link |
02:21:42.880
they just kill me.
link |
02:21:43.840
Like they just, like mental clarity, humor.
link |
02:21:47.880
No, 100%, but I just feel like
link |
02:21:49.560
they're living in a bigger cage.
link |
02:21:50.960
They have their own.
link |
02:21:52.040
You still think there's a cage?
link |
02:21:53.360
There's still a cage.
link |
02:21:54.400
You secretly suspect there's always a cage.
link |
02:21:57.560
There's nothing outside the universe.
link |
02:21:59.880
There's nothing outside the cage.
link |
02:22:02.280
You work in a bunch of companies,
link |
02:22:10.160
you lead a lot of amazing teams.
link |
02:22:15.320
I'm not sure if you've ever been
link |
02:22:16.580
like in the early stages of a startup,
link |
02:22:19.440
but do you have advice for somebody
link |
02:22:24.560
that wants to do a startup or build a company,
link |
02:22:28.320
like build a strong team of engineers that are passionate
link |
02:22:31.160
and just want to solve a big problem?
link |
02:22:35.000
Like, is there a more specifically on that point?
link |
02:22:39.080
Well, you have to be really good at stuff.
link |
02:22:41.360
If you're going to lead and build a team,
link |
02:22:43.040
you better be really interested
link |
02:22:44.520
in how people work and think.
link |
02:22:46.960
The people or the solution to the problem.
link |
02:22:49.040
So there's two things, right?
link |
02:22:50.160
One is how people work and the other is the...
link |
02:22:52.920
Well, actually there's quite a few successful startups.
link |
02:22:55.640
It's pretty clear the founders
link |
02:22:56.800
don't know anything about people.
link |
02:22:58.360
Like the idea was so powerful that it propelled them.
link |
02:23:01.480
But I suspect somewhere early,
link |
02:23:03.760
they hired some people who understood people
link |
02:23:06.980
because people really need a lot of care and feeding
link |
02:23:08.960
to collaborate and work together
link |
02:23:10.480
and feel engaged and work hard.
link |
02:23:13.800
Like startups are all about out producing other people.
link |
02:23:17.000
Like you're nimble because you don't have any legacy.
link |
02:23:19.820
You don't have a bunch of people
link |
02:23:22.320
who are depressed about life just showing up.
link |
02:23:24.720
So startups have a lot of advantages that way.
link |
02:23:29.720
Do you like the, Steve Jobs talked about this idea
link |
02:23:32.960
of A players and B players.
link |
02:23:34.940
I don't know if you know this formulation.
link |
02:23:37.240
Yeah, no.
link |
02:23:39.940
Organizations that get taken over by B player leaders
link |
02:23:44.680
often really underperform their C players.
link |
02:23:48.120
That said, in big organizations,
link |
02:23:50.720
there's so much work to do.
link |
02:23:52.600
And there's so many people who are happy
link |
02:23:54.040
to do what the leadership or the big idea people
link |
02:23:57.480
would consider menial jobs.
link |
02:24:00.320
And you need a place for them,
link |
02:24:01.880
but you need an organization that both values and rewards
link |
02:24:05.680
them but doesn't let them take over the leadership of it.
link |
02:24:08.460
Got it.
link |
02:24:09.300
So you need to have an organization
link |
02:24:11.040
that's resistant to that.
link |
02:24:11.960
But in the early days, the notion with Steve
link |
02:24:16.720
was that like one B player in a room of A players
link |
02:24:20.680
will be like destructive to the whole.
link |
02:24:23.040
I've seen that happen.
link |
02:24:24.360
I don't know if it's like always true.
link |
02:24:28.200
You run into people who are clearly B players
link |
02:24:30.320
but they think they're A players
link |
02:24:31.520
and so they have a loud voice at the table
link |
02:24:33.200
and they make lots of demands for that.
link |
02:24:35.160
But there's other people who are like, I know who I am.
link |
02:24:37.520
I just wanna work with cool people on cool shit
link |
02:24:39.720
and just tell me what to do and I'll go get it done.
link |
02:24:42.560
So you have to, again, this is like people skills.
link |
02:24:45.840
What kind of person is it?
link |
02:24:47.960
I've met some really great people I love working with
link |
02:24:51.040
that weren't the biggest ID people or the most productive
link |
02:24:53.600
ever but they show up, they get it done.
link |
02:24:56.200
They create connection and community that people value.
link |
02:24:59.880
It's pretty diverse so I don't think
link |
02:25:02.360
there's a recipe for that.
link |
02:25:05.120
I gotta ask you about love.
link |
02:25:07.000
I heard you're into this now.
link |
02:25:08.700
Into this love thing?
link |
02:25:09.560
Yeah, is this, do you think this is your solution
link |
02:25:11.720
to your depression?
link |
02:25:13.320
No, I'm just trying to, like you said,
link |
02:25:14.880
delighting people and occasionally trying to sell a book.
link |
02:25:16.960
I'm writing a book about love.
link |
02:25:18.120
You're writing a book about love?
link |
02:25:18.960
No, I'm not, I'm not.
link |
02:25:21.080
I have a friend of mine, he's gonna,
link |
02:25:25.080
he said you should really write a book
link |
02:25:27.240
about your management philosophy.
link |
02:25:29.080
He said it'd be a short book.
link |
02:25:35.000
Well, that one was thought pretty well.
link |
02:25:37.800
What role do you think love, family, friendship,
link |
02:25:40.440
all that kind of human stuff play in a successful life?
link |
02:25:44.400
You've been exceptionally successful in the space
link |
02:25:46.360
of running teams, building cool shit in this world,
link |
02:25:51.160
creating some amazing things.
link |
02:25:53.160
What, did love get in the way?
link |
02:25:54.720
Did love help the family get in the way?
link |
02:25:57.720
Did family help friendship?
link |
02:25:59.760
You want the engineer's answer?
link |
02:26:02.120
Please.
link |
02:26:03.120
But first, love is functional, right?
link |
02:26:05.800
It's functional in what way?
link |
02:26:07.280
So we habituate ourselves to the environment.
link |
02:26:11.000
And actually, Jordan Peterson told me this line.
link |
02:26:13.920
So you go through life and you just get used to everything,
link |
02:26:16.440
except for the things you love.
link |
02:26:17.800
They remain new.
link |
02:26:20.080
Like, this is really useful for, you know,
link |
02:26:22.440
like other people's children and dogs and trees.
link |
02:26:26.080
You just don't pay that much attention to them.
link |
02:26:27.700
Your own kids, you monitor them really closely.
link |
02:26:31.000
Like, and if they go off a little bit,
link |
02:26:32.720
because you love them, if you're smart,
link |
02:26:35.280
if you're gonna be a successful parent,
link |
02:26:37.480
you notice it right away.
link |
02:26:38.920
You don't habituate to just things you love.
link |
02:26:44.320
And if you want to be successful at work,
link |
02:26:46.160
if you don't love it,
link |
02:26:47.560
you're not gonna put the time in somebody else.
link |
02:26:50.400
It's somebody else that loves it.
link |
02:26:51.600
Like, because it's new and interesting,
link |
02:26:53.760
and that lets you go to the next level.
link |
02:26:57.560
So it's the thing, it's just a function
link |
02:26:59.120
that generates newness and novelty
link |
02:27:01.680
and surprises, you know, all those kinds of things.
link |
02:27:04.680
It's really interesting.
link |
02:27:06.360
There's people who figured out lots of frameworks for this.
link |
02:27:09.840
Like, humans seem to go,
link |
02:27:11.600
in partnership, go through interests.
link |
02:27:13.880
Like, suddenly somebody's interesting,
link |
02:27:16.640
and then you're infatuated with them,
link |
02:27:18.200
and then you're in love with them.
link |
02:27:20.080
And then you, you know, different people have ideas
link |
02:27:22.600
about parental love or mature love.
link |
02:27:24.520
Like, you go through a cycle of that,
link |
02:27:26.600
which keeps us together,
link |
02:27:27.840
and it's super functional for creating families
link |
02:27:30.600
and creating communities and making you support somebody
link |
02:27:34.560
despite the fact that you don't love them.
link |
02:27:36.960
Like, and it can be really enriching.
link |
02:27:44.260
You know, now, in the work life balance scheme,
link |
02:27:47.480
if alls you do is work,
link |
02:27:49.760
you think you may be optimizing your work potential,
link |
02:27:52.320
but if you don't love your work
link |
02:27:53.840
or you don't have family and friends
link |
02:27:56.960
and things you care about,
link |
02:27:59.280
your brain isn't well balanced.
link |
02:28:02.000
Like, everybody knows the experience of,
link |
02:28:03.440
he works on something all week.
link |
02:28:04.680
He went home, took two days off, and he came back in.
link |
02:28:07.720
The odds of you working on the thing,
link |
02:28:09.360
you picking up right where you left off is zero.
link |
02:28:12.760
Your brain refactored it.
link |
02:28:17.040
But being in love is great.
link |
02:28:19.200
It's like changes the color of the light in the room.
link |
02:28:22.440
It creates a spaciousness that's different.
link |
02:28:25.600
It helps you think.
link |
02:28:27.900
It makes you strong.
link |
02:28:29.560
Bukowski had this line about love being a fog
link |
02:28:32.520
that dissipates with the first light of reality
link |
02:28:36.240
in the morning.
link |
02:28:37.080
That's depressing.
link |
02:28:38.000
I think it's the other way around.
link |
02:28:39.560
It lasts.
link |
02:28:40.400
Well, like you said, it's a function.
link |
02:28:42.100
It's a thing that generates.
link |
02:28:42.940
It can be the light that actually enlivens your world
link |
02:28:45.640
and creates the interest and the power and the strength
link |
02:28:49.320
to go do something.
link |
02:28:51.720
Well, it's like, that sounds like,
link |
02:28:54.360
you know, there's like physical love, emotional love,
link |
02:28:56.200
intellectual love, spiritual love, right?
link |
02:28:58.240
Isn't it all the same thing, kind of?
link |
02:28:59.840
Nope.
link |
02:29:01.080
You should differentiate that.
link |
02:29:02.160
Maybe that's your problem.
link |
02:29:04.040
In your book, you should refine that a little bit.
link |
02:29:06.080
Is it different chapters?
link |
02:29:07.280
Yeah, there's different chapters.
link |
02:29:08.560
What's these, aren't these just different layers
link |
02:29:11.600
of the same thing, the stack of physical?
link |
02:29:14.360
People, some people are addicted to physical love
link |
02:29:17.400
and they have no idea about emotional or intellectual love.
link |
02:29:21.880
I don't know if they're the same things.
link |
02:29:22.960
I think they're different.
link |
02:29:23.920
That's true.
link |
02:29:24.760
They could be different.
link |
02:29:25.580
I guess the ultimate goal is for it to be the same.
link |
02:29:28.200
Well, if you want something to be bigger and interesting,
link |
02:29:30.200
you should find all its components and differentiate them,
link |
02:29:32.560
not clump it together.
link |
02:29:34.520
Like, people do this all the time.
link |
02:29:36.360
Yeah, the modularity.
link |
02:29:38.120
Get your abstraction layers right
link |
02:29:39.440
and then you have room to breathe.
link |
02:29:41.600
Well, maybe you can write the forward to my book
link |
02:29:43.480
about love.
link |
02:29:44.320
Or the afterwards.
link |
02:29:45.960
And the after.
link |
02:29:46.800
You really tried.
link |
02:29:49.320
I feel like Lex has made a lot of progress in this book.
link |
02:29:53.920
Well, you have things in your life that you love.
link |
02:29:55.880
Yeah, yeah.
link |
02:29:57.680
And they are, you're right, they're modular.
link |
02:29:59.800
It's quality.
link |
02:30:01.280
And you can have multiple things with the same person
link |
02:30:04.560
or the same thing.
link |
02:30:06.320
But, yeah.
link |
02:30:08.520
Depending on the moment of the day.
link |
02:30:09.720
Yeah, there's, like what Bukowski described
link |
02:30:13.160
is that moment when you go from being in love
link |
02:30:15.420
to having a different kind of love.
link |
02:30:17.320
Yeah.
link |
02:30:18.360
And that's a transition.
link |
02:30:19.480
But when it happens, if you read the owner's manual
link |
02:30:21.720
and you believed it, you would have said,
link |
02:30:23.620
oh, this happened.
link |
02:30:25.200
It doesn't mean it's not love.
link |
02:30:26.460
It's a different kind of love.
link |
02:30:27.920
But maybe there's something better about that.
link |
02:30:32.320
As you grow old, all you do is regret how you used to be.
link |
02:30:36.760
It's sad.
link |
02:30:38.560
Right?
link |
02:30:39.400
You should have learned a lot of things
link |
02:30:40.720
because like who you can be in your future self
link |
02:30:43.280
is actually more interesting and possibly delightful
link |
02:30:46.720
than being a mad kid in love with the next person.
link |
02:30:52.000
Like, that's super fun when it happens.
link |
02:30:54.440
But that's, you know, 5% of the possibility.
link |
02:30:59.840
Yeah, that's right.
link |
02:31:02.280
There's a lot more fun to be had in the long lasting stuff.
link |
02:31:05.320
Yeah, or meaning, you know, if that's your thing.
link |
02:31:07.640
Which is a kind of fun.
link |
02:31:09.280
It's a deeper kind of fun.
link |
02:31:10.640
And it's surprising.
link |
02:31:11.560
You know, that's, like the thing I like is surprises.
link |
02:31:15.920
You know, and you just never know what's gonna happen.
link |
02:31:19.440
But you have to look carefully and you have to work at it
link |
02:31:21.400
and you have to think about it and you know, it's.
link |
02:31:24.000
Yeah, you have to see the surprises when they happen, right?
link |
02:31:26.480
You have to be looking for it.
link |
02:31:28.320
From the branching perspective, you mentioned regrets.
link |
02:31:33.360
Do you have regrets about your own trajectory?
link |
02:31:36.200
Oh yeah, of course.
link |
02:31:38.200
Yeah, some of it's painful,
link |
02:31:39.440
but you wanna hear the painful stuff?
link |
02:31:41.320
No.
link |
02:31:42.160
I would say, like in terms of working with people,
link |
02:31:46.960
when people did stuff I didn't like,
link |
02:31:48.760
especially if it was a bit nefarious,
link |
02:31:50.760
I took it personally and I also felt it was personal
link |
02:31:54.520
about them.
link |
02:31:56.000
But a lot of times, like humans are,
link |
02:31:57.760
you know, most humans are a mess, right?
link |
02:31:59.840
And then they act out and they do stuff.
link |
02:32:02.120
And the psychologist I heard a long time ago said,
link |
02:32:06.000
you tend to think somebody does something to you.
link |
02:32:09.240
But really what they're doing is they're doing
link |
02:32:10.880
what they're doing while they're in front of you.
link |
02:32:13.360
It's not that much about you, right?
link |
02:32:16.240
And as I got more interested in,
link |
02:32:20.400
you know, when I work with people,
link |
02:32:21.720
I think about them and probably analyze them
link |
02:32:25.080
and understand them a little bit.
link |
02:32:26.600
And then when they do stuff, I'm way less surprised.
link |
02:32:29.080
And if it's bad, I'm way less hurt.
link |
02:32:32.320
And I react way less.
link |
02:32:34.160
Like I sort of expect everybody's got their shit.
link |
02:32:37.080
Yeah, and it's not about you as much.
link |
02:32:38.920
It's not about me that much.
link |
02:32:41.000
It's like, you know, you do something
link |
02:32:42.760
and you think you're embarrassed, but nobody cares.
link |
02:32:45.280
Like, and somebody's really mad at you,
link |
02:32:46.920
the odds of it being about you.
link |
02:32:49.680
No, they're getting mad the way they're doing that
link |
02:32:51.360
because of some pattern they learned.
link |
02:32:53.160
And you know, and maybe you can help them
link |
02:32:55.560
if you care enough about it.
link |
02:32:56.840
But, or you could see it coming and step out of the way.
link |
02:33:00.560
Like, I wish I was way better at that.
link |
02:33:02.860
I'm a bit of a hothead.
link |
02:33:04.740
And in support of that.
link |
02:33:06.000
You said with Steve, that was a feature, not a bug.
link |
02:33:08.880
Yeah, well, he was using it as the counter force
link |
02:33:11.640
to orderliness that would crush his work.
link |
02:33:13.480
Well, you were doing the same.
link |
02:33:15.080
Yeah, maybe.
link |
02:33:15.920
I don't think I, I don't think my vision was big enough.
link |
02:33:18.960
It was more like I just got pissed off and did stuff.
link |
02:33:22.560
I'm sure that's the, yeah, you're telling me.
link |
02:33:27.280
I don't know if it had the,
link |
02:33:29.080
it didn't have the amazing effect
link |
02:33:30.920
of creating the trillion dollar company.
link |
02:33:32.440
It was more like I just got pissed off and left
link |
02:33:35.320
and, or made enemies that I shouldn't have.
link |
02:33:38.400
And yeah, it's hard.
link |
02:33:40.520
Like, I didn't really understand politics
link |
02:33:42.080
until I worked at Apple where, you know,
link |
02:33:44.320
Steve was a master player of politics
link |
02:33:46.120
and his staff had to be, or they wouldn't survive him.
link |
02:33:48.840
And it was definitely part of the culture.
link |
02:33:51.400
And then I've been in companies where they say
link |
02:33:52.640
it's political, but it's all, you know,
link |
02:33:54.880
fun and games compared to Apple.
link |
02:33:56.920
And it's not that the people at Apple are bad people.
link |
02:34:00.320
It's just, they operate politically at a higher level.
link |
02:34:04.680
You know, it's not like, oh, somebody said something bad
link |
02:34:06.920
about somebody, somebody else, which is most politics.
link |
02:34:10.840
It's, you know, they had strategies
link |
02:34:13.520
about accomplishing their goals.
link |
02:34:15.680
Sometimes, you know, over the dead bodies of their enemies.
link |
02:34:19.920
You know, with sophistication, yeah,
link |
02:34:23.080
more Game of Thrones than sophistication
link |
02:34:25.440
and like a big time factor rather than a, you know.
link |
02:34:29.000
Wow, that requires a lot of control over your emotions,
link |
02:34:31.280
I think, to have a bigger strategy in the way you behave.
link |
02:34:35.600
Yeah, and it's effective in the sense
link |
02:34:38.800
that coordinating thousands of people
link |
02:34:40.760
to do really hard things where many of the people
link |
02:34:44.280
in there don't understand themselves,
link |
02:34:45.920
much less how they're participating,
link |
02:34:47.960
creates all kinds of, you know, drama and problems
link |
02:34:52.600
that, you know, our solution is political in nature.
link |
02:34:55.800
Like how do you convince people?
link |
02:34:57.040
How do you leverage them?
link |
02:34:57.880
How do you motivate them?
link |
02:34:59.040
How do you get rid of them?
link |
02:35:00.040
How do you, you know, like there's so many layers
link |
02:35:02.400
of that that are interesting.
link |
02:35:04.440
And even though some of it, let's say, may be tough,
link |
02:35:08.480
it's not evil unless, you know, you use that skill
link |
02:35:13.480
to evil purposes, which some people obviously do.
link |
02:35:16.240
But it's a skill set that operates, you know.
link |
02:35:19.480
And I wish I'd, you know, I was interested in it,
link |
02:35:22.320
but I, you know, it was sort of like,
link |
02:35:24.080
I'm an engineer, I do my thing.
link |
02:35:26.640
And, you know, there's times
link |
02:35:28.360
when I could have had a way bigger impact
link |
02:35:31.320
if I, you know, knew how to,
link |
02:35:33.160
if I paid more attention and knew more about that.
link |
02:35:36.640
Yeah, about the human layer of the stack.
link |
02:35:38.800
Yeah, that human political power, you know,
link |
02:35:41.560
expression layer of the stack.
link |
02:35:43.240
Just complicated.
link |
02:35:44.720
And there's lots to know about it.
link |
02:35:45.960
I mean, people are good at it, are just amazing.
link |
02:35:49.440
And when they're good at it,
link |
02:35:50.480
and let's say, relatively kind and oriented
link |
02:35:55.360
in a good direction, you can really feel,
link |
02:35:58.640
you can get lots of stuff done and coordinate things
link |
02:36:00.520
that you never thought possible.
link |
02:36:03.560
But all people like that also have some pretty hard edges
link |
02:36:06.680
because, you know, it's a heavy lift.
link |
02:36:09.600
And I wish I'd spent more time like that when I was younger.
link |
02:36:13.160
But maybe I wasn't ready.
link |
02:36:14.120
You know, I was a wide eyed kid for 30 years.
link |
02:36:17.720
Still a bit of a kid.
link |
02:36:18.680
Yeah, I know.
link |
02:36:19.960
What do you hope your legacy is
link |
02:36:23.480
when there's a book like Hitchhiker's Guide to the Galaxy,
link |
02:36:28.000
and this is like a one sentence entry by Jim Waller
link |
02:36:31.120
from like that guy lived at some point.
link |
02:36:34.200
There's not many, you know,
link |
02:36:35.600
not many people would be remembered.
link |
02:36:37.720
You're one of the sparkling little human creatures
link |
02:36:42.360
that had a big impact on the world.
link |
02:36:44.760
How do you hope you'll be remembered?
link |
02:36:46.360
My daughter was trying to get,
link |
02:36:48.520
she edited my Wikipedia page
link |
02:36:49.960
to say that I was a legend and a guru.
link |
02:36:53.840
But they took it out, so she put it back in.
link |
02:36:55.600
She's 15.
link |
02:36:58.720
I think that was probably the best part of my legacy.
link |
02:37:02.720
She got her sister, and they were all excited.
link |
02:37:04.560
They were like trying to put it in the references
link |
02:37:06.600
because there's articles and that on the title.
link |
02:37:09.360
So in the eyes of your kids, you're a legend.
link |
02:37:13.080
Well, they're pretty skeptical
link |
02:37:14.320
because they don't be better than that.
link |
02:37:15.960
They're like dad.
link |
02:37:18.400
So yeah, that kind of stuff is super fun.
link |
02:37:21.600
In terms of the big legends stuff, I don't care.
link |
02:37:24.360
You don't care.
link |
02:37:25.200
I don't really care.
link |
02:37:26.680
You're just an engineer.
link |
02:37:28.560
Yeah, I've been thinking about building a big pyramid.
link |
02:37:32.080
So I had a debate with a friend
link |
02:37:33.560
about whether pyramids or craters are cooler.
link |
02:37:36.840
And he realized that there's craters everywhere,
link |
02:37:39.240
but they built a couple of pyramids 5,000 years ago.
link |
02:37:42.040
And they remember you for a while.
link |
02:37:43.240
We're still talking about it.
link |
02:37:45.080
So I think that would be cool.
link |
02:37:47.280
Those aren't easy to build.
link |
02:37:48.680
Oh, I know.
link |
02:37:50.360
And they don't actually know how they built them,
link |
02:37:51.960
which is great.
link |
02:37:54.400
It's either AGI or aliens could be involved.
link |
02:37:58.480
So I think you're gonna have to figure out
link |
02:38:01.680
quite a few more things than just
link |
02:38:03.640
the basics of civil engineering.
link |
02:38:05.400
So I guess you hope your legacy is pyramids.
link |
02:38:10.000
That would be cool.
link |
02:38:12.400
And my Wikipedia page, you know,
link |
02:38:13.880
getting updated by my daughter periodically.
link |
02:38:16.240
Like those two things would pretty much make it.
link |
02:38:18.640
Jim, it's a huge honor talking to you again.
link |
02:38:20.600
I hope we talk many more times in the future.
link |
02:38:22.720
I can't wait to see what you do with Tense Torrent.
link |
02:38:26.160
I can't wait to use it.
link |
02:38:27.800
I can't wait for you to revolutionize
link |
02:38:30.040
yet another space in computing.
link |
02:38:33.400
It's a huge honor to talk to you.
link |
02:38:34.760
Thanks for talking to me.
link |
02:38:35.600
This was fun.
link |
02:39:05.600
See you next time.