back to index

Jim Keller: The Future of Computing, AI, Life, and Consciousness | Lex Fridman Podcast #162


small model | large model

link |
00:00:00.000
The following is a conversation with Jim Keller,
link |
00:00:02.920
his second time in the podcast.
link |
00:00:04.960
Jim is a legendary microprocessor architect
link |
00:00:08.480
and is widely seen as one of the greatest
link |
00:00:11.080
engineering minds of the computing age.
link |
00:00:14.640
In a peculiar twist of space time in our simulation,
link |
00:00:18.840
Jim is also a brother in law of Jordan Peterson.
link |
00:00:22.200
We talk about this and about computing,
link |
00:00:25.320
artificial intelligence, consciousness, and life.
link |
00:00:29.200
Quick mention of our sponsors.
link |
00:00:31.280
Athletic Greens all in one nutrition drink,
link |
00:00:33.800
Brooklyn and Sheets, ExpressVPN,
link |
00:00:36.640
and Bell Campo grass fed meat.
link |
00:00:39.640
Click the sponsor links to get a discount
link |
00:00:41.720
and to support this podcast.
link |
00:00:43.960
As a side note, let me say that Jim is someone
link |
00:00:46.240
who on a personal level inspired me to be myself.
link |
00:00:50.200
There was something in his words on and off the mic
link |
00:00:53.360
or perhaps that he even paid attention to me at all
link |
00:00:56.240
that almost told me, you're all right kid.
link |
00:00:59.160
A kind of pat on the back that can make the difference
link |
00:01:01.840
between the mind that flourishes
link |
00:01:03.640
and a mind that is broken down
link |
00:01:05.760
by the cynicism of the world.
link |
00:01:08.160
So I guess that's just my brief few words
link |
00:01:10.440
of thank you to Jim and in general,
link |
00:01:12.800
gratitude for the people who have given me a chance
link |
00:01:15.480
on this podcast and my work and in life.
link |
00:01:19.000
If you enjoy this thing, subscribe on YouTube,
link |
00:01:21.200
review it on Apple podcast, follow on Spotify,
link |
00:01:24.280
support on Patreon or connect with me
link |
00:01:26.400
on Twitter, Alex Friedman.
link |
00:01:28.600
And now here's my conversation with Jim Keller.
link |
00:01:33.400
What's the value and effectiveness of theory
link |
00:01:35.920
versus engineering, this dichotomy
link |
00:01:38.120
in building good software or hardware systems?
link |
00:01:43.440
Well, it's good designs both.
link |
00:01:46.480
I guess that's pretty obvious.
link |
00:01:48.720
But engineering, do you mean, you know,
link |
00:01:50.840
reduction of practice of known methods?
link |
00:01:53.280
And then science is the pursuit of discovering things
link |
00:01:55.960
that people don't understand
link |
00:01:57.800
or solving unknown problems.
link |
00:02:00.360
Definitions are interesting here,
link |
00:02:02.000
but I was thinking more in theory,
link |
00:02:04.160
constructing models that kind of generalize
link |
00:02:06.760
about how things work.
link |
00:02:08.480
And engineering is actually building stuff,
link |
00:02:12.800
the pragmatic like, okay, we have these nice models,
link |
00:02:16.220
but how do we actually get things to work?
link |
00:02:17.960
Maybe economics is a nice example.
link |
00:02:20.800
Like economists have all these models
link |
00:02:22.480
of how the economy works
link |
00:02:23.680
and how different policies will have an effect.
link |
00:02:26.720
But then there's the actual, okay,
link |
00:02:29.280
let's call it engineering
link |
00:02:30.520
of like actually deploying the policies.
link |
00:02:33.280
So computer design is almost all engineering
link |
00:02:36.420
and reduction of practice of known methods.
link |
00:02:38.240
Now, because of the complexity of the computers we build,
link |
00:02:43.600
you know, you could think you're,
link |
00:02:45.000
well, we'll just go write some code
link |
00:02:46.640
and then we'll verify it and then we'll put it together.
link |
00:02:49.200
And then you find out that the combination
link |
00:02:50.960
of all that stuff is complicated.
link |
00:02:53.240
And then you have to be inventive
link |
00:02:54.720
to figure out how to do it, right?
link |
00:02:56.960
So that's, that's definitely happens a lot.
link |
00:02:59.800
And then every so often some big idea happens,
link |
00:03:04.480
but it might be one person.
link |
00:03:06.400
And that idea is in what,
link |
00:03:07.720
in the space of engineering or is it in the space?
link |
00:03:10.440
Well, I'll give you an example.
link |
00:03:11.420
So one of the limits of computer performance
link |
00:03:13.160
is branch prediction.
link |
00:03:14.920
So, and there's a whole bunch of ideas
link |
00:03:17.540
about how good you could predict a branch.
link |
00:03:19.480
And people said, there's a limit to it.
link |
00:03:21.640
It's an asymptotic curve.
link |
00:03:23.520
And somebody came up with a better way
link |
00:03:24.960
to do branch prediction.
link |
00:03:26.480
That was a lot better.
link |
00:03:28.280
And he published a paper on it
link |
00:03:29.760
and every computer in the world now uses it.
link |
00:03:32.800
And it was one idea.
link |
00:03:34.640
So the engineers who build branch prediction hardware
link |
00:03:38.000
were happy to drop the one kind of training array
link |
00:03:40.520
and put it in another one.
link |
00:03:42.420
So it was, it was a real idea.
link |
00:03:44.880
And branch prediction is,
link |
00:03:46.480
is one of the key problems underlying all of sort of
link |
00:03:50.600
the lowest level of software it boils down to branch prediction.
link |
00:03:53.840
It boils down to uncertainty.
link |
00:03:54.880
Computers are limited by, you know,
link |
00:03:56.320
single thread computers limited by two things.
link |
00:03:58.680
The predictability of the path of the branches
link |
00:04:01.440
and the predictability of the locality of data.
link |
00:04:05.360
So we have predictors that now predict
link |
00:04:07.120
both of those pretty well.
link |
00:04:09.080
So memory is, you know, a couple hundred cycles away.
link |
00:04:11.920
Local cash is a couple of cycles away.
link |
00:04:14.580
When you're executing fast,
link |
00:04:15.760
virtually all the data has to be in the local cash.
link |
00:04:19.060
So a simple program says, you know,
link |
00:04:21.360
add one to every element in an array.
link |
00:04:23.280
It's really easy to see what the stream of data will be.
link |
00:04:26.680
But you might have a more complicated program that's, you know,
link |
00:04:29.120
says get a, get an element of this array,
link |
00:04:31.080
look at something, make a decision,
link |
00:04:32.800
go get another element, it's kind of random.
link |
00:04:35.200
And you can think that's really unpredictable.
link |
00:04:37.760
And then you make this big predictor
link |
00:04:39.200
that looks at this kind of pattern.
link |
00:04:40.720
And you realize, well, if you get this data
link |
00:04:42.440
and this data, then you probably want that one.
link |
00:04:44.520
And if you get this one and this one
link |
00:04:45.960
and this one, you probably want that one.
link |
00:04:47.960
And is that theory or is that engineering?
link |
00:04:49.920
Like the paper that was written, was it asymptotic
link |
00:04:53.200
kind of discussion, or is it more like,
link |
00:04:55.480
here's a hack that works well?
link |
00:04:57.920
It's a little bit of both.
link |
00:04:59.120
Like there's information theory in it, I think somewhere.
link |
00:05:01.320
Okay.
link |
00:05:02.160
So it's actually trying to prove some kind of stuff.
link |
00:05:03.920
But once you know the method,
link |
00:05:06.400
implementing it is an engineering problem.
link |
00:05:09.600
Now there's a flip side of this,
link |
00:05:10.840
which is in a big design team,
link |
00:05:13.440
what percentage of people think their plan
link |
00:05:19.000
or their life's work is engineering
link |
00:05:20.840
versus inventing things.
link |
00:05:23.520
So lots of companies will reward you for filing patents.
link |
00:05:27.560
Some many big companies get stuck
link |
00:05:29.320
because to get promoted, you have to come up
link |
00:05:31.160
with something new.
link |
00:05:33.000
And then what happens is everybody's trying
link |
00:05:34.760
to do some random new thing, 99% of which doesn't matter.
link |
00:05:39.160
And the basics get neglected.
link |
00:05:41.160
And, or they get to, there's a dichotomy, they think,
link |
00:05:46.200
like the cell library and the basic CAD tools,
link |
00:05:49.480
or basic software validation methods.
link |
00:05:53.200
That's simple stuff.
link |
00:05:54.680
They wanna work on the exciting stuff.
link |
00:05:56.880
And then they spend lots of time trying to figure out
link |
00:05:58.880
how to patent something.
link |
00:06:00.680
And that's mostly useless.
link |
00:06:02.200
But the breakthroughs are on simple stuff.
link |
00:06:04.520
No, no, you have to do the simple stuff really well.
link |
00:06:08.920
If you're building a building out of bricks,
link |
00:06:11.440
you want great bricks.
link |
00:06:13.240
So you go to two places to sell bricks.
link |
00:06:14.920
So one guy says, yeah, they're over there in an ugly pile.
link |
00:06:17.960
And the other guy is like lovingly tells you
link |
00:06:19.840
about the 50 kinds of bricks and how hard they are
link |
00:06:22.280
and how beautiful they are and how square they are.
link |
00:06:25.040
And, you know, which one are you gonna buy bricks from?
link |
00:06:28.200
Which is gonna make a better house.
link |
00:06:30.400
So you're talking about the craftsman,
link |
00:06:31.880
the person who understands bricks,
link |
00:06:33.480
who loves bricks, who loves the variety.
link |
00:06:35.160
That's a good word.
link |
00:06:36.080
You know, good engineering is great craftsmanship.
link |
00:06:39.360
And when you start thinking engineering is about invention
link |
00:06:44.840
and you set up a system that rewards invention,
link |
00:06:47.920
the craftsmanship gets neglected.
link |
00:06:50.640
Okay, so maybe one perspective is the theory.
link |
00:06:53.480
The science overemphasizes invention
link |
00:06:57.640
and engineering emphasizes craftsmanship.
link |
00:07:00.400
And therefore, like, so if you,
link |
00:07:02.840
it doesn't matter what you do, theory.
link |
00:07:04.320
But everybody does, like read the tech rags.
link |
00:07:06.200
They're always talking about some breakthrough
link |
00:07:07.880
or innovation and everybody thinks
link |
00:07:10.640
that's the most important thing.
link |
00:07:12.480
But the number of innovative ideas
link |
00:07:13.920
is actually relatively low.
link |
00:07:16.000
We need them, right?
link |
00:07:17.240
And innovation creates a whole new opportunity.
link |
00:07:19.840
Like when some guy invented the internet, right?
link |
00:07:24.040
Like that was a big thing.
link |
00:07:25.920
The million people that wrote software against that
link |
00:07:28.240
were mostly doing engineering and software writing.
link |
00:07:31.200
So the elaboration of that idea was huge.
link |
00:07:34.280
I don't know if you know Brandon and I,
link |
00:07:35.560
he wrote JavaScript in 10 days.
link |
00:07:38.200
That's an interesting story.
link |
00:07:39.520
It makes me wonder, and it was, you know,
link |
00:07:42.400
famously for many years considered
link |
00:07:44.200
to be a pretty crappy programming language.
link |
00:07:47.640
Still is perhaps.
link |
00:07:48.760
It's been improving sort of consistently.
link |
00:07:51.120
But the interesting thing about that guy is,
link |
00:07:55.600
you know, he doesn't get any awards.
link |
00:07:58.520
You don't get a Nobel Prize or a Fields Medal
link |
00:08:00.720
or a crappy piece of, you know, software code.
link |
00:08:06.640
That is currently the number one programming language
link |
00:08:08.720
in the world and runs now is increasingly running
link |
00:08:12.680
the back end of the internet.
link |
00:08:13.760
Well, does he know why everybody uses it?
link |
00:08:17.680
Like that would be an interesting thing.
link |
00:08:19.360
Was it the right thing at the right time?
link |
00:08:22.400
Cause like when stuff like JavaScript came out,
link |
00:08:24.960
like there was a move from, you know,
link |
00:08:26.320
writing C programs and C++ to,
link |
00:08:29.640
let's call what they call managed code frameworks
link |
00:08:32.400
where you write simple code, it might be interpreted,
link |
00:08:35.240
it has lots of libraries, productivity is high,
link |
00:08:37.800
and you don't have to be an expert.
link |
00:08:39.560
So, you know, Java was supposed to solve
link |
00:08:41.360
all the world's problems, it was complicated.
link |
00:08:43.800
JavaScript came out, you know,
link |
00:08:45.240
after a bunch of other scripting languages.
link |
00:08:47.680
I'm not an expert on it, but was it the right thing
link |
00:08:50.400
at the right time or was there something, you know,
link |
00:08:53.880
clever cause he wasn't the only one.
link |
00:08:56.320
There's a few elements.
link |
00:08:57.440
And maybe if he figured out what it was,
link |
00:08:59.800
then he'd get a prize.
link |
00:09:02.040
Like that.
link |
00:09:03.360
Yeah, you know, maybe his problem is he hasn't defined this.
link |
00:09:06.880
Or he just needs a good promoter.
link |
00:09:09.520
Well, I think there was a bunch of blog posts
link |
00:09:11.920
written about it, which is like wrong is right,
link |
00:09:14.840
which is like doing the crappy thing fast,
link |
00:09:19.320
just like hacking together the thing
link |
00:09:21.360
that answers some of the needs
link |
00:09:23.240
and then iterating over time, listening to developers,
link |
00:09:26.080
like listening to people who actually use the thing.
link |
00:09:28.320
This is something you can do more in software,
link |
00:09:31.520
but the right time, like you have to sense,
link |
00:09:33.760
you have to have a good instinct
link |
00:09:35.120
of when is the right time for the right tool
link |
00:09:37.560
and make it super simple and just get it out there.
link |
00:09:42.720
The problem is this is true with hardware,
link |
00:09:45.200
this is less true with software,
link |
00:09:46.400
is there's a backward compatibility
link |
00:09:48.440
that just drags behind you as, you know,
link |
00:09:51.720
as you try to fix all the mistakes of the past.
link |
00:09:53.800
But the timing was good.
link |
00:09:56.480
There's something about that.
link |
00:09:57.480
It wasn't accidental.
link |
00:09:58.840
You have to like give yourself over to the,
link |
00:10:02.600
you have to have this like broad sense
link |
00:10:05.360
of what's needed now, both scientifically
link |
00:10:09.280
and like the community and just like this.
link |
00:10:12.280
It was obvious that there was no,
link |
00:10:15.680
the interesting thing about JavaScript
link |
00:10:17.960
is everything that ran in the browser at the time,
link |
00:10:20.880
like Java and I think other like scheme,
link |
00:10:24.440
other programming languages,
link |
00:10:25.920
they were all in a separate external container.
link |
00:10:30.520
And then JavaScript was literally
link |
00:10:32.520
just injected into the webpage.
link |
00:10:34.600
It was the dumbest possible thing
link |
00:10:36.400
running in the same thread as everything else.
link |
00:10:39.360
And like it was inserted as a comment.
link |
00:10:43.120
So JavaScript code is inserted as a comment in the HTML code.
link |
00:10:47.520
And it was, I mean, there's, it's either genius
link |
00:10:51.320
or super dumb, but it's like.
link |
00:10:53.080
Right, so it had no apparatus for like a virtual machine
link |
00:10:55.720
and container.
link |
00:10:56.720
It just executed in the framework of the program
link |
00:10:58.960
that's already running.
link |
00:10:59.800
And it was, and then because something
link |
00:11:02.760
about that accessibility, the ease of its use
link |
00:11:07.280
resulted in then developers innovating
link |
00:11:10.080
of how to actually use it.
link |
00:11:11.400
I mean, I don't even know what to make of that,
link |
00:11:13.680
but it does seem to echo across different software,
link |
00:11:18.360
like stories of different software.
link |
00:11:19.760
PHP has the same story, really crappy language.
link |
00:11:22.920
They just took over the world.
link |
00:11:25.440
Well, let's have a joke that the random length instructions,
link |
00:11:28.360
variable length instructions, that's always one,
link |
00:11:30.680
even though they're obviously worse.
link |
00:11:33.080
Like nobody knows why x86 is,
link |
00:11:35.400
or you'd be the worst architecture, you know,
link |
00:11:38.000
on the planet is one of the most popular ones.
link |
00:11:40.520
Well, I mean, isn't that also the story of risk
link |
00:11:42.880
versus, I mean, is that simplicity?
link |
00:11:46.240
There's something about simplicity
link |
00:11:47.440
that us in this evolutionary process is valued.
link |
00:11:53.480
If it's simple, it spreads faster, it seems like.
link |
00:11:58.800
Or is that not always true?
link |
00:11:59.960
That's not always true.
link |
00:12:01.120
Yeah, it could be simple as good, but too simple as bad.
link |
00:12:04.280
So why did risk win, you think, so far?
link |
00:12:06.440
Did risk win?
link |
00:12:08.680
In the long archivist tree.
link |
00:12:10.560
We don't know.
link |
00:12:11.400
So who's gonna win?
link |
00:12:12.720
What's risk, what's risk,
link |
00:12:14.200
and who's gonna win in that space in these instruction sets?
link |
00:12:17.560
AI software's gonna win,
link |
00:12:18.920
but there'll be little computers that run little programs
link |
00:12:22.160
like normal all over the place.
link |
00:12:24.920
But we're going through another transformation, so.
link |
00:12:28.520
But you think instruction sets underneath it all will change?
link |
00:12:32.400
Yeah, they evolve slowly.
link |
00:12:33.640
They don't matter very much.
link |
00:12:35.480
They don't matter very much, okay.
link |
00:12:36.800
I mean, the limits of performance are, you know,
link |
00:12:39.760
predictability of instructions and data.
link |
00:12:41.640
I mean, that's the big thing.
link |
00:12:43.360
And then the usability of it is some, you know,
link |
00:12:47.960
quality of design, quality of tools, availability.
link |
00:12:52.160
Like right now, X86 is proprietary with Intel and AMD,
link |
00:12:56.440
but they can change it any way they want independently.
link |
00:12:59.480
Right, ARM is proprietary to ARM,
link |
00:13:01.640
and they won't let anybody else change it.
link |
00:13:03.680
So it's like a sole point.
link |
00:13:05.680
And RISC 5 is open source, so anybody can change it,
link |
00:13:09.080
which is super cool.
link |
00:13:10.640
But that also might mean it gets changed
link |
00:13:12.480
in too many random ways,
link |
00:13:13.640
that there's no common subset of it that people can use.
link |
00:13:17.720
Do you like open or do you like closed?
link |
00:13:19.920
Like if you were to bet all your money
link |
00:13:21.520
on one or the other, RISC 5 versus it?
link |
00:13:23.320
No idea.
link |
00:13:24.240
It's case dependent?
link |
00:13:25.080
Well, X86 oddly enough,
link |
00:13:26.360
when Intel first started developing it,
link |
00:13:28.320
they licensed it like seven people.
link |
00:13:30.280
So it was the open architecture.
link |
00:13:33.080
And then they moved faster than others
link |
00:13:35.360
and also bought one or two of them.
link |
00:13:37.480
But there was seven different people making X86,
link |
00:13:40.280
because at the time there was 6502 and Z80s and 8086.
link |
00:13:46.960
And you could argue everybody thought
link |
00:13:48.680
Z80 was the better instruction set,
link |
00:13:50.960
but that was proprietary to one place.
link |
00:13:54.480
Oh, and the 6800.
link |
00:13:56.160
So there's like four or five different microprocessors.
link |
00:13:59.480
Intel went open, got the market share
link |
00:14:02.400
because people felt like they had multiple sources from it,
link |
00:14:04.720
and then over time it narrowed down to two players.
link |
00:14:07.680
So why, you as a historian, why did Intel win
link |
00:14:12.920
for so long with their processors?
link |
00:14:17.280
I mean, I mean.
link |
00:14:18.120
They were great.
link |
00:14:18.960
Their process development was great.
link |
00:14:21.040
So it's just looking back to JavaScript and Brandenike
link |
00:14:24.320
is Microsoft and Netscape and all these internet browsers.
link |
00:14:28.960
Microsoft won the browser game
link |
00:14:31.760
because they aggressively stole other people's ideas.
link |
00:14:36.000
Like right after they did it.
link |
00:14:37.840
You know, I don't know
link |
00:14:39.120
if Intel was stealing other people's ideas.
link |
00:14:41.200
They started making.
link |
00:14:42.040
In a good way, stealing them,
link |
00:14:42.880
because we're just a clarify.
link |
00:14:43.800
They started making RAMs, random access memories.
link |
00:14:48.280
And then at the time when the Japanese manufacturers came up,
link |
00:14:53.000
you know, they were getting out competed on that
link |
00:14:54.920
and they pivoted the microprocessors
link |
00:14:56.600
and they made the first, you know,
link |
00:14:57.760
integrated microprocessor grant programs.
link |
00:14:59.920
It was the 4004 or something.
link |
00:15:03.880
Who was behind that pivot?
link |
00:15:04.880
That's a hell of a pivot.
link |
00:15:05.880
Andy Grove.
link |
00:15:06.880
And he was great.
link |
00:15:08.800
That's a hell of a pivot.
link |
00:15:10.160
And then they led semiconductor industry.
link |
00:15:13.880
Like they were just a little company, IBM,
link |
00:15:16.000
all kinds of big companies had boatloads of money
link |
00:15:19.040
and they out innovated everybody.
link |
00:15:21.200
Out of the innovative.
link |
00:15:22.200
Okay.
link |
00:15:23.040
Yeah.
link |
00:15:23.880
So it's not like marketing.
link |
00:15:24.720
It's not, you know, stuff.
link |
00:15:26.280
Their processor designs were pretty good.
link |
00:15:29.360
I think the, you know, Core 2 was probably the first one
link |
00:15:34.360
I thought was great.
link |
00:15:36.200
It was a really fast processor
link |
00:15:37.600
and then Haswell was great.
link |
00:15:40.200
What makes a great processor?
link |
00:15:42.240
Oh, if you just look at its performance
link |
00:15:43.880
versus everybody else, it's, you know,
link |
00:15:46.440
the size of it, you know, usability of it.
link |
00:15:49.880
So it's not specific, some kind of element
link |
00:15:51.840
that makes you beautiful.
link |
00:15:52.680
It's just like literally just raw performance.
link |
00:15:55.160
Is that how you think of bioprocessors?
link |
00:15:57.160
It's just like raw performance.
link |
00:15:59.760
Of course.
link |
00:16:01.320
It's like a horse race.
link |
00:16:02.360
The fastest one wins.
link |
00:16:04.280
Now.
link |
00:16:05.120
You don't care how.
link |
00:16:05.960
Well, there's the fastest in the environment.
link |
00:16:10.600
Like for years, you made the fastest one you could
link |
00:16:13.040
and then people started to have power limits.
link |
00:16:14.960
So then you made the fastest at the right PowerPoint.
link |
00:16:17.640
And then, and then when we started doing multiprocessors,
link |
00:16:20.480
like if you could scale your processors more
link |
00:16:23.800
than the other guy, you could be 10% faster
link |
00:16:25.680
on like a single thread, but you have more threads.
link |
00:16:28.440
So there's lots of variability.
link |
00:16:30.000
And then ARM really explored, like, you know,
link |
00:16:35.560
they have the A series and the R series
link |
00:16:37.440
and the M series, like a family of processors
link |
00:16:40.320
for all these different design points
link |
00:16:41.960
from like unbelievably small and simple.
link |
00:16:44.600
And so then when you're doing the design,
link |
00:16:46.520
it's sort of like this big palette of CPUs.
link |
00:16:49.360
Like they're the only ones with a credible,
link |
00:16:51.480
you know, top to bottom palette.
link |
00:16:54.660
What do you mean a credible top to bottom?
link |
00:16:56.880
Well, there's people that make microcontrollers
link |
00:16:58.600
that are small, but they don't have a fast one.
link |
00:17:00.480
There's people who make fast processors,
link |
00:17:02.080
but don't have a little, a medium one or a small one.
link |
00:17:04.880
Is that hard to do that full palette?
link |
00:17:07.080
That seems like a, it's a lot of different.
link |
00:17:09.360
So what's the difference in the ARM folks and Intel,
link |
00:17:13.360
in terms of the way they're approaching this problem?
link |
00:17:15.620
Well, Intel, almost all their processors or designs
link |
00:17:19.200
were, you know, very custom high end,
link |
00:17:21.720
you know, for the last 15, 20 years.
link |
00:17:23.400
It's the fastest horse possible.
link |
00:17:24.880
Yeah.
link |
00:17:25.840
In one horses.
link |
00:17:27.520
Yeah. And the architecture that are really good,
link |
00:17:30.400
but the company itself was fairly insular
link |
00:17:33.360
to what's going on in the industry with CAD tools and stuff.
link |
00:17:36.280
And there's this debate about custom design versus synthesis
link |
00:17:39.840
and how do you approach that?
link |
00:17:41.320
I'd say Intel was slow on the cutting to synthesize processors.
link |
00:17:45.680
ARM came in from the bottom and they generated IP,
link |
00:17:49.100
which went to all kinds of customers.
link |
00:17:50.840
So they had very little say in how the customer
link |
00:17:52.760
implemented their IP.
link |
00:17:54.960
So ARM is super friendly to the synthesis IP environment.
link |
00:17:59.440
Whereas Intel said,
link |
00:18:00.280
we're going to make this great client chip or server chip
link |
00:18:03.200
with our own CAD tools, with our own process,
link |
00:18:05.440
with our own, you know, other supporting IP
link |
00:18:08.140
and everything only works with our stuff.
link |
00:18:11.340
So is that, is ARM winning the mobile platform space
link |
00:18:16.440
in terms of process?
link |
00:18:17.680
And so in that way you're describing is why they're winning.
link |
00:18:22.680
Well, they had lots of people doing lots
link |
00:18:24.920
of different experiments.
link |
00:18:26.440
So they controlled the processor architecture and IP,
link |
00:18:29.440
but they let people put in lots of different chips.
link |
00:18:32.040
And there was a lot of variability in what happened there.
link |
00:18:35.240
Whereas Intel, when they made their mobile,
link |
00:18:37.160
their foray into mobile,
link |
00:18:38.440
they had one team doing one part, right?
link |
00:18:41.720
So it wasn't 10 experiments.
link |
00:18:43.160
And then their mindset was PC mindset,
link |
00:18:46.000
Microsoft software mindset,
link |
00:18:48.040
and that brought a whole bunch of things along
link |
00:18:49.920
that the mobile world, the embedded world don't do.
link |
00:18:52.560
You think it was possible for Intel to pivot hard
link |
00:18:55.480
and win the mobile market?
link |
00:18:58.280
That's a hell of a difficult thing to do, right?
link |
00:19:00.120
For a huge company to just pivot.
link |
00:19:03.440
I mean, it's so interesting to,
link |
00:19:05.560
because we'll talk about your current work.
link |
00:19:07.440
It's like, it's clear that PCs were dominating
link |
00:19:11.160
for several decades, like desktop computers.
link |
00:19:14.240
And then mobile, it's unclear.
link |
00:19:17.800
It's a leadership question.
link |
00:19:19.360
Like Apple under Steve Jobs, when he came back,
link |
00:19:23.080
they pivoted multiple times.
link |
00:19:25.880
They built iPads and iTunes and phones and tablets
link |
00:19:29.160
and great Macs, like who knew computers
link |
00:19:32.000
should be made out of aluminum?
link |
00:19:33.440
Nobody knew that.
link |
00:19:35.320
That they're great, it's super fun.
link |
00:19:37.200
That was Steve?
link |
00:19:38.040
Yeah, Steve Jobs, like they pivoted multiple times.
link |
00:19:41.440
And the old Intel, they did that multiple times.
link |
00:19:45.880
They made DRAMs and processors and processes
link |
00:19:48.440
and I got to ask this,
link |
00:19:50.920
what was it like working with Steve Jobs?
link |
00:19:53.080
I didn't work with him.
link |
00:19:54.440
Did you interact with him?
link |
00:19:55.720
Twice.
link |
00:19:57.480
I said hi to him twice in the cafeteria.
link |
00:19:59.880
What did you say?
link |
00:20:01.080
Hi.
link |
00:20:01.920
You said, hey fellas, he was friendly.
link |
00:20:06.000
He was wandering around and with somebody,
link |
00:20:08.280
you couldn't find the table
link |
00:20:09.240
because the cafeteria was packed and I gave my table.
link |
00:20:13.720
But I worked for Mike Colbert who talked to,
link |
00:20:16.080
like Mike was the unofficial CTO of Apple
link |
00:20:19.280
and a brilliant guy and he worked for Steve for 25 years,
link |
00:20:22.200
maybe more and he talked to Steve multiple times a day.
link |
00:20:26.720
And he was one of the people that could put up with Steve's,
link |
00:20:29.400
let's say brilliance and intensity.
link |
00:20:31.760
And Steve really liked him and Steve trusted Mike
link |
00:20:35.720
to translate the shit he thought up
link |
00:20:39.080
into engineering products at work
link |
00:20:40.920
and then Mike ran a group called Platform Architecture
link |
00:20:43.160
and I was in that group.
link |
00:20:44.800
So many times I'd be sitting with Mike
link |
00:20:46.400
and the phone would ring and it'd be Steve
link |
00:20:48.720
and Mike would hold the phone like this
link |
00:20:50.440
because Steve would be yelling about something or other.
link |
00:20:52.360
Yeah.
link |
00:20:53.200
And then he would translate.
link |
00:20:54.160
And he translated and then he would say,
link |
00:20:55.920
Steve wants us to do this.
link |
00:20:58.320
So.
link |
00:20:59.520
Was Steve a good engineer or no?
link |
00:21:01.120
I don't know.
link |
00:21:02.440
He was a great idea guy.
link |
00:21:03.800
Idea person.
link |
00:21:04.640
And he's a really good selector for talent.
link |
00:21:07.600
Yeah, that seems to be one of the key elements
link |
00:21:09.600
of leadership, right?
link |
00:21:10.800
And then he was a really good first principles guy.
link |
00:21:12.800
Like somebody say something couldn't be done
link |
00:21:15.080
and he would just think that's obviously wrong, right?
link |
00:21:20.320
But maybe it's hard to do, maybe it's expensive to do,
link |
00:21:24.440
maybe we need different people.
link |
00:21:26.200
There's like a whole bunch of,
link |
00:21:27.320
if you want to do something hard,
link |
00:21:29.640
maybe it takes time, maybe you have to iterate.
link |
00:21:31.600
There's a whole bunch of things that you could think about
link |
00:21:33.760
but saying it can't be done is stupid.
link |
00:21:36.400
How would you compare?
link |
00:21:38.120
So it seems like Elon Musk is more engineering centric
link |
00:21:42.880
but it's also,
link |
00:21:43.720
I think he considered himself a designer too.
link |
00:21:45.680
He has a design mind.
link |
00:21:47.040
Steve Jobs feels like he's much more idea space,
link |
00:21:50.560
design space versus engineering.
link |
00:21:52.760
Just make it happen.
link |
00:21:53.960
Like the world should be this way, just figure it out.
link |
00:21:57.200
But he used computers.
link |
00:21:58.720
You know, he had computer people talk to him all the time.
link |
00:22:01.880
Like Mike was a really good computer guy.
link |
00:22:03.400
He knew what computers could do.
link |
00:22:04.840
Computer meaning computer hardware,
link |
00:22:06.320
like hardware software, all the pieces and then he would,
link |
00:22:10.960
you know, have an idea about what could we do with this next
link |
00:22:14.520
that was grounded in reality.
link |
00:22:16.040
It wasn't like he was, you know,
link |
00:22:17.120
just finger painting on the wall
link |
00:22:19.200
and wishing somebody would interpret it.
link |
00:22:20.960
Like, so he had this interesting connection
link |
00:22:23.400
because, you know, he wasn't a computer architecture designer,
link |
00:22:28.320
but he had an intuition from the computers we had
link |
00:22:30.840
to what could happen and.
link |
00:22:33.880
It's interesting to say intuition because it seems
link |
00:22:36.760
like he was pissing off a lot of engineers in his intuition
link |
00:22:41.760
about what can and can't be done.
link |
00:22:43.680
Those, like the, what is all these stories
link |
00:22:46.840
about like floppy disks and all that kind of stuff.
link |
00:22:48.960
Like.
link |
00:22:49.800
Yeah.
link |
00:22:50.640
So in Steve, the first round, like he'd go into a lab
link |
00:22:54.320
and look at what's going on and hate it
link |
00:22:56.200
and fire people or assembly in the elevator,
link |
00:23:00.520
what they're doing for Apple and, you know, not be happy.
link |
00:23:03.800
When he came back, my impression was is he surrounded himself
link |
00:23:07.960
with this relatively small group of people.
link |
00:23:10.160
Yes.
link |
00:23:11.000
And didn't really interact outside of that as much.
link |
00:23:13.840
And then the joke was, you'd see like somebody moving
link |
00:23:16.280
up prototype through the quad with a black blanket over it.
link |
00:23:20.760
And that was because it was secret, you know,
link |
00:23:23.200
partly from Steve because they didn't want Steve
link |
00:23:25.120
to see it until it was ready.
link |
00:23:26.960
Yeah, the dynamic with Johnny Ive and Steve is interesting.
link |
00:23:31.400
It's like you don't want to, he ruins as many ideas
link |
00:23:35.960
as he generates.
link |
00:23:37.280
Yeah.
link |
00:23:38.280
Yeah.
link |
00:23:39.120
It's a dangerous kind of line to walk.
link |
00:23:42.080
If you have a lot of ideas, like,
link |
00:23:44.520
like Gordon Bell was famous for ideas, right?
link |
00:23:47.240
And it wasn't that the percentage of good ideas
link |
00:23:49.120
was way higher than anybody else.
link |
00:23:51.440
It was, he had so many ideas and he was also good
link |
00:23:54.360
at talking to people about it and getting the filters, right?
link |
00:23:58.120
And, you know, seeing through stuff.
link |
00:24:00.200
Whereas Elon was like, hey, I want to build rockets.
link |
00:24:03.360
So Steve was hired bunch of rocket guys
link |
00:24:06.000
and Elon would go read rocket manuals.
link |
00:24:08.480
So Elon is a better engineer, a sense like,
link |
00:24:11.480
or like more like a love and passion for the manuals.
link |
00:24:16.480
Yeah.
link |
00:24:17.320
And the details.
link |
00:24:18.160
The details.
link |
00:24:19.000
The data and the understanding.
link |
00:24:19.840
The craftsmanship too, right?
link |
00:24:20.800
Well, I guess Steve had craftsmanship too,
link |
00:24:22.720
but of a different kind.
link |
00:24:24.240
What do you make of the, just the standard
link |
00:24:26.960
for just a little longer, what do you make of like
link |
00:24:28.600
the anger and the passion and all that,
link |
00:24:30.640
the firing and the mood swings and the madness,
link |
00:24:35.080
the, you know, being emotional and all that.
link |
00:24:38.360
That's Steve and I guess Elon too.
link |
00:24:40.680
So what, is that a bug or a feature?
link |
00:24:43.720
It's a feature.
link |
00:24:45.040
So there's a graph, which is Y axis productivity.
link |
00:24:49.600
Yeah.
link |
00:24:50.440
X axis at zero is chaos and infinity is complete order.
link |
00:24:55.080
Yeah.
link |
00:24:55.920
So as you go from the, you know, the origin,
link |
00:25:00.920
as you improve order, you improve productivity.
link |
00:25:03.560
Yeah.
link |
00:25:04.400
And at some point productivity peaks
link |
00:25:06.440
and then it goes back down again.
link |
00:25:08.360
Too much order, nothing can happen.
link |
00:25:09.800
Yes.
link |
00:25:10.640
But the question is, is the,
link |
00:25:12.040
how close to the chaos is that?
link |
00:25:13.720
No, no, no.
link |
00:25:14.560
Here's the thing.
link |
00:25:15.400
Is once you start moving the direction of order,
link |
00:25:16.920
the force factor to drive you towards order is unstoppable.
link |
00:25:21.000
Oh.
link |
00:25:21.840
And every organization will move to the place
link |
00:25:24.880
where their productivity is stymied by order.
link |
00:25:27.120
So you need to...
link |
00:25:28.200
So the question is, who's the counter force?
link |
00:25:31.240
Like, and cause it also feels really good.
link |
00:25:33.360
As you get more organized and productivity goes up,
link |
00:25:36.240
the organization feels it, they orient towards it, right?
link |
00:25:39.720
To hire more people.
link |
00:25:41.080
They got more guys who couldn't run process,
link |
00:25:42.880
you get bigger, right?
link |
00:25:44.760
And then inevitably, the organization gets captured
link |
00:25:49.120
by the bureaucracy that manages all the processes.
link |
00:25:52.800
Yeah.
link |
00:25:53.640
All right. And then humans really like that.
link |
00:25:55.520
And so if you just walk into a room and say,
link |
00:25:57.920
guys, love what you're doing,
link |
00:26:00.960
but I need you to have less order.
link |
00:26:04.960
If you don't have some force behind that,
link |
00:26:06.920
nothing will happen.
link |
00:26:09.080
I can't tell you on how many levels that's profound.
link |
00:26:11.760
So...
link |
00:26:12.600
So that's why I'd say it's a feature.
link |
00:26:14.080
Now, could you be nicer about it?
link |
00:26:17.200
I don't know.
link |
00:26:18.040
I don't know any good examples of being nicer about it.
link |
00:26:21.440
Well, the funny thing is to get stuff done.
link |
00:26:23.520
You need people who can manage stuff
link |
00:26:25.200
and manage people because humans are complicated.
link |
00:26:26.920
They need lots of care and feeding
link |
00:26:28.160
and you need to tell them they look nice
link |
00:26:29.880
and they're doing good stuff and pat them on the back, right?
link |
00:26:33.120
I don't know.
link |
00:26:33.960
Do you tell me, is that needed?
link |
00:26:36.040
Do humans need that?
link |
00:26:37.080
I had a friend, he started a manager group and he said,
link |
00:26:39.640
I figured it out.
link |
00:26:40.840
You have to praise them before they do anything.
link |
00:26:43.400
I was waiting till they were done
link |
00:26:45.240
and they were always mad at me.
link |
00:26:46.560
Now we tell them what a great job they're doing
link |
00:26:48.200
while they're doing it.
link |
00:26:49.400
But then you get stuck in that trap
link |
00:26:51.080
because then when they're not doing something,
link |
00:26:52.240
how do you confront these people?
link |
00:26:54.080
I think a lot of people that had trauma
link |
00:26:55.920
in their childhood would disagree with you.
link |
00:26:57.560
Successful people that you just first do the rough stuff
link |
00:27:00.680
and then be nice later.
link |
00:27:02.360
I don't know.
link |
00:27:03.200
Okay, but you know, engineering companies are full
link |
00:27:05.440
of adults who had all kinds of range of childhoods.
link |
00:27:08.120
You know, most people had okay childhoods.
link |
00:27:11.440
Well, I don't know if...
link |
00:27:12.760
I know lots of people only work for praise, which is weird.
link |
00:27:15.640
You mean like everybody.
link |
00:27:18.720
I'm not that interested in it, but...
link |
00:27:21.160
Well, you're probably looking
link |
00:27:22.760
for somebody's approval, even still.
link |
00:27:27.440
Yeah, maybe.
link |
00:27:28.280
I should think about that.
link |
00:27:29.560
Maybe somebody who's no longer with us kind of thing.
link |
00:27:33.200
I don't know.
link |
00:27:34.120
I used to call up my dad and tell him what I was doing.
link |
00:27:36.080
He was very excited about engineering and stuff.
link |
00:27:38.640
You got his approval?
link |
00:27:40.360
Yeah, a lot.
link |
00:27:42.080
I was lucky.
link |
00:27:43.360
Like he decided I was smart and unusual as a kid
link |
00:27:47.200
and that was okay when I was really young.
link |
00:27:50.200
So when I did poorly in school, I was dyslexic.
link |
00:27:52.520
I didn't read until I was third or fourth grade.
link |
00:27:55.240
They didn't care.
link |
00:27:56.080
My parents were like, oh, he'll be fine.
link |
00:27:59.760
So I was lucky.
link |
00:28:01.520
That was cool.
link |
00:28:02.480
Is he still with us?
link |
00:28:05.160
You miss him?
link |
00:28:07.560
Sure, yeah, he had Parkinson's and then cancer.
link |
00:28:11.760
His last 10 years were tough.
link |
00:28:15.040
And I killed him.
link |
00:28:15.960
Killing a man like that's hard.
link |
00:28:18.240
The mind?
link |
00:28:19.360
Well, it was pretty good.
link |
00:28:21.440
Parkinson's causes slow dementia
link |
00:28:23.720
and the chemotherapy, I think, accelerated it.
link |
00:28:29.040
But it was like hallucinogenic dementia.
link |
00:28:30.960
So he was clever and funny and interesting
link |
00:28:34.120
and it was pretty unusual.
link |
00:28:37.880
Do you remember conversations from that time?
link |
00:28:41.520
Like what, do you have fond memories of the guy?
link |
00:28:43.920
Yeah, oh yeah.
link |
00:28:45.200
Anything come to mind?
link |
00:28:46.320
A friend told me one time I could draw a computer
link |
00:28:50.360
on the way forward faster than anybody had ever met.
link |
00:28:52.520
And I said, you should meet my dad.
link |
00:28:54.960
Like when I was a kid, he'd come home and say,
link |
00:28:56.920
I was driving by this bridge and I was thinking about it.
link |
00:28:58.840
And he pulled out a piece of paper
link |
00:28:59.840
and he'd draw the whole bridge.
link |
00:29:01.560
He was a mechanical engineer.
link |
00:29:03.640
And he would just draw the whole thing
link |
00:29:05.040
and then he would tell me about it
link |
00:29:06.320
and then tell me how he would have changed it.
link |
00:29:08.720
And he had this idea that he could understand
link |
00:29:11.920
and conceive anything.
link |
00:29:13.440
And I just grew up with that, so that was natural.
link |
00:29:16.480
So if, you know, like when I interview people,
link |
00:29:18.800
I ask them to draw a picture of something
link |
00:29:20.280
they did on the whiteboard.
link |
00:29:21.840
And it's really interesting.
link |
00:29:22.920
Like some people will draw a little box, you know,
link |
00:29:25.960
and then they'll say, and then this talks to this
link |
00:29:27.840
and I'll be like, that's just frustrating.
link |
00:29:30.120
And then I had this other guy come in one time.
link |
00:29:31.760
He says, well, I designed a floating point in this chip
link |
00:29:34.520
but I'd really like to tell you how the whole thing works
link |
00:29:36.360
and then tell you how the floating point works inside of it.
link |
00:29:38.240
Do you mind if I do that?
link |
00:29:39.080
He covered two whiteboards in like 30 minutes.
link |
00:29:42.040
And I hired him.
link |
00:29:42.880
Like, he was great.
link |
00:29:44.560
There's craftsmen.
link |
00:29:45.400
I mean, that's the craftsmanship to that.
link |
00:29:47.040
Yeah, but also the mental agility
link |
00:29:49.480
to understand the whole thing.
link |
00:29:51.360
Right.
link |
00:29:52.200
Put the pieces in context, you know,
link |
00:29:54.760
real view of the balance of how the design worked.
link |
00:29:58.640
Because if you don't understand it properly,
link |
00:30:01.040
when you start to draw it,
link |
00:30:02.240
you'll fill up half the whiteboard
link |
00:30:03.800
with like a little piece of it and, you know,
link |
00:30:06.040
like your ability to lay it out in an understandable way
link |
00:30:09.280
takes a lot of understanding.
link |
00:30:10.600
So. And be able to zoom in to the detail
link |
00:30:13.480
and then zoom out to the picture.
link |
00:30:15.000
Zoom out really fast.
link |
00:30:16.400
What about the impossible thing?
link |
00:30:17.600
You see, your dad believed that you can do anything.
link |
00:30:22.960
That's a weird feature for a craftsman.
link |
00:30:25.520
Yeah.
link |
00:30:26.680
It seems that that echoes in your own behavior.
link |
00:30:30.800
Like, that's the...
link |
00:30:32.120
Well, it's not that anybody can do anything right now.
link |
00:30:36.200
Right.
link |
00:30:37.040
It's that if you work at it, you can get better at it.
link |
00:30:39.680
And there might not be a limit.
link |
00:30:43.080
And they did funny things like,
link |
00:30:44.600
like he always wanted to play piano.
link |
00:30:46.120
So at the end of his life, he started playing the piano.
link |
00:30:48.440
When he had Parkinson's, I mean, he was terrible.
link |
00:30:51.560
But he thought if he really worked out it in this life,
link |
00:30:53.520
maybe the next life, he'd be better at it.
link |
00:30:56.360
He might be onto something.
link |
00:30:57.600
Yeah.
link |
00:30:58.440
Good thing.
link |
00:30:59.760
He enjoyed doing it.
link |
00:31:00.920
Yeah.
link |
00:31:01.760
So that's pretty funny.
link |
00:31:04.120
Do you think the perfect is the enemy of the good
link |
00:31:06.160
in hardware and software engineering?
link |
00:31:08.160
It's like we were talking about JavaScript a little bit
link |
00:31:10.480
and the messiness of the 10 day building process.
link |
00:31:14.760
Yeah.
link |
00:31:15.600
It's, you know, creative tension, right?
link |
00:31:19.080
So creative tension is you have two different ideas
link |
00:31:21.440
that you can't do both.
link |
00:31:23.880
Right.
link |
00:31:24.720
And the, but the fact that you want to do both causes you
link |
00:31:28.360
to go try to solve that problem.
link |
00:31:29.960
That's the creative part.
link |
00:31:32.040
So if you're building computers, like some people say,
link |
00:31:35.960
we have the schedule and anything that doesn't fit
link |
00:31:38.280
in the schedule, we can't do, right?
link |
00:31:40.440
And so they throw out the perfect cause I have a schedule.
link |
00:31:44.280
I hate that.
link |
00:31:46.600
Then there's other people to say,
link |
00:31:48.200
we need to get this perfectly right.
link |
00:31:50.560
And no matter what, you know, more people, more money, right?
link |
00:31:55.480
And there's a really clear idea about what you want.
link |
00:31:57.840
And some people are really good at articulating it, right?
link |
00:32:00.720
So let's call that the perfect.
link |
00:32:02.000
Yeah.
link |
00:32:02.840
Yeah.
link |
00:32:03.680
All right.
link |
00:32:04.520
But that's also terrible cause they never ship anything.
link |
00:32:05.840
They never hit any goals.
link |
00:32:07.360
So now you have the, now you have your framework.
link |
00:32:09.960
Yes.
link |
00:32:10.800
You can't throw out stuff cause you can't get it done today.
link |
00:32:12.760
Cause maybe you get it done tomorrow with the next project.
link |
00:32:15.360
Right.
link |
00:32:16.200
You can't, so you have to, I work with a guy that I really
link |
00:32:19.520
like working with, but he over filters his ideas.
link |
00:32:23.120
Over filters.
link |
00:32:24.760
He'd start thinking about something.
link |
00:32:26.600
And as soon as he figured out what's wrong with it,
link |
00:32:28.000
he'd throw it out.
link |
00:32:29.840
And then I start thinking about it.
link |
00:32:31.240
And I, you know, you come up with an idea and then you find
link |
00:32:33.200
out what's wrong with it.
link |
00:32:34.960
And then you give it a little time to set.
link |
00:32:36.720
Cause sometimes, you know, you figure out how to tweak it
link |
00:32:39.200
or maybe that idea helps some other idea.
link |
00:32:42.560
So idea generation is really funny.
link |
00:32:45.040
So you have to give your idea space.
link |
00:32:46.880
Like spaciousness of mind is key,
link |
00:32:49.720
but you also have to execute programs and get shit done.
link |
00:32:53.360
And then it turns out computer engineering is fun
link |
00:32:55.480
because it takes, you know, a hundred people to build a
link |
00:32:57.200
computer, 200 to 300, whatever the number is.
link |
00:33:00.560
And people are so variable about, you know, temperament and,
link |
00:33:05.480
you know, skill sets and stuff that in a big organization,
link |
00:33:09.400
you find that the people who love the perfect ideas and the
link |
00:33:11.960
people that want to get stuffed on yesterday and people like
link |
00:33:14.920
that come up with ideas and people like the, let's say,
link |
00:33:18.280
shoot down ideas.
link |
00:33:19.240
And it takes the whole, it takes a large group of people.
link |
00:33:22.920
So some are good at generating ideas.
link |
00:33:24.600
Some are good at filtering ideas.
link |
00:33:25.960
And in that giant mess, you're somehow, I guess the goal is
link |
00:33:32.120
for that giant mess of people to find the perfect path
link |
00:33:36.080
through the tension, the creative tension.
link |
00:33:38.480
But like, how do you know when you said there's some people
link |
00:33:42.000
good at articulating what perfect looks like,
link |
00:33:43.760
what a good design is?
link |
00:33:44.760
Like if you're sitting in a room and you have a set of ideas
link |
00:33:51.040
about like how to design a better processor,
link |
00:33:55.360
how do you know this is something special here?
link |
00:33:58.840
This is a good idea.
link |
00:33:59.920
Let's try this.
link |
00:34:00.760
So if you ever brainstormed idea with a couple of people
link |
00:34:03.080
that were really smart and you kind of go into it
link |
00:34:05.640
and you don't quite understand it and you're working on it.
link |
00:34:09.720
And then you start, you know, talking about it,
link |
00:34:12.200
putting it on the whiteboard, maybe it takes days or weeks.
link |
00:34:16.160
And then your brain starts to kind of synchronize.
link |
00:34:18.640
It's really weird.
link |
00:34:19.480
Like you start to see what each other is thinking and it starts to work.
link |
00:34:28.440
Like you can see work, like my talent in computer design
link |
00:34:30.960
is I can see how computers work in my head like really well.
link |
00:34:35.320
And I know other people can do that too.
link |
00:34:37.360
And when you're working with people that can do that,
link |
00:34:40.440
like it is kind of an amazing experience.
link |
00:34:45.360
And then every once in a while you get to that place
link |
00:34:48.200
and then you find the flaw and it was just kind of funny
link |
00:34:50.200
because you can fool yourself.
link |
00:34:53.760
The two of you kind of drifted along in the direction that was useless.
link |
00:34:58.080
Yeah, that happens too.
link |
00:34:59.440
Like you have to, because, you know, the nice thing about computer design
link |
00:35:04.120
is always reduction of practice.
link |
00:35:05.600
Like you come up with your good ideas.
link |
00:35:08.120
And I've noticed some architects who really love ideas
link |
00:35:11.160
and then they work on them and they put it on the shelf
link |
00:35:13.080
and they go work on the next idea and put it on the shelf
link |
00:35:14.800
and they never reduce it to practice.
link |
00:35:16.840
So they find out what's good and bad
link |
00:35:18.760
because almost every time I've done something really new,
link |
00:35:22.480
by the time it's done, like the good parts are good,
link |
00:35:25.640
but I know all the flaws, like...
link |
00:35:27.600
Yeah, would you say your career, just your own experience?
link |
00:35:31.560
Is your career defined mostly by flaws or by successes?
link |
00:35:35.240
Like if...
link |
00:35:36.080
Again, there's great attention between those.
link |
00:35:38.000
If you haven't tried hard, right, and done something new,
link |
00:35:43.000
right, then you're not gonna be facing the challenges
link |
00:35:45.760
when you build it, then you find out all the problems with it.
link |
00:35:49.120
And...
link |
00:35:49.960
But when you look back, do you see problems or...
link |
00:35:52.600
Oh, when I look back, I think earlier in my career,
link |
00:35:57.480
like EV5 was the second Alpha chip,
link |
00:36:00.320
I was so embarrassed about the mistakes,
link |
00:36:03.720
I could barely talk about it.
link |
00:36:05.720
And it was in the Guinness Book of Rolls records
link |
00:36:07.520
and it was the fastest processor on the planet.
link |
00:36:09.840
Yeah.
link |
00:36:10.680
So it was, and at some point I realized
link |
00:36:13.120
that was really a bad mental framework to deal with,
link |
00:36:16.400
like doing something new, we did a bunch of new things
link |
00:36:18.320
and some of them worked out great and some were bad.
link |
00:36:20.560
And we learned a lot from it and then the next one,
link |
00:36:24.120
we learned a lot.
link |
00:36:24.960
That also, EV6 also had some really cool things in it.
link |
00:36:28.960
I think the proportion of good stuff went up,
link |
00:36:31.360
but it had a couple of fatal flaws in it
link |
00:36:33.840
that were painful.
link |
00:36:35.800
And then...
link |
00:36:37.360
You learned to channel the pain into like pride.
link |
00:36:40.880
Not pride really, just realization
link |
00:36:44.680
about how the world works or how that kind of idea set works.
link |
00:36:48.560
Life is suffering, that's the reality.
link |
00:36:50.640
What...
link |
00:36:51.480
No, it's not.
link |
00:36:52.800
Well...
link |
00:36:53.640
I know the Buddhists have that and a couple of other people
link |
00:36:55.640
are stuck on it.
link |
00:36:56.480
No, it's, you know, there's just kind of weird combination
link |
00:36:59.840
of good and bad and light and darkness
link |
00:37:03.080
that you have to deal with.
link |
00:37:04.800
Yeah, there's definitely lots of suffering in the world.
link |
00:37:07.600
Depends on the perspective.
link |
00:37:08.840
It seems like there's way more darkness,
link |
00:37:10.640
but that makes the light part really nice.
link |
00:37:13.760
What...
link |
00:37:14.600
Computing hardware or just any kind of even software design,
link |
00:37:21.200
are you defined beautiful from your own work,
link |
00:37:24.800
from other people's work?
link |
00:37:27.600
That you're just...
link |
00:37:29.200
We were just talking about the kind of software
link |
00:37:31.920
that you're just... We were just talking about the battleground
link |
00:37:37.240
of flaws and mistakes and errors,
link |
00:37:39.160
but things that were just beautifully done.
link |
00:37:42.440
Is there something that pops to mind?
link |
00:37:44.400
Well, when things are beautifully done,
link |
00:37:47.840
usually there's a well thought out set of abstraction layers.
link |
00:37:53.640
So the whole thing works in unison nicely.
link |
00:37:56.360
Yes.
link |
00:37:57.280
And when I say abstraction layer,
link |
00:37:59.280
that means two different components
link |
00:38:01.080
when they work together, they work independently.
link |
00:38:04.840
They don't have to know what the other one is doing.
link |
00:38:07.640
So that decoupling.
link |
00:38:08.600
Yeah, so the famous one was the network stack.
link |
00:38:11.400
Like there's a seven layer network stack,
link |
00:38:13.000
you know, data transport and protocol and all the layers.
link |
00:38:16.280
And the innovation was is when they really got that right.
link |
00:38:19.880
Because networks before that didn't define those very well.
link |
00:38:22.840
The layers could innovate independently
link |
00:38:26.120
and occasionally the layer boundary would...
link |
00:38:28.640
The interface would be upgraded.
link |
00:38:30.920
And that let, you know, the design space breathe.
link |
00:38:35.640
You could do something new in layer seven
link |
00:38:37.760
without having to worry about how layer four worked.
link |
00:38:40.520
And so good design does that.
link |
00:38:42.920
And you see it in processor designs.
link |
00:38:45.160
When we did the Zen design at AMD,
link |
00:38:48.520
we made several components very modular.
link |
00:38:51.880
And, you know, my insistence at the top was
link |
00:38:54.640
I wanted all the interfaces defined
link |
00:38:56.560
before we wrote the RTL for the pieces.
link |
00:38:59.280
One of the verification leads said,
link |
00:39:01.000
if we do this right, I can test the pieces
link |
00:39:03.520
so well independently.
link |
00:39:04.840
When we put it together,
link |
00:39:06.400
we won't find all these interaction bugs
link |
00:39:08.080
because the floating point knows how the cache works.
link |
00:39:10.680
And I was a little skeptical, but he was mostly right.
link |
00:39:14.160
That the modularity design greatly improved the quality.
link |
00:39:18.920
Is that universally true in general?
link |
00:39:20.480
Would you say about good designs,
link |
00:39:21.800
the modularity is like usually modularity?
link |
00:39:24.080
Well, we talked about this before.
link |
00:39:25.120
Humans are only so smart.
link |
00:39:26.360
And we're not getting any smarter, right?
link |
00:39:29.440
But the complexity of things is going up.
link |
00:39:32.240
So, you know, a beautiful design
link |
00:39:35.520
can't be bigger than the person doing it.
link |
00:39:37.960
It's just, you know, their piece of it.
link |
00:39:40.000
Like the odds of you doing a really beautiful design
link |
00:39:42.440
of something that's way too hard for you is low, right?
link |
00:39:46.560
If it's way too simple for you,
link |
00:39:48.000
it's not that interesting.
link |
00:39:49.000
It's like, well, anybody could do that.
link |
00:39:50.600
But when you get the right match of your expertise
link |
00:39:54.720
and, you know, mental power to the right design size,
link |
00:39:58.680
that's cool, but that's not big enough
link |
00:40:00.400
to make a meaningful impact in the world.
link |
00:40:02.240
So now you have to have some framework
link |
00:40:04.880
to design the pieces so that the whole thing
link |
00:40:08.080
is big and harmonious.
link |
00:40:10.080
But, you know, when you put it together,
link |
00:40:13.520
it's, you know, sufficiently interesting to be used.
link |
00:40:18.880
And, you know, so that's like a beautiful design is.
link |
00:40:21.480
Matching the limits of that human cognitive capacity
link |
00:40:28.000
to the module you can create
link |
00:40:30.360
and creating a nice interface between those modules.
link |
00:40:33.160
And thereby, do you think there's a limit
link |
00:40:34.560
to the kind of beautiful complex systems
link |
00:40:37.120
we can build with this kind of modular design?
link |
00:40:41.000
It's like, you know, if we build increasingly more complicated,
link |
00:40:46.520
you can think of like the internet, okay, let's scale it down.
link |
00:40:50.920
No, you can think of like social network,
link |
00:40:52.320
like Twitter as one computing system.
link |
00:40:57.320
And, but those are the little modules, right?
link |
00:41:00.840
Well, it's built on, it's built on so many components
link |
00:41:03.800
nobody at Twitter even understands.
link |
00:41:06.000
Right.
link |
00:41:06.840
So, so, so if an alien showed up and looked at Twitter,
link |
00:41:09.280
he wouldn't just see Twitter as a beautiful,
link |
00:41:11.160
simple thing that everybody uses, which is really big.
link |
00:41:14.400
You would see the network, it runs on the fiber optics,
link |
00:41:18.160
the data is transported to computers.
link |
00:41:19.840
The whole thing is so bloody complicated,
link |
00:41:22.040
nobody at Twitter understands it.
link |
00:41:23.720
And so that's what the alien would see.
link |
00:41:25.760
So yeah, if an alien showed up and looked at Twitter
link |
00:41:28.800
or looked at the various different network systems
link |
00:41:32.040
that you can see on earth.
link |
00:41:33.680
So imagine they were really smart
link |
00:41:35.000
that could comprehend the whole thing.
link |
00:41:36.720
And then they sort of, you know, evaluated the human
link |
00:41:40.160
and thought, this is really interesting.
link |
00:41:41.600
No human on this planet comprehends the system they built.
link |
00:41:45.560
No individual, well, would they even see individual humans?
link |
00:41:48.200
That's interesting, like we humans are very human centric,
link |
00:41:51.120
entity centric.
link |
00:41:52.760
And so we think of us as the central organism
link |
00:41:56.880
and the networks as just the connection of organisms,
link |
00:41:59.840
but from a perspective of an alien,
link |
00:42:02.520
from an outside perspective, it seems like.
link |
00:42:05.400
Yeah, I get it, we're the answer to the ant colony.
link |
00:42:08.960
The ant colony, yeah.
link |
00:42:10.480
Or the result of production of the ant colony,
link |
00:42:12.760
which is like cities and it's,
link |
00:42:16.280
it's a, in that sense, humans are pretty impressive.
link |
00:42:19.840
The modularity that we're able to and the,
link |
00:42:23.080
and how robust we are to noise and mutation,
link |
00:42:25.920
all that kind of stuff.
link |
00:42:26.760
Well, that's cause it's stress tested all the time.
link |
00:42:28.480
Yeah.
link |
00:42:29.320
You know, you build all these cities with buildings
link |
00:42:31.040
and you get earthquakes occasionally and.
link |
00:42:33.000
Wars.
link |
00:42:33.840
You know, wars, earthquakes.
link |
00:42:35.520
Viruses every once in a while.
link |
00:42:37.960
Changes in business plans for, you know,
link |
00:42:40.320
like shipping or something.
link |
00:42:41.600
Like, as long as there's all stress tested,
link |
00:42:44.720
then it keeps adapting to the situation.
link |
00:42:48.520
So that's a curious phenomena.
link |
00:42:52.480
Well, let's go, let's talk about Moore's Law a little bit.
link |
00:42:54.840
It's a, at the broad view of Moore's Law,
link |
00:43:00.160
where it's just exponential improvement of computing
link |
00:43:03.760
capability, like OpenAI, for example,
link |
00:43:06.760
recently published this kind of papers looking
link |
00:43:11.760
at the exponential improvement in the training efficiency
link |
00:43:15.400
of neural networks.
link |
00:43:16.760
For like ImageNet and all that kind of stuff,
link |
00:43:18.560
we just got better on this,
link |
00:43:19.920
this is purely software side,
link |
00:43:22.280
just figuring out better tricks and algorithms
link |
00:43:25.600
for training neural networks.
link |
00:43:26.920
And that seems to be improving significantly faster
link |
00:43:30.600
than the Moore's Law prediction, you know?
link |
00:43:33.080
So that's in the software space.
link |
00:43:34.920
Like, what do you think if Moore's Law continues
link |
00:43:39.120
or if the general version of Moore's Law continues,
link |
00:43:42.880
do you think that comes mostly from the hardware,
link |
00:43:45.320
from the software, some mix of the two,
link |
00:43:47.560
some interesting totally,
link |
00:43:50.000
so not the reduction of the size of the transistor
link |
00:43:52.800
kind of thing, but more in the totally interesting
link |
00:43:57.800
kinds of innovations in the hardware space,
link |
00:43:59.840
all that kind of stuff.
link |
00:44:01.240
Well, there's like half a dozen things going on
link |
00:44:04.480
in that graph.
link |
00:44:05.560
So one is there's initial innovations
link |
00:44:08.480
that had a lot of headroom to be exploited.
link |
00:44:11.680
So, you know, the efficiency of the networks
link |
00:44:13.960
has improved dramatically.
link |
00:44:15.920
And then the decomposability of those and the use,
link |
00:44:19.600
you know, they started running on one computer,
link |
00:44:21.400
then multiple computers and then multiple GPUs
link |
00:44:23.720
and then arrays of GPUs and they're up to thousands.
link |
00:44:27.080
And at some point, so it's sort of like,
link |
00:44:30.640
they were consumed, they were going from
link |
00:44:32.280
like a single computer application
link |
00:44:33.880
to a thousand computer application.
link |
00:44:36.240
So that's not really a Moore's Law thing.
link |
00:44:38.200
That's an independent vector.
link |
00:44:39.560
How many computers can I put on this problem?
link |
00:44:42.360
Because the computers themselves are getting better
link |
00:44:44.240
on like a Moore's Law rate,
link |
00:44:46.000
but their ability to go from one to 10 to 100 to a thousand
link |
00:44:49.440
you know, was something.
link |
00:44:51.200
And then multiplied by, you know,
link |
00:44:53.320
the amount of computers it took to resolve
link |
00:44:55.320
like AlexNet, to ResNet, to transformers.
link |
00:44:58.320
It's been quite, you know, steady improvements.
link |
00:45:01.720
But those are like S cores, aren't they?
link |
00:45:03.320
That's the exactly kind of S cores
link |
00:45:04.960
that are underlying Moore's Law from the very beginning.
link |
00:45:07.640
So what's the biggest, what's the most productive,
link |
00:45:13.400
rich source of S curves in the future, do you think?
link |
00:45:16.760
Is it hardware or is it software?
link |
00:45:18.720
So hardware is going to move along relatively slowly.
link |
00:45:23.600
Like, you know, double performance every two years.
link |
00:45:27.040
There's still, I like how you call that slow.
link |
00:45:29.600
You know, it's the slow version.
link |
00:45:31.400
The snail's pace of Moore's Law.
link |
00:45:33.160
Maybe we should, we should trade mark that one.
link |
00:45:38.080
Whereas the scaling by number of computers,
link |
00:45:41.480
you know, can go much faster.
link |
00:45:43.400
You know, I'm sure at some point, Google had a,
link |
00:45:46.080
you know, their initial search engine
link |
00:45:47.520
was running on a laptop, you know, like,
link |
00:45:50.080
and at some point they really worked on scaling that.
link |
00:45:52.520
And then they factored the indexer from, you know,
link |
00:45:55.880
this piece and this piece and this piece
link |
00:45:57.440
and they spread the data on more and more things.
link |
00:45:59.280
And, you know, they did a dozen innovations.
link |
00:46:02.760
But as they scaled up the number of computers on that,
link |
00:46:05.360
it kept breaking, finding new bottlenecks in their software
link |
00:46:08.280
and their schedulers and made them rethink.
link |
00:46:11.720
Like, it seems insane to do a scheduler
link |
00:46:13.920
across a thousand computers who schedule parts of it
link |
00:46:16.720
and then send the results to one computer.
link |
00:46:19.000
But if you want to schedule a million searches,
link |
00:46:21.400
that makes perfect sense.
link |
00:46:23.200
So there's, the scaling by just quantity
link |
00:46:26.880
is probably the richest thing.
link |
00:46:28.960
But then as you scale quantity, like a network
link |
00:46:32.880
that was great on a hundred computers,
link |
00:46:34.680
maybe completely the wrong one,
link |
00:46:36.560
you may pick a network that's 10 times slower
link |
00:46:39.640
on 10,000 computers, like per computer.
link |
00:46:42.520
But if you go from a hundred to 10,000, that's a hundred times.
link |
00:46:45.800
So that's one of the things that happened
link |
00:46:47.240
when we did internet scaling.
link |
00:46:48.760
This efficiency went down, not up.
link |
00:46:52.560
The future of computing is inefficiency, not efficiency.
link |
00:46:55.520
But scales, inefficient scale.
link |
00:46:57.600
It's scaling faster than inefficiency by two.
link |
00:47:01.840
And as long as there's, you know, dollar value there,
link |
00:47:03.840
like scaling costs lots of money.
link |
00:47:06.000
But Google showed, Facebook showed, everybody showed
link |
00:47:08.200
that the scale was where the money was at.
link |
00:47:10.720
It was worth the financial.
link |
00:47:13.800
Do you think, is it possible
link |
00:47:16.440
that like basically the entirety of Earth
link |
00:47:19.640
will be like a computing surface?
link |
00:47:21.800
Like this table will be doing computing.
link |
00:47:24.440
This hedgehog will be doing computing.
link |
00:47:26.120
Like everything really inefficient
link |
00:47:28.160
done computing will be lover.
link |
00:47:29.560
Science fiction books, they call it computronium.
link |
00:47:31.840
Computronium?
link |
00:47:32.680
We turn everything into computing.
link |
00:47:34.720
Well, most of the elements aren't very good for anything.
link |
00:47:38.000
Like you're not gonna make a computer out of iron.
link |
00:47:39.960
Like, you know, silicon and carbon
link |
00:47:42.560
have like nice structures.
link |
00:47:45.080
You know, we'll see what you can do with the rest of it.
link |
00:47:48.680
People talk about, well, maybe you can turn the sun
link |
00:47:50.440
into a computer, but it's hydrogen.
link |
00:47:53.440
And a little bit of helium, so.
link |
00:47:55.800
What I mean is more like actually
link |
00:47:57.960
just adding computers to everything.
link |
00:47:59.920
Oh, okay.
link |
00:48:00.760
I thought you were just converting all the mass
link |
00:48:02.560
of the universe into computer.
link |
00:48:04.240
No, no, no.
link |
00:48:05.080
So not using.
link |
00:48:05.920
To be ironic from the simulation point of view
link |
00:48:07.600
is like the simulator build mass, the simulate.
link |
00:48:12.000
Yeah, I mean, yeah.
link |
00:48:12.840
So, I mean, ultimately this is all heading
link |
00:48:14.960
towards the simulation.
link |
00:48:15.800
Yeah, well, I think I might have told you this story.
link |
00:48:18.440
At Tesla, they were deciding,
link |
00:48:20.280
so they wanna measure the current coming out of the battery
link |
00:48:22.400
and they decided between putting a resistor in there
link |
00:48:25.880
and putting a computer with a sensor in there.
link |
00:48:29.360
And the computer was faster than the computer
link |
00:48:31.480
I worked on in 1982.
link |
00:48:34.120
And we chose the computer
link |
00:48:35.520
because it was cheaper than the resistor.
link |
00:48:38.640
So, sure, this hedgehog, you know, costs $13
link |
00:48:42.320
and we can put an AI that's as smart as you
link |
00:48:45.120
in there for five bucks.
link |
00:48:46.040
It'll have one.
link |
00:48:48.240
You know, so computers will be everywhere.
link |
00:48:51.760
I was hoping it wouldn't be smarter than me
link |
00:48:53.720
because...
link |
00:48:54.600
Well, everything's gonna be smarter than you.
link |
00:48:56.640
But you were saying it's inefficient.
link |
00:48:58.000
I thought it was better to have a lot of dumb things.
link |
00:49:00.200
Well, Moore's Law will slowly compact that stuff.
link |
00:49:02.720
So even the dumb things will be smarter than us.
link |
00:49:04.840
The dumb things are gonna be smart
link |
00:49:06.000
or they're gonna be smart enough to talk to something
link |
00:49:08.000
that's really smart.
link |
00:49:10.160
You know, it's like, well, just remember,
link |
00:49:13.600
like a big computer chip, you know,
link |
00:49:16.120
it's like an inch by an inch and, you know,
link |
00:49:18.680
40 microns thick, it doesn't take very much,
link |
00:49:22.440
very many atoms to make a high power computer.
link |
00:49:25.520
And 10,000 of them can fit in the shoebox.
link |
00:49:29.040
But, you know, you have the cooling and power problems,
link |
00:49:31.440
but, you know, people are working on that.
link |
00:49:33.480
But they still can't write compelling poetry or music
link |
00:49:37.640
or understand what love is
link |
00:49:40.360
or have a fear of mortality.
link |
00:49:41.680
So we're still winning.
link |
00:49:43.480
Neither can most of humanity, so...
link |
00:49:46.160
Well, they can write books about it.
link |
00:49:48.240
So, but speaking about this walk along the path
link |
00:49:56.080
of innovation towards the dumb things
link |
00:49:58.760
being smarter than humans,
link |
00:50:00.120
you are now the CTO of Ten Storent as of two months ago.
link |
00:50:08.560
They built hardware for deep learning.
link |
00:50:13.840
How do you build scalable and efficient deep learning?
link |
00:50:16.160
This is such a fascinating space.
link |
00:50:17.520
Yeah, yeah, so it's interesting.
link |
00:50:18.760
So up until recently,
link |
00:50:20.800
I thought there was two kinds of computers.
link |
00:50:22.360
There are serial computers that run like C programs,
link |
00:50:25.400
and then there's parallel computers.
link |
00:50:27.120
So the way I think about it is, you know,
link |
00:50:29.360
parallel computers have given parallelism.
link |
00:50:31.920
Like GPUs are great cause you have a million pixels.
link |
00:50:34.800
And modern GPUs run a program on every pixel.
link |
00:50:37.520
They call it the shader program, right?
link |
00:50:39.400
So, or like finite element analysis.
link |
00:50:42.480
You built something, you know,
link |
00:50:43.960
you make this into little tiny chunks.
link |
00:50:45.560
You give each chunk to a computer.
link |
00:50:47.120
So you're given all these chunks of parallelism like that.
link |
00:50:50.200
But most C programs, you write this linear narrative
link |
00:50:53.560
and you have to make it go fast.
link |
00:50:55.600
To make it go fast, you predict all the branches,
link |
00:50:57.720
all the data fetches and you run that more in parallel,
link |
00:51:00.280
but that's found parallelism.
link |
00:51:04.240
AI is, I'm still trying to decide how fundamental this is.
link |
00:51:08.400
It's a given parallelism problem.
link |
00:51:10.920
But the way people describe the neural networks
link |
00:51:14.800
and then how they write them in PyTorch, it makes graphs.
link |
00:51:17.920
Yeah.
link |
00:51:18.760
That might be fundamentally different than the GPU kind of.
link |
00:51:21.680
Parallelism, yeah, it might be.
link |
00:51:23.280
Because when you run the GPU program on all the pixels,
link |
00:51:27.320
you're running, you know, depends, you know,
link |
00:51:29.880
this group of pixels say it's background blue
link |
00:51:32.520
and it runs a really simple program.
link |
00:51:34.000
This pixel is, you know, some patch of your face.
link |
00:51:36.920
So you have some really interesting shader program
link |
00:51:39.520
to give you the impression of translucency.
link |
00:51:41.720
But the pixels themselves don't talk to each other.
link |
00:51:43.960
There's no graph, right?
link |
00:51:46.600
So you do the image
link |
00:51:48.200
and then you do the next image and you do the next image
link |
00:51:51.280
and you run 8 million pixels, 8 million programs every time
link |
00:51:55.600
and modern GPUs have like 6,000 thread engines in them.
link |
00:51:59.600
So, you know, to get 8 million pixels,
link |
00:52:02.080
each one runs a program on, you know, 10 or 20 pixels.
link |
00:52:06.160
And that's how they work, there's no graph.
link |
00:52:09.360
But you think graph might be a totally new way
link |
00:52:13.680
to think about hardware.
link |
00:52:14.840
So, Roger Gattori and I have been having this good conversation
link |
00:52:18.080
about given versus found parallelism.
link |
00:52:20.560
And then the kind of walk,
link |
00:52:22.480
as we got more transistors, like, you know,
link |
00:52:24.640
computers way back when did stuff on scalar data.
link |
00:52:27.800
Then we did on vector data, famous vector machines.
link |
00:52:30.720
Now we're making computers that operate on matrices, right?
link |
00:52:34.480
And then the category we said that was next was spatial.
link |
00:52:38.840
Like imagine you have so much data that, you know,
link |
00:52:41.640
you want to do the compute on this data.
link |
00:52:43.360
And then when it's done,
link |
00:52:44.920
it says send the result to this pile of data
link |
00:52:47.520
on some software on that.
link |
00:52:49.240
And it's better to think about it spatially
link |
00:52:53.040
than to move all the data to a central processor
link |
00:52:56.080
and do all the work.
link |
00:52:57.560
So, spatially, I mean, moving in the space of data
link |
00:53:00.720
as opposed to moving the data.
link |
00:53:02.440
Yeah, you have a petabyte data space
link |
00:53:05.320
spread across some huge array of computers.
link |
00:53:08.600
And when you do a computation somewhere,
link |
00:53:10.520
you send the result of that computation
link |
00:53:12.240
or maybe a pointer to the next program
link |
00:53:14.320
to some other piece of data and do it.
link |
00:53:16.600
But I think a better word might be graph
link |
00:53:18.760
and all the AI neural networks are graphs.
link |
00:53:21.640
Do some computations and the result here,
link |
00:53:24.000
do another computation, do a data transformation,
link |
00:53:26.360
do emerging, do a pooling, do another computation.
link |
00:53:30.320
Is it possible to compress and say
link |
00:53:32.240
how we make this thing efficient,
link |
00:53:34.520
this whole process efficient, this different?
link |
00:53:37.240
So first, the fundamental elements in the graphs
link |
00:53:40.880
are things like metrics, multiplies, convolutions,
link |
00:53:43.160
data manipulations, and data movements.
link |
00:53:46.120
So GPUs emulate those things with their little singles,
link |
00:53:50.200
basically running a single threaded program.
link |
00:53:53.080
And then there's an NVIDIA calls it a warp
link |
00:53:55.560
where they group a bunch of programs
link |
00:53:56.880
that are similar together for efficiency
link |
00:53:59.680
and instruction use.
link |
00:54:01.520
And then at a higher level, you take this graph
link |
00:54:04.920
and you say this part of the graph is a matrix multiplier
link |
00:54:07.200
which runs on these 30 G threads.
link |
00:54:09.800
But the model at the bottom was built
link |
00:54:12.600
for running programs on pixels, not executing graphs.
link |
00:54:17.120
So it's emulation, ultimately.
link |
00:54:19.400
So is it possible to build something
link |
00:54:21.080
that natively runs graphs?
link |
00:54:23.040
Yes, so it's what Ten Storent did.
link |
00:54:26.240
So where are we on that?
link |
00:54:28.200
How, like in the history of that effort,
link |
00:54:30.920
are we in the early days?
link |
00:54:32.040
Yeah, I think so.
link |
00:54:33.360
Ten Storent started by a friend of mine,
link |
00:54:35.720
Labisha Bajek, and I was his first investor.
link |
00:54:39.000
So I've been kind of following him
link |
00:54:41.600
and talking to him about it for years
link |
00:54:43.560
and in the fall when I was considering things to do.
link |
00:54:47.840
I decided, you know, we held a conference last year
link |
00:54:51.560
with a friend to organize it.
link |
00:54:53.720
And we wanted to bring in thinkers
link |
00:54:56.120
and two of the people were Andre Carpathi and Chris Latner.
link |
00:55:00.480
And Andre gave this talk, it's on YouTube,
link |
00:55:03.400
called software 2.0, which I think is great.
link |
00:55:06.840
Which is, we went from programmed computers,
link |
00:55:10.160
where you write programs to data program computers.
link |
00:55:13.760
You know, like the futures of software
link |
00:55:16.760
is data programs, the networks.
link |
00:55:19.360
And I think that's true.
link |
00:55:21.360
And then Chris has been working,
link |
00:55:23.960
he worked on LLVM, the low level virtual machine,
link |
00:55:26.600
which became the intermediate representation
link |
00:55:29.080
for all compilers.
link |
00:55:31.320
And now he's working on another project called MLIR,
link |
00:55:33.640
which is mid level intermediate representation,
link |
00:55:36.400
which is essentially under the graph
link |
00:55:39.800
about how do you represent that kind of computation
link |
00:55:42.800
and then coordinate large numbers
link |
00:55:44.320
of potentially heterogeneous computers.
link |
00:55:47.840
And I would say technically 10 storents,
link |
00:55:51.480
you know, two pillars of those two ideas,
link |
00:55:54.880
software 2.0 and mid level representation.
link |
00:55:58.280
But it's in service of executing graph programs.
link |
00:56:01.840
The hardware is designed to do that.
link |
00:56:03.760
So that's including the hardware piece.
link |
00:56:06.440
And then the other cool thing is,
link |
00:56:08.440
for a relatively small amount of money,
link |
00:56:10.040
they did a test chip and two production chips.
link |
00:56:13.320
So it's like a super effective team.
link |
00:56:15.320
And unlike some AI startups,
link |
00:56:18.160
where if you don't build the hardware
link |
00:56:20.120
to run the software that they really want to do,
link |
00:56:22.840
then you have to fix it by writing lots more software.
link |
00:56:26.040
So the hardware naturally does,
link |
00:56:28.200
matrix multiply, convolution, the data manipulations,
link |
00:56:31.800
and the data movement between processing elements
link |
00:56:35.320
that you can see in the graph,
link |
00:56:38.760
which I think is all pretty clever.
link |
00:56:40.320
And that's what I'm working on now.
link |
00:56:45.040
So I think it's called the grace call processor
link |
00:56:49.760
introduced last year.
link |
00:56:51.240
It's, you know, there's a bunch of measures
link |
00:56:53.160
of performance we're talking about, horses.
link |
00:56:55.560
It seems to outperform 368 trillion operations per second.
link |
00:56:59.840
It seems to outperform NVIDIA's Tesla T4 system.
link |
00:57:03.240
So these are just numbers.
link |
00:57:04.720
What do they actually mean in real world performance?
link |
00:57:07.600
Like what are the metrics for you that you're chasing
link |
00:57:11.160
in your horse racing?
link |
00:57:12.520
What do you care about?
link |
00:57:13.840
Well, first, so the native language of,
link |
00:57:17.680
you know, people who write AI network programs
link |
00:57:20.360
is PyTorch now, PyTorch TensorFlow.
link |
00:57:22.520
There's a couple others.
link |
00:57:24.000
The PyTorch is one over TensorFlow,
link |
00:57:25.800
this is just, I'm not an expert on that.
link |
00:57:27.960
I know many people have switched from TensorFlow
link |
00:57:30.440
to PyTorch.
link |
00:57:31.280
Yeah.
link |
00:57:32.120
And there's technical reasons for it.
link |
00:57:33.800
I use both, both are still awesome.
link |
00:57:35.880
Both are still awesome.
link |
00:57:37.120
But the deepest love is for PyTorch currently.
link |
00:57:39.880
Yeah, there's more love for that.
link |
00:57:41.320
And that may change.
link |
00:57:42.560
So the first thing is when they write their programs
link |
00:57:46.640
can the hardware execute it pretty much as it was written.
link |
00:57:50.400
Right.
link |
00:57:51.240
So PyTorch turns into a graph.
link |
00:57:53.280
We have a graph compiler that makes that graph.
link |
00:57:55.520
Then it fractions the graph down.
link |
00:57:57.440
So if you have big matrix multiply,
link |
00:57:58.800
we turn it into right size chunks
link |
00:58:00.120
to run on the processing elements.
link |
00:58:02.160
It hooks all the graph up, it lays out all the data.
link |
00:58:05.120
There's a couple of mid level representations of it
link |
00:58:08.000
that are also simulatable.
link |
00:58:09.400
So that if you're writing the code,
link |
00:58:12.120
you can see how it's gonna go through the machine,
link |
00:58:15.080
which is pretty cool.
link |
00:58:15.920
And then at the bottom it's scheduled kernels
link |
00:58:17.680
like math, data manipulation, data movement kernels,
link |
00:58:21.760
which do this stuff.
link |
00:58:22.840
So we don't have to run, write a little program
link |
00:58:26.200
to do matrix multiply.
link |
00:58:27.320
Because we have a big matrix multiplier.
link |
00:58:29.000
Like there's no SIMT program for that.
link |
00:58:32.360
But there is scheduling for that, right?
link |
00:58:36.000
So one of the goals is if you write a piece of PyTorch code
link |
00:58:40.200
that looks pretty reasonable,
link |
00:58:41.240
you should be able to compile it, run it on the hardware
link |
00:58:43.480
without having to tweak it
link |
00:58:44.760
and do all kinds of crazy things to get performance.
link |
00:58:48.120
There's not a lot of intermediate steps.
link |
00:58:50.040
It's running directly as written.
link |
00:58:51.320
Like on a GPU, if you write a large matrix multiply
link |
00:58:53.960
naively, you'll get 5% to 10%
link |
00:58:56.000
there's a peak performance of the GPU, right?
link |
00:58:58.920
And then there's a bunch of people published papers on this
link |
00:59:01.600
and I read them about what steps do you have to do?
link |
00:59:04.080
And it goes from pretty reasonable,
link |
00:59:06.760
well transpose one of the matrices.
link |
00:59:08.480
So you do rotor, not column ordered, you know, block it
link |
00:59:12.440
so that you can put a block of the matrix
link |
00:59:14.520
on different SMs, you know, groups of threads.
link |
00:59:19.320
But some of it gets into little details
link |
00:59:21.160
like you have to schedule it just so
link |
00:59:23.000
so you don't have register conflicts.
link |
00:59:25.040
So the, they call them CUDA ninjas.
link |
00:59:29.080
CUDA ninjas, I love it.
link |
00:59:31.120
To get to the optimal point,
link |
00:59:32.360
you either write a pre, use a pre written library
link |
00:59:36.080
which is a good strategy for some things
link |
00:59:37.920
or you have to be an expert
link |
00:59:39.640
in micro architecture to program it.
link |
00:59:42.240
Right, so the optimization step
link |
00:59:43.520
is way more complicated with the GPU.
link |
00:59:45.000
So our goal is, if you write PyTorch
link |
00:59:47.920
that's good PyTorch, you can do it.
link |
00:59:49.600
Now there's, as the networks are evolving, you know
link |
00:59:53.120
they've changed from convolutional to matrix multiply.
link |
00:59:56.360
The people are talking about conditional graphs,
link |
00:59:58.080
they're talking about very large matrices,
link |
00:59:59.800
they're talking about sparsity.
link |
01:00:01.760
They're talking about problems
link |
01:00:03.400
that scale across many, many chips.
link |
01:00:06.160
So the native, you know, data item is a packet.
link |
01:00:11.520
Like so you send the packet to a processor,
link |
01:00:13.360
it gets processed, it does a bunch of work
link |
01:00:15.440
and then it may send packets to other processors
link |
01:00:17.680
and they execute like a data flow graph kind of methodology.
link |
01:00:22.120
Got it. We have a big network on chip
link |
01:00:24.400
and then that second chip has 16 ethernet ports
link |
01:00:27.800
to help lots of them together
link |
01:00:29.600
and it's the same graph compiler across multiple chips.
link |
01:00:32.440
So that's where the scale comes in.
link |
01:00:33.600
So it's built to scale naturally.
link |
01:00:35.160
Now, my experience with scaling is as you scale
link |
01:00:38.200
you run into lots of interesting problems.
link |
01:00:40.800
So scaling is the amount of the climb.
link |
01:00:43.240
Yeah. So the hardware is built to do this
link |
01:00:45.040
and then we're in the process of...
link |
01:00:47.720
Is there a software part to this?
link |
01:00:49.200
With ethernet and all that?
link |
01:00:51.680
Well, the protocol at the bottom, you know, we send,
link |
01:00:55.680
it's an ethernet phi, but the protocol basically says
link |
01:00:59.800
send the packet from here to there.
link |
01:01:01.480
It's all point to point.
link |
01:01:03.160
The header bit says which processor to send it to
link |
01:01:05.880
and we basically take a packet off our on chip network,
link |
01:01:09.600
put an ethernet header on it, send it to the other end,
link |
01:01:13.040
strip the header off and send it to the local thing.
link |
01:01:14.920
It's pretty straightforward.
link |
01:01:16.160
Human to human interaction is pretty straightforward too
link |
01:01:18.200
but when you get a million of us
link |
01:01:19.400
we're just some crazy stuff together.
link |
01:01:21.680
Yeah, it can be fun.
link |
01:01:23.400
So is that the goal is scale?
link |
01:01:25.880
So like, for example, I've been recently doing a bunch
link |
01:01:28.960
of robots at home for my own personal pleasure.
link |
01:01:32.400
Am I going to ever use 10 storey or is this more for...
link |
01:01:35.840
There's all kinds of problems.
link |
01:01:37.240
Like there's small inference problems
link |
01:01:38.760
or small training problems or big training problems.
link |
01:01:41.480
What's the big goal?
link |
01:01:42.720
Is it the big training problems
link |
01:01:45.120
or the small training problems?
link |
01:01:46.320
There's one of the goals is to scale
link |
01:01:48.080
from 100 milliwatts to a megawatt.
link |
01:01:51.720
So like really have some range on the problems
link |
01:01:54.840
and the same kind of AI programs work
link |
01:01:57.360
at all different levels.
link |
01:01:59.320
So that's cool.
link |
01:02:00.600
The natural, since the natural data item is a packet
link |
01:02:03.600
that we can move around, it's built to scale
link |
01:02:07.640
but so many people have small problems.
link |
01:02:11.560
Right, right.
link |
01:02:12.400
But like I said, that phone is a small problem to solve.
link |
01:02:16.400
So do you see 10 storey potentially being inside a phone?
link |
01:02:19.960
Well, the power efficiency of local memory,
link |
01:02:22.600
local computation and the way we built it is pretty good.
link |
01:02:26.320
And then there's a lot of efficiency
link |
01:02:28.480
on being able to do conditional graphs in sparsity.
link |
01:02:31.480
I think it's for complicated networks
link |
01:02:34.480
that want to go in a small factor, it's going to be quite good.
link |
01:02:38.120
But we have to prove that that's a fun problem.
link |
01:02:40.720
And that's the early days of the company, right?
link |
01:02:42.240
It's a couple of years, you said.
link |
01:02:44.560
But you think, you invested, you think they're legit
link |
01:02:47.560
as you join, well, that's...
link |
01:02:50.000
Well, it's also, it's a really interesting place to be.
link |
01:02:53.240
Like the AI world is exploding, you know?
link |
01:02:55.720
And I looked at some other opportunities
link |
01:02:58.480
like build a faster processor, which people want.
link |
01:03:01.520
But that's more on an incremental path
link |
01:03:03.760
than what's going to happen in AI in the next 10 years.
link |
01:03:07.920
So this is kind of an exciting place to be part of.
link |
01:03:12.280
The revolutions will be happening in the very space that's...
link |
01:03:15.200
And then lots of people are working on it,
link |
01:03:16.600
but there's lots of technical reasons why some of them,
link |
01:03:18.840
you know, aren't going to work out that well.
link |
01:03:21.440
And that's interesting.
link |
01:03:23.600
And there's also the same problem about getting the basics right.
link |
01:03:27.520
Like we've talked to customers about exciting features.
link |
01:03:30.160
And at some point, we realized that each of the networks,
link |
01:03:32.960
realizing they want to hear first about memory bandwidth,
link |
01:03:35.880
local bandwidth, compute intensity, programmability.
link |
01:03:39.240
They want to know the basics, power management,
link |
01:03:42.000
how the network ports work.
link |
01:03:43.320
What are the basics?
link |
01:03:44.160
Do all the basics work?
link |
01:03:46.120
Because it's easy to say we've got this great idea
link |
01:03:47.800
of the, you know, the crack GPT3.
link |
01:03:51.040
But the people we talk to want to say,
link |
01:03:54.000
if I buy the...
link |
01:03:55.360
So we have a piece of express card with our chip on it.
link |
01:03:58.640
If you buy the card, you plug it in your machine
link |
01:04:00.840
to download the driver.
link |
01:04:01.920
How long does it take me to get my network to run?
link |
01:04:05.040
Right.
link |
01:04:05.880
You know, that's a real question.
link |
01:04:06.920
It's a very basic question.
link |
01:04:08.320
So, yeah.
link |
01:04:09.320
Is there an answer to that yet?
link |
01:04:10.480
Or is it's trying to get to it?
link |
01:04:12.120
Our goal is like an hour.
link |
01:04:13.400
Okay.
link |
01:04:14.240
When can I buy a test for it?
link |
01:04:16.800
Pretty soon.
link |
01:04:17.640
For my, for the small case training.
link |
01:04:19.760
Yeah, pretty soon.
link |
01:04:21.160
Months.
link |
01:04:22.000
Good.
link |
01:04:22.820
I love the idea of you inside a room
link |
01:04:24.800
with the Carpathian, under Carpathian, Chris Ladner.
link |
01:04:32.000
Very, very interesting, very brilliant people,
link |
01:04:36.000
very out of the box thinkers,
link |
01:04:37.600
but also like first principles thinkers.
link |
01:04:40.000
Well, they both get stuff done.
link |
01:04:42.680
They only get stuff done to get their own projects done.
link |
01:04:44.960
They talk about it clearly.
link |
01:04:47.040
They educate large numbers of people
link |
01:04:48.760
and they've created platforms for other people
link |
01:04:50.560
to go do their stuff on.
link |
01:04:52.040
Yeah.
link |
01:04:52.880
The clear thinking that's able to be communicated
link |
01:04:55.560
is kind of impressive.
link |
01:04:57.240
It's kind of remarkable to, yeah, I'm a fan.
link |
01:05:00.800
Well, let me ask,
link |
01:05:02.040
because I talked to Chris actually a lot these days.
link |
01:05:05.040
He's been a, one of the, just to give him a shout out
link |
01:05:08.120
and he's been so supportive as a human being.
link |
01:05:13.720
So everybody's quite different.
link |
01:05:16.320
Like great engineers are different,
link |
01:05:17.680
but he's been like sensitive to the human element
link |
01:05:20.800
in a way that's been fascinating.
link |
01:05:22.280
Like he was one of the early people
link |
01:05:23.800
on this stupid podcast that I do to say like,
link |
01:05:27.920
don't quit this thing and also talk to whoever
link |
01:05:32.120
the hell you want to talk to.
link |
01:05:34.160
That kind of from a legit engineer
link |
01:05:36.360
to get like props and be like, you can do this.
link |
01:05:39.960
That was, I mean, that's what a good leader does, right?
link |
01:05:42.280
It's just kind of let a little kid do his thing.
link |
01:05:45.120
Like go do it, let's see what turns out.
link |
01:05:48.720
That's a pretty powerful thing.
link |
01:05:50.520
But what do you, what's your sense about,
link |
01:05:54.480
he used to be, no, I think stepped away from Google, right?
link |
01:05:58.840
He said, sci fi, I think.
link |
01:06:01.800
What's really impressive to you
link |
01:06:03.840
about the things that Chris has worked on?
link |
01:06:05.760
As we mentioned, the optimization,
link |
01:06:08.320
the compiler design stuff, the LLVM,
link |
01:06:11.960
then there's, he's also at Google work, the TPU stuff.
link |
01:06:16.440
He's obviously worked on Swift,
link |
01:06:19.400
so the programming language side,
link |
01:06:21.400
talking about people that work in the entirety of the stack.
link |
01:06:24.120
Yeah, yeah.
link |
01:06:25.320
What, from your time interacting with Chris
link |
01:06:27.960
and knowing the guy, what's really impressive to you?
link |
01:06:30.800
It just inspires you.
link |
01:06:32.160
Well, like LLVM became the de facto platform
link |
01:06:39.800
for compilers, it's amazing.
link |
01:06:43.840
And it was good code quality, good design choices.
link |
01:06:46.360
He hit the right level of abstraction.
link |
01:06:48.840
There's a little bit of the right time and the right place.
link |
01:06:52.040
And then he built a new programming language called Swift,
link |
01:06:55.440
which after, let's say some adoption resistance
link |
01:06:59.080
became very successful.
link |
01:07:01.160
I don't know that much about his work at Google,
link |
01:07:03.360
although I know that, that was a typical,
link |
01:07:07.120
they started TensorFlow stuff and they,
link |
01:07:09.960
it was new, they wrote a lot of code
link |
01:07:12.800
and then at some point it needed to be refactored to be,
link |
01:07:17.200
because it's development slowed down,
link |
01:07:19.120
why PyTorch started a little later and then passed it.
link |
01:07:22.320
So he did a lot of work on that.
link |
01:07:23.960
And then his idea about MLIR,
link |
01:07:26.000
which is what people started to realize
link |
01:07:28.240
is the complexity of the software stack
link |
01:07:29.960
above the low level IR was getting so high
link |
01:07:33.560
that forcing the features of that into low level
link |
01:07:36.240
was putting too much of a burden on it.
link |
01:07:38.760
So he's splitting that into multiple pieces.
link |
01:07:41.640
And that was one of the inspirations for our software stack
link |
01:07:43.880
where we have several intermediate representations
link |
01:07:46.720
that are all executable.
link |
01:07:48.800
And you can look at them and do transformations on them
link |
01:07:51.360
before you lower the level.
link |
01:07:54.000
So that was, I think we started before MLIR
link |
01:07:58.200
really got far enough along to use,
link |
01:08:01.720
but we're interested in that.
link |
01:08:02.840
He's really excited about MLIR.
link |
01:08:04.880
That's just like little baby.
link |
01:08:06.680
So he, and there seems to be some profound ideas on that
link |
01:08:10.960
that are really useful.
link |
01:08:11.840
So each one of those things has been,
link |
01:08:15.000
as the world of software gets more and more complicated,
link |
01:08:17.840
how do we create the right abstraction levels
link |
01:08:20.080
to simplify it in a way that people can now work independently
link |
01:08:23.360
on different levels of it?
link |
01:08:25.200
So I would say all three of those projects, LLVM, Swift,
link |
01:08:29.040
and MLIR did that successfully.
link |
01:08:31.640
So I'm interested in what he's going to do next
link |
01:08:33.680
in the same kind of way.
link |
01:08:34.840
Yes.
link |
01:08:36.200
On either the TPU or maybe the NVIDIA GPU side,
link |
01:08:41.840
how does TensorFlow, you think, or the ideas
link |
01:08:44.920
underlying it doesn't have to be TensorFlow.
link |
01:08:46.920
Just this kind of graph focused, graph centric hardware
link |
01:08:54.840
deep learning centric hardware beat NVIDIAs.
link |
01:08:59.080
Do you think it's possible for it to basically overtake NVIDIA?
link |
01:09:02.280
Sure.
link |
01:09:03.520
What's that process look like?
link |
01:09:05.640
What's that journey look like, do you think?
link |
01:09:08.120
Well, GPUs were built around shader programs
link |
01:09:11.080
on millions of pixels, not to run graphs.
link |
01:09:14.440
So there's a hypothesis that says,
link |
01:09:17.400
the way the graphs are built is going
link |
01:09:20.680
to be really interesting to be efficient on computing this.
link |
01:09:24.120
And then the primitives is not a SIMD program.
link |
01:09:27.560
It's a matrix multiply convolution.
link |
01:09:30.120
And then the data manipulations are fairly extensive
link |
01:09:33.000
about how do you do a fast transpose with a program.
link |
01:09:36.400
I don't know if you've ever written a transpose program.
link |
01:09:38.840
They're ugly and slow, but in hardware you can do really well.
link |
01:09:42.200
I've got to give you an example.
link |
01:09:43.440
So when GPU accelerators started doing triangles,
link |
01:09:48.160
so you have a triangle which maps on a set of pixels.
link |
01:09:51.200
So it's very easy, straightforward
link |
01:09:53.200
to build a hardware engine that will find all those pixels.
link |
01:09:55.720
And it's kind of weird because you walk along the triangle
link |
01:09:57.680
to get to the edge.
link |
01:09:59.200
And then you have to go back down to the next row
link |
01:10:01.280
and walk along.
link |
01:10:02.080
And then you have to decide on the edge
link |
01:10:04.040
if the line of the triangle is like half on the pixel.
link |
01:10:08.000
What's the pixel color?
link |
01:10:09.120
Because it's half of this pixel and half the next one.
link |
01:10:11.240
That's called rasterization.
link |
01:10:13.000
And you're saying that could be done in hardware?
link |
01:10:15.920
No, that's an example of that operation
link |
01:10:19.320
as a software program is really bad.
link |
01:10:22.040
I've written a program that did rasterization.
link |
01:10:24.360
The hardware that does it has actually less code
link |
01:10:26.840
than the software program that does it.
link |
01:10:28.960
And it's way faster.
link |
01:10:31.840
So there are certain times when the abstraction you have
link |
01:10:35.440
rasterize a triangle, execute a graph,
link |
01:10:39.640
components of a graph, the right thing
link |
01:10:42.160
to do in the hardware software boundary
link |
01:10:43.800
is for the hardware to naturally do it.
link |
01:10:45.800
And so the GPU is really optimized
link |
01:10:47.880
for the rasterization of triangles.
link |
01:10:50.040
Well, like in a modern, that's a small piece of modern GPUs.
link |
01:10:56.960
What they did is that they still rasterized triangles
link |
01:10:59.920
when you're running a game.
link |
01:11:00.920
But for the most part, most of the computation
link |
01:11:03.520
in the area of the GPU is running shader programs.
link |
01:11:05.840
But there's single threaded programs on pixels, not graphs.
link |
01:11:09.560
To be honest, let's say I don't actually
link |
01:11:11.200
know the math behind shader, shading and lighting
link |
01:11:15.000
and all that kind of stuff.
link |
01:11:16.160
I don't know what.
link |
01:11:17.720
They look like little simple floating point programs
link |
01:11:20.080
or complicated ones.
link |
01:11:21.200
You can have 8,000 instructions in a shader program.
link |
01:11:23.680
But I don't have a good intuition
link |
01:11:25.560
why it could be parallelized so easily.
link |
01:11:27.920
No, it's because you have 8 million pixels in every single.
link |
01:11:30.640
So when you have a light that comes down, the amount of light,
link |
01:11:37.280
like say this is a line of pixels across this table,
link |
01:11:40.720
the amount of light on each pixel is subtly different.
link |
01:11:43.560
And each pixel is responsible for figuring out what it is.
link |
01:11:46.000
Figuring it out. So that pixel says, I'm this pixel.
link |
01:11:48.560
I know the angle of the light.
link |
01:11:49.920
I know the occlusion, I know the color I am.
link |
01:11:52.360
Like every single pixel here is a different color.
link |
01:11:54.400
Every single pixel gets a different amount of light.
link |
01:11:57.160
Every single pixel has a subtly different translucency.
link |
01:12:00.560
So to make it look realistic, the solution
link |
01:12:02.720
was you run a separate program on every pixel.
link |
01:12:05.120
See, but I thought there's a reflection from all over the place.
link |
01:12:07.760
Is it every pixel?
link |
01:12:08.520
Yeah, but there is.
link |
01:12:09.600
So you build a reflection map, which also
link |
01:12:12.240
has some pixelated thing.
link |
01:12:14.160
And then when the pixel is looking at the reflection map,
link |
01:12:16.320
it has to calculate what the normal of the surface is.
link |
01:12:19.240
And it does it per pixel.
link |
01:12:20.920
By the way, there's bull loads of hacks on that.
link |
01:12:23.040
You may have a lower resolution light map, reflection map.
link |
01:12:26.640
There's all these tax they do.
link |
01:12:29.200
But at the end of the day, it's per pixel computation.
link |
01:12:32.960
And it's so happening that you can map graph light computation
link |
01:12:37.160
onto this pixel central computation.
link |
01:12:39.320
You can do floating point programs
link |
01:12:41.320
on convolution and matrices.
link |
01:12:43.480
And NVIDIA invested for years in CUDA, first for HPC.
link |
01:12:47.240
And then they got lucky with the AI trend.
link |
01:12:50.120
But do you think they're going to essentially not
link |
01:12:52.600
be able to hardcore pivot out of their hole?
link |
01:12:55.440
We'll see.
link |
01:12:57.440
That's always interesting.
link |
01:12:59.480
How often do big companies hardcore pivot occasionally?
link |
01:13:03.840
How much do you know about NVIDIA, folks?
link |
01:13:06.320
Some.
link |
01:13:06.960
Some.
link |
01:13:08.160
Well, I'm curious as well, who's ultimately as a?
link |
01:13:11.480
Well, they've innovated several times.
link |
01:13:13.360
But they've also worked really hard on mobile.
link |
01:13:15.200
They've worked really hard on radios.
link |
01:13:18.400
They're fundamentally a GPU company.
link |
01:13:20.680
Well, they tried to pivot.
link |
01:13:21.800
It's an interesting little game and play in autonomous vehicles,
link |
01:13:27.640
with or semi autonomous, like playing with Tesla and so on,
link |
01:13:31.080
and seeing that's dipping a toe into that kind of pivot.
link |
01:13:35.680
They came out with this platform, which
link |
01:13:37.240
is interesting technically.
link |
01:13:39.120
But it was like a $3,000 GPU platform.
link |
01:13:46.040
I don't know if it's interesting technically.
link |
01:13:47.480
It's interesting philosophically.
link |
01:13:48.880
I technically, I don't know if it's
link |
01:13:51.040
the execution the craftsmanship is there.
link |
01:13:53.440
I'm not sure.
link |
01:13:54.600
I didn't get a sense.
link |
01:13:55.480
I think they were repurposing GPUs for an automotive solution.
link |
01:13:59.120
Right.
link |
01:13:59.360
It's not a real pivot.
link |
01:14:00.320
They didn't build a ground up solution.
link |
01:14:03.800
Like the chips inside Tesla are pretty cheap.
link |
01:14:06.360
Like Mobileye has been doing this.
link |
01:14:08.080
They're doing the classic work from the simplest thing.
link |
01:14:11.240
They were building 40 square millimeter chips.
link |
01:14:14.240
And NVIDIA, their solution, had 800 millimeter chips
link |
01:14:17.480
and 200 millimeter chips.
link |
01:14:19.160
And like boatloads are really expensive DRAMs.
link |
01:14:23.800
And it's a really different approach.
link |
01:14:27.000
So Mobileye fit the, let's say, automotive cost and form
link |
01:14:30.320
factor.
link |
01:14:31.280
And then they added features as it was economically viable.
link |
01:14:34.680
NVIDIA said, take the biggest thing
link |
01:14:36.280
and we're going to go make it work.
link |
01:14:39.080
And that's also influenced like Waymo.
link |
01:14:41.400
There's a whole bunch of autonomous startups
link |
01:14:43.640
where they have a 5,000 watt server in their trunk.
link |
01:14:47.240
And but that's because they think, well, 5,000 watts
link |
01:14:50.560
and $10,000 is OK because it's replacing a driver.
link |
01:14:54.720
Elon's approach was that port has
link |
01:14:56.280
to be cheap enough to put it in every single Tesla,
link |
01:14:59.520
whether they turn on autonomous driving or not.
link |
01:15:02.960
And Mobileye was like, we need to fit in the bomb
link |
01:15:06.200
and cost structure that car companies do.
link |
01:15:09.440
So they may sell you a GPS for 1,500 bucks.
link |
01:15:12.440
But the bomb for that's like $25.
link |
01:15:16.440
Well, and for Mobileye, it seems like neural networks
link |
01:15:20.160
were not first class citizens, like the computation.
link |
01:15:22.960
They didn't start out as a.
link |
01:15:24.640
Yeah, it was a CB problem.
link |
01:15:26.120
Yeah.
link |
01:15:27.120
And they did classic CB and found stop lights and lines.
link |
01:15:29.880
And they were really good at it.
link |
01:15:31.200
Yeah, and they never, I mean, I don't know what's happening
link |
01:15:33.920
now, but they never fully pivoted.
link |
01:15:35.800
I mean, it's like, it's the NVIDIA thing.
link |
01:15:37.960
And then as opposed to, so if you look at the new Tesla work,
link |
01:15:42.000
it's like neural networks from the ground up, right?
link |
01:15:45.560
Yeah, and even Tesla started with a lot of CB stuff in it.
link |
01:15:48.080
And Andre's basically been eliminating it.
link |
01:15:51.840
Move everything into the network.
link |
01:15:54.360
So without, this isn't like confidential stuff,
link |
01:15:57.920
but you sitting on a porch looking over the world,
link |
01:16:01.640
looking at the work that Andre is doing,
link |
01:16:03.720
that Elon's doing with Tesla autopilot.
link |
01:16:06.440
Do you like the trajectory of where things are going
link |
01:16:08.800
on the hardware side?
link |
01:16:09.640
Well, they're making serious progress.
link |
01:16:10.960
I like the videos of people driving the beta stuff.
link |
01:16:14.160
Like it's taking some pretty complicated intersections
link |
01:16:16.520
and all that, but it's still an intervention per drive.
link |
01:16:20.800
I mean, I have autopilot, the current autopilot, my Tesla.
link |
01:16:23.640
I use it every day.
link |
01:16:24.560
Do you have full stop driving beta or no?
link |
01:16:26.840
So you like where this is going?
link |
01:16:28.720
They're making progress.
link |
01:16:29.520
It's taking longer than anybody thought.
link |
01:16:32.240
You know, my wonder was, you know, hardware three,
link |
01:16:37.360
is it enough computing?
link |
01:16:39.040
Off by two, off by five, off by 10, off by 100.
link |
01:16:42.320
Yeah.
link |
01:16:43.160
And I thought it probably wasn't enough,
link |
01:16:47.160
but they're doing pretty well with it now.
link |
01:16:49.760
Yeah.
link |
01:16:50.600
And one thing is, the data set gets bigger,
link |
01:16:53.320
the training gets better.
link |
01:16:55.040
And then there's this interesting thing is,
link |
01:16:57.360
you sort of train and build an arbitrary size network
link |
01:16:59.960
that solves the problem.
link |
01:17:01.360
And then you refactor the network down to the thing
link |
01:17:03.680
that you can afford to ship, right?
link |
01:17:06.760
So the goal isn't to build a network that fits in the phone.
link |
01:17:10.720
It's to build something that actually works.
link |
01:17:14.800
And then how do you make that most effective
link |
01:17:17.680
on the hardware you have?
link |
01:17:19.840
And they seem to be doing that much better
link |
01:17:21.640
than a couple of years ago.
link |
01:17:23.520
Well, the one really important thing is also
link |
01:17:25.760
what they're doing well is how to iterate that quickly,
link |
01:17:28.640
which means like it's not just about one time
link |
01:17:31.160
deployment, one building, it's constantly
link |
01:17:32.800
iterating the network and trying to automate
link |
01:17:35.080
as many steps as possible, right?
link |
01:17:37.520
And that's actually the principles of the software 2.0
link |
01:17:41.680
I can mention with Andre is it's not just,
link |
01:17:46.920
I mean, I don't know what the actual,
link |
01:17:48.280
his description of software 2.0 is.
link |
01:17:50.880
If it's just high level, full software, there's specifics.
link |
01:17:53.480
But the interesting thing about what that actually looks
link |
01:17:57.040
in the real world is it's that, what I think Andre
link |
01:18:01.600
calls the data engine.
link |
01:18:02.640
It's like, it's the iterative improvement of the thing.
link |
01:18:06.600
You have a neural network that does stuff,
link |
01:18:10.480
fails on a bunch of things and learns from it
link |
01:18:12.760
over and over and over.
link |
01:18:13.600
So you constantly discovering edge cases.
link |
01:18:15.920
So it's very much about like data engineering,
link |
01:18:19.920
like figuring out, it's kind of what you were talking about
link |
01:18:23.040
with TensorFlow is you have the data landscape.
link |
01:18:25.720
You have to walk along that data landscape in a way
link |
01:18:27.920
that it's constantly improving the neural network.
link |
01:18:32.600
And that feels like that's the central piece
link |
01:18:34.920
of itself.
link |
01:18:35.760
And there's two pieces of it, like you find edge cases
link |
01:18:40.360
that don't work and then you define something
link |
01:18:42.000
that goes get you data for that.
link |
01:18:44.200
But then the other constraint is whether you have
link |
01:18:45.840
to label it or not.
link |
01:18:46.920
Like the amazing thing about like the GPT3 stuff
link |
01:18:49.840
is it's unsupervised.
link |
01:18:51.560
So there's essentially infinite amount of data.
link |
01:18:53.280
Now there's obviously infinite amount of data available
link |
01:18:56.280
from cars of people who are successfully driving.
link |
01:18:59.200
But the current pipelines are mostly running on labeled data,
link |
01:19:03.040
which is human limited.
link |
01:19:04.680
So when that becomes unsupervised, it'll
link |
01:19:09.800
create unlimited amount of data, which is on a scale.
link |
01:19:14.240
Now the networks that may use that data
link |
01:19:16.200
might be way too big for cars, but then there'll
link |
01:19:18.720
be the transformation from now we have unlimited data.
link |
01:19:20.880
I know exactly what I want.
link |
01:19:22.360
Now can I turn that into something that fits in the car?
link |
01:19:25.800
And that process is going to happen all over the place.
link |
01:19:29.200
Every time you get to the place where you have unlimited data,
link |
01:19:32.280
that's what software 2.0 is about, unlimited data training
link |
01:19:35.360
networks to do stuff without humans writing code to do it.
link |
01:19:40.680
And ultimately also trying to discover
link |
01:19:42.960
like you're saying the self supervised formulation
link |
01:19:46.560
of the problem, so the unsupervised formulation
link |
01:19:48.800
of the problem.
link |
01:19:49.680
Like in driving, there's this really interesting thing,
link |
01:19:53.560
which is you look at a scene that's before you,
link |
01:19:58.200
and you have data about what a successful human driver did
link |
01:20:01.920
in that scene one second later.
link |
01:20:04.440
It's a little piece of data that you can use just
link |
01:20:06.840
like with GPT3 as training.
link |
01:20:09.400
Currently, even though Tesla says they're using that,
link |
01:20:12.400
it's an open question to me.
link |
01:20:14.480
How far can you, can you saw all of the driving
link |
01:20:17.440
with just that self supervised piece of data?
link |
01:20:21.120
And like I think.
link |
01:20:23.400
Well, that's what Common AI is doing.
link |
01:20:25.560
That's what Common AI is doing, but the question
link |
01:20:27.560
is how much data, so what Common AI doesn't have
link |
01:20:32.280
is as good of a data engine, for example, as Tesla does.
link |
01:20:35.960
That's where the organization of the data.
link |
01:20:39.840
I mean, as far as I know, I haven't talked to George,
link |
01:20:41.960
but they do have the data.
link |
01:20:44.640
The question is how much data is needed?
link |
01:20:47.880
Because we say infinite very loosely here.
link |
01:20:51.440
And then the other question, which you said,
link |
01:20:54.400
I don't know if you think it's still an open question,
link |
01:20:56.520
is are we on the right order of magnitude
link |
01:20:59.400
for the compute necessary?
link |
01:21:02.040
That is this, is it like what Elon said,
link |
01:21:04.960
this chip that's in there now is enough
link |
01:21:07.120
to do full self driving, or do we need
link |
01:21:09.080
another order of magnitude?
link |
01:21:10.840
I think nobody actually knows the answer to that question.
link |
01:21:13.320
I like the confidence that Elon has, but.
link |
01:21:16.280
Yeah, we'll see.
link |
01:21:17.840
There's another funny thing is you don't learn to drive
link |
01:21:20.200
with infinite amounts of data.
link |
01:21:22.280
You learn to drive with an intellectual framework
link |
01:21:24.320
that understands physics and color and horizontal surfaces
link |
01:21:28.080
and laws and roads and all your experience
link |
01:21:34.000
from manipulating your environment.
link |
01:21:37.240
There's so many factors go into that.
link |
01:21:39.120
And then when you learn to drive,
link |
01:21:41.680
driving is a subset of this conceptual framework
link |
01:21:44.320
that you have.
link |
01:21:46.240
And so with self driving cars right now,
link |
01:21:48.120
we're teaching them to drive with driving data.
link |
01:21:51.640
You never teach a human to do that.
link |
01:21:53.520
You teach a human all kinds of interesting things,
link |
01:21:55.720
like language, like don't do that, watch out.
link |
01:21:59.280
There's all kinds of stuff going on.
link |
01:22:01.000
This is where you, I think, previous time
link |
01:22:02.880
we talked about where you poetically disagreed
link |
01:22:07.240
with my naive notion about humans.
link |
01:22:10.280
I just think that humans will make
link |
01:22:13.680
this whole driving thing really difficult.
link |
01:22:15.760
Yeah, all right.
link |
01:22:17.080
I said, humans don't move that slow.
link |
01:22:19.480
It's a ballistics problem.
link |
01:22:20.800
It's a ballistics, humans are a ballistics problem,
link |
01:22:22.720
which is like poetry to me.
link |
01:22:24.080
It's very possible that in driving,
link |
01:22:26.200
they're indeed purely a ballistics problem.
link |
01:22:28.480
And I think that's probably the right way to think about it,
link |
01:22:30.880
but I still, they still continue to surprise me
link |
01:22:34.400
with those damp pedestrians, the cyclists,
link |
01:22:36.920
other humans in other cars and.
link |
01:22:39.360
Yeah, but it's going to be one of these compensating things.
link |
01:22:41.200
So like when you're driving, you have an intuition
link |
01:22:45.480
about what humans are going to do,
link |
01:22:46.880
but you don't have 360 cameras and radars
link |
01:22:49.680
and you have an attention problem.
link |
01:22:51.160
So the self driving car comes in with no attention problems,
link |
01:22:55.120
360 cameras, a bunch of other features.
link |
01:22:58.800
So they'll wipe out a whole class of accidents, right?
link |
01:23:02.000
And emergency braking with radar,
link |
01:23:05.760
and especially as it gets AI enhanced,
link |
01:23:08.000
will eliminate collisions, right?
link |
01:23:10.920
But then you have the other problems
link |
01:23:12.040
of these unexpected things where,
link |
01:23:13.880
you think your human intuition is helping,
link |
01:23:15.600
but then the cars also have a set of hardware features
link |
01:23:19.560
that you're not even close to.
link |
01:23:21.520
And the key thing of course is,
link |
01:23:23.920
if you wipe out a huge number of kind of accidents,
link |
01:23:27.040
then it might be just way safer than a human driver,
link |
01:23:30.240
even though, even if humans are still a problem,
link |
01:23:33.000
that's hard to figure out.
link |
01:23:34.720
Yeah, that's probably what will happen.
link |
01:23:36.160
So autonomous cars will have a small number of accidents,
link |
01:23:38.800
humans would have avoided,
link |
01:23:40.000
but they'll wipe, they'll get rid of the bulk of them.
link |
01:23:43.800
What do you think about like Tesla's dojo efforts,
link |
01:23:48.640
or it can be bigger than Tesla in general?
link |
01:23:51.120
It's kind of like the tense torrent,
link |
01:23:54.280
trying to innovate, like this is the economy,
link |
01:23:56.640
like should a company try to from scratch
link |
01:23:59.200
build its own neural network training hardware?
link |
01:24:03.160
Well, first I think it's great.
link |
01:24:04.240
So we need lots of experiments, right?
link |
01:24:06.800
And there's lots of startups working on this
link |
01:24:09.440
and they're pursuing different things.
link |
01:24:11.520
Now I was there when we started Dojo,
link |
01:24:13.360
and it was sort of like,
link |
01:24:14.560
what's the unconstrained computer solution
link |
01:24:17.960
to go do very large training problems.
link |
01:24:21.720
And then there's fun stuff like,
link |
01:24:23.680
we said, well, we have this 10,000 watt board to cool.
link |
01:24:27.200
Well, you go talk to guys at SpaceX,
link |
01:24:29.120
and they think 10,000 watts is a really small number,
link |
01:24:31.160
not a big number.
link |
01:24:32.720
And there's brilliant people working on it.
link |
01:24:35.280
I'm curious to see how it'll come out.
link |
01:24:37.280
I couldn't tell you,
link |
01:24:39.000
I know it pivoted a few times since I left, so.
link |
01:24:41.640
So the cooling does seem to be a big problem.
link |
01:24:44.520
I do like what Elon said about it,
link |
01:24:46.840
which is like, we don't want to do the thing
link |
01:24:49.280
unless it's way better than the alternative,
link |
01:24:51.280
whatever the alternative is.
link |
01:24:52.960
So it has to be way better than like racks or GPUs.
link |
01:24:57.600
Yeah, and the other thing is just like,
link |
01:24:59.960
you know, the Tesla autonomous driving hardware,
link |
01:25:03.840
it was only serving one software stack.
link |
01:25:06.520
And the hardware team and the software team
link |
01:25:08.000
were tightly coupled.
link |
01:25:09.840
You know, if you're building a general purpose AI solution,
link |
01:25:12.120
then you know, there's so many different customers
link |
01:25:14.240
with so many different needs.
link |
01:25:16.360
Now, something Andre said is,
link |
01:25:18.360
I think this is amazing, 10 years ago,
link |
01:25:21.400
like vision, recommendation, language
link |
01:25:24.600
were completely different disciplines.
link |
01:25:27.080
I said, the people literally couldn't talk to each other.
link |
01:25:29.680
And three years ago, it was all neural networks,
link |
01:25:32.520
but the very different neural networks.
link |
01:25:34.800
And recently it's converging on one set of networks.
link |
01:25:37.680
They vary a lot in size, obviously,
link |
01:25:39.320
they vary in data, vary in outputs,
link |
01:25:42.040
but the technology has converged a good bit.
link |
01:25:44.760
Yeah, these transformers behind GPT3,
link |
01:25:47.360
it seems like they could be applied to video,
link |
01:25:48.960
they could be applied to a lot of,
link |
01:25:50.560
and it's like, and they're all really simple.
link |
01:25:52.520
And it was like to literally replace letters with pixels.
link |
01:25:56.320
It does vision, it's amazing.
link |
01:25:58.760
So, and then size actually improves the thing.
link |
01:26:02.080
So the bigger it gets,
link |
01:26:03.200
the more compute you throw at it, the better it gets.
link |
01:26:05.680
And the more data you have, the better it gets.
link |
01:26:08.320
So, so then you start to wonder,
link |
01:26:11.040
well, is that a fundamental thing,
link |
01:26:12.560
or is this just another step
link |
01:26:15.080
to some fundamental understanding
link |
01:26:16.560
about this kind of computation,
link |
01:26:18.840
which is really interesting.
link |
01:26:20.280
Us humans don't want to believe
link |
01:26:21.520
that that kind of thing will achieve
link |
01:26:22.680
conceptual understandings you were saying,
link |
01:26:24.400
like you'll figure out physics, but maybe it will.
link |
01:26:27.000
Maybe. Probably will.
link |
01:26:29.400
Well, it's worse than that.
link |
01:26:31.040
It'll understand physics in ways that we can't understand.
link |
01:26:33.760
I liked your Stephen Wolfram talk,
link |
01:26:35.880
where he said, you know,
link |
01:26:36.720
there's three generations of physics.
link |
01:26:38.000
There was physics by reasoning,
link |
01:26:40.120
well, big things should fall faster than small things,
link |
01:26:42.640
right, that's reasoning.
link |
01:26:44.080
And then there's physics by equations, like, you know,
link |
01:26:48.240
but the number of programs in a world that are solved
link |
01:26:50.160
with the single equations relatively low,
link |
01:26:52.000
almost all programs have, you know,
link |
01:26:53.640
more than one line of code,
link |
01:26:54.960
maybe a hundred million lines of code.
link |
01:26:56.840
So he said, now we're going to physics by equation,
link |
01:26:59.960
which is his project, which is cool.
link |
01:27:02.560
I might point out that there was two generations of physics
link |
01:27:07.280
before reasoning, habit.
link |
01:27:10.240
Like all animals, you know,
link |
01:27:11.440
no things fall and, you know, birds fly and, you know,
link |
01:27:14.520
predators know how to, you know,
link |
01:27:16.080
solve a differential equation to cut off a accelerating,
link |
01:27:20.000
you know, curving animal path.
link |
01:27:23.480
And then there was, you know, the gods did it, right?
link |
01:27:28.400
So, right, so there was, you know, there's five generations.
link |
01:27:31.640
Now, software 2.0 says programming things
link |
01:27:35.960
is not the last step, data.
link |
01:27:38.880
So there's going to be a physics
link |
01:27:41.280
past Stevens, Wolfram's comp.
link |
01:27:44.080
That's not explainable to us humans.
link |
01:27:47.520
And actually, there's no reason that I can see
link |
01:27:51.040
while that, even that's the limit, like,
link |
01:27:53.720
there's something beyond that.
link |
01:27:55.600
I mean, they're usually, like,
link |
01:27:56.440
usually when you have this hierarchy, it's not like,
link |
01:27:58.800
well, if you have this step and this step and this step
link |
01:28:00.600
and they're all qualitatively different,
link |
01:28:03.160
conceptually different, it's not obvious why, you know,
link |
01:28:05.400
six is the right ant number of hierarchy steps
link |
01:28:07.360
and not seven or eight or...
link |
01:28:09.200
Well, then it's probably impossible for us
link |
01:28:12.120
to comprehend something that's beyond
link |
01:28:15.240
the thing that's not explainable.
link |
01:28:17.080
Yeah, but the thing that, you know,
link |
01:28:20.800
understands the thing that's not explainable to us,
link |
01:28:23.480
Wolfram's conceives the next one and, like,
link |
01:28:26.640
I'm not sure why there's a limit to it.
link |
01:28:30.880
Looks like your brain hurts.
link |
01:28:31.720
That's the sad story.
link |
01:28:33.960
If we look at our own brain,
link |
01:28:36.520
which is an interesting, illustrative example,
link |
01:28:41.160
in your work with TestTorrent
link |
01:28:42.560
and trying to design deep learning architectures,
link |
01:28:46.120
do you think about the brain at all?
link |
01:28:50.040
Maybe from a hardware designer perspective,
link |
01:28:53.440
if you could change something about the brain,
link |
01:28:56.200
what would you change?
link |
01:28:57.520
Or do you...
link |
01:28:58.360
Funny question.
link |
01:28:59.200
Like, how would you do that?
link |
01:29:01.040
So your brain is really weird.
link |
01:29:02.360
Like, you know, your cerebral cortex,
link |
01:29:03.960
where we think we do most of our thinking,
link |
01:29:06.040
is what, like six or seven neurons thick?
link |
01:29:08.640
Yeah.
link |
01:29:09.480
Like, that's weird.
link |
01:29:10.320
Like, all the big networks are way bigger than that.
link |
01:29:13.200
Like, way deeper.
link |
01:29:14.320
So that seems odd.
link |
01:29:16.160
And then, you know, when you're thinking,
link |
01:29:18.000
if the input generates a result you can lose,
link |
01:29:21.800
it goes really fast.
link |
01:29:22.800
But if it can't, that generates an output
link |
01:29:25.240
that's interesting, which turns into an input,
link |
01:29:27.080
and then your brain,
link |
01:29:28.360
to the point where you mull things over for days,
link |
01:29:30.680
and how many trips through your brain is that, right?
link |
01:29:33.400
Like, it's, you know, 300 milliseconds or something,
link |
01:29:36.080
and you get through seven levels of neurons.
link |
01:29:37.840
I forget the number exactly.
link |
01:29:39.840
But then it does it over and over and over as it searches.
link |
01:29:43.400
And the brain clearly looks like some kind of graph
link |
01:29:46.120
because you have a neuron with, you know, connections,
link |
01:29:48.160
and it talks to other ones.
link |
01:29:49.200
And it's locally very computationally intense,
link |
01:29:52.360
but it also does sparse computations
link |
01:29:55.480
across a pretty big area.
link |
01:29:57.800
There's a lot of messy biological type of things,
link |
01:30:00.640
and it's meaning like, first of all,
link |
01:30:03.720
there's mechanical, chemical, and electrical signals.
link |
01:30:06.000
It's all that's going on.
link |
01:30:07.480
Then there's the acynchronicity of signals.
link |
01:30:12.480
And there's like, there's just a lot of variability
link |
01:30:14.720
that seems continuous and messy
link |
01:30:16.520
and just a mess of biology.
link |
01:30:18.640
And it's unclear whether that's a good thing
link |
01:30:22.640
or it's a bad thing.
link |
01:30:24.040
Because if it's a good thing,
link |
01:30:26.320
then we need to run the entirety of the evolution.
link |
01:30:29.240
Well, we're gonna have to start with basic bacteria
link |
01:30:31.560
to create something.
link |
01:30:32.400
Imagine you could build a brain with 10 layers.
link |
01:30:35.640
Would that be better or worse?
link |
01:30:37.360
Or more connections or less connections,
link |
01:30:39.800
or we don't know to what level our brains are optimized.
link |
01:30:44.240
But if I was changing things,
link |
01:30:45.480
like you can only hold like seven numbers in your head.
link |
01:30:49.320
Like why not a hundred or a million?
link |
01:30:51.840
There was a lot of that.
link |
01:30:53.680
And why can't we have like a floating point processor
link |
01:30:56.800
that can compute anything we want
link |
01:30:59.520
and see it all properly?
link |
01:31:01.240
Like that would be kind of fun.
link |
01:31:03.120
And why can't we see in four or eight dimensions?
link |
01:31:05.760
Like three D's kind of a drag.
link |
01:31:10.080
Like all the hard mass transforms
link |
01:31:11.600
are up in multiple dimensions.
link |
01:31:13.960
So there's, you could imagine a brain architecture
link |
01:31:16.560
that you could enhance with a whole bunch of features
link |
01:31:21.120
that would be really useful for thinking about things.
link |
01:31:24.440
It's possible that the limitations you're describing
link |
01:31:26.880
are actually essential for like the constraints
link |
01:31:29.880
are essential for creating like the depth of intelligence.
link |
01:31:34.000
Like that, the ability to reason, you know.
link |
01:31:38.240
Yeah, it's hard to say
link |
01:31:39.080
because like your brain is clearly a parallel processor.
link |
01:31:43.040
You know, 10 billion neurons talking to each other
link |
01:31:46.200
at a relatively low clock rate.
link |
01:31:48.440
But it produces something that looks like
link |
01:31:51.080
a serial thought process.
link |
01:31:52.640
It's a serial narrative in your head.
link |
01:31:54.720
That's true.
link |
01:31:55.560
But then there are people famously who are visual thinkers.
link |
01:31:59.040
Like I think I'm a relatively visual thinker.
link |
01:32:02.320
I can imagine any object and rotate it in my head
link |
01:32:05.080
and look at it.
link |
01:32:06.440
And there are people who say they don't think that way at all.
link |
01:32:09.640
And recently I read an article about people
link |
01:32:12.440
who say they don't have a voice in their head.
link |
01:32:16.760
They can talk, but when they, you know, it's like,
link |
01:32:19.880
well, what are you thinking though?
link |
01:32:21.520
They'll describe something that's visual.
link |
01:32:24.400
So that's curious.
link |
01:32:26.480
Now, if you're saying, if we dedicated more hardware
link |
01:32:33.800
to holding information like, you know, 10 numbers
link |
01:32:36.360
or a million numbers, like would that distract us
link |
01:32:40.960
from our ability to form this kind of singular identity?
link |
01:32:44.760
Like it dissipates somehow.
link |
01:32:46.280
Right.
link |
01:32:47.120
But maybe, you know, future humans will have many identities
link |
01:32:50.720
that have some higher level organization,
link |
01:32:53.120
but can actually do lots more things in parallel.
link |
01:32:55.640
Yeah, there's no reason, if we're thinking modularly,
link |
01:32:57.880
there's no reason we can have multiple consciousnesses
link |
01:33:00.280
in one brain.
link |
01:33:01.480
Yeah.
link |
01:33:02.320
And maybe there's some way to make it faster
link |
01:33:03.680
so that the, you know, the area of the computation
link |
01:33:07.880
could still have a unified feel to it
link |
01:33:13.200
while still having way more ability
link |
01:33:15.680
to do parallel stuff at the same time.
link |
01:33:17.560
Could definitely be improved.
link |
01:33:19.040
Could be improved?
link |
01:33:20.000
Yeah.
link |
01:33:20.840
Well, it's pretty good right now.
link |
01:33:22.880
Actually, people don't give it enough credit.
link |
01:33:24.640
The thing is pretty nice that, you know,
link |
01:33:27.840
the fact that the ride ends seem to be,
link |
01:33:31.840
give a nice like spark of beauty to the whole experience.
link |
01:33:37.880
So I don't know, I don't know if it can be improved easily.
link |
01:33:40.280
It could be more beautiful.
link |
01:33:42.480
I don't know how, what do you mean?
link |
01:33:45.240
What do you mean how?
link |
01:33:46.280
All the ways you can't imagine.
link |
01:33:48.280
No, but that's the whole point.
link |
01:33:49.480
I wouldn't be able to imagine,
link |
01:33:51.080
the fact that I can imagine ways
link |
01:33:52.720
in which it could be more beautiful means.
link |
01:33:55.880
So do you know, you know, Ian Banks, his stories.
link |
01:33:59.360
So the super smart AIs, they're live,
link |
01:34:03.560
mostly live in the world of what they call infinite fun
link |
01:34:07.480
because they can create arbitrary worlds.
link |
01:34:12.120
So they interact and, you know, the story has it.
link |
01:34:14.440
They interact in the normal world
link |
01:34:15.800
and they're very smart and they can do all kinds of stuff.
link |
01:34:18.520
And, you know, given mine can, you know,
link |
01:34:20.360
talk to a million humans at the same time
link |
01:34:21.960
because we're very slow and for reasons,
link |
01:34:24.640
you know, artificial, the story,
link |
01:34:26.240
they're interested in people and doing stuff,
link |
01:34:28.200
but they mostly live in this other land of thinking.
link |
01:34:32.960
My inclination is to think that the ability
link |
01:34:36.440
to create infinite fun will not be so fun.
link |
01:34:41.120
That's sad.
link |
01:34:42.440
Well.
link |
01:34:43.280
There's so many things to do.
link |
01:34:44.120
Imagine being able to make a star, move planets around.
link |
01:34:47.560
Yeah, yeah.
link |
01:34:48.560
But because we can imagine that as wildlife is fun,
link |
01:34:51.320
if we actually were able to do it, it'd be a slippery slope
link |
01:34:55.000
where fun, we'd even have a meeting
link |
01:34:56.720
because we just consistently desensitize ourselves
link |
01:35:00.320
by the infinite amounts of fun we're having.
link |
01:35:04.120
And the sadness, the dark stuff is what makes it fun, I think.
link |
01:35:09.520
That could be the Russian.
link |
01:35:10.440
It could be the fun makes it fun
link |
01:35:12.400
and the sadness makes it bittersweet.
link |
01:35:16.560
Yeah, that's true.
link |
01:35:17.400
Fun could be the thing that makes it fun.
link |
01:35:20.520
So what do you think about the expansion,
link |
01:35:22.520
not through the biology side,
link |
01:35:23.880
but through the BCI, the brain computer interfaces?
link |
01:35:27.200
Yeah, you got a chance to check out the Neuralink stuff.
link |
01:35:30.080
It's super interesting.
link |
01:35:31.480
Like humans, like our thoughts manifest as action.
link |
01:35:38.720
Like as a kid, shooting a rifle was super fun,
link |
01:35:41.680
driving a mini bike, doing things.
link |
01:35:44.240
And then computer games, I think,
link |
01:35:46.120
for a lot of kids became the thing where they can do
link |
01:35:49.680
what they want, they can fly a plane,
link |
01:35:51.440
they can do this, they can do this, right?
link |
01:35:53.560
But you have to have this physical interaction.
link |
01:35:55.840
Now imagine, you know, you could just imagine stuff
link |
01:36:00.280
and it happens, right?
link |
01:36:03.240
Like really richly and interestingly.
link |
01:36:06.600
Like we kind of do that when we dream.
link |
01:36:08.040
Like dreams are funny because like,
link |
01:36:10.480
if you have some control or awareness in your dreams,
link |
01:36:14.360
like it's very realistic looking
link |
01:36:16.360
or not realistic, it depends on the dream.
link |
01:36:19.400
But you can also manipulate that.
link |
01:36:22.440
And you know, what's possible there is odd
link |
01:36:26.200
in the fact that nobody understands it's hilarious, but.
link |
01:36:29.840
Do you think it's possible to expand
link |
01:36:31.720
that capability through computing?
link |
01:36:34.000
Sure.
link |
01:36:35.320
Is there some interesting,
link |
01:36:36.480
so from a hardware designer perspective,
link |
01:36:38.400
is there, do you think it'll present totally new challenges
link |
01:36:41.600
in the kind of hardware that required that like,
link |
01:36:44.080
so this hardware isn't standalone computing.
link |
01:36:47.720
So just take it from this, so today,
link |
01:36:50.240
computer games are rendered by GPUs, right?
link |
01:36:53.640
So, but you've seen the GAN stuff, right?
link |
01:36:56.840
Where train neural networks render realistic images,
link |
01:37:00.880
but there's no pixels, no triangles, no shaders,
link |
01:37:03.760
no light maps, no nothing.
link |
01:37:05.400
So the future of graphics is probably AI, right?
link |
01:37:09.960
Now that AI is heavily trained by lots of real data, right?
link |
01:37:14.840
So if you have an interface with a AI renderer, right?
link |
01:37:20.360
So if you say render a cat,
link |
01:37:22.760
it won't say, well, how tall is the cat
link |
01:37:24.520
and how big, you know, it'll render a cat.
link |
01:37:26.280
You might say, oh, a little bigger, a little smaller,
link |
01:37:28.200
you know, make it a tabby, shorter hair, you know,
link |
01:37:31.320
like you could tweak it.
link |
01:37:32.880
Like the amount of data you'll have to send
link |
01:37:36.520
to interact with a very powerful AI renderer could be low.
link |
01:37:41.400
But the question is, for brain computer interfaces,
link |
01:37:44.800
we need to render not onto a screen,
link |
01:37:47.840
but render onto the brain.
link |
01:37:50.360
And like directly, so there's a bandwidth.
link |
01:37:53.040
Well, we could do it both ways.
link |
01:37:53.880
I mean, our eyes are really good sensors.
link |
01:37:56.000
They could render onto a screen
link |
01:37:58.560
and we could feel like we're participating in it.
link |
01:38:01.120
You know, they're gonna have, you know,
link |
01:38:03.360
like the Oculus kind of stuff.
link |
01:38:04.880
It's gonna be so good when a projection to your eyes,
link |
01:38:07.040
you think it's real.
link |
01:38:08.040
You know, they're slowly solving those problems.
link |
01:38:12.520
And I suspect when the renderer of that information
link |
01:38:17.240
into your head is also AI mediated,
link |
01:38:20.600
they'll be able to give you the cues that, you know,
link |
01:38:23.480
you really want for depth and all kinds of stuff.
link |
01:38:27.280
Like your brain is partly faking your visual field, right?
link |
01:38:30.920
Like your eyes are twitching around,
link |
01:38:32.680
but you don't notice that.
link |
01:38:33.800
Occasionally they blank, you don't notice that.
link |
01:38:36.520
You know, there's all kinds of things.
link |
01:38:37.760
Like you think you see over here,
link |
01:38:39.120
but you don't really see there.
link |
01:38:40.800
It's all fabricated.
link |
01:38:42.160
Yeah, peripheral vision is fascinating.
link |
01:38:45.480
So if you have an AI renderer that's trained
link |
01:38:48.520
to understand exactly how you see
link |
01:38:51.640
and the kind of things that enhance the realism
link |
01:38:54.760
of the experience could be super real, actually.
link |
01:39:01.120
So I don't know what the limits that are,
link |
01:39:03.480
but obviously if we have a brain interface
link |
01:39:06.920
that goes in inside your, you know, visual cortex
link |
01:39:10.440
in a better way than your eyes do, which is possible.
link |
01:39:13.480
It's a lot of neurons.
link |
01:39:17.040
Maybe that'll be even cooler.
link |
01:39:19.760
Well, the really cool thing is it has to do
link |
01:39:21.560
with the infinite fun that you were referring to,
link |
01:39:24.200
which is our brains have to be very limited.
link |
01:39:26.600
And like you said, computations.
link |
01:39:28.160
It's also very plastic.
link |
01:39:29.880
Very plastic, yeah.
link |
01:39:31.040
So it's a interesting combination.
link |
01:39:33.600
The interesting open question is the limits
link |
01:39:37.480
of that neuroplasticity.
link |
01:39:38.800
Like how flexible is that thing?
link |
01:39:42.360
Because we haven't really tested it.
link |
01:39:44.920
We know about that experience where they put
link |
01:39:47.000
like a pressure pad on somebody's head
link |
01:39:49.160
and had a visual transducer pressurize it
link |
01:39:51.560
and somebody slowly learned to see.
link |
01:39:53.520
Yep.
link |
01:39:55.920
Especially at a young age, if you throw a lot at it,
link |
01:39:58.760
like what can it completely,
link |
01:40:03.960
so can you like arbitrarily expand it with computing power?
link |
01:40:06.880
So connected to the internet directly somehow.
link |
01:40:09.880
Yeah, the answer is probably yes.
link |
01:40:11.960
So the problem with biology and ethics is like,
link |
01:40:14.400
there's a mess there.
link |
01:40:15.560
Like us humans are perhaps unwilling to take risks
link |
01:40:21.840
into directions that are full of uncertainty.
link |
01:40:25.600
So it's like.
link |
01:40:26.440
90% of the population's unwilling to take risks.
link |
01:40:28.880
The other 10% is rushing into the risks
link |
01:40:31.320
unaided by any infrastructure whatsoever.
link |
01:40:34.160
And that's where all the fun happens in this society.
link |
01:40:38.920
There's been huge transformations
link |
01:40:41.120
in the last couple of thousand years.
link |
01:40:43.600
Yeah, it's funny.
link |
01:40:44.560
I got the chance to interact with this Matthew Johnson
link |
01:40:48.200
from Johns Hopkins.
link |
01:40:49.360
He's doing this large scale study of psychedelics.
link |
01:40:52.520
It's becoming more and more.
link |
01:40:54.240
I've gotten a chance to interact
link |
01:40:55.240
with that community of scientists working on psychedelics.
link |
01:40:57.800
But because of that, that opened the door to me
link |
01:41:00.120
to all these, what are they called?
link |
01:41:02.760
Psychonauts, the people who, like you said, the 10%.
link |
01:41:06.480
Like I don't care.
link |
01:41:08.040
I don't know if there's a science behind this.
link |
01:41:09.840
I'm taking the spaceship to, if I'm be the first on Mars,
link |
01:41:13.640
I'll be, you know, psychedelics interesting in the sense
link |
01:41:17.480
that in another dimension, like you said,
link |
01:41:21.440
it's a way to explore the limits of the human mind.
link |
01:41:25.440
Like what is this thing capable of doing?
link |
01:41:28.240
Cause you kind of, like when you dream,
link |
01:41:30.560
you detach it, I don't know exactly
link |
01:41:32.240
in your science of it,
link |
01:41:33.080
but you detach your like reality from what your mind,
link |
01:41:39.000
the images your mind is able to conjure up
link |
01:41:40.800
and your mind goes into weird places.
link |
01:41:43.160
And like entities appear,
link |
01:41:44.960
somehow Freudian type of like trauma
link |
01:41:48.760
is probably connected in there somehow.
link |
01:41:50.320
You start to have like these weird, vivid worlds that like.
link |
01:41:54.000
So do you actively dream?
link |
01:41:56.360
Do you, why not?
link |
01:41:59.240
I have like six hours of dreams
link |
01:42:00.960
and it's like real useful time.
link |
01:42:03.080
I know.
link |
01:42:03.920
I don't, I don't for some reason, I just knock out
link |
01:42:07.320
and I have sometimes like anxiety inducing kind of like
link |
01:42:12.120
a very pragmatic like nightmare type of dreams,
link |
01:42:16.640
but not nothing fun, nothing.
link |
01:42:18.440
Nothing fun?
link |
01:42:19.280
Nothing fun.
link |
01:42:20.600
I try, I unfortunately have mostly have fun
link |
01:42:24.640
in the waking world,
link |
01:42:26.520
which is very limited in the amount of fun you can have.
link |
01:42:30.000
It's not that limited either.
link |
01:42:31.200
Yeah, that's why we'll have to talk.
link |
01:42:35.040
Yeah, and your instructions.
link |
01:42:36.840
Yeah.
link |
01:42:37.680
There's like a manual for that.
link |
01:42:38.640
You might wanna.
link |
01:42:41.000
I looked it up.
link |
01:42:41.840
I'll ask you on what, what did you dream?
link |
01:42:44.680
You know, years ago when I read about, you know,
link |
01:42:47.040
like, you know, a book about how to have, you know,
link |
01:42:51.360
become aware in your dreams.
link |
01:42:53.080
I worked on it for a while.
link |
01:42:54.320
Like there's this trick about, you know,
link |
01:42:55.960
imagine you can see your hands and look out
link |
01:42:58.240
and I got somewhat good at it.
link |
01:43:00.640
Like, but my mostly when I'm thinking about things
link |
01:43:04.360
or working on problems,
link |
01:43:05.440
I prep myself before I go to sleep.
link |
01:43:09.040
It's like, I pull into my mind all the things
link |
01:43:13.160
I wanna work on or think about.
link |
01:43:15.400
And then that, let's say, greatly improves the chances
link |
01:43:19.840
that I'll work on that while I'm sleeping.
link |
01:43:23.400
And then I also, you know, basically asked to remember it.
link |
01:43:30.320
And I often remember very detailed.
link |
01:43:33.200
Within the dream.
link |
01:43:34.120
Yeah.
link |
01:43:34.960
Or outside the dream.
link |
01:43:35.800
Well, to bring it up in my dreaming
link |
01:43:37.800
and then to remember it when I wake up.
link |
01:43:41.040
It's just, it's more of a meditative practice.
link |
01:43:43.360
You say, you know, to prepare yourself to do that.
link |
01:43:48.920
Like if you go to, you know, to sleep,
link |
01:43:50.560
still gnashing your teeth about some random thing
link |
01:43:52.960
that happened that you're not that really interested
link |
01:43:55.520
in your dream about it.
link |
01:43:57.960
That's really interesting.
link |
01:43:58.800
Maybe.
link |
01:43:59.640
But you can direct your dreams somewhat by prepping.
link |
01:44:04.440
Yeah, I'm gonna have to try that.
link |
01:44:05.440
It's really interesting.
link |
01:44:06.400
Like the most important, the interesting,
link |
01:44:08.440
not like what did this guy send an email
link |
01:44:12.240
kind of like stupid worry stuff,
link |
01:44:14.080
but like fundamental problems you're actually concerned about.
link |
01:44:16.320
Yeah.
link |
01:44:17.160
Prepping.
link |
01:44:18.000
And interesting things you're worried about.
link |
01:44:18.840
Interesting.
link |
01:44:19.680
Or most of your reading or, you know,
link |
01:44:20.520
some great conversation you had
link |
01:44:21.360
or some adventure you want to have.
link |
01:44:23.480
Like there's a lot of space there and it seems to work
link |
01:44:31.080
that, you know, my percentage of interesting dreams
link |
01:44:34.400
and memories went up.
link |
01:44:36.440
Is there, is that the source of,
link |
01:44:40.440
if you were able to deconstruct like where
link |
01:44:42.760
some of your best ideas came from,
link |
01:44:45.720
is there a process that's at the core of that?
link |
01:44:49.440
Like, so some people, you know, walk and think,
link |
01:44:52.440
some people like in the shower, the best ideas hit them.
link |
01:44:55.200
If you talk about like Newton,
link |
01:44:56.560
Apple hitting them on the head.
link |
01:44:58.600
No, I found out a long time ago,
link |
01:45:01.120
I process things somewhat slowly.
link |
01:45:03.240
So like in college, I had friends that could study it
link |
01:45:05.760
the last minute and get an A next day.
link |
01:45:07.560
I can't do that at all.
link |
01:45:09.080
So I always front loaded all the work.
link |
01:45:10.960
Like I do all the problems early, you know,
link |
01:45:14.200
for finals, like the last three days,
link |
01:45:15.840
I wouldn't look at a book because I want, you know,
link |
01:45:18.840
cause like a new fact day before finals may screw up
link |
01:45:22.240
my understanding of what I thought I knew.
link |
01:45:23.920
So my goal was to always get it in
link |
01:45:27.240
and give it time to soak.
link |
01:45:29.920
And I used to, you know,
link |
01:45:32.080
I remember when we were doing like 3D calculus,
link |
01:45:33.840
I would have these amazing dreams of 3D surfaces
link |
01:45:36.320
with normal, you know, calculating the gradient.
link |
01:45:38.600
And just like all come up.
link |
01:45:40.160
So it was like really fun, like very visual.
link |
01:45:43.920
And if I got cycles of that, that was useful.
link |
01:45:48.480
And the other is just don't over filter your ideas.
link |
01:45:50.960
Like I like that process of brainstorming
link |
01:45:54.520
where lots of ideas can happen.
link |
01:45:55.640
I like people who have lots of ideas.
link |
01:45:57.400
And then you just let them sit.
link |
01:45:58.800
Then there's a, yeah, I'll let them sit
link |
01:46:00.240
and let it breathe a little bit.
link |
01:46:02.560
And then reduce it to practice.
link |
01:46:05.000
Like at some point you really have to, does it really work?
link |
01:46:09.920
Like, you know, is this real or not?
link |
01:46:13.000
Right, but you have to do both.
link |
01:46:15.040
There's creative tension there.
link |
01:46:16.160
Like how do you be both open and, you know, precise?
link |
01:46:20.480
Have you had ideas that you just,
link |
01:46:22.280
that sit in your mind for like years before the?
link |
01:46:26.120
Sure.
link |
01:46:27.360
That's it.
link |
01:46:28.200
It's an interesting way to just generate ideas
link |
01:46:31.760
and just let them sit.
link |
01:46:33.120
Let them sit there for a while.
link |
01:46:35.160
I think I have a few of those ideas.
link |
01:46:38.480
You know, it was so funny.
link |
01:46:40.160
Yeah, I think that's, you know, creativity,
link |
01:46:43.720
this one or something.
link |
01:46:45.760
For the slow thinkers in the, in the room, I suppose.
link |
01:46:49.400
As I, some people, like you said, are just like, like the.
link |
01:46:53.320
Yeah, it's really interesting.
link |
01:46:54.880
There's so much diversity in how people think, you know,
link |
01:46:58.080
how fast or slow they are, how well they remember,
link |
01:47:00.400
don't like, you know, I'm not super good at remembering facts,
link |
01:47:04.080
but processes and methods.
link |
01:47:06.480
Like in our engineering, I went to Penn State
link |
01:47:08.080
and almost all our engineering tests were open book.
link |
01:47:11.880
I could remember the page and not the formula.
link |
01:47:14.840
As soon as I saw the formula,
link |
01:47:15.920
I could remember the whole method if I, if I'd learned it.
link |
01:47:19.760
Yeah.
link |
01:47:20.600
So it's a funny, where some people could, you know,
link |
01:47:23.480
I just watched friends like flipping through the book,
link |
01:47:25.600
trying to find the formula,
link |
01:47:27.480
even knowing that they'd done just as much work.
link |
01:47:30.120
Now we just opened the book and I was on page 27.
link |
01:47:33.080
About a half, I could see the whole thing visually.
link |
01:47:35.960
Yeah.
link |
01:47:36.800
And, you know.
link |
01:47:37.640
And you have to learn that about yourself
link |
01:47:39.040
and figure out what the, what the function optimally.
link |
01:47:41.480
I had a friend who, he was always concerned.
link |
01:47:43.320
He didn't know how he came up with ideas.
link |
01:47:45.760
He had lots of ideas, but he said they just sort of popped up.
link |
01:47:49.120
Like he'd be working on something, he had this idea.
link |
01:47:51.000
Like, where does it come from?
link |
01:47:53.320
But you can have more awareness of it.
link |
01:47:54.840
Like, like, like, like how you,
link |
01:47:58.040
how your brain works as a little murky as you go down
link |
01:48:00.440
from the voice in your head or the obvious visualizations.
link |
01:48:03.880
Like when you visualize something, how does that happen?
link |
01:48:06.560
Yeah, that's right.
link |
01:48:07.400
You know, if I say, you know, visualize a volcano,
link |
01:48:09.040
it's easy to do, right?
link |
01:48:09.880
And what does it actually look like when you visualize it?
link |
01:48:12.520
I can visualize to the point where I don't see
link |
01:48:14.400
very much out of my eyes and I see the colors
link |
01:48:16.240
of the thing I'm visualizing.
link |
01:48:18.240
Yeah, but there's like a, there's a shape, there's a texture,
link |
01:48:20.560
there's a color, but there's also conceptual visualization.
link |
01:48:23.120
Like, what are you actually visualizing
link |
01:48:25.680
when you're visualizing a volcano?
link |
01:48:27.200
Just like with peripheral vision,
link |
01:48:28.480
you think you see the whole thing.
link |
01:48:29.680
Yeah, yeah, yeah.
link |
01:48:30.520
That's a good way to say it.
link |
01:48:31.800
You know, you have this kind of almost peripheral vision
link |
01:48:34.840
of your visualizations, they're like these ghosts.
link |
01:48:38.400
But if, you know, if you, if you work on it,
link |
01:48:40.160
you can get a pretty high level of detail.
link |
01:48:42.280
And somehow you can walk along those visualizations
link |
01:48:44.360
to come up with an idea, which is weird.
link |
01:48:47.200
But when you're thinking about solving problems,
link |
01:48:50.920
like you're putting information in,
link |
01:48:52.960
you're exercising the stuff you do know,
link |
01:48:55.720
you're sort of teasing the area that's,
link |
01:48:58.120
you don't understand and don't know,
link |
01:49:00.680
but you can almost, you know, feel,
link |
01:49:04.600
you know, that process happening.
link |
01:49:06.560
You know, that's, that's how I, like,
link |
01:49:10.040
like, I know sometimes when I'm working really hard
link |
01:49:12.000
on something, like, I get really hot when I'm sleeping.
link |
01:49:14.880
And, you know, it's like, we got the blank throw,
link |
01:49:17.280
I wake up with all the blank throw on the floor.
link |
01:49:20.040
And, you know, every time it's while I wake up
link |
01:49:22.400
and think, wow, that was great, you know.
link |
01:49:25.360
Oh, you're able to reverse engineer
link |
01:49:27.600
what the hell happened there?
link |
01:49:29.000
Oh, sometimes it's vivid dreams.
link |
01:49:30.360
And sometimes it's just kind of like you say,
link |
01:49:32.520
like shadow thinking that you sort of have this feeling
link |
01:49:35.160
you're going through this stuff,
link |
01:49:36.960
but it's not that obvious.
link |
01:49:38.760
Isn't that so amazing that the mind just does
link |
01:49:40.960
all these little experiments?
link |
01:49:42.880
I never, you know, I thought, I always thought,
link |
01:49:45.240
it's like a river that you can't,
link |
01:49:46.760
you're just there for the ride, but you're right.
link |
01:49:48.920
If you prep it.
link |
01:49:50.360
No, it's all understandable.
link |
01:49:52.400
Meditation really helps.
link |
01:49:53.720
You gotta start figuring out, you need to learn language
link |
01:49:56.240
if you're on mind.
link |
01:49:59.280
And there's multiple levels of it, but.
link |
01:50:02.600
The abstractions again, right?
link |
01:50:04.000
It's somewhat comprehensible and observable and feelable
link |
01:50:08.400
or whatever the right word is.
link |
01:50:11.920
You know, you're not alone for the ride.
link |
01:50:13.640
You are the ride.
link |
01:50:15.560
I have to ask you, hardware engineer working
link |
01:50:18.320
on your own networks now, what's consciousness?
link |
01:50:21.400
What the hell is that thing?
link |
01:50:22.800
Is that, is that just some little weird quirk
link |
01:50:25.920
of our particular computing device?
link |
01:50:29.240
Or is it something fundamental that we really need
link |
01:50:31.240
to crack open it for, to build like good computers?
link |
01:50:36.520
Do you ever think about consciousness?
link |
01:50:37.920
Like why it feels like something to be?
link |
01:50:40.000
I know, it's really weird.
link |
01:50:42.600
So.
link |
01:50:43.640
Yeah.
link |
01:50:45.520
I mean, everything about it's weird.
link |
01:50:47.960
First is to half a second behind reality, right?
link |
01:50:51.320
It's a post hoc narrative about what happened.
link |
01:50:53.760
You've already done stuff by the time you're conscious of it.
link |
01:50:58.840
And your consciousness generally is a single threaded thing,
link |
01:51:01.200
but we know your brain is 10 billion neurons running
link |
01:51:04.120
some crazy parallel thing.
link |
01:51:07.960
And there's a really big sorting thing going on there.
link |
01:51:11.160
It also seems to be really reflective in the sense
link |
01:51:13.560
that you create a space in your head, right?
link |
01:51:17.960
Like we don't really see anything, right?
link |
01:51:19.600
Like photons hit your eyes, it gets turned into signals,
link |
01:51:22.800
it goes through multiple layers of neurons.
link |
01:51:25.560
You know, like I'm so curious that, you know,
link |
01:51:28.160
that looks glassy and that looks not glassy.
link |
01:51:30.440
And like, like how the resolution of your vision is so high,
link |
01:51:33.480
you had to go through all this processing.
link |
01:51:36.040
Where for most of it, it looks nothing like vision, right?
link |
01:51:39.640
Like, like there's no theater in your mind, right?
link |
01:51:43.600
So we, we have a world in our heads.
link |
01:51:46.800
We're literally disisolated behind our sensors,
link |
01:51:51.080
but we can look at it, speculate about it,
link |
01:51:55.160
speculate about alternatives, problem solve, what if,
link |
01:51:59.280
you know, there's so many things going on,
link |
01:52:02.240
and that process is lagging reality.
link |
01:52:05.640
And it's single threaded,
link |
01:52:07.040
even though the underlying thing is like massively parallel.
link |
01:52:09.880
Yeah, so it's, it's so curious.
link |
01:52:12.240
So imagine you're building an AI computer,
link |
01:52:14.000
if you wanted to replicate humans,
link |
01:52:15.840
well, you'd have huge arrays of neural networks,
link |
01:52:17.800
and apparently only sixers have in deep, which is hilarious.
link |
01:52:21.840
They don't even remember seven numbers,
link |
01:52:23.160
but I think we can upgrade that a lot, right?
link |
01:52:25.640
And then somewhere in there,
link |
01:52:27.680
you would train the network to create basically
link |
01:52:30.040
the world that you live in, right?
link |
01:52:32.360
So like tell stories to itself about the world
link |
01:52:35.280
that it's perceiving.
link |
01:52:36.240
Well, create the world, tell stories in the world,
link |
01:52:40.240
and then have many dimensions of, you know,
link |
01:52:45.080
like side jokes to it.
link |
01:52:47.640
Like we have an emotional structure,
link |
01:52:49.280
like we have a biological structure,
link |
01:52:51.480
and that seems hierarchical too,
link |
01:52:52.720
like if you're hungry, it dominates your thinking.
link |
01:52:55.600
If you're mad, it dominates your thinking.
link |
01:52:57.880
Like, and we don't know if that's important
link |
01:53:00.320
to consciousness or not, but it certainly disrupts,
link |
01:53:03.040
you know, intrudes in the consciousness.
link |
01:53:05.720
Like so there's lots of structure to that,
link |
01:53:08.120
and we like to dwell on the past,
link |
01:53:09.840
we like to think about the future,
link |
01:53:11.240
we like to imagine, we like to fantasize, right?
link |
01:53:14.680
And the somewhat circular observation of that
link |
01:53:18.520
is the thing we call consciousness.
link |
01:53:21.720
Now, if you created a computer system
link |
01:53:23.320
that did all things, created worldviews,
link |
01:53:24.880
created future alternate histories,
link |
01:53:27.320
you know, dwelled on past events,
link |
01:53:29.040
you know, accurately or semi accurately, you know, it's...
link |
01:53:33.000
Well, consciousness just spring up like naturally.
link |
01:53:35.320
Well, would that feel, look and feel conscious to you?
link |
01:53:38.080
Like you seem conscious to me, but I don't know.
link |
01:53:39.920
Like external observer sense.
link |
01:53:41.760
Do you think a thing that looks conscious is conscious?
link |
01:53:44.920
Like, do you, again,
link |
01:53:47.080
this is like an engineering kind of question, I think,
link |
01:53:49.240
because like, if we want to engineer consciousness,
link |
01:53:56.800
is it okay to engineer something
link |
01:53:58.280
that just looks conscious?
link |
01:54:00.720
Or is there a difference between something that is...
link |
01:54:02.680
Well, we have all consciousness
link |
01:54:04.040
because it's a super effective way to manage our affairs.
link |
01:54:07.120
Yeah, yeah, this is a social element, yeah.
link |
01:54:09.000
Well, it gives us a planning system, you know,
link |
01:54:11.520
we have a huge amount of stuff.
link |
01:54:13.280
Like when we're talking,
link |
01:54:14.720
like the reason we can talk really fast is we're modeling
link |
01:54:16.880
each other a really high level of detail.
link |
01:54:19.080
And consciousness is required for that.
link |
01:54:21.360
Well, all those components together manifest consciousness.
link |
01:54:26.160
Right?
link |
01:54:27.000
So if we make intelligent beings that we want to interact with
link |
01:54:29.600
that we're like, you know, wondering what they're thinking,
link |
01:54:32.040
you know, you know, looking forward to seeing them,
link |
01:54:34.920
you know, when they interact with them,
link |
01:54:36.480
they're interesting, surprising, you know, fascinating,
link |
01:54:40.800
you know, they will probably be feel conscious like we do
link |
01:54:43.480
and we'll perceive them as conscious.
link |
01:54:47.160
I don't know why not, but never know.
link |
01:54:49.960
Another fun question on this,
link |
01:54:51.440
because in, from a computing perspective,
link |
01:54:55.040
we're trying to create something that's human like
link |
01:54:56.680
or super human like.
link |
01:54:59.720
Let me ask you about aliens.
link |
01:55:01.280
Aliens.
link |
01:55:04.400
Do you think there's intelligent alien civilizations
link |
01:55:08.440
out there and do you think their technology,
link |
01:55:13.160
their computing, their AI bots,
link |
01:55:16.480
their chips are of the same nature as ours?
link |
01:55:21.280
Yeah, I got no idea.
link |
01:55:23.120
I mean, if there's lots of aliens out there,
link |
01:55:24.960
they've been awfully quiet.
link |
01:55:27.320
You know, there's your speculation about why
link |
01:55:30.680
there seems to be more than enough planets out there.
link |
01:55:34.960
There's a lot.
link |
01:55:35.800
Yeah.
link |
01:55:37.480
There's intelligent life on this planet
link |
01:55:38.960
that seems quite different, you know, like, you know,
link |
01:55:41.720
dolphins seem like plausibly understandable.
link |
01:55:44.600
Octopuses don't seem understandable at all.
link |
01:55:47.640
If they live longer than a year,
link |
01:55:48.800
maybe they would be running the planet.
link |
01:55:51.000
They seem really smart.
link |
01:55:52.720
And their neuro architecture is completely different than ours.
link |
01:55:56.560
Now, who knows how they perceive things.
link |
01:55:58.680
I mean, that's the question is for us intelligent beings,
link |
01:56:01.200
we might not be able to perceive other kinds of intelligence
link |
01:56:03.600
if they become sufficiently different than us.
link |
01:56:05.600
Yeah, like we live in the current constrained world,
link |
01:56:08.760
you know, it's three dimensional geometry
link |
01:56:10.600
and the geometry defines a certain amount of physics.
link |
01:56:14.440
And, you know, there's like how time work seems to work.
link |
01:56:18.200
Like there's so many things that seem like a whole bunch
link |
01:56:21.680
of the input parameters to the, you know,
link |
01:56:23.480
another conscious being are the same.
link |
01:56:26.440
Yes.
link |
01:56:27.280
Like if it's biological, biological things seem to be
link |
01:56:29.960
in a relatively narrow temperature range, right?
link |
01:56:32.920
Because, you know, organics don't aren't stable,
link |
01:56:35.600
too cold or too hot, you know, so, so there's,
link |
01:56:39.200
if you specify the list of things that input to that,
link |
01:56:45.240
but soon as we make really smart, you know, beings
link |
01:56:49.560
and they go solve about how to think about a billion numbers
link |
01:56:52.040
at the same time and then how to think in n dimensions.
link |
01:56:56.040
There's a funny science fiction book
link |
01:56:57.320
where all the society had uploaded into this matrix.
link |
01:57:01.560
And at some point, some of the beings in the matrix thought,
link |
01:57:05.320
I wonder if there's intelligent life out there.
link |
01:57:07.880
So they had to do a whole bunch of work to figure out
link |
01:57:09.920
like how to make a physical thing
link |
01:57:12.360
because their matrix was self sustaining
link |
01:57:14.960
and they made a little spaceship
link |
01:57:16.120
and they traveled to another planet.
link |
01:57:17.720
When they got there, there was like life running around,
link |
01:57:20.600
but there was no intelligent life.
link |
01:57:22.640
And then they figured out that there was these huge,
link |
01:57:26.200
you know, organic matrix all over the planet inside there
link |
01:57:29.560
where intelligent beings had uploaded themselves
link |
01:57:31.760
and into that matrix.
link |
01:57:34.960
So everywhere intelligent life was, soon as it got smart,
link |
01:57:40.480
it up leveled itself into something way more interesting
link |
01:57:43.640
than 3D geometry and...
link |
01:57:45.160
Yeah, it escaped whatever the...
link |
01:57:47.080
It's not escaped, it's...
link |
01:57:48.400
Upload was better.
link |
01:57:49.800
The essence of what we think of as an intelligent being,
link |
01:57:53.240
I tend to like the thought experiment of the organism,
link |
01:57:58.120
like humans aren't the organisms.
link |
01:58:00.400
I like the notion of like Richard Dawkins and memes
link |
01:58:03.760
that ideas themselves are the organisms,
link |
01:58:08.040
like they're just using our minds to evolve.
link |
01:58:11.520
So like we're just like meat receptacles
link |
01:58:15.240
for ideas to breed and multiply and so on.
link |
01:58:18.200
And maybe those are the aliens.
link |
01:58:20.920
Yeah.
link |
01:58:22.240
So Jordan Peterson has a line that says, you know,
link |
01:58:26.720
you think you have ideas, but ideas have you.
link |
01:58:29.200
Yeah.
link |
01:58:30.040
Right?
link |
01:58:30.880
Good line.
link |
01:58:31.720
And then we know about the phenomenon of groupthink
link |
01:58:34.280
and there's so many things that constrain us.
link |
01:58:37.960
But I think you can examine all that
link |
01:58:39.960
and not be completely owned by the ideas
link |
01:58:43.320
and completely sucked into groupthink.
link |
01:58:46.160
And part of your responsibility as a human
link |
01:58:49.840
is to escape that kind of phenomena, which isn't...
link |
01:58:52.800
You know, it's one of the creative tension things again.
link |
01:58:55.920
You're constructed by it, but you can still observe it
link |
01:58:59.520
and you can think about it
link |
01:59:00.760
and you can make choices about to some level
link |
01:59:04.040
how constrained you are by it.
link |
01:59:06.960
And, you know, it's useful to do that.
link |
01:59:12.000
And...
link |
01:59:13.480
But at the same time, and it could be by doing that,
link |
01:59:17.400
you know, the group and society you're part of
link |
01:59:21.480
becomes collectively even more interesting.
link |
01:59:24.200
So, you know, so that the outside observer will think,
link |
01:59:27.040
wow, you know, all these lexes running around
link |
01:59:30.080
with all these really independent ideas
link |
01:59:31.560
have created something even more interesting
link |
01:59:33.720
and aggregate.
link |
01:59:35.720
So, so I don't know.
link |
01:59:39.760
Those are lenses to look at the situation.
link |
01:59:41.880
But it's all...
link |
01:59:42.720
That'll give you some inspiration,
link |
01:59:43.560
but I don't think they're constrained.
link |
01:59:45.480
Right, you know.
link |
01:59:46.720
As a small little quirk of history,
link |
01:59:49.360
it seems like you're related to Jordan Peterson,
link |
01:59:53.680
like you mentioned.
link |
01:59:54.920
He's going through some rough stuff now.
link |
01:59:57.680
Is there some comment you can make
link |
01:59:59.200
about the roughness of the human journey,
link |
02:00:02.640
ups and downs?
link |
02:00:04.320
Well, I became an expert in Benz withdrawal.
link |
02:00:10.800
Like, which is, you took Benz as the aspenes
link |
02:00:13.640
and at some point they interact with GABA circuits,
link |
02:00:19.040
you know, to reduce anxiety and do a hundred other things.
link |
02:00:21.960
Like, there's actually no known list of everything they do
link |
02:00:25.120
because they interact with so many parts of your body.
link |
02:00:28.240
And then once you're on them, you habituate to them
link |
02:00:30.520
and you have a dependency.
link |
02:00:32.640
It's not like you're a drug dependency.
link |
02:00:34.200
We're trying to get high.
link |
02:00:35.080
It's a metabolic dependency.
link |
02:00:38.880
And then if you discontinue them,
link |
02:00:42.640
there's a funny thing called kindling,
link |
02:00:45.400
which is if you stop them and then go,
link |
02:00:47.600
you know, you'll have a horrible withdrawal symptoms.
link |
02:00:49.960
If you go back on them at the same level,
link |
02:00:51.480
you won't be stable.
link |
02:00:53.280
And that unfortunately happened to him.
link |
02:00:55.840
Because it's so deeply integrated
link |
02:00:57.280
into all the kinds of systems in the body?
link |
02:00:58.880
It literally changes the size and numbers
link |
02:01:00.840
of neurotransmitter sites in your brain.
link |
02:01:03.880
So there's a process called the Ashton protocol
link |
02:01:07.400
where you taper it down slowly over two years.
link |
02:01:10.360
The people go through that, go through unbelievable hell.
link |
02:01:13.720
And what Jordan went through seemed to be worse
link |
02:01:15.680
because on advice of doctors, you know,
link |
02:01:18.520
we'll stop taking these and take this.
link |
02:01:20.320
It was a disaster and he got some.
link |
02:01:23.920
Yeah, it was pretty tough.
link |
02:01:26.680
He seems to be doing quite a bit better intellectually.
link |
02:01:29.240
You can see his brain clicking back together.
link |
02:01:32.040
I spent a lot of time with him.
link |
02:01:32.960
I've never seen anybody suffer so much.
link |
02:01:34.960
Well, his brain is also like this powerhouse, right?
link |
02:01:37.720
So I wonder, does a brain that's able to think deeply
link |
02:01:42.480
about the world suffer more to these kinds of withdrawals?
link |
02:01:45.280
Like, I don't know.
link |
02:01:46.640
I've watched videos of people going through withdrawal.
link |
02:01:49.440
They all seem to suffer unbelievably.
link |
02:01:54.000
And, you know, my heart goes out to everybody.
link |
02:01:57.520
And there's some funny math about this.
link |
02:01:59.240
Some doctors said as best you can tell, you know,
link |
02:02:01.920
there's the standard recommendations
link |
02:02:03.520
don't take them for more than a month
link |
02:02:04.720
and then taper over a couple of weeks.
link |
02:02:07.120
Many doctors prescribe them endlessly,
link |
02:02:09.320
which is against the protocol, but it's common, right?
link |
02:02:13.080
And then something like 75% of people,
link |
02:02:16.600
when they taper it's, you know,
link |
02:02:18.520
half the people have difficulty,
link |
02:02:19.840
but 75% get off okay.
link |
02:02:22.080
20% have severe difficulty
link |
02:02:24.000
and 5% have life threatening difficulty.
link |
02:02:27.280
And if you're one of those, it's really bad.
link |
02:02:29.520
And the stories that people have on this
link |
02:02:31.520
is heartbreaking and tough.
link |
02:02:34.960
So you put some of the fault at the doctors.
link |
02:02:36.800
They just not know what the hell they're doing.
link |
02:02:38.600
Oh, that was hard to say.
link |
02:02:40.520
It's one of those commonly prescribed things.
link |
02:02:43.080
Like one doctor said, what happens is
link |
02:02:46.080
if you're prescribed them for a reason
link |
02:02:47.800
and then you have a hard time getting off,
link |
02:02:49.880
the protocol basically says you're either crazy
link |
02:02:52.440
or dependent and you get kind of pushed
link |
02:02:55.480
into a different treatment regime.
link |
02:02:58.360
You're a drug addict or a psychiatric patient.
link |
02:03:01.800
And so like one doctor said, you know,
link |
02:03:04.080
I prescribed them for 10 years thinking
link |
02:03:05.520
I was helping my patients
link |
02:03:06.560
and I realized I was really harming them.
link |
02:03:09.600
And, you know, the awareness of that is slowly coming up.
link |
02:03:12.880
The fact that they're casually prescribed to people
link |
02:03:18.160
is horrible and it's bloody scary.
link |
02:03:23.800
And some people are stable on them,
link |
02:03:25.040
but they're on them for life.
link |
02:03:26.240
Like once you, you know,
link |
02:03:27.080
it's another one of those drugs that,
link |
02:03:29.240
but Benzo's long range have real impacts
link |
02:03:31.360
on your personality.
link |
02:03:32.560
People talk about the Benzo bubble
link |
02:03:34.120
where you get disassociated from reality
link |
02:03:36.320
and your friends a little bit.
link |
02:03:38.200
It's really terrible.
link |
02:03:40.360
The mind is terrifying.
link |
02:03:41.720
We were talking about how the infinite possibility of fun,
link |
02:03:45.480
but like it's the infinite possibility of suffering too,
link |
02:03:48.640
which is one of the dangers of like expansion
link |
02:03:52.320
of the human mind.
link |
02:03:53.480
It's like, I wonder if all the possible human experiences
link |
02:03:58.200
that intelligent computer can have,
link |
02:04:01.680
is it mostly fun or is it mostly suffering?
link |
02:04:05.840
So like if you brute force expand the set of possibilities
link |
02:04:10.840
like are you going to run into some trouble
link |
02:04:13.960
in terms of like torture and suffering and so on?
link |
02:04:16.560
Maybe our human brain is just protecting us
link |
02:04:18.840
from much more possible pain and suffering.
link |
02:04:22.280
Maybe the space of pain is like much larger
link |
02:04:25.960
than we could possibly imagine and that.
link |
02:04:28.120
The world's in a balance.
link |
02:04:30.720
You know, all the literature on religion and stuff is,
link |
02:04:34.200
you know, the struggle between good and evil
link |
02:04:36.280
is balanced for very finely tuned
link |
02:04:39.360
for reasons that are complicated.
link |
02:04:41.640
But that's a long philosophical conversation.
link |
02:04:44.840
Speaking of balance that's complicated,
link |
02:04:46.680
I wonder because we're living through one
link |
02:04:48.640
of the more important moments in human history
link |
02:04:51.600
with this particular virus,
link |
02:04:53.760
it seems like pandemics have at least the ability
link |
02:04:56.960
to kill off most of the human population at their worst.
link |
02:05:03.040
And there's just fascinating
link |
02:05:04.240
because there's so many viruses in this world.
link |
02:05:06.120
There's so many, I mean viruses basically run the world
link |
02:05:08.560
in the sense that they've been around for a very long time.
link |
02:05:12.240
They're everywhere.
link |
02:05:13.640
They seem to be extremely powerful
link |
02:05:15.320
and they're distributed kind of way,
link |
02:05:17.240
but at the same time they're not intelligent
link |
02:05:19.560
and they're not even living.
link |
02:05:21.240
Do you have like high level thoughts about this virus
link |
02:05:23.800
that like in terms of you being fascinated
link |
02:05:27.280
or terrified or somewhere in between?
link |
02:05:30.360
So I believe in frameworks, right?
link |
02:05:32.480
So like one of them is evolution.
link |
02:05:36.240
Like we're evolved creatures, right?
link |
02:05:37.840
Yes.
link |
02:05:38.920
And one of the things about evolution
link |
02:05:40.840
is it's hyper competitive.
link |
02:05:42.720
And it's not competitive out of a sense of evil.
link |
02:05:44.840
It's competitive in a sense of there's endless variation
link |
02:05:47.760
and variations that work better when.
link |
02:05:50.320
And then over time, there's so many levels
link |
02:05:52.920
of that competition, like multi cellular life
link |
02:05:56.760
partly exists because of the competition
link |
02:06:01.120
between different kinds of life forms.
link |
02:06:04.200
And we know sex partly exists to scramble our genes
link |
02:06:06.840
so that we have genetic variation
link |
02:06:09.880
against the invasion of the bacteria and the viruses.
link |
02:06:14.200
And it's endless.
link |
02:06:16.040
Like I read some funny statistic,
link |
02:06:18.000
like the density of viruses and bacteria in the ocean
link |
02:06:20.760
is really high.
link |
02:06:22.040
And one third of the bacteria die every day
link |
02:06:23.880
because the virus is invading them.
link |
02:06:26.200
Like one third of them.
link |
02:06:27.960
Wow.
link |
02:06:29.040
Like I don't know if that number is true,
link |
02:06:31.000
but it was like there's like the amount of competition
link |
02:06:34.880
and what's going on is stunning.
link |
02:06:37.320
And there's a theory as we age,
link |
02:06:38.600
we slowly accumulate bacterias and viruses
link |
02:06:41.720
and as our immune system kind of goes down,
link |
02:06:45.520
that's what slowly kills us.
link |
02:06:47.680
It just feels so peaceful from a human perspective
link |
02:06:50.160
when we sit back and they're able to have a relaxed
link |
02:06:52.200
conversation and there's wars going on out there.
link |
02:06:56.720
Like right now you're harboring how many bacteria
link |
02:07:00.840
and the ones, many of them are parasites on you.
link |
02:07:04.800
And some of them are helpful.
link |
02:07:06.000
And some of them are modifying your behavior.
link |
02:07:07.720
And some of them are, you know, it's just really wild.
link |
02:07:12.160
But you know, this particular manifestation is unusual.
link |
02:07:16.160
You know, in the demographic, how it hit
link |
02:07:18.360
and the political response that it engendered
link |
02:07:21.280
and the healthcare response it engendered
link |
02:07:23.760
and the technology it engendered, it's kind of wild.
link |
02:07:27.040
Yeah, the communication on Twitter
link |
02:07:28.520
that it led to all that kind of stuff,
link |
02:07:31.160
at every single level, yeah.
link |
02:07:32.920
But what usually kills is life.
link |
02:07:34.520
The big extinctions are caused by meteors and volcanoes.
link |
02:07:39.360
That's the one you're worried about,
link |
02:07:40.720
as opposed to human created bombs that we launch.
link |
02:07:44.400
Solar flares are another good one.
link |
02:07:46.040
You know, occasionally solar flares hit the planet.
link |
02:07:48.520
So it's nature.
link |
02:07:51.080
Yeah, it's all pretty wild.
link |
02:07:53.480
On another historic moment, this is perhaps outside,
link |
02:07:57.440
but perhaps within your space of frameworks
link |
02:08:02.440
so you think about that just happened,
link |
02:08:04.600
I guess a couple of weeks ago is,
link |
02:08:06.680
I don't know if you're paying attention at all,
link |
02:08:08.040
it's the game stop and Wall Street bets.
link |
02:08:12.440
It's a lot of fun.
link |
02:08:14.160
So it's really fascinating.
link |
02:08:16.600
There's kind of a theme to this conversation today
link |
02:08:19.200
because it's like neural networks,
link |
02:08:22.040
it's cool how there's a large number of people
link |
02:08:25.040
in a distributed way, almost having a kind of fun,
link |
02:08:30.040
were able to take on the powerful elite hedge funds,
link |
02:08:35.840
centralized powers and overpower them.
link |
02:08:40.000
Do you have thoughts on this whole saga?
link |
02:08:43.360
I don't know enough about finance,
link |
02:08:45.000
but it was like the Elon, you know,
link |
02:08:47.800
Robin Hood guy when they talked.
link |
02:08:49.280
Yeah, what'd you think about that?
link |
02:08:51.560
Well, Robin Hood guy didn't know
link |
02:08:52.680
how the finance system worked.
link |
02:08:54.280
That was clear, right?
link |
02:08:55.560
He was treating like the people who settled
link |
02:08:57.760
the transactions as a black box.
link |
02:09:00.000
And suddenly somebody called him up
link |
02:09:01.360
and say, hey, black box calling you,
link |
02:09:03.600
your transaction volume means you need
link |
02:09:05.240
to put out $3 billion right now.
link |
02:09:06.920
And he's like, I don't have $3 billion.
link |
02:09:08.960
Like I don't even make any money on these trades.
link |
02:09:10.520
Why do I have $3 billion while you're sponsoring a trade?
link |
02:09:13.200
So there was a set of abstractions that,
link |
02:09:16.960
I don't think either, like now we understand it.
link |
02:09:19.520
Like this happens in chip design.
link |
02:09:21.120
Like you buy wafers from TSMC or Samsung or Intel.
link |
02:09:25.640
And they say it works like this
link |
02:09:27.440
and you do your design based on that.
link |
02:09:29.000
And then chip comes back and it doesn't work.
link |
02:09:31.280
And then suddenly you started having to open the black boxes.
link |
02:09:34.280
The transistors really work like they said,
link |
02:09:36.400
what's the real issue?
link |
02:09:38.080
So there's a whole set of things
link |
02:09:43.240
that created this opportunity and somebody spotted it.
link |
02:09:46.240
Now, people spot these kinds of opportunities all the time.
link |
02:09:49.880
So there's been flash crashes, there's been,
link |
02:09:52.720
there's always short squeezes that are fairly regular.
link |
02:09:55.360
Every CEO I know hates the shorts
link |
02:09:58.480
because they're manipulating,
link |
02:10:00.320
they're trying to manipulate their stock
link |
02:10:01.840
in a way that they make money and deprive value
link |
02:10:05.920
from both the company and the investors.
link |
02:10:08.880
So the fact that some of these stocks were so short,
link |
02:10:13.680
it's hilarious that this hasn't happened before.
link |
02:10:17.320
I don't know why.
link |
02:10:18.160
And I don't actually know why some serious hedge funds
link |
02:10:21.120
didn't do it to other hedge funds.
link |
02:10:23.440
And some of the hedge funds actually made a lot of money
link |
02:10:25.360
on this.
link |
02:10:26.200
Yes, so my guess is we know 5% of what really happened
link |
02:10:32.160
and a lot of the players don't know what happened.
link |
02:10:34.440
And the people who probably made the most money
link |
02:10:37.440
aren't the people that they're talking about.
link |
02:10:39.560
Yeah, that's...
link |
02:10:41.120
Do you think there was something...
link |
02:10:42.720
I mean, this is the cool kind of Elon.
link |
02:10:47.960
You're the same kind of conversationalist,
link |
02:10:50.720
which is like first principles,
link |
02:10:52.360
questions of like what the hell happened.
link |
02:10:56.280
Just very basic questions of like,
link |
02:10:57.920
was there something shady going on?
link |
02:11:00.840
What, you know, who are the parties involved?
link |
02:11:03.680
It's the basic questions that everybody wants to know about.
link |
02:11:06.320
Yeah, so like we're in a very
link |
02:11:08.480
hyper competitive world, right?
link |
02:11:10.320
But transactions like buying and selling stock
link |
02:11:12.160
is a trust event.
link |
02:11:13.560
You know, I trust the company,
link |
02:11:14.520
representing themselves properly.
link |
02:11:16.240
You know, I bought the stock
link |
02:11:17.800
because I think it's gonna go up.
link |
02:11:19.680
I trust that the regulations are solid.
link |
02:11:22.680
Now, inside of that, there's all kinds of places
link |
02:11:26.120
where, you know, humans over trust.
link |
02:11:28.600
And, you know, this, this expose,
link |
02:11:31.520
let's say some weak points in the system.
link |
02:11:34.600
I don't know if it's gonna get corrected.
link |
02:11:37.320
I don't know if we have close to the real story.
link |
02:11:41.760
You know, my suspicion is we don't.
link |
02:11:44.480
And listening to that guy, he was like a little wide eyed
link |
02:11:47.280
about, and then he did this and then he did that.
link |
02:11:49.080
And I was like, I think you should know more
link |
02:11:51.800
about that spit your business than that.
link |
02:11:54.160
But again, there's many businesses when,
link |
02:11:56.480
like this layer is really stable.
link |
02:11:58.720
You stop paying attention to it.
link |
02:12:00.640
You pay attention to the stuff that's bugging you or new.
link |
02:12:04.320
You don't pay attention to the stuff
link |
02:12:05.760
that just seems to work all the time.
link |
02:12:07.040
You just, you know, the sky's blue every day, California.
link |
02:12:11.040
And we're once while, you know, it rains there.
link |
02:12:12.840
It was like, what do we do?
link |
02:12:15.240
Somebody go bring in the lawn furniture.
link |
02:12:17.240
You know, like it's getting wet.
link |
02:12:18.680
You don't know why it's getting wet.
link |
02:12:19.960
Yeah, it doesn't.
link |
02:12:20.800
I was blue for like 100 days and now it's, you know, so.
link |
02:12:24.560
But part of the problem here with Vlad,
link |
02:12:27.000
the CEO of Robinhood is the scaling
link |
02:12:29.520
is that what we've been talking about
link |
02:12:30.880
is there's a lot of unexpected things
link |
02:12:34.720
that happen with the scaling.
link |
02:12:36.040
And you have to be, I think the scaling forces you
link |
02:12:39.680
to then return to the fundamentals.
link |
02:12:41.920
Well, it's interesting because when you buy
link |
02:12:43.640
and sell stocks, the scaling is, you know,
link |
02:12:45.600
the stocks don't only move in a certain range.
link |
02:12:47.280
And if you buy a stock, you can only lose that amount of money.
link |
02:12:50.000
On the short, short market, you can lose a lot more
link |
02:12:52.400
than you can benefit.
link |
02:12:53.800
Like it has a, it has a weird cause, you know,
link |
02:12:56.440
cost function or whatever the right word for that is.
link |
02:12:59.240
So he was trading in a market
link |
02:13:01.080
where he wasn't actually capitalized for the downside.
link |
02:13:04.160
If it got outside a certain range.
link |
02:13:07.320
Now, whether something that various has happened,
link |
02:13:09.720
I have no idea, but at some point the financial risk,
link |
02:13:14.720
both him and his customers was way outside
link |
02:13:16.920
of his financial capacity.
link |
02:13:18.640
And his understanding how the system work was clearly weak
link |
02:13:22.840
or he didn't represent himself.
link |
02:13:24.640
I don't know the person.
link |
02:13:26.360
When I listened to him, Nick,
link |
02:13:28.240
it could have been the surprise question was,
link |
02:13:29.840
like, how many of these guys called him?
link |
02:13:31.920
You know, it sounded like he was treating stuff
link |
02:13:34.280
as a black box, maybe he shouldn't have,
link |
02:13:37.080
but maybe he has a whole pile of experts
link |
02:13:38.360
somewhere else than it was going on.
link |
02:13:39.600
I don't, I don't know.
link |
02:13:40.720
Yeah, I mean, this is, this is one of the qualities
link |
02:13:45.200
of a good leader is under fire, you have to perform.
link |
02:13:49.080
And that means to think clearly and to speak clearly.
link |
02:13:53.040
And he dropped the ball on those things
link |
02:13:55.280
cause and understand the problem quickly,
link |
02:13:58.040
learn and understand the problem at like,
link |
02:14:00.080
at this like basic level, like what the hell happened.
link |
02:14:05.080
And my guess is, you know, at some level it was amateurs
link |
02:14:09.360
trading against, you know, experts slash insiders
link |
02:14:12.320
slash people with, you know, special information.
link |
02:14:14.920
Outsiders versus insiders.
link |
02:14:16.880
Yeah. And the insiders, you know,
link |
02:14:19.320
my guess is the next time this happens,
link |
02:14:21.160
we'll make money on it.
link |
02:14:23.000
The insiders always win.
link |
02:14:25.080
Well, they have more tools and more incentive.
link |
02:14:27.160
I mean, this always happens.
link |
02:14:28.480
Like the outsiders are doing this for fun.
link |
02:14:30.800
The insiders are doing this 24 stop.
link |
02:14:33.320
But there's numbers in the outsiders.
link |
02:14:35.720
This is the interesting thing.
link |
02:14:36.800
Well, there's numbers on the insiders too.
link |
02:14:40.360
Like that's different kind of numbers.
link |
02:14:44.040
But this could be a new era because, I don't know,
link |
02:14:46.080
at least I didn't expect that a bunch of Redditors could,
link |
02:14:49.120
you know, there's, you know, millions of people
link |
02:14:51.280
can get to the next one will be a surprise.
link |
02:14:54.200
But don't you think the crowd,
link |
02:14:56.480
the people are planning the next attack?
link |
02:14:59.240
We'll see.
link |
02:15:00.480
But it has to be a surprise.
link |
02:15:01.440
Can't be the same game.
link |
02:15:02.600
And so the insiders, like it could be,
link |
02:15:06.520
there's a very large number of games to play
link |
02:15:08.840
and they can be agile about it.
link |
02:15:10.520
I don't know, I'm not an expert.
link |
02:15:12.160
Right. That's a good question.
link |
02:15:13.720
The space of games, how restricted is it?
link |
02:15:18.000
Yeah. And the system is so complicated,
link |
02:15:20.200
it could be relatively unrestricted.
link |
02:15:22.720
And also, like, you know,
link |
02:15:24.200
during the last couple of financial crashes,
link |
02:15:26.640
you know, what set it off was, you know,
link |
02:15:28.800
sets of derivative events where, you know,
link |
02:15:31.320
Nesim Talib's, you know, saying is,
link |
02:15:34.560
they're trying to lower volatility in the short run
link |
02:15:39.400
by creating tail events.
link |
02:15:41.600
And systems always evolve towards that.
link |
02:15:43.680
And then they always crash.
link |
02:15:45.560
Like S curve is the, you know,
link |
02:15:47.840
start low, ramp, plateau, crash.
link |
02:15:51.560
It's 100% effective.
link |
02:15:54.520
In the long run, let me ask you some advice
link |
02:15:58.120
to put on your profound hat.
link |
02:15:59.760
Mm hmm.
link |
02:16:01.640
There's a bunch of young folks who listen to this thing
link |
02:16:04.640
for no good reason whatsoever.
link |
02:16:07.480
Undergraduate students, maybe high school students,
link |
02:16:10.600
maybe just young folks, young at heart,
link |
02:16:13.040
looking for the next steps to taking life.
link |
02:16:16.840
What advice would you give to a young person today
link |
02:16:19.320
about life, maybe career, but also life in general?
link |
02:16:23.840
Get good at some stuff.
link |
02:16:26.080
Well, get to know yourself, right?
link |
02:16:28.200
To get good at something that you're actually interested in.
link |
02:16:30.640
You have to love what you're doing to get good at it.
link |
02:16:33.480
You really got to find that.
link |
02:16:34.440
Don't waste all your time doing stuff
link |
02:16:35.800
that's just boring or bland or numbing, right?
link |
02:16:40.160
Don't let old people screw you.
link |
02:16:44.720
Well, people get talked into doing all kinds of shit
link |
02:16:46.760
and racking up huge student, you know, student deaths.
link |
02:16:49.320
And like, there's so much crap going on, you know?
link |
02:16:52.600
And they drains your time and drains your energy.
link |
02:16:54.760
Yeah, they are quite a sign, you know, thesis that,
link |
02:16:56.840
you know, the older generation won't let go.
link |
02:16:59.560
They're trapping all the young people.
link |
02:17:01.200
I think that's some truth to that.
link |
02:17:02.480
Yeah, sure.
link |
02:17:05.000
Just because you're old doesn't mean you stop thinking.
link |
02:17:06.960
I know lots of really original old people.
link |
02:17:10.400
I'm an old person.
link |
02:17:11.960
So, but you have to be conscious about it.
link |
02:17:15.680
You can fall into the ruts and then do that.
link |
02:17:19.000
I mean, when I hear young people spouting opinions,
link |
02:17:22.080
it sounds like they come from Fox News or CNN.
link |
02:17:24.400
I think they've been captured by group thinking, memes,
link |
02:17:27.280
and stuff.
link |
02:17:28.120
I supposed to think on their own.
link |
02:17:29.520
You know, so if you find yourself repeating
link |
02:17:31.440
what everybody else is saying,
link |
02:17:33.440
you're not gonna have a good life.
link |
02:17:35.800
Like, that's not how the world works.
link |
02:17:37.720
It may be, it seems safe,
link |
02:17:39.320
but it puts you at great jeopardy for
link |
02:17:42.680
well being boring or unhappy.
link |
02:17:45.960
How long did it take you to find the thing
link |
02:17:47.800
that you have fun with?
link |
02:17:50.640
I don't know.
link |
02:17:52.160
I've been a fun person since I was pretty little.
link |
02:17:54.320
So, everything.
link |
02:17:55.160
I've gone through a couple of periods
link |
02:17:56.120
of depression in my life.
link |
02:17:58.080
Well, good reason or for a reason
link |
02:18:00.160
that doesn't make any sense.
link |
02:18:02.600
Yeah.
link |
02:18:04.480
Like, some things are hard.
link |
02:18:05.960
Like, you go through mental transitions in high school.
link |
02:18:08.880
I was really depressed for a year.
link |
02:18:10.680
And I think I had my first midlife crisis at 26.
link |
02:18:15.120
I kind of thought, is this all there is?
link |
02:18:16.600
Like, I was working at a job that I loved
link |
02:18:20.000
and but I was going to work and all my time is consumed.
link |
02:18:23.360
What's the escape out of that depression?
link |
02:18:25.760
What's the answer to is, is this all there is?
link |
02:18:29.200
Well, a friend of mine, I asked him
link |
02:18:31.720
because he was working in his ass off.
link |
02:18:32.840
I said, what's your work life balance?
link |
02:18:34.520
Like, there's work, friends, family, personal time.
link |
02:18:40.280
Are you balancing in that?
link |
02:18:41.360
And he said, work 80%, family 20%.
link |
02:18:43.560
And I tried to find some time to sleep.
link |
02:18:47.520
Like, there's no personal time.
link |
02:18:49.160
There's no passionate time.
link |
02:18:51.800
Like, you know, the young people are often passionate
link |
02:18:53.760
about work.
link |
02:18:54.600
So I was sort of like that.
link |
02:18:56.920
But you need to have some space in your life
link |
02:18:59.920
for different things.
link |
02:19:01.800
And that's, that creates, that makes you resistant
link |
02:19:05.840
to the whole, the deep dips into depression kind of thing.
link |
02:19:11.200
Yeah. Well, you have to get to know yourself too.
link |
02:19:13.040
Meditation helps.
link |
02:19:14.440
Some physical, something physically intense helps.
link |
02:19:18.480
Like the weird places your mind goes kind of thing.
link |
02:19:21.920
Like, and why does it happen?
link |
02:19:23.760
Why do you do what you do?
link |
02:19:24.800
Like triggers, like the things that cause your mind
link |
02:19:27.640
to go to different places kind of thing.
link |
02:19:29.440
Or like events, like.
link |
02:19:32.200
You're upbringing for better or worse,
link |
02:19:33.680
whether your parents are great people or not,
link |
02:19:35.640
you come into adulthood with all kinds of emotional burdens.
link |
02:19:42.320
Yeah.
link |
02:19:43.160
And you can see some people are so bloody stiff
link |
02:19:45.040
and restrained and they think, you know,
link |
02:19:46.800
the world's fundamentally negative.
link |
02:19:49.040
Like you maybe, you have unexplored territory.
link |
02:19:53.000
Yeah.
link |
02:19:53.960
Or you're afraid of something.
link |
02:19:56.280
Definitely afraid of quite a few things.
link |
02:19:58.720
Then you got to go face them.
link |
02:20:00.280
Like what's the worst thing that can happen?
link |
02:20:03.480
You're going to die, right?
link |
02:20:05.160
Like that's inevitable.
link |
02:20:06.360
You might as well get over that, like a hundred percent.
link |
02:20:08.280
That's right.
link |
02:20:09.800
Like people are worried about the virus,
link |
02:20:11.120
but you know, the human condition is pretty deadly.
link |
02:20:14.520
There's something about embarrassment.
link |
02:20:16.360
That's, I've competed a lot in my life.
link |
02:20:18.200
And I think the, if I'm too introspected,
link |
02:20:21.960
the thing I'm most afraid of is being like humiliated.
link |
02:20:26.120
I think nobody cares about that.
link |
02:20:28.040
Like you're the only person on the planet
link |
02:20:30.080
that cares about you being humiliated.
link |
02:20:32.320
So it can really useless thought.
link |
02:20:34.720
It is.
link |
02:20:35.560
It's like, you're all humiliated.
link |
02:20:39.520
Something happened in a room full of people
link |
02:20:41.080
and they walk out and they didn't think about it.
link |
02:20:42.640
One more second.
link |
02:20:43.760
Or maybe somebody told a funny story
link |
02:20:45.360
to somebody else and then it dissipated throughout.
link |
02:20:47.520
Yeah.
link |
02:20:48.600
Yeah.
link |
02:20:49.440
No, I know it too.
link |
02:20:50.280
I mean, I've been really embarrassed about shit
link |
02:20:53.400
that nobody cared about myself.
link |
02:20:55.520
Yeah.
link |
02:20:56.360
It's a funny thing.
link |
02:20:57.200
So the worst thing ultimately is just, yeah.
link |
02:20:59.920
But that's a cage and then you have to get out of it.
link |
02:21:02.600
Like once you, here's the thing.
link |
02:21:03.880
Once you find something like that,
link |
02:21:05.720
you have to be determined to break it.
link |
02:21:09.040
Cause otherwise you'll just, you know,
link |
02:21:10.240
so you accumulate that kind of junk
link |
02:21:11.760
and then you die as a, you know, a mess.
link |
02:21:15.440
So the goal, I guess it's like a cage within a cage.
link |
02:21:18.440
I guess the goal is to die in the biggest possible cage.
link |
02:21:21.960
Well, ideally you'd have no cage.
link |
02:21:25.120
You know, people do get enlightened.
link |
02:21:26.480
I've got a few.
link |
02:21:27.440
It's great.
link |
02:21:28.520
You found a few?
link |
02:21:29.360
There's a few out there?
link |
02:21:30.480
I don't know.
link |
02:21:31.320
Of course there are.
link |
02:21:33.360
Either that or they have, you know,
link |
02:21:34.560
it's a great sales pitch.
link |
02:21:35.520
There's like enlightened people,
link |
02:21:36.480
write books and do all kinds of stuff.
link |
02:21:38.280
It's a good way to sell a book.
link |
02:21:39.520
I'll give you that.
link |
02:21:40.840
You've never met somebody you just thought,
link |
02:21:42.880
they just kill me.
link |
02:21:43.840
Like they just, like mental clarity, humor.
link |
02:21:47.880
No, 100%, but I just feel like they're living
link |
02:21:50.000
in a bigger cage.
link |
02:21:50.960
They have their own.
link |
02:21:52.000
You still think there's a cage?
link |
02:21:53.320
There's still a cage.
link |
02:21:54.360
You secretly suspect there's always a cage.
link |
02:21:57.560
There's no, there's nothing outside the universe.
link |
02:21:59.920
There's nothing outside the cage.
link |
02:22:01.520
You work, you work, you work at a bunch of companies.
link |
02:22:06.520
You work, you work at a bunch of companies.
link |
02:22:10.200
You led a lot of amazing teams.
link |
02:22:14.080
I don't, I'm not sure if you've ever been like
link |
02:22:16.880
at the early stages of a startup,
link |
02:22:19.480
but do you have advice for somebody that wants to
link |
02:22:25.880
do a startup or build a company,
link |
02:22:28.360
like build a strong team of engineers that are passionate.
link |
02:22:31.240
Just want to solve a big problem.
link |
02:22:35.040
Like, is there a more specifically on that point?
link |
02:22:39.360
You have to be really good at stuff.
link |
02:22:41.400
If you're going to lead and build a team,
link |
02:22:43.040
you better be really interested in how people work and think.
link |
02:22:47.000
The people or the solution to the problem.
link |
02:22:49.080
So there's two things, right?
link |
02:22:50.200
One is how people work and the other is the fund.
link |
02:22:53.040
Actually, there's quite a few successful startups
link |
02:22:55.680
that's really clear.
link |
02:22:56.520
The founders don't know anything about people.
link |
02:22:58.400
Like the idea was so powerful that it propelled them.
link |
02:23:01.480
But I suspect somewhere early,
link |
02:23:03.760
they hired some people who understood people
link |
02:23:06.960
because people really need a lot of care
link |
02:23:08.480
and feeding the collaborate and work together
link |
02:23:10.440
and feel engaged and work hard.
link |
02:23:13.800
Like startups are all about out producing other people.
link |
02:23:17.000
Like you're nimble because you don't have any legacy.
link |
02:23:19.800
You don't have a bunch of people who are depressed
link |
02:23:22.880
about life just showing up.
link |
02:23:25.760
So startups have a lot of advantages that way.
link |
02:23:28.120
Do you like the, Steve Jobs talked about this idea of A players
link |
02:23:33.000
and B players?
link |
02:23:33.840
I don't know if you know this formulation.
link |
02:23:36.120
Yeah, no.
link |
02:23:38.320
Organizations that get taken over by B player leaders
link |
02:23:43.520
often really underperform their HRC players.
link |
02:23:46.960
That said, in big organizations,
link |
02:23:49.240
there's so much work to do.
link |
02:23:51.200
And there's so many people who are happy to do what,
link |
02:23:53.720
like the leadership or the big idea people
link |
02:23:56.240
would consider menial jobs.
link |
02:23:58.640
And you need a place for them,
link |
02:24:00.480
but you need an organization that both values and rewards them,
link |
02:24:04.440
but doesn't let them take over the leadership of it.
link |
02:24:07.040
Got it.
link |
02:24:07.880
So you need to have an organization that's resistant to that.
link |
02:24:10.560
But in the early days,
link |
02:24:12.800
the notion with Steve was that one B player in a room
link |
02:24:18.440
of A players will be destructive to the whole.
link |
02:24:21.600
I've seen that happen.
link |
02:24:22.960
I don't know if it's always true,
link |
02:24:25.080
like you run into people who are clearly B players,
link |
02:24:27.840
but they think they're A players.
link |
02:24:28.920
And so they have a loud voice at the table
link |
02:24:30.560
and they make lots of demands for that.
link |
02:24:32.680
But there's other people are like, I know I am.
link |
02:24:34.960
I just want to work with cool people on cool shit
link |
02:24:37.160
and just tell me what to do and I'll go get it done.
link |
02:24:39.560
So you have to, again, this is like people skills,
link |
02:24:42.440
like what kind of person is it?
link |
02:24:45.080
I've met some really great people I love working with.
link |
02:24:48.600
That weren't the biggest ID people,
link |
02:24:50.280
the most productive ever, but they show up,
link |
02:24:52.280
they get it done.
link |
02:24:53.120
You know, they create connection and community
link |
02:24:55.680
that people value.
link |
02:24:57.000
It's pretty diverse.
link |
02:24:58.880
I don't think there's a recipe for that.
link |
02:25:01.880
I got to ask you about love.
link |
02:25:03.720
I heard you into this now.
link |
02:25:05.520
Into this love thing?
link |
02:25:06.360
Yeah.
link |
02:25:07.200
Do you think this is your solution to your depression?
link |
02:25:10.120
No, I'm just trying to, like you said,
link |
02:25:11.840
to delight in people on occasion trying to sell a book.
link |
02:25:13.880
I'm writing a book about love.
link |
02:25:14.960
You're writing a book about love.
link |
02:25:15.800
No, I'm not.
link |
02:25:16.640
I'm not.
link |
02:25:17.480
I'm not.
link |
02:25:18.320
I'm not.
link |
02:25:19.160
I'm not.
link |
02:25:20.000
I'm not.
link |
02:25:20.840
I'm not.
link |
02:25:21.680
I'm not.
link |
02:25:23.280
I'm a friend of mine.
link |
02:25:24.640
He's gonna.
link |
02:25:25.760
Somebody said, you should really write a book
link |
02:25:27.240
about your management philosophy.
link |
02:25:29.120
He said, it'd be a short book.
link |
02:25:35.000
Well, that one was all pretty well.
link |
02:25:37.760
What role do you think love, family, friendship,
link |
02:25:40.440
all that kind of human stuff play in a successful life?
link |
02:25:44.400
You've been exceptionally successful in the space
link |
02:25:46.360
of like running teams, building cool shit in this world,
link |
02:25:51.160
creating some amazing things.
link |
02:25:53.160
What, did love get in the way?
link |
02:25:54.720
Did love help the family get in the way?
link |
02:25:57.680
Did family help friendship?
link |
02:25:59.720
You want the engineer's answer?
link |
02:26:02.120
Please.
link |
02:26:02.960
So, but first love is functional, right?
link |
02:26:05.800
It's functional in what way?
link |
02:26:07.280
So we habituate ourselves to the environment.
link |
02:26:10.960
And actually Jordan told me,
link |
02:26:12.040
Jordan Peterson told me this line.
link |
02:26:13.920
So you go through life and you just get used to everything,
link |
02:26:16.440
except for the things you love.
link |
02:26:17.800
They remain new.
link |
02:26:20.120
Like this is really useful for, you know,
link |
02:26:22.440
like other people's children and dogs and trees.
link |
02:26:26.080
You just don't pay that much attention to them.
link |
02:26:27.720
Your own kids, you're monitoring them really closely.
link |
02:26:31.000
Like, and if they go off a little bit,
link |
02:26:32.720
because you love them, if you're smart,
link |
02:26:35.280
if you're gonna be a successful parent,
link |
02:26:37.480
you notice it right away.
link |
02:26:38.920
You don't habituate just things you love.
link |
02:26:44.280
And if you wanna be successful at work,
link |
02:26:46.120
if you don't love it,
link |
02:26:47.560
you're not gonna put the time in somebody else.
link |
02:26:50.400
It's somebody else that loves it.
link |
02:26:51.640
Like, cause it's new and interesting
link |
02:26:53.760
and that lets you go to the next level.
link |
02:26:57.560
So it's a thing, it's just a function
link |
02:26:59.120
that generates newness and novelty
link |
02:27:01.680
and surprises, you know, all those kinds of things.
link |
02:27:04.680
It's really interesting.
link |
02:27:05.800
Like, and there's people figured out lots of, you know,
link |
02:27:08.600
frameworks for this, you know, like,
link |
02:27:10.440
like humans seem to go in partnership,
link |
02:27:12.440
go through, you know, interests.
link |
02:27:13.880
Like somebody, suddenly somebody's interesting
link |
02:27:16.680
and then you're infatuated with them
link |
02:27:18.200
and then you're in love with them.
link |
02:27:20.080
And then you, you know, different people have ideas
link |
02:27:22.640
about parental love or mature love.
link |
02:27:24.520
Like you go through a cycle of that,
link |
02:27:26.600
which keeps us together and it's, you know,
link |
02:27:28.720
super functional for creating families
link |
02:27:30.600
and creating communities
link |
02:27:32.560
and making you support somebody
link |
02:27:34.560
despite the fact that you don't love them.
link |
02:27:36.960
Like, and, and it can be really enriching.
link |
02:27:41.960
You know, no, no, in the work life balance scheme,
link |
02:27:45.040
if all you do is work,
link |
02:27:47.360
you think you may be optimizing your work potential,
link |
02:27:50.040
but if you don't love your work
link |
02:27:51.600
or you don't have family and friends
link |
02:27:54.680
and things you care about,
link |
02:27:56.560
your brain isn't well balanced.
link |
02:27:59.560
Like everybody knows experience
link |
02:28:01.040
of your work's on something all week.
link |
02:28:02.360
You went home and took two days off and you came back in.
link |
02:28:05.360
The odds of you working on the thing,
link |
02:28:07.360
you picking up right where you left off is zero.
link |
02:28:09.840
Your brain refactored it, but being in love is great.
link |
02:28:16.240
It's like change is the color of the light in the room.
link |
02:28:18.840
It creates a spaciousness that's, that's different.
link |
02:28:22.840
It helps you think, it makes you strong.
link |
02:28:26.840
Bukowski had this line about love being a fog
link |
02:28:29.840
that dissipates with the first light of reality
link |
02:28:32.840
in the morning.
link |
02:28:33.840
That's depressing.
link |
02:28:34.840
I think it's the other way around.
link |
02:28:36.840
It lasts, well, like you said, it's just a function.
link |
02:28:39.840
It's a thing that generates.
link |
02:28:40.840
It can be the light that actually enlivens your world
link |
02:28:43.840
and creates the interest and the power and the strengths
link |
02:28:46.840
to go do something.
link |
02:28:48.840
It's like, that sounds like, you know,
link |
02:28:51.840
there's like physical love, emotional love,
link |
02:28:53.840
intellectual love, spiritual love, right?
link |
02:28:55.840
Isn't it all the same thing?
link |
02:28:56.840
Nope.
link |
02:28:57.840
You should differentiate that.
link |
02:28:59.840
Maybe that's your problem.
link |
02:29:01.840
In your book, you should refine that a little bit.
link |
02:29:03.840
The different chapters?
link |
02:29:06.840
Yeah, there's different chapters.
link |
02:29:08.840
What's the, what's, these are, aren't these just different
link |
02:29:10.840
layers of the same thing or the stack?
link |
02:29:12.840
Physical.
link |
02:29:13.840
People, people, some people are addicted to physical love
link |
02:29:16.840
and they have no idea about emotional or intellectual love.
link |
02:29:20.840
I don't know if they're the same things.
link |
02:29:22.840
I think they're different.
link |
02:29:23.840
That's true.
link |
02:29:24.840
They could be different.
link |
02:29:25.840
I'd be, I guess the ultimate goal is for it to be the same.
link |
02:29:27.840
Well, if you want something to be bigger and interesting,
link |
02:29:29.840
you should find all its components and differentiate them,
link |
02:29:31.840
not clown it together.
link |
02:29:33.840
People do this all the time.
link |
02:29:35.840
Yeah.
link |
02:29:36.840
Modularity.
link |
02:29:37.840
Get your abstraction layers right and then you can,
link |
02:29:39.840
you have room to breathe.
link |
02:29:40.840
Well, maybe you can write the forward to my book about love.
link |
02:29:43.840
Or the afterwards.
link |
02:29:45.840
You really tried.
link |
02:29:48.840
I feel like Lex has made a lot of progress with this book.
link |
02:29:52.840
Well, you have things in your life that you love.
link |
02:29:55.840
Yeah.
link |
02:29:56.840
Yeah.
link |
02:29:57.840
And they are, you're right.
link |
02:29:58.840
They're modular.
link |
02:29:59.840
And you can have multiple things with the same person or the same thing.
link |
02:30:04.840
Yeah.
link |
02:30:05.840
But, yeah.
link |
02:30:07.840
Depending on the moment of the day.
link |
02:30:09.840
Yeah.
link |
02:30:10.840
Like what Bacowski described is that moment you go from being in love
link |
02:30:14.840
to having a different kind of love.
link |
02:30:16.840
Yeah.
link |
02:30:17.840
Right.
link |
02:30:18.840
And that's a transition.
link |
02:30:19.840
But when it happens, if you'd read the owner's manual and you believed it,
link |
02:30:22.840
you would have said, oh, this happened.
link |
02:30:24.840
It doesn't mean it's not love.
link |
02:30:25.840
It's a different kind of love.
link |
02:30:27.840
But, but maybe there's something better about that is you grow old.
link |
02:30:32.840
If all you do is regret how you used to be.
link |
02:30:36.840
It's sad.
link |
02:30:37.840
Right.
link |
02:30:38.840
You should have learned a lot of things because like who you can be in your future
link |
02:30:42.840
self is actually more interesting and possibly delightful than, you know,
link |
02:30:47.840
being a mad kid in love with the next person.
link |
02:30:51.840
Like that's super fun when it happens.
link |
02:30:54.840
That's, that's, you know, 5% of the possibility.
link |
02:30:59.840
Yeah.
link |
02:31:00.840
That's right.
link |
02:31:01.840
That there's a lot more fun to be had in the long lasting stuff.
link |
02:31:04.840
Yeah.
link |
02:31:05.840
Or meaning, you know, if that's your thing.
link |
02:31:07.840
Meaning, which is a kind of fun.
link |
02:31:08.840
It's a deeper kind of fun.
link |
02:31:10.840
And it's surprising, you know, that's like, like the thing I like is surprises,
link |
02:31:14.840
you know, and you just never know what's going to happen.
link |
02:31:18.840
But you have to look carefully and you have to work at it.
link |
02:31:21.840
You have to think about it.
link |
02:31:23.840
Yeah.
link |
02:31:24.840
You have to see the surprises when they happen, right?
link |
02:31:26.840
You have to be looking for it from the branching perspective.
link |
02:31:29.840
You mentioned regrets.
link |
02:31:32.840
Do you have regrets about your own trajectory?
link |
02:31:35.840
Oh yeah.
link |
02:31:36.840
Of course.
link |
02:31:37.840
Yeah.
link |
02:31:38.840
Some of it's painful, but you want to hear the painful stuff.
link |
02:31:42.840
I'd say like in terms of working with people, when people did say stuff I didn't like,
link |
02:31:48.840
especially if it was a bit nefarious, I took it personally.
link |
02:31:51.840
I also felt it was personal about them.
link |
02:31:55.840
But a lot of times, like humans are, you know, most humans are a mess, right?
link |
02:31:59.840
And then they act out and they do stuff.
link |
02:32:01.840
And this psychologist I heard a long time ago said, you tend to think somebody does something to you.
link |
02:32:08.840
But really what they're doing is they're doing what they're doing while they're in front of you.
link |
02:32:12.840
It's not that much about you.
link |
02:32:14.840
Yeah.
link |
02:32:15.840
Right.
link |
02:32:16.840
And as I got more interested in, you know, when I work with people,
link |
02:32:20.840
I think about them and probably analyze them and understand them a little bit.
link |
02:32:25.840
And then when they do stuff, I'm way less surprised.
link |
02:32:28.840
And I'm way, you know, and if it's bad, I'm way less hurt.
link |
02:32:31.840
And I react way less.
link |
02:32:33.840
Like I sort of expect everybody's got their shit.
link |
02:32:36.840
Yeah.
link |
02:32:37.840
And it's not about you.
link |
02:32:38.840
It's not about me that much.
link |
02:32:40.840
It's like, you know, you do something and you think you're embarrassed, but nobody cares.
link |
02:32:44.840
Like somebody's really mad at you.
link |
02:32:46.840
The odds of it being about you.
link |
02:32:48.840
Yeah.
link |
02:32:49.840
Because they're getting mad the way they're doing that because of some pattern they learned.
link |
02:32:52.840
And, you know, and maybe you can help them if you care enough about it.
link |
02:32:56.840
Or you could see it coming and step out of the way.
link |
02:32:59.840
Like, I wish I was way better at that.
link |
02:33:02.840
I'm a bit of a hothead.
link |
02:33:04.840
You regret that?
link |
02:33:05.840
You said with Steve, that was a feature, not a bug.
link |
02:33:08.840
Yeah.
link |
02:33:09.840
Well, he was using it as the counter for orderliness that would crush his work.
link |
02:33:13.840
Well, you were doing the same.
link |
02:33:14.840
Yeah.
link |
02:33:15.840
Maybe.
link |
02:33:16.840
I don't think my vision was big enough.
link |
02:33:18.840
It was more like I just got pissed off and did stuff.
link |
02:33:22.840
I'm sure that's the...
link |
02:33:24.840
Yeah.
link |
02:33:25.840
You're telling...
link |
02:33:26.840
I don't know if it had the...
link |
02:33:28.840
It didn't have the amazing effect of creating a trillion dollar company.
link |
02:33:31.840
It was more like I just got pissed off and left.
link |
02:33:34.840
And or made enemies that he shouldn't have.
link |
02:33:38.840
Yeah.
link |
02:33:39.840
It's hard.
link |
02:33:40.840
Like, I didn't really understand politics until I worked at Apple where, you know, Steve
link |
02:33:44.840
was a master player of politics and his staff had to be or they wouldn't survive them.
link |
02:33:47.840
And it was definitely part of the culture.
link |
02:33:50.840
And then I've been in companies where they say it's political, but it's all, you know,
link |
02:33:54.840
fun and games compared to Apple.
link |
02:33:56.840
And it's not that the people at Apple are bad people.
link |
02:33:59.840
It's just they operate politically at a higher level.
link |
02:34:03.840
You know, it's not like, oh, somebody said something bad about somebody, somebody else,
link |
02:34:08.840
which is most politics.
link |
02:34:10.840
It's, you know, they had strategies about accomplishing their goals.
link |
02:34:15.840
Sometimes, you know, over the dead bodies of their enemies, you know, with sophistication.
link |
02:34:21.840
Yeah, more game of thrones and sophistication and like a big time factor rather than a,
link |
02:34:27.840
you know.
link |
02:34:28.840
Well, that requires a lot of control over your emotions, I think, to have a bigger strategy
link |
02:34:34.840
in the way you behave.
link |
02:34:35.840
Yeah.
link |
02:34:36.840
And it's effective in the sense that coordinating thousands of people to do really hard things
link |
02:34:42.840
where many of the people in there don't understand themselves much less how they're participating
link |
02:34:47.840
creates all kinds of, you know, drama and problems that, you know, our solution is political
link |
02:34:54.840
and nature.
link |
02:34:55.840
Like, how do you convince people?
link |
02:34:56.840
How do you leverage them?
link |
02:34:57.840
How do you motivate them?
link |
02:34:58.840
How do you get rid of them?
link |
02:34:59.840
How, you know, like, there's, there's so many layers of that that are interesting.
link |
02:35:03.840
And even though some, some of it, let's say, may be tough, it's not evil.
link |
02:35:11.840
Unless, you know, you use that skill to evil purposes, which some people obviously do.
link |
02:35:16.840
But it's a skill set that operates.
link |
02:35:18.840
You know, and I wish I'd, you know, I was interested in it, but I, you know, it was sort of like,
link |
02:35:23.840
I'm an engineer, I do my thing.
link |
02:35:26.840
And, you know, there's, there's times when I could have had a way bigger impact if I,
link |
02:35:31.840
you know, knew how to, if I paid more attention and knew more about that.
link |
02:35:35.840
Yeah.
link |
02:35:36.840
About the human layer of the stack.
link |
02:35:38.840
Yeah, that, that human political power, you know, expression layer of the stack.
link |
02:35:42.840
It's just complicated.
link |
02:35:43.840
And there's lots to know about it.
link |
02:35:45.840
I mean, people are good at it or just amazing.
link |
02:35:47.840
And when they're good at it and let's say relatively kind and oriented a good direction,
link |
02:35:55.840
you can really feel, you can get lots of stuff done and coordinate things that you never
link |
02:36:00.840
thought possible.
link |
02:36:02.840
But all people like that also have some pretty hard edges because, you know, it's,
link |
02:36:07.840
it's a heavy lift.
link |
02:36:08.840
And I wish I'd spent more time with that when I was younger, but maybe I wasn't ready.
link |
02:36:13.840
You know, I was a wide eyed kid for 30 years.
link |
02:36:16.840
Still a bit of a kid.
link |
02:36:18.840
Yeah, I know.
link |
02:36:19.840
What do you hope your legacy is when there's a, when there's a book like a H Hikers guy
link |
02:36:26.840
to the galaxy.
link |
02:36:27.840
And this is like a, one sentence entry about Jim Miller from like that guy lived at some
link |
02:36:32.840
point.
link |
02:36:33.840
There's not many, you know, not many people would be remembered.
link |
02:36:37.840
You're one of the sparkling little human creatures that had a big impact on the world.
link |
02:36:43.840
How do you hope you'll be remembered?
link |
02:36:45.840
My daughter was trying to get, she edited my Wikipedia page to say that I was a legend
link |
02:36:51.840
in the guru, but they took it out.
link |
02:36:54.840
So she put it back and she's 15.
link |
02:36:56.840
I think, I think that was probably the best part of my legacy.
link |
02:37:01.840
She got her sister and they were all excited.
link |
02:37:04.840
They were like trying to put it in the references because there's articles in that on the top.
link |
02:37:08.840
So in the eyes of your kids, you're a legend.
link |
02:37:11.840
Well, they're pretty skeptical because they don't be better than that.
link |
02:37:15.840
They're like, dad.
link |
02:37:17.840
So yeah, that's, that's super, that kind of stuff is super fun in terms of the big legend
link |
02:37:22.840
stuff.
link |
02:37:23.840
I don't care.
link |
02:37:24.840
I don't really care.
link |
02:37:26.840
You're just an engineer.
link |
02:37:28.840
They've been thinking about building a big pyramid.
link |
02:37:31.840
So I had a debate with a friend about whether pyramids or craters are cooler.
link |
02:37:36.840
And he realized that there's craters everywhere, but you know, they built a couple pyramids
link |
02:37:40.840
5,000 years ago.
link |
02:37:41.840
And they remember you for a while.
link |
02:37:42.840
We're still talking about it.
link |
02:37:44.840
I think that would be cool.
link |
02:37:46.840
Those aren't easy to build.
link |
02:37:48.840
Oh, I know.
link |
02:37:49.840
And they don't actually know how they built them, which is great.
link |
02:37:53.840
It's either a AGI or aliens could be involved.
link |
02:37:58.840
So I think, I think you're going to have to figure out quite a few more things than just
link |
02:38:03.840
the basics of civil engineering.
link |
02:38:06.840
So I guess you hope your legacy is pyramids.
link |
02:38:09.840
That would, that would be cool.
link |
02:38:11.840
And my Wikipedia page, you know, getting updated by my daughter periodically.
link |
02:38:15.840
Like those two things would pretty much make it.
link |
02:38:18.840
Jim, it's a huge honor talking to you again.
link |
02:38:20.840
I hope we talk many more times in the future.
link |
02:38:22.840
I can't wait to see what you do with TimeStorrent.
link |
02:38:25.840
I can't wait to use it.
link |
02:38:27.840
I can't wait for you to revolutionize yet another space in computing.
link |
02:38:32.840
It's a huge honor to talk to you.
link |
02:38:34.840
Thanks for talking today.
link |
02:38:35.840
This was fun.
link |
02:38:36.840
Thanks for listening to this conversation with Jim Keller.
link |
02:38:39.840
And thank you to our sponsors, Athletic Greens, all in one nutrition drink, Brooklyn and
link |
02:38:44.840
Sheets, ExpressVPN, and Bell Campo grass fed meat.
link |
02:38:49.840
Click the sponsor links to get a discount and to support this podcast.
link |
02:38:53.840
And now let me leave you with some words from Alan Turing.
link |
02:38:57.840
Those who can imagine anything can create the impossible.
link |
02:39:02.840
Thank you for listening and hope to see you next time.