back to index

George Hotz: Hacking the Simulation & Learning to Drive with Neural Nets | Lex Fridman Podcast #132


small model | large model

link |
00:00:00.000
The following is a conversation with George Hots, a.k.a. Geohot, his second time in the
link |
00:00:05.440
podcast. He's the founder of Kama AI, an autonomous and semi autonomous vehicle technology company
link |
00:00:12.800
that seeks to be, to Tesla autopilot, what Android is, to the iOS. They sell the Kama
link |
00:00:20.400
2 device for $1,000 that when installed in many of their supported cars can keep the vehicle
link |
00:00:26.480
centered in the lane even when there are no lane markings. It includes driver sensing that ensures
link |
00:00:33.120
that the driver's eyes are on the road. As you may know, I'm a big fan of driver sensing. I do
link |
00:00:38.640
believe Tesla autopilot and others should definitely include it in their sensor suite. Also, I'm a fan
link |
00:00:44.640
of Android and a big fan of George, for many reasons, including his nonlinear out of the box
link |
00:00:50.640
brilliance and the fact that he's a superstar programmer of a very different style than myself.
link |
00:00:57.360
Styles make fights and styles make conversations. So I really enjoyed this chat. I'm sure we'll
link |
00:01:03.360
talk many more times on this podcast. Quick mention of a sponsor followed by some thoughts
link |
00:01:08.560
related to the episode. First is ForSigmatic, the maker of delicious mushroom coffee. Second
link |
00:01:15.920
is the coding digital, a podcast on tech and entrepreneurship that I listen to and enjoy.
link |
00:01:22.000
And finally, ExpressVPN, the VPN I've used for many years to protect my privacy on the internet.
link |
00:01:29.200
Please check out the sponsors in the description to get a discount and to support this podcast.
link |
00:01:34.640
As a side note, let me say that my work at MIT on autonomous and semi autonomous vehicles
link |
00:01:40.400
led me to study the human side of autonomy enough to understand that it's a beautifully complicated
link |
00:01:46.480
and interesting problem space much richer than what can be studied in the lab. In that sense,
link |
00:01:52.320
the data that comma AI, Tesla autopilot and perhaps others like Cadillac supercruiser collecting
link |
00:01:58.400
gives us a chance to understand how we can design safe semi autonomous vehicles for real human beings
link |
00:02:05.360
in real world conditions. I think this requires bold innovation and a serious exploration of the
link |
00:02:11.600
first principles of the driving task itself. If you enjoyed this thing, subscribe on YouTube,
link |
00:02:17.760
review it with five stars and up a podcast, follow on Spotify, support on Patreon or connect with me
link |
00:02:23.360
on Twitter at Lex Friedman. And now here's my conversation with George Hottz. So last time
link |
00:02:31.920
we started talking about the simulation, this time let me ask you, do you think there's intelligent
link |
00:02:36.640
life out there in the universe? I always maintained my answer to the Fermi paradox. I think there
link |
00:02:42.720
has been intelligent life elsewhere in the universe. So intelligent civilizations existed,
link |
00:02:47.760
but they've blown themselves up. So your general intuition is that intelligent
link |
00:02:52.080
civilizations quickly, like there's that parameter in the Drake equation, your sense is they don't
link |
00:02:58.560
last very long. How are we doing on that? Have we lasted pretty good? How do we do?
link |
00:03:05.360
Yeah. I mean, not quite yet. I tell you, as Yuckowski, IQ required to destroy the world
link |
00:03:13.360
falls by one point every year. Okay. So technology democratizes the destruction of the world.
link |
00:03:20.880
When can a meme destroy the world?
link |
00:03:22.240
It kind of is already, right?
link |
00:03:27.120
Somewhat. I don't think we've seen anywhere near the worst of it yet. World's going to get weird.
link |
00:03:33.760
Well, maybe a meme can save the world. You thought about that, the meme Lord Elon Musk fighting on
link |
00:03:39.360
the side of good versus the meme Lord of the darkness, which is not saying anything bad about
link |
00:03:46.800
Donald Trump, but he is the Lord of the meme on the dark side. He's a Darth Vader of memes.
link |
00:03:53.600
I think in every fairy tale, they always end it with, and they lived happily ever after. And
link |
00:03:59.840
I'm like, please tell me more about this happily ever after. I've heard 50% of marriages end in
link |
00:04:04.560
divorce. Why doesn't your marriage end up there? You can't just say happily ever after. So
link |
00:04:10.880
the thing about destruction is it's over after the destruction. We have to do everything right
link |
00:04:16.320
in order to avoid it. And one thing wrong. Actually, this is what I really like about
link |
00:04:21.920
cryptography. Cryptography, it seems like we live in a world where the defense wins
link |
00:04:27.680
versus nuclear weapons. The opposite is true. It is much easier to build a warhead that splits
link |
00:04:33.200
into 100 little warheads than to build something that can take out 100 little warheads.
link |
00:04:38.720
The offense has the advantage there. So maybe our future is in crypto.
link |
00:04:43.040
So cryptography, right. The Goliath is the defense. And then all the different hackers
link |
00:04:51.440
are the Davids. And that equation is flipped for nuclear war. Because there's so many,
link |
00:04:58.640
like one nuclear weapon destroys everything, essentially.
link |
00:05:01.920
Yeah. And it is much easier to attack with a nuclear weapon than it is to like,
link |
00:05:07.680
the technology required to intercept and destroy a rocket is much more complicated than the
link |
00:05:12.080
technology required to just orbital trajectory, send a rocket to somebody.
link |
00:05:17.280
So okay, your intuition that there were intelligent civilizations out there,
link |
00:05:22.800
but it's very possible that they're no longer there. It's kind of a sad picture.
link |
00:05:27.520
They enter some steady state. They all wirehead themselves.
link |
00:05:31.280
What's wirehead?
link |
00:05:33.840
Stimulate their pleasure centers and just live forever in this kind of stasis.
link |
00:05:38.880
Oh. Well, I mean, I think the reason I believe
link |
00:05:43.600
this is because where are they? If there's some reason they stopped expanding,
link |
00:05:50.400
because otherwise they would have taken over the universe. The universe isn't that big.
link |
00:05:53.200
Or at least, you know, let's just talk about the galaxy, right?
link |
00:05:55.840
About 70,000 light years across. I took that number from Star Trek Voyager. I don't know how
link |
00:06:00.080
true it is. But yeah, that's not big, right? 70,000 light years is nothing.
link |
00:06:06.320
For some possible technology that you can imagine that can leverage like wormholes or
link |
00:06:11.440
something like that. You don't even need wormholes. Just a von Neumann probe is enough.
link |
00:06:14.880
A von Neumann probe and a million years of sub light travel,
link |
00:06:18.240
and you'd have taken over the whole universe. That clearly didn't happen. So something stopped it.
link |
00:06:23.760
So you mean for like a few million years, if you sent out probes that travel close,
link |
00:06:29.680
what's sub light? You mean close to the speed of light?
link |
00:06:32.080
Let's say 0.1c.
link |
00:06:33.520
And it just spreads. Interesting. Actually, that's an interesting calculation.
link |
00:06:38.640
So what makes you think that we'd be able to communicate with them?
link |
00:06:45.280
Why do you think we would be able to comprehend
link |
00:06:48.880
intelligent lies that are out there? Even if they were among this kind of thing,
link |
00:06:55.440
or even just flying around?
link |
00:06:56.880
Well, I mean, that's possible. It's possible that there is some sort of prime directive.
link |
00:07:04.480
That'd be a really cool universe to live in. And there's some reason they're not making
link |
00:07:08.480
themselves visible to us. But it makes sense that they would use the same,
link |
00:07:15.040
well, at least the same entropy.
link |
00:07:16.880
Well, you're implying the same laws of physics. I don't know what you mean by entropy in this case.
link |
00:07:20.560
Oh, yeah. I mean, if entropy is the scarce resource in the universe.
link |
00:07:23.760
So what do you think about like Steven Wolfram and everything is a computation?
link |
00:07:28.720
And then what if they are traveling through this world of computation?
link |
00:07:32.560
So if you think of the universe as just information processing,
link |
00:07:36.480
then what you're referring to with entropy, and then these pockets of interesting complex
link |
00:07:43.440
computation swimming around, how do we know they're not already here?
link |
00:07:46.720
How do we know that all the different amazing things that are full of mystery
link |
00:07:54.240
on Earth are just like little footprints of intelligence from light years away?
link |
00:08:01.520
Maybe. I mean, I tend to think that as civilizations expand, they use more and more energy.
link |
00:08:07.680
And you can never overcome the problem of waste heat. So where is their waste heat?
link |
00:08:11.680
So we'd be able to, with our crude methods, be able to see like there's a whole lot of energy here.
link |
00:08:18.800
But it could be something we're not, I mean, we don't understand dark energy, right? Dark matter.
link |
00:08:23.280
It could be just stuff we don't understand at all.
link |
00:08:25.920
Or they could have a fundamentally different physics, you know, like that we just don't even
link |
00:08:31.600
comprehend. Well, I think, okay, I mean, it depends how far out you want to go.
link |
00:08:34.960
I don't think physics is very different on the other side of the galaxy.
link |
00:08:38.000
I would suspect that they have, I mean, if they're in our universe, they have the same physics.
link |
00:08:45.520
Well, yeah, that's the assumption we have. But there could be like super trippy things like
link |
00:08:51.760
like our cognition only gets to a slice and all the possible instruments that we can design
link |
00:08:59.280
only get to a particular slice of the universe. And there's something much like weirder.
link |
00:09:03.920
Maybe we can try a thought experiment. Would people from the past be able to detect
link |
00:09:11.760
their remnants of our, be able to detect our modern civilization? I think the answer is
link |
00:09:17.200
obviously yes. You mean past from a hundred years ago?
link |
00:09:20.480
Well, let's even go back further. Let's go to a million years ago.
link |
00:09:24.320
The humans who were lying around in the desert probably didn't even have,
link |
00:09:27.520
maybe they just barely had fire. They would understand if a 747 flew overhead.
link |
00:09:35.040
Oh, in this vicinity, but not if a 747 flew on Mars. Because they wouldn't be able to see far,
link |
00:09:44.880
because we're not actually communicating that well with the rest of the universe.
link |
00:09:48.720
We're doing okay, just sending out random like 50s tracks of music.
link |
00:09:53.120
True. And yeah, I mean, they'd have to do, you know, the, we've only been broadcasting radio waves
link |
00:09:59.520
for 150 years and well, there's your light cone. So yeah, okay. What do you make about all the,
link |
00:10:08.800
I recently came across this, having talked to David Fravor. I don't know if you caught
link |
00:10:15.680
what the videos that Pentagon released and the New York Times reporting of the UFO sightings. So I
link |
00:10:23.840
kind of looked into it quote unquote. And there's actually been like hundreds of thousands of UFO
link |
00:10:32.160
sightings, right? And a lot of it you can explain away in different kinds of ways. So one is it
link |
00:10:38.000
could be interesting physical phenomena. Two, it could be people wanting to believe.
link |
00:10:43.520
And therefore they conjure up a lot of different things that just, you know, when you see different
link |
00:10:46.880
kinds of lights, some basic physics phenomena, and then you just conjure up ideas of possible
link |
00:10:53.200
out there mysterious worlds. But, you know, it's also possible like you have a case of David Fravor,
link |
00:11:00.880
who is a Navy pilot, who's, you know, as legit as it gets in terms of humans who are able to
link |
00:11:08.400
perceive things in the environment and make conclusions, whether those things are a threat
link |
00:11:14.800
or not. And he and several other pilots saw a thing, I don't know if you followed this,
link |
00:11:21.440
but they saw a thing that they've since then called TikTok that moved in all kinds of weird
link |
00:11:26.320
ways. They don't know what it is. It could be technology developed by the United States,
link |
00:11:34.720
and they're just not aware of it and the surface level from the Navy, right? It could be different
link |
00:11:40.800
kind of lighting technology or drone technology, all that kind of stuff. It could be the Russians
link |
00:11:45.600
and the Chinese, all that kind of stuff. And of course their mind, our mind can also venture
link |
00:11:52.960
into the possibility that it's from another world. Have you looked into this at all? What do you
link |
00:11:58.320
think about it? I think all the news is a scythe. I think that the most plausible... Nothing is real.
link |
00:12:06.320
Yeah, I listened to the, I think it was Bob Lazar on Joe Rogan. And like, I believe everything
link |
00:12:14.240
this guy is saying. And then I think that it's probably just some like MK Ultra kind of thing,
link |
00:12:18.880
you know? What do you mean? Like, they made some weird thing and they called it an alien spaceship.
link |
00:12:26.000
You know, maybe it was just to like stimulate young physicists minds and tell them it's alien
link |
00:12:30.400
technology and we'll see what they come up with, right? Do you find any conspiracy theories
link |
00:12:35.200
compelling? Like, have you pulled at the string of the rich complex world of conspiracy theories
link |
00:12:42.240
that's out there? I think that I've heard a conspiracy theory that conspiracy theories
link |
00:12:47.280
were invented by the CIA in the 60s to discredit true things. Yeah. So, you know, you can go to
link |
00:12:55.520
ridiculous conspiracy theories like Flat Earth and Pizza Gate and, you know, these things are
link |
00:13:04.480
almost to hide like conspiracy theories that like, you know, remember when the Chinese like locked
link |
00:13:09.360
up the doctors who discovered coronavirus? Like, I tell people this and I'm like, no, no, no,
link |
00:13:12.960
that's not a conspiracy theory. That actually happened. Do you remember the time that the
link |
00:13:16.880
money used to be backed by gold and now it's backed by nothing? This is not a conspiracy theory. This
link |
00:13:21.840
actually happened. Well, that's one of my worries today with the idea of fake news is that
link |
00:13:29.440
when nothing is real, then like, you dilute the possibility of anything being true by conjuring
link |
00:13:38.560
up all kinds of conspiracy theories. And then you don't know what to believe. And then like,
link |
00:13:43.440
the idea of truth of objectivity is lost completely. Everybody has their own truth.
link |
00:13:48.960
So, you used to control information by censoring it. Then the internet happened and governments
link |
00:13:55.360
were like, oh, shit, we can't censor things anymore. I know what we'll do. You know, it's
link |
00:14:01.200
the old story of the story of like tying a flag with a leprechaun tells you as gold is buried
link |
00:14:06.880
and you tie one flag and you make the leprechaun swear to not remove the flag and you come back
link |
00:14:10.400
to the field later with a shovel and this flag is everywhere. That's one way to maintain privacy,
link |
00:14:15.920
right? In order to protect the contents of this conversation, for example, we could just generate
link |
00:14:23.680
like millions of deep fake conversations where you and I talk and say random things.
link |
00:14:28.960
So, this is just one of them and nobody knows which one is the real one. This could be fake
link |
00:14:33.760
right now. Classic steganography technique. Okay, another absurd question about intelligent life
link |
00:14:39.840
because you're an incredible programmer outside of everything else we'll talk about just as a
link |
00:14:45.840
programmer. Do you think intelligent beings out there, the civilizations that were out there
link |
00:14:54.400
had computers and programming? Did they naturally have to develop something where we engineer
link |
00:15:02.320
machines and are able to encode both knowledge into those machines and instructions that process
link |
00:15:10.800
that knowledge, process that information to make decisions and actions and so on? And would those
link |
00:15:16.560
programming languages, if you think they exist, be at all similar to anything we've developed?
link |
00:15:24.080
So, I don't see that much of a difference between quote unquote natural languages and
link |
00:15:29.440
programming languages. I think there are so many similarities. So, when asked the question
link |
00:15:38.880
what do alien languages look like, I imagine they're not all that dissimilar from ours,
link |
00:15:46.320
and I think translating in and out of them wouldn't be that crazy.
link |
00:15:52.720
Well, it's difficult to compile DNA to Python and then to C. There is a little bit of a gap
link |
00:16:00.480
in the kind of languages we use for touring machines and the kind of languages nature seems
link |
00:16:08.720
to use a little bit. Maybe that's just we just haven't understood the kind of language that
link |
00:16:14.560
nature uses as well yet. DNA is a CAD model. It's not quite a programming language. It has no sort
link |
00:16:21.840
of serial execution. It's not quite a CAD model. So, I think in that sense we actually
link |
00:16:31.280
completely understand it. The problem is simulating on these CAD models. I played with it a bit this
link |
00:16:38.080
year is super computationally intensive. If you want to go down to like the molecular level
link |
00:16:43.520
where you need to go to see a lot of these phenomenon like protein folding. So, yeah,
link |
00:16:49.360
it's not that we don't understand it. It just requires a whole lot of compute to kind of compile
link |
00:16:54.240
it. For human minds it's inefficient both for the data representation and for the programming.
link |
00:17:00.560
It runs well on raw nature. It runs well on raw nature and when we try to build
link |
00:17:04.880
emulators or simulators for that, well the mad sloth kind of tried it.
link |
00:17:10.480
It runs in that, yeah, you've commented elsewhere. I don't remember where that one of the problems
link |
00:17:18.160
is simulating nature is tough. And if you want to sort of deploy a prototype, I forgot how you
link |
00:17:26.400
put it but it made me laugh, but animals or humans would need to be involved in order to
link |
00:17:32.400
you know, to try to run some prototype code on like if we're talking about COVID and viruses and so
link |
00:17:40.240
on. Yeah. If you were to try to engineer some kind of defense mechanisms like a vaccine
link |
00:17:47.280
against COVID or all that kind of stuff that doing any kind of experimentation like you can
link |
00:17:52.480
with like autonomous vehicles would be very technically cost, technically and ethically
link |
00:17:58.640
costly. I'm not sure about that. I think you can do tons of crazy biology in test tubes. I think
link |
00:18:05.440
my bigger complaint is more all the tools are so bad. Like literally you mean like like
link |
00:18:13.360
libraries and I don't know. I'm not pipetting shit. Like your hand in me, I gotta, no, no, no,
link |
00:18:19.920
there has to be some like automating stuff and like the human biology is messy. Like it seems like
link |
00:18:29.680
look at those Taranos videos. They were joke. It's like a little gantry. It's like a little
link |
00:18:33.680
xy gantry high school science project with the pipet. I'm like, really? Gotta be something better.
link |
00:18:39.040
You can't build like nice microfluidics and I can program the, you know, computation to
link |
00:18:44.480
bio interface. I mean, this is going to happen. But like right now, if you are asking me to pipet
link |
00:18:50.880
50 milliliters of solution, I'm out. This is so crude. Yeah. Okay, let's get all the crazy out of
link |
00:18:59.120
the way. So a bunch of people asked me, since we talked about the simulation last time,
link |
00:19:04.880
we talked about hacking the simulation. Do you have any updates, any insights about how we might
link |
00:19:10.640
be able to go about hacking simulation if we indeed do live in a simulation?
link |
00:19:17.040
I think a lot of people misinterpreted the point of that South by talk. The point of the
link |
00:19:22.800
South by talk was not literally to hack the simulation. I think that this
link |
00:19:31.600
this is an idea is literally just I think theoretical physics. I think that's the whole,
link |
00:19:35.440
you know, the whole goal, right? You want your grand unified theory, but then, okay,
link |
00:19:42.160
build a grand unified theory search for exploits, right? I think we're nowhere near actually there
link |
00:19:46.800
yet. My hope with that was just more to like, like, are you people kidding me with the things
link |
00:19:53.120
you spend time thinking about? Do you understand like kind of how small you are? You are, you are
link |
00:19:58.960
bytes and God's computer. Really? And the things that people get worked up about and, you know.
link |
00:20:06.560
So basically, it was more a message of we should humble ourselves that we get to
link |
00:20:15.040
like, what are we humans in this byte code? Yeah. And not just humble ourselves, but like,
link |
00:20:23.120
I'm not trying to like make people guilty or anything like that. I'm trying to say like,
link |
00:20:26.320
literally, look at what you are spending time on, right? What are you referring to? You're
link |
00:20:31.200
referring to the Kardashians? What are we talking about? I'm referring to no, the Kardashians.
link |
00:20:35.920
Everyone knows that's kind of fun. I'm referring more to like, economy, you know, this idea that
link |
00:20:46.640
we got up our stock price. Or what is the goal function of humanity?
link |
00:20:54.720
You don't like the game of capitalism? Like, you don't like the games we've constructed for
link |
00:20:59.360
ourselves as humans? I'm a big fan of capitalism. I don't think that's really the game we're playing
link |
00:21:04.640
right now. I think we're playing a different game where the rules are rigged. Okay, which games
link |
00:21:11.360
are interesting to you that we humans have constructed and which aren't? Which are productive
link |
00:21:16.640
and which are not? Actually, maybe that's the real point of the talk. It's like,
link |
00:21:22.080
stop playing these fake human games. There's a real game here. We can play the real game.
link |
00:21:28.480
The real game is, you know, nature wrote the rules. This is a real game. There still is
link |
00:21:33.520
a game to play. But if you look at, sorry to interrupt, I don't know if you've seen the
link |
00:21:37.520
Instagram account, nature is metal. The game that nature seems to be playing is a lot more cruel
link |
00:21:45.280
than we humans want to put up with. Or at least we see it as cruel. It's like, the bigger thing
link |
00:21:51.120
eats the smaller thing and does it to impress another big thing so it can mate with that thing.
link |
00:22:00.320
And that's it. That seems to be the entirety of it. Well, there's no art. There's no music.
link |
00:22:07.120
There's no comma AI. There's no comma one, no comma two, no George
link |
00:22:13.280
Hott's with his brilliant talks at South by Southwest. See, I disagree though. I disagree
link |
00:22:18.320
that this is what nature is. I think nature just provided basically an open world MMORPG.
link |
00:22:26.720
And you know, here it's open world. I mean, if that's the game you want to play, you can play
link |
00:22:31.600
that game. Isn't that beautiful? I know if you play Diablo, they used to have, I think, cow level
link |
00:22:38.000
where it's, so everybody will go just, they figured out this, like the best way to gain
link |
00:22:46.160
experience points. This is just slaughter cows over and over and over. And so they figured out this
link |
00:22:54.480
little sub game within the bigger game that this is the most efficient way to get experience points.
link |
00:22:59.760
And everybody somehow agreed they're getting experience points in RPG context where you always
link |
00:23:04.720
want to be getting more stuff, more skills, more levels, keep advancing. That seems to be good.
link |
00:23:10.320
So might as well spend sacrifice, actual enjoyment of playing a game, exploring a world,
link |
00:23:17.520
and spending like hundreds of hours a year time in cow level. I mean, the number of hours I spent
link |
00:23:24.400
in cow level, I'm not like the most impressive person because people have probably thousands
link |
00:23:29.760
of hours there, but it's ridiculous. So that's a little absurd game that brought me joints on
link |
00:23:35.360
weird dopamine drug kind of way. So you don't like those games. You don't think that's us humans
link |
00:23:43.280
feeling the nature. And that was the point of the talk. So how do we hack it then?
link |
00:23:51.280
Well, I want to live forever. And this is the goal. Well, that's a game against nature.
link |
00:23:59.120
Yeah. Immortality is the good objective function to you.
link |
00:24:02.320
I mean, start there and then you can do whatever else you want because you've got a long time.
link |
00:24:07.200
What if immortality makes the game just totally not fun? I mean, why do you assume immortality
link |
00:24:13.760
is somehow a good objective function? It's not immortality that I want. A true immortality
link |
00:24:20.960
where I could not die, I would prefer what we have right now. But I want to choose my own death,
link |
00:24:26.160
of course. I don't want nature to decide when I die, I'm going to win. I'm going to be you.
link |
00:24:32.720
And then at some point, if you choose commit suicide, how long do you think you'd live?
link |
00:24:41.440
Until I get bored.
link |
00:24:42.960
See, I don't think people, like brilliant people like you that really ponder
link |
00:24:49.120
they're living a long time, are really considering how meaningless life becomes.
link |
00:24:58.400
Well, I want to know everything and then I'm ready to die.
link |
00:25:03.360
As long as there's...
link |
00:25:03.920
But why do you want, isn't it possible that you want to know everything because it's finite?
link |
00:25:09.600
Like the reason you want to know quote unquote, everything is because you don't have enough time
link |
00:25:14.320
to know everything. And once you have unlimited time, then you realize like, why do anything?
link |
00:25:22.080
Like why learn anything?
link |
00:25:24.800
I want to know everything and then I'm ready to die.
link |
00:25:26.880
So you have... Yeah.
link |
00:25:27.760
Well, it's not a... It's a terminal value. It's not in service of anything else.
link |
00:25:34.560
I'm conscious of the possibility, this is not a certainty, but the possibility of that
link |
00:25:40.400
engine of curiosity that you're speaking to is actually a symptom of the finiteness of life.
link |
00:25:49.680
Like without that finiteness, your curiosity would vanish like a morning fog.
link |
00:25:56.800
All right, cool.
link |
00:25:57.520
Bukowski talked about love like that.
link |
00:25:59.200
Then let me solve immortality. Let me change the thing in my brain that reminds me of the
link |
00:26:03.600
fact that I'm immortal tells me that life is finite shit. Maybe I'll have it tell me that
link |
00:26:07.120
life ends next week. All right? I'm okay with some self manipulation like that. I'm okay with
link |
00:26:13.360
deceiving myself. Oh, oh, Rika. Changing the code.
link |
00:26:17.040
If that's the problem, right? If the problem is that I will no longer have that curiosity,
link |
00:26:21.680
I'd like to have backup copies of myself. Revert, yeah.
link |
00:26:25.520
Well, which I check in with occasionally to make sure they're okay with the trajectory
link |
00:26:29.120
and they can kind of override it. Maybe a nice like, I think of like those wave nets,
link |
00:26:33.040
those like logarithmic go back to the copies.
link |
00:26:35.120
But sometimes it's not reversible. Like I've done this with video games.
link |
00:26:39.840
Once you figure out the cheat code or like you look up how to cheat old school,
link |
00:26:43.840
like single player, it ruins the game for you.
link |
00:26:46.800
Absolutely. I know that feeling. But again, that just means our brain manipulation
link |
00:26:51.760
technology is not good enough yet. Remove that cheat code from your brain.
link |
00:26:55.680
So it's also possible that if we figure out immortality that all of us will kill ourselves
link |
00:27:02.720
before we advance far enough to be able to revert the change.
link |
00:27:08.080
I'm not killing myself till I know everything. So...
link |
00:27:11.680
That's what you say now because your life is finite.
link |
00:27:15.760
You know, I think self modifying systems comes up with all these hairy complexities.
link |
00:27:20.880
And can I promise that I'll do it perfectly? No.
link |
00:27:23.120
But I think I can put good safety structures in place.
link |
00:27:25.440
So that talk and your thinking here is not literally referring to a simulation in that our
link |
00:27:38.000
universe is a kind of computer program running on a computer.
link |
00:27:42.080
That's more of a thought experiment.
link |
00:27:45.760
Do you also think of the potential of the sort of
link |
00:27:48.960
Bostrom, Elon Musk, and others that talk about an actual program that simulates our universe?
link |
00:27:59.520
Oh, I don't doubt that we're in a simulation. I just think that it's not quite that important.
link |
00:28:05.120
I mean, I'm interested only in simulation theory as far as like it gives me power over nature.
link |
00:28:09.600
If it's totally unfalsifiable, then who cares?
link |
00:28:12.960
I mean, what do you think that experiment would look like?
link |
00:28:15.120
Like somebody on Twitter asked George what signs we would look for
link |
00:28:20.640
to know whether or not we're in the simulation, which is exactly what you're asking is like
link |
00:28:26.080
the step that precedes the step of knowing how to get more power from this knowledge
link |
00:28:32.080
is to get an indication that there's some power to be gained.
link |
00:28:35.120
So get an indication that you can discover and exploit cracks in the simulation
link |
00:28:42.000
or it doesn't have to be in the physics of the universe.
link |
00:28:45.280
Yeah. Show me. I mean, like a memory leak would be cool.
link |
00:28:51.600
Some scrying technology, you know?
link |
00:28:53.920
What kind of technology?
link |
00:28:55.280
Scrying.
link |
00:28:56.080
What's that?
link |
00:28:56.560
Oh, that's a weird. Scrying is the paranormal ability to like remote viewing,
link |
00:29:04.320
like being able to see somewhere where you're not.
link |
00:29:06.720
So, you know, I don't think you can do it by chanting in a room,
link |
00:29:11.120
but if we could find, it's a memory leak, basically.
link |
00:29:16.080
It's a memory leak. Yeah.
link |
00:29:17.360
You're able to access parts you're not supposed to.
link |
00:29:19.840
Yeah, yeah, yeah.
link |
00:29:20.480
And thereby discover shortcut.
link |
00:29:22.000
Yeah. Maybe a memory leak means the other thing as well,
link |
00:29:24.560
but I mean like, yeah, like an ability to read arbitrary memory.
link |
00:29:26.960
Yeah.
link |
00:29:27.440
Right. And that one's not that horrifying.
link |
00:29:29.520
All right. The right ones start to be horrifying.
link |
00:29:31.280
Read it. Right. So the reading is not the problem.
link |
00:29:34.720
Yeah. It's like Heartbleed for the universe.
link |
00:29:37.360
Oh, boy, the writing is a big, big problem.
link |
00:29:40.560
It's a big problem.
link |
00:29:42.880
It's the moment you can write anything,
link |
00:29:44.480
even if it's just random noise.
link |
00:29:47.520
That's terrifying.
link |
00:29:49.040
I mean, even without, even without that,
link |
00:29:51.360
like even some of the, you know, the nanotech stuff that's coming, I think.
link |
00:29:56.880
I don't know if you're paying attention, but actually,
link |
00:29:59.120
Eric Weisstein came out with the theory of everything.
link |
00:30:02.000
I mean, that came out.
link |
00:30:03.360
He's been working on a theory of everything in the physics world
link |
00:30:06.240
called geometric unity.
link |
00:30:07.920
And then for me, from computer science person,
link |
00:30:10.400
like you, Steven Wolfram's theory of everything,
link |
00:30:14.000
of like, hypergraphs is super interesting and beautiful.
link |
00:30:17.440
But not from a physics perspective,
link |
00:30:19.200
but from a computational perspective.
link |
00:30:20.720
I don't know. Have you paid attention to any of that?
link |
00:30:23.040
So again, like what would make me pay attention
link |
00:30:26.240
and like why like a hate string theory is,
link |
00:30:29.360
okay, make a testable prediction.
link |
00:30:31.120
Right. I'm only interested in,
link |
00:30:33.520
I'm not interested in theories for their intrinsic beauty.
link |
00:30:35.840
I'm interested in theories that give me power over the universe.
link |
00:30:39.760
So if these theories do, I'm very interested.
link |
00:30:42.960
Can I just say how beautiful that is?
link |
00:30:44.960
Because a lot of physicists say,
link |
00:30:46.960
I'm interested in experimental validation,
link |
00:30:49.840
and they skip out the part where they say,
link |
00:30:52.720
to give me more power in the universe.
link |
00:30:55.360
I just love the,
link |
00:30:57.280
yo, I want, I want, I want the clarity of that.
link |
00:30:59.600
I want a hundred gigahertz processors.
link |
00:31:01.840
I want transistors that are smaller than atoms.
link |
00:31:04.000
I want like power.
link |
00:31:07.920
That's, that's true.
link |
00:31:10.400
And that's where people from aliens to this kind of technology
link |
00:31:13.360
where people are worried that governments,
link |
00:31:16.720
like who owns that power?
link |
00:31:19.200
Is it George Haas?
link |
00:31:20.640
Is it thousands of distributed hackers across the world?
link |
00:31:24.880
Is it governments?
link |
00:31:26.560
You know, is it Mark Zuckerberg?
link |
00:31:28.560
There's a lot of people that,
link |
00:31:32.320
I don't know if anyone trusts anyone individual with power,
link |
00:31:35.440
so they're always worried.
link |
00:31:37.120
It's the beauty of blockchains.
link |
00:31:39.200
That's the beauty of blockchains, which we'll talk about.
link |
00:31:43.040
On Twitter, somebody pointed me to a story,
link |
00:31:46.080
a bunch of people pointed me to a story a few months ago
link |
00:31:49.200
where you went into a restaurant in New York,
link |
00:31:51.440
and you can correct me if any of this is wrong,
link |
00:31:53.040
and ran into a bunch of folks from a company, a crypto company,
link |
00:31:58.640
who are trying to scale up Ethereum,
link |
00:32:01.680
and they had a technical deadline
link |
00:32:03.200
related to a solidity to OVM compiler.
link |
00:32:07.200
So these are all Ethereum technologies.
link |
00:32:09.680
So you stepped in, they recognized you,
link |
00:32:14.560
pulled you aside, explained their problem,
link |
00:32:16.080
and you stepped in and helped them solve the problem,
link |
00:32:18.320
thereby creating legend status story.
link |
00:32:24.640
Can you tell me the story a little more detail?
link |
00:32:28.800
It seems kind of incredible.
link |
00:32:31.360
Did this happen?
link |
00:32:32.160
Yeah, yeah, it's a true story.
link |
00:32:33.200
It's a true story.
link |
00:32:33.920
I mean, they wrote a very flattering account of it.
link |
00:32:39.280
So optimism is the spin, the company's called Optimism.
link |
00:32:43.600
It's a spin off of Plasma.
link |
00:32:45.120
They're trying to build L2 solutions on Ethereum.
link |
00:32:47.680
So right now, every Ethereum node
link |
00:32:52.480
has to run every transaction on the Ethereum network.
link |
00:32:56.560
And this kind of doesn't scale, right?
link |
00:32:58.400
Because if you have n computers,
link |
00:32:59.920
well, if that becomes two n computers,
link |
00:33:02.160
you actually still get the same amount of compute.
link |
00:33:05.200
This is like O of 1 scaling, because they all have to run it.
link |
00:33:10.000
Okay, fine, you get more blockchain security,
link |
00:33:12.720
but the blockchain's already so secure.
link |
00:33:15.120
Can we trade some of that off for speed?
link |
00:33:17.840
That's kind of what these L2 solutions are.
link |
00:33:20.400
They built this thing, which kind of sandbox for Ethereum contracts,
link |
00:33:26.160
so they can run it in this L2 world,
link |
00:33:28.080
and it can't do certain things in L1.
link |
00:33:30.800
Can I ask you for some definitions?
link |
00:33:32.240
What's L2?
link |
00:33:33.200
Oh, L2 is Layer 2.
link |
00:33:34.720
So L1 is like the base Ethereum chain,
link |
00:33:37.040
and then Layer 2 is like a computational layer that runs elsewhere,
link |
00:33:44.240
but still is kind of secured by Layer 1.
link |
00:33:47.600
And I'm sure a lot of people know,
link |
00:33:49.600
but Ethereum is a cryptocurrency,
link |
00:33:51.840
probably one of the most popular cryptocurrency, second to Bitcoin.
link |
00:33:55.200
And a lot of interesting technological innovations there.
link |
00:33:58.640
Maybe you could also slip in whenever you talk about this,
link |
00:34:03.120
any things that are exciting to you in the Ethereum space.
link |
00:34:06.240
And why Ethereum?
link |
00:34:07.680
Well, I mean, Bitcoin is not turned complete.
link |
00:34:10.800
Ethereum is not technically turned complete with a gas limit, but close enough.
link |
00:34:15.840
With a gas limit?
link |
00:34:16.720
What's the gas limit?
link |
00:34:17.840
Resources?
link |
00:34:18.960
Yeah, I mean, no computer is actually turned complete.
link |
00:34:22.960
They're just fine at RAM.
link |
00:34:23.840
You know, guys, I can actually solve the whole problem.
link |
00:34:25.680
What's the word gas limit?
link |
00:34:26.640
You just have so many brilliant words.
link |
00:34:28.480
I'm not even going to ask.
link |
00:34:29.200
Well, that's not my word.
link |
00:34:30.800
That's Ethereum's word.
link |
00:34:31.680
Gas limit.
link |
00:34:32.480
Ethereum, you have to spend gas per instruction.
link |
00:34:35.120
So like different op codes, you use different amounts of gas,
link |
00:34:37.680
and you buy gas with Ether to prevent people from basically DDoSing the network.
link |
00:34:42.640
So Bitcoin is proof of work.
link |
00:34:45.360
And then what's Ethereum?
link |
00:34:46.960
It's also proof of work.
link |
00:34:48.160
They're working on some proof of stake Ethereum 2.0 stuff.
link |
00:34:50.880
But right now it's proof of work.
link |
00:34:52.560
It uses a different hash function from Bitcoin.
link |
00:34:54.560
That's more ASIC resistance because you need RAM.
link |
00:34:57.200
So we're all talking about Ethereum 1.0.
link |
00:34:59.840
So what were they trying to do to scale this whole process?
link |
00:35:03.760
So they were like, well, if we could run contracts elsewhere,
link |
00:35:06.880
and then only save the results of that computation,
link |
00:35:12.960
well, we don't actually have to do the compute on the chain.
link |
00:35:14.560
We can do the compute off chain and just post what the results are.
link |
00:35:17.280
Now, the problem with that is, well, somebody could lie about what the results are.
link |
00:35:21.120
So you need a resolution mechanism.
link |
00:35:23.200
And the resolution mechanism can be really expensive because you just have to make sure
link |
00:35:28.880
that the person who is saying, look, I swear that this is the real computation.
link |
00:35:33.680
I'm staking $10,000 on that fact.
link |
00:35:36.000
And if you prove it wrong, yeah,
link |
00:35:39.120
it might cost you $3,000 in gas fees to prove wrong,
link |
00:35:42.640
but you'll get the $10,000 bounty.
link |
00:35:44.560
So you can secure using those kind of systems.
link |
00:35:48.720
So it's effectively a sandbox which runs contracts.
link |
00:35:52.800
And like just like any kind of normal sandbox,
link |
00:35:55.440
you have to replace syscalls with calls into the hypervisor.
link |
00:36:00.320
Uh, sandbox, syscalls, hypervisor, what do these things mean?
link |
00:36:07.520
As long as it's interesting to talk about.
link |
00:36:09.040
Yeah, I mean, you can take like the Chrome sandbox is maybe the one to think about, right?
link |
00:36:12.640
So the Chrome process is doing a rendering.
link |
00:36:15.920
Can't, for example, read a file from the file system.
link |
00:36:18.160
Yeah.
link |
00:36:18.640
It has, if it tries to make an open syscall in Linux,
link |
00:36:21.600
the open syscall, you can't make an open syscall.
link |
00:36:23.600
No, no, no.
link |
00:36:24.640
You have to request from the kind of a hypervisor process
link |
00:36:29.040
or like, I don't know what it's called in Chrome.
link |
00:36:31.600
But the, hey, could you open this file for me?
link |
00:36:36.240
And then it does all these checks and then it passes the file,
link |
00:36:38.320
handle back in if it's approved.
link |
00:36:39.920
Got it.
link |
00:36:41.040
So that's, yeah.
link |
00:36:42.400
So what's the, in the context of Ethereum,
link |
00:36:45.200
what are the boundaries of the sandbox that we're talking about?
link |
00:36:48.320
Well, like one of the calls that you actually reading and writing any state
link |
00:36:53.840
to the Ethereum contract, to the Ethereum blockchain.
link |
00:36:56.720
Writing state is one of those calls that you're going to have to sandbox
link |
00:37:03.200
in layer two, because if you let layer two just arbitrarily write to the Ethereum blockchain.
link |
00:37:10.320
So layer two is really sitting on top of layer one.
link |
00:37:15.040
So you're going to have a lot of different kinds of ideas that you can play with.
link |
00:37:18.160
Yeah.
link |
00:37:18.560
And they're all, they're not fundamentally changing the source code level of Ethereum.
link |
00:37:24.480
Well, you have to replace a bunch of calls with calls into the hypervisor.
link |
00:37:30.240
So instead of doing the syscall directly, you, you replace it with a call to the hypervisor.
link |
00:37:36.640
So originally they were doing this by first running the, so solidity is the language
link |
00:37:42.800
that most Ethereum contracts are written in, it compiles to a bytecode.
link |
00:37:46.880
And then they wrote this thing they called the transpiler.
link |
00:37:49.360
And the transpiler took the bytecode and it transpiled it into
link |
00:37:53.600
OVM safe bytecode, basically bytecode that didn't make any of those restricted syscalls
link |
00:37:58.560
and added the calls to the hypervisor.
link |
00:38:01.520
This transpiler was a 3000 line mess.
link |
00:38:05.440
And it's hard to do.
link |
00:38:06.880
It's hard to do if you're trying to do it like that, because you have to kind of like deconstruct
link |
00:38:10.480
the bytecode, change things about it, and then reconstruct it.
link |
00:38:15.200
And I mean, as soon as I hear this, I'm like, well, why don't you just change the compiler?
link |
00:38:19.760
All right.
link |
00:38:20.240
Why not the first place you build the bytecode, just do it in the compiler?
link |
00:38:25.440
So, yeah, you know, I asked them how much they wanted it.
link |
00:38:28.960
Of course, measured in dollars and I'm like, well, okay.
link |
00:38:32.960
And yeah.
link |
00:38:34.480
You wrote the compiler.
link |
00:38:35.840
Yeah, I modified.
link |
00:38:36.800
I wrote a 300 line diff to the compiler.
link |
00:38:39.040
It's on source.
link |
00:38:39.600
You can look at it.
link |
00:38:40.800
Yeah, I looked at the code last night.
link |
00:38:42.880
Yeah, it's cute.
link |
00:38:45.440
Yeah, exactly.
link |
00:38:46.000
Oh, too massive.
link |
00:38:46.560
Cute is a good word for it.
link |
00:38:48.400
And it's C++.
link |
00:38:51.680
C++, yeah.
link |
00:38:54.240
So, when asked how you were able to do it, you said you just got to think and then do it right.
link |
00:39:02.960
So, can you break that apart a little bit?
link |
00:39:04.560
What's your process of one, thinking, and two, doing it right?
link |
00:39:09.760
You know, the people who I was working for were amused that I said that.
link |
00:39:13.360
It doesn't really mean anything.
link |
00:39:14.720
Okay.
link |
00:39:14.960
Yeah, I mean, is there some deep profound insights to draw from like how you problem solve from that?
link |
00:39:23.440
This is always what I say.
link |
00:39:24.480
I'm like, do you want to be a good programmer?
link |
00:39:25.920
Do it for 20 years?
link |
00:39:27.760
Yeah, there's no shortcuts.
link |
00:39:29.520
Yeah.
link |
00:39:31.520
What are your thoughts on crypto in general?
link |
00:39:33.360
So, what parts technically or philosophically define especially beautiful maybe?
link |
00:39:39.840
Oh, I'm extremely bullish on crypto long term.
link |
00:39:42.720
Not any specific crypto project, but this idea of, well, two ideas.
link |
00:39:50.320
One, the Nakamoto consensus algorithm is I think one of the greatest innovations of the 21st century.
link |
00:39:58.480
This idea that people can reach consensus.
link |
00:40:01.120
You can reach a group consensus using a relatively straightforward algorithm is wild.
link |
00:40:07.440
And like Satoshi Nakamoto, people always ask me who I look up to.
link |
00:40:15.440
It's like whoever that is.
link |
00:40:17.600
Who do you think it is?
link |
00:40:19.280
Elon Musk?
link |
00:40:20.880
Is it you?
link |
00:40:22.160
It is definitely not me and I do not think it's Elon Musk.
link |
00:40:26.080
But yeah, this idea of groups reaching consensus in a decentralized yet formulaic way
link |
00:40:34.240
is one extremely powerful idea from crypto.
link |
00:40:40.400
Maybe the second idea is this idea of smart contracts.
link |
00:40:45.680
When you write a contract between two parties, any contract,
link |
00:40:50.800
this contract, if there are disputes, it's interpreted by lawyers.
link |
00:40:56.080
Lawyers are just really shitty overpaid interpreters.
link |
00:40:59.840
Imagine you had, let's talk about them in terms of like, let's compare a lawyer to Python.
link |
00:41:04.080
Right?
link |
00:41:06.240
Well, okay.
link |
00:41:06.880
That's brilliant.
link |
00:41:07.920
Oh, I never thought of it that way.
link |
00:41:09.840
It's hilarious.
link |
00:41:11.520
So Python, I'm paying even 10 cents an hour.
link |
00:41:15.760
I'll use the nice Azure machine.
link |
00:41:17.040
I can run Python for 10 cents an hour.
link |
00:41:19.520
Lawyers cost $1,000 an hour.
link |
00:41:21.280
So Python is 10,000x better on that axis.
link |
00:41:26.720
Lawyers don't always return the same answer.
link |
00:41:29.200
Python almost always does.
link |
00:41:36.560
Cost.
link |
00:41:37.680
Yeah.
link |
00:41:37.920
I mean, just cost, reliability, everything about Python is so much better than lawyers.
link |
00:41:43.680
So if you can make smart contracts, this whole concept of code is law.
link |
00:41:50.240
I love and I would love to live in a world where everybody accepted that fact.
link |
00:41:54.640
So maybe you can talk about what smart contracts are.
link |
00:42:01.120
So let's say we have even something as simple as a safety deposit box.
link |
00:42:11.520
A safety deposit box that holds a million dollars.
link |
00:42:14.560
I have a contract with the bank that says two out of these three parties
link |
00:42:18.560
must be present to open the safety deposit box and get the money out.
link |
00:42:25.040
So that's a contract with the bank and it's only as good as the bank and the lawyers.
link |
00:42:28.880
Let's say somebody dies and now we're going to go through a big legal dispute about whether,
link |
00:42:34.880
oh, was it in the will?
link |
00:42:35.920
Was it not in the will?
link |
00:42:38.080
It's just so messy and the cost to determine truth is so expensive.
link |
00:42:44.400
Versus a smart contract, which just uses cryptography to check if two out of three
link |
00:42:48.080
keys are present.
link |
00:42:49.920
Well, I can look at that and I can have certainty in the answer that it's going to return.
link |
00:42:55.680
And that's what all businesses want is certainty.
link |
00:42:58.000
You know, they say businesses don't care via YouTube.
link |
00:43:00.640
YouTube's like, look, we don't care which way this lawsuit goes.
link |
00:43:04.000
Just please tell us so we can have certainty.
link |
00:43:06.960
And I wonder how many agreements in this, because we're talking about financial transactions
link |
00:43:11.680
only in this case, correct?
link |
00:43:13.680
The smart contracts.
link |
00:43:15.440
Oh, you can go to anything.
link |
00:43:17.120
You can put a prenup in the theorem blockchain.
link |
00:43:21.760
Married smart contract.
link |
00:43:22.880
Sorry, divorce lawyer.
link |
00:43:24.080
Sorry, you're going to be replaced by Python.
link |
00:43:29.440
Okay, so that's another beautiful idea.
link |
00:43:34.800
Do you think there's something that's appealing to you about any one specific implementation?
link |
00:43:40.160
So if you look 10, 20, 50 years down the line, do you see any like Bitcoin, Ethereum,
link |
00:43:48.000
any of the other hundreds of cryptocurrencies winning out?
link |
00:43:50.960
Is there like, what's your intuition about the space?
link |
00:43:53.120
Are you just sitting back and watching the chaos and look who cares what emerges?
link |
00:43:57.200
Oh, I don't.
link |
00:43:58.000
I don't speculate.
link |
00:43:58.800
I don't really care.
link |
00:43:59.760
I don't really care which one of these projects wins.
link |
00:44:02.560
I'm kind of in the Bitcoin as a meme coin camp.
link |
00:44:05.280
I mean, why does Bitcoin have value?
link |
00:44:06.880
It's technically kind of not great, like the block size debate.
link |
00:44:14.160
When I found out what the block size debate was, I'm like, are you guys kidding?
link |
00:44:17.760
What's the block size debate?
link |
00:44:21.280
You know what?
link |
00:44:21.760
It's really, it's too stupid to even talk.
link |
00:44:24.800
People can look it up, but I'm like, wow.
link |
00:44:27.360
Ethereum seems, the governance of Ethereum seems much better.
link |
00:44:31.120
I've come around a bit on proof of stake ideas.
link |
00:44:33.840
Because very smart people thinking about some things.
link |
00:44:37.680
Yeah, governance is interesting.
link |
00:44:40.320
It does feel like Vitalik, it does feel like it opens,
link |
00:44:46.000
even in these distributed systems, leaders are helpful
link |
00:44:51.120
because they kind of help you drive the mission and the vision
link |
00:44:55.680
and they put a face to a project.
link |
00:44:58.240
It's a weird thing about us humans.
link |
00:45:00.000
Geniuses are helpful, like Vitalik.
link |
00:45:02.000
Right.
link |
00:45:02.720
Yeah, brilliant.
link |
00:45:06.480
Leaders, I mean.
link |
00:45:07.680
Are not necessarily, yeah.
link |
00:45:10.160
So you think the reason he's the face of Ethereum is because he's a genius.
link |
00:45:16.960
That's interesting.
link |
00:45:21.360
It's interesting to think about that we need to create systems
link |
00:45:24.560
in which the quote unquote leaders that emerge are the geniuses in the system.
link |
00:45:33.040
I mean, that's arguably why the current state of democracy is broken
link |
00:45:36.880
is the people who are emerging as the leaders are not the most competent,
link |
00:45:40.880
are not the superstars of the system.
link |
00:45:43.200
And it seems like at least for now in the crypto world,
link |
00:45:46.000
oftentimes the leaders are the superstars.
link |
00:45:49.280
Imagine at the debate, they asked, what's the Sixth Amendment?
link |
00:45:52.800
What are the four fundamental forces in the universe?
link |
00:45:55.440
Right.
link |
00:45:56.080
What's the integral of two to the X?
link |
00:45:58.720
Yeah, I'd love to see those questions asked.
link |
00:46:01.520
And that's what I want as our leader.
link |
00:46:03.760
What's Bayes rule?
link |
00:46:07.040
Yeah, I mean, even, oh, wow, you're hurting my brain.
link |
00:46:13.360
My standard was even lower, but I would have loved to see
link |
00:46:17.520
just this basic brilliance.
link |
00:46:20.400
Like I've talked to historians.
link |
00:46:22.000
There's just these, they're not even like, they don't have a PhD
link |
00:46:25.200
or even education history.
link |
00:46:26.560
They just like a Dan Carlin type character who just like, holy shit,
link |
00:46:32.640
how did all this information get into your head?
link |
00:46:35.200
They're able to just connect Genghis Khan to the entirety of the history of the 20th century.
link |
00:46:41.760
They know everything about every single battle that happened.
link |
00:46:46.000
And they know the game of thrones of the different power plays and all that happened there.
link |
00:46:54.960
And they know the individuals and all the documents involved.
link |
00:46:58.400
And they integrate that into their regular life.
link |
00:47:01.920
It's not like they're ultra history nerds.
link |
00:47:03.840
They're just, they know this information.
link |
00:47:06.240
That's what competence looks like.
link |
00:47:07.840
Yeah.
link |
00:47:08.880
Because I've seen that with programmers too, right?
link |
00:47:10.480
That's what great programmers do.
link |
00:47:11.920
But yeah, it would be, it's really unfortunate that those kinds of people
link |
00:47:16.560
aren't emerging as our leaders.
link |
00:47:19.200
But for now, at least in the crypto world, that seems to be the case.
link |
00:47:23.120
I don't know if that always, you could imagine that in 100 years, it's not the case, right?
link |
00:47:28.640
Crypto world has one very powerful idea going for it.
link |
00:47:31.760
And that's the idea of forks, right?
link |
00:47:34.160
I mean, imagine, we'll use a less controversial example, this was actually in my joke app in
link |
00:47:46.480
2012, I was like, Barack Obama, Mitt Romney, let's let them both be president, right?
link |
00:47:50.880
Like imagine we could fork America and just let them both be president.
link |
00:47:54.320
And then the Americas could compete and people could invest in one,
link |
00:47:58.000
pull their liquidity out of one, put it in the other.
link |
00:48:00.320
You have this in the crypto world.
link |
00:48:02.240
Ethereum forks into Ethereum and Ethereum Classic.
link |
00:48:05.360
And you can pull your liquidity out of one and put it in another.
link |
00:48:08.800
And people vote with their dollars, which forks, companies should be able to fork.
link |
00:48:16.240
I'd love to fork Nvidia.
link |
00:48:20.320
Yeah, like different business strategies and then try them out and see what works.
link |
00:48:25.440
Like even take, yeah, take common AI that closes its source and then take one that's open source
link |
00:48:36.480
and see what works, take one that's purchased by GM and one that remains Android Renegade
link |
00:48:43.440
in all these different versions and see.
link |
00:48:45.200
The beauty of common AI is someone can actually do that.
link |
00:48:47.840
Please take common AI and fork it.
link |
00:48:50.480
That's right.
link |
00:48:51.040
That's the beauty of open source.
link |
00:48:52.160
So you're, I mean, we'll talk about autonomous vehicle space, but it does seem that you're
link |
00:49:01.120
really knowledgeable about a lot of different topics.
link |
00:49:03.840
So the natural question a bunch of people ask this, which is,
link |
00:49:07.680
how do you keep learning new things?
link |
00:49:09.840
Do you have like practical advice?
link |
00:49:12.320
If you were to introspect, like taking notes, allocate time,
link |
00:49:17.600
or do you just mess around and just allow your curiosity to drive?
link |
00:49:21.200
I'll write these people a self help book and I'll charge $67 for it.
link |
00:49:26.240
And I will write on the cover of the self help book.
link |
00:49:30.240
All of this advice is completely meaningless.
link |
00:49:32.480
You're going to be a sucker and buy this book anyway.
link |
00:49:34.640
And the one lesson that I hope they take away from the book
link |
00:49:38.720
is that I can't give you a meaningful answer to that.
link |
00:49:42.400
That's interesting.
link |
00:49:43.200
And let me translate that is you haven't really thought about what it is you do systematically
link |
00:49:52.560
because you could reduce it.
link |
00:49:53.760
And there's some people, I mean, I've met brilliant people that this is really
link |
00:49:58.480
clear with athletes.
link |
00:50:00.080
Some are just, you know, the best in the world at something.
link |
00:50:04.960
And they, they have zero interest in writing a, like a self help book,
link |
00:50:08.800
but or how to master this game.
link |
00:50:11.440
And then there's some athletes who become great coaches
link |
00:50:15.440
and they love the analysis, perhaps the overall analysis.
link |
00:50:18.640
And you right now, at least at your age, which is an interesting,
link |
00:50:21.600
you're in the middle of the battle.
link |
00:50:23.040
You're like the warriors that have zero interest in writing books.
link |
00:50:27.600
So you're in the middle of the battle.
link |
00:50:28.960
So you have, yeah.
link |
00:50:30.400
This is, this is a fair point.
link |
00:50:31.600
I do think I have a certain aversion to this kind of deliberate, intentional way of living life.
link |
00:50:39.520
You eventually, the hilarity of this, especially since this is recorded,
link |
00:50:47.040
it will reveal beautifully the absurdity when you finally do publish this book.
link |
00:50:51.120
That guarantee you will.
link |
00:50:53.440
The story of comma AI, maybe it'll be a biography written about you.
link |
00:50:58.640
That'll be, that'll be better, I guess.
link |
00:51:00.240
And you might be able to learn some cute lessons if you're starting a company
link |
00:51:03.360
like comma AI from that book.
link |
00:51:05.200
But if you're asking generic questions like, how do I be good at things?
link |
00:51:10.000
Dude, I don't know.
link |
00:51:11.600
Well, learn, I mean, the interesting,
link |
00:51:13.040
Do them a lot.
link |
00:51:14.080
Do them a lot.
link |
00:51:14.720
But the interesting thing here is learning things outside of your current trajectory,
link |
00:51:21.680
which is what it feels like from an outsider's perspective.
link |
00:51:27.840
I don't know if there's a device on that, but it is an interesting curiosity.
link |
00:51:33.040
When you become really busy, you're running a company.
link |
00:51:37.680
Hard time.
link |
00:51:40.160
Yeah.
link |
00:51:41.360
But like, there's a natural inclination and trend, like just the momentum of life
link |
00:51:48.240
carries you into a particular direction of wanting to focus.
link |
00:51:50.880
And this kind of dispersion that curiosity can lead to gets harder and harder with time.
link |
00:51:57.920
Because you get really good at certain things and it sucks trying things that you're not good at,
link |
00:52:03.600
like trying to figure them out.
link |
00:52:05.040
When you do this with your live streams, you're on the fly figuring stuff out.
link |
00:52:09.840
You don't mind looking dumb.
link |
00:52:13.840
You just figured out, figured out pretty quickly.
link |
00:52:16.480
Sometimes I try things and I don't figure them out.
link |
00:52:18.320
You fail.
link |
00:52:18.960
My chest rating is like a 1400 despite putting like a couple of hundred hours in.
link |
00:52:23.120
It's pathetic.
link |
00:52:24.080
I mean, to be fair, I know that I could do it better.
link |
00:52:26.640
If I did it better, like, don't play, you know, don't play five minute games,
link |
00:52:29.680
play 15 minute games at least.
link |
00:52:31.200
Like, I know these things, but it just doesn't.
link |
00:52:34.080
It doesn't stick nicely in my knowledge stream.
link |
00:52:37.040
All right.
link |
00:52:37.360
Let's talk about Kama AI.
link |
00:52:39.200
What's the mission of the company?
link |
00:52:41.920
Let's like look at the biggest picture.
link |
00:52:44.560
Oh, I have an exact statement.
link |
00:52:46.800
Solve self driving cars while delivering shipable intermediaries.
link |
00:52:50.640
So long term vision is have fully autonomous vehicles and make sure you're making money along the way.
link |
00:52:58.160
I think it doesn't really speak to money, but I can talk.
link |
00:52:59.920
I can talk about what solve self driving cars means.
link |
00:53:02.320
Solve self driving cars, of course, means you're not building a new car.
link |
00:53:07.120
You're building a person replacement that person can sit in the driver's seat and drive you anywhere a person can drive with a human or better level of safety, speed, quality, comfort.
link |
00:53:18.640
And what's the second part of that?
link |
00:53:20.240
Deliver and shipable intermediaries is, well, it's a way to fund the company.
link |
00:53:24.560
That's true.
link |
00:53:25.040
But it's also a way to keep us honest.
link |
00:53:29.040
If you don't have that, it is very easy with this technology to think you are making progress when you're not.
link |
00:53:36.880
I've heard it best described on Hacker News as you can set any arbitrary milestone, meet that milestone, and still be able to do that.
link |
00:53:46.880
Meet that milestone and still be infinitely far away from solving self driving cars.
link |
00:53:51.600
So it's hard to have like real deadlines when you're like cruise or Waymo when you don't have revenue.
link |
00:54:02.000
Is that, I mean, is revenue essentially the thing we're talking about here?
link |
00:54:07.840
Revenue is, capitalism is based around consent.
link |
00:54:11.360
Capitalism, the way that you get revenue is a kind of real capitalism.
link |
00:54:14.240
Common is in the real capitalism camp.
link |
00:54:16.480
There's definitely scams out there, but real capitalism is based around consent.
link |
00:54:19.520
It's based around this idea that like, if we're getting revenue, it's because we're providing at least that much value to another person.
link |
00:54:24.640
When someone buys $1,000 comma two from us, we're providing them at least $1,000 of value, but they wouldn't buy it.
link |
00:54:29.920
Brilliant.
link |
00:54:30.640
So can you give a world wind overview of the products that Kamiai provides throughout its history and today?
link |
00:54:37.920
I mean, yeah, the past ones aren't really that interesting.
link |
00:54:40.800
It's kind of just been refinement of the same idea.
link |
00:54:45.120
The real only product we sell today is the comma two.
link |
00:54:48.160
Which is a piece of hardware with cameras.
link |
00:54:52.320
So the comma two, I mean, you can think about it kind of like a person.
link |
00:54:56.560
You know, in future hardware will probably be even more and more person like.
link |
00:55:00.160
So it has, you know, eyes, ears, a mouth, a brain, and a way to interface with the car.
link |
00:55:08.320
Does it have consciousness?
link |
00:55:09.920
Just kidding.
link |
00:55:10.720
That was a trick question.
link |
00:55:11.920
I don't have consciousness either.
link |
00:55:13.920
Me and the comma two are the same.
link |
00:55:15.520
You're the same?
link |
00:55:16.240
I have a little more compute than it.
link |
00:55:17.600
It only has like the same computer.
link |
00:55:19.200
How interesting.
link |
00:55:19.760
B, you know.
link |
00:55:21.920
You're more efficient energy wise for the compute you're doing.
link |
00:55:25.520
Far more efficient energy wise.
link |
00:55:27.680
20 pay to flops, 20 wants, crazy.
link |
00:55:29.600
Do you lack consciousness?
link |
00:55:30.960
Sure.
link |
00:55:31.600
Do you fear death?
link |
00:55:32.560
You do.
link |
00:55:33.360
You want immortality?
link |
00:55:34.400
Of course I fear death.
link |
00:55:35.200
Does Kamiai fear death?
link |
00:55:36.480
I don't think so.
link |
00:55:37.760
I don't think so.
link |
00:55:39.360
Of course it does.
link |
00:55:40.320
It very much fears, well, fears negative loss.
link |
00:55:42.720
Oh, yeah.
link |
00:55:45.600
Okay.
link |
00:55:46.080
So comma two, when did that come out?
link |
00:55:49.120
That was a year ago?
link |
00:55:50.240
No, two.
link |
00:55:51.920
Early this year.
link |
00:55:53.440
Wow.
link |
00:55:53.920
Time, it feels, yeah.
link |
00:55:56.240
To 2020 feels like it's taken 10 years to get to the end.
link |
00:56:00.560
It's a long year.
link |
00:56:01.680
It's a long year.
link |
00:56:03.040
So what's the sexiest thing about comma two, feature wise?
link |
00:56:09.840
So, I mean, maybe you can also link on like, what is it?
link |
00:56:14.160
Like, what's its purpose?
link |
00:56:15.600
Because there's a hardware, there's a software component.
link |
00:56:18.400
You've mentioned the sensors, but also like,
link |
00:56:20.960
what is it, its features and capabilities?
link |
00:56:22.960
I think our slogan summarizes it well.
link |
00:56:25.200
Comma slogan is make driving chill.
link |
00:56:28.720
I love it.
link |
00:56:29.200
Yeah, I mean, it is, you know, if you like cruise control,
link |
00:56:34.240
imagine cruise control, but much, much more.
link |
00:56:37.840
So it can do adaptive cruise control things,
link |
00:56:40.880
which is like slow down for cars in front of it,
link |
00:56:42.880
maintain a certain speed.
link |
00:56:44.080
And it can also do lane keeping, so stay in the lane
link |
00:56:47.840
and do it better and better and better over time.
link |
00:56:49.840
That's very much machine learning based.
link |
00:56:51.680
So there's cameras, there's a driver facing camera too.
link |
00:56:58.400
What else is there?
link |
00:57:01.840
What am I thinking?
link |
00:57:02.640
So the hardware versus software.
link |
00:57:04.320
So open pilot versus the actual hardware, the device.
link |
00:57:08.880
What's, can you draw that distinction?
link |
00:57:10.400
What's one, what's the other?
link |
00:57:11.440
I mean, the hardware is pretty much a cell phone
link |
00:57:13.680
with a few additions, a cell phone with a cooling system
link |
00:57:16.880
and with a car interface connected to it.
link |
00:57:20.080
And by cell phone, you mean like Qualcomm Snapdragon?
link |
00:57:24.240
Yeah, the current hardware is a Snapdragon 821.
link |
00:57:29.120
It has a Wi Fi radio, it has an LTE radio, it has a screen.
link |
00:57:33.760
We use every part of the cell phone.
link |
00:57:35.840
And then the interface of the car is specific to the car,
link |
00:57:38.320
so you keep supporting more and more cars.
link |
00:57:41.280
Yeah, through the interface of the car, I mean,
link |
00:57:42.720
the device itself just has four can buses,
link |
00:57:45.120
has four can interfaces on it,
link |
00:57:46.640
they're connected through the USB port to the phone.
link |
00:57:48.320
And then, yeah, on those four can buses,
link |
00:57:53.200
you connect it to the car.
link |
00:57:54.320
And there's a little harness to do this.
link |
00:57:56.240
Cars are actually surprisingly similar.
link |
00:57:58.320
So can is the protocol by which cars communicate.
link |
00:58:01.440
And then you're able to read stuff and write stuff
link |
00:58:04.160
to be able to control the car, depending on the car.
link |
00:58:06.800
So what's the software side?
link |
00:58:08.160
What's open pilot?
link |
00:58:10.240
So I mean, open pilot is, the hardware is pretty simple
link |
00:58:12.720
compared to open pilot.
link |
00:58:13.680
Open pilot is, well, so you have a machine learning model,
link |
00:58:20.880
which it's an open pilot, it's a blob, it's just a blob of weights.
link |
00:58:25.440
It's not like people are like, oh, it's closed source.
link |
00:58:27.280
I'm like, what's a blob of weights?
link |
00:58:28.720
What do you expect?
link |
00:58:31.200
It's primarily neural network based.
link |
00:58:33.600
Well, open pilot is all the software kind of around
link |
00:58:36.480
that neural network, that if you have a neural network
link |
00:58:38.480
that says here's where you want to send the car,
link |
00:58:40.640
open pilot actually goes and executes all of that.
link |
00:58:44.560
It cleans up the input to the neural network,
link |
00:58:46.720
it cleans up the output and executes on it.
link |
00:58:48.880
So connects, it's the glue that connects everything together.
link |
00:58:51.680
Runs the sensors, does a bunch of calibration
link |
00:58:54.240
for the neural network, does, deals with like,
link |
00:58:57.840
if the car is on a banked road, you have to counter steer
link |
00:59:01.280
against that.
link |
00:59:01.840
And the neural network can't necessarily know that
link |
00:59:03.600
by looking at the picture.
link |
00:59:06.160
So you can do that with other sensors, infusion,
link |
00:59:08.400
and localizer, open pilot also is responsible for sending
link |
00:59:12.960
the data up to our servers, so we can learn from it,
link |
00:59:16.560
logging it, recording it, running the cameras,
link |
00:59:18.960
thermally managing the device, managing the disk space
link |
00:59:22.480
on the device, managing all the resources on the device.
link |
00:59:24.560
So what, since we last spoke, I don't remember when,
link |
00:59:27.440
maybe a year ago, maybe a little bit longer,
link |
00:59:30.000
how has open pilot improved?
link |
00:59:32.880
We did exactly what I promised you.
link |
00:59:34.640
I promised you that by the end of the year,
link |
00:59:36.560
we would be able to remove the lanes.
link |
00:59:40.320
The lateral policy is now almost completely end to end.
link |
00:59:45.840
You can turn the lanes off and it will drive,
link |
00:59:48.320
drive slightly worse on the highway if you turn the lanes off,
link |
00:59:50.880
but you can turn the lanes off and it will drive well
link |
00:59:54.160
trained completely end to end on user data.
link |
00:59:57.280
And this year, we hope to do the same for the
link |
00:59:58.720
longitudinal policy.
link |
00:59:59.920
So that's the interesting thing is you're not doing,
link |
01:00:03.440
you don't appear to be, you can correct me,
link |
01:00:05.200
you don't appear to be doing lane detection
link |
01:00:08.640
or lane marking detection or kind of the segmentation task
link |
01:00:12.320
or any kind of object detection task.
link |
01:00:15.040
You're doing what's traditionally more called
link |
01:00:17.440
like end to end learning.
link |
01:00:19.360
So entrained on actual behavior of drivers
link |
01:00:24.080
when they're driving the car manually.
link |
01:00:27.600
And this is hard to do.
link |
01:00:29.280
You know, it's not supervised learning.
link |
01:00:32.080
Yeah, but so the nice thing is there's a lot of data,
link |
01:00:34.640
so it's hard and easy, right?
link |
01:00:37.760
We have a lot of high quality data, right?
link |
01:00:39.840
Like more than you need in the center.
link |
01:00:41.600
Well, we've way more than we do.
link |
01:00:43.280
We've way more data than we need.
link |
01:00:44.800
I mean, it's an interesting question, actually,
link |
01:00:46.960
because in terms of amount, you have more than you need.
link |
01:00:51.440
But the, you know, driving is full of edge cases.
link |
01:00:54.160
So how do you select the data you train on?
link |
01:00:58.160
I think this is an interesting open question.
link |
01:01:00.480
Like, what's the cleverest way to select data?
link |
01:01:04.080
That's the question Tesla is probably working on.
link |
01:01:07.520
That's, I mean, the entirety of machine learning can be,
link |
01:01:09.760
they don't seem to really care.
link |
01:01:10.880
They just kind of select data.
link |
01:01:12.160
But I feel like that if you want to solve,
link |
01:01:14.720
if you want to create intelligence systems,
link |
01:01:16.080
you have to pick data well, right?
link |
01:01:18.720
And so would you have any hints, ideas of how to do it well?
link |
01:01:22.800
So in some ways, that is the definition I like
link |
01:01:26.160
of reinforcement learning versus supervised learning.
link |
01:01:28.320
In supervised learning, the weights depend on the data, right?
link |
01:01:34.160
And this is obviously true, but in reinforcement learning,
link |
01:01:38.160
the data depends on the weights.
link |
01:01:40.320
Yeah.
link |
01:01:40.800
Right?
link |
01:01:41.280
And actually, both ways.
link |
01:01:42.800
That's poetry.
link |
01:01:43.920
So how does it know what data to train on?
link |
01:01:46.160
Well, let it pick.
link |
01:01:47.360
We're not there yet, but that's the eventual.
link |
01:01:49.360
So you're thinking this almost like a reinforcement learning
link |
01:01:52.160
framework.
link |
01:01:53.120
We're going to do RL on the world.
link |
01:01:55.280
Every time a car makes a mistake, user disengages,
link |
01:01:58.000
we train on that and do RL on the world.
link |
01:02:00.000
Ship out a new model.
link |
01:02:00.800
That's an epoch, right?
link |
01:02:03.200
And for now, you're not doing the Elon style promising
link |
01:02:08.320
that it's going to be fully autonomous.
link |
01:02:09.600
You really are sticking to level two.
link |
01:02:12.320
And it's supposed to be supervised.
link |
01:02:15.360
It is definitely supposed to be supervised.
link |
01:02:16.720
And we enforce the fact that it's supervised.
link |
01:02:19.680
We look at our rate of improvement in disengagement.
link |
01:02:23.440
OpenPilot now has an unplanned disengagement
link |
01:02:25.520
about every 100 miles.
link |
01:02:26.560
This is up from 10 miles, like maybe a year ago.
link |
01:02:36.080
Yeah.
link |
01:02:36.560
So maybe we've seen 10x improvement in a year,
link |
01:02:38.320
but 100 miles is still a far cry from the 100,000 you're
link |
01:02:42.560
going to need.
link |
01:02:43.680
So you're going to somehow need to get three more 10xs in there.
link |
01:02:49.680
And what's your intuition?
link |
01:02:52.160
You're basically hoping that there's exponential
link |
01:02:54.320
improvement built into the baked into the cake
link |
01:02:56.240
somewhere.
link |
01:02:56.720
Well, that's even, I mean, 10x improvement,
link |
01:02:58.240
that's already assuming exponential, right?
link |
01:03:00.320
There's definitely exponential improvement.
link |
01:03:02.400
And I think when Elon talks about exponential,
link |
01:03:04.160
like these things, these systems are going to exponentially
link |
01:03:06.880
improve.
link |
01:03:07.760
Just exponential doesn't mean you're getting 100 gigahertz
link |
01:03:10.880
processors tomorrow, right?
link |
01:03:13.120
Like it's going to still take a while because the gap
link |
01:03:16.000
between even our best system and humans is still large.
link |
01:03:20.080
So that's an interesting distinction to draw.
link |
01:03:22.160
So if you look at the way Tesla is approaching the problem,
link |
01:03:24.720
and the way you're approaching the problem, which is very
link |
01:03:29.040
different than the rest of the self driving car world.
link |
01:03:32.560
So let's put them aside is you're treating most the driving
link |
01:03:35.760
tasks as a machine learning problem.
link |
01:03:37.440
And the way Tesla is approaching it is with the
link |
01:03:39.520
multitask learning, where you break the task of driving
link |
01:03:43.120
into hundreds of different tasks.
link |
01:03:45.280
And you have this multi headed neural network that's very
link |
01:03:48.880
good at performing each task.
link |
01:03:51.520
And there's presumably something on top that's stitching
link |
01:03:55.120
stuff together in order to make control decisions, policy
link |
01:04:00.080
decisions about how you move the car.
link |
01:04:02.080
But what that allows you, there's a brilliance to this
link |
01:04:04.320
because it allows you to master each task, like lane
link |
01:04:09.040
detection, stop sign detection, the traffic light
link |
01:04:13.920
detection, drivable area segmentation,
link |
01:04:17.120
you know, vehicle, bicycle pedestrian detection.
link |
01:04:21.280
There's some localization tasks in there.
link |
01:04:24.800
Also predicting, like, yeah, predicting how the entities
link |
01:04:32.240
in the scene are going to move.
link |
01:04:33.840
Like everything is basically a machine learning task
link |
01:04:36.240
where there's a classification, segmentation, prediction.
link |
01:04:39.520
And it's nice because you can have this entire engine,
link |
01:04:44.240
data engine that's mining for edge cases for each one
link |
01:04:48.480
of these tasks.
link |
01:04:49.280
And you can have people, like engineers, that are
link |
01:04:51.600
basically masters of that task.
link |
01:04:53.600
They become the best person in the world at, as you talk
link |
01:04:56.880
about the cone guy for Waymo.
link |
01:04:59.520
Yeah, they're a good old cone guy.
link |
01:05:00.960
They become the best person in the world at cone
link |
01:05:05.440
detection.
link |
01:05:07.120
So that's a compelling notion from a supervised
link |
01:05:09.680
learning perspective, automating much of the process of
link |
01:05:14.720
edge case discovery and retraining neural network
link |
01:05:17.200
for each of the individual perception tasks.
link |
01:05:19.760
And then you're looking at the machine learning in a
link |
01:05:22.080
more holistic way, basically doing end to end
link |
01:05:26.240
learning on the driving tasks, supervised, trained
link |
01:05:30.080
on the data of the actual driving of people they
link |
01:05:34.480
use comma AI, like actual human drivers do manual
link |
01:05:37.920
control, plus the moments of disengagement that maybe
link |
01:05:43.920
with some labeling could indicate the failure of
link |
01:05:46.640
the system.
link |
01:05:47.200
So you have a huge amount of data for positive
link |
01:05:51.680
control of the vehicle, like successful control of
link |
01:05:54.000
the vehicle, both maintaining the lane as I think
link |
01:05:58.800
you're also working on longitudinal control of
link |
01:06:00.960
the vehicle, and then failure cases where the
link |
01:06:04.000
vehicle does something wrong that needs disengagement.
link |
01:06:08.240
So like what, why do you think you're right and
link |
01:06:11.440
Tesla is wrong on this?
link |
01:06:14.240
And do you think you'll come around the Tesla way?
link |
01:06:17.440
Do you think Tesla will come around to your way?
link |
01:06:21.280
If you were to start a chess engine company, would
link |
01:06:23.920
you hire a bishop guy?
link |
01:06:26.000
See, we have, this is Monday morning quarterbacking
link |
01:06:29.760
is, yes, probably.
link |
01:06:35.840
Oh, our Rook guy.
link |
01:06:37.280
Oh, we stole the Rook guy from that company.
link |
01:06:39.280
Oh, we're going to have real good Rooks.
link |
01:06:40.800
Well, there's not many pieces, right?
link |
01:06:43.840
You can, there's not many guys and gals to hire.
link |
01:06:48.640
You just have a few that work in the bishop, a few
link |
01:06:51.200
that work in the Rook.
link |
01:06:52.480
But is that not ludicrous today to think about in
link |
01:06:55.280
the world of AlphaZero?
link |
01:06:56.560
But AlphaZero is a chess game, so the fundamental
link |
01:07:00.160
question is how hard is driving compared to chess?
link |
01:07:04.240
Because so long term, end to end will be the right solution.
link |
01:07:10.480
The question is how many years away is that?
link |
01:07:13.280
End to end is going to be the only solution for level five.
link |
01:07:15.680
For the only way we get there.
link |
01:07:17.040
Of course, and of course Tesla is going to come
link |
01:07:18.720
around to my way.
link |
01:07:19.520
And if you're a Rook guy out there, I'm sorry.
link |
01:07:22.800
The Kone guy.
link |
01:07:24.800
I don't know.
link |
01:07:25.280
We're going to specialize each task.
link |
01:07:26.720
We're going to really understand Rook placement.
link |
01:07:29.040
Yeah, I understand the intuition you have.
link |
01:07:31.920
I mean, that is a very compelling notion that we
link |
01:07:36.880
can learn the task end to end.
link |
01:07:38.960
Like the same compelling notion you might have for
link |
01:07:40.800
natural language conversation.
link |
01:07:42.400
But I'm not sure because one thing you sneaked in there
link |
01:07:48.800
is the assertion that it's impossible to get to level
link |
01:07:52.400
five without this kind of approach. I don't know if
link |
01:07:55.760
that's obvious.
link |
01:07:56.960
I don't know if that's obvious either.
link |
01:07:58.160
I don't actually mean that.
link |
01:08:01.200
I think that it is much easier to get to level five
link |
01:08:04.240
with an end to end approach.
link |
01:08:05.520
I think that the other approach is doable.
link |
01:08:08.800
But the magnitude of the engineering challenge
link |
01:08:11.280
may exceed what humanity is capable of.
link |
01:08:13.600
So, but what do you think of the Tesla data engine
link |
01:08:17.760
approach, which to me is an active learning task.
link |
01:08:20.960
It's kind of fascinating.
link |
01:08:22.320
It's breaking it down into these multiple tasks
link |
01:08:25.520
and mining their data constantly for like edge cases
link |
01:08:29.360
for these different tasks.
link |
01:08:30.240
Yeah, but the tasks themselves are not being learned.
link |
01:08:32.240
This is feature engineering.
link |
01:08:35.760
I mean, it's a higher abstraction level of feature
link |
01:08:41.040
engineering for the different tasks.
link |
01:08:43.120
It's task engineering in a sense.
link |
01:08:44.720
It's slightly better feature engineering,
link |
01:08:46.720
but it's still fundamentally is feature engineering.
link |
01:08:49.200
And if anything about the history of AI has taught us
link |
01:08:51.760
anything, it's that feature engineering approaches
link |
01:08:54.560
will always be replaced and lose to end to end.
link |
01:08:57.600
Now, to be fair, I cannot really make promises on timelines,
link |
01:09:02.000
but I can say that when you look at the code for stock fish
link |
01:09:05.680
and the code for alpha zero,
link |
01:09:06.960
one is a lot shorter than the other.
link |
01:09:09.040
A lot more elegant and required a lot less
link |
01:09:10.640
programmer hours to write.
link |
01:09:11.680
Yeah, but there was a lot more murder of bad agents
link |
01:09:21.680
on the alpha zero side.
link |
01:09:24.800
By murder, I mean agents that played a game
link |
01:09:29.280
and failed miserably.
link |
01:09:30.480
Yeah.
link |
01:09:31.520
Oh, oh.
link |
01:09:32.240
In simulation, that failure is less costly.
link |
01:09:34.800
Yeah.
link |
01:09:35.520
In real world, it's...
link |
01:09:37.280
Do you mean in practice,
link |
01:09:38.240
like alpha zero has lost games miserably?
link |
01:09:40.160
No.
link |
01:09:41.200
Oh, I haven't seen that.
link |
01:09:43.200
No, but I know, but the requirement for alpha zero
link |
01:09:46.800
is to be able to like evolution, human evolution,
link |
01:09:51.360
not human evolution,
link |
01:09:52.160
biological evolution of life on earth from the origin of life
link |
01:09:56.160
has murdered trillions upon trillions of organisms
link |
01:10:00.640
on the path to us humans.
link |
01:10:02.960
So the question is, can we stitch together a human like object
link |
01:10:06.880
without having to go through the entirety process of evolution?
link |
01:10:09.760
Well, no, but do the evolution in simulation?
link |
01:10:11.840
Yeah, that's the question.
link |
01:10:12.720
Can we simulate?
link |
01:10:13.360
So do you ever sense that it's possible to simulate some aspect of it?
link |
01:10:16.080
Mu zero is exactly this.
link |
01:10:18.080
Mu zero is the solution to this.
link |
01:10:21.200
Mu zero, I think, is going to be looked back
link |
01:10:23.840
as the canonical paper.
link |
01:10:25.040
And I don't think deep learning is everything.
link |
01:10:26.720
I think that there's still a bunch of things missing to get there.
link |
01:10:29.280
But Mu zero, I think, is going to be looked back
link |
01:10:31.360
as the kind of cornerstone paper of this whole deep learning era.
link |
01:10:36.960
And Mu zero is the solution to self driving cars.
link |
01:10:39.440
You have to make a few tweaks to it.
link |
01:10:41.040
But Mu zero does effectively that.
link |
01:10:42.640
It does those rollouts and those murdering in a learned simulator
link |
01:10:47.440
in a learned dynamics model.
link |
01:10:50.000
It's interesting.
link |
01:10:50.560
It doesn't get enough love.
link |
01:10:51.440
I was blown away when I was blown away when I read that paper.
link |
01:10:54.080
I'm like, you know, OK, I've always had a comma.
link |
01:10:56.880
I'm going to sit and I'm going to wait for the solution
link |
01:10:58.320
to self driving cars to come along.
link |
01:11:00.160
This year I saw it.
link |
01:11:00.880
It's Mu zero.
link |
01:11:01.600
No.
link |
01:11:04.960
So sit back and let the winning roll in.
link |
01:11:08.080
So your sense, just to elaborate a little bit the link on the topic,
link |
01:11:12.560
your sense is neural networks will solve driving.
link |
01:11:15.280
Yes.
link |
01:11:15.760
Like we don't need anything else.
link |
01:11:17.360
I think the same way chess was maybe the chess and maybe Google
link |
01:11:21.040
are the pinnacle of like search algorithms
link |
01:11:24.000
and things that look kind of like A star.
link |
01:11:27.600
The pinnacle of this era is going to be self driving cars.
link |
01:11:33.680
But on the path that you have to deliver products,
link |
01:11:37.120
and it's possible that the path to full self driving cars will take decades.
link |
01:11:43.360
I doubt it.
link |
01:11:44.000
So how long would you put on it?
link |
01:11:46.800
Like what?
link |
01:11:47.440
What are we?
link |
01:11:48.480
You're chasing it.
link |
01:11:49.680
Tesla is chasing it.
link |
01:11:52.400
What are we talking about?
link |
01:11:53.360
Five years, 10 years, 50 years?
link |
01:11:54.800
Let's say in the 2020s.
link |
01:11:56.960
In the 2020s.
link |
01:11:58.080
The later part of the 2020s.
link |
01:12:02.480
With the neural network.
link |
01:12:04.480
That would be nice to see.
link |
01:12:05.600
And on the path to that, you're delivering your products,
link |
01:12:08.400
which is a nice L2 system.
link |
01:12:09.840
That's what Tesla is doing, a nice L2 system.
link |
01:12:12.160
It just gets better every time.
link |
01:12:13.360
The only difference between L2 and the other levels
link |
01:12:15.920
is who takes liability.
link |
01:12:16.880
And I'm not a liability guy.
link |
01:12:18.160
I don't want to take liability.
link |
01:12:19.200
I'm going to level two forever.
link |
01:12:21.920
Now, on that little transition, I mean,
link |
01:12:26.320
how do you make the transition work?
link |
01:12:28.240
Is this where driver sensing comes in?
link |
01:12:31.440
Like how do you make the, because you said 100 miles,
link |
01:12:34.960
like, is there some sort of human factor psychology thing
link |
01:12:40.400
where people start to over trust the system?
link |
01:12:42.320
All those kinds of effects.
link |
01:12:43.920
Once it gets better and better and better and better,
link |
01:12:45.920
they get lazier and lazier and lazier.
link |
01:12:48.560
Is that, like, how do you get that transition right?
link |
01:12:51.600
First off, our monitoring is already adaptive.
link |
01:12:53.520
Our monitoring is already seen adaptive.
link |
01:12:55.760
Driver monitoring.
link |
01:12:56.960
Is this the camera that's looking at the driver?
link |
01:12:59.040
You have an infrared camera in the...
link |
01:13:01.200
Our policy for how we enforce the driver monitoring
link |
01:13:04.800
is seen adaptive.
link |
01:13:05.920
What's that mean?
link |
01:13:06.800
Well, for example, in one of the extreme cases,
link |
01:13:11.120
if the car is not moving,
link |
01:13:13.120
we do not actively enforce driver monitoring.
link |
01:13:16.080
If you are going through like a 45 mile an hour road
link |
01:13:22.160
with lights and stop signs and potentially pedestrians,
link |
01:13:26.320
we enforce a very tight driver monitoring policy.
link |
01:13:28.960
If you are alone on a perfectly straight highway,
link |
01:13:32.320
and it's all machine learning, none of that is hand coded.
link |
01:13:34.960
Actually, the stop is hand coded, but...
link |
01:13:36.960
So there's some kind of machine learning estimation of risk?
link |
01:13:40.400
Yes.
link |
01:13:40.960
Yeah, I mean, I've always been a huge fan of that.
link |
01:13:43.920
That's difficult to do every step into that direction
link |
01:13:51.360
is a worthwhile stop to take.
link |
01:13:52.960
It might be difficult to do really well.
link |
01:13:54.560
Like, us humans are able to estimate risk pretty damn well.
link |
01:13:57.360
Whatever the hell that is,
link |
01:13:58.960
that feels like one of the nice features of us humans.
link |
01:14:02.800
Because we humans are really good drivers
link |
01:14:06.160
when we're really tuned in.
link |
01:14:08.320
And we're good at estimating risk,
link |
01:14:10.160
like when are we supposed to be tuned in?
link |
01:14:12.160
Yeah.
link |
01:14:13.200
And people are like,
link |
01:14:14.800
oh, well, why would you ever make the driver monitoring policy
link |
01:14:17.600
less aggressive?
link |
01:14:18.400
Why would you always not keep it at its most aggressive?
link |
01:14:21.200
Because then people are just going to get fatigued from it.
link |
01:14:23.200
Yeah, well, they get annoyed.
link |
01:14:24.800
You know, you know,
link |
01:14:26.000
when they get annoyed, you want the experience to be pleasant.
link |
01:14:30.800
Obviously, I want the experience to be pleasant,
link |
01:14:32.400
but even just from a straight up safety perspective,
link |
01:14:35.920
if you alert people when they look around
link |
01:14:38.640
and they're like, why is this thing alerting me?
link |
01:14:40.880
There's nothing I could possibly hit right now.
link |
01:14:42.880
People will just learn to tune it out.
link |
01:14:45.040
People will just learn to tune it out,
link |
01:14:46.720
to put weights on the steering wheel,
link |
01:14:47.920
to do whatever, to overcome it.
link |
01:14:49.760
And remember that you're always part of this adaptive system.
link |
01:14:53.440
So all I can really say about how this scale is going forward
link |
01:14:56.800
is, yeah, something we have to monitor for.
link |
01:14:59.200
Ooh, we don't know.
link |
01:15:00.000
This is a great psychology experiment at scale.
link |
01:15:01.920
Like, we'll see.
link |
01:15:03.040
Yeah, it's fascinating.
link |
01:15:03.840
Track it.
link |
01:15:04.480
And making sure you have a good understanding of attention
link |
01:15:08.960
is a very key part of that psychology problem.
link |
01:15:11.280
Yeah, I think you and I probably have a different,
link |
01:15:14.000
come to it differently.
link |
01:15:15.040
But to me, it's a fascinating psychology problem
link |
01:15:19.520
to explore something much deeper than just driving.
link |
01:15:21.920
It's such a nice way to explore human attention
link |
01:15:26.480
and human behavior, which is why, again,
link |
01:15:30.080
we've probably both criticized Mr. Elon Musk
link |
01:15:33.680
on this one topic from different avenues.
link |
01:15:38.000
So both offline and online, I had little chats with Elon.
link |
01:15:44.000
Like, I love human beings.
link |
01:15:45.840
As a computer vision problem, as an AI problem,
link |
01:15:49.360
it's fascinating.
link |
01:15:50.160
He wasn't so much interested in that problem.
link |
01:15:54.400
It's like, in order to solve driving, the whole point
link |
01:15:57.360
is you want to remove the human from the picture.
link |
01:16:01.120
And it seems like you can't do that quite yet.
link |
01:16:04.000
Eventually, yes.
link |
01:16:05.040
But you can't quite do that yet.
link |
01:16:07.840
So this is the moment where, and you can't yet say,
link |
01:16:12.160
I told you so, to Tesla, but it's getting there
link |
01:16:17.600
because I don't know if you've seen this.
link |
01:16:19.120
There's some reporting that they're, in fact,
link |
01:16:21.040
starting to do driver monitoring.
link |
01:16:23.040
Yeah, they ship the model in shadow mode.
link |
01:16:26.160
With, though, I believe only a visible light camera.
link |
01:16:29.040
It might even be fisheye.
link |
01:16:31.680
It's like a low resolution.
link |
01:16:33.120
Low resolution visible light.
link |
01:16:34.640
I mean, to be fair, that's what we have in the Eon as well.
link |
01:16:37.040
Our last generation product.
link |
01:16:38.720
This is the one area where I can say our hardware's ahead of Tesla.
link |
01:16:42.000
The rest of our hardware way, way behind,
link |
01:16:43.680
but our driver monitoring camera.
link |
01:16:45.120
So you think, I think on the third row, Tesla podcast,
link |
01:16:50.080
or somewhere else, I've heard you say that obviously,
link |
01:16:53.760
eventually, they're going to have driver monitoring.
link |
01:16:56.320
I think what I've said is Elon will definitely ship
link |
01:16:58.880
driver monitoring before he ships level five.
link |
01:17:00.880
Level three, four, level five.
link |
01:17:01.840
And I'm willing to bet 10 grand on that.
link |
01:17:03.280
And you bet 10 grand on that.
link |
01:17:06.160
I mean, now I know what to take the bet,
link |
01:17:07.440
but before, maybe someone would have thought,
link |
01:17:08.800
I should have got my money in.
link |
01:17:09.840
Yeah.
link |
01:17:10.640
It's an interesting bet.
link |
01:17:12.080
I think you're right.
link |
01:17:15.520
I'm actually on a human level, because he's made the decision.
link |
01:17:23.520
Like he said that driver monitoring is the wrong way to go.
link |
01:17:26.800
But you have to think of as a human, as a CEO,
link |
01:17:30.240
I think that's the right thing to say when...
link |
01:17:37.040
Sometimes you have to say things publicly
link |
01:17:39.280
that are different than what you actually believe,
link |
01:17:40.960
because when you're producing a large number of vehicles,
link |
01:17:44.080
and the decision was made not to include the camera,
link |
01:17:46.960
like what are you supposed to say?
link |
01:17:48.240
Yeah.
link |
01:17:48.960
Like, our cars don't have the thing that I think is right to have.
link |
01:17:53.040
It's an interesting thing, but like on the other side,
link |
01:17:56.640
as a CEO, I mean, something you could probably speak to
link |
01:17:59.040
as a leader, I think about me as a human
link |
01:18:04.080
to publicly change your mind on something.
link |
01:18:06.160
How hard is that?
link |
01:18:07.120
Well, especially when assholes like George Haas say,
link |
01:18:09.840
I told you so.
link |
01:18:12.160
All I will say is I am not a leader,
link |
01:18:14.480
and I am happy to change my mind.
link |
01:18:17.040
You think Elon will?
link |
01:18:20.480
Yeah, I do.
link |
01:18:22.320
I think he'll come up with a good way
link |
01:18:24.160
to make it psychologically okay for him.
link |
01:18:27.200
Well, it's such an important thing, man,
link |
01:18:29.520
especially for a first principles thinker,
link |
01:18:31.280
because he made a decision that driver monitoring
link |
01:18:34.640
is not the right way to go.
link |
01:18:35.600
And I could see that decision,
link |
01:18:37.440
and I could even make that decision.
link |
01:18:39.040
Like, I was on the fence too.
link |
01:18:43.600
Driver monitoring is such an obvious,
link |
01:18:47.200
simple solution to the problem of attention.
link |
01:18:49.840
It's not obvious to me that just by putting a camera there,
link |
01:18:52.720
you solve things.
link |
01:18:54.080
You have to create an incredible, compelling experience,
link |
01:18:58.880
just like you're talking about.
link |
01:19:00.880
I don't know if it's easy to do that.
link |
01:19:03.120
It's not at all easy to do that, in fact, I think.
link |
01:19:05.280
So, as a creator of a car that's trying to create a product
link |
01:19:10.960
that people love, which is what Tesla tries to do.
link |
01:19:13.280
Right?
link |
01:19:14.240
It's not obvious to me that as a design decision,
link |
01:19:18.320
whether adding a camera is a good idea.
link |
01:19:20.960
From a safety perspective either,
link |
01:19:23.200
in the human factors community,
link |
01:19:25.200
everybody says that you should obviously have
link |
01:19:27.920
driver sensing, driver monitoring.
link |
01:19:29.920
But that's like saying it's obvious as parents
link |
01:19:36.960
you shouldn't let your kids go out at night.
link |
01:19:39.920
But okay.
link |
01:19:42.000
But they're still going to find ways to do drugs.
link |
01:19:47.600
Yeah.
link |
01:19:48.320
You have to also be good parents.
link |
01:19:49.920
So, it's much more complicated than just you need
link |
01:19:53.120
to have driver monitoring.
link |
01:19:54.320
I totally disagree on, okay, if you have a camera there,
link |
01:19:58.720
and the camera is watching the person,
link |
01:20:00.320
but never throws an alert, they'll never think about it.
link |
01:20:03.600
Right?
link |
01:20:04.480
The driver monitoring policy that you choose to,
link |
01:20:08.480
how you choose to communicate with the user,
link |
01:20:10.320
is entirely separate from the data collection perspective.
link |
01:20:14.320
Right.
link |
01:20:15.120
Right.
link |
01:20:15.920
So, there's one thing to say,
link |
01:20:22.480
tell your teenager they can't do something.
link |
01:20:24.480
There's another thing to gather the data.
link |
01:20:27.040
So, you can make informed decisions, that's really interesting.
link |
01:20:29.280
But you have to make that, that's the interesting thing
link |
01:20:32.000
about cars.
link |
01:20:33.520
But even true with ComAI, you don't have to manufacture
link |
01:20:37.600
the thing into the car, is you have to make a decision
link |
01:20:40.080
that anticipates the right strategy long term.
link |
01:20:44.160
So, you have to start collecting the data
link |
01:20:46.640
and start making decisions.
link |
01:20:47.760
It started at three years ago.
link |
01:20:50.000
I believe that we have the best driver monitoring
link |
01:20:52.400
solution in the world.
link |
01:20:53.200
I think that when you compare it to Supercruise,
link |
01:20:56.080
it's the only other one that I really know that shipped,
link |
01:20:57.840
and ours is better.
link |
01:20:59.360
What do you like and not like about Supercruise?
link |
01:21:04.160
I mean, I had a few.
link |
01:21:06.080
Supercruise, the sun would be shining through the window,
link |
01:21:10.240
would blind the camera, and it would say I wasn't paying
link |
01:21:12.560
attention when I was looking completely straight.
link |
01:21:14.560
I couldn't reset the attention with a steering wheel,
link |
01:21:17.120
touch, and Supercruise would disengage.
link |
01:21:19.120
Like, I was communicating to the car, I'm like,
link |
01:21:21.120
look, I'm here, I'm paying attention.
link |
01:21:23.280
Why are you really going to force me to disengage?
link |
01:21:25.360
And it did.
link |
01:21:27.520
So, it's a constant conversation with the user,
link |
01:21:30.640
and yeah, there's no way to ship a system like this
link |
01:21:32.800
if you can't OTA.
link |
01:21:34.400
We're shipping a new one every month.
link |
01:21:35.840
Sometimes we balance it with our users on Discord.
link |
01:21:38.800
Sometimes we make the driver monitoring a little more
link |
01:21:41.040
aggressive and people complain.
link |
01:21:42.400
Sometimes they don't.
link |
01:21:43.600
We want it to be as aggressive as possible
link |
01:21:45.760
where people don't complain and it doesn't feel intrusive.
link |
01:21:47.840
So, being able to update the system over the air
link |
01:21:49.920
is an essential component.
link |
01:21:51.520
I mean, that's probably, to me, you mentioned,
link |
01:21:55.600
I mean, to me, that is the biggest innovation of Tesla,
link |
01:21:59.840
that it made it, people realize that over the air updates
link |
01:22:03.840
is essential.
link |
01:22:04.800
Yeah.
link |
01:22:06.640
I mean, was that not obvious from the iPhone?
link |
01:22:09.040
The iPhone was the first real product that OTA'd, I think.
link |
01:22:11.760
Was it actually, that's brilliant, you're right.
link |
01:22:14.000
I mean, the game consoles used to not, right?
link |
01:22:15.840
The game consoles were maybe the second thing that did.
link |
01:22:17.840
Well, I didn't really think about it.
link |
01:22:20.080
One of the amazing features of a smartphone
link |
01:22:23.200
isn't just, like, the touchscreen isn't the thing.
link |
01:22:26.880
It's the ability to constantly update.
link |
01:22:29.840
Yeah, it gets better.
link |
01:22:31.120
It gets better.
link |
01:22:34.320
I love my iOS 14.
link |
01:22:35.600
Yeah.
link |
01:22:37.440
Well, one thing that I probably disagree with you
link |
01:22:40.800
on, on driver monitoring is you said that it's easy.
link |
01:22:44.720
I mean, you tend to say stuff is easy.
link |
01:22:48.640
I'm sure, I guess you said it's easy
link |
01:22:52.640
relative to the external perception problem there.
link |
01:22:58.160
Can you elaborate why you think it's easy?
link |
01:23:00.880
Feature engineering works for driver monitoring.
link |
01:23:03.440
Feature engineering does not work for the external.
link |
01:23:05.840
So human faces are not, human faces
link |
01:23:09.760
and the movement of human faces and head and body
link |
01:23:13.520
is not as variable as the external environment
link |
01:23:16.080
is your intuition.
link |
01:23:17.200
Yes, and there's another big difference as well.
link |
01:23:20.080
Your reliability of a driver monitoring system
link |
01:23:22.400
doesn't actually need to be that hot.
link |
01:23:24.240
The uncertainty, if you have something that's detecting
link |
01:23:27.120
whether the human's paying attention and it only works 92%
link |
01:23:29.680
of the time, you're still getting almost all the benefit of that
link |
01:23:32.640
because the human, like, you're training the human, right?
link |
01:23:35.360
You're dealing with a system that's really helping you out.
link |
01:23:38.960
It's a conversation.
link |
01:23:40.160
It's not like the external thing where, guess what?
link |
01:23:43.360
If you swerve into a tree, you swerve into a tree, right?
link |
01:23:46.000
Like, you get no margin for error.
link |
01:23:48.160
Yeah, I think that's really well put.
link |
01:23:49.520
I think that's the right, exactly the place where we're comparing
link |
01:23:57.200
to the external perception and the control problem.
link |
01:24:00.160
Driver monitoring is easier because you don't,
link |
01:24:02.080
the bar for success is much lower.
link |
01:24:05.200
Yeah, but I still think like the human face
link |
01:24:09.040
is more complicated actually than the external environment.
link |
01:24:12.000
But for driving, you don't give a damn.
link |
01:24:14.160
I don't need something that complicated to have to communicate
link |
01:24:20.080
the idea to the human that I want to communicate,
link |
01:24:23.120
which is, yo, system might mess up here.
link |
01:24:25.600
You got to pay attention.
link |
01:24:27.680
Yeah. See, that's my love and fascination is the human face.
link |
01:24:32.480
And it feels like this is a nice place to create products
link |
01:24:38.240
that create an experience in the car.
link |
01:24:39.920
So like, it feels like there should be more richer experiences
link |
01:24:45.280
in the car.
link |
01:24:47.040
You know, like that's an opportunity for like something
link |
01:24:50.560
like Kama AI or just any kind of system like a Tesla
link |
01:24:53.680
or any of the autonomous vehicle companies
link |
01:24:56.080
is because software is, there's much more sensors
link |
01:24:59.120
and so much is running on software and you're
link |
01:25:00.800
doing machine learning anyway.
link |
01:25:02.720
There's an opportunity to create totally new experiences
link |
01:25:06.080
that we're not even anticipating.
link |
01:25:08.080
You don't think so? No.
link |
01:25:10.480
You think it's a box that gets you from A to B
link |
01:25:12.720
and you want to do it chill?
link |
01:25:14.960
Yeah. I mean, I think as soon as we get to level three on highways,
link |
01:25:17.360
okay, enjoy your Candy Crush.
link |
01:25:19.120
Enjoy your Hulu.
link |
01:25:20.160
Enjoy your, you know, whatever, whatever.
link |
01:25:23.520
Sure, you get this.
link |
01:25:24.160
You can look at screens basically versus right now,
link |
01:25:26.800
what do you have?
link |
01:25:27.520
Music and audiobooks.
link |
01:25:28.480
So level three is where you can kind of disengage
link |
01:25:30.560
in stretches of time.
link |
01:25:34.800
Well, you think level three is possible.
link |
01:25:37.200
Like on the highway going for 100 miles
link |
01:25:38.960
and you can just go to sleep?
link |
01:25:40.320
Oh yeah, sleep.
link |
01:25:43.440
So again, I think it's really all on a spectrum.
link |
01:25:47.120
I think that being able to use your phone
link |
01:25:49.920
while you're on the highway and like this all being okay
link |
01:25:53.280
and being aware that the car might alert you
link |
01:25:55.200
when you have five seconds to basically.
link |
01:25:56.960
So the five second thing that you think is possible?
link |
01:25:58.800
Yeah, I think it is.
link |
01:25:59.520
Oh yeah, not in all scenarios.
link |
01:26:01.920
Some scenarios it's not.
link |
01:26:03.200
It's the whole risk thing that you mentioned is nice
link |
01:26:06.080
is to be able to estimate like how risk is this situation.
link |
01:26:10.480
That's really important to understand.
link |
01:26:12.800
One other thing you mentioned comparing comma and autopilot
link |
01:26:17.280
is that something about the haptic feel
link |
01:26:22.320
of the way comma controls the car when things are uncertain.
link |
01:26:25.760
Like it behaves a little bit more uncertain
link |
01:26:27.600
when things are uncertain.
link |
01:26:29.440
That's kind of an interesting point
link |
01:26:30.960
and then autopilot is much more confident always
link |
01:26:33.920
even when it's uncertain
link |
01:26:35.440
until it runs into trouble.
link |
01:26:39.120
That's a funny thing.
link |
01:26:40.800
I actually mentioned that to Elon I think
link |
01:26:42.560
and then the first time we talked he was inviting
link |
01:26:46.160
is like communicating uncertainty.
link |
01:26:48.720
I guess comma doesn't really communicate uncertainty explicitly.
link |
01:26:52.480
It communicates it through haptic feel.
link |
01:26:54.960
Like what's the role of communicating uncertainty do you think?
link |
01:26:58.000
We do some stuff explicitly.
link |
01:26:59.680
Like we do detect the lanes when you're on the highway
link |
01:27:01.680
and we'll show you how many lanes we're using to drive with.
link |
01:27:04.240
You can look at where things the lanes are.
link |
01:27:06.080
You can look at the path.
link |
01:27:08.320
And we want to be better about this
link |
01:27:10.240
when we're actually hiring.
link |
01:27:11.360
We want to hire some new UI people.
link |
01:27:12.800
UI people, you mentioned this.
link |
01:27:14.160
Because it's a UI problem too, right?
link |
01:27:16.560
We have a great designer now
link |
01:27:19.200
but we need people who are just going to build this
link |
01:27:21.040
and debug these UIs.
link |
01:27:22.240
QT people and QT.
link |
01:27:24.480
Is that what the UI has done with this QT?
link |
01:27:26.320
Moving the new UIs and QT.
link |
01:27:29.120
C++ QT.
link |
01:27:31.840
Tesla uses it too.
link |
01:27:33.200
Yeah.
link |
01:27:33.440
Yeah.
link |
01:27:34.000
We had some React stuff in there.
link |
01:27:37.760
React.js or just React.
link |
01:27:39.280
React has its own language, right?
link |
01:27:40.960
React Native.
link |
01:27:41.760
React Native.
link |
01:27:42.560
React is a JavaScript framework.
link |
01:27:45.040
It's all based on JavaScript.
link |
01:27:48.720
I like C++.
link |
01:27:51.280
What do you think about Dojo with Tesla
link |
01:27:54.960
and their foray into what appears to be
link |
01:27:58.160
specialized hardware for training on that?
link |
01:28:04.960
I guess it's something maybe you can correct me
link |
01:28:07.200
for my shallow looking at it.
link |
01:28:09.760
It seems like something that Google did with TPUs
link |
01:28:11.920
but specialized for driving data.
link |
01:28:15.440
I don't think it's specialized for driving data.
link |
01:28:18.240
It's just legit, just TPU.
link |
01:28:19.920
They want to go the Apple way.
link |
01:28:22.000
Basically everything required in the chain is done in house.
link |
01:28:25.520
Well, so you have a problem right now
link |
01:28:27.680
and this is one of my concerns.
link |
01:28:31.600
I really would like to see somebody deal with this
link |
01:28:33.680
if anyone out there is doing it.
link |
01:28:35.120
I'd like to help them if I can.
link |
01:28:37.840
You basically have two options right now to train.
link |
01:28:41.680
Your options are Nvidia or Google.
link |
01:28:45.760
So Google is not even an option.
link |
01:28:50.000
Their TPUs are only available in Google Cloud.
link |
01:28:52.240
Google has absolutely onerous terms of service restrictions.
link |
01:28:56.960
They may have changed it but back in Google's terms of service
link |
01:29:00.400
it said explicitly you are not allowed to use Google Cloud ML
link |
01:29:03.600
for training autonomous vehicles
link |
01:29:05.120
or for doing anything that competes with Google
link |
01:29:07.120
without Google's prior written permission.
link |
01:29:10.080
I mean Google is not a platform company.
link |
01:29:13.920
I wouldn't touch TPUs with a 10 foot pole.
link |
01:29:16.560
So that leaves you with the monopoly.
link |
01:29:19.040
In video.
link |
01:29:19.600
In video.
link |
01:29:20.160
Yeah. So I mean.
link |
01:29:21.920
That you're not a fan of.
link |
01:29:23.840
Well look I was a huge fan of in 2016 Nvidia.
link |
01:29:28.400
Jensen came sat in the car.
link |
01:29:31.760
Cool guy when the stock was $30 a share.
link |
01:29:35.280
Nvidia stock has skyrocketed.
link |
01:29:38.080
I witnessed a real change and who was in management
link |
01:29:40.960
over there in like 2018.
link |
01:29:43.440
And now they are let's exploit.
link |
01:29:46.560
Let's take every dollar we possibly can out of this ecosystem.
link |
01:29:49.440
Let's charge $10,000 for A100s
link |
01:29:51.600
because we know we got the best shit in the game.
link |
01:29:54.080
And let's charge $10,000 for an A100
link |
01:29:57.760
when it's really not that different from a 3080
link |
01:30:00.000
which is $699.
link |
01:30:03.360
The margins that they are making off of those high end chips
link |
01:30:06.080
are so high that I mean I think they're shooting themselves
link |
01:30:10.080
in the foot just from a business perspective
link |
01:30:12.000
because there's a lot of people talking like me now
link |
01:30:14.800
who are like somebody's got to take Nvidia down.
link |
01:30:16.800
Yeah where they could dominate.
link |
01:30:20.880
Nvidia could be the new Intel.
link |
01:30:22.320
Yeah to be inside everything essentially and yet the winners
link |
01:30:29.040
in certain spaces like in autonomous driving the winners
link |
01:30:33.600
only the people who are like desperately falling back
link |
01:30:36.400
and trying to catch up and have a ton of money
link |
01:30:38.400
like the big automakers are the ones interested
link |
01:30:40.960
in partnering with Nvidia.
link |
01:30:42.960
Oh and I think a lot of those things are going to fall through.
link |
01:30:45.760
If I were Nvidia sell chips.
link |
01:30:49.280
Sell chips at a reasonable markup.
link |
01:30:52.080
To everybody.
link |
01:30:52.800
To everybody.
link |
01:30:53.440
Without any restrictions.
link |
01:30:54.720
Without any restrictions.
link |
01:30:56.000
Intel did this.
link |
01:30:57.120
Look at Intel.
link |
01:30:58.080
They had a great long run.
link |
01:30:59.760
Nvidia is trying to turn their they're like trying to
link |
01:31:02.240
productize their chips way too much.
link |
01:31:05.440
They're trying to extract way more value
link |
01:31:07.680
than they can sustainably.
link |
01:31:09.280
Sure you can do it tomorrow.
link |
01:31:10.560
Is it going to up your share price?
link |
01:31:12.080
Sure if you're one of those CEOs is like how much can I strip
link |
01:31:14.400
mine this company and you know and that's what's weird about it too.
link |
01:31:17.760
Like the CEO is the founder.
link |
01:31:19.200
It's the same guy.
link |
01:31:20.000
Yeah.
link |
01:31:20.320
I mean I still think Jensen's a great guy.
link |
01:31:22.240
He is great.
link |
01:31:23.280
Why do this?
link |
01:31:25.120
You have a choice.
link |
01:31:26.800
You have a choice right now.
link |
01:31:27.840
Are you trying to cash out?
link |
01:31:28.720
Are you trying to buy a yacht?
link |
01:31:30.560
If you are fine.
link |
01:31:32.000
But if you're trying to be the next huge semiconductor company
link |
01:31:36.080
sell chips.
link |
01:31:37.120
Well the interesting thing about Jensen
link |
01:31:40.080
is he is a big vision guy.
link |
01:31:42.000
So he has a plan like for 50 years down the road.
link |
01:31:48.640
So it makes me wonder like.
link |
01:31:50.400
How does price gouging fit into it?
link |
01:31:51.760
Yeah how does that like it's it doesn't seem to make sense as a plan.
link |
01:31:56.960
I worry that he's listening to the wrong people.
link |
01:31:59.200
Yeah that that's the sense I have too sometimes because I
link |
01:32:04.000
despite everything I think NVIDIA is an incredible company.
link |
01:32:08.880
Well one I'm deeply grateful to NVIDIA for the products they've created in the past.
link |
01:32:14.160
Me too.
link |
01:32:14.160
Right.
link |
01:32:14.720
And so.
link |
01:32:16.000
The 1080Ti was a great GPU.
link |
01:32:17.840
Still have a lot of them.
link |
01:32:18.800
Still is yeah.
link |
01:32:21.760
But at the same time it just feels like.
link |
01:32:26.720
Feels like you don't want to put all your stock in NVIDIA.
link |
01:32:29.360
And so the Elon is doing what Tesla is doing with autopilot and Dojo
link |
01:32:34.880
is the Apple way is because they're not going to share Dojo.
link |
01:32:38.720
With George Hott's.
link |
01:32:41.600
I know they should sell that chip.
link |
01:32:43.600
Oh they should sell that even their their accelerator.
link |
01:32:46.160
The accelerator that's in all the cars the 30 watt one.
link |
01:32:49.040
Sell it why not.
link |
01:32:51.440
So open it up.
link |
01:32:52.560
Make me why does this has to be a car company.
link |
01:32:55.680
Well if you sell the chip here's what you get.
link |
01:32:57.920
Yeah.
link |
01:32:58.800
Makes the money all the chips.
link |
01:33:00.000
It doesn't take away from your chip.
link |
01:33:01.840
You're going to make some money free money.
link |
01:33:03.760
And also the world is going to build an ecosystem of tooling for you.
link |
01:33:08.880
Right.
link |
01:33:09.120
You're not going to have to fix the bug in your 10H layer.
link |
01:33:12.640
Someone else already did.
link |
01:33:14.960
Well the question that's an interesting question.
link |
01:33:16.560
I mean that's the question Steve Jobs asked.
link |
01:33:18.560
That's the question Elon Musk is perhaps asking is
link |
01:33:24.960
do you want Tesla stuff inside other vehicles
link |
01:33:27.920
in inside potentially inside like iRobot Vacuum Cleaner.
link |
01:33:32.480
Yeah.
link |
01:33:34.640
I think you should decide where your advantages are.
link |
01:33:37.120
I'm not saying Tesla should start selling battery packs to to automakers
link |
01:33:40.240
because battery packs to automakers they're straight up in competition with you.
link |
01:33:43.520
If I were Tesla I'd keep the battery technology totally.
link |
01:33:45.920
Yeah.
link |
01:33:46.160
As far as we make batteries.
link |
01:33:47.840
But the thing about the Tesla TPU is anybody can build that.
link |
01:33:53.040
It's just a question of you know are you willing to spend the you know the money.
link |
01:33:57.280
It could be a huge source of revenue potentially.
link |
01:34:00.000
Are you willing to spend 100 million dollars.
link |
01:34:01.280
Right.
link |
01:34:02.240
Anyone can build it.
link |
01:34:03.600
And someone will.
link |
01:34:04.560
And a bunch of companies now are starting trying to build AI accelerators.
link |
01:34:08.000
Somebody's going to get the idea right.
link |
01:34:10.080
And yeah.
link |
01:34:11.520
Hopefully they don't get greedy because they'll just lose to the next guy who finally
link |
01:34:15.680
and then eventually the Chinese are going to make knockoff and video chips and that's.
link |
01:34:19.440
From your perspective I don't know if you're also paying attention to Stan Tesla for a moment.
link |
01:34:24.080
Dave Elon Musk has talked about a complete rewrite
link |
01:34:27.600
of the neural net that they're using that seems to again I'm half paying attention.
link |
01:34:34.640
But it seems to involve basically a kind of integration of all the sensors to where
link |
01:34:42.000
it's a four dimensional view you know you have a 3D model of the world over time.
link |
01:34:47.440
And then you can I think it's done both for the for the actually you know so the neural
link |
01:34:53.680
network is able to in a more holistic way deal with the world and make predictions and so on.
link |
01:34:59.120
But also to make the annotation task more you know easier like you can annotate the world
link |
01:35:07.120
in one place and they kind of distribute itself across the sensors and across a different
link |
01:35:12.800
like the hundreds of tasks that are involved in the hydranet.
link |
01:35:16.400
What are your thoughts about this rewrite is it just like some details that are kind of obvious
link |
01:35:22.240
that are steps that should be taken or is there something fundamental that could challenge your
link |
01:35:27.120
idea that end to end is the right solution. We're in the middle of a big rewrite now as well
link |
01:35:32.880
we haven't shipped a new model in a bit. Of what kind? We're going from 2D to 3D.
link |
01:35:38.160
Right now all our stuff like for example when the car pitches back the lane lines also pitch back
link |
01:35:42.880
because we're assuming the flat world hypothesis the new models do not do this the new models
link |
01:35:48.800
output everything in 3D. But there's still no annotation so the 3D is it's more about the
link |
01:35:55.600
output. We have Z's in everything. We've had a disease. We unified a lot of stuff as well.
link |
01:36:06.480
We switched from TensorFlow to PyTorch. My understanding of what Tesla's thing is is
link |
01:36:13.840
that their annotator now annotates across the time dimension. I mean cute. Why are you building an
link |
01:36:22.400
annotator? I find their entire pipeline. I find your vision I mean the vision of end to end
link |
01:36:31.600
very compelling but I also like the engineering of the data engine that they've created.
link |
01:36:36.560
In terms of supervised learning pipelines that thing is damn impressive. You're basically the
link |
01:36:45.120
idea is that you have hundreds of thousands of people that are doing data collection for you
link |
01:36:51.040
by doing their experience so that's kind of similar to the Kama AI model and you're able to
link |
01:36:58.240
mine that data based on the kind of education you need. I think it's harder to do in the end
link |
01:37:05.840
to end learning. The mining of the right edge case. That's what feature engineering is actually
link |
01:37:12.400
really powerful because us humans are able to do this kind of mining a little better.
link |
01:37:19.760
But yeah there's obvious constraints and limitations to that idea.
link |
01:37:25.680
Carpathia just tweeted. He's like you get really interesting insights if you sort your validation
link |
01:37:31.680
set by loss and look at the highest loss examples. Yeah. So yeah I mean you can do we have we have
link |
01:37:40.640
a little data engine like thing we're training a segnat anyway it's not fancy it's just like
link |
01:37:45.520
okay train the new segnat run it on a hundred thousand images and now take the thousand with
link |
01:37:50.800
highest loss select a hundred of those by human put those get those ones labeled retrain do it
link |
01:37:56.880
again. So it's a much less well written data engine and yeah you can take these things really
link |
01:38:03.200
far and it is impressive engineering and if you truly need supervised data for a problem
link |
01:38:09.760
yeah things like data engine are the high end of what is attention is a human paying attention.
link |
01:38:15.840
I mean we're going to probably build something that looks like data engine to push our driver
link |
01:38:19.440
monitoring further but for driving itself you have it all annotated beautifully by what the human
link |
01:38:25.120
does so. Yeah that's interesting I mean that applies to driver attention as well. Do you want to
link |
01:38:30.320
detect the eyes do you want to detect blinking and pupil movement do you want to detect all the
link |
01:38:35.040
like a face alignment so landmark detection and so on and then doing kind of reasoning based on
link |
01:38:41.120
that or do you want to take the entirety of the face over time and do end to end. I mean it's
link |
01:38:45.680
obvious that over eventually you have to do end to end with some calibration with some fixes and
link |
01:38:50.320
so on but it's like I don't know when that's the right move. Even if it's end to end there actually
link |
01:38:58.080
is there is no kind of um you have to supervise that with humans. Whether a human is paying attention
link |
01:39:04.960
or not is a completely subjective judgment um like you can try to like automatically do it
link |
01:39:10.880
with some stuff but you don't have if I record a video of a human I don't have true annotations
link |
01:39:17.040
anywhere in that video. The only way to get them is with you know other humans labeling it really.
link |
01:39:22.720
Well I don't know you so if you think deeply about it you could you might be able to just
link |
01:39:30.000
depending on the task you may be a discover self annotating things like you know you can look at
link |
01:39:35.600
like steering wheel reverse or something like that. You can discover little moments of lapse of
link |
01:39:39.760
attention. Yeah I mean that's that's where psychology comes in is there indicate because you
link |
01:39:45.600
have so so much data to look at so you might be able to find moments when there's like just
link |
01:39:52.320
inattention that even with smartphone if you want to text smartphone use yeah you can start to zoom
link |
01:39:57.440
in I mean that's the goldmine a sort of the comma AI I mean Tesla's doing this too right is there
link |
01:40:03.200
they're doing annotation based on it's like uh self supervised learning too it's just a small part
link |
01:40:11.440
of the entire picture it's that's kind of the challenge of solving a problem in machine learning
link |
01:40:18.560
if you can discover self annotating parts of the problem right. Our driver monitoring team is
link |
01:40:26.080
half a person right now half a person you know once we have skill to a full once we have two people
link |
01:40:31.120
once we have two three people on that team I definitely want to look at self annotating stuff
link |
01:40:35.120
for attention. Let's go back for a sec to to a comma and what you know for people who are curious
link |
01:40:44.640
to try it out how do you install a comma in say a 2022 or a Corolla or like what are the cars that
link |
01:40:52.480
are supported what are the cars that you recommend and what does it take you have a few videos out
link |
01:40:58.720
but maybe through words can you explain now what's it take to actually install a thing.
link |
01:41:02.720
So we support I think it's 91 cars 91 makes models you get to 100 this year.
link |
01:41:10.000
Nice. The yeah the 2020 Corolla great choice the 2020 Sonata it's using the stock longitudinal
link |
01:41:21.040
it's using just our lateral control but it's a very refined car their longitudinal control is not
link |
01:41:26.480
bad at all. So yeah Corolla Sonata or if you're willing to get your hands a little dirty and
link |
01:41:34.160
look in the right places on the internet the Honda Civic is great but you're going to have to install
link |
01:41:38.880
a modified EPS firmware in order to get a little bit more torque and I can't help you with that
link |
01:41:43.200
comma does not efficiently endorse that but we have been doing it we didn't ever release it
link |
01:41:49.520
we waited for someone else to discover it and then you know. And you have a discord server
link |
01:41:54.320
where people there's a very active developer community I suppose so depending on the level
link |
01:42:02.240
of experimentation you're willing to do that's a community. If you if you just want to buy it and
link |
01:42:09.200
you have a supported car yeah it's 10 minutes to install there's YouTube videos it's Ikea
link |
01:42:16.160
furniture level if you can set up a table from Ikea you can install a comma two in your supported
link |
01:42:20.640
car and it will just work now you're like oh but I want this high end feature or I want to fix this
link |
01:42:25.680
bug okay well welcome to the developer community. So what if I wanted to this is something I asked
link |
01:42:31.760
you I'll find like a few months ago if I wanted to run my own code to so use comma as a platform
link |
01:42:43.280
and try to run something like open pilot what does it take to do that? So there's a toggle in the
link |
01:42:49.520
settings called enable ssh and if you toggle that you can ssh into your device you can modify the
link |
01:42:55.200
code you can upload whatever code you want to it. There's a whole lot of people so about 60% of people
link |
01:43:00.960
are running stock comma about 40% of people are running forks and there's a community of there's
link |
01:43:07.360
a bunch of people who maintain these forks and these forks support different cars or they have
link |
01:43:14.080
you know different toggles we try to keep away from the toggles that are like disabled
link |
01:43:17.920
driver monitoring but you know there's some people might want that kind of thing and like you know
link |
01:43:22.160
yeah you can it's your car it's your I'm not here to tell you you know we have some you know we ban
link |
01:43:30.880
if you're trying to subvert safety features you're banned from our discord I don't want anything to
link |
01:43:34.240
do with you but there's some forks doing that. Got it. So you encourage responsible forking.
link |
01:43:42.720
Yeah yeah we encourage some people you know yeah some people like like there's forks that will do
link |
01:43:48.000
some people just like having a lot of readouts on the UI like a lot of like flashing numbers so
link |
01:43:53.360
there's forks that do that. Some people don't like the fact that it disengages when you press
link |
01:43:57.440
the gas pedal there's forks that disable that. Got it. Now the the stock experience is is what
link |
01:44:03.920
like so it does both lane keeping and longitudinal control all together so it's not separate like
link |
01:44:09.760
it is an autopilot. No so okay some cars we use the stock longitudinal control we don't do the
link |
01:44:15.280
longitudinal control in all the cars. Some cars the ACC's are pretty good in the cars it's the
link |
01:44:19.680
lane keep that's atrocious in anything except for autopilot and supercruise. But you know you just
link |
01:44:24.800
turn it on and it works what does this engagement look like? Yeah so we have I mean I'm very concerned
link |
01:44:31.600
about mode confusion I've experienced it on supercruise and autopilot where like autopilot like
link |
01:44:38.320
autopilot disengages I don't realize that the ACC is still on the lead car moves slightly over and
link |
01:44:44.560
then the Tesla accelerates to like whatever my set speed is super fast and like what's going on here.
link |
01:44:51.200
We have engaged and disengaged and this is similar to my understanding I'm not a pilot but my
link |
01:44:56.640
understanding is either the pilot is in control or the copilot is in control and we have the same
link |
01:45:03.280
kind of transition system either open pilot is engaged or open pilot is disengaged engage with
link |
01:45:09.040
cruise control disengage with either gas break or cancel. Let's talk about money what's the business
link |
01:45:15.920
strategy for comma profitable well it's your you did it congratulations what so it's basically
link |
01:45:25.040
selling we should say comma cost a thousand bucks comma two two hundred for the interface
link |
01:45:30.800
to the car as well it's 1200 I'll send that nobody's usually up front like this you gotta add the
link |
01:45:37.280
tack on right yeah I love it this I'm not gonna lie to you trust me it will add 1200 a value to
link |
01:45:43.360
your life yes it's still super cheap 30 days no questions asked money back guarantee and
link |
01:45:48.240
prices are only going up you know if there ever is future hardware it costs a lot more than 1200
link |
01:45:53.200
dollars so comma three is in the works so it could be all I all I will say is future hardware is
link |
01:45:59.600
going to cost a lot more than the current hardware yeah like the people that use the people have
link |
01:46:05.680
spoken with that use comma they use open pilot they first of all they use it a lot so people that
link |
01:46:12.640
use it they they fall in love with oh our retention rate is insane there's a good sign yeah it's a
link |
01:46:18.240
really good sign um 70 percent of comma two buyers are daily active users yeah it's amazing
link |
01:46:24.960
um oh also we don't plan on stopping selling the comma two like like it's you know so whatever
link |
01:46:32.800
you create that's beyond comma two it would be uh it would be potentially a phase shift
link |
01:46:40.640
like it's it's so much better that like you could use comma two and you can use comma whatever
link |
01:46:45.520
depends what you want it's kind of 41 42 yeah you know autopilot hardware one versus hardware two
link |
01:46:51.840
the comma two is kind of like hardware one got it got it you can still use both got it got it
link |
01:46:56.160
I think I heard you talk about retention rate with uh BR headsets that the average is just
link |
01:47:00.640
once yeah just fast I mean it's such a fascinating way to think about technology
link |
01:47:05.680
and this is a really really good sign and the other thing that people say about comma is like
link |
01:47:09.120
they can't believe they're getting this four thousand bucks right it's it seems it seems like
link |
01:47:14.560
some kind of steal so but in terms of like long term business strategies that basically to put so
link |
01:47:21.680
it's currently in like a thousand plus cars uh 1200 more uh so yeah dailies is about uh
link |
01:47:34.640
dailies is about 2000 weekly is about 2500 monthly is over 3000 wow we've grown a lot since
link |
01:47:40.880
we last talked is the goal that can we talk crazy for a second I mean what's the the goal to overtake
link |
01:47:46.880
tesla let's talk okay so I mean android did overtake ira that's exactly it right so
link |
01:47:54.560
they did it I actually don't know the timeline of that one they but let let let's talk uh because
link |
01:48:00.560
everything is in alpha now the autopilot you could argue is in alpha in terms of towards the big
link |
01:48:05.360
mission of autonomous driving right and so what yeah it's your goal to overtake to get millions of
link |
01:48:12.000
cars essentially of course where would it stop like it's open source software it might not be
link |
01:48:18.560
millions of cars with a piece of comma hardware but yeah I think open pilot at some point will
link |
01:48:24.400
cross over autopilot in in in users just like android crossed over ios how does google make money
link |
01:48:30.320
from android uh it's it's complicated their own devices make money google google makes money by
link |
01:48:39.440
just kind of having you on the internet uh yes google search is built in gmail is built in android
link |
01:48:45.760
is just a shield for the rest of google's ecosystem kind yeah but the problem is android is not is a
link |
01:48:51.120
brilliant thing i mean android arguably changed the world so there you go that's you can you can
link |
01:48:58.960
feel good ethically speaking but as a business strategy it's questionable oh so hardware so
link |
01:49:05.840
hardware i mean it took google a long time to come around to it but they are now making money on the
link |
01:49:08.960
pixel you're not about money you're more about winning yeah but if only if only 10 percent of
link |
01:49:16.320
open pilot devices come from comma ai we still make a lot that is still yes that is a ton of
link |
01:49:21.920
money for our company but can't somebody create a better comma using open pilot or you're basically
link |
01:49:27.680
saying we'll outcompete them well i'll compete you is can you create a better android phone than
link |
01:49:31.280
the google pixel right i mean you can but like i love that so you're confident like you know what
link |
01:49:37.280
the hell you're doing yeah it's it's uh uh competence and merit i mean our money our money
link |
01:49:44.320
comes from we're consumer electronics company yeah and put it this way so we sold we sold like
link |
01:49:48.880
3 000 comma twos um i mean 2500 right now uh and like okay we're probably gonna sell 10 000 units
link |
01:50:00.560
next year right 10 000 units and even just a thousand dollars a unit okay we're at 10 million
link |
01:50:05.760
in uh in in revenue um get that up to a hundred thousand maybe double the price of the unit now
link |
01:50:12.080
we're talking like 200 million revenue yeah actually making money one of the rare semi autonomous
link |
01:50:18.080
autonomous vehicle companies that are actually making money yeah yeah you know if you have if you
link |
01:50:24.320
look at a model when we were just talking about this yesterday if you look at a model and like
link |
01:50:27.680
you're testing like your ab testing your model and if your your your one branch of the ab test
link |
01:50:32.240
the losses go down very fast in the first five epochs yeah that model is probably going to converge
link |
01:50:37.440
to something considerably better than the one with the losses going down slower why do people
link |
01:50:41.840
think this is going to stop why do people think one day there's going to be a great like well
link |
01:50:45.680
waymo's eventually going to surpass you guys oh they're not do you see like a world where like
link |
01:50:53.680
a tesla or a car like a tesla would be able to basically press a button and you like switch
link |
01:51:00.080
to open pilot you know you you know they've load in i don't know so i think so first off
link |
01:51:06.320
i think that we may surpass tesla in terms of users i do not think we're going to surpass tesla
link |
01:51:12.160
ever in terms of revenue i think tesla can capture a lot more revenue per user than we can but this
link |
01:51:17.920
mimics the android ios model exactly there may be more android devices but you know there's a
link |
01:51:22.560
lot more iphones than google pixels so i think there'll be a lot more tesla cars sold than pieces
link |
01:51:26.720
of comma hardware um and then as far as a tesla owner being able to switch to open pilot uh does
link |
01:51:36.800
ios does iphones run android no but you can if you really want to do it but it doesn't really make
link |
01:51:44.080
sense like it's not it doesn't make sense who cares what about if uh a large company like
link |
01:51:49.280
automakers for GM Toyota came to george hots or on the tech space amazon facebook google
link |
01:51:57.920
came with a large pile of cash uh would would you consider being um purchased
link |
01:52:07.360
what did you see that as a one possible not seriously no um i would probably uh see how much
link |
01:52:15.760
uh shit they'll entertain for me um and if they're willing to like jump through a bunch of my hoops
link |
01:52:21.840
then maybe but like no not the way that mna works today i mean we've been approached and i laugh in
link |
01:52:27.040
these people's faces i'm like are you kidding yeah you know because it's so it's so it's so
link |
01:52:32.880
demeaning the mna people are so demeaning to companies they treat the startup world as their
link |
01:52:39.680
innovation ecosystem and they think that i'm cool with going along with that so i can have some of
link |
01:52:43.920
their scam fake fed dollars you know fed coin i'm what am i gonna do with more fed coin you know
link |
01:52:49.360
head coin fed coin man i love that so that's the cool thing about podcasting actually is uh
link |
01:52:55.280
people criticize i don't know if you're familiar with uh less spotify uh giving joe rogan a hundred
link |
01:53:00.720
million i'd talk about that and you know they respect despite all the shit that people are
link |
01:53:08.480
talking about spotify people understand that podcasters like joe rogan know what the hell they're
link |
01:53:16.000
doing yeah so they give them money and say just do what you do and like the equivalent for you
link |
01:53:24.320
would be like george do what the hell you do because you're good at it try not to murder
link |
01:53:29.360
too many people like try like there's some kind of common sense things like just don't go on a weird
link |
01:53:34.960
rampage of yeah it comes down to what companies i could respect right um you know could i respect
link |
01:53:44.000
gm never um no i couldn't i mean could i respect like a hundai more some right that's that's a
link |
01:53:52.960
lot closer to yoda what's your nah no it's like the korean is the way i think i think that you
link |
01:54:00.160
know the japanese the germans the u.s they're all too they're all too you know they all think
link |
01:54:04.560
they're too great to be about the tech companies apple apple is of the tech companies that i could
link |
01:54:10.480
respect apples the closest yeah i mean i could never should be ironic would be ironic oh if
link |
01:54:16.400
if kama ai is uh is acquired by apple i mean facebook look i quit facebook 10 years ago
link |
01:54:21.760
because i didn't respect the business model um google has declined so fast in the last five years
link |
01:54:27.680
what are your thoughts about wemo and its present and its future is let me let me say let me start
link |
01:54:34.880
by saying something uh nice which is uh i've visited them a few times and i've uh have written
link |
01:54:43.600
in their cars and the engineering that they're doing both the research and the actual development
link |
01:54:51.600
and the engineering they're doing and the scale they're actually achieving by doing it all themselves
link |
01:54:56.240
is really impressive and the the balance of safety and innovation and like the cars work
link |
01:55:04.480
really well for the routes they drive like they drive fast which was very surprising to me like
link |
01:55:10.800
it drives like the speed limit or faster the speed limit is it goes and it works really damn well
link |
01:55:17.600
and the interface is nice and channel or zone yeah yeah and channel there's a very specific
link |
01:55:21.840
environment so it i you know it gives me enough material in my mind to push back against the
link |
01:55:28.960
madman of the world like george hotz to be like because you kind of imply there's zero probability
link |
01:55:37.040
they're going to win yeah and after i've used after i've written in it to me it's not zero oh
link |
01:55:44.560
it's not for technology reasons bureaucracy no it's worse than that it's actually for product
link |
01:55:50.320
reasons i think oh you think they're just not capable of creating an amazing product uh no i
link |
01:55:56.000
think that the product that they're building doesn't make sense um so a few things uh you
link |
01:56:03.440
say the waymos are fast um benchmark a waymo against a competent uber driver right the uber
link |
01:56:10.160
driver is faster it's not even about speed it's the thing you said it's about the experience of
link |
01:56:14.560
being stuck at a stop sign because pedestrians are crossing nonstop that i like when my uber
link |
01:56:20.880
driver doesn't come to a full stop at the stop sign yeah you know and so let's say the waymos
link |
01:56:28.320
are 20 slower than than an uber right um you can argue that they're going to be cheaper
link |
01:56:34.880
and i argue that users already have the choice to trade off money for speed it's called uber pool
link |
01:56:40.160
um i think it's like 15 percent of rides at uber pools right users are not willing to trade off
link |
01:56:47.920
money for speed so the whole product that they're building is not going to be competitive
link |
01:56:54.640
with traditional ride sharing networks right um like and also whether there's profit to be made
link |
01:57:04.240
depends entirely on one company having a monopoly i think that the level for autonomous ride sharing
link |
01:57:11.120
vehicles market is going to look a lot like the scooter market if even the technology does come
link |
01:57:16.400
to exist which i question who's doing well in that market yeah it's a race to the bottom you know
link |
01:57:22.160
well they could be it could be closer like an uber in a lift where it's just a one or two players
link |
01:57:27.840
well the scooter people have given up trying to market scooters as a practical means of
link |
01:57:34.720
transportation and they're just like they're super fun to ride look at wheels i love those things and
link |
01:57:39.040
they're great on that front yeah but from an actual transportation product perspective
link |
01:57:44.480
i do not think scooters are viable and i do not think level four autonomous cars are viable
link |
01:57:49.040
if you uh let's play a fun experiment if you ran let's do uh tesla and let's do waymo
link |
01:57:56.800
if uh ilan musk took a vacation for a year he just said screw it i'm gonna go live in an island
link |
01:58:03.760
no electronics and the board decides that we need to find somebody to run the company
link |
01:58:09.040
and they they decide that you should run the company for a year how do you run tesla differently
link |
01:58:14.720
i wouldn't change much do you think they're on the right track i wouldn't change i mean i'd have
link |
01:58:19.520
some minor changes but even even my debate with tesla about you know end to end versus segnets
link |
01:58:27.440
like that's just software who cares right like it's not gonna it's not like you're doing something
link |
01:58:33.120
terrible with segnets you're probably building something that's at least going to help you debug
link |
01:58:36.640
the end to end system a lot right it's very easy to transition from what they have to like an end
link |
01:58:43.280
to end kind of thing right uh and then i presume you would uh in the model y or maybe in the model
link |
01:58:51.200
three start adding driver sensing with infrared yes i would add i would add i would add infrared
link |
01:58:56.000
camera infrared lights right away to those cars um and start collecting that data and do all that
link |
01:59:04.160
kind of stuff yeah very much i think they're already kind of doing it it's it's an incredibly minor
link |
01:59:08.880
change if i actually were to have tesla first off i'd be horrified that i wouldn't be able to do a
link |
01:59:12.560
better job as elon and then i would try to you know understand the way he's done things before
link |
01:59:17.600
you would also have to take over his twitter so i don't tweet yeah what's your twitter situation
link |
01:59:23.520
why why why are you so quiet on twitter i mean du comma is like what what's your social network
link |
01:59:29.520
presence like because you on instagram you're you uh you do live streams you're you're you're um
link |
01:59:36.720
you understand the music of the internet but you don't always fully engage into it you're
link |
01:59:42.000
part time why you still have a twitter yeah i mean it's the instagram is a pretty place
link |
01:59:47.520
instagram is a beautiful place it glorifies beauty i like i like instagram's values as a network
link |
01:59:52.640
got um twitter glorifies conflict quarter glorifies you know like like like like like
link |
01:59:58.560
you know just shots taking shots of people and it's like you know
link |
02:00:03.520
you know twitter and donald trump are perfectly perfect for each other so teslas on uh teslas
link |
02:00:10.400
on the right track in your view yeah okay so let's try let's like really try this experiment if you
link |
02:00:16.800
ran waymo let's say they're i don't know if you agree but they seem to be at the head of the pack of
link |
02:00:22.640
the kind of uh what would you call that approach like it's not necessarily lighter based because
link |
02:00:29.040
it's not about lighter level four robotaxi level four robotaxi all in before any before making
link |
02:00:34.880
any revenue uh so they're probably at the head of the pack if you were said uh hey george can
link |
02:00:42.640
you please run this company for a year how would you change it uh i would go i would get anthony
link |
02:00:48.320
levandowski out of jail and i would put him in charge of the company um let's try to break
link |
02:00:57.520
that apart one do you want to make you want to destroy the company by doing that or do you mean
link |
02:01:02.320
or do you mean uh you like renegade style thinking that pushes that that like throws away bureaucracy
link |
02:01:11.280
and goes to first principle thinking what what do you mean by that um i think anthony levandowski
link |
02:01:15.120
is a genius and i think he would come up with a much better idea of what to do with waymo than me
link |
02:01:22.240
so you mean that unironically he is a genius oh yes oh absolutely without a doubt i mean
link |
02:01:27.920
i'm not saying there's no shortcomings but in the interactions i've had with him yeah what um he's
link |
02:01:35.920
also willing to take like who knows what he would do with waymo i mean he's also out there like
link |
02:01:40.560
far more out there than i am yeah his big risks yeah what do you make of him i was i was going to
link |
02:01:45.840
talk to him in his pockets and i was going back and forth i'm such a gullible naive human like
link |
02:01:51.840
i see the best in people and i slowly started to realize that there might be some people out
link |
02:01:57.840
there that like have multiple faces to the world they're like deceiving and dishonest i still
link |
02:02:09.520
refuse to like i i just i trust people and i don't care if i get hurt by it but like
link |
02:02:15.600
you know sometimes you have to be a little bit careful especially platform wise and
link |
02:02:19.520
podcast wise what are you what am i supposed to think so you think you think he's a good person
link |
02:02:25.040
oh i don't know i don't really make moral judgments it's difficult to all i mean this
link |
02:02:31.920
about the waymo i actually i mean that whole idea very non ironically about what i would do
link |
02:02:36.080
the problem with putting me in charge of waymo is waymo is already 10 billion dollars in the
link |
02:02:40.400
hall right whatever idea waymo does look comma's profitable comma's raised 8.1 million dollars
link |
02:02:46.560
that's small you know that's small money like i can build a reasonable consumer electronics company
link |
02:02:50.720
and succeed wildly at that and still never be able to pay back waymo's 10 billion so i i think the
link |
02:02:57.040
basic idea with waymo will forget the 10 billion because they have some backing but your basic
link |
02:03:02.560
thing is like what can we do to start making some money well no i mean my bigger idea is like
link |
02:03:07.840
whatever the idea is that's going to save waymo i don't have it it's going to have to be a big
link |
02:03:12.560
risk idea and i cannot think of a better person than anthony levendowski to do it so that is
link |
02:03:18.320
completely what i would do ceo of waymo i would call myself a transitionary ceo
link |
02:03:22.480
do everything i can to fix that situation up yeah uh yeah because i can't i can't do it right like
link |
02:03:29.760
i can't i can't i mean i can talk about how what i really want to do is just apologize for all those
link |
02:03:35.520
corny uh you know ad campaigns and be like here's the real state of the technology yeah that's
link |
02:03:40.880
like i have several criticism i'm a little bit more bullish on waymo than than you seem to be
link |
02:03:46.000
but one criticism i have is it went into corny mode too early like it's still a startup it hasn't
link |
02:03:52.560
delivered on anything so it should be like more renegade and show off the engineering that they're
link |
02:03:58.800
doing which just can be impressive as opposed to doing these weird commercials of like your friendly
link |
02:04:05.360
your friendly car company i mean that's my biggest my biggest snipe at waymo was always
link |
02:04:09.840
that guy's a paid actor that guy's not a waymo user he's a paid actor look here i found his call
link |
02:04:14.480
sheet do kind of like what spacex is doing with the rocket launch is just get put the nerds up front
link |
02:04:20.800
put the engineers up front and just like show failures too just i love i love spacex is yeah
link |
02:04:27.760
yeah the thing they're doing it is right and it just feels like the right but we're all so excited
link |
02:04:33.040
to see them succeed yeah i can't wait to see waymo fail you know like you lie to me i want you to fail
link |
02:04:39.200
you tell me the truth you be honest with me i want you to succeed yeah
link |
02:04:42.160
yeah uh yeah and that requires the uh the renegade ceo right i'm with you i'm with you i still have
link |
02:04:51.760
a little bit of faith in waymo to for for the renegade ceo to step forward but it's not it's not
link |
02:04:58.880
john kraftzak yeah it's uh you can't it's not Chris homestead and i'm those people may be very good
link |
02:05:06.560
at certain things yeah but they're not renegades yeah because these companies are fundamentally
link |
02:05:11.920
even though we're talking about billion dollars all these crazy numbers they're still like early
link |
02:05:17.520
stage startups i mean and i i just i if you are pre revenue and you've raised 10 billion dollars i
link |
02:05:23.280
have no idea like like this just doesn't work you know it's against everything silicon valley where's
link |
02:05:28.000
your minimum viable product you know where's your users was your growth numbers this is
link |
02:05:34.000
traditional silicon valley why do you not apply it to what you think you're too big to fail already
link |
02:05:39.360
like how do you think autonomous driving will change society so the mission is for comma to
link |
02:05:49.040
solve self driving do you have like a vision of the world of how it'll be different
link |
02:05:57.760
is it as simple as a to b transportation or is there like because these are robots
link |
02:06:02.080
it's not about autonomous driving in and of itself it's what the technology enables
link |
02:06:09.520
it's i think it's the coolest applied ai problem i like it because it has a clear path to monetary
link |
02:06:15.440
value um but as far as that being the thing that changes the world i mean no like like there's
link |
02:06:24.320
cute things we're doing in common like who'd have thought you could stick a phone on the
link |
02:06:27.040
windshield and it'll drive um but like really the product that you're building is not something that
link |
02:06:32.640
people were not capable of imagining 50 years ago so no it doesn't change the world on that front
link |
02:06:37.600
could people have imagined the internet 50 years ago only true junior genius visionaries yeah
link |
02:06:42.320
everyone could have imagined autonomous cars 50 years ago it's like a car but i don't drive it
link |
02:06:46.880
see i i have the sense and i told you like i'm my long term dream is robots with which you have
link |
02:06:55.040
deep with whom you have deep connections right and uh there's different trajectories towards that
link |
02:07:03.440
and i've been thinking so been thinking of launching a startup i see autonomous vehicles as a potential
link |
02:07:10.320
trajectory to that that that i'm that's not where the direction i would like to go but i also see
link |
02:07:17.520
tesla or even kama ai like pivoting into into robotics broadly defined that's at some stage
link |
02:07:25.840
in the way like you're mentioning the internet didn't expect let's solve you know what i say
link |
02:07:31.520
a comma about this we could talk about this but let's solve self driving guys first gotta stay
link |
02:07:36.000
focused on the mission don't don't don't you're not too big to fail for however much i think
link |
02:07:40.160
calm is winning like no no no no you're winning when you solve level five self driving cars and
link |
02:07:44.960
until then you haven't won and won and you know again you want to be arrogant in the face of
link |
02:07:49.200
other people great you want to be arrogant in the face of nature you're an idiot right stay
link |
02:07:53.920
mission focused brilliantly put uh like i mentioned thinking of launching a startup i've been considering
link |
02:07:59.440
actually before covid i've been thinking of moving to san francisco oh i wouldn't go there so why is uh
link |
02:08:07.520
well and now i'm thinking about potentially austin and we're in san diego now san diego come here
link |
02:08:14.800
so why what um i mean you're you're such an interesting human you've launched so many successful
link |
02:08:21.840
things what uh why san diego what do you recommend why not san francisco have you thought well so
link |
02:08:31.120
in your case san diego with qualcomm is now dragon i mean that's an amazing combination
link |
02:08:36.880
but that wasn't really why that wasn't the why no i mean qualcomm was an afterthought qualcomm
link |
02:08:41.520
was it was a nice thing to think about it's like you can have a tech company here and a good one
link |
02:08:45.600
i mean you know i like qualcomm but no um well so why san diego better than san francisco why does
link |
02:08:50.640
san francisco suck well so okay so first off we all kind of said like we want to stay in california
link |
02:08:55.120
people like the ocean you know california for for its flaws it's like a lot of the flaws of
link |
02:09:01.760
california are not necessarily california as a whole and they're much more san francisco specific
link |
02:09:05.600
yeah um san francisco so i think first year cities in general have stopped wanting growth uh well you
link |
02:09:13.680
have like in san francisco you know the voting class always votes to not build more houses because
link |
02:09:18.800
they own all the houses and they're like well you know once people have figured out how to vote
link |
02:09:23.360
themselves more money they're going to do it it is so insanely corrupt um it is not balanced at all
link |
02:09:29.120
like political party wise you know it's it's it's a one party city and for all the discussion of
link |
02:09:35.520
diversity yeah it's has it stops lacking real diversity of thought of background of uh approaches
link |
02:09:44.560
the strategies of yeah ideas it's it's kind of a strange place that it's the loudest people about
link |
02:09:52.960
diversity and the biggest lack of diversity well i mean that's that's what they say right it's the
link |
02:09:58.720
projection projection yeah yeah it's interesting and even people in silicon valley tell me that's uh
link |
02:10:06.000
like high up people that everybody is like this is a terrible place it doesn't make i mean and
link |
02:10:10.800
coronavirus is really what killed it yeah uh san francisco was the number one uh exodus during
link |
02:10:17.360
coronavirus we still think san diego's uh is a good place to be yeah yeah i mean we'll see we'll
link |
02:10:24.720
see what happens with california a bit longer term yeah like austin's and austin's an interesting
link |
02:10:31.520
choice i wouldn't i wouldn't i don't really anything bad to say about austin either except for the
link |
02:10:36.160
extreme heat in the summer um which you know but that's like very on the surface right i think as
link |
02:10:40.640
far as like an ecosystem goes it's it's cool i personally love colorado colorado is great uh
link |
02:10:46.880
yeah i mean you have these states that are you know like just way better run um california is you
link |
02:10:53.600
know it's especially san francisco it's on its high horse and like yeah can i ask you for advice to
link |
02:11:01.440
me and to others about what's the take to build a successful startup oh i don't know i haven't
link |
02:11:08.080
done that talk to someone who did that well you've you know uh this is like another book of years
link |
02:11:16.320
i'll buy for sixty seven dollars i suppose uh so there's um one of these days i'll sell out
link |
02:11:24.000
yeah that's right jail breaks are going to be a dollar and books are going to be 67 how i uh how
link |
02:11:29.360
i jail broke the iphone by george hotz that's right how i jail broke the iphone and you can
link |
02:11:35.200
do you can't do 67 in in 21 days that's right that's right oh god okay i can't wait but quite
link |
02:11:43.200
so you haven't introspected you have built a very unique company i mean not not you but you and others
link |
02:11:53.040
but i don't know um there's no there's nothing you haven't introspect you haven't really sat down
link |
02:11:59.520
and thought about like well like if you and i were having a bunch of we're having some beers
link |
02:12:06.080
and you're seeing that i'm depressed and whatever i'm struggling there's no advice you can give
link |
02:12:11.520
oh i mean more beer more beer
link |
02:12:18.400
yeah i think it's all very like situation dependent um here's okay if i can give a generic
link |
02:12:25.040
piece of advice it's the technology always wins the better technology always wins and lying always
link |
02:12:32.560
loses build technology and don't lie i'm with you i agree very much the long run long run
link |
02:12:41.280
sure it's the long run you know what the market can remain irrational longer than you can remain
link |
02:12:45.440
solvent true fact well this is this is an interesting point because i ethically and just as a human
link |
02:12:51.680
believe that um like like hype and smoke and mirrors is not at any stage of the company is a good
link |
02:13:01.280
strategy i mean there's some like you know pr magic kind of like you know you want a new product
link |
02:13:08.080
yeah if there's a call to action if there's like a call to action like buy my new gpu look at it
link |
02:13:12.960
it takes up three slots and it's this big it's huge buy my gpu yeah that's great if you look at
link |
02:13:17.920
you know especially in that in ai space broadly but autonomous vehicles like you can raise a huge
link |
02:13:23.840
amount of money on nothing and the question to me is like i'm against that i'll never be part of
link |
02:13:31.200
that i don't think i hope not willingly not but like is there something to be said to uh
link |
02:13:42.000
essentially lying to raise money like fake it till you make a kind of thing i mean this is
link |
02:13:48.480
billy mcfardle in the fire festival like we all we all experienced uh you know what happens with
link |
02:13:54.000
that no no don't fake it till you make it be honest and hope you make it the whole way the
link |
02:14:00.560
technology wins right the technology wins and like there is i'm not i use like the anti hype you
link |
02:14:06.800
know that's that's a slava kpss reference but um hype isn't necessarily bad i loved camping out for
link |
02:14:14.640
the iphones um you know and as long as the hype is backed by like substance as long as it's backed
link |
02:14:21.520
by something i can actually buy and like it's real then hype is great and it's a great feeling
link |
02:14:27.760
it's when the hype is backed by lies that it's a bad feeling i mean a lot of people call you on
link |
02:14:33.040
musk a fraud how could it be a fraud i've noticed this this kind of interesting effect which is
link |
02:14:38.880
he does tend to over promise and deliver what's what's the better way to phrase it promise a
link |
02:14:46.560
timeline that he doesn't deliver on he delivers much later on what do you think about that because
link |
02:14:52.800
i i do that i think that's a programmer thing yeah i do that as well you think that's a really
link |
02:14:58.480
bad thing to do is is that okay i think that's again as long as like you're working toward it
link |
02:15:05.200
and you're gonna deliver on and it's not too far off right yeah right like like you know the whole
link |
02:15:12.960
the whole autonomous vehicle thing it's like i mean i still think tassel is on track to to beat
link |
02:15:19.440
us i still think even with their even with their missteps they have advantages we don't have um
link |
02:15:25.520
you know elon is is better than me at at at like marshalling massive amounts of resources so you
link |
02:15:33.600
know i still think given the fact they're maybe making some wrong decisions they'll end up winning
link |
02:15:39.280
and like it's fine to hype it if you're actually gonna win right like if elon says look we're
link |
02:15:46.080
gonna be landing rockets back on earth in a year and it takes four like you know he landed a rocket
link |
02:15:52.160
back on earth and he was working toward it the whole time i think there's some amount of like
link |
02:15:56.960
i think what it becomes wrong is if you know you're not gonna meet that deadline if you're lying
link |
02:16:01.520
yeah that's brilliantly put like this is what people don't understand i think like elon believes
link |
02:16:08.320
everything he says he does as far as i can tell he does and i i detected that in myself too like if i
link |
02:16:16.320
it's only bullshit if you if you're like conscious of yourself lying yeah i think so yeah no you can't
link |
02:16:23.760
take that to such an extreme right like in a way i think maybe billy mcfarland believed everything
link |
02:16:28.160
he said too right that's how you start a cult and and everybody uh kills themselves yeah yeah like
link |
02:16:34.880
it's you need you need if there's like some factor on it it's fine and you need some people to like
link |
02:16:41.280
you know keep you in check but like if you deliver on most of the things you say and just the timelines
link |
02:16:47.360
are off man it does piss people off though i wonder but who cares in in a long arc of history the
link |
02:16:54.720
people everybody gets pissed off at the people who succeed which is one of the things that
link |
02:16:59.280
frustrates me about this world is um they don't celebrate the success of others like
link |
02:17:09.040
there's so many people that want elon to fail it's so fascinating to me like what is wrong with you
link |
02:17:18.000
like so elon most talks about like people it's short like they talk about financial
link |
02:17:23.360
but i think it's much bigger than the financials i've seen like the human factors community
link |
02:17:27.120
be they want they want other people to fail why why why like even people the harshest thing is like
link |
02:17:36.560
you know even people that like seem to really hate donald trump they want him to fail
link |
02:17:41.840
or like the other president or they want rock obama to fail it's like
link |
02:17:49.600
it's weird but i i want that i would love to inspire that part of the world to change because
link |
02:17:54.400
well dammit if the human species is going to survive we can celebrate success like it seems
link |
02:18:01.040
like the efficient thing to do in this objective function that like we're all striving for is to
link |
02:18:06.640
celebrate the ones that like figure out how to like do better at that objective function as
link |
02:18:11.840
opposed to like dragging them down back into them into the mud i think there is this is the speech
link |
02:18:18.560
i always give about the commenters on hacker news um so first off something to remember about the
link |
02:18:23.520
internet in general is commenters are not representative of the population yeah i don't
link |
02:18:29.520
comment on anything i don't you know commenters are are are representative of a a certain sliver
link |
02:18:35.200
of the population and on hacker news a common thing i'll see is when you'll see something that's like
link |
02:18:41.920
you know promises to be wild out there and innovative there is some amount of you know
link |
02:18:49.120
know checking them back to earth but there's also some amount of if this thing succeeds
link |
02:18:55.200
well i'm 36 and i've worked at large tech companies my whole life
link |
02:19:02.160
they can't succeed because if they succeed that would mean that i could have done something
link |
02:19:06.800
different with my life but we know that i couldn't have we know that i couldn't have and and that's
link |
02:19:10.720
why they're gonna fail and they have to root for them to fail to kind of maintain their world image
link |
02:19:15.040
so tune it out and they comment well it's hard i so one of the things one of the things i'm
link |
02:19:22.560
considering startup wise is to change that because i think the i think it's also a technology
link |
02:19:31.200
problem it's a platform problem i agree it's like because the thing you said most people don't comment
link |
02:19:37.040
it i think most people want to comment they just don't because it's all the assholes for
link |
02:19:45.280
commenting i don't want to be grouped in with them i'm not you don't want to be at a party
link |
02:19:49.280
where everyone is an asshole and so they but that's a platform's problem that's i can't believe
link |
02:19:54.880
what reddits become i can't believe the group thinking reddit comments there's a reddit is
link |
02:20:02.080
interesting one because they're subreddits and so you can still see especially small subreddits
link |
02:20:09.120
that like that are little like havens of like joy and positivity and like deep even disagreement
link |
02:20:16.960
but like nuanced discussion but it's only like small little pockets but that's that's emergent
link |
02:20:23.280
the platform is not helping that or hurting that so i guess naturally something about the internet
link |
02:20:30.160
that if you don't put in a lot of effort to encourage nuance and positive good vibes
link |
02:20:37.120
it's naturally going to decline into chaos i would love to see someone do this well
link |
02:20:42.800
yeah i think it's yeah very doable this is i think actually so i feel like twitter could be overthrown
link |
02:20:51.280
yeah ashwabak talked about how like uh if you have like and retweet like that's only positive
link |
02:21:00.400
wiring right the only way to do anything like negative there is um with a comment and that's
link |
02:21:09.200
like that asymmetry is what gives you know twitter its particular toxicness whereas i find youtube
link |
02:21:16.320
comments to be much better because youtube comments have a have a have an up and a down and they don't
link |
02:21:21.760
show the downloads without getting into depth of this particular discussion the point is to explore
link |
02:21:28.560
possibilities and get a lot of data on it because uh i mean i could disagree with what you just said
link |
02:21:33.920
it's the point is it's unclear it's a it hasn't been explored in a really rich way uh like these
link |
02:21:41.040
questions of how to create platforms that encourage positivity yeah i think it's a it's a technology
link |
02:21:48.720
problem and i think we'll look back at twitter as it is now maybe it'll happen within twitter
link |
02:21:53.600
but most likely somebody overthrows them is we'll look back at twitter and and say we can't
link |
02:22:00.560
believe we put up with this level of toxicity you need a different business model too any any social
link |
02:22:05.840
network that fundamentally has advertising as a business model this was in the social dilemma
link |
02:22:10.320
which i didn't watch but i liked it it's like you know there's always the you know you're the
link |
02:22:13.200
product you're not the uh but they had a nuance take on it that i really liked and it said the
link |
02:22:19.200
product being sold is influence over you the product being sold is literally your you know
link |
02:22:27.520
influence on you like that can't be if that's your idea okay well you know guess what it can't
link |
02:22:35.760
not be toxic yeah maybe there's ways to spin it like with with giving a lot more control to the
link |
02:22:42.000
user and transparency to see what is happening to them as opposed to in the shadows as possible
link |
02:22:47.120
but that can't be the primary source but the users aren't no one's going to use that it depends it
link |
02:22:52.560
depends it depends i think i think that the you're you're not going to you can't depend on
link |
02:22:58.400
self awareness of the users it's a it's another it's a longer discussion because uh you can't
link |
02:23:03.520
depend on it but you can reward self awareness like if for the ones who are willing to put in
link |
02:23:11.840
the work of self awareness you can reward them and incentivize and perhaps be pleasantly surprised
link |
02:23:17.680
how many people are willing to be self aware on the internet like we are in real life like i'm
link |
02:23:24.800
putting in a lot of effort with you right now being self aware about if i say something stupid
link |
02:23:29.200
or mean i'll like look at your like body language like i'm putting in that effort it's costly for
link |
02:23:34.320
an intro very costly but on the internet fuck it like most people are like i don't care if this hurts
link |
02:23:42.800
somebody i don't care if this uh is not interesting or if this is yeah the mean or whatever i think
link |
02:23:49.040
so much of the engagement today on the internet is so disingenuine too yeah you're not doing this
link |
02:23:53.760
out of a genuine this is what you think you're doing this just straight up to manipulate others
link |
02:23:57.520
whether you're in you just became an ad okay let's talk about a fun topic which is programming
link |
02:24:04.240
here's another book idea for you let me pitch uh what's your uh perfect programming setup so like
link |
02:24:11.280
this by george hotz so uh like what listen your give me give me a mac book air sit me in a corner
link |
02:24:19.280
of a hotel room and you know i'll still ask so you really don't care you don't fetishize like
link |
02:24:24.000
multiple monitors keyboard uh those things are nice and i'm not going to say no to them but
link |
02:24:30.480
do they automatically unlock tons of productivity no not at all i have definitely been more productive
link |
02:24:35.280
on a mac book air in a corner of a hotel room what about um ide so uh which operating system
link |
02:24:44.480
do you love what uh text editor do you use ide what um is there is there something that
link |
02:24:51.200
that is like the perfect if you could just say the perfect productivity setup for george hotz
link |
02:24:57.600
doesn't matter doesn't it doesn't matter literally doesn't matter you know i i guess i code most of
link |
02:25:01.840
the time in vim like literally i'm using an editor from the 70s you know you didn't
link |
02:25:05.840
didn't make anything better okay vs code is nice for reading code there's a few things that are nice
link |
02:25:09.760
about it uh i think that there you can build much better tools how like idas x refs work way better
link |
02:25:15.440
than vx vs codes why yeah actually that's a good question like why i i still use sorry emacs for
link |
02:25:23.520
most uh i've actually never i have to confess something dark so i've never used them okay
link |
02:25:31.920
this i think maybe i'm just afraid that my life has been a like a waste i'm so i'm not i'm not even
link |
02:25:42.400
jellicle about emacs i think this this is how i feel about tensorflow vs pie torch yeah having
link |
02:25:48.080
just like we've switched everything to pie torch now put months into the switch i have felt like
link |
02:25:52.400
i've wasted years on tensorflow i can't believe it i can't believe how much better pie torches yeah
link |
02:25:59.440
i've used emacs in vim doesn't matter yeah still just my my heart somehow i fell in love with lisp
link |
02:26:04.560
i don't know why you can't the heart wants with the heart wants i don't i don't understand it but
link |
02:26:09.440
it just connected with me maybe it's the functional language at first i connected with maybe it's because
link |
02:26:13.840
so many of the ai courses before the deep learning revolution were taught with lisp in mind i don't
link |
02:26:19.520
know i don't know what it is but i'm i'm stuck with it but at the same time like why am i not using
link |
02:26:24.160
a modern id for some of these programming i don't know they're not that much better i've used modern
link |
02:26:28.880
id used to but at the same time so like to just we're not to disagree with you but like i like
link |
02:26:34.000
multiple monitors like i have to do work on a laptop and it's a it's a pain in the ass and also i'm
link |
02:26:42.160
addicted to the kinesis weird keyboard that you could you could see there uh yeah so you don't
link |
02:26:49.360
have any of that you can just be in a on a macbook i mean look at work i have three 24 inch monitors
link |
02:26:55.040
i have a happy hacking keyboard i have a razor death header mouse like but it's not essential for you
link |
02:27:01.200
now let's go to uh day in the life of george hotz what is the perfect day productivity wise so we're
link |
02:27:08.800
not talking about like hunter s thompson uh drugs and uh let's let's look at productivity like what
link |
02:27:16.800
what's the day look like i'm like hour by hour is there any regularities that create a magical
link |
02:27:24.000
george hotz experience i can remember three days in my life and i remember these days vividly
link |
02:27:30.240
when i've gone through kind of radical transformations to the way i think and what i would give i would
link |
02:27:38.880
pay a hundred thousand dollars if i could have one of these days tomorrow um the days have been so
link |
02:27:43.760
impactful and one was first discovering leis of utkowski on the singularity and reading that stuff
link |
02:27:50.560
and like you know my mind was blown um and the next was discovering uh the hunter price and that
link |
02:27:58.000
ai is just compression like finally understanding ai xi and what all of that was you know i like
link |
02:28:04.080
read about it when i was 1819 i didn't understand it and then the fact that like lossless compression
link |
02:28:08.320
implies intelligence the day that i was shown that um and then the third one is controversial
link |
02:28:14.000
the day i found a blog called unqualified reservations and uh read that and i was like
link |
02:28:20.400
wait which one is that that's uh what's the guy's name courtesy arvin yeah so many people tell me
link |
02:28:26.080
i'm supposed to talk to him yeah the day he sounds insane or brilliant but insane or both i don't know
link |
02:28:33.040
the day i found that blog was another like this was during like like gamer gate and kind of the
link |
02:28:37.440
run up to the 2016 election and i'm like wow okay the world makes sense now this this like i had a
link |
02:28:43.840
framework now to interpret this just like i got the framework for ai and a framework to
link |
02:28:47.760
interpret technological progress like those days when i discovered these new frameworks or oh
link |
02:28:52.960
interesting it's just not about but what was special about those days how those days come to be
link |
02:28:58.720
is it just you got lucky like sure i like what you just encountered a hotter prize on uh on
link |
02:29:05.760
hack and use or something like that um like what but you see i don't think it's just see i don't
link |
02:29:12.160
think it's just that like i could have gotten lucky at any point i think that in a way you
link |
02:29:16.320
were ready at that moment yeah exactly to receive the information but is there some magic to the
link |
02:29:22.880
day today of like like eating breakfast and it's the mundane things nothing no i drift i drift
link |
02:29:31.680
through life without structure i drift through life hoping and praying that i will get another day
link |
02:29:37.280
like those days and there's nothing in particular you do to uh to be a receptacle for another for
link |
02:29:43.680
day number four no i didn't do anything to get the other ones so i don't think i have to really
link |
02:29:49.600
do anything now i took a month long trip to new york and the ethereum thing was the
link |
02:29:55.600
highlight of it but the rest of it was pretty terrible i did a two week road trip and i got
link |
02:30:00.720
i had to turn around i had to turn around i'm driving in uh in gunnison colorado i passed
link |
02:30:06.800
through gunnison and uh the snow starts coming down there's a pass up there called monarch pass
link |
02:30:12.000
in order to get through to denver you got to get over the rockies and i had to turn my car around
link |
02:30:16.080
i couldn't i watched uh i watched a f150 go off the road i'm like i gotta go back and
link |
02:30:23.120
like that day was meaningful because like like it was real like i actually had to turn my car
link |
02:30:27.520
around um it's rare that anything even real happens in my life even as you know mundane is the fact
link |
02:30:33.760
that yeah there was snow i had to turn around stay in gunnison leave the next day something about
link |
02:30:38.000
that moment felt real okay so actually it's interesting to break apart the three moments
link |
02:30:43.680
you mentioned if it's okay so it'll uh i always have trouble pronouncing his name but
link |
02:30:49.040
allows a yorkowski yeah so what how did your worldview change in starting to consider the
link |
02:31:01.280
the exponential growth of ai and agi that he thinks about and the the threats of artificial
link |
02:31:06.880
intelligence and all that kind of ideas like can you is it just like can you maybe uh break apart
link |
02:31:12.400
like what exactly was so magical to you was a transformational experience today everyone knows
link |
02:31:18.240
him for threats and ai safety um this was pre that stuff there was i don't think a mention of ai
link |
02:31:23.520
safety on the page um this is this is old yorkowski stuff he'd probably denounce it all now he'd
link |
02:31:29.120
probably be like that's exactly what i didn't want to happen sorry man uh is there something
link |
02:31:35.840
specific you can take from his work that you can remember yeah uh it was this realization that
link |
02:31:40.960
uh computers double in power every 18 months and humans do not and they haven't crossed yet
link |
02:31:50.000
but if you have one thing that's doubling every 18 months and one thing that's staying like this
link |
02:31:54.720
you know here's your log graph here's your line you know you calculate that
link |
02:32:01.440
okay and that did that open the door to the exponential thinking like thinking that like
link |
02:32:06.320
you know what with technology we can actually transform the world it opened the door to human
link |
02:32:12.320
obsolescence it it opened the door to realize that in my lifetime humans are going to be replaced
link |
02:32:20.480
and then the matching idea to that of artificial intelligence with a hunter prize
link |
02:32:26.880
you know i'm torn i go back and forth on what i think about it yeah but the the the basic thesis
link |
02:32:33.520
is it's nice it's a nice compelling notion that we can reduce the task of creating an
link |
02:32:38.320
intelligent system a generally intelligent system into the task of compression so you
link |
02:32:44.160
you can think of all of intelligence in the universe in fact is a kind of compression
link |
02:32:50.080
do you find that a was that just at the time you found that as a compelling idea do you still
link |
02:32:54.320
find that a compelling idea i still find that a compelling idea um i think that it's not that
link |
02:33:00.240
useful day to day but actually um one of maybe my quests before that was a search for the definition
link |
02:33:07.600
of the word intelligence and i never had one and i definitely have a definition of the word compression
link |
02:33:14.560
it's a very uh simple uh straightforward one and uh you know what compression is you know what
link |
02:33:19.840
lossless is lossless compression not lossy lossless compression and that that is equivalent to
link |
02:33:24.960
intelligence which i believe i'm not sure how useful that definition is day to day but like i
link |
02:33:29.520
now have a framework to understand what it is and he just 10x the the uh the prize for that
link |
02:33:35.440
competition like recently a few months ago you ever thought of taking a crack at that oh i did
link |
02:33:40.880
oh i did i spent i spent the next after i found the prize i spent the next six months of my life
link |
02:33:46.960
trying it and uh well that's when i started learning everything about ai and then i worked
link |
02:33:52.240
with vicarious for a bit and then i learned read all the deep learning stuff and i'm like okay now
link |
02:33:55.840
i like i'm caught up to modern ai wow and i had i had a really good framework to put it all in
link |
02:34:01.280
from the compression stuff right like some of the first uh some of the first deep learning models
link |
02:34:06.640
i played with were uh or ggpt gpt basically but before transformers before it was still uh rnn's
link |
02:34:15.040
to to do uh character prediction but by the way on the compression side i mean the
link |
02:34:20.400
the especially neural networks what do you make of the lossless requirement with the harder prize so
link |
02:34:27.760
you know human intelligence and neural networks can probably compress stuff pretty well but
link |
02:34:33.360
there would be lossy it's imperfect uh you can turn a lossy compression into a lossless compressor
link |
02:34:38.720
pretty easily using an arithmetic encoder right you can take an arithmetic encoder and you can
link |
02:34:43.120
just encode the noise with maximum efficiency right so even if you can't predict exactly what
link |
02:34:48.880
the next character is the better a probability distribution you can put over the next character
link |
02:34:54.160
you can then use an arithmetic encoder to uh right you don't have to know whether it's an
link |
02:34:58.400
ear and eye you just have to put good probabilities on them and then you know code those and if you
link |
02:35:03.840
have it's a bit of entropy thing right so let me on that topic it'd be interesting as a little
link |
02:35:08.800
side tour what are your thoughts in this year about gpt3 and these language models and these
link |
02:35:14.960
transformers is there something interesting to you as an ai researcher or is there something
link |
02:35:21.440
interesting to you as an autonomous vehicle developer nah i think uh i think it's arrived
link |
02:35:27.280
i mean it's not like it's cool it's cool for what it is but no we're not just going to be able to
link |
02:35:31.760
scale up to gpg12 and get general purpose intelligence like your loss function is literally
link |
02:35:37.760
just you know you know cross entropy loss on the character right like that's not the loss function
link |
02:35:42.400
of general intelligence is that obvious to you yes can you imagine that like to play devil's
link |
02:35:52.160
advocate on yourself is it possible that you can the gpt12 will achieve general intelligence
link |
02:35:58.400
with something as dumb as this kind of loss function i guess it depends what you mean by
link |
02:36:03.520
general intelligence so there's another problem with the gpt's and that's that they don't have a
link |
02:36:09.840
uh they don't have long term memory right all right so like just gpt12 a scaled up version
link |
02:36:19.600
of gpt2 or 3 i find it hard to believe well you can scale it and it's yeah so it's a hard
link |
02:36:29.520
quarter hard coded length but you can make it wider and wider and wider yeah you're gonna get
link |
02:36:37.200
you're gonna get cool things from those systems but i don't think you're ever gonna get something
link |
02:36:44.720
that can like you know build me a rocket ship what about solve driving so you know you can use
link |
02:36:51.840
transformer with video for example you think is there something in there no because i mean look
link |
02:36:59.680
we use we use a grew we use a grew we could change that grew out to a transformer
link |
02:37:03.520
um i think driving is much more marcovian than language so marcovian you mean like the memory
link |
02:37:11.360
which which aspect of marcovian i mean that like most of the information in the state at t minus
link |
02:37:17.360
one is also in the info is in state t right and it kind of like drops off nicely like this whereas
link |
02:37:22.960
sometime with language you have to refer back to the third paragraph on the second page i feel like
link |
02:37:28.000
there's not many like like you can say like speed limit signs but there's really not many things
link |
02:37:32.240
in autonomous driving that look like that but if you look at uh to play devil's advocate is uh the
link |
02:37:37.760
risk estimation thing that you've talked about is kind of interesting is uh it feels like there might
link |
02:37:42.880
be some longer term uh aggregation of context necessary to be able to figure out like the context
link |
02:37:52.000
i'm not even sure i'm i'm believing my my own devil's we have a nice we have a nice like vision
link |
02:37:57.840
model which outputs like a a one or two four dimensional perception space um can i try transformers
link |
02:38:04.400
on it sure i probably will at some point we'll try transformers and then we'll just see do
link |
02:38:08.960
they do better sure i'm but it might not be a game changer you know well i'm not like like
link |
02:38:13.440
might transformers work better than grooves for autonomous driving sure might we switch sure
link |
02:38:17.760
is this some radical change no okay we use a slightly different you know we switch from
link |
02:38:22.000
r and ns to grooves like okay maybe it's grease to transformers but no it's not yeah well on the
link |
02:38:28.960
on the topic of general intelligence i don't know how much i've talked to you about it like what um
link |
02:38:34.720
do you think we'll actually build an agi like if you look at ray korswell with singularity do you
link |
02:38:41.120
do you have like an intuition about you're kind of saying driving is easy yeah and i i tend to
link |
02:38:49.040
personally believe that solving driving will have really deep important impacts on our ability to
link |
02:38:57.360
solve general intelligence like i i think driving doesn't require general intelligence but i think
link |
02:39:03.760
they're going to be neighbors in a way that it's like deeply tied because it's so like driving is so
link |
02:39:10.560
deeply connected to the human experience that i think solving one will help solve the other
link |
02:39:17.280
but but so i don't see i don't see driving is like easy and almost like separate than
link |
02:39:22.240
general intelligence but like what's your vision of the future with a singularity do you see there
link |
02:39:27.120
will be a single moment like a singularity where it'll be a phase shift are we in the singularity
link |
02:39:31.840
now like what do you have crazy ideas about the future in terms of agi we're definitely in the
link |
02:39:36.320
singularity now um we are of course of course look at the bandwidth between people the bandwidth
link |
02:39:41.760
between people goes up all right um the singularity is just you know in the bandwidth but what do you
link |
02:39:47.440
mean by the bandwidth of people communications tools the whole world is networked the whole world
link |
02:39:51.680
is networked and we raise the speed of that network right oh so you think the communication of
link |
02:39:56.240
information in a distributed way is a empowering thing for collective intelligence oh i didn't
link |
02:40:02.480
say it's necessarily a good thing but i think that's like when i think of the definition of the
link |
02:40:05.600
singularity yeah it seems kind of right i see like it's a change in the world beyond which
link |
02:40:13.440
like the world would be transformed in ways that we can't possibly imagine no i mean i think we're
link |
02:40:17.200
in the singularity now in the sense that there's like you know one world in a monoculture and it's
link |
02:40:21.120
also linked yeah i mean i kind of share the intuition that the the singularity will originate
link |
02:40:27.600
from the collective intelligence of us ants versus the like some single system agi type thing
link |
02:40:35.120
oh i totally agree with that yeah i don't i don't really believe in like like a hard takeoff agi
link |
02:40:40.400
kind of thing um yeah i don't think i don't even think ai is all that difference in kind
link |
02:40:49.440
from what we've already been building um with respect to driving i think driving is a subset
link |
02:40:55.280
of general intelligence and i think it's a pretty complete subset i think the tools we develop at
link |
02:40:59.440
comma will also be extremely helpful to solving general intelligence and that's i think the
link |
02:41:05.120
real reason why i'm doing it i don't care about self driving cars it's a cool problem to beat people
link |
02:41:09.440
at but yeah i mean yeah you're kind of you're of two minds so one you do have to have a mission
link |
02:41:16.240
and you want to focus and make sure you get you get there you can't forget that but at the same time
link |
02:41:22.160
there is a thread that's much bigger than uh the kinesis the entirety of your effort that's much
link |
02:41:28.720
bigger than just driving with ai and with general intelligence it is so easy to delude yourself
link |
02:41:35.040
into thinking you've figured something out when you haven't if we build a level five
link |
02:41:38.640
self driving car we have indisputably built something yeah is it general intelligence
link |
02:41:44.640
i'm not going to debate that i will say we've built something that provides huge financial value
link |
02:41:49.600
yeah beautifully put that's the engineer and credo like just just build the thing it's like
link |
02:41:54.160
that's why i'm with uh with uh with with ilan on uh go to mars yeah that's a great one you can
link |
02:41:59.920
argue like who the hell cares about going to mars but the reality is set that as a mission get it
link |
02:42:06.560
done yeah and then you're going to crack some pro problem that you've never even expected in the
link |
02:42:11.760
process of doing that yeah yeah i mean no i think if i had a choice between you managed to go on a
link |
02:42:17.040
mars and solving self driving cars i think going to mars is uh better but i don't know i'm more
link |
02:42:22.000
suited for self driving cars i'm an information guy i'm not a modernist i'm a postmodernist
link |
02:42:26.400
postmodernist all right beautifully put let me let me drag you back to programming for a second
link |
02:42:32.160
what three maybe three to five programming languages should people learn do you think like if you look
link |
02:42:37.120
at yourself what did you get the most out of from learning uh well so everybody should learn
link |
02:42:44.800
see an assembly we'll start with those two right assembly yeah if you can't code an assembly you
link |
02:42:49.920
don't know what the computer's doing you don't understand like you don't have to be great in
link |
02:42:54.080
assembly but you have to code in it and then like you have to appreciate assembly in order to
link |
02:42:59.200
appreciate all the great things see gets you and then you have to code and see in order to appreciate
link |
02:43:04.080
all the great things python gets you so i'll just say assembly see in python we'll start with those three
link |
02:43:09.600
the memory allocation of c and the the the fact that so assembly is to give you a sense of just
link |
02:43:16.480
how many levels of abstraction you get to work on in modern day programming yeah yeah graph coloring
link |
02:43:21.680
for assignment register assignment and compilers yeah like you know you got to do you know the
link |
02:43:26.080
compiler your computer only has a certain number of registers yet you can have all the variables
link |
02:43:29.280
you want to see function you know so you get to start to build intuition about compilation like
link |
02:43:34.480
what a compiler gets you what else um well then there is then there's kind of uh so those are
link |
02:43:41.600
all very imperative programming languages um then there's two other paradigms for programming that
link |
02:43:47.680
everybody should be familiar with i'm one of them is functional uh you should learn haskell and take
link |
02:43:52.720
that all the way through learn a language with dependent types like cock um learn that whole
link |
02:43:58.480
space like the very pl theory heavy languages and haskell is your favorite functional what is that
link |
02:44:04.960
the go to you'd say yeah i'm not a great haskell programmer i wrote a compiler and haskell ones
link |
02:44:10.400
there's another paradigm and actually there's one more paradigm that i'll even talk about
link |
02:44:13.600
after that that i never used to talk about when i would think about this but the next paradigm
link |
02:44:16.880
is learn verilog or hdl um understand this idea of all of the instructions executed once
link |
02:44:25.360
if i have a block in verilog and i write stuff in it it's not sequential they all execute it once
link |
02:44:33.600
and then like think like that that's how hard it works to be so i guess assembly doesn't quite
link |
02:44:39.280
get you that assembly is more about compilation and verilog is more about the hardware like
link |
02:44:44.720
giving a sense of what actually is the hardware is doing assembly see python are straight like
link |
02:44:50.560
they sit right on top of each other in fact c is well let me see it's kind of coded in c but you
link |
02:44:55.760
could imagine the first c was coded in assembly and python is actually coded in c um so you know you
link |
02:45:00.640
can straight up go on that got it and then verilog gives you that's brilliant okay and then i think
link |
02:45:08.320
there's another one now everyone's karpathy calls it programming 2.0 which is learn a i'm not even
link |
02:45:15.680
gonna don't learn tensorflow learn pytorch so machine learning we've got to come up with a
link |
02:45:20.800
better term than programming 2.0 or um but yeah it's a programming language learn it
link |
02:45:29.680
i wonder if it can be formalized a little bit better which we feel like we're in the early days
link |
02:45:33.920
of what that actually entails data driven programming data driven programming yeah
link |
02:45:41.600
but it's so fundamentally different as a paradigm than the others like it almost requires a different
link |
02:45:47.360
skill set but you think it's still yeah and pytorch versus tensorflow pytorch wins it's
link |
02:45:56.560
the fourth paradigm it's the fourth paradigm that i've kind of seen there's like this you know
link |
02:46:00.560
imperative functional hardware i don't know a better word for it and then ml do you have advice
link |
02:46:11.520
for people uh that want to you know get into programming want to learn programming you have a
link |
02:46:17.200
video uh what is programming noob lessons exclamation point and i think the top comment
link |
02:46:24.160
is like warning this is not for noobs uh do you have a noob like uh tldw for that video but also
link |
02:46:34.640
a noob friendly advice on how to get into programming you are never going to learn
link |
02:46:40.640
programming by watching a video called learn programming the only way to learn programming
link |
02:46:46.080
i think and the only one is it the only way everyone i've ever met who can program well
link |
02:46:50.000
learned it all in the same way they had something they wanted to do and then they tried to do it
link |
02:46:56.400
and then they were like oh well okay this is kind of you know be nice as a computer could kind of do
link |
02:47:02.320
this and then you know that's how you learn you just keep pushing on a project
link |
02:47:08.880
so the only advice i have for learning programming is go program somebody wrote to me a question like
link |
02:47:14.480
we don't really they're looking to learn about recurring on networks and saying like my company
link |
02:47:19.920
is thinking of doing recur using recurrent neural networks for time series data but we don't really
link |
02:47:24.800
have an idea of where to use it yet we just want to like do you have any advice on how to learn about
link |
02:47:30.480
these are these kind of general machine learning questions and i think the answer is like actually
link |
02:47:37.200
have a problem that you're trying to solve and and just i see that stuff oh my god when people
link |
02:47:41.840
talk like that they're like i heard machine learning is important could you help us integrate
link |
02:47:46.640
machine learning with macaroni and cheese production you just i don't even you can't help
link |
02:47:53.040
these people like who lets you run anything who lets that kind of person run anything i think we're
link |
02:47:59.280
all we're all beginners at some point so it's not like they're a beginner it's it's like my problem
link |
02:48:06.400
is not that they don't know about machine learning my problem is that they think that machine learning
link |
02:48:10.720
has something to say about macaroni and cheese production or like i heard about this new technology
link |
02:48:16.960
how can i use it for why like i don't know what it is but how can i use it for why that's true
link |
02:48:24.080
you have to build up an intuition of how because you might be able to figure out a way but like
link |
02:48:28.240
the prerequisites you should have a macaroni and cheese problem to solve first exactly and then two
link |
02:48:34.160
you should have more traditional like in the learning process involve more traditionally applicable
link |
02:48:40.880
problems in the space of whatever that is of machine learning and then see if it can be applied
link |
02:48:46.480
to me at least start with tell me about a problem like if you have a problem you're like you know
link |
02:48:50.960
some of my boxes aren't getting enough macaroni in them um can we use machine learning to solve
link |
02:48:55.760
this problem that's much much better than how do i apply machine learning to macaroni and cheese
link |
02:49:01.520
one big thing maybe this is me uh talking to the audience a little bit because i get these days
link |
02:49:08.000
so many messages a device on how to like learn stuff okay my this this this is not me being mean
link |
02:49:18.000
i think this is quite profound actually is you should google it oh yeah like one of the uh
link |
02:49:27.120
like skills that you should really acquire as an engineer as a researcher as a thinker like one
link |
02:49:34.000
there's two two two complementary skills like one is with a blank sheet of paper with no internet
link |
02:49:39.760
to think deeply and then the other is to google the crap out of the questions you have like that's
link |
02:49:46.400
actually a skill i don't people often talk about but like doing research like pulling at the thread
link |
02:49:51.920
like looking up different words going into like github repositories with two stars and like looking
link |
02:49:58.640
how they did stuff like looking at the code or going on twitter seeing like there's little
link |
02:50:04.320
pockets of brilliant people that are like having discussions like if you're a neuroscientist go
link |
02:50:09.920
into signal processing community if you're an ai person going into the psychology community like
link |
02:50:16.320
the switch communities i keep searching searching searching because it's so much better to invest
link |
02:50:23.760
in like finding somebody else who already solved your problem then then is to try to solve the
link |
02:50:29.920
problem and because they've often invested years of their life like entire communities are probably
link |
02:50:36.560
already out there who have tried to solve your problem i think they're the same thing i think
link |
02:50:41.280
you go try to solve the problem and then in trying to solve the problem if you're good at
link |
02:50:46.320
solving problems you'll stumble upon the person who solved it already yeah but the stumbling is
link |
02:50:50.880
really important i think that's the skill that people should really put especially in undergrad
link |
02:50:55.520
like search if you ask me a question how should i get started in deep learning especially like
link |
02:51:00.640
especially like that is just so googleable like the whole point is you google that and you get
link |
02:51:09.840
and you get a million pages and just start looking at them yeah start pulling at the threads start
link |
02:51:15.840
exploring start taking notes start getting advice from a million people that already like spent their
link |
02:51:21.920
life answering that question actually oh well yeah i mean that's definitely also yeah when people
link |
02:51:26.640
like ask me things like that i'm like trust me the top answer on google is much much better than
link |
02:51:30.400
anything i'm going to tell you right yeah people ask it's an interesting question let me know if
link |
02:51:38.480
you have any recommendations what three books technical or fiction or philosophical had an
link |
02:51:44.160
impact on your life or you wouldn't recommend perhaps uh maybe we'll start with the least
link |
02:51:49.920
controversial uh infinite jest um infinite jest is a do you have a foster house yeah it's a book
link |
02:51:59.360
about wireheading really uh very enjoyable to read very well written you know you will you will you
link |
02:52:08.480
will grow as a person reading this book uh it's effort um and i'll set that up for the second
link |
02:52:14.320
book which is pornography that's called atlas shrugged um which um atlas shrugged is pornography
link |
02:52:22.480
i mean it is i will not i will not defend the i will not say atlas shrugged is a well written
link |
02:52:27.520
book um it is entertaining to read certainly just like pornography the production value
link |
02:52:32.480
isn't great um you know there's a 60 page uh monologue in there that anran's editor really
link |
02:52:37.520
wanted to take out and she uh paid she paid out of her pocket to keep that 60 page monologue in
link |
02:52:44.480
the book um but it is a great book for a kind of framework um of human relations and i know a lot
link |
02:52:55.200
of people are like yeah but it's a terrible framework yeah but it's a framework just for context
link |
02:53:01.440
in a couple days i'm speaking with for probably four plus hours with yaron brook who's the main
link |
02:53:09.520
living remaining objectivists objectivist interest uh so i've always found this philosophy quite
link |
02:53:17.040
interesting on many levels one of how repulsive some percent of the large percent of the population
link |
02:53:25.280
find it which is always uh always funny to me when people are like unable to even read a philosophy
link |
02:53:32.880
because um of some i think that says more about their psychological perspective on it yeah but
link |
02:53:41.120
but there is something about objectivism and anran's philosophy that's really deeply connected to this
link |
02:53:49.040
idea of capitalism of uh the ethical life is the productive life um that was always um compelling
link |
02:53:59.600
to me it didn't seem as like i didn't seem to interpret it in the negative sense that some
link |
02:54:04.560
people do to be fair i read the book when i was 19 so you had an impact at that point yeah yeah and
link |
02:54:11.360
the bad guys in the book have this slogan from each according to their ability to each according
link |
02:54:15.840
to their need and i'm looking at this and i'm like these are the most cart this is team rocket level
link |
02:54:21.520
cartoonishness right no bad and then when i realized that was actually the slogan of the communist party
link |
02:54:27.280
i'm like wait a second wait no no no no no just you're telling me this really happened
link |
02:54:32.960
yeah it's interesting i mean one of the criticisms of her work is she has a cartoonish
link |
02:54:38.080
view of good and evil like that there's like the the reality isn't jordan peterson says is that each
link |
02:54:45.280
of us have the capacity for good and evil in us as opposed to like there's some characters who are
link |
02:54:50.000
purely evil and some characters are purely good and that's in a way why it's pornographic
link |
02:54:55.040
the production value i love it well evil is punished and there's very clearly like you know
link |
02:54:59.680
no there's no there's no you know uh just like porn doesn't have uh you know like character
link |
02:55:06.480
growth well you know neither does alish around like brilliant well put but at 19 year old
link |
02:55:12.160
george hotz it was uh it was good enough yeah yeah yeah what uh what's the third you have something um
link |
02:55:18.960
i could give i these these two i'll just throw out uh their sci fi uh perbutation city um great
link |
02:55:24.320
things just try thinking about copies yourself and then um them by science that is uh greg eagan
link |
02:55:31.680
uh he's uh that might not be his real name some australian guy might not be australian
link |
02:55:35.600
i don't know um and then this one's online it's called the metamorphosis of prime intellect
link |
02:55:42.080
um it's a story set in a post singularity world it's interesting is there uh can you
link |
02:55:47.920
either of the worlds do you find something uh philosophy interesting in them that you can
link |
02:55:51.760
come in on i mean it is clear to me that uh metamorphosis of prime intellect is like written by
link |
02:56:00.480
an engineer uh which is it's very it's very almost a pragmatic take on a utopia in a way
link |
02:56:12.480
positive or negative is that well that's up to you to decide reading the book and the ending of it
link |
02:56:18.560
is very interesting as well and i didn't realize what it was i first read that when i was 15 i've
link |
02:56:25.200
reread that book several times in my life and it's short it's 50 pages i want to go read it
link |
02:56:30.720
what's uh sorry it's a little tangent i've been working through the foundation i've been i've
link |
02:56:35.360
haven't read much sci fi my whole life and i'm trying to fix that uh the last few months
link |
02:56:40.080
that's been a little side project what's uh to use the greatest sci fi novel uh that uh people
link |
02:56:46.960
should read or is it or i mean i would i would yeah i would i would say like yeah parametrician
link |
02:56:51.520
city metamorphosis of prime intellect um i don't know i i didn't like foundation i thought it was
link |
02:56:56.800
way too modernist you like dune and like all of those i've never read dune i've never read dune
link |
02:57:02.720
i i have to read it uh fire upon the deep is interesting uh okay i mean look everyone should
link |
02:57:10.240
read everyone should read neuromancer everyone should read snow crash if you haven't read those
link |
02:57:13.600
like start there um yeah i haven't read snow crash never snow crash oh it's i mean it's very
link |
02:57:18.080
interesting go to lesher bach and if you want the controversial one bronze age mindset
link |
02:57:25.120
all right i'll look into that one those aren't sci fi but just to round out books
link |
02:57:31.520
so a bunch of people asked me on twitter and read it and so on
link |
02:57:35.760
for advice so what advice would you give a young person today about life
link |
02:57:40.000
what uh yeah i mean looking back you especially when you're a young younger you did uh and you
link |
02:57:50.640
continued it you've accomplished a lot of interesting things is there some advice from those
link |
02:57:59.600
from that life of yours that you can pass on if college ever opens again i would love
link |
02:58:04.720
to give a graduation speech um at that point i will put a lot of somewhat satirical effort
link |
02:58:11.200
into this question yeah at this uh you haven't written anything at this point oh you know what
link |
02:58:16.560
always wear sunscreen this is water like your plagiarism i mean you know but but that's the
link |
02:58:23.760
that's the like clean your room you know yeah you can plagiarism from from all of this stuff and it's
link |
02:58:28.800
it's there is no self help books aren't designed to help you they're designed to make you feel good
link |
02:58:39.920
like whatever advice i could give you already know everyone already knows sorry it doesn't feel good
link |
02:58:50.080
right like you know you know i don't know what if i tell you that you should you know
link |
02:58:55.680
know eat well and and and read more and it's not going to do anything i think the whole like genre
link |
02:59:03.440
of those kind of questions is is is meaningless i don't know if anything it's don't worry so much
link |
02:59:09.760
about that stuff don't be so caught up in your head right i mean you're yeah in the sense that your
link |
02:59:15.520
whole life if your whole existence is like moving version of that advice i don't know yeah
link |
02:59:21.920
and there's there's something i mean there's something in you that resists that kind of
link |
02:59:27.440
thinking in that in itself is it's just illustrative of who you are and there's something to learn from
link |
02:59:35.600
that i think you're you're clearly not overthinking stuff yeah and you know there's a gut thing i
link |
02:59:43.280
even when i talk about my advice i'm like my advice is only relevant to me yeah it's not
link |
02:59:47.600
relevant to anybody else i'm not saying you should go out if you're the kind of person who
link |
02:59:50.880
overthinks things to stop overthinking things it's not bad it doesn't work for me maybe it works for
link |
02:59:55.360
you i you know i don't know let me ask you about love yeah uh so you i think last time we talked
link |
03:00:03.200
about the meaning of life and it was it was kind of about winning of course uh i don't think i've
link |
03:00:11.280
talked to you about love much whether romantic or just love for the common humanity amongst us all
link |
03:00:17.280
what role has love played in your life in this in this quest for winning where does love fit in
link |
03:00:26.160
well where love i think means several different things there's love in the sense of maybe i could
link |
03:00:32.560
just say there's like love in the sense of opiates and love in the sense of oxytocin and then love
link |
03:00:38.480
in the sense of maybe like a love for math i don't think fits into either those first two paradigms uh
link |
03:00:48.880
so each of those have they uh have they have they given something to you
link |
03:00:55.360
in your life i'm not that big of a fan of the first two um why
link |
03:01:00.880
why the same reason i'm not a fan of you know for the same reason i don't do opiates and don't
link |
03:01:08.880
take ecstasy right and there were times look i've tried both um i like opiates way more than
link |
03:01:16.320
electric ecstasy uh but they're not the ethical life is the productive life so maybe that's my
link |
03:01:25.280
problem with with with those and then like yeah a sense of i don't know like abstract love for humanity
link |
03:01:32.000
i mean the abstract love for humanity i'm like yeah i've always felt that and i guess it's hard
link |
03:01:38.720
for me to imagine not feeling it and maybe people who don't and i don't know but yeah that's just
link |
03:01:44.000
like a background thing that's there i mean since we brought up uh drugs let me ask you
link |
03:01:49.120
you this is becoming more and more part of my life because i'm talking to a few researchers
link |
03:01:55.360
that work on psychedelics i've eaten shrooms a couple times and it was fascinating to me that
link |
03:02:02.000
like the mind can go like just fascinating the mind can go to places i didn't imagine it
link |
03:02:08.880
could go and i was very friendly and positive and exciting and everything was kind of hilarious
link |
03:02:14.960
in in the in the place wherever my mind went that's where what is uh what do you think about
link |
03:02:19.760
psychedelics do you think they have what do you think the mind goes have you done psychedelics
link |
03:02:25.840
what do you where do you think the mind goes uh is there something useful to learn about the places
link |
03:02:30.560
it goes once you come back you know i find it interesting that this idea that psychedelics
link |
03:02:38.880
have something to teach is almost unique to psychedelics right people don't argue this about
link |
03:02:44.800
amphetamines and that's true and i'm not really sure why yeah i think all of the drugs have lessons
link |
03:02:52.480
to teach i think there's things to learn from opiates i think there's things to learn from
link |
03:02:56.000
amphetamines i think there's things to learn from psychedelics things to learn from marijuana
link |
03:03:00.000
um but also at the same time recognize that i don't think you're learning things about the
link |
03:03:06.960
world i think you're learning things about yourself yes um and you know what's the even
link |
03:03:12.880
i might have even been uh might have even been a timothy leary quote i don't know what's
link |
03:03:17.760
called but the idea is basically like you know everybody should look behind the door but then
link |
03:03:21.680
once you've seen behind the door you don't need to keep going back um so i mean and that's my
link |
03:03:27.200
thoughts on on all real drug use too so maybe for caffeine it's a it's a little experience
link |
03:03:34.320
that uh it's good to have but oh yeah no i mean yeah i guess yeah psychedelics are definitely
link |
03:03:41.680
so you're a fan of new experiences i suppose yes because they all contain a little especially the
link |
03:03:46.160
first few times it contains some lessons that can be picked up yeah and i'll i'll i'll revisit
link |
03:03:52.320
psychedelics maybe once a year um usually small smaller doses maybe they turn up the learning
link |
03:03:59.520
rate of your brain i've heard that like that yeah that's cool big learning rates have frozen
link |
03:04:05.440
comms last question this is a little weird one but you've called yourself crazy in the past
link |
03:04:12.640
uh first of all on a scale of one to ten how crazy would you say are you oh i mean it depends
link |
03:04:18.720
how you you know when you compare me to ilan moskin i'd say levandowski not so crazy so like
link |
03:04:23.760
like a seven let's go with six six six six what uh well i like seven seven's a good number seven
link |
03:04:33.680
all right well yeah i'm sure day by day changes right so but you're in that in that area what uh
link |
03:04:42.320
in thinking about that what do you think is the role of madness is that a feature
link |
03:04:46.560
or a bug if you were to dissect your brain so okay from like a like mental health lens on crazy
link |
03:04:56.960
i'm not sure i really believe in that i'm not sure i really believe in like a lot of that stuff
link |
03:05:02.720
right this concept of okay you know when you get over to like like like like like hardcore bipolar
link |
03:05:08.720
and schizophrenia these things are clearly real somewhat biological and then over here on the
link |
03:05:14.000
spectrum you have like add and oppositional defiance disorder and these things that are like
link |
03:05:20.640
wait this is normal spectrum human behavior like this isn't
link |
03:05:25.920
you know where's the the line here and why is this like a problem so there's this whole
link |
03:05:32.880
you know the neurodiversity of humanity is it's huge like people think i'm always on drugs people
link |
03:05:37.760
are saying this to me on my streams and like guys you know like i'm real open with my drug use i'd
link |
03:05:41.520
tell you if i was on drugs and i mean i had like a cup of coffee this morning but other than that
link |
03:05:46.400
this is just me you're witnessing my brain in action you so so the word madness doesn't even
link |
03:05:54.480
make sense and then you're in the rich neurodiversity of humans i think it makes sense but only for
link |
03:06:03.200
our like some insane extremes like if you are actually like visibly hallucinating
link |
03:06:13.200
you know that's okay but there is the kind of spectrum on which you stand out like
link |
03:06:18.640
that that's uh like if i were to look you know at decorations on a christmas tree or
link |
03:06:23.760
something like that like if you were a decoration out that would catch my eye like that thing is
link |
03:06:30.000
sparkly whatever the hell that thing is uh there's something to that just like refusing to be um
link |
03:06:41.040
boring or maybe boring is the wrong word but to um yeah i mean be willing to sparkle
link |
03:06:51.200
you know it's it's like somewhat constructed i mean i am who i choose to be uh i'm gonna say
link |
03:06:58.960
things as true as i can see them i'm not gonna i'm not gonna lie and but that's a really important
link |
03:07:05.600
feature in itself so like whatever the neurodiversity of your whatever your brain is not putting
link |
03:07:12.640
constraints on it that force it to to fit into the mold of what society is like
link |
03:07:18.560
defines what you're supposed to be so you're one of the specimens that
link |
03:07:22.400
that doesn't mind being yourself being right is super important except at the expense of being wrong
link |
03:07:37.040
without breaking that apart i think it's a beautiful way to end it george you're one
link |
03:07:41.120
of the most special humans i know it's truly an honor to talk to you thanks so much for doing it
link |
03:07:45.520
thank you for having me thanks for listening to this conversation with george hotz and thank you
link |
03:07:50.880
to our sponsors for sigmatic which is the maker of delicious mushroom coffee decoding digital
link |
03:07:58.480
which is a tech podcast that i listen to and enjoy and express vpn which is the vpn i've used
link |
03:08:05.680
for many years please check out these sponsors in the description to get a discount and to support
link |
03:08:11.520
this podcast if you enjoy this thing subscribe on youtube review it with five stars and apple
link |
03:08:16.880
podcast follow on spotify support on patreon or connect with me on twitter at lex freedman
link |
03:08:24.320
and now let me leave you with some words from the great and powerful linus torvald
link |
03:08:29.920
talk is cheap show me the code thank you for listening and hope to see you next time