back to index

Sebastian Thrun: Flying Cars, Autonomous Vehicles, and Education | Lex Fridman Podcast #59


small model | large model

link |
00:00:00.000
The following is a conversation with Sebastian Thrun.
link |
00:00:03.520
He's one of the greatest roboticists, computer scientists,
link |
00:00:06.200
and educators of our time.
link |
00:00:08.160
He led the development of the autonomous vehicles
link |
00:00:10.600
at Stanford that won the 2005 DARPA Grand Challenge
link |
00:00:14.360
and placed second in the 2007 DARPA Urban Challenge.
link |
00:00:18.200
He then led the Google Self Driving Car Program, which
link |
00:00:21.280
launched the Self Driving Car Revolution.
link |
00:00:24.680
He taught the popular Stanford course
link |
00:00:26.520
on artificial intelligence in 2011,
link |
00:00:29.080
which was one of the first massive open online courses,
link |
00:00:32.440
or MOOCs, as they're commonly called.
link |
00:00:35.080
That experience led him to cofound Udacity,
link |
00:00:37.600
an online education platform.
link |
00:00:39.800
If you haven't taken courses on it yet,
link |
00:00:41.640
I highly recommend it.
link |
00:00:43.320
Their Self Driving Car Program, for example, is excellent.
link |
00:00:47.160
He's also the CEO of Kitty Hawk, a company working
link |
00:00:50.720
on building flying cars, or more technically,
link |
00:00:54.000
EV TALLS, which stands for Electric Vertical Takeoff
link |
00:00:56.920
and Landing Aircraft.
link |
00:00:58.680
He has launched several revolutions
link |
00:01:00.600
and inspired millions of people.
link |
00:01:02.720
But also, as many know, he's just a really nice guy.
link |
00:01:06.840
It was an honor and a pleasure to talk with him.
link |
00:01:10.560
This is the Artificial Intelligence Podcast.
link |
00:01:12.800
If you enjoy it, subscribe on YouTube, give it five stars
link |
00:01:15.920
on Apple Podcasts, follow it on Spotify,
link |
00:01:18.520
support it on Patreon, or simply connect with me on Twitter.
link |
00:01:21.880
And Lex Friedman, spelled F R I D M A N.
link |
00:01:25.840
If you leave a review on Apple Podcasts or YouTube
link |
00:01:28.280
or Twitter, consider mentioning ideas,
link |
00:01:30.600
people, topics you find interesting.
link |
00:01:32.880
It helps guide the future of this podcast.
link |
00:01:35.840
But in general, I just love comments with kindness
link |
00:01:38.840
and thoughtfulness in them.
link |
00:01:40.160
This podcast is a side project for me, as many people know,
link |
00:01:43.640
but I still put a lot of effort into it.
link |
00:01:45.880
So the positive words of support
link |
00:01:47.720
from an amazing community, from you, really help.
link |
00:01:52.200
I recently started doing ads at the end of the introduction.
link |
00:01:55.240
I'll do one or two minutes after introducing the episode,
link |
00:01:58.120
and never any ads in the middle
link |
00:01:59.640
that can break the flow of the conversation.
link |
00:02:01.800
I hope that works for you
link |
00:02:03.120
and doesn't hurt the listening experience.
link |
00:02:05.400
I provide timestamps for the start of the conversation
link |
00:02:08.000
that you can skip to, but it helps if you listen to the ad
link |
00:02:11.440
and support this podcast by trying out the product
link |
00:02:13.840
or service being advertised.
link |
00:02:16.480
This show is presented by Cash App,
link |
00:02:18.760
the number one finance app in the App Store.
link |
00:02:21.440
I personally use Cash App to send money to friends,
link |
00:02:24.040
but you can also use it to buy, sell,
link |
00:02:25.880
and deposit Bitcoin in just seconds.
link |
00:02:28.200
Cash App also has a new investing feature.
link |
00:02:31.040
You can buy fractions of a stock, say $1 worth,
link |
00:02:34.440
no matter what the stock price is.
link |
00:02:36.560
Brokerage services are provided by Cash App Investing,
link |
00:02:39.520
a subsidiary of Square and member SIPC.
link |
00:02:42.920
I'm excited to be working with Cash App
link |
00:02:44.680
to support one of my favorite organizations called First,
link |
00:02:47.880
best known for their first robotics and Lego competitions.
link |
00:02:51.280
They educate and inspire hundreds of thousands of students
link |
00:02:54.640
in over 110 countries
link |
00:02:56.480
and have a perfect rating on charity navigator,
link |
00:02:59.000
which means the donated money
link |
00:03:00.680
is used to maximum effectiveness.
link |
00:03:03.080
When you get Cash App from the App Store or Google Play
link |
00:03:06.040
and use code LEGS Podcast, you'll get $10,
link |
00:03:09.280
and Cash App will also donate $10 the first,
link |
00:03:12.080
which again is an organization
link |
00:03:13.920
that I've personally seen inspire girls and boys
link |
00:03:16.640
to dream of engineering a better world.
link |
00:03:19.720
And now here's my conversation with Sebastian Thrun.
link |
00:03:24.880
You mentioned that The Matrix may be your favorite movie.
link |
00:03:28.960
So let's start with a crazy philosophical question.
link |
00:03:32.160
Do you think we're living in a simulation
link |
00:03:34.800
and in general, do you find the thought experiment interesting?
link |
00:03:40.000
Define simulation, I would say.
link |
00:03:42.240
Maybe we are, maybe we are not,
link |
00:03:43.720
but it's completely irrelevant to the way we should act.
link |
00:03:47.160
Putting aside for a moment the fact
link |
00:03:50.960
that it might not have any impact
link |
00:03:52.320
on how we should act as human beings,
link |
00:03:54.920
for people studying theoretical physics,
link |
00:03:57.320
these kinds of questions might be kind of interesting,
link |
00:03:59.600
looking at the universe's information processing system.
link |
00:04:03.800
The universe is an information processing system.
link |
00:04:05.960
It is. It's a huge physical, biological,
link |
00:04:07.720
chemical computer. There's no question.
link |
00:04:11.000
But I live here and now.
link |
00:04:12.920
I care about people. I care about us.
link |
00:04:15.640
What do you think is trying to compute?
link |
00:04:17.680
I don't think there's an intention.
link |
00:04:18.840
I think the world evolves the way it evolves.
link |
00:04:22.080
And it's beautiful. It's unpredictable.
link |
00:04:25.400
And I'm really, really grateful to be alive.
link |
00:04:28.040
I've spoken like a true human.
link |
00:04:30.480
Which last time I checked I was.
link |
00:04:33.400
Well, in fact, this whole conversation
link |
00:04:35.280
is just a touring test to see if indeed you are.
link |
00:04:40.280
You've also said that one of the first programs
link |
00:04:42.760
or the first few programs you've written
link |
00:04:45.120
was a wait for a TI 57 calculator.
link |
00:04:50.040
Maybe that's early 80s.
link |
00:04:52.040
I don't want to date calculators or anything.
link |
00:04:54.280
That's early 80s, correct.
link |
00:04:55.640
Yeah.
link |
00:04:56.480
So if you were to place yourself back into that time,
link |
00:04:59.800
into the mindset you were in,
link |
00:05:02.160
because you have predicted the evolution of computing,
link |
00:05:05.600
AI, the internet, technology in the decades that followed.
link |
00:05:10.760
I was super fascinated by Silicon Valley,
link |
00:05:13.160
which I'd seen on television once and thought, my God,
link |
00:05:15.680
this is so cool.
link |
00:05:16.520
They built like DRAMs there and CPUs.
link |
00:05:19.600
How cool is that?
link |
00:05:20.440
And as a college student, a few years later,
link |
00:05:23.960
I decided to study intelligence and study human beings
link |
00:05:26.920
and found that even back then in the 80s and 90s,
link |
00:05:30.520
artificial intelligence is what fascinated me the most.
link |
00:05:33.320
What's missing is that back in the day,
link |
00:05:35.840
the computers are really small.
link |
00:05:37.680
They're like the brains you could build
link |
00:05:39.120
were not anywhere bigger as a cockroach.
link |
00:05:41.600
And cockroaches aren't very smart.
link |
00:05:43.760
So we weren't at the scale yet where we are today.
link |
00:05:46.320
Did you dream at that time to achieve
link |
00:05:49.240
the kind of scale we have today?
link |
00:05:51.080
Did that seem possible?
link |
00:05:52.680
I always wanted to make robots smart.
link |
00:05:54.320
I felt it was super cool to build an artificial human.
link |
00:05:57.960
And the best way to build an artificial human
link |
00:05:59.680
would be to build a robot,
link |
00:06:00.680
because that's kind of the closest we could do.
link |
00:06:03.080
Unfortunately, we aren't there yet.
link |
00:06:04.960
The robots today are still very brittle,
link |
00:06:07.200
but it's fascinating to study intelligence
link |
00:06:09.440
from a constructive perspective where you build something.
link |
00:06:12.880
To understand you build,
link |
00:06:15.120
what do you think it takes to build an intelligent system
link |
00:06:19.480
and an intelligent robot?
link |
00:06:20.880
I think the biggest innovation that we've seen
link |
00:06:22.680
is machine learning.
link |
00:06:23.760
And it's the idea that their computers
link |
00:06:26.120
can basically teach themselves.
link |
00:06:28.640
Let's give an example.
link |
00:06:29.720
I'd say everybody pretty much knows how to walk.
link |
00:06:33.080
And we learn how to walk in the first year,
link |
00:06:35.160
two of our lives.
link |
00:06:36.840
But no scientist has ever been able
link |
00:06:38.800
to write on the rules of human gait.
link |
00:06:41.120
We don't understand it.
link |
00:06:42.840
We have it in our brains some, or we can practice it.
link |
00:06:45.160
We understand it, but we can't articulate it.
link |
00:06:47.760
We can't pass it on by language.
link |
00:06:50.280
And that to me is kind of the deficiency
link |
00:06:52.080
of today's computer programming.
link |
00:06:53.360
Even you program a computer,
link |
00:06:55.240
they're so insanely dumb
link |
00:06:56.720
that you have to give them rules for every contingencies.
link |
00:06:59.840
Very unlike the way people learn,
link |
00:07:01.720
but learn from data and experience,
link |
00:07:03.440
computers are being instructed.
link |
00:07:05.480
And because it's so hard to get this instruction set right,
link |
00:07:07.840
we pay software engineers $200,000 a year.
link |
00:07:11.520
Now, the most recent innovation,
link |
00:07:13.160
which has been in the make for like 30, 40 years,
link |
00:07:15.920
is an idea that computers can find their own rules.
link |
00:07:18.520
So they can learn from falling down and getting up
link |
00:07:20.680
the same way children can learn
link |
00:07:22.000
from falling down and getting up.
link |
00:07:23.840
And that revolution has led to a capability
link |
00:07:26.240
that's completely unmatched.
link |
00:07:28.760
Today's computers can watch experts do their jobs,
link |
00:07:31.800
whether you're a doctor or a lawyer,
link |
00:07:33.920
pick up the regularities, learn those rules,
link |
00:07:36.960
and then become as good as the best experts.
link |
00:07:39.440
So the dream of in the 80s of expert systems, for example,
link |
00:07:42.720
had at its core the idea that humans could boil down
link |
00:07:46.600
their expertise on a sheet of paper.
link |
00:07:49.400
So to sort of reduce,
link |
00:07:50.800
sort of be able to explain to machines
link |
00:07:53.280
how to do something explicitly.
link |
00:07:55.520
So do you think,
link |
00:07:57.320
what's the use of human expertise into this whole picture?
link |
00:08:00.120
Do you think most of the intelligence
link |
00:08:01.800
will come from machines learning from experience
link |
00:08:04.320
without human expertise input?
link |
00:08:06.520
So the question for me is much more,
link |
00:08:08.400
how do you express expertise?
link |
00:08:10.720
You can express expertise by writing a book.
link |
00:08:13.000
You can express expertise by showing someone
link |
00:08:15.160
what you're doing.
link |
00:08:16.280
You can express expertise by applying it
link |
00:08:18.400
by many different ways.
link |
00:08:20.040
And I think the expert systems was our best attempt in AI
link |
00:08:23.760
to capture expertise and rules.
link |
00:08:26.040
But someone sat down and say,
link |
00:08:27.160
here are the rules of human gate.
link |
00:08:28.640
Here's when you put your big toe forward
link |
00:08:31.360
and your heel backwards and here how it stops stumbling.
link |
00:08:34.720
And as we now know, the set of rules,
link |
00:08:37.120
the set of language that we can command
link |
00:08:39.440
is incredibly limited.
link |
00:08:41.200
The majority of the human brain doesn't deal with language.
link |
00:08:43.760
It deals with like subconscious numerical,
link |
00:08:47.240
perceptual things that we don't even have self aware of.
link |
00:08:51.320
Now, when a AI system watches an expert do their job
link |
00:08:55.760
and practice their job,
link |
00:08:57.880
it can pick up things that people can't even put into writing
link |
00:09:01.640
into books or rules.
link |
00:09:03.040
And that's what the real power is.
link |
00:09:04.520
We now have AI systems that, for example,
link |
00:09:07.160
look over the shoulders of highly paid human doctors
link |
00:09:10.560
like dermatologists or radiologists
link |
00:09:12.840
and they can somehow pick up those skills
link |
00:09:15.320
that no one can express in words.
link |
00:09:18.440
So you were a key person in launching three revolutions,
link |
00:09:22.160
online education, autonomous vehicles
link |
00:09:25.160
and fine cars or vtals.
link |
00:09:28.200
So high level and I apologize
link |
00:09:32.600
for all the philosophical questions.
link |
00:09:34.680
That's no apology necessary.
link |
00:09:36.920
How do you choose what problems to try and solve?
link |
00:09:40.680
What drives you to make those solutions a reality?
link |
00:09:43.400
I have two desires in life.
link |
00:09:44.880
I want to literally make the lives of others better
link |
00:09:48.520
or as we often say,
link |
00:09:50.520
maybe jokingly make the world a better place.
link |
00:09:53.000
I actually believe in this.
link |
00:09:55.040
It's as funny as it sounds.
link |
00:09:57.720
And second, I want to learn.
link |
00:09:59.160
I want to get in the skills.
link |
00:10:00.480
I don't want to be in the drop I'm good at
link |
00:10:01.880
because if I'm in a job that I'm good at,
link |
00:10:04.320
the chance for me to learn something interesting
link |
00:10:05.880
is actually minimized.
link |
00:10:06.720
So when I be in a job, I'm bad at.
link |
00:10:09.400
That's really important to me.
link |
00:10:10.240
So in a build, for example,
link |
00:10:11.600
what people often call flying cars,
link |
00:10:13.040
these are electrical, vertical takeoff and landing vehicles.
link |
00:10:17.960
I'm just no expert in any of this.
link |
00:10:19.720
And it's so much fun to learn on the job
link |
00:10:22.280
what it actually means to build something like this.
link |
00:10:24.920
Now I'd say the stuff that I done lately
link |
00:10:27.560
after I finished my professorship at Stanford,
link |
00:10:31.120
they really focused on
link |
00:10:32.560
like what has the maximum impact on society.
link |
00:10:34.880
Like transportation is something that has transformed
link |
00:10:37.560
the 21st or 20th century
link |
00:10:38.960
more than any other invention I'm happy in
link |
00:10:40.520
and even more than communication.
link |
00:10:42.560
And cities are different workers, different women's rights
link |
00:10:45.800
are different because of transportation.
link |
00:10:47.880
And yet we still have a very suboptimal transportation
link |
00:10:51.360
solution where we kill 1.2 or so million people
link |
00:10:56.080
every year in traffic.
link |
00:10:57.680
It's like the leading cause of death
link |
00:10:58.920
for young people in many countries
link |
00:11:01.160
where we are extremely inefficient resource wise,
link |
00:11:03.640
just go to your other neighborhood city
link |
00:11:06.800
and look at the number of parked cars
link |
00:11:08.320
that's a travesty in my opinion,
link |
00:11:10.400
or where we spend endless hours in traffic jams.
link |
00:11:13.840
And very, very simple innovations
link |
00:11:15.720
like a self driving car or what people call a flying car
link |
00:11:18.840
could completely change this and it's there.
link |
00:11:21.040
I mean, the technology is basically there.
link |
00:11:23.320
You have to close your eyes not to see it.
link |
00:11:25.480
So lingering on autonomous vehicles,
link |
00:11:29.480
a fascinating space, some incredible work
link |
00:11:32.120
you've done throughout your career there.
link |
00:11:33.640
So let's start with DARPA.
link |
00:11:37.120
I think the DARPA challenge through the desert
link |
00:11:40.400
and then urban to the streets.
link |
00:11:42.920
I think that inspired an entire generation of roboticists
link |
00:11:45.800
and obviously sprung this whole excitement
link |
00:11:49.520
about this particular kind of four wheeled robots
link |
00:11:52.680
who called autonomous cars, self driving cars.
link |
00:11:55.560
So you led the development of Stanley,
link |
00:11:58.280
the autonomous car that won the race of the desert,
link |
00:12:01.320
the DARPA challenge in 2005.
link |
00:12:04.000
And junior, the car that finished second
link |
00:12:07.400
in the DARPA urban challenge also did incredibly well
link |
00:12:11.120
in 2007, I think.
link |
00:12:14.440
What are some painful inspiring or enlightening experiences
link |
00:12:18.680
from that time that stand out to you?
link |
00:12:20.680
Oh my God, painful were all these incredibly complicated
link |
00:12:28.200
stupid bugs that had to be found.
link |
00:12:30.520
We had a phase where the Stanley,
link |
00:12:33.080
our car that eventually won the DARPA run challenge
link |
00:12:36.280
would every 30 miles just commit suicide.
link |
00:12:39.360
And we didn't know why.
link |
00:12:40.960
And it ended up to be that in the sinking
link |
00:12:43.600
of two computer clocks, occasionally a clock went backwards
link |
00:12:47.760
and that negative time elapsed, screwed up
link |
00:12:50.880
the entire internal logic, but it took ages to find this.
link |
00:12:54.440
It was like bugs like that.
link |
00:12:56.360
I'd say enlightening is the Stanford team
link |
00:12:59.360
immediately focused on machine learning and on software
link |
00:13:02.520
whereas everybody else seemed to focus
link |
00:13:03.800
on building better hardware.
link |
00:13:05.240
Our analysis had been a human being
link |
00:13:07.680
with an existing rental car can perfectly drive the course
link |
00:13:10.240
by having to build a better rental car.
link |
00:13:12.160
I just thought it should replace the human being.
link |
00:13:14.880
And the human being to me was a conjunction of three steps.
link |
00:13:18.840
We had like sensors, eyes and ears,
link |
00:13:21.040
mostly eyes, we had brains in the middle
link |
00:13:23.800
and then we had actuators, our hands and our feet.
link |
00:13:26.320
Now the extras are easy to build.
link |
00:13:28.200
The sensors are actually also easy to build
link |
00:13:29.720
what was missing was the brain.
link |
00:13:30.680
So we had to build a human brain
link |
00:13:32.600
and nothing clearer than to me
link |
00:13:35.200
that the human brain is a learning machine.
link |
00:13:36.960
So why not just train our robot?
link |
00:13:38.200
So you would build a massive machine learning into our machine
link |
00:13:42.280
and with that we're able to not just learn from human drivers.
link |
00:13:45.640
We had the entire speed control of the vehicle
link |
00:13:47.960
was copied from human driving
link |
00:13:49.840
but also have the robot learn from experience
link |
00:13:51.640
where it made a mistake and go to recover from it
link |
00:13:53.680
and learn from it.
link |
00:13:55.560
You mentioned the pain point of software and clocks.
link |
00:14:00.680
Synchronization seems to be a problem
link |
00:14:04.600
that continues with robotics.
link |
00:14:06.040
It's a tricky one with drones and so on.
link |
00:14:08.040
So what does it take to build a thing,
link |
00:14:13.400
a system with so many constraints?
link |
00:14:16.640
You have a deadline, no time,
link |
00:14:20.280
you're unsure about anything really.
link |
00:14:22.080
It's the first time that people really even explain.
link |
00:14:25.000
It's not even sure that anybody can finish
link |
00:14:26.800
when we're talking about the race or the desert
link |
00:14:28.880
the year before nobody finished.
link |
00:14:30.640
What does it take to scramble and finish a product
link |
00:14:33.440
that actually, a system that actually works?
link |
00:14:35.880
We were lucky, we were a really small team.
link |
00:14:38.360
The core of the team were four people.
link |
00:14:40.480
It was four because five couldn't comfortably sit
link |
00:14:43.160
inside a car but four could.
link |
00:14:45.360
And I as a team leader, my job was to get pizza
link |
00:14:47.760
for everybody and wash the car and stuff like this
link |
00:14:51.200
and repair the radiator and it broke
link |
00:14:52.880
and debug the system.
link |
00:14:54.920
And we were very kind of open minded.
link |
00:14:56.920
We had like no egos involved.
link |
00:14:58.320
We just wanted to see how far we can get.
link |
00:15:00.880
What we did really, really well was time management.
link |
00:15:03.320
We were done with everything a month before the race
link |
00:15:06.240
and we froze the entire software a month before the race.
link |
00:15:08.760
And it turned out, looking at other teams,
link |
00:15:11.440
every other team complained if they had just one more week
link |
00:15:14.120
they would have won.
link |
00:15:15.440
And we decided that's gonna fall into that mistake.
link |
00:15:18.760
We're gonna be early.
link |
00:15:19.880
And we had an entire month to shake the system.
link |
00:15:22.680
And we actually found two or three minor bucks
link |
00:15:24.920
in the last month that we had to fix
link |
00:15:27.080
and we were completely prepared when the race occurred.
link |
00:15:30.000
Okay, so first of all, that's such an incredibly rare
link |
00:15:33.880
achievement in terms of being able to be done on time
link |
00:15:37.720
or ahead of time.
link |
00:15:39.000
What do you, how do you do that in your future work?
link |
00:15:43.080
What advice do you have in general?
link |
00:15:44.760
Because it seems to be so rare,
link |
00:15:46.320
especially in highly innovative projects like this.
link |
00:15:49.280
People worked till the last second.
link |
00:15:50.840
Well, the nice thing about the Topper Grand Challenge
link |
00:15:52.560
is that the problem was incredibly well defined.
link |
00:15:55.320
We were able for a while to drive
link |
00:15:57.160
the old Topper Grand Challenge course
link |
00:15:58.800
which had been used the year before.
link |
00:16:00.840
And then at some reason we were kicked out of the region.
link |
00:16:04.080
So we had to go to different deserts,
link |
00:16:05.520
the snorren deserts,
link |
00:16:06.360
and we were able to drive desert trails
link |
00:16:08.920
just of the same type.
link |
00:16:10.640
So there was never any debate about like,
link |
00:16:12.400
what's actually the problem?
link |
00:16:13.320
We didn't sit down and say,
link |
00:16:14.440
hey, should we build a car or a plane
link |
00:16:16.600
that we had to build a car?
link |
00:16:18.320
That made it very, very easy.
link |
00:16:20.480
Then I studied my own life and life of others.
link |
00:16:23.880
And we asked that the typical mistake that people make
link |
00:16:26.440
is that there is this kind of crazy bug left
link |
00:16:29.680
that they haven't found yet.
link |
00:16:32.280
And it's just, they regret it
link |
00:16:34.400
and the bug would have been trivial to fix.
link |
00:16:36.240
They just haven't fixed it yet.
link |
00:16:37.800
And they didn't want to fall into that trap.
link |
00:16:39.640
So I built a testing team.
link |
00:16:41.160
We had a testing team that built a testing booklet
link |
00:16:43.800
of 160 pages of tests we had to go through
link |
00:16:46.840
just to make sure we shake out the system appropriately.
link |
00:16:49.800
And the testing team was with us all the time
link |
00:16:51.840
and dictated to us today,
link |
00:16:53.480
we do railroad crossings.
link |
00:16:55.560
Tomorrow we do, we practice the start of the event.
link |
00:16:58.480
And in all of these, we thought, oh my God,
link |
00:17:01.120
this long solved trivial and then we tested it out.
link |
00:17:03.200
Oh my God, it doesn't do a railroad crossing, why not?
link |
00:17:05.160
Oh my God, it mistakes the rails for metal barriers.
link |
00:17:09.680
We have to fix this.
link |
00:17:10.600
Yes.
link |
00:17:11.600
So it was really a continuous focus
link |
00:17:14.480
on improving the weakest part of the system.
link |
00:17:16.400
And as long as you focus on improving
link |
00:17:19.160
the weakest part of the system,
link |
00:17:20.560
you eventually build a really great system.
link |
00:17:23.120
Let me just pause in that, to me as an engineer,
link |
00:17:25.880
it's just super exciting that you were thinking like that,
link |
00:17:28.280
especially at that stage as brilliant
link |
00:17:30.440
that testing was such a core part of it.
link |
00:17:33.400
It may be to linger on the point of leadership.
link |
00:17:36.720
I think it's one of the first times
link |
00:17:39.120
you were really a leader
link |
00:17:42.000
and you've led many very successful teams since then.
link |
00:17:46.440
What does it take to be a good leader?
link |
00:17:48.480
I would say most of them just take credit.
link |
00:17:51.120
Yeah, for the work of others.
link |
00:17:54.320
Right.
link |
00:17:55.320
That's very convenient, turns out,
link |
00:17:57.560
because I can't do all these things myself.
link |
00:18:00.200
I'm an engineer at heart, so I care about engineering.
link |
00:18:03.760
So I don't know what the chicken and the egg is,
link |
00:18:06.160
but as a kid, I loved computers
link |
00:18:07.880
because you could tell them to do something.
link |
00:18:09.560
And they actually did it.
link |
00:18:10.720
It was very cool.
link |
00:18:11.560
And you could, like, in the middle of the night,
link |
00:18:12.760
wake up at one in the morning and switch on your computer.
link |
00:18:15.200
And what you told you to do yesterday would still do.
link |
00:18:18.160
That was really cool.
link |
00:18:19.400
Unfortunately, they didn't quite work with people.
link |
00:18:21.320
So you go to people and tell them what to do
link |
00:18:22.840
and they don't do it and they hate you for it.
link |
00:18:25.800
Or you do it today and then they go a day later
link |
00:18:28.080
and they'll stop doing it, so you have to.
link |
00:18:30.240
So then the question really became,
link |
00:18:31.480
how can you put yourself in the brain of people
link |
00:18:34.120
as opposed to computers?
link |
00:18:35.120
And in terms of computers, it's super dumb.
link |
00:18:36.920
They're just so dumb.
link |
00:18:38.240
If people were as dumb as computers,
link |
00:18:39.640
I wouldn't want to work with them.
link |
00:18:41.280
But people are smart and people are emotional
link |
00:18:43.600
and people have pride and people have aspirations.
link |
00:18:45.920
So how can I connect to that?
link |
00:18:49.880
And that's the thing where most of leadership just fails
link |
00:18:52.600
because many, many engineers, turn manager,
link |
00:18:56.240
believe they can treat their team just the same way
link |
00:18:58.440
they can treat your computer and it just doesn't work this way.
link |
00:19:00.440
It's just really bad.
link |
00:19:02.360
So how can I connect to people?
link |
00:19:05.120
And it turns out, as a college professor,
link |
00:19:07.680
the wonderful thing you do all the time
link |
00:19:10.000
is to empower other people.
link |
00:19:11.040
Like, your job is to make your students look great.
link |
00:19:14.760
That's all you do.
link |
00:19:15.560
You're the best coach.
link |
00:19:16.920
And it turns out, if you do a fantastic job
link |
00:19:18.760
with making your students look great,
link |
00:19:20.840
they actually love you and their parents love you.
link |
00:19:22.720
And they give you all the credit for stuff you don't deserve.
link |
00:19:24.760
Turns out, all my students were smarter than me.
link |
00:19:27.200
All the great stuff invented at Stanford
link |
00:19:28.760
was their stuff, not my stuff.
link |
00:19:30.040
And they give me credit and say, oh, Sebastian,
link |
00:19:32.520
we're just making them feel good about themselves.
link |
00:19:35.160
So the question really is, can you take a team of people
link |
00:19:38.080
and what does it take to make them,
link |
00:19:40.400
to connect to what they actually want in life
link |
00:19:43.360
and turn this into productive action?
link |
00:19:45.760
It turns out, every human being that I know
link |
00:19:48.520
has incredibly good intentions.
link |
00:19:50.120
I've really rarely met a person with bad intentions.
link |
00:19:54.080
I believe every person wants to contribute.
link |
00:19:55.920
I think every person I've met wants to help others.
link |
00:19:59.440
It's amazing how much of an urge we have not to just help
link |
00:20:02.560
ourselves, but to help others.
link |
00:20:04.440
So how can we empower people and give them
link |
00:20:06.480
the right framework that they can accomplish this?
link |
00:20:10.640
In moments when it works, it's magical
link |
00:20:12.440
because you'd see the confluence of people
link |
00:20:17.200
being able to make the world a better place
link |
00:20:19.200
and deriving enormous confidence and pride out of this.
link |
00:20:22.880
And that's when my environment works the best.
link |
00:20:27.200
These are moments where I can disappear for a month
link |
00:20:29.440
and come back and things still work.
link |
00:20:31.600
It's very hard to accomplish, but when it works, it's amazing.
link |
00:20:35.080
So I agree with you very much, and it's not often
link |
00:20:37.960
heard that most people in the world have good intentions.
link |
00:20:43.560
At the core, their intentions are good,
link |
00:20:45.960
and they're good people.
link |
00:20:47.400
That's a beautiful message.
link |
00:20:48.880
It's not often heard.
link |
00:20:50.200
We make this mistake, and this is a friend of mine,
link |
00:20:52.640
Alex Zorta, talking this, that we judge ourselves
link |
00:20:56.400
by our intentions and others by their actions.
link |
00:21:00.120
And I think the biggest skill, I mean, here in Silicon Valley,
link |
00:21:02.800
we're full of engineers who have very little empathy
link |
00:21:05.160
and are kind of befuddled by why it doesn't work for them.
link |
00:21:09.240
The biggest skill, I think, that people should acquire
link |
00:21:13.120
is to put themselves into the position of the other
link |
00:21:16.920
and listen, and listen to what the other has to say.
link |
00:21:20.040
And they'd be shocked how similar they are to themselves.
link |
00:21:23.480
And they might even be shocked how their own actions don't
link |
00:21:26.440
reflect their intentions.
link |
00:21:28.640
I often have conversations with engineers where I say, look,
link |
00:21:31.480
hey, I love you, you're doing a great job.
link |
00:21:33.400
And by the way, what you just did has the following effect.
link |
00:21:37.360
Are you aware of that?
link |
00:21:38.880
And then people would say, oh, my god, not I wasn't,
link |
00:21:41.320
because my intention was that.
link |
00:21:43.120
And they say, yeah, I trust your intention,
link |
00:21:45.040
you're a good human being.
link |
00:21:46.360
But just to help you in the future,
link |
00:21:48.520
if you keep expressing it that way,
link |
00:21:51.360
then people will just hate you.
link |
00:21:53.440
And I've had many instances where you would say, oh, my god,
link |
00:21:55.680
thank you for telling me this, because it wasn't my intention
link |
00:21:58.320
to look like an idiot.
link |
00:21:59.320
It wasn't my intention to help other people.
link |
00:22:00.920
I just didn't know how to do it.
link |
00:22:02.520
Very simple, by the way.
link |
00:22:04.040
There's a book, Dale Carnegie, 1936,
link |
00:22:07.480
How to Make Friends and How to Influence Others,
link |
00:22:10.440
has the entire Bible.
link |
00:22:11.480
You just read it, and you're done,
link |
00:22:12.760
and you apply it every day.
link |
00:22:14.000
And I wish I was good enough to apply it every day.
link |
00:22:16.760
But it's just simple things, right?
link |
00:22:19.800
Be positive, remember people's names, smile,
link |
00:22:22.600
and eventually have empathy.
link |
00:22:24.480
Really think that the person that you hate
link |
00:22:27.440
and you think is an idiot is actually just like yourself.
link |
00:22:30.440
It's a person who's struggling, who
link |
00:22:32.080
means well, and who might need help.
link |
00:22:34.240
And guess what?
link |
00:22:35.080
You need help.
link |
00:22:36.640
I've recently spoken with Stephen Schwartzman.
link |
00:22:39.960
I'm not sure if you know who that is.
link |
00:22:41.640
I do.
link |
00:22:42.920
So he said.
link |
00:22:44.280
It's on my list.
link |
00:22:45.880
I know this.
link |
00:22:47.400
But he said to expand on what you're saying,
link |
00:22:52.760
that one of the biggest things you can do
link |
00:22:56.040
is hear people when they tell you what their problem is,
link |
00:23:00.040
and then help them with that problem.
link |
00:23:02.360
He says it's surprising how few people actually
link |
00:23:06.400
listen to what troubles others.
link |
00:23:09.320
And because it's right there in front of you,
link |
00:23:12.640
and you can benefit the world the most.
link |
00:23:15.240
And in fact, yourself and everybody around you
link |
00:23:18.040
by just hearing the problems and solving them.
link |
00:23:20.840
I mean, that's my little history of engineering.
link |
00:23:24.000
That is, while I was engineering with computers,
link |
00:23:28.240
I didn't care at all what their computer's problems were.
link |
00:23:32.400
I just told them what to do and they do it.
link |
00:23:34.800
And it just doesn't work this way with people.
link |
00:23:37.560
It doesn't work with me.
link |
00:23:38.480
If you come to me and say, do A, I do the opposite.
link |
00:23:43.600
But let's return to the comfortable world of engineering.
link |
00:23:47.120
And can you tell me in broad strokes
link |
00:23:50.800
in how you see it?
link |
00:23:52.160
Because you're at the core of starting it,
link |
00:23:53.840
the core of driving it, the technical evolution
link |
00:23:56.520
of autonomous vehicles from the first DARPA Grand Challenge
link |
00:24:00.400
to the incredible success we see with the program you started
link |
00:24:04.200
with Google Self Driving Car and Waymo
link |
00:24:05.960
and the entire industry that sprung up
link |
00:24:08.320
of different kinds of approaches, debates, and so on.
link |
00:24:11.160
Well, the idea of self driving car goes back to the 80s.
link |
00:24:13.840
There was a team in Germany and another team in Carnegie Mellon
link |
00:24:16.160
that did some very pioneering work.
link |
00:24:18.680
But back in the day, I'd say the computers
link |
00:24:20.800
were so deficient that even the best professors
link |
00:24:24.640
and engineers in the world basically stood no chance.
link |
00:24:28.200
It then folded into a phase where the US government spent
link |
00:24:32.120
at least half a billion dollars that I could count
link |
00:24:34.640
on research projects.
link |
00:24:36.120
But the way the procurement works,
link |
00:24:38.920
a successful stack of paper describing lots of stuff
link |
00:24:42.760
that no one's ever going to read was a successful product
link |
00:24:46.240
of a research project.
link |
00:24:47.640
So we trained our researchers to produce lots of paper.
link |
00:24:52.560
That all changed with the DARPA Grand Challenge.
link |
00:24:54.280
And I really got a credit the ingenious people at DARPA
link |
00:24:58.480
and the US government in Congress
link |
00:25:00.400
that took a complete new funding model where they said,
link |
00:25:02.960
let's not fund effort, let's fund outcomes.
link |
00:25:05.600
And it sounds very trivial, but there was no tax code
link |
00:25:08.800
that allowed the use of congressional tax money for a price.
link |
00:25:13.680
It was all effort based.
link |
00:25:15.080
So if you put in 100 hours in, you could charge 100 hours.
link |
00:25:17.440
If you put in 1,000 hours in, you could build 1,000 hours.
link |
00:25:20.640
By changing the focus and so you're making the price,
link |
00:25:22.840
we don't pay you for development, we pay you for the accomplishment.
link |
00:25:26.320
They automatically drew out all these contractors
link |
00:25:30.200
who are used to the drug of getting money per hour.
link |
00:25:33.400
And they drew in a whole bunch of new people.
link |
00:25:35.720
And these people are mostly crazy people.
link |
00:25:37.600
There were people who had a car and a computer
link |
00:25:40.760
and they wanted to make a million bucks.
link |
00:25:42.400
The million bucks was their visual price money,
link |
00:25:44.200
it was then doubled.
link |
00:25:45.480
And they felt if I put my computer in my car and program it,
link |
00:25:49.400
I can be rich.
link |
00:25:50.880
And that was so awesome.
link |
00:25:52.040
Like half the teams, there was a team that was surfer dudes
link |
00:25:55.440
and they had like two surfboards on their vehicle
link |
00:25:58.520
and brought like these fashion girls, super cute girls,
link |
00:26:01.560
like twin sisters.
link |
00:26:03.720
And you could tell these guys were not your common felt way
link |
00:26:07.280
bandit who gets all these big, multi million
link |
00:26:10.880
and billion dollar countries from the US government.
link |
00:26:13.520
And there was a great reset.
link |
00:26:16.320
Universities moved in.
link |
00:26:18.560
I was very fortunate at Stanford that I just received 10 year.
link |
00:26:21.800
So I couldn't get fired, no matter what I do.
link |
00:26:23.360
Otherwise I wouldn't have done it.
link |
00:26:25.120
And I had enough money to finance this thing.
link |
00:26:28.240
And I was able to attract a lot of money from third parties.
link |
00:26:31.160
And even car companies moved in.
link |
00:26:32.520
They kind of moved in very quietly
link |
00:26:34.040
because they were super scared to be embarrassed
link |
00:26:36.560
that their car would flip over.
link |
00:26:38.520
But Ford was there and Volkswagen was there
link |
00:26:40.640
and a few others and GM was there.
link |
00:26:43.360
So it kind of reset the entire landscape of people.
link |
00:26:46.320
And if you look at who's a big name
link |
00:26:48.200
in self living cars today,
link |
00:26:49.480
these are mostly people who participated in those challenges.
link |
00:26:53.400
OK, that's incredible.
link |
00:26:54.320
Can you just comment quickly on your sense of lessons
link |
00:26:58.680
learned from that kind of funding model
link |
00:27:01.240
and the research that's going on in academia
link |
00:27:04.400
in terms of producing papers.
link |
00:27:06.120
Is there something to be learned and scaled up bigger,
link |
00:27:09.840
having these kinds of grand challenges
link |
00:27:11.720
that could improve outcomes?
link |
00:27:14.560
So I'm a big believer in focusing on kind of an end
link |
00:27:17.200
to end system.
link |
00:27:19.640
I'm a really big believer in systems building.
link |
00:27:21.920
I've always built systems in my academic career,
link |
00:27:23.680
even though I do a lot of math and abstract stuff.
link |
00:27:27.040
But it's all derived from the idea of let's solve a real problem.
link |
00:27:29.640
And it's very hard for me to be in academic and say,
link |
00:27:34.160
let me solve a component of a problem.
link |
00:27:35.760
Like if someone feels like non monetary logic
link |
00:27:38.640
or AI planning systems where people believe
link |
00:27:41.760
that a certain style of problem solving
link |
00:27:44.320
is the ultimate end objective.
link |
00:27:47.280
And I would always turn around and say, hey, what problem
link |
00:27:51.000
would my grandmother care about that
link |
00:27:53.000
doesn't understand computer technology
link |
00:27:54.680
and doesn't want to understand?
link |
00:27:56.480
And how can I make her love what I do?
link |
00:27:58.480
Because only then do I have an impact on the world.
link |
00:28:01.320
I can easily impress my colleagues.
link |
00:28:02.920
That is much easier.
link |
00:28:04.720
But impressing my grandmother is very, very hard.
link |
00:28:07.600
So I would always thought if I can build a self driving car
link |
00:28:10.760
and my grandmother can use it even after she loses
link |
00:28:13.840
her driving privileges or children can use it
link |
00:28:16.160
or we save maybe a million lives a year,
link |
00:28:20.560
that would be very impressive.
link |
00:28:22.160
And then there's so many problems like these.
link |
00:28:23.920
Like there's a problem with Q and cancer or whatever it is.
link |
00:28:26.000
Live twice as long.
link |
00:28:27.880
Once a problem is defined, of course,
link |
00:28:29.960
I can't solve it in its entirety.
link |
00:28:31.440
Like it takes sometimes tens of thousands of people
link |
00:28:34.200
to find a solution.
link |
00:28:35.360
There's no way you can fund an army of 10,000 at Stanford.
link |
00:28:39.400
So you've got to build a prototype.
link |
00:28:41.080
Let's build a meaningful prototype.
link |
00:28:42.520
And the DARPA Grand Challenge was beautiful because it
link |
00:28:44.320
told me what this prototype had to do.
link |
00:28:46.080
I didn't even think about what it had to do.
link |
00:28:47.720
It just had to read the rules.
link |
00:28:48.840
And it was really, really beautiful.
link |
00:28:51.080
And it's most beautiful, you think,
link |
00:28:52.840
what academia could aspire to is to build a prototype that's
link |
00:28:57.160
the systems level that solves or gives you an inkling
link |
00:29:01.400
that this problem could be solved with this prototype.
link |
00:29:03.520
First of all, I want to emphasize what academia really is.
link |
00:29:06.560
And I think people misunderstand it.
link |
00:29:08.600
First and foremost, academia is a way to educate young people.
link |
00:29:13.360
First and foremost, a professor is an educator,
link |
00:29:15.440
no matter where you are at a small suburban college
link |
00:29:18.600
or whether you are a Harvard or Stanford professor.
link |
00:29:22.840
That's not the way most people think of themselves
link |
00:29:25.000
in academia because we have this kind of competition going
link |
00:29:28.240
on for citations and publication.
link |
00:29:31.440
That's a measurable thing.
link |
00:29:32.840
But that is secondary to the primary purpose
link |
00:29:35.480
of educating people to think.
link |
00:29:37.840
Now, in terms of research, most of the great science,
link |
00:29:42.880
the great research comes out of universities.
link |
00:29:45.560
You can trace almost everything back, including Google,
link |
00:29:47.880
to universities.
link |
00:29:48.880
So there's nothing really fundamentally broken here.
link |
00:29:52.160
It's a good system.
link |
00:29:53.440
And I think America has the finest university system
link |
00:29:55.960
on the planet.
link |
00:29:57.640
We can talk about reach and how to reach people outside
link |
00:30:01.000
the system.
link |
00:30:01.480
It's a different topic.
link |
00:30:02.360
But the system itself is a good system.
link |
00:30:04.840
If I had one wish, I would say it'd be really great
link |
00:30:08.360
if there was more debate about what the great big problems
link |
00:30:14.240
are in society and focus on those.
link |
00:30:18.800
And most of them are interdisciplinary.
link |
00:30:21.640
Unfortunately, it's very easy to fall
link |
00:30:24.640
into an interdisciplinary viewpoint where
link |
00:30:28.960
your problem is dictated, but your closest colleagues
link |
00:30:32.360
believe the problem is.
link |
00:30:33.720
It's very hard to break out and say,
link |
00:30:35.280
well, there's an entire new field of problems.
link |
00:30:37.960
So give an example.
link |
00:30:39.840
Prior to me working on self driving cars,
link |
00:30:41.640
I was a roboticist and a machine learning expert.
link |
00:30:44.600
And I wrote books on robotics, something called
link |
00:30:47.280
probabilistic robotics.
link |
00:30:48.480
It's a very methods driven kind of view point of the world.
link |
00:30:51.520
I built robots that acted in museums as tour guides that
link |
00:30:54.600
led children around.
link |
00:30:55.600
It is something that at the time was moderately challenging.
link |
00:31:00.040
When I started working on cars, several colleagues
link |
00:31:03.360
told me, Sebastian, you're destroying your career
link |
00:31:06.080
because in our field of robotics,
link |
00:31:08.160
cars are looked like as a gimmick.
link |
00:31:10.400
And they're not expressive enough.
link |
00:31:11.760
They can only put this bottle in the brakes.
link |
00:31:15.080
There's no dexterity.
link |
00:31:16.440
There's no complexity.
link |
00:31:18.240
It's just too simple.
link |
00:31:19.480
And no one came to me and said, wow,
link |
00:31:21.600
if you solve that problem, you can save a million lives.
link |
00:31:25.000
Among all robotic problems that I've seen in my life,
link |
00:31:27.240
I would say the self living car transportation
link |
00:31:29.760
is the one that has the most hope for society.
link |
00:31:32.080
So how come the robotics community wasn't all over the place?
link |
00:31:35.120
And it was become because we focused on methods and solutions
link |
00:31:37.920
and not on problems.
link |
00:31:39.880
Like if you go around today and ask your grandmother,
link |
00:31:42.400
what bugs you?
link |
00:31:43.240
What really makes you upset?
link |
00:31:45.240
I challenge any academic to do this
link |
00:31:48.760
and then realize how far your research is probably
link |
00:31:52.600
away from that today.
link |
00:31:54.840
At the very least, that's a good thing for academics
link |
00:31:57.520
to deliberate on.
link |
00:31:58.960
The other thing that's really nice in Silicon Valley
link |
00:32:00.880
is Silicon Valley is full of smart people outside academia.
link |
00:32:04.400
So there's the Larry Pages and Mark Zuckerbergs
link |
00:32:06.320
in the world who are anywhere smarter, smarter
link |
00:32:09.040
than the best academics I've met in my life.
link |
00:32:11.720
And what they do is they are at a different level.
link |
00:32:15.480
They build the systems.
link |
00:32:16.720
They build the customer facing systems.
link |
00:32:19.320
They build things that people can use
link |
00:32:21.920
without technical education.
link |
00:32:23.800
And they are inspired by research.
link |
00:32:25.840
They're inspired by scientists.
link |
00:32:27.520
They hire the best PhDs from the best universities
link |
00:32:30.320
for a reason.
link |
00:32:32.040
So I think this kind of vertical integration
link |
00:32:34.800
that between the real product, the real impact,
link |
00:32:37.800
and the real thought, the real ideas,
link |
00:32:39.880
that's actually working surprisingly well in Silicon
link |
00:32:41.720
Valley.
link |
00:32:42.760
It did not work as well in other places in this nation.
link |
00:32:45.000
So when I worked at Carnegie Mellon,
link |
00:32:46.720
we had the world's finest computer science university.
link |
00:32:49.800
But there wasn't those people in Pittsburgh
link |
00:32:52.760
that would be able to take these very fine computer science
link |
00:32:55.720
ideas and turn them into massively impactful products.
link |
00:33:00.600
That symbiosis seemed to exist pretty much only in Silicon
link |
00:33:04.320
Valley and maybe a bit in Boston and Austin.
link |
00:33:06.600
Yeah.
link |
00:33:07.240
With Stanford, that's really interesting.
link |
00:33:11.200
So if we look a little bit further on
link |
00:33:14.080
from the DARPA Grand Challenge and the launch
link |
00:33:17.800
of the Google self driving car, what do you see
link |
00:33:21.080
as the state, the challenges of autonomous vehicles
link |
00:33:24.520
as they are now, is actually achieving that huge scale
link |
00:33:29.160
and having a huge impact on society?
link |
00:33:31.640
I'm extremely proud of what has been accomplished.
link |
00:33:35.200
And again, I'm taking a lot of quality for the work for others.
link |
00:33:38.280
And I'm actually very optimistic.
link |
00:33:40.000
And people have been kind of worrying,
link |
00:33:42.360
is it too fast, is it slow, why is it not there yet, and so on.
link |
00:33:45.880
It is actually quite an interesting hard problem
link |
00:33:48.800
in that a self driving car, to build one that manages 90%
link |
00:33:54.480
of the problems encountered in everyday driving is easy.
link |
00:33:57.280
We can literally do this over a weekend.
link |
00:33:59.520
To do 99% might take a month, then there's 1% left.
link |
00:34:03.280
So 1% would mean that you still have a fatal accident every week.
link |
00:34:08.160
Very unacceptable.
link |
00:34:09.120
So now you work on this 1%.
link |
00:34:11.000
And the 99% of that, the remaining 1%,
link |
00:34:13.720
is actually still relatively easy.
link |
00:34:15.800
But now you're down to like a 100% or 1%.
link |
00:34:18.280
And it's still completely unacceptable in terms of safety.
link |
00:34:21.640
So the variety of things you encounter are just enormous.
link |
00:34:24.280
And that gives me enormous respect for a human being
link |
00:34:26.480
that we're able to deal with the couch on the highway,
link |
00:34:30.520
or the deer in the headlight, or the blown tire
link |
00:34:33.520
that we've never been trained for.
link |
00:34:34.960
And all of a sudden, I have to handle an emergency situation
link |
00:34:37.160
and often do very, very successfully.
link |
00:34:38.800
It's amazing.
link |
00:34:39.880
From that perspective, how safe driving actually is,
link |
00:34:42.240
given how many millions of miles we drive every year
link |
00:34:45.280
in this country.
link |
00:34:47.680
We are now at a point where I believe the technology is there.
link |
00:34:50.120
And I've seen it.
link |
00:34:51.680
I've seen it in way more.
link |
00:34:52.560
I've seen it in app, I've seen it in cruise,
link |
00:34:55.480
and in a number of companies and in voyage,
link |
00:34:58.360
where vehicles not driving around and basically
link |
00:35:02.400
flawlessly are able to drive people around
link |
00:35:04.280
in limited scenarios.
link |
00:35:06.160
In fact, you can go to Vegas today
link |
00:35:08.080
and order a summoner lift.
link |
00:35:09.960
And if you get the right setting off your app,
link |
00:35:13.560
you'll be picked up by a driverless car.
link |
00:35:15.880
Now, there's still safety drivers in there.
link |
00:35:18.120
But that's a fantastic way to kind of learn
link |
00:35:21.360
what the limits are of technology today.
link |
00:35:23.000
And there's still some glitches.
link |
00:35:24.760
But the glitches have become very, very rare.
link |
00:35:26.600
I think the next step is going to be to down cost it,
link |
00:35:29.720
to harden it.
link |
00:35:31.280
The entrapment, the sensors are not quite
link |
00:35:34.280
an automotive grade standard yet.
link |
00:35:36.240
And then to really build the business models,
link |
00:35:37.840
to really kind of go somewhere and make the business case.
link |
00:35:41.000
And the business case is hard work.
link |
00:35:42.600
It's not just, oh my god, we have this capability.
link |
00:35:44.600
People are just going to buy it.
link |
00:35:45.560
You have to make it affordable.
link |
00:35:46.760
You have to find the social acceptance
link |
00:35:51.320
of people.
link |
00:35:52.240
None of the teams yet has been able to, or gutsy enough,
link |
00:35:55.360
to drive around without a person inside the car.
link |
00:35:59.360
And that's the next magical hurdle.
link |
00:36:01.320
We'll be able to send these vehicles around completely
link |
00:36:04.280
empty in traffic.
link |
00:36:05.760
And I think, I mean, I wait every day,
link |
00:36:08.120
wait for the news that Waymo has just done this.
link |
00:36:11.840
So interesting, you mentioned gutsy.
link |
00:36:14.680
Let me ask some maybe unanswerable question,
link |
00:36:20.200
maybe edgy questions.
link |
00:36:21.480
But in terms of how much risk is required,
link |
00:36:26.840
some guts, in terms of leadership style,
link |
00:36:30.360
it would be good to contrast approaches.
link |
00:36:32.600
And I don't think anyone knows what's right.
link |
00:36:34.680
But if we compare Tesla and Waymo, for example,
link |
00:36:38.560
Elon Musk and the Waymo team, there's slight differences
link |
00:36:44.880
in approach.
link |
00:36:45.720
So on the Elon side, there's more,
link |
00:36:49.600
I don't know what the right word to use,
link |
00:36:50.880
but aggression in terms of innovation.
link |
00:36:53.960
And on Waymo's side, there's more cautious, safety focused
link |
00:37:01.320
approach to the problem.
link |
00:37:03.520
What do you think it takes?
link |
00:37:06.240
What's leadership?
link |
00:37:07.520
But which moment is right?
link |
00:37:09.160
Which approach is right?
link |
00:37:11.640
Look, I don't sit in either of those teams.
link |
00:37:13.960
So I'm unable to even verify, like somebody says, correct.
link |
00:37:18.040
In the end of the day, every innovator in that space
link |
00:37:21.280
will face a fundamental dilemma.
link |
00:37:23.200
And I would say you could put aerospace tightens
link |
00:37:27.160
into the same bucket, which is you
link |
00:37:29.360
have to balance public safety with your drive to innovate.
link |
00:37:34.360
And this country in particular in the States
link |
00:37:36.840
has a 100 plus year history of doing this very successfully.
link |
00:37:40.680
Air travel is what 100 times is safe per mile,
link |
00:37:43.920
then ground travel, then cars.
link |
00:37:47.560
And there's a reason for it, because people
link |
00:37:49.520
have found ways to be very methodological about ensuring
link |
00:37:54.320
public safety while still being able to make progress
link |
00:37:56.960
on important aspects, for example, like yell and noise
link |
00:38:00.280
and fuel consumption.
link |
00:38:03.680
So I think that those practices are proven,
link |
00:38:06.200
and they actually work.
link |
00:38:07.880
We live in a world safer than ever before.
link |
00:38:09.880
And yes, there will always be the provision
link |
00:38:11.880
that something goes wrong.
link |
00:38:12.760
There's always the possibility that someone makes a mistake,
link |
00:38:15.280
or there's an unexpected failure.
link |
00:38:17.160
We can never guarantee to 100% absolute safety
link |
00:38:21.000
other than just not doing it.
link |
00:38:23.280
But I think I'm very proud of the history of the United States.
link |
00:38:27.080
I mean, we've dealt with much more dangerous technology,
link |
00:38:30.120
like nuclear energy, and kept that safe too.
link |
00:38:33.720
We have nuclear weapons, and we keep those safe.
link |
00:38:36.360
So we have methods and procedures
link |
00:38:39.400
that really balance these two things very, very successfully.
link |
00:38:42.920
You've mentioned a lot of great autonomous vehicle companies
link |
00:38:46.280
that are taking sort of the level four, level five.
link |
00:38:48.720
They jump in full autonomy with a safety driver
link |
00:38:51.840
and take that kind of approach, and also
link |
00:38:53.600
through simulation and so on.
link |
00:38:55.760
There's also the approach that Tesla autopilot is doing,
link |
00:38:59.520
which is kind of incrementally taking a level two vehicle
link |
00:39:03.680
and using machine learning and learning from the driving
link |
00:39:07.080
of human beings, and trying to creep up,
link |
00:39:10.520
trying to incrementally improve the system
link |
00:39:12.320
until it's able to achieve level four autonomy.
link |
00:39:15.520
So perfect autonomy in certain kind of geographical regions.
link |
00:39:19.720
What are your thoughts on these contrasting approaches?
link |
00:39:23.120
Well, first of all, I'm a very proud Tesla owner,
link |
00:39:25.520
and I literally use the autopilot every day,
link |
00:39:27.840
and it literally has kept me safe.
link |
00:39:30.760
It is a beautiful technology specifically
link |
00:39:33.880
for highway driving when I'm slightly tired,
link |
00:39:37.560
because then it turns me into a much safer driver,
link |
00:39:42.160
and I'm 100% confident that's the case.
link |
00:39:46.440
In terms of the right approach, I think the biggest change
link |
00:39:49.440
I've seen since I went in the Waymo team
link |
00:39:51.040
is this thing called deep learning.
link |
00:39:54.520
Deep learning was not a hot topic when I started Waymo,
link |
00:39:57.960
or Google self having cars.
link |
00:39:59.360
It was there, but in fact, we started Google Brain
link |
00:40:01.720
at the same time in Google X, so I invested in deep learning,
link |
00:40:04.760
but people didn't talk about it.
link |
00:40:06.280
It wasn't a hot topic.
link |
00:40:07.800
And now it is.
link |
00:40:08.480
There's a shift of emphasis from a more geometric
link |
00:40:11.760
perspective, where you use geometric sensors.
link |
00:40:14.280
They give you a full 3D view, and you
link |
00:40:15.840
do a geometric reasoning about, oh, this box over here
link |
00:40:18.200
might be a car.
link |
00:40:19.600
Towards a more human like, oh, let's just learn about it.
link |
00:40:24.080
This looks like the thing I've seen 10,000 times before,
link |
00:40:26.520
so maybe it's the same thing.
link |
00:40:28.520
Machine learning perspective.
link |
00:40:30.280
And that has really put, I think,
link |
00:40:32.240
all these approaches on steroids.
link |
00:40:35.960
At Udacity, we teach a course in self driving cars.
link |
00:40:38.720
In fact, I think we've credited over 20,000 or so people
link |
00:40:43.800
on self driving cars skills.
link |
00:40:44.960
So every self driving car team in the world
link |
00:40:47.440
now uses our engineers.
link |
00:40:49.280
And in this course, the very first homework assignment
link |
00:40:51.880
is to do lane finding on images.
link |
00:40:54.920
And lane finding images, for laymen, what this means
link |
00:40:57.440
is you put a camera into your car, or you open your eyes,
link |
00:40:59.880
and you wouldn't know where the lane is.
link |
00:41:02.440
So you can stay inside the lane with your car.
link |
00:41:05.000
Humans can do this super easily.
link |
00:41:06.520
You just look and you know where the lane is.
link |
00:41:08.480
Intuitively.
link |
00:41:10.120
For machines, for a long time, it was super hard
link |
00:41:12.160
because people would write these kind of crazy rules.
link |
00:41:14.600
If there's like wine lane markers,
link |
00:41:16.080
and these are what white really means,
link |
00:41:17.600
this is not quite white enough, so let's all, it's not white.
link |
00:41:20.320
Or maybe the sun is shining, so when the sun shines,
link |
00:41:22.280
and this is white, and this is a straight line.
link |
00:41:24.600
Or maybe it's not quite a straight line
link |
00:41:25.680
because the road is curved.
link |
00:41:27.200
And do we know that there's a 6 feet between lane markings
link |
00:41:30.080
or not, or 12 feet, whatever it is?
link |
00:41:33.840
And now, what the students are doing,
link |
00:41:36.280
they would take machine learning.
link |
00:41:37.560
Instead of like writing these crazy rules
link |
00:41:39.600
for the lane marker, they would say,
link |
00:41:40.720
hey, let's take an hour of driving,
link |
00:41:42.680
and label it and tell the vehicle,
link |
00:41:44.400
this is actually the lane by hand.
link |
00:41:45.760
And then these are examples,
link |
00:41:47.320
and have the machine find its own rules
link |
00:41:49.360
what lane markings are.
link |
00:41:51.400
And within 24 hours, now every student
link |
00:41:53.760
that's never done any programming before in this space
link |
00:41:56.040
can write a perfect lane finder
link |
00:41:58.280
as good as the best commercial lane finders.
link |
00:42:00.840
And that's completely amazing to me.
link |
00:42:02.760
We've seen progress using machine learning
link |
00:42:05.480
that completely dwarfs anything
link |
00:42:08.120
that I saw 10 years ago.
link |
00:42:10.920
Yeah, and just as a side note,
link |
00:42:12.840
the self driving car nanodegree,
link |
00:42:15.200
the fact that you launched that many years ago now,
link |
00:42:18.880
maybe four years ago, three years ago,
link |
00:42:21.360
is incredible that that's a great example
link |
00:42:23.720
of system level thinking.
link |
00:42:24.720
Sort of just taking an entire course
link |
00:42:27.120
that teaches how to solve the entire problem.
link |
00:42:29.320
I definitely recommend people.
link |
00:42:31.200
It's been super popular,
link |
00:42:32.360
and it's become actually incredibly high quality
link |
00:42:34.240
related with Mercedes and various other companies
link |
00:42:37.360
in that space.
link |
00:42:38.200
And we find that engineers from Tesla
link |
00:42:40.120
and Waymo are taking it today.
link |
00:42:43.120
The insight was that two things.
link |
00:42:45.480
One is existing universities will be very slow to move
link |
00:42:49.200
because the departmentalized
link |
00:42:50.480
and there's no department for self driving cars.
link |
00:42:52.360
So between Mackey and EE and computer science,
link |
00:42:56.240
getting those folks together into one room
link |
00:42:57.680
is really, really hard.
link |
00:42:59.640
And every professor listening here will know,
link |
00:43:01.280
we'll probably agree to that.
link |
00:43:02.920
And secondly, even if all the great universities
link |
00:43:06.400
just did this, which none so far
link |
00:43:08.400
has developed a curriculum in this field,
link |
00:43:11.120
it is just a few thousand students that can partake
link |
00:43:13.720
because all the great universities are super selective.
link |
00:43:16.280
So how about people in India?
link |
00:43:18.160
How about people in China or in the Middle East
link |
00:43:20.680
or Indonesia or Africa?
link |
00:43:23.480
Why should those be excluded
link |
00:43:25.200
from the skill of building self driving cars?
link |
00:43:27.280
Are they any dumber than we are?
link |
00:43:28.520
Are they any less privileged?
link |
00:43:30.240
And the answer is we should just give everybody the skill
link |
00:43:34.880
to build a self driving car.
link |
00:43:35.920
Because if we do this,
link |
00:43:37.480
then we have like a thousand self driving car startups.
link |
00:43:40.400
And if 10% succeed, that's like a hundred,
link |
00:43:42.960
that means a hundred countries now
link |
00:43:44.200
will have self driving cars and be safer.
link |
00:43:46.840
It's kind of interesting to imagine impossible to quantify,
link |
00:43:50.400
but the number, over a period of several decades,
link |
00:43:55.080
the impact that has, like a single course,
link |
00:43:57.960
like a ripple effect to society.
link |
00:44:00.800
If you, I just recently talked to Andrew
link |
00:44:03.040
and who was creator of Cosmos, so it's a show.
link |
00:44:06.560
It's interesting to think about
link |
00:44:08.200
how many scientists that show launched.
link |
00:44:10.680
And so it's really, in terms of impact,
link |
00:44:15.600
I can't imagine a better course
link |
00:44:17.200
than the self driving car course.
link |
00:44:18.680
That's, you know, there's other more specific disciplines
link |
00:44:21.840
like deep learning and so on
link |
00:44:22.880
that Udacity is also teaching,
link |
00:44:24.120
but self driving cars, it's really, really interesting course.
link |
00:44:26.840
In the end, it came at the right moment.
link |
00:44:28.400
It came at a time when there were a bunch of acquires.
link |
00:44:31.680
Acquire is acquisition of a company,
link |
00:44:34.160
not for its technology or its products or business,
link |
00:44:36.360
but for its people.
link |
00:44:38.280
So acquire means maybe the company of 70 people,
link |
00:44:40.600
they have no product yet,
link |
00:44:41.520
but they're super smart people
link |
00:44:43.120
and they pay a certain amount of money.
link |
00:44:44.320
So I took acquires like GM Cruise and Uber and others
link |
00:44:48.400
and did the math and said,
link |
00:44:50.040
hey, how many people are there
link |
00:44:52.200
and how much money was paid?
link |
00:44:53.720
And as a lower bound,
link |
00:44:55.600
I estimated the value of a self driving car engineer
link |
00:44:58.520
in these acquisitions to be at least $10 million, right?
link |
00:45:02.240
So think about this, you get yourself a skill
link |
00:45:05.040
and you team up and build a company
link |
00:45:06.680
and your worth now is $10 million.
link |
00:45:09.760
I mean, that's kind of cool.
link |
00:45:10.800
I mean, what other thing could you do in life
link |
00:45:13.400
to be worth $10 million within a year?
link |
00:45:15.920
Yeah, amazing.
link |
00:45:17.600
But to come back for a moment onto deep learning
link |
00:45:21.000
and its application in autonomous vehicles,
link |
00:45:23.720
you know, what are your thoughts on Elon Musk's statement,
link |
00:45:28.520
provocative statement, perhaps that light air is a crutch.
link |
00:45:31.160
So there's geometric way of thinking about the world,
link |
00:45:34.080
maybe holding us back if what we should instead be doing
link |
00:45:39.000
in this robotics space,
link |
00:45:39.960
in this particular space of autonomous vehicles
link |
00:45:42.600
is using camera as a primary sensor
link |
00:45:46.520
and using computer vision and machine learning
link |
00:45:48.280
as the primary way to...
link |
00:45:49.760
Okay, I have two comments.
link |
00:45:50.600
I think first of all, we all know that people can drive cars
link |
00:45:55.440
without lighters in their heads
link |
00:45:56.920
because we only have eyes
link |
00:45:59.040
and we mostly just use eyes for driving.
link |
00:46:02.120
Maybe we use some other perception about our bodies,
link |
00:46:04.600
accelerations, occasionally our ears,
link |
00:46:08.040
certainly not our noses.
link |
00:46:10.720
So the existence proof is there,
link |
00:46:12.480
that eyes must be sufficient.
link |
00:46:15.600
In fact, we could even drive a car
link |
00:46:17.960
if someone put a camera out
link |
00:46:19.480
and then gave us the camera image with no latency,
link |
00:46:23.480
we would be able to drive a car that way the same way.
link |
00:46:26.400
So a camera is also sufficient.
link |
00:46:28.800
Secondly, I really love the idea that in the Western world,
link |
00:46:31.840
we have many, many different people
link |
00:46:33.640
trying different hypotheses.
link |
00:46:35.720
It's almost like an anthill,
link |
00:46:36.880
like if an anthill tries to forge for food, right?
link |
00:46:39.600
You can sit there as two ants
link |
00:46:40.840
and agree what the perfect path is
link |
00:46:42.560
and then every single ant marches
link |
00:46:44.040
for the most likely location of food is,
link |
00:46:46.360
or you can even just spread out.
link |
00:46:48.000
And I promise you the spread out solution will be better
link |
00:46:50.480
because if they're discussing
link |
00:46:52.960
philosophical intellectual ants get it wrong
link |
00:46:55.600
and they're all moving the wrong direction,
link |
00:46:56.960
they're gonna waste the day
link |
00:46:58.280
and then they're gonna discuss again for another week.
link |
00:47:00.560
Whereas if all these ants go in the right direction,
link |
00:47:02.480
someone's gonna succeed and they're gonna come back
link |
00:47:04.280
and claim victory and get the Nobel Prize
link |
00:47:06.840
or whatever the ant equivalent is.
link |
00:47:08.720
And then they all march in the same direction.
link |
00:47:10.560
And that's great about society.
link |
00:47:11.840
That's great about the Western society.
link |
00:47:13.200
We're not plant based, we're not central based,
link |
00:47:15.520
we don't have a Soviet Union style central government
link |
00:47:18.880
that tells us where to forge.
link |
00:47:21.000
We just forge, we started in the C Corp.
link |
00:47:24.040
We get investor money, go out and try it out.
link |
00:47:25.840
And who knows who's gonna win?
link |
00:47:28.720
I like it.
link |
00:47:30.680
When you look at the longterm vision of autonomous vehicles,
link |
00:47:35.200
do you see machine learning as fundamentally
link |
00:47:37.840
being able to solve most of the problems?
link |
00:47:39.600
So learning from experience.
link |
00:47:42.280
I'd say we should be very clear
link |
00:47:44.240
about what machine learning is and is not.
link |
00:47:46.120
And I think there's a lot of confusion.
link |
00:47:48.160
What is today is a technology
link |
00:47:50.880
that can go through large databases
link |
00:47:54.680
of repetitive patterns and find those patterns.
link |
00:48:00.880
So in example, we did a study at Stanford two years ago
link |
00:48:03.560
where we applied machine learning
link |
00:48:05.440
to detecting skin cancer in images.
link |
00:48:07.880
And we harvested or built a data set of 129,000 skin
link |
00:48:12.880
photo shots that were all had been biopsied
link |
00:48:15.800
for what the actual situation was.
link |
00:48:18.240
And those included melanomas and carcinomas,
link |
00:48:21.320
also included rashes and other skin conditions, lesions.
link |
00:48:26.320
And then we had a network find those patterns
link |
00:48:29.600
and it was by and large able to then detect skin cancer
link |
00:48:33.360
with an iPhone as accurately as the best board certified
link |
00:48:37.360
Stanford level dermatologist.
link |
00:48:40.080
We proved that.
link |
00:48:41.440
We proved that.
link |
00:48:42.840
Now this thing was great in this one thing
link |
00:48:45.920
and finding skin cancer, but it couldn't drive a car.
link |
00:48:49.720
So the difference to human intelligence
link |
00:48:51.640
is we do all these many, many things
link |
00:48:53.320
and we can often learn from a very small data set
link |
00:48:56.760
of experiences where as machines still need
link |
00:48:59.640
very large data sets and things that will be very repetitive.
link |
00:49:03.360
Now that's still super impactful
link |
00:49:04.720
because almost everything we do is repetitive.
link |
00:49:06.480
So that's gonna really transform human labor.
link |
00:49:10.040
But it's not this almighty general intelligence.
link |
00:49:13.160
We're really far away from a system
link |
00:49:15.320
that would exhibit general intelligence.
link |
00:49:18.760
To that end, I actually commiserate the naming a little bit
link |
00:49:21.360
because artificial intelligence, if you believe Hollywood,
link |
00:49:24.480
is immediately mixed into the idea of human suppression
link |
00:49:27.360
and machine superiority.
link |
00:49:30.400
I don't think that we're gonna see this in my lifetime.
link |
00:49:33.000
I don't think human suppression is a good idea.
link |
00:49:36.480
I don't see it coming.
link |
00:49:37.480
I don't see the technology being there.
link |
00:49:39.760
What I see instead is a very pointed,
link |
00:49:42.360
focused pattern recognition technology
link |
00:49:44.360
that's able to extract patterns from large data sets.
link |
00:49:48.480
And in doing so, it can be super impactful.
link |
00:49:51.560
Super impactful.
link |
00:49:53.560
Let's take the impact of artificial intelligence
link |
00:49:55.880
on human work.
link |
00:49:57.680
We all know that it takes something like 10,000 hours
link |
00:50:00.600
to become an expert.
link |
00:50:02.560
If you're gonna be a doctor or a lawyer
link |
00:50:04.280
or even a really good driver,
link |
00:50:06.360
it takes a certain amount of time to become experts.
link |
00:50:09.600
Machines now are able and have been shown
link |
00:50:12.240
to observe people become experts and observe experts
link |
00:50:16.720
and then extract those rules from experts
link |
00:50:18.480
in some interesting way.
link |
00:50:19.760
They could go from law to sales,
link |
00:50:23.160
to driving cars, to diagnosing cancer
link |
00:50:29.280
and then giving that capability
link |
00:50:30.960
to people who are completely new in their job.
link |
00:50:33.080
We now can, and that's been done.
link |
00:50:35.520
It's been done commercially in many, many instantiations.
link |
00:50:38.520
That means we can use machine learning
link |
00:50:40.840
to make people an expert
link |
00:50:43.000
on their very first day of their work.
link |
00:50:45.520
Like think about the impact.
link |
00:50:46.560
If your doctor is still in their first 10,000 hours,
link |
00:50:51.040
you have a doctor who's not quite an expert yet.
link |
00:50:53.800
Who would not want a doctor
link |
00:50:55.240
who's the world's best expert?
link |
00:50:57.360
And now we can leverage machines
link |
00:50:59.200
to really eradicate error in decision making,
link |
00:51:02.720
error in lack of expertise for human doctors.
link |
00:51:06.240
That could save your life.
link |
00:51:08.360
If we can link on that for a little bit,
link |
00:51:10.360
in which way do you hope machines in the medical field
link |
00:51:14.800
could help assist doctors?
link |
00:51:16.360
You mentioned this sort of accelerating the learning curve
link |
00:51:21.360
or people, if they start a job
link |
00:51:24.520
or in the first 10,000 hours can be assisted by machines.
link |
00:51:27.360
How do you envision that assistance looking?
link |
00:51:29.720
So we built this app for an iPhone
link |
00:51:32.320
that can detect and classify and diagnose skin cancer.
link |
00:51:36.320
And we proved two years ago
link |
00:51:39.440
that it does pretty much as good
link |
00:51:40.920
or better than the best human doctor.
link |
00:51:42.200
So let me tell you a story.
link |
00:51:43.600
So there's a friend of mine that's called Ben.
link |
00:51:45.480
Ben is a very famous venture capitalist.
link |
00:51:47.680
He goes to his doctor and the doctor looks at a mole
link |
00:51:50.720
and says, hey, that mole is probably harmless.
link |
00:51:55.360
And for some very funny reason,
link |
00:51:58.680
he pulls out that phone with our app.
link |
00:52:00.440
He's a collaborator in our study.
link |
00:52:02.640
And the app says, no, no, no, no.
link |
00:52:04.640
This is a melanoma.
link |
00:52:06.320
And for background,
link |
00:52:07.240
melanomas are skin cancer is the most common cancer
link |
00:52:10.800
in this country.
link |
00:52:12.400
Melanomas can go from stage zero to stage four
link |
00:52:16.640
within less than a year.
link |
00:52:18.120
Stage zero means you can basically cut it out yourself
link |
00:52:20.880
with a kitchen knife and be safe.
link |
00:52:23.200
And stage four means your chances
link |
00:52:25.000
of leaving five more years than less than 20%.
link |
00:52:28.000
So it's a very serious, serious, serious condition.
link |
00:52:31.160
So this doctor who took out the iPhone
link |
00:52:36.160
looked at the iPhone and was a little bit puzzled.
link |
00:52:37.680
He said, I mean, what, just to be safe,
link |
00:52:39.720
let's cut it out and biopsy it.
link |
00:52:41.600
That's the technical term for let's get an in depth diagnostics
link |
00:52:45.440
that is more than just looking at it.
link |
00:52:47.720
And it came back as cancerous as a melanoma.
link |
00:52:50.800
And it was then removed.
link |
00:52:52.240
And my friend Ben, I was hiking with him
link |
00:52:54.960
and we were talking about AI.
link |
00:52:56.280
And he said, I'm talking to this vocal skin cancer.
link |
00:52:58.880
And he said, oh, funny.
link |
00:53:00.720
My doctor just had an iPhone that found my cancer.
link |
00:53:05.480
So I was like completely intrigued.
link |
00:53:06.920
I didn't even know about this.
link |
00:53:08.200
So here's a person.
link |
00:53:09.040
I mean, this is a real human life, right?
link |
00:53:11.640
Now, who doesn't know somebody who has been affected
link |
00:53:13.520
by cancer?
link |
00:53:14.360
Cancer is cause of death number two.
link |
00:53:16.160
Cancer is this kind of disease that is mean.
link |
00:53:19.440
And in the following way, most cancers
link |
00:53:21.960
can actually be cured relatively easily
link |
00:53:24.520
if we catch them early.
link |
00:53:25.880
And the reason why we don't tend to catch them early
link |
00:53:28.360
is because they have no symptoms.
link |
00:53:30.560
Like your very first symptom of a gallbladder cancer
link |
00:53:33.840
or a pancreate cancer might be a headache.
link |
00:53:37.040
And when you finally go to your doctor
link |
00:53:38.680
because of these headaches or your back pain
link |
00:53:41.600
and you're being imaged, it's usually stage four plus.
link |
00:53:45.880
And that's the time when your curing chances
link |
00:53:48.200
might be dropped to a single digital percentage.
link |
00:53:50.880
So if you could leverage AI to inspect your body
link |
00:53:54.520
on a regular basis without even a doctor in the room,
link |
00:53:58.080
maybe when you take a shower or what have you,
link |
00:54:00.360
I know this sounds creepy, but then we might be able
link |
00:54:03.040
to save millions and millions of lives.
link |
00:54:06.280
You've mentioned there's a concern that people have
link |
00:54:09.440
about near term impacts of AI in terms of job loss.
link |
00:54:12.800
So you've mentioned being able to assist doctors,
link |
00:54:15.480
being able to assist people in their jobs.
link |
00:54:17.880
Do you have a worry of people losing their jobs
link |
00:54:22.240
or the economy being affected by the improvements in AI?
link |
00:54:25.440
Yeah, anybody concerned about job losses,
link |
00:54:27.680
please come to Udacity.com, we teach contemporary tech skills
link |
00:54:32.320
and we have a kind of implicit job promise.
link |
00:54:36.680
We often, when we measure, we spend way over 50%
link |
00:54:40.400
of our graduates in new jobs and they're very satisfied
link |
00:54:42.920
about it and it costs almost nothing
link |
00:54:44.800
cause like 1,500 max or something like that.
link |
00:54:47.120
And I saw there's a cool new program
link |
00:54:48.920
that you agreed with the U.S. government
link |
00:54:51.120
guaranteeing that you will help give scholarships
link |
00:54:54.880
that educate people in this kind of situation.
link |
00:54:57.840
Yeah, we're working with the U.S. government
link |
00:54:59.960
on the idea of basically rebuilding the American dream.
link |
00:55:03.880
So Udacity has just dedicated 100,000 scholarships
link |
00:55:07.440
for citizens of America for various levels of courses
link |
00:55:12.080
that eventually will get you a job.
link |
00:55:16.720
And those courses all somewhat relate to the tech sector
link |
00:55:19.480
because the tech sector is kind of the hottest sector
link |
00:55:21.520
right now and they range from interlevel digital marketing
link |
00:55:25.000
to very advanced self driving car engineering.
link |
00:55:28.120
And we're doing this with the White House
link |
00:55:29.480
because we think it's bipartisan.
link |
00:55:30.920
It's an issue that is that if you wanna really make
link |
00:55:34.240
America great, being able to be part of the solution
link |
00:55:40.120
and live the American dream requires us to be proactive
link |
00:55:43.840
about our education and our skillset.
link |
00:55:45.840
It's just the way it is today.
link |
00:55:47.760
And it's always been this way.
link |
00:55:48.760
And we always had this American dream
link |
00:55:50.000
to send our kids to college
link |
00:55:51.200
and now the American dream has to be
link |
00:55:53.280
to send ourselves to college.
link |
00:55:54.640
We can do this very, very efficiently
link |
00:55:58.240
and very, very, we can squeeze in in the evenings
link |
00:56:00.920
and things to online at all ages.
link |
00:56:03.160
All ages.
link |
00:56:04.000
So our learners go from age 11 to age 80.
link |
00:56:11.240
I just traveled Germany and the guy
link |
00:56:14.120
in the train compartment next to me was one of my students.
link |
00:56:17.400
It's like, wow, that's amazing.
link |
00:56:19.680
I don't think about impact.
link |
00:56:21.280
We've become the educator of choice
link |
00:56:23.320
for now, I believe officially six countries
link |
00:56:25.520
or five countries is the most in the Middle East
link |
00:56:27.360
like Saudi Arabia and in Egypt.
link |
00:56:30.080
In Egypt, we just had a cohort graduate
link |
00:56:33.440
where we had 1100 high school students
link |
00:56:37.280
that went through programming skills,
link |
00:56:39.800
proficient at the level of a computer science undergrad.
link |
00:56:42.920
And we had a 95% graduation rate
link |
00:56:45.200
even though everything's online.
link |
00:56:46.280
It's kind of tough, but we kind of trying to figure out
link |
00:56:48.240
how to make this effective.
link |
00:56:50.120
The vision is, the vision is very, very simple.
link |
00:56:52.520
The vision is education ought to be a basic human right.
link |
00:56:58.320
It cannot be locked up behind ivory tower walls
link |
00:57:02.320
only for the rich people, for the parents
link |
00:57:04.400
who might be bright themselves into the system
link |
00:57:06.920
and only for young people and only for people
link |
00:57:09.240
from the right demographics and the right geography
link |
00:57:11.880
and possibly even the right race.
link |
00:57:14.200
It has to be opened up to everybody.
link |
00:57:15.800
If we are truthful to the human mission,
link |
00:57:18.720
if we are truthful to our values,
link |
00:57:20.640
we're gonna open up education to everybody in the world.
link |
00:57:23.440
So Udacity's pledge of 100,000 scholarships,
link |
00:57:27.200
I think is the biggest pledge of scholarships ever
link |
00:57:29.160
in terms of numbers.
link |
00:57:30.720
And we're working, as I said, with the White House
link |
00:57:33.000
and with very accomplished CEOs like Tim Cook
link |
00:57:36.040
from Apple and others to really bring education
link |
00:57:39.000
to everywhere in the world.
link |
00:57:41.040
Not to ask you to pick the favorite of your children,
link |
00:57:44.600
but at this point. Oh, that's Jasper.
link |
00:57:46.680
I only have one that I know of.
link |
00:57:49.720
Okay, good.
link |
00:57:52.680
In this particular moment, what nano good degree,
link |
00:57:55.800
what set of courses are you most excited about Udacity
link |
00:58:00.040
or is that too impossible to pick?
link |
00:58:02.000
I've been super excited about something
link |
00:58:03.800
we haven't launched yet in the building,
link |
00:58:05.480
which is when we talk to our partner companies,
link |
00:58:09.120
we have now a very strong footing in the enterprise world.
link |
00:58:12.680
In order to our students,
link |
00:58:14.560
we've kind of always focused on these hard skills
link |
00:58:17.240
like the programming skills or math skills
link |
00:58:19.720
or building skills or design skills.
link |
00:58:22.200
And a very common ask is soft skills.
link |
00:58:25.160
Like how do you behave in your work?
link |
00:58:26.880
How do you develop empathy?
link |
00:58:28.280
How do you work in a team?
link |
00:58:30.440
What are the very basics of management?
link |
00:58:32.400
How do you do time management?
link |
00:58:33.680
How do you advance your career
link |
00:58:36.240
in the context of a broader community?
link |
00:58:39.280
And that's something that we haven't done
link |
00:58:41.400
very well at Udacity, and I would say most universities
link |
00:58:43.840
are doing very poorly as well
link |
00:58:45.160
because we're so obsessed with individual test scores
link |
00:58:47.880
and so little,
link |
00:58:49.480
pays a little attention to teamwork in education.
link |
00:58:52.600
So that's something I see us moving into as a company
link |
00:58:55.480
because I'm excited about this.
link |
00:58:56.920
And I think, look, we can teach people tech skills
link |
00:59:00.120
and they're gonna be great.
link |
00:59:00.960
But if you teach people empathy,
link |
00:59:02.720
that's gonna have the same impact.
link |
00:59:04.960
Maybe harder than self driving cars, but I don't think so.
link |
00:59:08.720
I think the rules are really simple.
link |
00:59:11.320
You just have to, you have to want to engage.
link |
00:59:14.400
It's, we literally went in school and in K through 12,
link |
00:59:18.200
we teach kids like get the highest math score.
link |
00:59:20.480
And if you are a rational human being,
link |
00:59:22.920
you might evolve from this education, say,
link |
00:59:25.640
having the best math score and the best English scores,
link |
00:59:28.080
making me the best leader.
link |
00:59:29.680
And it turns out not to be the case.
link |
00:59:31.080
It's actually really wrong because making the,
link |
00:59:34.360
first of all, in terms of math scores,
link |
00:59:35.840
I think it's perfectly fine to hire somebody
link |
00:59:37.680
with great math skills.
link |
00:59:38.520
You don't have to do it yourself.
link |
00:59:40.640
You can't hire some of the great empathy for you.
link |
00:59:42.760
That's much harder,
link |
00:59:43.880
but it can always hire some of the great math skills.
link |
00:59:46.360
But we live in a fluent world
link |
00:59:49.000
where we constantly deal with other people
link |
00:59:51.040
and that's a beauty.
link |
00:59:51.920
It's not a nuisance, it's a beauty.
link |
00:59:53.360
So if we somehow develop that muscle
link |
00:59:55.960
that we can do that well and empower others
link |
00:59:59.920
in the workplace,
link |
01:00:00.920
I think we're gonna be super successful.
link |
01:00:02.920
And I know many fellow robot assistant computer scientists
link |
01:00:07.280
that I will insist to take this course.
link |
01:00:09.840
Not to be named you.
link |
01:00:12.200
Not to be named.
link |
01:00:13.760
Many, many years ago, 1903,
link |
01:00:17.960
the Wright Brothers flew in Kitty Hawk for the first time.
link |
01:00:22.600
And you've launched a company of the same name, Kitty Hawk,
link |
01:00:26.920
with the dream of building flying cars, EV Talls.
link |
01:00:32.320
So at the big picture,
link |
01:00:34.560
what are the big challenges of making this thing
link |
01:00:36.640
that actually inspired generations of people
link |
01:00:40.000
about what the future looks like?
link |
01:00:41.760
What does it take?
link |
01:00:42.600
What are the biggest challenges?
link |
01:00:43.680
So flying cars has always been a dream.
link |
01:00:47.240
Every boy, every girl wants to fly.
link |
01:00:49.720
Let's be honest.
link |
01:00:51.040
And let's go back in our history
link |
01:00:52.360
of your dreaming of flying.
link |
01:00:53.800
I think honestly, my single most remembered childhood dream
link |
01:00:57.440
has been a dream where I was sitting on a pillow
link |
01:00:59.440
and I could fly.
link |
01:01:00.760
I was like five years old.
link |
01:01:02.080
I remember like maybe three dreams of my childhood,
link |
01:01:04.160
but that's the one that we remember most vividly.
link |
01:01:07.560
And then Peter Thiel famously said,
link |
01:01:09.400
they promised us flying cars
link |
01:01:10.720
and they gave us 140 characters,
link |
01:01:12.840
pointing as Twitter at the time,
link |
01:01:15.280
limited message size to 140 characters.
link |
01:01:18.400
So we're coming back now to really go
link |
01:01:20.240
for these super impactful stuff like flying cars.
link |
01:01:23.280
And to be precise, they're not really cars.
link |
01:01:25.920
They don't have wheels.
link |
01:01:27.200
They're actually much closer to a helicopter
link |
01:01:28.640
than anything else.
link |
01:01:29.680
They take off vertically in their flight horizontally,
link |
01:01:32.120
but they have important differences.
link |
01:01:34.400
One difference is that they are much quieter.
link |
01:01:37.760
We just released a vehicle called Project Heavy Sight
link |
01:01:41.600
that can fly over you as low as a helicopter
link |
01:01:43.560
and you basically can't hear.
link |
01:01:45.240
It's like 38 decibels.
link |
01:01:46.720
It's like, if you were inside the library,
link |
01:01:49.280
you might be able to hear it,
link |
01:01:50.240
but anywhere outdoors, your ambient noise is higher.
link |
01:01:54.600
Secondly, they're much more affordable.
link |
01:01:57.080
They're much more affordable than helicopters.
link |
01:01:59.000
And the reason is helicopters are expensive
link |
01:02:01.960
for many reasons.
link |
01:02:04.440
There's lots of single point of figures in a helicopter.
link |
01:02:07.040
There's a bolt between the blades
link |
01:02:09.160
that's caused Jesus bolt.
link |
01:02:10.840
And the reason why it's called Jesus bolt is
link |
01:02:13.080
if this bolt breaks, you will die.
link |
01:02:16.400
There is no second solution in helicopter flight.
link |
01:02:19.560
Whereas we have these distributed mechanism.
link |
01:02:21.520
When you go from gasoline to electric,
link |
01:02:23.760
you can now have many, many, many small motors
link |
01:02:25.880
as opposed to one big motor.
link |
01:02:27.360
And that means if you lose one of those motors,
link |
01:02:28.840
not a big deal.
link |
01:02:29.680
Heavy Sight, if it loses a motor,
link |
01:02:31.360
it has eight of those.
link |
01:02:32.840
If it loses one of those eight motors,
link |
01:02:34.040
so it's seven left,
link |
01:02:35.200
you can take off just like before
link |
01:02:37.320
and land just like before.
link |
01:02:40.160
We are now also moving into a technology
link |
01:02:42.080
that doesn't require a commercial pilot.
link |
01:02:44.200
Because in some level,
link |
01:02:45.560
flight is actually easier than ground transportation.
link |
01:02:49.000
Like in self driving cars,
link |
01:02:51.440
the world is full of like children and bicycles
link |
01:02:54.520
and other cars and mailboxes and curbs and shrubs
link |
01:02:57.600
and whatever you,
link |
01:02:58.440
all these things you have to avoid.
link |
01:03:00.520
When you go above the buildings and tree lines,
link |
01:03:03.760
there's nothing there.
link |
01:03:04.600
I mean, you can do the test right now,
link |
01:03:06.120
look outside and count the number of things you see flying.
link |
01:03:09.440
I'd be shocked if you could see more than two things.
link |
01:03:11.480
It's probably just zero.
link |
01:03:13.840
In the Bay Area, the most I've ever seen was six.
link |
01:03:16.960
And maybe it's 15 or 20, but not 10,000.
link |
01:03:20.400
So the sky is very ample and very empty and very free.
link |
01:03:24.000
So the vision is,
link |
01:03:25.160
can we build a socially acceptable mass transit solution
link |
01:03:29.960
for daily transportation that is affordable?
link |
01:03:34.280
And we have an existence proof.
link |
01:03:36.320
Heavy sites can fly 100 miles in range
link |
01:03:39.800
with still 30% electric reserves.
link |
01:03:43.280
It can fly up to like 180 miles an hour.
link |
01:03:46.080
We know that that solution at scale
link |
01:03:48.880
would make your ground transportation 10 times as fast
link |
01:03:52.640
as a car based on use sensors or statistics data,
link |
01:03:57.520
which means we would take your 300 hours of daily commute
link |
01:04:01.960
down to 30 hours and give you 270 hours back.
link |
01:04:05.160
Who wouldn't want, I mean, who doesn't hate traffic?
link |
01:04:07.640
Like I hate, give me the person who doesn't hate traffic.
link |
01:04:10.760
I hate traffic every time I'm in traffic, I hate it.
link |
01:04:13.920
And if we could free the world from traffic,
link |
01:04:17.520
we have technology, we can free the world from traffic.
link |
01:04:20.000
We have the technology.
link |
01:04:21.320
It's there, we have an existence proof.
link |
01:04:23.040
It's not a technological problem anymore.
link |
01:04:25.400
Do you think there is a future where tens of thousands,
link |
01:04:29.320
maybe hundreds of thousands of both delivery drones
link |
01:04:34.360
and flying cars of this kind, EV talls, fill the sky?
link |
01:04:39.920
I absolutely believe this.
link |
01:04:40.920
And there's obviously the societal acceptance
link |
01:04:43.600
is a major question and of course safety is.
link |
01:04:46.920
I believe in safety, we only exceed ground transportation
link |
01:04:49.520
safety, as has happened for aviation already,
link |
01:04:52.840
commercial aviation.
link |
01:04:54.480
And in terms of acceptance, I think
link |
01:04:56.880
one of the key things is noise.
link |
01:04:58.280
That's why we are focusing relentlessly on noise
link |
01:05:00.920
and we built perhaps the crisis electric V tall vehicle
link |
01:05:05.600
ever built.
link |
01:05:07.600
The nice thing about the sky is it's three dimensional.
link |
01:05:09.720
So any mathematician will immediately
link |
01:05:11.880
recognize the difference between 1D
link |
01:05:13.400
of like a regular highway to 3D of a sky.
link |
01:05:17.240
But to make it clear for the layman,
link |
01:05:20.200
say you want to make 100 vertical lanes of highway 101
link |
01:05:23.880
in San Francisco, because you believe building
link |
01:05:26.160
a hundred vertical lanes is the right solution.
link |
01:05:28.920
Imagine how much it would cost to stack
link |
01:05:30.720
100 vertical lanes physically onto 101.
link |
01:05:33.360
They would be prohibitive.
link |
01:05:34.280
They would be consuming the world's GDP for an entire year
link |
01:05:37.720
just for one highway.
link |
01:05:39.160
It's amazingly expensive.
link |
01:05:41.200
In the sky, it would just be a recompilation
link |
01:05:43.640
of a piece of software because all these lanes are virtual.
link |
01:05:46.520
That means any vehicle that is in conflict with another vehicle
link |
01:05:49.880
would just go to different altitudes
link |
01:05:51.800
and the conflict is gone.
link |
01:05:53.280
And if you don't believe this, that's
link |
01:05:55.560
exactly how commercial aviation works.
link |
01:05:58.520
When you fly from New York to San Francisco,
link |
01:06:01.400
another plane flies from San Francisco to New York,
link |
01:06:04.200
there are different altitudes so they don't hit each other.
link |
01:06:06.720
It's a solved problem for the jet space.
link |
01:06:10.320
And it will be a solved problem for the urban space.
link |
01:06:12.680
There's companies like Google Wing and Amazon
link |
01:06:15.280
working on very innovative solutions.
link |
01:06:17.000
How do we have space management?
link |
01:06:18.520
They use exactly the same principles
link |
01:06:20.160
as we use today to route today's jets.
link |
01:06:23.280
There's nothing hard about this.
link |
01:06:25.920
Do you envision autonomy being a key part of it
link |
01:06:28.960
so that the flying vehicles are either semi autonomous
link |
01:06:35.400
or fully autonomous?
link |
01:06:36.920
100% autonomous.
link |
01:06:37.880
You don't want idiots like me flying in the sky.
link |
01:06:40.440
I promise you.
link |
01:06:41.960
And if you have 10,000, watch the movie, The Fifth Element,
link |
01:06:46.000
to get a pee for what would happen if it's not autonomous.
link |
01:06:49.480
And a centralized, that's a really interesting idea
link |
01:06:51.720
of a centralized management system for lanes and so on.
link |
01:06:56.320
So actually just being able to have
link |
01:07:00.280
similar as we have in the current commercial aviation,
link |
01:07:03.000
but scale it up to much more vehicles.
link |
01:07:05.560
That's a really interesting optimization problem.
link |
01:07:07.680
It is mathematically very, very straightforward.
link |
01:07:11.120
Like the gap we leave between jets is gargantuanous.
link |
01:07:13.560
And part of the reason is there isn't that many jets.
link |
01:07:16.440
So it just feels like a good solution.
link |
01:07:18.840
Today, when you get vectored by air traffic control,
link |
01:07:22.360
someone talks to you.
link |
01:07:23.920
So an ATC controller might have up to maybe 20 planes
link |
01:07:26.960
on the same frequency.
link |
01:07:28.200
And then they talk to you.
link |
01:07:29.160
You have to talk back.
link |
01:07:30.360
And it feels right because there isn't more than 20 planes
link |
01:07:32.720
around anyhow, so you can talk to everybody.
link |
01:07:34.920
But if there's 20,000 things around,
link |
01:07:36.720
you can't talk to everybody anymore.
link |
01:07:38.120
So we have to do something that's called digital,
link |
01:07:40.240
like text messaging.
link |
01:07:42.120
We do have solutions.
link |
01:07:43.040
Like we have what, four, five billion smartphones
link |
01:07:45.520
in the world now, and they're all connected.
link |
01:07:47.720
And some of us solve the scale problem for smartphones.
link |
01:07:50.720
We know where they all are.
link |
01:07:51.960
They can talk to somebody.
link |
01:07:53.600
And they're very reliable.
link |
01:07:54.880
They're amazingly reliable.
link |
01:07:56.480
We could use the same system, the same scale,
link |
01:08:00.080
for air traffic control.
link |
01:08:01.080
So instead of me as a pilot talking to a human being
link |
01:08:04.080
in the middle of the conversation receiving
link |
01:08:06.800
a new frequency, like how ancient is that,
link |
01:08:09.680
we could digitize the stuff and digitally transmit
link |
01:08:13.480
the right flight coordinates.
link |
01:08:15.280
And that solution will automatically scale to 10,000
link |
01:08:18.680
vehicles.
link |
01:08:20.080
We talked about empathy a little bit.
link |
01:08:22.480
Do you think we'll one day build an AI system
link |
01:08:25.840
that a human being can love and that
link |
01:08:28.000
loves that human back, like in the movie Her?
link |
01:08:31.360
Look, I'm a pragmatist.
link |
01:08:33.960
For me, AI is a tool.
link |
01:08:35.640
It's like a shovel.
link |
01:08:37.080
And the ethics of using the shovel
link |
01:08:39.400
are always with us, the people.
link |
01:08:41.880
And it has to be this way.
link |
01:08:44.240
In terms of emotions, I would hate to come into my kitchen
link |
01:08:49.880
and see that my refrigerator spoiled all my food,
link |
01:08:54.240
then have it explained to me that it fell in love
link |
01:08:56.560
with a dishwasher.
link |
01:08:58.000
And I wasn't as nice as the dishwasher.
link |
01:08:59.680
So as a result, it neglected me.
link |
01:09:02.200
That would just be a bad experience.
link |
01:09:05.160
And it would be a bad product.
link |
01:09:07.080
I would probably not recommend this refrigerator
link |
01:09:09.560
to my friends.
link |
01:09:11.760
And that's where I draw the line.
link |
01:09:13.040
To me, technology has to be reliable
link |
01:09:16.280
and has to be predictable.
link |
01:09:17.360
I want my car to work.
link |
01:09:19.520
I don't want to fall in love with my car.
link |
01:09:22.480
I just want it to work.
link |
01:09:24.320
I want it to complement me, not to replace me.
link |
01:09:26.840
I have very unique human properties.
link |
01:09:30.280
And I want the machines to make me turn me into a superhuman.
link |
01:09:35.360
Like, I'm already a superhuman today,
link |
01:09:37.440
thanks to the machines that surround me.
link |
01:09:38.960
And I'll give you examples.
link |
01:09:40.440
I can run across the Atlantic near the speed of sound
link |
01:09:45.360
at 36,000 feet today.
link |
01:09:48.120
That's kind of amazing.
link |
01:09:49.240
I can, my voice now carries me all the way to Australia
link |
01:09:54.320
using a smartphone today.
link |
01:09:56.280
And it's not the speed of sound, which would take hours.
link |
01:09:59.720
It's the speed of light.
link |
01:10:00.960
My voice travels at the speed of light.
link |
01:10:03.480
How cool is that?
link |
01:10:04.280
That makes me superhuman.
link |
01:10:05.960
I would even argue my flushing toilet makes me superhuman.
link |
01:10:10.280
Just think of the time before flushing toilets.
link |
01:10:13.560
And maybe you have a very old person in your family
link |
01:10:16.280
that you can ask about this.
link |
01:10:18.240
Or take a trip to rural India to experience it.
link |
01:10:23.160
It makes me superhuman.
link |
01:10:25.600
So to me, what technology does, it complements me.
link |
01:10:28.680
It makes me stronger.
link |
01:10:30.680
Therefore, words like love and compassion have very little interest
link |
01:10:37.520
in this for machines.
link |
01:10:38.400
I have interest in people.
link |
01:10:40.560
You don't think, first of all, beautifully put,
link |
01:10:44.080
beautifully argued.
link |
01:10:45.480
But do you think love has use in our tools, compassion?
link |
01:10:50.280
I think love is a beautiful human concept.
link |
01:10:53.120
And if you think of what love really is,
link |
01:10:55.240
love is a means to convey safety, to convey trust.
link |
01:11:03.080
I think trust has a huge need in technology as well,
link |
01:11:07.280
not just people.
link |
01:11:09.040
We want to trust our technology in a similar way
link |
01:11:13.120
we trust people.
link |
01:11:15.840
In human interaction, standards have emerged.
link |
01:11:19.240
And feelings, emotions have emerged, maybe genetically,
link |
01:11:22.400
maybe ideologically, that are able to convey
link |
01:11:24.960
a sense of trust, sense of safety,
link |
01:11:26.400
a sense of passion, of love, of dedication.
link |
01:11:28.760
That makes the human fabric.
link |
01:11:30.680
And I'm a big slacker for love.
link |
01:11:33.600
I want to be loved.
link |
01:11:34.480
I want to be trusted.
link |
01:11:35.240
I want to be admired.
link |
01:11:36.720
All these wonderful things.
link |
01:11:38.760
And because all of us, we have this beautiful system,
link |
01:11:42.080
I wouldn't just blindly copy this to the machines.
link |
01:11:44.800
Here's why.
link |
01:11:46.120
When you look at, say, transportation,
link |
01:11:49.280
you could have observed that up to the end of the 19th century,
link |
01:11:54.480
almost all transportation used any number of legs,
link |
01:11:58.080
from one leg to two legs to 1,000 legs.
link |
01:12:01.600
And you could have concluded that is the right way
link |
01:12:03.800
to move about the environment.
link |
01:12:06.680
We've made the exception of birds who is flapping wings.
link |
01:12:08.920
In fact, there are many people in aviation
link |
01:12:10.800
that flap wings to their arms and jump from cliffs.
link |
01:12:13.680
Most of them didn't survive.
link |
01:12:16.920
Then the interesting thing is that the technology solutions
link |
01:12:19.880
are very different.
link |
01:12:21.560
Like, in technology, it's really easy to build a wheel.
link |
01:12:23.840
In biology, it's super hard to build a wheel.
link |
01:12:25.680
There's very few perpetually rotating things in biology.
link |
01:12:30.040
And they usually run cells and things.
link |
01:12:34.160
In engineering, we can build wheels.
link |
01:12:37.200
And those wheels gave rise to cars.
link |
01:12:41.280
Similar wheels gave rise to aviation.
link |
01:12:44.360
Like, there's no thing that flies that
link |
01:12:46.760
wouldn't have something that rotates,
link |
01:12:48.800
like a jet engine or helicopter blades.
link |
01:12:52.400
So the solutions have used very different physical laws
link |
01:12:55.480
than nature.
link |
01:12:56.480
And that's great.
link |
01:12:58.040
So for me to be too much focused on, oh, this
link |
01:13:00.400
is how nature does it, let's just replicate it,
link |
01:13:03.360
if you really believed that the solution to the agricultural
link |
01:13:06.160
evolution was a humanoid robot, you would still
link |
01:13:09.200
be waiting today.
link |
01:13:10.920
Again, beautifully put, you said that you don't take yourself
link |
01:13:14.680
too seriously.
link |
01:13:15.920
You don't say that?
link |
01:13:18.160
You want me to say that?
link |
01:13:19.160
Maybe.
link |
01:13:19.640
You don't take me seriously.
link |
01:13:20.960
I'm not.
link |
01:13:21.560
Yeah, that's right.
link |
01:13:22.800
Good.
link |
01:13:23.280
You're right.
link |
01:13:23.960
I don't want to.
link |
01:13:24.480
I just made that up.
link |
01:13:25.720
But you have a humor and a likeness about life
link |
01:13:29.120
that I think is beautiful and inspiring to a lot of people.
link |
01:13:33.480
Where does that come from?
link |
01:13:35.040
The smile, the humor, the likeness
link |
01:13:38.400
amidst all the chaos of the hard work that you're in.
link |
01:13:42.560
Where does that come from?
link |
01:13:43.640
I just love my life.
link |
01:13:44.560
I love the people around me.
link |
01:13:45.960
I love, I'm just so glad to be alive.
link |
01:13:49.720
Like, I'm, what, 52?
link |
01:13:52.320
How to believe?
link |
01:13:53.640
People say 52 is a new 51.
link |
01:13:55.480
So now I feel better.
link |
01:13:58.520
But in looking around the world, looking,
link |
01:14:03.280
just go back 200, 300 years.
link |
01:14:06.160
Humanity is what, 300,000 years old.
link |
01:14:09.320
But for the first 300,000 years minus the last 100,
link |
01:14:13.960
our life expectancy would have been plus or minus 30 years,
link |
01:14:18.320
roughly, give or take.
link |
01:14:20.240
So I would be long dead now.
link |
01:14:23.480
Like, that makes me just enjoy every single day of my life.
link |
01:14:26.800
Because I don't deserve this.
link |
01:14:28.040
Like, why am I born today when so many of my ancestors
link |
01:14:32.480
died of horrible deaths?
link |
01:14:34.520
Like, famines, massive wars that ravaged Europe
link |
01:14:39.880
for the last 1,000 years, mystically disappeared
link |
01:14:43.160
after World War II when the Americans and the Allies
link |
01:14:46.520
did something amazing to my country
link |
01:14:48.280
that didn't deserve it, the country of Germany.
link |
01:14:51.440
This is so amazing.
link |
01:14:52.600
And then when you're alive and feel this every day,
link |
01:14:56.960
then it's just so amazing what we can accomplish,
link |
01:15:02.040
what we can do.
link |
01:15:03.520
We live in a world that is so incredibly
link |
01:15:06.840
changing every day, almost everything
link |
01:15:09.840
that we cherish from your smartphone
link |
01:15:12.920
to your flushing toilet, to all these basic inventions,
link |
01:15:16.240
your new clothes you're wearing, your watch, your plane,
link |
01:15:19.640
penicillin, I don't know, anesthesia for surgery,
link |
01:15:25.800
penicillin, have been invented in the last 150 years.
link |
01:15:30.080
So in the last 150 years, something magical happened.
link |
01:15:32.400
And I would trace it back to Gutenberg and the printing
link |
01:15:34.920
press that has been able to disseminate information
link |
01:15:37.680
more efficiently than before, that all of a sudden we
link |
01:15:40.160
were able to invent agriculture and nitrogen
link |
01:15:43.840
fertilization that made agriculture so much more
link |
01:15:46.360
potent that we didn't have to work with farms anymore.
link |
01:15:48.800
And we could start reading and writing,
link |
01:15:50.160
and we could become all these wonderful things we are today,
link |
01:15:52.840
from airline pilot to massage therapist to software engineer.
link |
01:15:56.800
It's just amazing, living in that time is such a blessing.
link |
01:16:00.960
We should sometimes really think about this.
link |
01:16:04.440
Steven Pinker, who is a very famous author and philosopher
link |
01:16:07.360
whom I really adore, wrote a great book called Enlightenment
link |
01:16:09.920
Now, and that's maybe the one book I would recommend.
link |
01:16:11.920
And he asked the question, if there
link |
01:16:14.080
was only a single article written in the 20th century,
link |
01:16:17.200
only one article, what would it be?
link |
01:16:19.120
What's the most important innovation,
link |
01:16:21.160
the most important thing that happened?
link |
01:16:23.120
And he would say this article would credit a guy named Carl Bosch.
link |
01:16:27.080
And I'd challenge anybody, have you ever heard of the name Carl
link |
01:16:30.360
Bosch?
link |
01:16:31.240
I haven't.
link |
01:16:33.000
There's a Bosch corporation in Germany,
link |
01:16:35.440
but it's not associated with Carl Bosch.
link |
01:16:38.480
So I looked it up.
link |
01:16:39.920
Carl Bosch invented nitrogen fertilization.
link |
01:16:42.720
And in doing so, together with an older invention of irrigation,
link |
01:16:47.560
was able to increase the yield per agricultural land
link |
01:16:50.880
by a factor of 26, so a 2,500% increase in fertility of land.
link |
01:16:57.720
And that, so Steve Pinker argues,
link |
01:17:00.560
saved over 2 billion lives today, 2 billion people
link |
01:17:04.680
who would be dead if this man hadn't done what he had done.
link |
01:17:08.440
Think about that impact and what that means to society.
link |
01:17:12.200
That's the way I look at the world.
link |
01:17:14.200
I mean, it's so amazing to be alive and to be part of this.
link |
01:17:16.960
And I'm so glad I lived after Carl Bosch and not before.
link |
01:17:21.360
I don't think there's a better way to end this.
link |
01:17:23.520
Sebastian, it's an honor to talk to you,
link |
01:17:25.480
to have had the chance to learn from you.
link |
01:17:27.400
Thank you so much for talking to us.
link |
01:17:28.800
Thanks for coming out.
link |
01:17:29.800
It's a real pleasure.
link |
01:17:31.000
Thank you for listening to this conversation with Sebastian Thrun.
link |
01:17:34.400
And thank you to our presenting sponsor, Cash App.
link |
01:17:37.480
Download it, use code LEX Podcast.
link |
01:17:40.240
You'll get $10.
link |
01:17:41.520
And $10 will go to first, a STEM education nonprofit
link |
01:17:44.920
that inspires hundreds of thousands of young minds
link |
01:17:47.480
to learn and to dream of engineering our future.
link |
01:17:50.560
If you enjoy this podcast, subscribe on YouTube,
link |
01:17:53.360
get 5 stars on Apple Podcast, support on Patreon,
link |
01:17:56.640
or connect with me on Twitter.
link |
01:17:58.840
And now, let me leave you with some words of wisdom
link |
01:18:01.280
from Sebastian Thrun.
link |
01:18:03.280
It's important to celebrate your failures
link |
01:18:05.400
as much as your successes.
link |
01:18:07.720
If you celebrate your failures really well,
link |
01:18:09.800
if you say, wow, I failed, I tried, I was wrong,
link |
01:18:13.920
but I learned something,
link |
01:18:15.600
then you realize you have no fear.
link |
01:18:18.280
And when your fear goes away, you can move the world.
link |
01:18:22.520
Thank you for listening and hope to see you next time.