back to index

Russ Tedrake: Underactuated Robotics, Control, Dynamics and Touch | Lex Fridman Podcast #114


small model | large model

link |
00:00:00.000
The following is a conversation with Russ Tedrick,
link |
00:00:03.000
a roboticist and professor at MIT
link |
00:00:05.560
and vice president of robotics research
link |
00:00:07.880
at Toyota Research Institute or TRI.
link |
00:00:11.240
He works on control of robots in interesting,
link |
00:00:15.160
complicated, underactuated, stochastic,
link |
00:00:18.000
difficult to model situations.
link |
00:00:19.960
He's a great teacher and a great person,
link |
00:00:22.640
one of my favorites at MIT.
link |
00:00:25.040
We'll get into a lot of topics in this conversation
link |
00:00:28.280
from his time leading MIT's Delta Robotics Challenge team
link |
00:00:32.760
to the awesome fact that he often runs
link |
00:00:35.400
close to a marathon a day to and from work barefoot.
link |
00:00:40.480
For a world class roboticist interested in elegant,
link |
00:00:43.400
efficient control of underactuated dynamical systems
link |
00:00:46.920
like the human body, this fact makes Russ
link |
00:00:50.840
one of the most fascinating people I know.
link |
00:00:54.480
Quick summary of the ads.
link |
00:00:55.780
Three sponsors, Magic Spoon Cereal, BetterHelp,
link |
00:00:59.220
and ExpressVPN.
link |
00:01:00.760
Please consider supporting this podcast
link |
00:01:02.620
by going to magicspoon.com slash lex
link |
00:01:05.680
and using code lex at checkout,
link |
00:01:07.960
going to betterhelp.com slash lex
link |
00:01:10.480
and signing up at expressvpn.com slash lexpod.
link |
00:01:14.640
Click the links in the description,
link |
00:01:16.480
buy the stuff, get the discount.
link |
00:01:18.800
It really is the best way to support this podcast.
link |
00:01:21.800
If you enjoy this thing, subscribe on YouTube,
link |
00:01:24.000
review it with five stars on Apple Podcast,
link |
00:01:26.240
support it on Patreon, or connect with me
link |
00:01:28.280
on Twitter at lexfreedman.
link |
00:01:31.280
As usual, I'll do a few minutes of ads now
link |
00:01:33.640
and never any ads in the middle
link |
00:01:34.880
that can break the flow of the conversation.
link |
00:01:37.880
This episode is supported by Magic Spoon,
link |
00:01:40.880
low carb keto friendly cereal.
link |
00:01:43.460
I've been on a mix of keto or carnivore diet
link |
00:01:45.800
for a very long time now.
link |
00:01:47.320
That means eating very little carbs.
link |
00:01:50.520
I used to love cereal.
link |
00:01:52.200
Obviously, most have crazy amounts of sugar,
link |
00:01:54.960
which is terrible for you, so I quit years ago,
link |
00:01:58.000
but Magic Spoon is a totally new thing.
link |
00:02:00.420
Zero sugar, 11 grams of protein,
link |
00:02:03.000
and only three net grams of carbs.
link |
00:02:05.720
It tastes delicious.
link |
00:02:07.240
It has a bunch of flavors, they're all good,
link |
00:02:09.660
but if you know what's good for you,
link |
00:02:11.200
you'll go with cocoa, my favorite flavor
link |
00:02:13.940
and the flavor of champions.
link |
00:02:15.820
Click the magicspoon.com slash lex link in the description,
link |
00:02:19.460
use code lex at checkout to get the discount
link |
00:02:22.160
and to let them know I sent you.
link |
00:02:24.400
So buy all of their cereal.
link |
00:02:26.680
It's delicious and good for you.
link |
00:02:28.640
You won't regret it.
link |
00:02:30.560
This show is also sponsored by BetterHelp,
link |
00:02:33.160
spelled H E L P Help.
link |
00:02:36.040
Check it out at betterhelp.com slash lex.
link |
00:02:39.440
They figure out what you need
link |
00:02:40.600
and match you with a licensed professional therapist
link |
00:02:43.240
in under 48 hours.
link |
00:02:44.960
It's not a crisis line, it's not self help,
link |
00:02:47.640
it is professional counseling done securely online.
link |
00:02:51.040
As you may know, I'm a bit from the David Goggins line
link |
00:02:53.720
of creatures and still have some demons to contend with,
link |
00:02:57.080
usually on long runs or all nighters full of self doubt.
link |
00:03:01.580
I think suffering is essential for creation,
link |
00:03:04.360
but you can suffer beautifully
link |
00:03:06.040
in a way that doesn't destroy you.
link |
00:03:08.200
For most people, I think a good therapist can help in this.
link |
00:03:11.540
So it's at least worth a try.
link |
00:03:13.400
Check out the reviews, they're all good.
link |
00:03:15.620
It's easy, private, affordable, available worldwide.
link |
00:03:19.220
You can communicate by text anytime
link |
00:03:21.640
and schedule weekly audio and video sessions.
link |
00:03:25.080
Check it out at betterhelp.com slash lex.
link |
00:03:28.500
This show is also sponsored by ExpressVPN.
link |
00:03:31.840
Get it at expressvpn.com slash lex pod
link |
00:03:34.860
to get a discount and to support this podcast.
link |
00:03:37.680
Have you ever watched The Office?
link |
00:03:39.680
If you have, you probably know it's based
link |
00:03:41.900
on a UK series also called The Office.
link |
00:03:45.120
Not to stir up trouble, but I personally think
link |
00:03:48.080
the British version is actually more brilliant
link |
00:03:50.320
than the American one, but both are amazing.
link |
00:03:53.120
Anyway, there are actually nine other countries
link |
00:03:56.120
with their own version of The Office.
link |
00:03:58.400
You can get access to them with no geo restriction
link |
00:04:01.180
when you use ExpressVPN.
link |
00:04:03.600
It lets you control where you want sites
link |
00:04:05.560
to think you're located.
link |
00:04:07.340
You can choose from nearly 100 different countries,
link |
00:04:10.360
giving you access to content
link |
00:04:12.120
that isn't available in your region.
link |
00:04:14.020
So again, get it on any device at expressvpn.com slash lex pod
link |
00:04:19.800
to get an extra three months free
link |
00:04:22.080
and to support this podcast.
link |
00:04:25.000
And now here's my conversation with Russ Tedrick.
link |
00:04:29.560
What is the most beautiful motion
link |
00:04:31.480
of an animal or robot that you've ever seen?
link |
00:04:36.160
I think the most beautiful motion of a robot
link |
00:04:38.280
has to be the passive dynamic walkers.
link |
00:04:41.120
I think there's just something fundamentally beautiful.
link |
00:04:43.320
The ones in particular that Steve Collins built
link |
00:04:45.360
with Andy Ruina at Cornell, a 3D walking machine.
link |
00:04:50.520
So it was not confined to a boom or a plane
link |
00:04:54.680
that you put it on top of a small ramp,
link |
00:04:57.460
give it a little push, it's powered only by gravity.
link |
00:05:00.500
No controllers, no batteries whatsoever.
link |
00:05:04.320
It just falls down the ramp.
link |
00:05:06.160
And at the time it looked more natural, more graceful,
link |
00:05:09.520
more human like than any robot we'd seen to date
link |
00:05:13.460
powered only by gravity.
link |
00:05:15.240
How does it work?
link |
00:05:17.160
Well, okay, the simplest model, it's kind of like a slinky.
link |
00:05:19.480
It's like an elaborate slinky.
link |
00:05:21.560
One of the simplest models we used to think about it
link |
00:05:23.840
is actually a rimless wheel.
link |
00:05:25.360
So imagine taking a bicycle wheel, but take the rim off.
link |
00:05:30.100
So it's now just got a bunch of spokes.
link |
00:05:32.640
If you give that a push,
link |
00:05:33.720
it still wants to roll down the ramp,
link |
00:05:35.840
but every time its foot, its spoke comes around
link |
00:05:38.180
and hits the ground, it loses a little energy.
link |
00:05:41.880
Every time it takes a step forward,
link |
00:05:43.280
it gains a little energy.
link |
00:05:45.800
Those things can come into perfect balance.
link |
00:05:48.200
And actually they want to, it's a stable phenomenon.
link |
00:05:51.240
If it's going too slow, it'll speed up.
link |
00:05:53.720
If it's going too fast, it'll slow down
link |
00:05:55.880
and it comes into a stable periodic motion.
link |
00:05:59.480
Now you can take that rimless wheel,
link |
00:06:02.120
which doesn't look very much like a human walking,
link |
00:06:05.040
take all the extra spokes away, put a hinge in the middle.
link |
00:06:08.080
Now it's two legs.
link |
00:06:09.720
That's called our compass gait walker.
link |
00:06:11.880
That can still, you give it a little push,
link |
00:06:13.800
it starts falling down a ramp.
link |
00:06:15.520
It looks a little bit more like walking.
link |
00:06:17.240
At least it's a biped.
link |
00:06:19.700
But what Steve and Andy,
link |
00:06:21.400
and Tad McGeer started the whole exercise,
link |
00:06:23.480
but what Steve and Andy did was they took it
link |
00:06:25.200
to this beautiful conclusion
link |
00:06:28.700
where they built something that had knees, arms, a torso.
link |
00:06:32.440
The arms swung naturally, give it a little push.
link |
00:06:36.320
And that looked like a stroll through the park.
link |
00:06:38.720
How do you design something like that?
link |
00:06:40.240
I mean, is that art or science?
link |
00:06:42.360
It's on the boundary.
link |
00:06:43.800
I think there's a science to getting close to the solution.
link |
00:06:47.640
I think there's certainly art in the way
link |
00:06:49.040
that they made a beautiful robot.
link |
00:06:52.000
But then the finesse, because they were working
link |
00:06:57.080
with a system that wasn't perfectly modeled,
link |
00:06:58.980
wasn't perfectly controlled,
link |
00:07:01.060
there's all these little tricks
link |
00:07:02.800
that you have to tune the suction cups at the knees,
link |
00:07:05.480
for instance, so that they stick,
link |
00:07:07.960
but then they release at just the right time.
link |
00:07:09.640
Or there's all these little tricks of the trade,
link |
00:07:12.360
which really are art, but it was a point.
link |
00:07:14.440
I mean, it made the point.
link |
00:07:16.200
We were, at that time, the walking robot,
link |
00:07:18.800
the best walking robot in the world was Honda's Asmo.
link |
00:07:21.840
Absolutely marvel of modern engineering.
link |
00:07:24.120
Is this 90s?
link |
00:07:25.240
This was in 97 when they first released.
link |
00:07:27.440
It sort of announced P2, and then it went through.
link |
00:07:29.920
It was Asmo by then in 2004.
link |
00:07:32.360
And it looks like this very cautious walking,
link |
00:07:37.840
like you're walking on hot coals or something like that.
link |
00:07:41.320
I think it gets a bad rap.
link |
00:07:43.760
Asmo is a beautiful machine.
link |
00:07:45.340
It does walk with its knees bent.
link |
00:07:47.000
Our Atlas walking had its knees bent.
link |
00:07:49.740
But actually, Asmo was pretty fantastic.
link |
00:07:52.340
But it wasn't energy efficient.
link |
00:07:54.320
Neither was Atlas when we worked on Atlas.
link |
00:07:58.220
None of our robots that have been that complicated
link |
00:08:00.520
have been very energy efficient.
link |
00:08:04.040
But there's a thing that happens when you do control,
link |
00:08:09.680
when you try to control a system of that complexity.
link |
00:08:12.480
You try to use your motors to basically counteract gravity.
link |
00:08:17.360
Take whatever the world's doing to you and push back,
link |
00:08:20.680
erase the dynamics of the world,
link |
00:08:23.520
and impose the dynamics you want
link |
00:08:25.040
because you can make them simple and analyzable,
link |
00:08:28.220
mathematically simple.
link |
00:08:30.760
And this was a very sort of beautiful example
link |
00:08:34.400
that you don't have to do that.
link |
00:08:36.380
You can just let go.
link |
00:08:37.480
Let physics do most of the work, right?
link |
00:08:40.280
And you just have to give it a little bit of energy.
link |
00:08:42.200
This one only walked down a ramp.
link |
00:08:43.560
It would never walk on the flat.
link |
00:08:45.340
To walk on the flat,
link |
00:08:46.180
you have to give a little energy at some point.
link |
00:08:48.480
But maybe instead of trying to take the forces imparted
link |
00:08:51.960
to you by the world and replacing them,
link |
00:08:55.200
what we should be doing is letting the world push us around
link |
00:08:58.200
and we go with the flow.
link |
00:08:59.360
Very zen, very zen robot.
link |
00:09:01.280
Yeah, but okay, so that sounds very zen,
link |
00:09:03.440
but I can also imagine how many like failed versions
link |
00:09:10.220
they had to go through.
link |
00:09:11.640
Like how many, like, I would say it's probably,
link |
00:09:14.040
would you say it's in the thousands
link |
00:09:15.320
that they've had to have the system fall down
link |
00:09:17.920
before they figured out how to get it?
link |
00:09:19.840
I don't know if it's thousands, but it's a lot.
link |
00:09:22.560
It takes some patience.
link |
00:09:23.560
There's no question.
link |
00:09:25.040
So in that sense, control might help a little bit.
link |
00:09:28.320
Oh, I think everybody, even at the time,
link |
00:09:32.100
said that the answer is to do with that with control.
link |
00:09:35.020
But it was just pointing out
link |
00:09:36.340
that maybe the way we're doing control right now
link |
00:09:39.120
isn't the way we should.
link |
00:09:41.040
Got it.
link |
00:09:41.880
So what about on the animal side,
link |
00:09:43.800
the ones that figured out how to move efficiently?
link |
00:09:46.200
Is there anything you find inspiring or beautiful
link |
00:09:49.440
in the movement of any particular animal?
link |
00:09:51.160
I do have a favorite example.
link |
00:09:51.980
Okay.
link |
00:09:52.820
So it sort of goes with the passive walking idea.
link |
00:09:57.160
So is there, you know, how energy efficient are animals?
link |
00:10:01.400
Okay, there's a great series of experiments
link |
00:10:03.840
by George Lauder at Harvard and Mike Tranofilo at MIT.
link |
00:10:07.520
They were studying fish swimming in a water tunnel.
link |
00:10:10.640
Okay.
link |
00:10:11.820
And one of these, the type of fish they were studying
link |
00:10:15.240
were these rainbow trout,
link |
00:10:17.240
because there was a phenomenon well understood
link |
00:10:20.360
that rainbow trout, when they're swimming upstream
link |
00:10:22.180
in mating season, they kind of hang out behind the rocks.
link |
00:10:25.120
And it looks like, I mean,
link |
00:10:26.080
that's tiring work swimming upstream.
link |
00:10:28.080
They're hanging out behind the rocks.
link |
00:10:29.180
Maybe there's something energetically interesting there.
link |
00:10:31.980
So they tried to recreate that.
link |
00:10:33.400
They put in this water tunnel, a rock basically,
link |
00:10:36.440
a cylinder that had the same sort of vortex street,
link |
00:10:40.560
the eddies coming off the back of the rock
link |
00:10:42.480
that you would see in a stream.
link |
00:10:44.240
And they put a real fish behind this
link |
00:10:46.080
and watched how it swims.
link |
00:10:48.000
And the amazing thing is that if you watch from above
link |
00:10:51.960
what the fish swims when it's not behind a rock,
link |
00:10:53.800
it has a particular gate.
link |
00:10:56.120
You can identify the fish the same way you look
link |
00:10:58.240
at a human walking down the street.
link |
00:10:59.840
You sort of have a sense of how a human walks.
link |
00:11:02.420
The fish has a characteristic gate.
link |
00:11:05.360
You put that fish behind the rock, its gate changes.
link |
00:11:09.160
And what they saw was that it was actually resonating
link |
00:11:12.720
and kind of surfing between the vortices.
link |
00:11:16.560
Now, here was the experiment that really was the clincher.
link |
00:11:20.140
Because there was still, it wasn't clear how much of that
link |
00:11:22.160
was mechanics of the fish,
link |
00:11:24.000
how much of that is control, the brain.
link |
00:11:26.940
So the clincher experiment,
link |
00:11:28.480
and maybe one of my favorites to date,
link |
00:11:29.800
although there are many good experiments.
link |
00:11:33.700
They took, this was now a dead fish.
link |
00:11:38.380
They took a dead fish.
link |
00:11:40.200
They put a string that went,
link |
00:11:41.640
that tied the mouth of the fish to the rock
link |
00:11:44.160
so it couldn't go back and get caught in the grates.
link |
00:11:47.160
And then they asked what would that dead fish do
link |
00:11:49.180
when it was hanging out behind the rock?
link |
00:11:51.160
And so what you'd expect, it sort of flopped around
link |
00:11:52.920
like a dead fish in the vortex wake
link |
00:11:56.120
until something sort of amazing happens.
link |
00:11:57.800
And this video is worth putting in, right?
link |
00:12:02.880
What happens?
link |
00:12:04.040
The dead fish basically starts swimming upstream, right?
link |
00:12:07.520
It's completely dead, no brain, no motors, no control.
link |
00:12:12.160
But it's somehow the mechanics of the fish
link |
00:12:14.600
resonate with the vortex street
link |
00:12:16.360
and it starts swimming upstream.
link |
00:12:18.280
It's one of the best examples ever.
link |
00:12:20.520
Who do you give credit for that to?
link |
00:12:23.740
Is that just evolution constantly just figuring out
link |
00:12:27.980
by killing a lot of generations of animals,
link |
00:12:30.920
like the most efficient motion?
link |
00:12:33.360
Is that, or maybe the physics of our world completely like,
link |
00:12:38.660
is like if evolution applied not only to animals,
link |
00:12:40.920
but just the entirety of it somehow drives to efficiency,
link |
00:12:45.220
like nature likes efficiency?
link |
00:12:47.020
I don't know if that question even makes any sense.
link |
00:12:49.980
I understand the question.
link |
00:12:51.020
That's reasonable.
link |
00:12:51.860
I mean, do they co evolve?
link |
00:12:54.460
Yeah, somehow co, yeah.
link |
00:12:55.620
Like I don't know if an environment can evolve, but.
link |
00:13:00.020
I mean, there are experiments that people do,
link |
00:13:02.340
careful experiments that show that animals can adapt
link |
00:13:05.940
to unusual situations and recover efficiency.
link |
00:13:08.660
So there seems like at least in one direction,
link |
00:13:11.100
I think there is reason to believe
link |
00:13:12.740
that the animal's motor system and probably its mechanics
link |
00:13:18.100
adapt in order to be more efficient.
link |
00:13:20.060
But efficiency isn't the only goal, of course.
link |
00:13:23.140
Sometimes it's too easy to think about only efficiency,
link |
00:13:26.220
but we have to do a lot of other things first, not get eaten.
link |
00:13:30.540
And then all other things being equal, try to save energy.
link |
00:13:34.140
By the way, let's draw a distinction
link |
00:13:36.100
between control and mechanics.
link |
00:13:38.160
Like how would you define each?
link |
00:13:40.820
Yeah.
link |
00:13:41.720
I mean, I think part of the point is that
link |
00:13:43.940
we shouldn't draw a line as clearly as we tend to.
link |
00:13:47.860
But on a robot, we have motors
link |
00:13:51.460
and we have the links of the robot, let's say.
link |
00:13:54.840
If the motors are turned off,
link |
00:13:56.260
the robot has some passive dynamics, okay?
link |
00:13:59.780
Gravity does the work.
link |
00:14:01.380
You can put springs, I would call that mechanics, right?
link |
00:14:03.700
If we have springs and dampers,
link |
00:14:04.940
which our muscles are springs and dampers and tendons.
link |
00:14:08.540
But then you have something that's doing active work,
link |
00:14:10.440
putting energy in, which are your motors on the robot.
link |
00:14:13.240
The controller's job is to send commands to the motor
link |
00:14:16.580
that add new energy into the system, right?
link |
00:14:19.960
So the mechanics and control interplay somewhere,
link |
00:14:22.820
the divide is around, you know,
link |
00:14:24.820
did you decide to send some commands to your motor
link |
00:14:27.560
or did you just leave the motors off,
link |
00:14:28.980
let them do their work?
link |
00:14:30.580
Would you say is most of nature
link |
00:14:35.140
on the dynamic side or the control side?
link |
00:14:39.820
So like, if you look at biological systems,
link |
00:14:43.580
we're living in a pandemic now,
link |
00:14:45.100
like, do you think a virus is a,
link |
00:14:47.840
do you think it's a dynamic system
link |
00:14:50.100
or is there a lot of control, intelligence?
link |
00:14:54.100
I think it's both, but I think we maybe have underestimated
link |
00:14:57.040
how important the dynamics are, right?
link |
00:15:02.020
I mean, even our bodies, the mechanics of our bodies,
link |
00:15:04.300
certainly with exercise, they evolve.
link |
00:15:06.140
But so I actually, I lost a finger in early 2000s
link |
00:15:11.060
and it's my fifth metacarpal.
link |
00:15:14.460
And it turns out you use that a lot
link |
00:15:16.620
in ways you don't expect when you're opening jars,
link |
00:15:19.340
even when I'm just walking around,
link |
00:15:20.620
if I bump it on something, there's a bone there
link |
00:15:23.220
that was used to taking contact.
link |
00:15:26.780
My fourth metacarpal wasn't used to taking contact,
link |
00:15:28.820
it used to hurt, it still does a little bit.
link |
00:15:31.100
But actually my bone has remodeled, right?
link |
00:15:34.180
Over a couple of years, the geometry,
link |
00:15:39.580
the mechanics of that bone changed
link |
00:15:42.100
to address the new circumstances.
link |
00:15:44.340
So the idea that somehow it's only our brain
link |
00:15:46.820
that's adapting or evolving is not right.
link |
00:15:50.140
Maybe sticking on evolution for a bit,
link |
00:15:52.560
because it's tended to create some interesting things.
link |
00:15:56.720
Bipedal walking, why the heck did evolution give us,
link |
00:16:01.720
I think we're, are we the only mammals that walk on two feet?
link |
00:16:05.040
No, I mean, there's a bunch of animals that do it a bit.
link |
00:16:09.040
A bit.
link |
00:16:09.880
I think we are the most successful bipeds.
link |
00:16:12.280
I think I read somewhere that the reason
link |
00:16:17.760
the evolution made us walk on two feet
link |
00:16:22.760
is because there's an advantage
link |
00:16:24.720
to being able to carry food back to the tribe
link |
00:16:27.200
or something like that.
link |
00:16:28.040
So like you can carry, it's kind of this communal,
link |
00:16:31.960
cooperative thing, so like to carry stuff back
link |
00:16:35.080
to a place of shelter and so on to share with others.
link |
00:16:40.080
Do you understand at all the value of walking on two feet
link |
00:16:44.520
from both a robotics and a human perspective?
link |
00:16:48.000
Yeah, there are some great books written
link |
00:16:50.280
about evolution of, walking evolution of the human body.
link |
00:16:54.560
I think it's easy though to make bad evolutionary arguments.
link |
00:17:00.600
Sure, most of them are probably bad,
link |
00:17:03.740
but what else can we do?
link |
00:17:06.200
I mean, I think a lot of what dominated our evolution
link |
00:17:11.120
probably was not the things that worked well
link |
00:17:15.080
sort of in the steady state, you know,
link |
00:17:18.560
when things are good, but for instance,
link |
00:17:22.800
people talk about what we should eat now
link |
00:17:25.040
because our ancestors were meat eaters or whatever.
link |
00:17:28.320
Oh yeah, I love that, yeah.
link |
00:17:30.240
But probably, you know, the reason
link |
00:17:32.520
that one pre Homo sapiens species versus another survived
link |
00:17:39.640
was not because of whether they ate well
link |
00:17:43.440
when there was lots of food.
link |
00:17:45.300
But when the ice age came, you know,
link |
00:17:47.920
probably one of them happened to be in the wrong place.
link |
00:17:50.940
One of them happened to forage a food that was okay
link |
00:17:54.200
even when the glaciers came or something like that, I mean.
link |
00:17:58.240
There's a million variables that contributed
link |
00:18:00.560
and we can't, and our, actually the amount of information
link |
00:18:04.080
we're working with and telling these stories,
link |
00:18:06.680
these evolutionary stories is very little.
link |
00:18:10.220
So yeah, just like you said, it seems like,
link |
00:18:13.080
if you study history, it seems like history turns
link |
00:18:15.680
on like these little events that otherwise
link |
00:18:20.280
would seem meaningless, but in a grant,
link |
00:18:23.320
like when you, in retrospect, were turning points.
link |
00:18:27.560
Absolutely.
link |
00:18:28.400
And that's probably how like somebody got hit in the head
link |
00:18:31.280
with a rock because somebody slept with the wrong person
link |
00:18:35.160
back in the cave days and somebody get angry
link |
00:18:38.500
and that turned, you know, warring tribes
link |
00:18:41.920
combined with the environment, all those millions of things
link |
00:18:45.360
and the meat eating, which I get a lot of criticism
link |
00:18:47.680
because I don't know what your dietary processes are like,
link |
00:18:51.480
but these days I've been eating only meat,
link |
00:18:55.040
which is, there's a large community of people who say,
link |
00:18:59.080
yeah, probably make evolutionary arguments
link |
00:19:01.080
and say you're doing a great job.
link |
00:19:02.720
There's probably an even larger community of people,
link |
00:19:05.760
including my mom, who says it's deeply unhealthy,
link |
00:19:08.520
it's wrong, but I just feel good doing it.
link |
00:19:10.760
But you're right, these evolutionary arguments
link |
00:19:12.980
can be flawed, but is there anything interesting
link |
00:19:15.420
to pull out for?
link |
00:19:17.320
There's a great book, by the way,
link |
00:19:19.360
well, a series of books by Nicholas Taleb
link |
00:19:21.280
about Fooled by Randomness and Black Swan.
link |
00:19:24.800
Highly recommend them, but yeah,
link |
00:19:26.840
they make the point nicely that probably
link |
00:19:29.160
it was a few random events that, yes,
link |
00:19:34.360
maybe it was someone getting hit by a rock, as you say.
link |
00:19:39.520
That said, do you think, I don't know how to ask this
link |
00:19:42.700
question or how to talk about this,
link |
00:19:44.080
but there's something elegant and beautiful
link |
00:19:45.680
about moving on two feet, obviously biased
link |
00:19:48.800
because I'm human, but from a robotics perspective, too,
link |
00:19:53.280
you work with robots on two feet,
link |
00:19:56.440
is it all useful to build robots that are on two feet
link |
00:20:00.120
as opposed to four?
link |
00:20:01.120
Is there something useful about it?
link |
00:20:02.320
I think the most, I mean, the reason I spent a long time
link |
00:20:05.540
working on bipedal walking was because it was hard
link |
00:20:09.000
and it challenged control theory in ways
link |
00:20:12.480
that I thought were important.
link |
00:20:13.920
I wouldn't have ever tried to convince you
link |
00:20:18.520
that you should start a company around bipeds
link |
00:20:22.440
or something like this.
link |
00:20:24.240
There are people that make pretty compelling arguments.
link |
00:20:26.120
I think the most compelling one is that the world
link |
00:20:28.920
is built for the human form, and if you want a robot
link |
00:20:32.320
to work in the world we have today,
link |
00:20:34.800
then having a human form is a pretty good way to go.
link |
00:20:39.680
There are places that a biped can go that would be hard
link |
00:20:42.560
for other form factors to go, even natural places,
link |
00:20:47.640
but at some point in the long run,
link |
00:20:51.360
we'll be building our environments for our robots, probably,
link |
00:20:54.220
and so maybe that argument falls aside.
link |
00:20:56.480
So you famously run barefoot.
link |
00:21:00.640
Do you still run barefoot?
link |
00:21:02.120
I still run barefoot.
link |
00:21:03.080
That's so awesome.
link |
00:21:04.760
Much to my wife's chagrin.
link |
00:21:07.800
Do you want to make an evolutionary argument
link |
00:21:09.320
for why running barefoot is advantageous?
link |
00:21:12.680
What have you learned about human and robot movement
link |
00:21:17.560
in general from running barefoot?
link |
00:21:21.160
Human or robot and or?
link |
00:21:23.640
Well, you know, it happened the other way, right?
link |
00:21:25.640
So I was studying walking robots,
link |
00:21:27.680
and there's a great conference called
link |
00:21:31.760
the Dynamic Walking Conference where it brings together
link |
00:21:35.320
both the biomechanics community
link |
00:21:36.980
and the walking robots community.
link |
00:21:39.880
And so I had been going to this for years
link |
00:21:41.660
and hearing talks by people who study barefoot running
link |
00:21:45.080
and other, the mechanics of running.
link |
00:21:48.080
So I did eventually read Born to Run.
link |
00:21:50.280
Most people read Born to Run in the first, right?
link |
00:21:54.080
The other thing I had going for me is actually
link |
00:21:55.720
that I wasn't a runner before,
link |
00:21:58.800
and I learned to run after I had learned
link |
00:22:01.560
about barefoot running, or I mean,
link |
00:22:03.640
started running longer distances.
link |
00:22:05.440
So I didn't have to unlearn.
link |
00:22:07.360
And I'm definitely, I'm a big fan of it for me,
link |
00:22:11.080
but I'm not going to,
link |
00:22:12.360
I tend to not try to convince other people.
link |
00:22:14.600
There's people who run beautifully with shoes on,
link |
00:22:17.240
and that's good.
link |
00:22:20.040
But here's why it makes sense for me.
link |
00:22:24.040
It's all about the longterm game, right?
link |
00:22:26.360
So I think it's just too easy to run 10 miles,
link |
00:22:29.440
feel pretty good, and then you get home at night
link |
00:22:31.560
and you realize my knees hurt.
link |
00:22:33.840
I did something wrong, right?
link |
00:22:37.880
If you take your shoes off,
link |
00:22:39.780
then if you hit hard with your foot at all,
link |
00:22:44.080
then it hurts.
link |
00:22:45.720
You don't like run 10 miles
link |
00:22:47.560
and then realize you've done some damage.
link |
00:22:50.800
You have immediate feedback telling you
link |
00:22:52.940
that you've done something that's maybe suboptimal,
link |
00:22:55.420
and you change your gait.
link |
00:22:56.520
I mean, it's even subconscious.
link |
00:22:57.720
If I, right now, having run many miles barefoot,
link |
00:23:00.640
if I put a shoe on, my gait changes
link |
00:23:03.160
in a way that I think is not as good.
link |
00:23:05.840
So it makes me land softer.
link |
00:23:09.520
And I think my goals for running
link |
00:23:13.160
are to do it for as long as I can into old age,
link |
00:23:16.860
not to win any races.
link |
00:23:19.000
And so for me, this is a way to protect myself.
link |
00:23:23.420
Yeah, I think, first of all,
link |
00:23:25.680
I've tried running barefoot many years ago,
link |
00:23:29.540
probably the other way,
link |
00:23:30.480
just reading Born to Run.
link |
00:23:33.920
But just to understand,
link |
00:23:36.440
because I felt like I couldn't put in the miles
link |
00:23:39.520
that I wanted to.
link |
00:23:40.840
And it feels like running for me,
link |
00:23:44.260
and I think for a lot of people,
link |
00:23:46.280
was one of those activities that we do often
link |
00:23:48.880
and we never really try to learn to do correctly.
link |
00:23:53.340
Like, it's funny, there's so many activities
link |
00:23:55.920
we do every day, like brushing our teeth, right?
link |
00:24:00.280
I think a lot of us, at least me,
link |
00:24:02.360
probably have never deeply studied
link |
00:24:04.320
how to properly brush my teeth, right?
link |
00:24:07.040
Or wash, as now with the pandemic,
link |
00:24:08.960
or how to properly wash our hands.
link |
00:24:10.640
We do it every day, but we haven't really studied,
link |
00:24:13.800
like, am I doing this correctly?
link |
00:24:15.200
But running felt like one of those things,
link |
00:24:17.120
it was absurd not to study how to do correctly,
link |
00:24:20.220
because it's the source of so much pain and suffering.
link |
00:24:23.320
Like, I hate running, but I do it.
link |
00:24:25.680
I do it because I hate it, but I feel good afterwards.
link |
00:24:28.940
But I think it feels like you need
link |
00:24:30.280
to learn how to do it properly.
link |
00:24:31.440
So that's where barefoot running came in,
link |
00:24:33.540
and then I quickly realized that my gait
link |
00:24:35.760
was completely wrong.
link |
00:24:38.040
I was taking huge steps,
link |
00:24:41.440
and landing hard on the heel, all those elements.
link |
00:24:45.840
And so, yeah, from that I actually learned
link |
00:24:47.600
to take really small steps, look.
link |
00:24:50.520
I already forgot the number,
link |
00:24:52.280
but I feel like it was 180 a minute or something like that.
link |
00:24:55.600
And I remember I actually just took songs
link |
00:25:00.080
that are 180 beats per minute,
link |
00:25:03.360
and then like tried to run at that beat,
link |
00:25:06.520
and just to teach myself.
link |
00:25:07.660
It took a long time, and I feel like after a while,
link |
00:25:11.120
you learn to run, you adjust properly,
link |
00:25:14.320
without going all the way to barefoot.
link |
00:25:15.960
But I feel like barefoot is the legit way to do it.
link |
00:25:19.440
I mean, I think a lot of people
link |
00:25:21.640
would be really curious about it.
link |
00:25:23.360
Can you, if they're interested in trying,
link |
00:25:25.560
what would you, how would you recommend
link |
00:25:27.840
they start, or try, or explore?
link |
00:25:30.740
Slowly.
link |
00:25:31.580
That's the biggest thing people do,
link |
00:25:33.720
is they are excellent runners,
link |
00:25:35.920
and they're used to running long distances,
link |
00:25:37.620
or running fast, and they take their shoes off,
link |
00:25:39.240
and they hurt themselves instantly trying to do
link |
00:25:42.520
something that they were used to doing.
link |
00:25:44.280
I think I lucked out in the sense
link |
00:25:46.000
that I couldn't run very far when I first started trying.
link |
00:25:50.200
And I run with minimal shoes too.
link |
00:25:51.840
I mean, I will bring along a pair of,
link |
00:25:54.360
actually, like aqua socks or something like this,
link |
00:25:56.320
I can just slip on, or running sandals,
link |
00:25:58.320
I've tried all of them.
link |
00:26:00.360
What's the difference between a minimal shoe
link |
00:26:02.600
and nothing at all?
link |
00:26:03.760
What's, like, feeling wise, what does it feel like?
link |
00:26:07.020
There is a, I mean, I notice my gait changing, right?
link |
00:26:10.000
So, I mean, your foot has as many muscles
link |
00:26:15.080
and sensors as your hand does, right?
link |
00:26:17.600
Sensors, ooh, okay.
link |
00:26:19.960
And we do amazing things with our hands.
link |
00:26:23.200
And we stick our foot in a big, solid shoe, right?
link |
00:26:26.000
So there's, I think, you know, when you're barefoot,
link |
00:26:29.640
you're just giving yourself more proprioception.
link |
00:26:33.240
And that's why you're more aware of some of the gait flaws
link |
00:26:35.720
and stuff like this.
link |
00:26:37.080
Now, you have less protection too, so.
link |
00:26:40.720
Rocks and stuff.
link |
00:26:42.400
I mean, yeah, so I think people who are afraid
link |
00:26:45.160
of barefoot running are worried about getting cuts
link |
00:26:47.160
or stepping on rocks.
link |
00:26:49.800
First of all, even if that was a concern,
link |
00:26:51.560
I think those are all, like, very short term.
link |
00:26:54.240
You know, if I get a scratch or something,
link |
00:26:55.420
it'll heal in a week.
link |
00:26:56.520
If I blow out my knees, I'm done running forever.
link |
00:26:58.240
So I will trade the short term for the long term anytime.
link |
00:27:01.720
But even then, you know, and this, again,
link |
00:27:04.760
to my wife's chagrin, your feet get tough, right?
link |
00:27:07.760
And, yeah, I can run over almost anything now.
link |
00:27:13.760
I mean, what, can you talk about,
link |
00:27:17.240
is there, like, is there tips or tricks
link |
00:27:21.940
that you have, suggestions about,
link |
00:27:24.820
like, if I wanted to try it?
link |
00:27:26.620
You know, there is a good book, actually.
link |
00:27:29.580
There's probably more good books since I read them.
link |
00:27:32.700
But Ken Bob, Barefoot Ken Bob Saxton.
link |
00:27:37.340
He's an interesting guy.
link |
00:27:38.820
But I think his book captures the right way
link |
00:27:42.620
to describe running, barefoot running,
link |
00:27:44.180
to somebody better than any other I've seen.
link |
00:27:48.580
So you run pretty good distances, and you bike,
link |
00:27:52.540
and is there, you know, if we talk about bucket list items,
link |
00:27:57.820
is there something crazy on your bucket list,
link |
00:28:00.220
athletically, that you hope to do one day?
link |
00:28:04.620
I mean, my commute is already a little crazy.
link |
00:28:07.180
What are we talking about here?
link |
00:28:09.020
What distance are we talking about?
link |
00:28:11.420
Well, I live about 12 miles from MIT,
link |
00:28:14.680
but you can find lots of different ways to get there.
link |
00:28:16.620
So, I mean, I've run there for many years, I've biked there.
link |
00:28:20.540
Old ways?
link |
00:28:21.460
Yeah, but normally I would try to run in
link |
00:28:23.900
and then bike home, bike in, run home.
link |
00:28:25.980
But you have run there and back before?
link |
00:28:28.140
Sure.
link |
00:28:28.980
Barefoot?
link |
00:28:29.820
Yeah, or with minimal shoes or whatever that.
link |
00:28:32.260
12, 12 times two?
link |
00:28:34.340
Yeah.
link |
00:28:35.180
Okay.
link |
00:28:36.020
It became kind of a game of how can I get to work?
link |
00:28:38.500
I've rollerbladed, I've done all kinds of weird stuff,
link |
00:28:41.020
but my favorite one these days,
link |
00:28:42.700
I've been taking the Charles River to work.
link |
00:28:45.060
So, I can put in the rowboat not so far from my house,
link |
00:28:50.740
but the Charles River takes a long way to get to MIT,
link |
00:28:53.300
so I can spend a long time getting there.
link |
00:28:56.380
And it's not about, I don't know, it's just about,
link |
00:29:01.620
I've had people ask me,
link |
00:29:02.560
how can you justify taking that time?
link |
00:29:05.820
But for me, it's just a magical time to think,
link |
00:29:10.140
to compress, decompress.
link |
00:29:13.740
Especially, I'll wake up, do a lot of work in the morning,
link |
00:29:16.220
and then I kind of have to just let that settle
link |
00:29:19.180
before I'm ready for all my meetings.
link |
00:29:20.700
And then on the way home, it's a great time to sort of
link |
00:29:23.160
let that settle.
link |
00:29:24.580
You lead a large group of people.
link |
00:29:31.860
Is there days where you're like,
link |
00:29:33.980
oh shit, I gotta get to work in an hour?
link |
00:29:36.620
Like, I mean, is there a tension there?
link |
00:29:45.420
And like, if we look at the grand scheme of things,
link |
00:29:47.940
just like you said, long term,
link |
00:29:49.500
that meeting probably doesn't matter.
link |
00:29:51.700
Like, you can always say, I'll just, I'll run
link |
00:29:54.660
and let the meeting happen, how it happens.
link |
00:29:57.100
Like, what, how do you, that zen, how do you,
link |
00:30:02.200
what do you do with that tension
link |
00:30:03.580
between the real world saying urgently,
link |
00:30:05.620
you need to be there, this is important,
link |
00:30:08.220
everything is melting down,
link |
00:30:10.060
how are we gonna fix this robot?
link |
00:30:11.820
There's this critical meeting,
link |
00:30:14.660
and then there's this, the zen beauty of just running,
link |
00:30:18.020
the simplicity of it, you along with nature.
link |
00:30:21.380
What do you do with that?
link |
00:30:22.700
I would say I'm not a fast runner, particularly.
link |
00:30:25.540
Probably my fastest splits ever was when
link |
00:30:27.940
I had to get to daycare on time
link |
00:30:29.220
because they were gonna charge me, you know,
link |
00:30:30.700
some dollar per minute that I was late.
link |
00:30:33.540
I've run some fast splits to daycare.
link |
00:30:36.980
But those times are past now.
link |
00:30:41.700
I think work, you can find a work life balance in that way.
link |
00:30:44.900
I think you just have to.
link |
00:30:47.260
I think I am better at work
link |
00:30:48.620
because I take time to think on the way in.
link |
00:30:52.180
So I plan my day around it,
link |
00:30:55.300
and I rarely feel that those are really at odds.
link |
00:31:00.300
So what, the bucket list item.
link |
00:31:03.380
If we're talking 12 times two, or approaching a marathon,
link |
00:31:10.620
what, have you run an ultra marathon before?
link |
00:31:15.060
Do you do races?
link |
00:31:16.740
Is there, what's a...
link |
00:31:17.580
Not to win.
link |
00:31:21.620
I'm not gonna like take a dinghy across the Atlantic
link |
00:31:23.720
or something if that's what you want.
link |
00:31:24.780
But if someone does and wants to write a book,
link |
00:31:27.920
I would totally read it
link |
00:31:28.760
because I'm a sucker for that kind of thing.
link |
00:31:31.140
No, I do have some fun things that I will try.
link |
00:31:33.420
You know, I like to, when I travel,
link |
00:31:35.300
I almost always bike to Logan Airport
link |
00:31:37.020
and fold up a little folding bike
link |
00:31:38.740
and then take it with me and bike to wherever I'm going.
link |
00:31:41.040
And it's taken me,
link |
00:31:42.420
or I'll take a stand up paddle board these days
link |
00:31:44.580
on the airplane,
link |
00:31:45.500
and then I'll try to paddle around where I'm going
link |
00:31:47.100
or whatever.
link |
00:31:47.940
And I've done some crazy things, but...
link |
00:31:50.720
But not for the, you know, I now talk,
link |
00:31:55.140
I don't know if you know who David Goggins is by any chance.
link |
00:31:57.500
Not well, but yeah.
link |
00:31:58.460
But I talk to him now every day.
link |
00:32:00.140
So he's the person who made me do this stupid challenge.
link |
00:32:05.940
So he's insane and he does things for the purpose
link |
00:32:10.160
in the best kind of way.
link |
00:32:11.380
He does things like for the explicit purpose of suffering.
link |
00:32:16.980
Like he picks the thing that,
link |
00:32:18.420
like whatever he thinks he can do, he does more.
link |
00:32:22.940
So is that, do you have that thing in you or are you...
link |
00:32:27.300
I think it's become the opposite.
link |
00:32:29.820
It's a...
link |
00:32:30.660
So you're like that dynamical system
link |
00:32:32.300
that the walker, the efficient...
link |
00:32:34.420
Yeah, it's leave no pain, right?
link |
00:32:38.860
You should end feeling better than you started.
link |
00:32:40.900
Okay.
link |
00:32:41.720
But it's mostly, I think, and COVID has tested this
link |
00:32:45.940
because I've lost my commute.
link |
00:32:47.740
I think I'm perfectly happy walking around town
link |
00:32:51.980
with my wife and kids if they could get them to go.
link |
00:32:55.220
And it's more about just getting outside
link |
00:32:57.780
and getting away from the keyboard for some time
link |
00:32:59.980
just to let things compress.
link |
00:33:02.580
Let's go into robotics a little bit.
link |
00:33:04.100
What to use the most beautiful idea in robotics?
link |
00:33:07.800
Whether we're talking about control
link |
00:33:10.780
or whether we're talking about optimization
link |
00:33:12.740
and the math side of things or the engineering side of things
link |
00:33:16.180
or the philosophical side of things.
link |
00:33:20.380
I think I've been lucky to experience something
link |
00:33:23.540
that not so many roboticists have experienced,
link |
00:33:27.700
which is to hang out
link |
00:33:30.220
with some really amazing control theorists.
link |
00:33:34.420
And the clarity of thought
link |
00:33:40.700
that some of the more mathematical control theory
link |
00:33:43.140
can bring to even very complex, messy looking problems
link |
00:33:49.480
is really, it really had a big impact on me
link |
00:33:53.140
and I had a day even just a couple of weeks ago
link |
00:33:57.900
where I had spent the day on a Zoom robotics conference
link |
00:34:01.020
having great conversations with lots of people.
link |
00:34:04.020
Felt really good about the ideas
link |
00:34:06.780
that were flowing and the like.
link |
00:34:09.500
And then I had a late afternoon meeting
link |
00:34:12.940
with one of my favorite control theorists
link |
00:34:15.540
and we went from these abstract discussions
link |
00:34:20.540
about maybes and what ifs and what a great idea
link |
00:34:25.540
to these super precise statements
link |
00:34:30.100
about systems that aren't that much more simple
link |
00:34:33.660
or abstract than the ones I care about deeply.
link |
00:34:38.260
And the contrast of that is,
link |
00:34:42.540
I don't know, it really gets me.
link |
00:34:43.780
I think people underestimate
link |
00:34:47.580
maybe the power of clear thinking.
link |
00:34:51.580
And so for instance, deep learning is amazing.
link |
00:34:58.580
I use it heavily in our work.
link |
00:35:00.380
I think it's changed the world, unquestionable.
link |
00:35:04.700
It makes it easy to get things to work
link |
00:35:07.020
without thinking as critically about it.
link |
00:35:08.580
So I think one of the challenges as an educator
link |
00:35:11.300
is to think about how do we make sure people get a taste
link |
00:35:14.940
of the more rigorous thinking
link |
00:35:17.860
that I think goes along with some different approaches.
link |
00:35:22.620
Yeah, so that's really interesting.
link |
00:35:24.020
So understanding like the fundamentals,
link |
00:35:26.900
the first principles of the problem,
link |
00:35:31.900
where in this case it's mechanics,
link |
00:35:33.780
like how a thing moves, how a thing behaves,
link |
00:35:38.780
like all the forces involved,
link |
00:35:40.420
like really getting a deep understanding of that.
link |
00:35:42.740
I mean, from physics, the first principle thing
link |
00:35:45.340
come from physics, and here it's literally physics.
link |
00:35:50.100
Yeah, and this applies, in deep learning,
link |
00:35:51.940
this applies to not just, I mean,
link |
00:35:54.980
it applies so cleanly in robotics,
link |
00:35:57.300
but it also applies to just in any data set.
link |
00:36:01.500
I find this true, I mean, driving as well.
link |
00:36:05.100
There's a lot of folks in that work on autonomous vehicles
link |
00:36:09.100
that work on autonomous vehicles that don't study driving,
link |
00:36:17.900
like deeply.
link |
00:36:20.300
I might be coming a little bit from the psychology side,
link |
00:36:23.100
but I remember I spent a ridiculous number of hours
link |
00:36:28.380
at lunch, at this like lawn chair,
link |
00:36:31.940
and I would sit somewhere in MIT's campus,
link |
00:36:35.740
there's a few interesting intersections,
link |
00:36:37.260
and we'd just watch people cross.
link |
00:36:39.380
So we were studying pedestrian behavior,
link |
00:36:43.220
and I felt like, as we record a lot of video,
link |
00:36:46.220
to try, and then there's the computer vision
link |
00:36:47.820
extracts their movement, how they move their head, and so on,
link |
00:36:50.860
but like every time, I felt like I didn't understand enough.
link |
00:36:55.340
I just, I felt like I wasn't understanding
link |
00:36:58.620
what, how are people signaling to each other,
link |
00:37:01.620
what are they thinking,
link |
00:37:03.580
how cognizant are they of their fear of death?
link |
00:37:07.820
Like, what's the underlying game theory here?
link |
00:37:11.900
What are the incentives?
link |
00:37:14.140
And then I finally found a live stream of an intersection
link |
00:37:17.860
that's like high def that I just, I would watch
link |
00:37:20.300
so I wouldn't have to sit out there.
link |
00:37:21.780
But it's interesting, so like, I feel.
link |
00:37:23.580
But that's tough, that's a tough example,
link |
00:37:25.180
because I mean, the learning.
link |
00:37:27.100
Humans are involved.
link |
00:37:28.780
Not just because human, but I think the learning mantra
link |
00:37:33.460
is that basically the statistics of the data
link |
00:37:35.500
will tell me things I need to know, right?
link |
00:37:37.940
And, you know, for the example you gave
link |
00:37:41.860
of all the nuances of, you know, eye contact,
link |
00:37:45.420
or hand gestures, or whatever that are happening
link |
00:37:47.620
for these subtle interactions
link |
00:37:48.900
between pedestrians and traffic, right?
link |
00:37:51.140
Maybe the data will tell that story.
link |
00:37:54.460
I maybe even, one level more meta than what you're saying.
link |
00:38:01.300
For a particular problem,
link |
00:38:02.660
I think it might be the case
link |
00:38:03.820
that data should tell us the story.
link |
00:38:07.220
But I think there's a rigorous thinking
link |
00:38:09.420
that is just an essential skill
link |
00:38:11.700
for a mathematician or an engineer
link |
00:38:14.580
that I just don't wanna lose it.
link |
00:38:18.380
There are certainly super rigorous control,
link |
00:38:22.460
or sorry, machine learning people.
link |
00:38:24.940
I just think deep learning makes it so easy
link |
00:38:28.020
to do some things that our next generation,
link |
00:38:31.580
are not immediately rewarded
link |
00:38:35.860
for going through some of the more rigorous approaches.
link |
00:38:38.540
And then I wonder where that takes us.
link |
00:38:40.740
Well, I'm actually optimistic about it.
link |
00:38:42.260
I just want to do my part
link |
00:38:44.860
to try to steer that rigorous thinking.
link |
00:38:48.020
So there's like two questions I wanna ask.
link |
00:38:50.940
Do you have sort of a good example of rigorous thinking
link |
00:38:56.860
where it's easy to get lazy and not do the rigorous thinking?
link |
00:39:00.860
And the other question I have is like,
link |
00:39:02.500
do you have advice of how to practice rigorous thinking
link |
00:39:09.140
in all the computer science disciplines that we've mentioned?
link |
00:39:16.380
Yeah, I mean, there are times where problems
link |
00:39:21.500
that can be solved with well known mature methods
link |
00:39:25.860
could also be solved with a deep learning approach.
link |
00:39:30.300
And there's an argument that you must use learning
link |
00:39:36.740
even for the parts we already think we know,
link |
00:39:38.380
because if the human has touched it,
link |
00:39:39.780
then you've biased the system
link |
00:39:42.460
and you've suddenly put a bottleneck in there
link |
00:39:44.340
that is your own mental model.
link |
00:39:46.300
But something like converting a matrix,
link |
00:39:49.100
I think we know how to do that pretty well,
link |
00:39:50.780
even if it's a pretty big matrix,
link |
00:39:52.020
and we understand that pretty well.
link |
00:39:53.140
And you could train a deep network to do it,
link |
00:39:55.060
but you shouldn't probably.
link |
00:39:57.340
So in that sense, rigorous thinking is understanding
link |
00:40:02.220
the scope and the limitations of the methods that we have,
link |
00:40:07.340
like how to use the tools of mathematics properly.
link |
00:40:10.180
Yeah, I think taking a class on analysis
link |
00:40:15.100
is all I'm sort of arguing is to take a chance to stop
link |
00:40:18.620
and force yourself to think rigorously
link |
00:40:20.900
about even the rational numbers or something.
link |
00:40:25.140
It doesn't have to be the end all problem.
link |
00:40:27.740
But that exercise of clear thinking,
link |
00:40:31.100
I think goes a long way,
link |
00:40:33.420
and I just wanna make sure we keep preaching it.
link |
00:40:35.260
We don't lose it.
link |
00:40:36.380
But do you think when you're doing rigorous thinking
link |
00:40:39.540
or maybe trying to write down equations
link |
00:40:43.220
or sort of explicitly formally describe a system,
link |
00:40:47.980
do you think we naturally simplify things too much?
link |
00:40:51.580
Is that a danger you run into?
link |
00:40:53.500
Like in order to be able to understand something
link |
00:40:56.180
about the system mathematically,
link |
00:40:58.180
we make it too much of a toy example.
link |
00:41:01.700
But I think that's the good stuff, right?
link |
00:41:04.460
That's how you understand the fundamentals?
link |
00:41:07.060
I think so.
link |
00:41:07.900
I think maybe even that's a key to intelligence
link |
00:41:10.380
or something, but I mean, okay,
link |
00:41:12.460
what if Newton and Galileo had deep learning?
link |
00:41:15.100
And they had done a bunch of experiments
link |
00:41:18.340
and they told the world,
link |
00:41:20.360
here's your weights of your neural network.
link |
00:41:22.460
We've solved the problem.
link |
00:41:24.260
Where would we be today?
link |
00:41:25.380
I don't think we'd be as far as we are.
link |
00:41:28.420
There's something to be said
link |
00:41:29.260
about having the simplest explanation for a phenomenon.
link |
00:41:32.540
So I don't doubt that we can train neural networks
link |
00:41:37.180
to predict even physical F equals MA type equations.
link |
00:41:46.300
But I maybe, I want another Newton to come along
link |
00:41:51.300
because I think there's more to do
link |
00:41:52.940
in terms of coming up with the simple models
link |
00:41:56.020
for more complicated tasks.
link |
00:41:59.860
Yeah, let's not offend AI systems from 50 years
link |
00:42:04.240
from now that are listening to this
link |
00:42:06.340
that are probably better at,
link |
00:42:08.260
might be better coming up
link |
00:42:10.180
with F equals MA equations themselves.
link |
00:42:13.080
So sorry, I actually think learning is probably a route
link |
00:42:16.940
to achieving this, but the representation matters, right?
link |
00:42:21.180
And I think having a function that takes my inputs
link |
00:42:26.200
to outputs that is arbitrarily complex
link |
00:42:29.060
may not be the end goal.
link |
00:42:30.780
I think there's still the most simple
link |
00:42:34.140
or parsimonious explanation for the data.
link |
00:42:37.620
Simple doesn't mean low dimensional.
link |
00:42:39.000
That's one thing I think that we've,
link |
00:42:41.020
a lesson that we've learned.
link |
00:42:41.960
So a standard way to do model reduction
link |
00:42:46.080
or system identification and controls
link |
00:42:47.860
is the typical formulation is that you try to find
link |
00:42:50.460
the minimal state dimension realization of a system
link |
00:42:54.220
that hits some error bounds or something like that.
link |
00:42:57.760
And that's maybe not, I think we're learning
link |
00:43:00.340
that state dimension is not the right metric.
link |
00:43:05.980
Of complexity.
link |
00:43:06.820
Of complexity.
link |
00:43:07.640
But for me, I think a lot about contact,
link |
00:43:09.460
the mechanics of contact,
link |
00:43:10.820
if a robot hand is picking up an object or something.
link |
00:43:14.520
And when I write down the equations of motion for that,
link |
00:43:17.220
they look incredibly complex,
link |
00:43:19.100
not because, actually not so much
link |
00:43:23.420
because of the dynamics of the hand when it's moving,
link |
00:43:26.660
but it's just the interactions
link |
00:43:28.500
and when they turn on and off, right?
link |
00:43:30.860
So having a high dimensional,
link |
00:43:33.300
but simple description of what's happening out here is fine.
link |
00:43:36.420
But if when I actually start touching,
link |
00:43:38.480
if I write down a different dynamical system
link |
00:43:41.860
for every polygon on my robot hand
link |
00:43:45.420
and every polygon on the object,
link |
00:43:47.300
whether it's in contact or not,
link |
00:43:49.000
with all the combinatorics that explodes there,
link |
00:43:51.700
then that's too complex.
link |
00:43:54.460
So I need to somehow summarize that
link |
00:43:55.800
with a more intuitive physics way of thinking.
link |
00:44:01.460
And yeah, I'm very optimistic
link |
00:44:03.500
that machine learning will get us there.
link |
00:44:05.700
First of all, I mean, I'll probably do it
link |
00:44:08.220
in the introduction,
link |
00:44:09.140
but you're one of the great robotics people at MIT.
link |
00:44:12.900
You're a professor at MIT.
link |
00:44:14.300
You've teach him a lot of amazing courses.
link |
00:44:16.480
You run a large group
link |
00:44:19.180
and you have a important history for MIT, I think,
link |
00:44:22.780
as being a part of the DARPA Robotics Challenge.
link |
00:44:26.340
Can you maybe first say,
link |
00:44:28.340
what is the DARPA Robotics Challenge
link |
00:44:30.000
and then tell your story around it, your journey with it?
link |
00:44:36.380
Yeah, sure.
link |
00:44:39.260
So the DARPA Robotics Challenge,
link |
00:44:41.060
it came on the tails of the DARPA Grand Challenge
link |
00:44:44.720
and DARPA Urban Challenge,
link |
00:44:45.940
which were the challenges that brought us,
link |
00:44:49.660
put a spotlight on self driving cars.
link |
00:44:55.400
Gil Pratt was at DARPA and pitched a new challenge
link |
00:45:01.360
that involved disaster response.
link |
00:45:04.980
It didn't explicitly require humanoids,
link |
00:45:07.140
although humanoids came into the picture.
link |
00:45:10.220
This happened shortly after the Fukushima disaster in Japan
link |
00:45:14.740
and our challenge was motivated roughly by that
link |
00:45:17.660
because that was a case where if we had had robots
link |
00:45:21.060
that were ready to be sent in,
link |
00:45:22.700
there's a chance that we could have averted disaster.
link |
00:45:26.580
And certainly after the, in the disaster response,
link |
00:45:30.620
there were times we would have loved
link |
00:45:32.380
to have sent robots in.
link |
00:45:34.740
So in practice, what we ended up with was a grand challenge,
link |
00:45:39.220
a DARPA Robotics Challenge,
link |
00:45:41.180
where Boston Dynamics was to make humanoid robots.
link |
00:45:48.660
People like me and the amazing team at MIT
link |
00:45:53.660
were competing first in a simulation challenge
link |
00:45:56.780
to try to be one of the ones that wins the right
link |
00:45:59.460
to work on one of the Boston Dynamics humanoids
link |
00:46:03.340
in order to compete in the final challenge,
link |
00:46:06.620
which was a physical challenge.
link |
00:46:08.580
And at that point, it was already, so it was decided
link |
00:46:11.260
as humanoid robots early on.
link |
00:46:13.420
There were two tracks.
link |
00:46:15.140
You could enter as a hardware team
link |
00:46:16.900
where you brought your own robot,
link |
00:46:18.480
or you could enter through the virtual robotics challenge
link |
00:46:21.380
as a software team that would try to win the right
link |
00:46:24.300
to use one of the Boston Dynamics robots.
link |
00:46:25.940
Sure, called Atlas.
link |
00:46:27.420
Atlas.
link |
00:46:28.260
Humanoid robots.
link |
00:46:29.080
Yeah, it was a 400 pound Marvel,
link |
00:46:31.500
but a pretty big, scary looking robot.
link |
00:46:35.620
Expensive too.
link |
00:46:36.700
Expensive, yeah.
link |
00:46:38.260
Okay, so I mean, how did you feel
link |
00:46:42.300
at the prospect of this kind of challenge?
link |
00:46:44.780
I mean, it seems autonomous vehicles,
link |
00:46:48.820
yeah, I guess that sounds hard,
link |
00:46:51.060
but not really from a robotics perspective.
link |
00:46:53.980
It's like, didn't they do it in the 80s
link |
00:46:56.020
is the kind of feeling I would have,
link |
00:46:58.760
like when you first look at the problem,
link |
00:47:00.820
it's on wheels, but like humanoid robots,
link |
00:47:04.900
that sounds really hard.
link |
00:47:07.060
So what are your, psychologically speaking,
link |
00:47:12.860
what were you feeling, excited, scared?
link |
00:47:15.780
Why the heck did you get yourself involved
link |
00:47:18.020
in this kind of messy challenge?
link |
00:47:19.660
We didn't really know for sure what we were signing up for
link |
00:47:24.540
in the sense that you could have something that,
link |
00:47:26.820
as it was described in the call for participation,
link |
00:47:30.780
that could have put a huge emphasis on the dynamics
link |
00:47:33.900
of walking and not falling down
link |
00:47:35.700
and walking over rough terrain,
link |
00:47:37.380
or the same description,
link |
00:47:38.580
because the robot had to go into this disaster area
link |
00:47:40.780
and turn valves and pick up a drill,
link |
00:47:44.580
it cut the hole through a wall,
link |
00:47:45.780
it had to do some interesting things.
link |
00:47:48.420
The challenge could have really highlighted perception
link |
00:47:51.860
and autonomous planning,
link |
00:47:54.820
or it ended up that locomoting over complex terrain
link |
00:48:01.060
played a pretty big role in the competition.
link |
00:48:03.600
So...
link |
00:48:05.520
And the degree of autonomy wasn't clear.
link |
00:48:08.360
The degree of autonomy
link |
00:48:09.560
was always a central part of the discussion.
link |
00:48:11.920
So what wasn't clear was how we would be able,
link |
00:48:15.560
how far we'd be able to get with it.
link |
00:48:17.520
So the idea was always that you want semi autonomy,
link |
00:48:21.640
that you want the robot to have enough compute
link |
00:48:24.280
that you can have a degraded network link to a human.
link |
00:48:27.640
And so the same way we had degraded networks
link |
00:48:30.640
at many natural disasters,
link |
00:48:33.160
you'd send your robot in,
link |
00:48:34.960
you'd be able to get a few bits back and forth,
link |
00:48:37.540
but you don't get to have enough
link |
00:48:38.920
potentially to fully operate the robot
link |
00:48:42.080
in every joint of the robot.
link |
00:48:44.600
So, and then the question was,
link |
00:48:46.160
and the gamesmanship of the organizers
link |
00:48:48.880
was to figure out what we're capable of,
link |
00:48:50.680
push us as far as we could,
link |
00:48:52.600
so that it would differentiate the teams
link |
00:48:55.300
that put more autonomy on the robot
link |
00:48:57.540
and had a few clicks and just said,
link |
00:48:59.400
go there, do this, go there, do this,
link |
00:49:00.920
versus someone who's picking every footstep
link |
00:49:03.400
or something like that.
link |
00:49:05.280
So what were some memories,
link |
00:49:10.760
painful, triumphant from the experience?
link |
00:49:13.620
Like what was that journey?
link |
00:49:15.040
Maybe if you can dig in a little deeper,
link |
00:49:17.680
maybe even on the technical side, on the team side,
link |
00:49:21.120
that whole process of,
link |
00:49:24.120
from the early idea stages to actually competing.
link |
00:49:28.200
I mean, this was a defining experience for me.
link |
00:49:31.680
It came at the right time for me in my career.
link |
00:49:33.940
I had gotten tenure before I was due a sabbatical,
link |
00:49:37.480
and most people do something relaxing
link |
00:49:39.840
and restorative for a sabbatical.
link |
00:49:41.920
So you got tenure before this?
link |
00:49:44.520
Yeah, yeah, yeah.
link |
00:49:46.200
It was a good time for me.
link |
00:49:48.120
We had a bunch of algorithms that we were very happy with.
link |
00:49:50.960
We wanted to see how far we could push them,
link |
00:49:52.560
and this was a chance to really test our mettle
link |
00:49:54.920
to do more proper software engineering.
link |
00:49:56.880
So the team, we all just worked our butts off.
link |
00:50:01.420
We were in that lab almost all the time.
link |
00:50:07.680
Okay, so there were some, of course,
link |
00:50:09.600
high highs and low lows throughout that.
link |
00:50:12.080
Anytime you're not sleeping
link |
00:50:13.720
and devoting your life to a 400 pound humanoid.
link |
00:50:18.320
I remember actually one funny moment
link |
00:50:20.720
where we're all super tired,
link |
00:50:21.940
and so Atlas had to walk across cinder blocks.
link |
00:50:24.760
That was one of the obstacles.
link |
00:50:26.520
And I remember Atlas was powered down
link |
00:50:28.240
and hanging limp on its harness,
link |
00:50:31.280
and the humans were there picking up
link |
00:50:34.000
and laying the brick down
link |
00:50:35.200
so that the robot could walk over it.
link |
00:50:36.440
And I thought, what is wrong with this?
link |
00:50:38.240
We've got a robot just watching us
link |
00:50:41.560
do all the manual labor
link |
00:50:42.500
so that it can take its little stroll across the train.
link |
00:50:47.040
But I mean, even the virtual robotics challenge
link |
00:50:52.120
was super nerve wracking and dramatic.
link |
00:50:54.640
I remember, so we were using Gazebo as a simulator
link |
00:51:01.520
on the cloud,
link |
00:51:02.360
and there was all these interesting challenges.
link |
00:51:03.920
I think the investment that OSR FC,
link |
00:51:08.560
whatever they were called at that time,
link |
00:51:10.020
Brian Gerkey's team at Open Source Robotics,
link |
00:51:14.160
they were pushing on the capabilities of Gazebo
link |
00:51:16.000
in order to scale it to the complexity of these challenges.
link |
00:51:20.380
So, you know, up to the virtual competition.
link |
00:51:23.900
So the virtual competition was,
link |
00:51:26.220
you will sign on at a certain time
link |
00:51:28.480
and we'll have a network connection
link |
00:51:29.840
to another machine on the cloud
link |
00:51:32.080
that is running the simulator of your robot.
link |
00:51:34.880
And your controller will run on this computer
link |
00:51:38.160
and the physics will run on the other
link |
00:51:40.920
and you have to connect.
link |
00:51:43.060
Now, the physics, they wanted it to run at real time rates
link |
00:51:48.140
because there was an element of human interaction.
link |
00:51:50.740
And humans, if you do want to teleop,
link |
00:51:53.280
it works way better if it's at frame rate.
link |
00:51:56.120
Oh, cool.
link |
00:51:57.120
But it was very hard to simulate
link |
00:51:58.720
these complex scenes at real time rate.
link |
00:52:03.240
So right up to like days before the competition,
link |
00:52:06.520
the simulator wasn't quite at real time rate.
link |
00:52:11.040
And that was great for me because my controller
link |
00:52:13.280
was solving a pretty big optimization problem
link |
00:52:16.280
and it wasn't quite at real time rate.
link |
00:52:17.760
So I was fine.
link |
00:52:18.880
I was keeping up with the simulator.
link |
00:52:20.480
We were both running at about 0.7.
link |
00:52:22.880
And I remember getting this email.
link |
00:52:24.960
And by the way, the perception folks on our team hated
link |
00:52:28.440
that they knew that if my controller was too slow,
link |
00:52:31.440
the robot was gonna fall down.
link |
00:52:32.520
And no matter how good their perception system was,
link |
00:52:34.920
if I can't make my controller fast.
link |
00:52:36.940
Anyways, we get this email
link |
00:52:37.920
like three days before the virtual competition.
link |
00:52:40.480
It's for all the marbles.
link |
00:52:41.480
We're gonna either get a humanoid robot or we're not.
link |
00:52:44.920
And we get an email saying,
link |
00:52:45.740
good news, we made the robot, the simulator faster.
link |
00:52:48.680
It's now at one point.
link |
00:52:50.560
And I was just like, oh man, what are we gonna do here?
link |
00:52:54.800
So that came in late at night for me.
link |
00:52:59.520
A few days ahead.
link |
00:53:00.560
A few days ahead.
link |
00:53:01.440
I went over, it happened at Frank Permenter,
link |
00:53:04.000
who's a very, very sharp.
link |
00:53:06.800
He was a student at the time working on optimization.
link |
00:53:11.160
He was still in lab.
link |
00:53:13.640
Frank, we need to make the quadratic programming solver
link |
00:53:16.680
faster, not like a little faster.
link |
00:53:18.360
It's actually, you know, and we wrote a new solver
link |
00:53:22.600
for that QP together that night.
link |
00:53:28.160
It was terrifying.
link |
00:53:29.400
So there's a really hard optimization problem
link |
00:53:31.920
that you're constantly solving.
link |
00:53:34.480
You didn't make the optimization problem simpler?
link |
00:53:36.820
You wrote a new solver?
link |
00:53:38.480
So, I mean, your observation is almost spot on.
link |
00:53:42.840
What we did was what everybody,
link |
00:53:44.520
I mean, people know how to do this,
link |
00:53:45.800
but we had not yet done this idea of warm starting.
link |
00:53:49.240
So we are solving a big optimization problem
link |
00:53:51.320
at every time step.
link |
00:53:52.680
But if you're running fast enough,
link |
00:53:54.280
the optimization problem you're solving
link |
00:53:55.680
on the last time step is pretty similar
link |
00:53:57.920
to the optimization you're gonna solve with the next.
link |
00:54:00.040
We had course had told our commercial solver
link |
00:54:02.240
to use warm starting, but even the interface
link |
00:54:05.520
to that commercial solver was causing us these delays.
link |
00:54:09.840
So what we did was we basically wrote,
link |
00:54:12.740
we called it fast QP at the time.
link |
00:54:15.360
We wrote a very lightweight, very fast layer,
link |
00:54:18.480
which would basically check if nearby solutions
link |
00:54:22.120
to the quadratic program were,
link |
00:54:24.240
which were very easily checked,
link |
00:54:26.560
could stabilize the robot.
link |
00:54:28.000
And if they couldn't, we would fall back to the solver.
link |
00:54:30.720
You couldn't really test this well, right?
link |
00:54:33.120
Or like?
link |
00:54:33.960
I mean, so we always knew that if we fell back to,
link |
00:54:37.360
if we, it got to the point where if for some reason
link |
00:54:40.440
things slowed down and we fell back to the original solver,
link |
00:54:42.840
the robot would actually literally fall down.
link |
00:54:46.040
So it was a harrowing sort of edge we were,
link |
00:54:49.360
ledge we were sort of on.
link |
00:54:51.200
But I mean, it actually,
link |
00:54:53.200
like the 400 pound human could come crashing to the ground
link |
00:54:55.840
if your solver's not fast enough.
link |
00:54:58.880
But you know, we had lots of good experiences.
link |
00:55:01.900
So can I ask you a weird question I get
link |
00:55:06.640
about idea of hard work?
link |
00:55:09.440
So actually people, like students of yours
link |
00:55:14.320
that I've interacted with and just,
link |
00:55:17.040
and robotics people in general,
link |
00:55:19.400
but they have moments,
link |
00:55:23.400
at moments have worked harder than most people I know
link |
00:55:28.360
in terms of, if you look at different disciplines
link |
00:55:30.600
of how hard people work.
link |
00:55:32.360
But they're also like the happiest.
link |
00:55:34.560
Like, just like, I don't know.
link |
00:55:37.000
It's the same thing with like running.
link |
00:55:39.200
People that push themselves to like the limit,
link |
00:55:41.380
they also seem to be like the most like full of life
link |
00:55:44.760
somehow.
link |
00:55:46.720
And I get often criticized like,
link |
00:55:48.680
you're not getting enough sleep.
link |
00:55:50.420
What are you doing to your body?
link |
00:55:52.000
Blah, blah, blah, like this kind of stuff.
link |
00:55:54.680
And I usually just kind of respond like,
link |
00:55:58.040
I'm doing what I love.
link |
00:55:59.720
I'm passionate about it.
link |
00:56:00.920
I love it.
link |
00:56:01.760
I feel like it's, it's invigorating.
link |
00:56:04.800
I actually think, I don't think the lack of sleep
link |
00:56:07.640
is what hurts you.
link |
00:56:08.860
I think what hurts you is stress and lack of doing things
link |
00:56:12.040
that you're passionate about.
link |
00:56:13.280
But in this world, yeah, I mean,
link |
00:56:14.920
can you comment about why the heck robotics people
link |
00:56:20.720
are willing to push themselves to that degree?
link |
00:56:26.200
Is there value in that?
link |
00:56:27.680
And why are they so happy?
link |
00:56:30.360
I think, I think you got it right.
link |
00:56:31.920
I mean, I think the causality is not that we work hard.
link |
00:56:36.440
And I think other disciplines work very hard too,
link |
00:56:38.500
but it's, I don't think it's that we work hard
link |
00:56:40.300
and therefore we are happy.
link |
00:56:43.160
I think we found something
link |
00:56:44.700
that we're truly passionate about.
link |
00:56:48.080
It makes us very happy.
link |
00:56:49.960
And then we get a little involved with it
link |
00:56:52.280
and spend a lot of time on it.
link |
00:56:54.600
What a luxury to have something
link |
00:56:55.980
that you wanna spend all your time on, right?
link |
00:56:59.140
We could talk about this for many hours,
link |
00:57:00.800
but maybe if we could pick,
link |
00:57:03.880
is there something on the technical side
link |
00:57:05.480
on the approach that you took that's interesting
link |
00:57:08.260
that turned out to be a terrible failure
link |
00:57:10.240
or a success that you carry into your work today
link |
00:57:13.800
about all the different ideas that were involved
link |
00:57:17.260
in making, whether in the simulation or in the real world,
link |
00:57:23.400
making this semi autonomous system work?
link |
00:57:25.520
I mean, it really did teach me something fundamental
link |
00:57:30.880
about what it's gonna take to get robustness
link |
00:57:33.560
out of a system of this complexity.
link |
00:57:35.320
I would say the DARPA challenge
link |
00:57:37.720
really was foundational in my thinking.
link |
00:57:41.040
I think the autonomous driving community thinks about this.
link |
00:57:43.720
I think lots of people thinking
link |
00:57:45.580
about safety critical systems
link |
00:57:47.080
that might have machine learning in the loop
link |
00:57:48.920
are thinking about these questions.
link |
00:57:50.360
For me, the DARPA challenge was the moment
link |
00:57:53.340
where I realized we've spent every waking minute
link |
00:57:57.480
running this robot.
link |
00:57:58.920
And again, for the physical competition,
link |
00:58:01.440
days before the competition,
link |
00:58:02.540
we saw the robot fall down in a way
link |
00:58:04.440
it had never fallen down before.
link |
00:58:05.980
I thought, how could we have found that?
link |
00:58:10.520
We only have one robot, it's running almost all the time.
link |
00:58:13.600
We just didn't have enough hours in the day
link |
00:58:15.560
to test that robot.
link |
00:58:17.120
Something has to change, right?
link |
00:58:19.380
And then I think that, I mean,
link |
00:58:21.080
I would say that the team that won was,
link |
00:58:24.880
from KAIST, was the team that had two robots
link |
00:58:28.020
and was able to do not only incredible engineering,
link |
00:58:30.560
just absolutely top rate engineering,
link |
00:58:33.240
but also they were able to test at a rate
link |
00:58:36.080
and discipline that we didn't keep up with.
link |
00:58:39.600
What does testing look like?
link |
00:58:41.120
What are we talking about here?
link |
00:58:42.280
Like, what's a loop of tests?
link |
00:58:45.000
Like from start to finish, what is a loop of testing?
link |
00:58:48.800
Yeah, I mean, I think there's a whole philosophy to testing.
link |
00:58:51.880
There's the unit tests, and you can do that on a hardware,
link |
00:58:54.440
you can do that in a small piece of code.
link |
00:58:56.360
You write one function, you should write a test
link |
00:58:58.280
that checks that function's input and outputs.
link |
00:59:00.620
You should also write an integration test
link |
00:59:02.440
at the other extreme of running the whole system together,
link |
00:59:05.320
where they try to turn on all of the different functions
link |
00:59:09.120
that you think are correct.
link |
00:59:11.560
It's much harder to write the specifications
link |
00:59:13.400
for a system level test,
link |
00:59:14.520
especially if that system is as complicated
link |
00:59:17.360
as a humanoid robot.
link |
00:59:18.460
But the philosophy is sort of the same.
link |
00:59:21.040
On the real robot, it's no different,
link |
00:59:24.160
but on a real robot,
link |
00:59:26.040
it's impossible to run the same experiment twice.
link |
00:59:28.640
So if you see a failure,
link |
00:59:32.480
you hope you caught something in the logs
link |
00:59:34.380
that tell you what happened,
link |
00:59:35.620
but you'd probably never be able to run
link |
00:59:36.920
exactly that experiment again.
link |
00:59:39.400
And right now, I think our philosophy is just,
link |
00:59:45.720
basically Monte Carlo estimation,
link |
00:59:47.880
is just run as many experiments as we can,
link |
00:59:50.880
maybe try to set up the environment
link |
00:59:53.080
to make the things we are worried about happen
link |
00:59:58.120
as often as possible.
link |
00:59:59.880
But really we're relying on somewhat random search
link |
01:00:02.280
in order to test.
link |
01:00:04.220
Maybe that's all we'll ever be able to,
link |
01:00:05.480
but I think, you know,
link |
01:00:07.320
cause there's an argument that the things that'll get you
link |
01:00:10.520
are the things that are really nuanced in the world.
link |
01:00:14.040
And there'd be very hard to, for instance,
link |
01:00:15.700
put back in a simulation.
link |
01:00:16.880
Yeah, I guess the edge cases.
link |
01:00:19.880
What was the hardest thing?
link |
01:00:21.840
Like, so you said walking over rough terrain,
link |
01:00:24.680
like just taking footsteps.
link |
01:00:27.120
I mean, people, it's so dramatic and painful
link |
01:00:31.360
in a certain kind of way to watch these videos
link |
01:00:33.520
from the DRC of robots falling.
link |
01:00:37.600
Yep.
link |
01:00:38.440
It's just so heartbreaking.
link |
01:00:39.440
I don't know.
link |
01:00:40.280
Maybe it's because for me at least,
link |
01:00:42.400
we anthropomorphize the robot.
link |
01:00:45.120
Of course, it's also funny for some reason,
link |
01:00:48.400
like humans falling is funny for, I don't,
link |
01:00:51.920
it's some dark reason.
link |
01:00:53.400
I'm not sure why it is so,
link |
01:00:55.300
but it's also like tragic and painful.
link |
01:00:57.880
And so speaking of which, I mean,
link |
01:01:00.380
what made the robots fall and fail in your view?
link |
01:01:05.000
So I can tell you exactly what happened on our,
link |
01:01:06.960
we, I contributed one of those.
link |
01:01:08.360
Our team contributed one of those spectacular falls.
link |
01:01:10.960
Every one of those falls has a complicated story.
link |
01:01:15.560
I mean, at one time,
link |
01:01:16.920
the power effectively went out on the robot
link |
01:01:20.200
because it had been sitting at the door
link |
01:01:21.720
waiting for a green light to be able to proceed
link |
01:01:24.400
and its batteries, you know,
link |
01:01:26.280
and therefore it just fell backwards
link |
01:01:28.080
and smashed its head against the ground.
link |
01:01:29.280
And it was hilarious,
link |
01:01:30.120
but it wasn't because of bad software, right?
link |
01:01:34.100
But for ours, so the hardest part of the challenge,
link |
01:01:37.120
the hardest task in my view was getting out of the Polaris.
link |
01:01:40.400
It was actually relatively easy to drive the Polaris.
link |
01:01:43.760
Can you tell the story?
link |
01:01:44.600
Sorry to interrupt.
link |
01:01:45.440
The story of the car.
link |
01:01:50.040
People should watch this video.
link |
01:01:51.240
I mean, the thing you've come up with is just brilliant,
link |
01:01:53.900
but anyway, sorry, what's...
link |
01:01:55.920
Yeah, we kind of joke.
link |
01:01:56.920
We call it the big robot, little car problem
link |
01:01:59.040
because somehow the race organizers decided
link |
01:02:03.440
to give us a 400 pound humanoid.
link |
01:02:05.360
And then they also provided the vehicle,
link |
01:02:07.480
which was a little Polaris.
link |
01:02:08.640
And the robot didn't really fit in the car.
link |
01:02:11.760
So you couldn't drive the car with your feet
link |
01:02:14.520
under the steering column.
link |
01:02:15.720
We actually had to straddle the main column of the,
link |
01:02:21.280
and have basically one foot in the passenger seat,
link |
01:02:23.580
one foot in the driver's seat,
link |
01:02:25.280
and then drive with our left hand.
link |
01:02:28.880
But the hard part was we had to then park the car,
link |
01:02:31.300
get out of the car.
link |
01:02:33.080
It didn't have a door, that was okay.
link |
01:02:34.320
But it's just getting up from crouched, from sitting,
link |
01:02:38.720
when you're in this very constrained environment.
link |
01:02:41.880
First of all, I remember after watching those videos,
link |
01:02:44.320
I was much more cognizant of how hard it is for me
link |
01:02:47.840
to get in and out of the car,
link |
01:02:49.600
and out of the car, especially.
link |
01:02:51.760
It's actually a really difficult control problem.
link |
01:02:54.240
Yeah.
link |
01:02:55.480
I'm very cognizant of it when I'm like injured
link |
01:02:58.360
for whatever reason.
link |
01:02:59.200
Oh, that's really hard.
link |
01:03:00.120
Yeah.
link |
01:03:01.440
So how did you approach this problem?
link |
01:03:03.560
So we had, you think of NASA's operations,
link |
01:03:08.160
and they have these checklists,
link |
01:03:09.800
prelaunched checklists and the like.
link |
01:03:11.080
We weren't far off from that.
link |
01:03:12.380
We had this big checklist.
link |
01:03:13.500
And on the first day of the competition,
link |
01:03:16.320
we were running down our checklist.
link |
01:03:17.520
And one of the things we had to do,
link |
01:03:19.120
we had to turn off the controller,
link |
01:03:21.320
the piece of software that was running
link |
01:03:23.320
that would drive the left foot of the robot
link |
01:03:25.560
in order to accelerate on the gas.
link |
01:03:28.120
And then we turned on our balancing controller.
link |
01:03:30.840
And the nerves, jitters of the first day of the competition,
link |
01:03:34.280
someone forgot to check that box
link |
01:03:35.660
and turn that controller off.
link |
01:03:37.560
So we used a lot of motion planning
link |
01:03:40.880
to figure out a sort of configuration of the robot
link |
01:03:45.320
that we could get up and over.
link |
01:03:47.200
We relied heavily on our balancing controller.
link |
01:03:50.320
And basically, when the robot was in one
link |
01:03:53.760
of its most precarious sort of configurations,
link |
01:03:57.560
trying to sneak its big leg out of the side,
link |
01:04:01.800
the other controller that thought it was still driving
link |
01:04:05.000
told its left foot to go like this.
link |
01:04:06.760
And that wasn't good.
link |
01:04:11.000
But it turned disastrous for us
link |
01:04:13.320
because what happened was a little bit of push here.
link |
01:04:16.980
Actually, we have videos of us running into the robot
link |
01:04:21.080
with a 10 foot pole and it kind of will recover.
link |
01:04:24.680
But this is a case where there's no space to recover.
link |
01:04:27.800
So a lot of our secondary balancing mechanisms
link |
01:04:30.180
about like take a step to recover,
link |
01:04:32.160
they were all disabled because we were in the car
link |
01:04:33.760
and there was no place to step.
link |
01:04:35.320
So we were relying on our just lowest level reflexes.
link |
01:04:38.380
And even then, I think just hitting the foot on the seat,
link |
01:04:42.200
on the floor, we probably could have recovered from it.
link |
01:04:44.960
But the thing that was bad that happened
link |
01:04:46.400
is when we did that and we jostled a little bit,
link |
01:04:49.440
the tailbone of our robot was only a little off the seat,
link |
01:04:53.720
it hit the seat.
link |
01:04:55.480
And the other foot came off the ground just a little bit.
link |
01:04:58.260
And nothing in our plans had ever told us what to do
link |
01:05:02.280
if your butt's on the seat and your feet are in the air.
link |
01:05:05.120
Feet in the air.
link |
01:05:06.040
And then the thing is once you get off the script,
link |
01:05:10.080
things can go very wrong
link |
01:05:11.040
because even our state estimation,
link |
01:05:12.760
our system that was trying to collect all the data
link |
01:05:15.200
from the sensors and understand
link |
01:05:16.760
what's happening with the robot,
link |
01:05:18.480
it didn't know about this situation.
link |
01:05:20.080
So it was predicting things that were just wrong.
link |
01:05:22.800
And then we did a violent shake and fell off
link |
01:05:26.560
in our face first out of the robot.
link |
01:05:29.180
But like into the destination.
link |
01:05:32.520
That's true, we fell in, we got our point for egress.
link |
01:05:36.320
But so is there any hope for, that's interesting,
link |
01:05:39.280
is there any hope for Atlas to be able to do something
link |
01:05:43.280
when it's just on its butt and feet in the air?
link |
01:05:46.320
Absolutely.
link |
01:05:47.200
So you can, what do you?
link |
01:05:48.520
No, so that is one of the big challenges.
link |
01:05:50.920
And I think it's still true, you know,
link |
01:05:53.840
Boston Dynamics and Antimal and there's this incredible work
link |
01:05:59.120
on legged robots happening around the world.
link |
01:06:04.540
Most of them still are very good at the case
link |
01:06:07.620
where you're making contact with the world at your feet.
link |
01:06:10.080
And they have typically point feet relatively,
link |
01:06:12.200
they have balls on their feet, for instance.
link |
01:06:14.480
If those robots get in a situation
link |
01:06:16.600
where the elbow hits the wall or something like this,
link |
01:06:19.880
that's a pretty different situation.
link |
01:06:21.240
Now they have layers of mechanisms that will make,
link |
01:06:24.080
I think the more mature solutions have ways
link |
01:06:27.680
in which the controller won't do stupid things.
link |
01:06:31.240
But a human, for instance, is able to leverage
link |
01:06:34.720
incidental contact in order to accomplish a goal.
link |
01:06:36.760
In fact, I might, if you push me,
link |
01:06:37.800
I might actually put my hand out
link |
01:06:39.720
and make a new brand new contact.
link |
01:06:42.220
The feet of the robot are doing this on quadrupeds,
link |
01:06:44.940
but we mostly in robotics are afraid of contact
link |
01:06:49.120
on the rest of our body, which is crazy.
link |
01:06:53.180
There's this whole field of motion planning,
link |
01:06:56.040
collision free motion planning.
link |
01:06:58.040
And we write very complex algorithms
link |
01:06:59.800
so that the robot can dance around
link |
01:07:01.640
and make sure it doesn't touch the world.
link |
01:07:05.840
So people are just afraid of contact
link |
01:07:07.720
because contact the scene is a difficult.
link |
01:07:09.880
It's still a difficult control problem and sensing problem.
link |
01:07:13.380
Now you're a serious person, I'm a little bit of an idiot
link |
01:07:21.180
and I'm going to ask you some dumb questions.
link |
01:07:24.140
So I do martial arts.
link |
01:07:27.140
So like jiu jitsu, I wrestled my whole life.
link |
01:07:30.380
So let me ask the question, like whenever people learn
link |
01:07:35.380
that I do any kind of AI or like I mentioned robots
link |
01:07:38.500
and things like that, they say,
link |
01:07:40.040
when are we going to have robots that can win
link |
01:07:45.020
in a wrestling match or in a fight against a human?
link |
01:07:49.880
So we just mentioned sitting on your butt,
link |
01:07:52.160
if you're in the air, that's a common position.
link |
01:07:53.940
Jiu jitsu, when you're on the ground,
link |
01:07:55.420
you're a down opponent.
link |
01:07:59.100
Like how difficult do you think is the problem?
link |
01:08:03.800
And when will we have a robot that can defeat a human
link |
01:08:06.880
in a wrestling match?
link |
01:08:08.580
And we're talking about a lot, like, I don't know
link |
01:08:11.100
if you're familiar with wrestling, but essentially.
link |
01:08:15.340
Not very.
link |
01:08:16.180
It's basically the art of contact.
link |
01:08:19.580
It's like, it's because you're picking contact points
link |
01:08:24.580
and then using like leverage like to off balance
link |
01:08:29.300
to trick people, like you make them feel
link |
01:08:33.940
like you're doing one thing
link |
01:08:35.620
and then they change their balance
link |
01:08:38.840
and then you switch what you're doing
link |
01:08:41.620
and then results in a throw or whatever.
link |
01:08:44.100
So like, it's basically the art of multiple contacts.
link |
01:08:48.540
So.
link |
01:08:49.380
Awesome, that's a nice description of it.
link |
01:08:50.820
So there's also an opponent in there, right?
link |
01:08:53.040
So if.
link |
01:08:54.180
Very dynamic.
link |
01:08:55.060
Right, if you are wrestling a human
link |
01:08:58.520
and are in a game theoretic situation with a human,
link |
01:09:02.900
that's still hard, but just to speak to the, you know,
link |
01:09:08.220
quickly reasoning about contact part of it, for instance.
link |
01:09:11.340
Yeah, maybe even throwing the game theory out of it,
link |
01:09:13.380
almost like, yeah, almost like a non dynamic opponent.
link |
01:09:17.700
Right, there's reasons to be optimistic,
link |
01:09:20.060
but I think our best understanding of those problems
link |
01:09:22.660
are still pretty hard.
link |
01:09:24.820
I have been increasingly focused on manipulation,
link |
01:09:29.860
partly where that's a case where the contact
link |
01:09:31.720
has to be much more rich.
link |
01:09:35.800
And there are some really impressive examples
link |
01:09:38.260
of deep learning policies, controllers
link |
01:09:41.820
that can appear to do good things through contact.
link |
01:09:47.860
We've even got new examples of, you know,
link |
01:09:51.380
deep learning models of predicting what's gonna happen
link |
01:09:53.940
to objects as they go through contact.
link |
01:09:56.220
But I think the challenge you just offered there
link |
01:09:59.780
still eludes us, right?
link |
01:10:01.500
The ability to make a decision
link |
01:10:03.620
based on those models quickly.
link |
01:10:07.560
You know, I have to think though, it's hard for humans too,
link |
01:10:10.140
when you get that complicated.
link |
01:10:11.380
I think probably you had maybe a slow motion version
link |
01:10:16.100
of where you learned the basic skills
link |
01:10:17.980
and you've probably gotten better at it
link |
01:10:20.700
and there's much more subtle to you.
link |
01:10:24.660
But it might still be hard to actually, you know,
link |
01:10:27.940
really on the fly take a, you know, model of your humanoid
link |
01:10:32.140
and figure out how to plan the optimal sequence.
link |
01:10:35.260
That might be a problem we never solve.
link |
01:10:36.660
Well, the, I mean, one of the most amazing things to me
link |
01:10:40.360
about the, we can talk about martial arts.
link |
01:10:43.740
We could also talk about dancing.
link |
01:10:45.340
Doesn't really matter.
link |
01:10:46.740
Too human, I think it's the most interesting study
link |
01:10:50.540
of contact.
link |
01:10:51.380
It's not even the dynamic element of it.
link |
01:10:53.040
It's the, like when you get good at it, it's so effortless.
link |
01:10:58.740
Like I can just, I'm very cognizant
link |
01:11:00.900
of the entirety of the learning process
link |
01:11:03.380
being essentially like learning how to move my body
link |
01:11:07.660
in a way that I could throw very large weights
link |
01:11:12.220
around effortlessly, like, and I can feel the learning.
link |
01:11:18.500
Like I'm a huge believer in drilling of techniques
link |
01:11:21.540
and you can just like feel your, I don't,
link |
01:11:23.580
you're not feeling, you're feeling, sorry,
link |
01:11:26.780
you're learning it intellectually a little bit,
link |
01:11:29.800
but a lot of it is the body learning it somehow,
link |
01:11:32.820
like instinctually and whatever that learning is,
link |
01:11:36.100
that's really, I'm not even sure if that's equivalent
link |
01:11:40.780
to like a deep learning, learning a controller.
link |
01:11:44.760
I think it's something more,
link |
01:11:46.820
it feels like there's a lot of distributed learning
link |
01:11:49.720
going on.
link |
01:11:50.560
Yeah, I think there's hierarchy and composition
link |
01:11:56.440
probably in the systems that we don't capture very well yet.
link |
01:12:00.840
You have layers of control systems.
link |
01:12:02.440
You have reflexes at the bottom layer
link |
01:12:03.960
and you have a system that's capable
link |
01:12:07.440
of planning a vacation to some distant country,
link |
01:12:11.320
which is probably, you probably don't have a controller,
link |
01:12:14.240
a policy for every possible destination you'll ever pick.
link |
01:12:18.260
Right?
link |
01:12:20.380
But there's something magical in the in between
link |
01:12:23.460
and how do you go from these low level feedback loops
link |
01:12:26.340
to something that feels like a pretty complex
link |
01:12:30.020
set of outcomes.
link |
01:12:32.740
You know, my guess is, I think there's evidence
link |
01:12:34.760
that you can plan at some of these levels, right?
link |
01:12:37.620
So Josh Tenenbaum just showed it in his talk the other day.
link |
01:12:41.740
He's got a game he likes to talk about.
link |
01:12:43.320
I think he calls it the pick three game or something,
link |
01:12:46.700
where he puts a bunch of clutter down in front of a person
link |
01:12:50.740
and he says, okay, pick three objects.
link |
01:12:52.380
And it might be a telephone or a shoe
link |
01:12:55.700
or a Kleenex box or whatever.
link |
01:12:59.880
And apparently you pick three items and then you pick,
link |
01:13:01.820
he says, okay, pick the first one up with your right hand,
link |
01:13:04.100
the second one up with your left hand.
link |
01:13:06.360
Now using those objects, now as tools,
link |
01:13:08.860
pick up the third object.
link |
01:13:11.060
Right, so that's down at the level of physics
link |
01:13:15.700
and mechanics and contact mechanics
link |
01:13:17.140
that I think we do learning or we do have policies for,
link |
01:13:21.880
we do control for, almost feedback,
link |
01:13:24.740
but somehow we're able to still,
link |
01:13:26.300
I mean, I've never picked up a telephone
link |
01:13:28.420
with a shoe and a water bottle before.
link |
01:13:30.220
And somehow, and it takes me a little longer to do that
link |
01:13:33.140
the first time, but most of the time
link |
01:13:35.180
we can sort of figure that out.
link |
01:13:37.260
So yeah, I think the amazing thing is this ability
link |
01:13:41.940
to be flexible with our models,
link |
01:13:44.100
plan when we need to use our well oiled controllers
link |
01:13:48.700
when we don't, when we're in familiar territory.
link |
01:13:53.280
Having models, I think the other thing you just said
link |
01:13:55.560
was something about, I think your awareness
link |
01:13:58.140
of what's happening is even changing
link |
01:13:59.860
as you improve your expertise, right?
link |
01:14:02.380
So maybe you have a very approximate model
link |
01:14:04.980
of the mechanics to begin with.
link |
01:14:06.240
And as you gain expertise,
link |
01:14:09.300
you get a more refined version of that model.
link |
01:14:11.920
You're aware of muscles or balance components
link |
01:14:17.100
that you just weren't even aware of before.
link |
01:14:19.700
So how do you scaffold that?
link |
01:14:21.740
Yeah, plus the fear of injury,
link |
01:14:24.180
the ambition of goals, of excelling,
link |
01:14:28.780
and fear of mortality.
link |
01:14:32.020
Let's see, what else is in there?
link |
01:14:33.340
As the motivations, overinflated ego in the beginning,
link |
01:14:38.040
and then a crash of confidence in the middle.
link |
01:14:42.900
All of those seem to be essential for the learning process.
link |
01:14:46.700
And if all that's good,
link |
01:14:48.140
then you're probably optimizing energy efficiency.
link |
01:14:50.500
Yeah, right, so we have to get that right.
link |
01:14:53.080
So there was this idea that you would have robots
link |
01:14:58.580
play soccer better than human players by 2050.
link |
01:15:03.780
That was the goal.
link |
01:15:05.300
Basically, it was the goal to beat world champion team,
link |
01:15:10.140
to become a world cup, beat like a world cup level team.
link |
01:15:13.340
So are we gonna see that first?
link |
01:15:15.900
Or a robot, if you're familiar,
link |
01:15:19.580
there's an organization called UFC for mixed martial arts.
link |
01:15:23.440
Are we gonna see a world cup championship soccer team
link |
01:15:27.100
that have robots, or a UFC champion mixed martial artist
link |
01:15:32.660
as a robot?
link |
01:15:33.860
I mean, it's very hard to say one thing is harder,
link |
01:15:37.140
some problem is harder than the other.
link |
01:15:38.580
What probably matters is who started the organization that,
link |
01:15:44.980
I mean, I think RoboCup has a pretty serious following,
link |
01:15:47.140
and there is a history now of people playing that game,
link |
01:15:50.860
learning about that game, building robots to play that game,
link |
01:15:53.620
building increasingly more human robots.
link |
01:15:55.820
It's got momentum.
link |
01:15:57.020
So if you want to have mixed martial arts compete,
link |
01:16:00.900
you better start your organization now, right?
link |
01:16:05.460
I think almost independent of which problem
link |
01:16:07.740
is technically harder,
link |
01:16:08.660
because they're both hard and they're both different.
link |
01:16:11.400
That's a good point.
link |
01:16:12.240
I mean, those videos are just hilarious,
link |
01:16:14.700
like especially the humanoid robots
link |
01:16:17.140
trying to play soccer.
link |
01:16:21.260
I mean, they're kind of terrible right now.
link |
01:16:23.420
I mean, I guess there is robo sumo wrestling.
link |
01:16:26.020
There's like the robo one competitions,
link |
01:16:28.740
where they do have these robots that go on the table
link |
01:16:31.140
and basically fight.
link |
01:16:32.100
So maybe I'm wrong, maybe.
link |
01:16:33.720
First of all, do you have a year in mind for RoboCup,
link |
01:16:37.140
just from a robotics perspective?
link |
01:16:39.100
Seems like a super exciting possibility
link |
01:16:42.060
that like in the physical space,
link |
01:16:46.340
this is what's interesting.
link |
01:16:47.620
I think the world is captivated.
link |
01:16:50.560
I think it's really exciting.
link |
01:16:52.620
It inspires just a huge number of people
link |
01:16:56.400
when a machine beats a human at a game
link |
01:17:01.460
that humans are really damn good at.
link |
01:17:03.460
So you're talking about chess and go,
link |
01:17:05.740
but that's in the world of digital.
link |
01:17:09.820
I don't think machines have beat humans
link |
01:17:13.320
at a game in the physical space yet,
link |
01:17:16.020
but that would be just.
link |
01:17:17.700
You have to make the rules very carefully, right?
link |
01:17:20.340
I mean, if Atlas kicked me in the shins, I'm down
link |
01:17:22.980
and game over.
link |
01:17:25.440
So it's very subtle on what's fair.
link |
01:17:31.220
I think the fighting one is a weird one.
link |
01:17:33.020
Yeah, because you're talking about a machine
link |
01:17:35.180
that's much stronger than you.
link |
01:17:36.500
But yeah, in terms of soccer, basketball, all those kinds.
link |
01:17:39.740
Even soccer, right?
link |
01:17:40.580
I mean, as soon as there's contact or whatever,
link |
01:17:43.500
and there are some things that the robot will do better.
link |
01:17:46.540
I think if you really set yourself up to try to see
link |
01:17:51.540
could robots win the game of soccer
link |
01:17:53.140
as the rules were written, the right thing
link |
01:17:56.300
for the robot to do is to play very differently
link |
01:17:58.060
than a human would play.
link |
01:17:59.680
You're not gonna get the perfect soccer player robot.
link |
01:18:04.060
You're gonna get something that exploits the rules,
link |
01:18:07.900
exploits its super actuators, its super low bandwidth
link |
01:18:13.420
feedback loops or whatever, and it's gonna play the game
link |
01:18:15.340
differently than you want it to play.
link |
01:18:17.540
And I bet there's ways, I bet there's loopholes, right?
link |
01:18:21.380
We saw that in the DARPA challenge that it's very hard
link |
01:18:27.060
to write a set of rules that someone can't find
link |
01:18:30.660
a way to exploit.
link |
01:18:32.860
Let me ask another ridiculous question.
link |
01:18:35.020
I think this might be the last ridiculous question,
link |
01:18:37.980
but I doubt it.
link |
01:18:39.220
I aspire to ask as many ridiculous questions
link |
01:18:44.540
of a brilliant MIT professor.
link |
01:18:48.060
Okay, I don't know if you've seen the black mirror.
link |
01:18:53.660
It's funny, I never watched the episode.
link |
01:18:56.740
I know when it happened though, because I gave a talk
link |
01:19:00.620
to some MIT faculty one day on a unassuming Monday
link |
01:19:05.380
or whatever I was telling him about the state of robotics.
link |
01:19:08.500
And I showed some video from Boston Dynamics
link |
01:19:10.740
of the quadruped spot at the time.
link |
01:19:13.940
It was the early version of spot.
link |
01:19:15.900
And there was a look of horror that went across the room.
link |
01:19:19.300
And I said, I've shown videos like this a lot of times,
link |
01:19:23.220
what happened?
link |
01:19:24.060
And it turns out that this video had gone,
link |
01:19:26.780
this black mirror episode had changed
link |
01:19:28.380
the way people watched the videos I was putting out.
link |
01:19:33.180
The way they see these kinds of robots.
link |
01:19:34.740
So I talked to so many people who are just terrified
link |
01:19:37.780
because of that episode probably of these kinds of robots.
link |
01:19:41.020
I almost wanna say that they almost enjoy being terrified.
link |
01:19:44.540
I don't even know what it is about human psychology
link |
01:19:47.100
that kind of imagine doomsday,
link |
01:19:49.220
the destruction of the universe or our society
link |
01:19:52.780
and kind of like enjoy being afraid.
link |
01:19:57.340
I don't wanna simplify it, but it feels like
link |
01:19:59.300
they talk about it so often.
link |
01:20:01.020
It almost, there does seem to be an addictive quality to it.
link |
01:20:06.380
I talked to a guy, a guy named Joe Rogan,
link |
01:20:09.500
who's kind of the flag bearer
link |
01:20:11.580
for being terrified at these robots.
link |
01:20:14.660
Do you have two questions?
link |
01:20:17.340
One, do you have an understanding
link |
01:20:18.620
of why people are afraid of robots?
link |
01:20:21.700
And the second question is in black mirror,
link |
01:20:24.940
just to tell you the episode,
link |
01:20:26.380
I don't even remember it that much anymore,
link |
01:20:28.180
but these robots, I think they can shoot
link |
01:20:31.100
like a pellet or something.
link |
01:20:32.820
They basically have, it's basically a spot with a gun.
link |
01:20:36.540
And how far are we away from having robots
link |
01:20:41.940
that go rogue like that?
link |
01:20:44.100
Basically spot that goes rogue for some reason
link |
01:20:48.460
and somehow finds a gun.
link |
01:20:51.300
Right, so, I mean, I'm not a psychologist.
link |
01:20:56.420
I think, I don't know exactly why
link |
01:20:59.860
people react the way they do.
link |
01:21:01.700
I think we have to be careful about the way robots influence
link |
01:21:06.700
our society and the like.
link |
01:21:07.980
I think that's something, that's a responsibility
link |
01:21:09.860
that roboticists need to embrace.
link |
01:21:13.260
I don't think robots are gonna come after me
link |
01:21:15.460
with a kitchen knife or a pellet gun right away.
link |
01:21:18.460
And I mean, if they were programmed in such a way,
link |
01:21:21.420
but I used to joke with Atlas that all I had to do
link |
01:21:25.940
was run for five minutes and its battery would run out.
link |
01:21:28.340
But actually they've got to be careful
link |
01:21:30.620
and actually they've got a very big battery
link |
01:21:32.460
in there by the end.
link |
01:21:33.300
So it was over an hour.
link |
01:21:37.220
I think the fear is a bit cultural though.
link |
01:21:39.420
Cause I mean, you notice that, like, I think in my age,
link |
01:21:45.140
in the US, we grew up watching Terminator, right?
link |
01:21:48.260
If I had grown up at the same time in Japan,
link |
01:21:50.500
I probably would have been watching Astro Boy.
link |
01:21:52.740
And there's a very different reaction to robots
link |
01:21:55.860
in different countries, right?
link |
01:21:57.460
So I don't know if it's a human innate fear of metal marvels
link |
01:22:02.620
or if it's something that we've done to ourselves
link |
01:22:06.420
with our sci fi.
link |
01:22:09.860
Yeah, the stories we tell ourselves through movies,
link |
01:22:12.580
through just through popular media.
link |
01:22:16.780
But if I were to tell, you know, if you were my therapist
link |
01:22:21.100
and I said, I'm really terrified that we're going
link |
01:22:24.900
to have these robots very soon that will hurt us.
link |
01:22:30.900
Like, how do you approach making me feel better?
link |
01:22:36.620
Like, why shouldn't people be afraid?
link |
01:22:39.580
There's a, I think there's a video
link |
01:22:41.380
that went viral recently.
link |
01:22:44.500
Everything, everything was spot in Boston,
link |
01:22:46.900
which goes viral in general.
link |
01:22:48.380
But usually it's like really cool stuff.
link |
01:22:50.060
Like they're doing flips and stuff
link |
01:22:51.420
or like sad stuff, the Atlas being hit with a broomstick
link |
01:22:56.140
or something like that.
link |
01:22:57.300
But there's a video where I think one of the new productions
link |
01:23:02.420
bought robots, which are awesome.
link |
01:23:04.620
It was like patrolling somewhere in like in some country.
link |
01:23:08.540
And like people immediately were like saying like,
link |
01:23:11.920
this is like the dystopian future,
link |
01:23:14.580
like the surveillance state.
link |
01:23:16.380
For some reason, like you can just have a camera,
link |
01:23:18.940
like something about spot being able to walk on four feet
link |
01:23:23.420
with like really terrified people.
link |
01:23:25.940
So like, what do you say to those people?
link |
01:23:31.060
I think there is a legitimate fear there
link |
01:23:33.820
because so much of our future is uncertain.
link |
01:23:37.840
But at the same time, technically speaking,
link |
01:23:40.140
it seems like we're not there yet.
link |
01:23:41.920
So what do you say?
link |
01:23:42.820
I mean, I think technology is complicated.
link |
01:23:48.580
It can be used in many ways.
link |
01:23:49.940
I think there are purely software attacks
link |
01:23:56.360
that somebody could use to do great damage.
link |
01:23:59.000
Maybe they have already, you know,
link |
01:24:01.480
I think wheeled robots could be used in bad ways too.
link |
01:24:08.340
Drones.
link |
01:24:09.180
Drones, right, I don't think that, let's see.
link |
01:24:16.340
I don't want to be building technology
link |
01:24:19.920
just because I'm compelled to build technology
link |
01:24:21.860
and I don't think about it.
link |
01:24:23.580
But I would consider myself a technological optimist,
link |
01:24:27.740
I guess, in the sense that I think we should continue
link |
01:24:32.220
to create and evolve and our world will change.
link |
01:24:37.220
And if we will introduce new challenges,
link |
01:24:40.780
we'll screw something up maybe,
link |
01:24:42.900
but I think also we'll invent ourselves
link |
01:24:46.220
out of those challenges and life will go on.
link |
01:24:49.380
So it's interesting because you didn't mention
link |
01:24:51.580
like this is technically too hard.
link |
01:24:54.540
I don't think robots are, I think people attribute
link |
01:24:57.380
a robot that looks like an animal
link |
01:24:59.140
as maybe having a level of self awareness
link |
01:25:02.140
or consciousness or something that they don't have yet.
link |
01:25:05.460
Right, so it's not, I think our ability
link |
01:25:09.380
to anthropomorphize those robots is probably,
link |
01:25:13.700
we're assuming that they have a level of intelligence
link |
01:25:16.540
that they don't yet have.
link |
01:25:17.940
And that might be part of the fear.
link |
01:25:20.060
So in that sense, it's too hard.
link |
01:25:22.260
But, you know, there are many scary things in the world.
link |
01:25:25.540
Right, so I think we're right to ask those questions.
link |
01:25:29.860
We're right to think about the implications of our work.
link |
01:25:33.600
Right, in the short term as we're working on it for sure,
link |
01:25:39.720
is there something long term that scares you
link |
01:25:43.840
about our future with AI and robots?
link |
01:25:47.680
A lot of folks from Elon Musk to Sam Harris
link |
01:25:52.400
to a lot of folks talk about the existential threats
link |
01:25:56.860
about artificial intelligence.
link |
01:25:58.880
Oftentimes, robots kind of inspire that the most
link |
01:26:03.680
because of the anthropomorphism.
link |
01:26:05.840
Do you have any fears?
link |
01:26:07.400
It's an important question.
link |
01:26:12.120
I actually, I think I like Rod Brooks answer
link |
01:26:14.920
maybe the best on this, I think.
link |
01:26:17.080
And it's not the only answer he's given over the years,
link |
01:26:19.320
but maybe one of my favorites is he says,
link |
01:26:24.360
it's not gonna be, he's got a book,
link |
01:26:25.920
Flesh and Machines, I believe, it's not gonna be
link |
01:26:29.960
the robots versus the people,
link |
01:26:31.880
we're all gonna be robot people.
link |
01:26:34.240
Because, you know, we already have smartphones,
link |
01:26:38.000
some of us have serious technology implanted
link |
01:26:41.120
in our bodies already, whether we have a hearing aid
link |
01:26:43.780
or a pacemaker or anything like this,
link |
01:26:47.800
people with amputations might have prosthetics.
link |
01:26:50.880
And that's a trend I think that is likely to continue.
link |
01:26:57.340
I mean, this is now wild speculation.
link |
01:27:01.420
But I mean, when do we get to cognitive implants
link |
01:27:05.500
and the like, and.
link |
01:27:06.620
Yeah, with neural link, brain computer interfaces,
link |
01:27:09.500
that's interesting.
link |
01:27:10.340
So there's a dance between humans and robots
link |
01:27:12.620
that's going to be, it's going to be impossible
link |
01:27:17.220
to be scared of the other out there, the robot,
link |
01:27:23.380
because the robot will be part of us, essentially.
link |
01:27:26.060
It'd be so intricately sort of part of our society that.
link |
01:27:30.180
Yeah, and it might not even be implanted part of us,
link |
01:27:33.060
but just, it's so much a part of our, yeah, our society.
link |
01:27:37.220
So in that sense, the smartphone is already the robot
link |
01:27:39.380
we should be afraid of, yeah.
link |
01:27:41.660
I mean, yeah, and all the usual fears arise
link |
01:27:45.460
of the misinformation, the manipulation,
link |
01:27:51.860
all those kinds of things that,
link |
01:27:56.180
the problems are all the same.
link |
01:27:57.860
They're human problems, essentially, it feels like.
link |
01:28:00.700
Yeah, I mean, I think the way we interact
link |
01:28:03.420
with each other online is changing the value we put on,
link |
01:28:07.420
you know, personal interaction.
link |
01:28:08.940
And that's a crazy big change that's going to happen
link |
01:28:11.260
and rip through our, has already been ripping
link |
01:28:13.080
through our society, right?
link |
01:28:14.200
And that has implications that are massive.
link |
01:28:18.060
I don't know if they should be scared of it
link |
01:28:19.300
or go with the flow, but I don't see, you know,
link |
01:28:24.700
some battle lines between humans and robots
link |
01:28:26.500
being the first thing to worry about.
link |
01:28:29.580
I mean, I do want to just, as a kind of comment,
link |
01:28:33.340
maybe you can comment about your just feelings
link |
01:28:35.460
about Boston Dynamics in general, but you know,
link |
01:28:38.660
I love science, I love engineering,
link |
01:28:40.300
I think there's so many beautiful ideas in it.
link |
01:28:42.540
And when I look at Boston Dynamics
link |
01:28:45.300
or legged robots in general,
link |
01:28:47.620
I think they inspire people, curiosity and feelings
link |
01:28:54.620
in general, excitement about engineering
link |
01:28:57.460
more than almost anything else in popular culture.
link |
01:29:00.620
And I think that's such an exciting,
link |
01:29:03.660
like responsibility and possibility for robotics.
link |
01:29:06.820
And Boston Dynamics is riding that wave pretty damn well.
link |
01:29:10.460
Like they found it, they've discovered that hunger
link |
01:29:13.980
and curiosity in the people and they're doing magic with it.
link |
01:29:17.540
I don't care if the, I mean, I guess is that their company,
link |
01:29:19.820
they have to make money, right?
link |
01:29:21.340
But they're already doing incredible work
link |
01:29:24.300
and inspiring the world about technology.
link |
01:29:26.940
I mean, do you have thoughts about Boston Dynamics
link |
01:29:30.700
and maybe others, your own work in robotics
link |
01:29:34.620
and inspiring the world in that way?
link |
01:29:36.600
I completely agree, I think Boston Dynamics
link |
01:29:40.240
is absolutely awesome.
link |
01:29:42.640
I think I show my kids those videos, you know,
link |
01:29:46.160
and the best thing that happens is sometimes
link |
01:29:48.640
they've already seen them, you know, right?
link |
01:29:50.740
I think, I just think it's a pinnacle of success
link |
01:29:55.360
in robotics that is just one of the best things
link |
01:29:58.760
that's happened, absolutely completely agree.
link |
01:30:01.660
One of the heartbreaking things to me is how many
link |
01:30:06.220
robotics companies fail, how hard it is to make money
link |
01:30:11.300
with a robotics company.
link |
01:30:13.100
Like iRobot like went through hell just to arrive
link |
01:30:17.220
at a Roomba to figure out one product.
link |
01:30:19.740
And then there's so many home robotics companies
link |
01:30:23.900
like Jibo and Anki, Anki, the cutest toy that's a great robot
link |
01:30:32.720
I thought went down, I'm forgetting a bunch of them,
link |
01:30:36.320
but a bunch of robotics companies fail,
link |
01:30:37.980
Rod's company, Rethink Robotics.
link |
01:30:42.340
Like, do you have anything hopeful to say
link |
01:30:47.260
about the possibility of making money with robots?
link |
01:30:50.340
Oh, I think you can't just look at the failures.
link |
01:30:54.220
I mean, Boston Dynamics is a success.
link |
01:30:55.940
There's lots of companies that are still doing amazingly
link |
01:30:58.500
good work in robotics.
link |
01:31:01.140
I mean, this is the capitalist ecology or something, right?
link |
01:31:05.360
I think you have many companies, you have many startups
link |
01:31:07.700
and they push each other forward and many of them fail
link |
01:31:11.380
and some of them get through and that's sort of
link |
01:31:13.820
the natural way of those things.
link |
01:31:17.040
I don't know that is robotics really that much worse.
link |
01:31:20.460
I feel the pain that you feel too.
link |
01:31:22.300
Every time I read one of these, sometimes it's friends
link |
01:31:26.480
and I definitely wish it went better or went differently.
link |
01:31:33.580
But I think it's healthy and good to have bursts of ideas,
link |
01:31:38.340
bursts of activities, ideas, if they are really aggressive,
link |
01:31:41.880
they should fail sometimes.
link |
01:31:45.180
Certainly that's the research mantra, right?
link |
01:31:46.940
If you're succeeding at every problem you attempt,
link |
01:31:50.780
then you're not choosing aggressively enough.
link |
01:31:53.380
Is it exciting to you, the new spot?
link |
01:31:55.980
Oh, it's so good.
link |
01:31:57.620
When are you getting them as a pet or it?
link |
01:32:00.140
Yeah, I mean, I have to dig up 75K right now.
link |
01:32:03.220
I mean, it's so cool that there's a price tag,
link |
01:32:05.740
you can go and then actually buy it.
link |
01:32:08.620
I have a Skydio R1, love it.
link |
01:32:11.500
So no, I would absolutely be a customer.
link |
01:32:18.580
I wonder what your kids would think about it.
link |
01:32:20.060
I actually, Zach from Boston Dynamics would let my kid drive
link |
01:32:25.660
in one of their demos one time.
link |
01:32:27.140
And that was just so good, so good.
link |
01:32:31.100
And again, I'll forever be grateful for that.
link |
01:32:34.220
And there's something magical about the anthropomorphization
link |
01:32:37.260
of that arm, it adds another level of human connection.
link |
01:32:42.580
I'm not sure we understand from a control aspect,
link |
01:32:47.480
the value of anthropomorphization.
link |
01:32:51.540
I think that's an understudied
link |
01:32:53.980
and under understood engineering problem.
link |
01:32:57.060
There's been a, like psychologists have been studying it.
link |
01:33:00.160
I think it's part like manipulating our mind
link |
01:33:02.860
to believe things is a valuable engineering.
link |
01:33:06.740
Like this is another degree of freedom
link |
01:33:08.820
that can be controlled.
link |
01:33:09.820
I like that, yeah, I think that's right.
link |
01:33:11.380
I think there's something that humans seem to do
link |
01:33:16.020
or maybe my dangerous introspection is,
link |
01:33:20.340
I think we are able to make very simple models
link |
01:33:23.820
that assume a lot about the world very quickly.
link |
01:33:27.780
And then it takes us a lot more time, like you're wrestling.
link |
01:33:31.220
You probably thought you knew what you were doing
link |
01:33:33.080
with wrestling and you were fairly functional
link |
01:33:35.340
as a complete wrestler.
link |
01:33:36.900
And then you slowly got more expertise.
link |
01:33:39.340
So maybe it's natural that our first level of defense
link |
01:33:45.740
against seeing a new robot is to think of it
link |
01:33:48.040
in our existing models of how humans and animals behave.
link |
01:33:52.420
And it's just, as you spend more time with it,
link |
01:33:55.060
then you'll develop more sophisticated models
link |
01:33:56.980
that will appreciate the differences.
link |
01:34:00.340
Exactly.
link |
01:34:01.620
Can you say what does it take to control a robot?
link |
01:34:05.700
Like what is the control problem of a robot?
link |
01:34:08.580
And in general, what is a robot in your view?
link |
01:34:10.980
Like how do you think of this system?
link |
01:34:15.020
What is a robot?
link |
01:34:16.020
What is a robot?
link |
01:34:17.580
I think robotics.
link |
01:34:18.400
I told you ridiculous questions.
link |
01:34:20.020
No, no, it's good.
link |
01:34:21.500
I mean, there's standard definitions
link |
01:34:22.980
of combining computation with some ability
link |
01:34:27.460
to do mechanical work.
link |
01:34:29.060
I think that gets us pretty close.
link |
01:34:30.980
But I think robotics has this problem
link |
01:34:34.180
that once things really work,
link |
01:34:37.200
we don't call them robots anymore.
link |
01:34:38.920
Like my dishwasher at home is pretty sophisticated,
link |
01:34:44.100
beautiful mechanisms.
link |
01:34:45.600
There's actually a pretty good computer,
link |
01:34:46.940
probably a couple of chips in there doing amazing things.
link |
01:34:49.580
We don't think of that as a robot anymore,
link |
01:34:51.620
which isn't fair.
link |
01:34:52.460
Because then what roughly it means
link |
01:34:53.940
that robotics always has to solve the next problem
link |
01:34:58.340
and doesn't get to celebrate its past successes.
link |
01:35:00.580
I mean, even factory room floor robots
link |
01:35:05.660
are super successful.
link |
01:35:06.860
They're amazing.
link |
01:35:08.260
But that's not the ones,
link |
01:35:09.500
I mean, people think of them as robots,
link |
01:35:10.880
but they don't,
link |
01:35:11.720
if you ask what are the successes of robotics,
link |
01:35:14.500
somehow it doesn't come to your mind immediately.
link |
01:35:17.860
So the definition of robot is a system
link |
01:35:20.560
with some level of automation that fails frequently.
link |
01:35:23.500
Something like, it's the computation plus mechanical work
link |
01:35:28.420
and an unsolved problem.
link |
01:35:30.540
It's an unsolved problem, yeah.
link |
01:35:32.300
So from a perspective of control and mechanics,
link |
01:35:37.020
dynamics, what is a robot?
link |
01:35:40.700
So there are many different types of robots.
link |
01:35:42.380
The control that you need for a Jibo robot,
link |
01:35:47.620
you know, some robot that's sitting on your countertop
link |
01:35:50.620
and interacting with you, but not touching you,
link |
01:35:53.580
for instance, is very different than what you need
link |
01:35:55.820
for an autonomous car or an autonomous drone.
link |
01:35:59.460
It's very different than what you need for a robot
link |
01:36:01.020
that's gonna walk or pick things up with its hands, right?
link |
01:36:04.740
My passion has always been for the places
link |
01:36:09.140
where you're interacting more,
link |
01:36:10.540
you're doing more dynamic interactions with the world.
link |
01:36:13.700
So walking, now manipulation.
link |
01:36:18.740
And the control problems there are beautiful.
link |
01:36:21.700
I think contact is one thing that differentiates them
link |
01:36:25.940
from many of the control problems we've solved classically,
link |
01:36:29.240
right, like modern control grew up stabilizing fighter jets
link |
01:36:32.780
that were passively unstable,
link |
01:36:34.060
and there's like amazing success stories from control
link |
01:36:37.020
all over the place.
link |
01:36:39.140
Power grid, I mean, there's all kinds of,
link |
01:36:41.340
it's everywhere that we don't even realize,
link |
01:36:44.640
just like AI is now.
link |
01:36:47.540
So you mentioned contact, like what's contact?
link |
01:36:51.500
So an airplane is an extremely complex system
link |
01:36:54.980
or a spacecraft landing or whatever,
link |
01:36:57.380
but at least it has the luxury
link |
01:36:59.340
of things change relatively continuously.
link |
01:37:03.640
That's an oversimplification.
link |
01:37:04.940
But if I make a small change
link |
01:37:07.060
in the command I send to my actuator,
link |
01:37:10.140
then the path that the robot will take
link |
01:37:12.680
tends to change only by a small amount.
link |
01:37:16.820
And there's a feedback mechanism here.
link |
01:37:18.860
That's what we're talking about.
link |
01:37:19.700
And there's a feedback mechanism.
link |
01:37:20.980
And thinking about this as locally,
link |
01:37:23.780
like a linear system, for instance,
link |
01:37:25.820
I can use more linear algebra tools
link |
01:37:29.220
to study systems like that,
link |
01:37:31.340
generalizations of linear algebra to these smooth systems.
link |
01:37:36.400
What is contact?
link |
01:37:37.380
The robot has something very discontinuous
link |
01:37:41.540
that happens when it makes or breaks,
link |
01:37:43.620
when it starts touching the world.
link |
01:37:45.420
And even the way it touches or the order of contacts
link |
01:37:48.080
can change the outcome in potentially unpredictable ways.
link |
01:37:53.080
Not unpredictable, but complex ways.
link |
01:37:56.880
I do think there's a little bit of,
link |
01:38:01.440
a lot of people will say that contact is hard in robotics,
link |
01:38:04.580
even to simulate.
link |
01:38:06.360
And I think there's a little bit of a,
link |
01:38:08.720
there's truth to that,
link |
01:38:09.640
but maybe a misunderstanding around that.
link |
01:38:13.560
So what is limiting is that when we think about our robots
link |
01:38:19.600
and we write our simulators,
link |
01:38:21.400
we often make an assumption that objects are rigid.
link |
01:38:26.000
And when it comes down, that their mass moves all,
link |
01:38:30.720
stays in a constant position relative to each other itself.
link |
01:38:37.080
And that leads to some paradoxes
link |
01:38:39.360
when you go to try to talk about
link |
01:38:40.560
rigid body mechanics and contact.
link |
01:38:43.200
And so for instance, if I have a three legged stool
link |
01:38:48.200
with just imagine it comes to a point at the leg.
link |
01:38:51.840
So it's only touching the world at a point.
link |
01:38:54.400
If I draw my physics,
link |
01:38:56.920
my high school physics diagram of the system,
link |
01:39:00.280
then there's a couple of things
link |
01:39:01.600
that I'm given by elementary physics.
link |
01:39:03.800
I know if the system, if the table is at rest,
link |
01:39:06.320
if it's not moving, zero velocities,
link |
01:39:09.520
that means that the normal force,
link |
01:39:11.120
all the forces are in balance.
link |
01:39:13.280
So the force of gravity is being countered
link |
01:39:16.400
by the forces that the ground is pushing on my table legs.
link |
01:39:21.240
I also know since it's not rotating
link |
01:39:23.880
that the moments have to balance.
link |
01:39:25.800
And since it's a three dimensional table,
link |
01:39:29.560
it could fall in any direction.
link |
01:39:31.120
It actually tells me uniquely
link |
01:39:33.040
what those three normal forces have to be.
link |
01:39:37.080
If I have four legs on my table,
link |
01:39:39.600
four legged table and they were perfectly machined
link |
01:39:43.280
to be exactly the right same height
link |
01:39:45.360
and they're set down and the table's not moving,
link |
01:39:48.040
then the basic conservation laws don't tell me,
link |
01:39:51.960
there are many solutions for the forces
link |
01:39:54.040
that the ground could be putting on my legs
link |
01:39:56.600
that would still result in the table not moving.
link |
01:40:00.200
Now, the reason that seems fine, I could just pick one.
link |
01:40:03.920
But it gets funny now because if you think about friction,
link |
01:40:07.840
what we think about with friction is our standard model
link |
01:40:11.000
says the amount of force that the table will push back
link |
01:40:15.880
if I were to now try to push my table sideways,
link |
01:40:18.000
I guess I have a table here,
link |
01:40:20.880
is proportional to the normal force.
link |
01:40:24.040
So if I'm barely touching and I push, I'll slide,
link |
01:40:27.200
but if I'm pushing more and I push, I'll slide less.
link |
01:40:30.440
It's called coulomb friction is our standard model.
link |
01:40:33.720
Now, if you don't know what the normal force is
link |
01:40:35.520
on the four legs and you push the table,
link |
01:40:38.840
then you don't know what the friction forces are gonna be.
link |
01:40:43.440
And so you can't actually tell,
link |
01:40:45.560
the laws just aren't explicit yet
link |
01:40:47.960
about which way the table's gonna go.
link |
01:40:49.680
It could veer off to the left,
link |
01:40:51.360
it could veer off to the right, it could go straight.
link |
01:40:54.720
So the rigid body assumption of contact
link |
01:40:58.440
leaves us with some paradoxes,
link |
01:40:59.840
which are annoying for writing simulators
link |
01:41:02.840
and for writing controllers.
link |
01:41:04.240
We still do that sometimes because soft contact
link |
01:41:07.720
is potentially harder numerically or whatever,
link |
01:41:11.400
and the best simulators do both
link |
01:41:12.920
or do some combination of the two.
link |
01:41:15.240
But anyways, because of these kinds of paradoxes,
link |
01:41:17.360
there's all kinds of paradoxes in contact,
link |
01:41:20.720
mostly due to these rigid body assumptions.
link |
01:41:23.560
It becomes very hard to write the same kind of control laws
link |
01:41:27.880
that we've been able to be successful with
link |
01:41:29.600
for fighter jets.
link |
01:41:32.000
Like fighter jets, we haven't been as successful
link |
01:41:34.560
writing those controllers for manipulation.
link |
01:41:37.440
And so you don't know what's going to happen
link |
01:41:39.160
at the point of contact, at the moment of contact.
link |
01:41:41.480
There are situations absolutely
link |
01:41:42.880
where our laws don't tell us.
link |
01:41:45.760
So the standard approach, that's okay.
link |
01:41:47.440
I mean, instead of having a differential equation,
link |
01:41:51.160
you end up with a differential inclusion, it's called.
link |
01:41:53.640
It's a set valued equation.
link |
01:41:56.080
It says that I'm in this configuration,
link |
01:41:58.320
I have these forces applied on me.
link |
01:42:00.000
And there's a set of things that could happen, right?
link |
01:42:03.480
And you can...
link |
01:42:04.320
And those aren't continuous, I mean, what...
link |
01:42:07.480
So when you're saying like non smooth,
link |
01:42:10.360
they're not only not smooth, but this is discontinuous?
link |
01:42:14.520
The non smooth comes in
link |
01:42:15.800
when I make or break a new contact first,
link |
01:42:18.760
or when I transition from stick to slip.
link |
01:42:21.200
So you typically have static friction,
link |
01:42:23.520
and then you'll start sliding,
link |
01:42:24.840
and that'll be a discontinuous change in philosophy.
link |
01:42:28.920
In philosophy, for instance,
link |
01:42:31.360
especially if you come to rest or...
link |
01:42:33.360
That's so fascinating.
link |
01:42:34.480
Okay, so what do you do?
link |
01:42:37.720
Sorry, I interrupted you.
link |
01:42:38.920
It's fine.
link |
01:42:41.600
What's the hope under so much uncertainty
link |
01:42:44.160
about what's going to happen?
link |
01:42:45.440
What are you supposed to do?
link |
01:42:46.360
I mean, control has an answer for this.
link |
01:42:48.520
Robust control is one approach,
link |
01:42:50.240
but roughly you can write controllers
link |
01:42:52.640
which try to still perform the right task
link |
01:42:55.920
despite all the things that could possibly happen.
link |
01:42:58.120
The world might want the table to go this way and this way,
link |
01:43:00.000
but if I write a controller that pushes a little bit more
link |
01:43:03.640
and pushes a little bit,
link |
01:43:04.480
I can certainly make the table go in the direction I want.
link |
01:43:08.000
It just puts a little bit more of a burden
link |
01:43:10.000
on the control system, right?
link |
01:43:12.120
And this discontinuities do change the control system
link |
01:43:15.440
because the way we write it down right now,
link |
01:43:21.200
every different control configuration,
link |
01:43:24.320
including sticking or sliding
link |
01:43:26.200
or parts of my body that are in contact or not,
link |
01:43:29.160
looks like a different system.
link |
01:43:30.840
And I think of them,
link |
01:43:31.880
I reason about them separately or differently
link |
01:43:34.680
and the combinatorics of that blow up, right?
link |
01:43:38.000
So I just don't have enough time to compute
link |
01:43:41.440
all the possible contact configurations of my humanoid.
link |
01:43:45.000
Interestingly, I mean, I'm a humanoid.
link |
01:43:49.000
I have lots of degrees of freedom, lots of joints.
link |
01:43:52.400
I've only been around for a handful of years.
link |
01:43:54.960
It's getting up there,
link |
01:43:55.800
but I haven't had time in my life
link |
01:43:59.200
to visit all of the states in my system,
link |
01:44:03.080
certainly all the contact configurations.
link |
01:44:05.240
So if step one is to consider
link |
01:44:08.320
every possible contact configuration that I'll ever be in,
link |
01:44:12.160
that's probably not a problem I need to solve, right?
link |
01:44:17.040
Just as a small tangent, what's a contact configuration?
link |
01:44:20.560
What like, just so we can enumerate
link |
01:44:24.920
what are we talking about?
link |
01:44:26.280
How many are there?
link |
01:44:27.600
The simplest example maybe would be,
link |
01:44:30.000
imagine a robot with a flat foot.
link |
01:44:32.720
And we think about the phases of gait
link |
01:44:35.440
where the heel strikes and then the front toe strikes,
link |
01:44:40.000
and then you can heel up, toe off.
link |
01:44:43.720
Those are each different contact configurations.
link |
01:44:46.720
I only had two different contacts,
link |
01:44:48.320
but I ended up with four different contact configurations.
link |
01:44:51.440
Now, of course, my robot might actually have bumps on it
link |
01:44:57.400
or other things,
link |
01:44:58.240
so it could be much more subtle than that, right?
link |
01:45:00.640
But it's just even with one sort of box
link |
01:45:03.160
interacting with the ground already in the plane
link |
01:45:06.240
has that many, right?
link |
01:45:07.120
And if I was just even a 3D foot,
link |
01:45:09.440
then it probably my left toe might touch
link |
01:45:11.240
just before my right toe and things get subtle.
link |
01:45:14.360
Now, if I'm a dexterous hand
link |
01:45:16.480
and I go to talk about just grabbing a water bottle,
link |
01:45:22.280
if I have to enumerate every possible order
link |
01:45:26.720
that my hand came into contact with the bottle,
link |
01:45:31.000
then I'm dead in the water.
link |
01:45:32.960
Any approach that we were able to get away with that
link |
01:45:35.400
in walking because we mostly touched the ground
link |
01:45:38.480
within a small number of points, for instance,
link |
01:45:40.840
and we haven't been able to get dexterous hands that way.
link |
01:45:43.800
So you've mentioned that people think
link |
01:45:50.200
that contact is really hard
link |
01:45:52.520
and that that's the reason that robotic manipulation
link |
01:45:58.160
is problem is really hard.
link |
01:46:00.560
Is there any flaws in that thinking?
link |
01:46:06.560
So I think simulating contact is one aspect.
link |
01:46:10.560
I know people often say that we don't,
link |
01:46:12.880
that one of the reasons that we have a limit in robotics
link |
01:46:16.320
is because we do not simulate contact accurately
link |
01:46:19.040
in our simulators.
link |
01:46:20.840
And I think that is the extent to which that's true
link |
01:46:25.600
is partly because our simulators,
link |
01:46:27.880
we haven't got mature enough simulators.
link |
01:46:31.240
There are some things that are still hard, difficult,
link |
01:46:34.120
that we should change,
link |
01:46:38.200
but we actually, we know what the governing equations are.
link |
01:46:41.520
They have some foibles like this indeterminacy,
link |
01:46:44.720
but we should be able to simulate them accurately.
link |
01:46:48.600
We have incredible open source community in robotics,
link |
01:46:51.440
but it actually just takes a professional engineering team
link |
01:46:54.360
a lot of work to write a very good simulator like that.
link |
01:46:59.080
Now, where does, I believe you've written, Drake.
link |
01:47:03.280
There's a team of people.
link |
01:47:04.520
I certainly spent a lot of hours on it myself.
link |
01:47:07.320
But what is Drake and what does it take to create
link |
01:47:12.060
a simulation environment for the kind of difficult control
link |
01:47:18.200
problems we're talking about?
link |
01:47:20.740
Right, so Drake is the simulator that I've been working on.
link |
01:47:24.640
There are other good simulators out there.
link |
01:47:26.780
I don't like to think of Drake as just a simulator
link |
01:47:29.680
because we write our controllers in Drake,
link |
01:47:31.780
we write our perception systems a little bit in Drake,
link |
01:47:34.360
but we write all of our low level control
link |
01:47:37.040
and even planning and optimization.
link |
01:47:40.840
So it has optimization capabilities as well?
link |
01:47:42.480
Absolutely, yeah.
link |
01:47:43.640
I mean, Drake is three things roughly.
link |
01:47:46.000
It's an optimization library, which is sits on,
link |
01:47:49.800
it provides a layer of abstraction in C++ and Python
link |
01:47:54.240
for commercial solvers.
link |
01:47:55.920
You can write linear programs, quadratic programs,
link |
01:48:00.760
semi definite programs, sums of squares programs,
link |
01:48:03.340
the ones we've used, mixed integer programs,
link |
01:48:05.660
and it will do the work to curate those
link |
01:48:07.960
and send them to whatever the right solver is for instance,
link |
01:48:10.360
and it provides a level of abstraction.
link |
01:48:13.720
The second thing is a system modeling language,
link |
01:48:18.360
a bit like LabVIEW or Simulink,
link |
01:48:20.880
where you can make block diagrams out of complex systems,
link |
01:48:24.840
or it's like ROS in that sense,
link |
01:48:26.640
where you might have lots of ROS nodes
link |
01:48:29.040
that are each doing some part of your system,
link |
01:48:31.960
but to contrast it with ROS, we try to write,
link |
01:48:36.560
if you write a Drake system, then you have to,
link |
01:48:40.120
it asks you to describe a little bit more about the system.
link |
01:48:43.000
If you have any state, for instance, in the system,
link |
01:48:46.240
any variables that are gonna persist,
link |
01:48:47.680
you have to declare them.
link |
01:48:49.120
Parameters can be declared and the like,
link |
01:48:51.620
but the advantage of doing that is that you can,
link |
01:48:54.160
if you like, run things all on one process,
link |
01:48:57.460
but you can also do control design against it.
link |
01:49:00.200
You can do, I mean, simple things like rewinding
link |
01:49:03.120
and playing back your simulations, for instance,
link |
01:49:07.960
these things, you get some rewards
link |
01:49:09.600
for spending a little bit more upfront cost
link |
01:49:11.380
in describing each system.
link |
01:49:13.320
And I was inspired to do that
link |
01:49:16.920
because I think the complexity of Atlas, for instance,
link |
01:49:21.260
is just so great.
link |
01:49:22.600
And I think, although, I mean,
link |
01:49:24.140
ROS has been an incredible, absolutely huge fan
link |
01:49:27.520
of what it's done for the robotics community,
link |
01:49:30.720
but the ability to rapidly put different pieces together
link |
01:49:35.480
and have a functioning thing is very good.
link |
01:49:38.960
But I do think that it's hard to think clearly
link |
01:49:42.880
about a bag of disparate parts,
link |
01:49:45.000
Mr. Potato Head kind of software stack.
link |
01:49:48.160
And if you can ask a little bit more
link |
01:49:53.060
out of each of those parts,
link |
01:49:54.200
then you can understand the way they work better.
link |
01:49:56.120
You can try to verify them and the like,
link |
01:50:00.160
or you can do learning against them.
link |
01:50:02.680
And then one of those systems, the last thing,
link |
01:50:04.760
I said the first two things that Drake is,
link |
01:50:06.480
but the last thing is that there is a set
link |
01:50:09.680
of multi body equations, rigid body equations,
link |
01:50:12.560
that is trying to provide a system that simulates physics.
link |
01:50:16.760
And we also have renderers and other things,
link |
01:50:20.060
but I think the physics component of Drake is special
link |
01:50:23.300
in the sense that we have done excessive amount
link |
01:50:27.740
of engineering to make sure
link |
01:50:29.840
that we've written the equations correctly.
link |
01:50:31.580
Every possible tumbling satellite or spinning top
link |
01:50:34.160
or anything that we could possibly write as a test is tested.
link |
01:50:38.300
We are making some, I think, fundamental improvements
link |
01:50:42.000
on the way you simulate contact.
link |
01:50:44.240
Just what does it take to simulate contact?
link |
01:50:47.600
I mean, it just seems,
link |
01:50:50.920
I mean, there's something just beautiful
link |
01:50:52.400
to the way you were like explaining contact
link |
01:50:55.240
and you were like tapping your fingers
link |
01:50:56.720
on the table while you're doing it, just.
link |
01:51:00.720
Easily, right?
link |
01:51:01.560
Easily, just like, just not even like,
link |
01:51:04.800
it was like helping you think, I guess.
link |
01:51:10.640
So you have this like awesome demo
link |
01:51:12.280
of loading or unloading a dishwasher,
link |
01:51:16.720
just picking up a plate,
link |
01:51:18.840
or grasping it like for the first time.
link |
01:51:26.120
That's just seems like so difficult.
link |
01:51:29.440
What, how do you simulate any of that?
link |
01:51:33.600
So it was really interesting that what happened was
link |
01:51:35.840
that we started getting more professional
link |
01:51:39.200
about our software development
link |
01:51:40.520
during the DARPA Robotics Challenge.
link |
01:51:43.360
I learned the value of software engineering
link |
01:51:46.040
and how these, how to bridle complexity.
link |
01:51:48.640
I guess that's what I want to somehow fight against
link |
01:51:52.800
and bring some of the clear thinking of controls
link |
01:51:54.760
into these complex systems we're building for robots.
link |
01:52:00.460
Shortly after the DARPA Robotics Challenge,
link |
01:52:02.940
Toyota opened a research institute,
link |
01:52:04.600
TRI, Toyota Research Institute.
link |
01:52:08.200
They put one of their, there's three locations.
link |
01:52:10.880
One of them is just down the street from MIT.
link |
01:52:13.040
And I helped ramp that up right up
link |
01:52:17.520
as a part of my, the end of my sabbatical, I guess.
link |
01:52:23.480
So TRI has given me, the TRI robotics effort
link |
01:52:29.480
has made this investment in simulation in Drake.
link |
01:52:32.640
And Michael Sherman leads a team there
link |
01:52:34.480
of just absolutely top notch dynamics experts
link |
01:52:37.800
that are trying to write those simulators
link |
01:52:40.120
that can pick up the dishes.
link |
01:52:41.960
And there's also a team working on manipulation there
link |
01:52:44.780
that is taking problems like loading the dishwasher.
link |
01:52:48.980
And we're using that to study these really hard corner cases
link |
01:52:53.180
kind of problems in manipulation.
link |
01:52:55.280
So for me, this, you know, simulating the dishes,
link |
01:52:59.760
we could actually write a controller.
link |
01:53:01.580
If we just cared about picking up dishes in the sink once,
link |
01:53:05.040
we could write a controller
link |
01:53:05.880
without any simulation whatsoever,
link |
01:53:07.760
and we could call it done.
link |
01:53:10.040
But we want to understand like,
link |
01:53:12.140
what is the path you take to actually get to a robot
link |
01:53:17.040
that could perform that for any dish in anybody's kitchen
link |
01:53:22.120
with enough confidence
link |
01:53:23.280
that it could be a commercial product, right?
link |
01:53:26.520
And it has deep learning perception in the loop.
link |
01:53:29.360
It has complex dynamics in the loop.
link |
01:53:31.040
It has controller, it has a planner.
link |
01:53:33.240
And how do you take all of that complexity
link |
01:53:36.320
and put it through this engineering discipline
link |
01:53:39.020
and verification and validation process
link |
01:53:42.440
to actually get enough confidence to deploy?
link |
01:53:46.440
I mean, the DARPA challenge made me realize
link |
01:53:49.840
that that's not something you throw over the fence
link |
01:53:52.000
and hope that somebody will harden it for you,
link |
01:53:54.080
that there are really fundamental challenges
link |
01:53:57.380
in closing that last gap.
link |
01:53:59.840
They're doing the validation and the testing.
link |
01:54:03.520
I think it might even change the way we have to think about
link |
01:54:06.780
the way we write systems.
link |
01:54:09.580
What happens if you have the robot running lots of tests
link |
01:54:15.560
and it screws up, it breaks a dish, right?
link |
01:54:19.040
How do you capture that?
link |
01:54:19.960
I said, you can't run the same simulation
link |
01:54:23.580
or the same experiment twice on a real robot.
link |
01:54:27.920
Do we have to be able to bring that one off failure
link |
01:54:31.520
back into simulation
link |
01:54:32.640
in order to change our controllers, study it,
link |
01:54:35.120
make sure it won't happen again?
link |
01:54:37.240
Do we, is it enough to just try to add that
link |
01:54:40.600
to our distribution and understand that on average,
link |
01:54:43.800
we're gonna cover that situation again?
link |
01:54:45.920
There's like really subtle questions at the corner cases
link |
01:54:49.960
that I think we don't yet have satisfying answers for.
link |
01:54:53.240
Like how do you find the corner cases?
link |
01:54:55.120
That's one kind of, is there,
link |
01:54:57.160
do you think that's possible to create a systematized way
link |
01:55:01.260
of discovering corner cases efficiently?
link |
01:55:04.720
Yes.
link |
01:55:05.560
In whatever the problem is?
link |
01:55:07.600
Yes, I mean, I think we have to get better at that.
link |
01:55:10.760
I mean, control theory has for decades
link |
01:55:14.920
talked about active experiment design.
link |
01:55:17.840
What's that?
link |
01:55:19.560
So people call it curiosity these days.
link |
01:55:22.080
It's roughly this idea of trying to exploration
link |
01:55:24.800
or exploitation, but in the active experiment design
link |
01:55:27.600
is even, is more specific.
link |
01:55:29.640
You could try to understand the uncertainty in your system,
link |
01:55:34.120
design the experiment that will provide
link |
01:55:36.480
the maximum information to reduce that uncertainty.
link |
01:55:40.120
If there's a parameter you wanna learn about,
link |
01:55:42.360
what is the optimal trajectory I could execute
link |
01:55:45.440
to learn about that parameter, for instance.
link |
01:55:49.520
Scaling that up to something that has a deep network
link |
01:55:51.720
in the loop and a planning in the loop is tough.
link |
01:55:55.660
We've done some work on, you know,
link |
01:55:58.200
with Matt Okely and Aman Sinha,
link |
01:56:00.280
we've worked on some falsification algorithms
link |
01:56:03.600
that are trying to do rare event simulation
link |
01:56:05.600
that try to just hammer on your simulator.
link |
01:56:08.120
And if your simulator is good enough,
link |
01:56:10.000
you can spend a lot of time,
link |
01:56:13.840
or you can write good algorithms
link |
01:56:15.840
that try to spend most of their time in the corner cases.
link |
01:56:19.920
So you basically imagine you're building an autonomous car
link |
01:56:25.880
and you wanna put it in, I don't know,
link |
01:56:27.360
downtown New Delhi all the time, right?
link |
01:56:29.400
And accelerated testing.
link |
01:56:31.640
If you can write sampling strategies,
link |
01:56:33.340
which figure out where your controller's
link |
01:56:35.400
performing badly in simulation
link |
01:56:37.440
and start generating lots of examples around that.
link |
01:56:40.600
You know, it's just the space of possible places
link |
01:56:44.060
where that can be, where things can go wrong is very big.
link |
01:56:48.040
So it's hard to write those algorithms.
link |
01:56:49.800
Yeah, rare event simulation
link |
01:56:51.720
is just a really compelling notion, if it's possible.
link |
01:56:55.760
We joked and we call it the black swan generator.
link |
01:56:58.600
It's a black swan.
link |
01:57:00.080
Because you don't just want the rare events,
link |
01:57:01.680
you want the ones that are highly impactful.
link |
01:57:04.020
I mean, that's the most,
link |
01:57:06.560
those are the most sort of profound questions
link |
01:57:08.780
we ask of our world.
link |
01:57:10.120
Like, what's the worst that can happen?
link |
01:57:16.720
But what we're really asking
link |
01:57:18.080
isn't some kind of like computer science,
link |
01:57:20.800
worst case analysis.
link |
01:57:22.560
We're asking like, what are the millions of ways
link |
01:57:25.600
this can go wrong?
link |
01:57:27.360
And that's like our curiosity.
link |
01:57:29.500
And we humans, I think are pretty bad at,
link |
01:57:34.900
we just like run into it.
link |
01:57:36.980
And I think there's a distributed sense
link |
01:57:38.580
because there's now like 7.5 billion of us.
link |
01:57:41.620
And so there's a lot of them.
link |
01:57:42.860
And then a lot of them write blog posts
link |
01:57:45.060
about the stupid thing they've done.
link |
01:57:46.540
So we learn in a distributed way.
link |
01:57:49.980
There's some.
link |
01:57:50.820
I think that's gonna be important for robots too.
link |
01:57:53.380
I mean, that's another massive theme
link |
01:57:55.940
at Toyota Research for Robotics
link |
01:57:58.800
is this fleet learning concept
link |
01:58:00.540
is the idea that I, as a human,
link |
01:58:04.780
I don't have enough time to visit all of my states, right?
link |
01:58:07.880
There's just a, it's very hard for one robot
link |
01:58:10.140
to experience all the things.
link |
01:58:12.640
But that's not actually the problem we have to solve, right?
link |
01:58:16.540
We're gonna have fleets of robots
link |
01:58:17.700
that can have very similar appendages.
link |
01:58:20.660
And at some point, maybe collectively,
link |
01:58:24.160
they have enough data
link |
01:58:26.220
that their computational processes
link |
01:58:29.340
should be set up differently than ours, right?
link |
01:58:31.860
It's this vision of just,
link |
01:58:34.180
I mean, all these dishwasher unloading robots.
link |
01:58:38.880
I mean, that robot dropping a plate
link |
01:58:42.580
and a human looking at the robot probably pissed off.
link |
01:58:46.860
Yeah.
link |
01:58:47.820
But that's a special moment to record.
link |
01:58:51.220
I think one thing in terms of fleet learning,
link |
01:58:54.500
and I've seen that because I've talked to a lot of folks,
link |
01:58:57.740
just like Tesla users or Tesla drivers,
link |
01:59:01.220
they're another company
link |
01:59:02.980
that's using this kind of fleet learning idea.
link |
01:59:05.300
One hopeful thing I have about humans
link |
01:59:08.220
is they really enjoy when a system improves, learns.
link |
01:59:13.260
So they enjoy fleet learning.
link |
01:59:14.680
And the reason it's hopeful for me
link |
01:59:17.260
is they're willing to put up with something
link |
01:59:20.300
that's kind of dumb right now.
link |
01:59:22.660
And they're like, if it's improving,
link |
01:59:25.540
they almost like enjoy being part of the, like teaching it.
link |
01:59:29.460
Almost like if you have kids,
link |
01:59:30.960
like you're teaching them something, right?
link |
01:59:33.540
I think that's a beautiful thing
link |
01:59:35.140
because that gives me hope
link |
01:59:36.300
that we can put dumb robots out there.
link |
01:59:40.100
I mean, the problem on the Tesla side with cars,
link |
01:59:43.340
cars can kill you.
link |
01:59:45.320
That makes the problem so much harder.
link |
01:59:47.740
Dishwasher unloading is a little safe.
link |
01:59:50.580
That's why home robotics is really exciting.
link |
01:59:54.220
And just to clarify, I mean, for people who might not know,
link |
01:59:57.580
I mean, TRI, Toyota Research Institute.
link |
02:00:00.100
So they're, I mean, they're pretty well known
link |
02:00:03.980
for like autonomous vehicle research,
link |
02:00:06.140
but they're also interested in home robotics.
link |
02:00:10.260
Yep, there's a big group working on,
link |
02:00:12.780
multiple groups working on home robotics.
link |
02:00:14.340
It's a major part of the portfolio.
link |
02:00:17.480
There's also a couple other projects
link |
02:00:19.100
in advanced materials discovery,
link |
02:00:21.300
using AI and machine learning to discover new materials
link |
02:00:24.420
for car batteries and the like, for instance, yeah.
link |
02:00:28.540
And that's been actually an incredibly successful team.
link |
02:00:31.500
There's new projects starting up too, so.
link |
02:00:33.540
Do you see a future of where like robots are in our home
link |
02:00:38.940
and like robots that have like actuators
link |
02:00:44.040
that look like arms in our home
link |
02:00:46.620
or like, you know, more like humanoid type robots?
link |
02:00:49.340
Or is this, are we gonna do the same thing
link |
02:00:51.820
that you just mentioned that, you know,
link |
02:00:53.860
the dishwasher is no longer a robot.
link |
02:00:55.980
We're going to just not even see them as robots.
link |
02:00:58.700
But I mean, what's your vision of the home of the future
link |
02:01:02.500
10, 20 years from now, 50 years, if you get crazy?
link |
02:01:06.220
Yeah, I think we already have Roombas cruising around.
link |
02:01:10.720
We have, you know, Alexis or Google Homes
link |
02:01:13.700
on our kitchen counter.
link |
02:01:16.240
It's only a matter of time until they spring arms
link |
02:01:18.060
and start doing something useful like that.
link |
02:01:21.860
So I do think it's coming.
link |
02:01:23.860
I think lots of people have lots of motivations
link |
02:01:27.660
for doing it.
link |
02:01:29.380
It's been super interesting actually learning
link |
02:01:31.520
about Toyota's vision for it,
link |
02:01:33.900
which is about helping people age in place.
link |
02:01:38.700
Cause I think that's not necessarily the first entry,
link |
02:01:41.620
the most lucrative entry point,
link |
02:01:44.340
but it's the problem maybe that we really need to solve
link |
02:01:48.680
no matter what.
link |
02:01:50.020
And so I think there's a real opportunity.
link |
02:01:53.900
It's a delicate problem.
link |
02:01:55.740
How do you work with people, help people,
link |
02:01:59.340
keep them active, engaged, you know,
link |
02:02:03.300
but improve their quality of life
link |
02:02:05.060
and help them age in place, for instance.
link |
02:02:08.340
It's interesting because older folks are also,
link |
02:02:12.440
I mean, there's a contrast there
link |
02:02:13.700
because they're not always the folks
link |
02:02:18.080
who are the most comfortable with technology, for example.
link |
02:02:20.900
So there's a division that's interesting.
link |
02:02:24.860
You can do so much good with a robot for older folks,
link |
02:02:32.020
but there's a gap to fill of understanding.
link |
02:02:36.380
I mean, it's actually kind of beautiful.
link |
02:02:39.360
Robot is learning about the human
link |
02:02:41.140
and the human is kind of learning about this new robot thing.
link |
02:02:44.820
And it's also with, at least with,
link |
02:02:49.660
like when I talked to my parents about robots,
link |
02:02:51.460
there's a little bit of a blank slate there too.
link |
02:02:54.540
Like you can, I mean, they don't know anything
link |
02:02:58.020
about robotics, so it's completely like wide open.
link |
02:03:02.640
They don't have, they haven't,
link |
02:03:03.880
my parents haven't seen Black Mirror.
link |
02:03:06.780
So like they, it's a blank slate.
link |
02:03:09.460
Here's a cool thing, like what can it do for me?
link |
02:03:11.980
Yeah, so it's an exciting space.
link |
02:03:14.380
I think it's a really important space.
link |
02:03:16.340
I do feel like a few years ago,
link |
02:03:20.020
drones were successful enough in academia.
link |
02:03:22.740
They kind of broke out and started an industry
link |
02:03:25.980
and autonomous cars have been happening.
link |
02:03:29.100
It does feel like manipulation in logistics, of course,
link |
02:03:32.900
first, but in the home shortly after,
link |
02:03:35.700
seems like one of the next big things
link |
02:03:37.180
that's gonna really pop.
link |
02:03:40.060
So I don't think we talked about it,
link |
02:03:42.100
but what's soft robotics?
link |
02:03:44.540
So we talked about like rigid bodies.
link |
02:03:49.300
Like if we can just linger on this whole touch thing.
link |
02:03:52.940
Yeah, so what's soft robotics?
link |
02:03:54.620
So I told you that I really dislike the fact
link |
02:04:00.780
that robots are afraid of touching the world
link |
02:04:03.140
all over their body.
link |
02:04:04.860
So there's a couple reasons for that.
link |
02:04:06.900
If you look carefully at all the places
link |
02:04:08.740
that robots actually do touch the world,
link |
02:04:11.220
they're almost always soft.
link |
02:04:12.540
They have some sort of pad on their fingers
link |
02:04:14.700
or a rubber sole on their foot.
link |
02:04:17.900
But if you look up and down the arm,
link |
02:04:19.300
we're just pure aluminum or something.
link |
02:04:25.340
So that makes it hard actually.
link |
02:04:26.660
In fact, hitting the table with your rigid arm
link |
02:04:30.460
or nearly rigid arm has some of the problems
link |
02:04:34.580
that we talked about in terms of simulation.
link |
02:04:37.260
I think it fundamentally changes the mechanics of contact
link |
02:04:39.940
when you're soft, right?
link |
02:04:41.260
You turn point contacts into patch contacts,
link |
02:04:45.020
which can have torsional friction.
link |
02:04:47.020
You can have distributed load.
link |
02:04:49.260
If I wanna pick up an egg, right?
link |
02:04:52.460
If I pick it up with two points,
link |
02:04:54.300
then in order to put enough force
link |
02:04:56.220
to sustain the weight of the egg,
link |
02:04:57.340
I might have to put a lot of force to break the egg.
link |
02:04:59.980
If I envelop it with contact all around,
link |
02:05:04.460
then I can distribute my force across the shell of the egg
link |
02:05:07.540
and have a better chance of not breaking it.
link |
02:05:10.620
So soft robotics is for me a lot about changing
link |
02:05:12.860
the mechanics of contact.
link |
02:05:15.500
Does it make the problem a lot harder?
link |
02:05:19.380
Quite the opposite.
link |
02:05:24.020
It changes the computational problem.
link |
02:05:26.740
I think because of the, I think our world
link |
02:05:30.460
and our mathematics has biased us towards rigid.
link |
02:05:34.180
I see.
link |
02:05:35.020
But it really should make things better in some ways, right?
link |
02:05:40.740
I think the future is unwritten there.
link |
02:05:44.620
But the other thing it can do.
link |
02:05:45.460
I think ultimately, sorry to interrupt,
link |
02:05:46.820
but I think ultimately it will make things simpler
link |
02:05:49.540
if we embrace the softness of the world.
link |
02:05:51.580
It makes things smoother, right?
link |
02:05:55.740
So the result of small actions is less discontinuous,
link |
02:06:00.740
but it also means potentially less instantaneously bad.
link |
02:06:05.980
For instance, I won't necessarily contact something
link |
02:06:09.060
and send it flying off.
link |
02:06:12.300
The other aspect of it
link |
02:06:13.140
that just happens to dovetail really well
link |
02:06:14.860
is that soft robotics tends to be a place
link |
02:06:17.260
where we can embed a lot of sensors too.
link |
02:06:19.100
So if you change your hardware and make it more soft,
link |
02:06:23.540
then you can potentially have a tactile sensor,
link |
02:06:25.620
which is measuring the deformation.
link |
02:06:27.820
So there's a team at TRI that's working on soft hands
link |
02:06:32.180
and you get so much more information.
link |
02:06:35.500
You can put a camera behind the skin roughly
link |
02:06:38.820
and get fantastic tactile information,
link |
02:06:42.860
which is, it's super important.
link |
02:06:46.180
Like in manipulation,
link |
02:06:47.020
one of the things that really is frustrating
link |
02:06:49.820
is if you work super hard on your head mounted,
link |
02:06:52.140
on your perception system for your head mounted cameras,
link |
02:06:54.540
and then you get a lot of information
link |
02:06:56.060
for your head mounted cameras,
link |
02:06:57.700
and then you've identified an object,
link |
02:06:59.460
you reach down to touch it,
link |
02:07:00.380
and the last thing that happens,
link |
02:07:01.900
right before the most important time,
link |
02:07:03.980
you stick your hand
link |
02:07:04.820
and you're occluding your head mounted sensors.
link |
02:07:07.380
So in all the part that really matters,
link |
02:07:10.220
all of your off board sensors are occluded.
link |
02:07:13.580
And really, if you don't have tactile information,
link |
02:07:15.900
then you're blind in an important way.
link |
02:07:19.300
So it happens that soft robotics and tactile sensing
link |
02:07:23.140
tend to go hand in hand.
link |
02:07:25.100
I think we've kind of talked about it,
link |
02:07:26.820
but you taught a course on underactuated robotics.
link |
02:07:31.060
I believe that was the name of it, actually.
link |
02:07:32.780
That's right.
link |
02:07:34.980
Can you talk about it in that context?
link |
02:07:37.340
What is underactuated robotics?
link |
02:07:40.380
Right, so underactuated robotics is my graduate course.
link |
02:07:43.740
It's online mostly now,
link |
02:07:46.620
in the sense that the lectures.
link |
02:07:47.460
Several versions of it, I think.
link |
02:07:49.060
Right, the YouTube.
link |
02:07:49.900
It's really great, I recommend it highly.
link |
02:07:52.060
Look on YouTube for the 2020 versions.
link |
02:07:55.060
Until March, and then you have to go back to 2019,
link |
02:07:57.460
thanks to COVID.
link |
02:08:00.740
No, I've poured my heart into that class.
link |
02:08:04.820
And lecture one is basically explaining
link |
02:08:06.620
what the word underactuated means.
link |
02:08:07.940
So people are very kind to show up
link |
02:08:09.860
and then maybe have to learn
link |
02:08:12.220
what the title of the course means
link |
02:08:13.460
over the course of the first lecture.
link |
02:08:15.420
That first lecture is really good.
link |
02:08:17.500
You should watch it.
link |
02:08:18.780
Thanks.
link |
02:08:19.860
It's a strange name,
link |
02:08:21.500
but I thought it captured the essence
link |
02:08:25.860
of what control was good at doing
link |
02:08:27.940
and what control was bad at doing.
link |
02:08:29.980
So what do I mean by underactuated?
link |
02:08:31.940
So a mechanical system
link |
02:08:36.340
has many degrees of freedom, for instance.
link |
02:08:39.500
I think of a joint as a degree of freedom.
link |
02:08:41.940
And it has some number of actuators, motors.
link |
02:08:46.180
So if you have a robot that's bolted to the table
link |
02:08:49.220
that has five degrees of freedom and five motors,
link |
02:08:54.100
then you have a fully actuated robot.
link |
02:08:57.140
If you take away one of those motors,
link |
02:09:00.540
then you have an underactuated robot.
link |
02:09:03.180
Now, why on earth?
link |
02:09:04.940
I have a good friend who likes to tease me.
link |
02:09:07.460
He said, Ross, if you had more research funding,
link |
02:09:09.500
would you work on fully actuated robots?
link |
02:09:11.740
Yeah.
link |
02:09:12.580
And the answer is no.
link |
02:09:15.180
The world gives us underactuated robots,
link |
02:09:17.420
whether we like it or not.
link |
02:09:18.460
I'm a human.
link |
02:09:19.860
I'm an underactuated robot,
link |
02:09:21.500
even though I have more muscles
link |
02:09:23.540
than my big degrees of freedom,
link |
02:09:25.220
because I have in some places
link |
02:09:27.740
multiple muscles attached to the same joint.
link |
02:09:30.820
But still, there's a really important degree of freedom
link |
02:09:33.900
that I have, which is the location of my center of mass
link |
02:09:37.140
in space, for instance.
link |
02:09:39.580
All right, I can jump into the air,
link |
02:09:42.500
and there's no motor that connects my center of mass
link |
02:09:45.220
to the ground in that case.
link |
02:09:47.220
So I have to think about the implications
link |
02:09:49.420
of not having control over everything.
link |
02:09:52.740
The passive dynamic walkers are the extreme view of that,
link |
02:09:56.540
where you've taken away all the motors,
link |
02:09:57.860
and you have to let physics do the work.
link |
02:09:59.980
But it shows up in all of the walking robots,
link |
02:10:02.220
where you have to use some of the actuators
link |
02:10:04.540
to push and pull even the degrees of freedom
link |
02:10:06.980
that you don't have an actuator on.
link |
02:10:09.980
That's referring to walking if you're falling forward.
link |
02:10:13.140
Is there a way to walk that's fully actuated?
link |
02:10:16.260
So it's a subtle point.
link |
02:10:18.340
When you're in contact and you have your feet on the ground,
link |
02:10:23.940
there are still limits to what you can do, right?
link |
02:10:26.540
Unless I have suction cups on my feet,
link |
02:10:29.140
I cannot accelerate my center of mass towards the ground
link |
02:10:32.620
faster than gravity,
link |
02:10:33.780
because I can't get a force pushing me down, right?
link |
02:10:37.420
But I can still do most of the things that I want to.
link |
02:10:39.420
So you can get away with basically thinking of the system
link |
02:10:42.460
as fully actuated,
link |
02:10:43.420
unless you suddenly needed to accelerate down super fast.
link |
02:10:47.460
But as soon as I take a step,
link |
02:10:49.260
I get into the more nuanced territory,
link |
02:10:52.980
and to get to really dynamic robots,
link |
02:10:55.780
or airplanes or other things,
link |
02:10:59.220
I think you have to embrace the underactuated dynamics.
link |
02:11:02.620
Manipulation, people think, is manipulation underactuated?
link |
02:11:06.940
Even if my arm is fully actuated, I have a motor,
link |
02:11:10.580
if my goal is to control the position and orientation
link |
02:11:14.260
of this cup, then I don't have an actuator
link |
02:11:18.460
for that directly.
link |
02:11:19.300
So I have to use my actuators over here
link |
02:11:21.100
to control this thing.
link |
02:11:23.380
Now it gets even worse,
link |
02:11:24.340
like what if I have to button my shirt, okay?
link |
02:11:29.300
What are the degrees of freedom of my shirt, right?
link |
02:11:31.340
I suddenly, that's a hard question to think about.
link |
02:11:34.540
It kind of makes me queasy
link |
02:11:36.740
thinking about my state space control ideas.
link |
02:11:40.740
But actually those are the problems
link |
02:11:41.820
that make me so excited about manipulation right now,
link |
02:11:44.540
is that it breaks some of the,
link |
02:11:48.020
it breaks a lot of the foundational control stuff
link |
02:11:50.060
that I've been thinking about.
link |
02:11:51.420
Is there, what are some interesting insights
link |
02:11:54.580
you could say about trying to solve an underactuated,
link |
02:11:58.060
a control in an underactuated system?
link |
02:12:02.380
So I think the philosophy there
link |
02:12:04.820
is let physics do more of the work.
link |
02:12:08.460
The technical approach has been optimization.
link |
02:12:12.180
So you typically formulate your decision making
link |
02:12:14.260
for control as an optimization problem.
link |
02:12:17.140
And you use the language of optimal control
link |
02:12:19.420
and sometimes often numerical optimal control
link |
02:12:22.780
in order to make those decisions and balance,
link |
02:12:26.620
these complicated equations of,
link |
02:12:29.100
and in order to control,
link |
02:12:30.900
you don't have to use optimal control
link |
02:12:33.140
to do underactuated systems,
link |
02:12:34.900
but that has been the technical approach
link |
02:12:36.340
that has borne the most fruit in our,
link |
02:12:39.100
at least in our line of work.
link |
02:12:40.900
And there's some, so in underactuated systems,
link |
02:12:44.060
when you say let physics do some of the work,
link |
02:12:46.820
so there's a kind of feedback loop
link |
02:12:50.380
that observes the state that the physics brought you to.
link |
02:12:54.540
So like you've, there's a perception there,
link |
02:12:57.780
there's a feedback somehow.
link |
02:13:00.420
Do you ever loop in like complicated perception systems
link |
02:13:05.420
into this whole picture?
link |
02:13:06.900
Right, right around the time of the DARPA challenge,
link |
02:13:09.620
we had a complicated perception system
link |
02:13:11.340
in the DARPA challenge.
link |
02:13:12.700
We also started to embrace perception
link |
02:13:15.580
for our flying vehicles at the time.
link |
02:13:17.340
We had a really good project
link |
02:13:20.100
on trying to make airplanes fly
link |
02:13:21.820
at high speeds through forests.
link |
02:13:24.780
Sirtash Karaman was on that project
link |
02:13:27.460
and we had, it was a really fun team to work on.
link |
02:13:30.700
He's carried it farther, much farther forward since then.
link |
02:13:34.220
And that's using cameras for perception?
link |
02:13:35.980
So that was using cameras.
link |
02:13:37.580
That was, at the time we felt like LIDAR
link |
02:13:40.300
was too heavy and too power heavy
link |
02:13:44.860
to be carried on a light UAV,
link |
02:13:47.740
and we were using cameras.
link |
02:13:49.220
And that was a big part of it was just
link |
02:13:50.660
how do you do even stereo matching
link |
02:13:53.100
at a fast enough rate with a small camera,
link |
02:13:56.460
small onboard compute.
link |
02:13:58.620
Since then we have now,
link |
02:14:00.700
so the deep learning revolution
link |
02:14:02.140
unquestionably changed what we can do
link |
02:14:05.540
with perception for robotics and control.
link |
02:14:09.020
So in manipulation, we can address,
link |
02:14:11.020
we can use perception in I think a much deeper way.
link |
02:14:14.660
And we get into not only,
link |
02:14:17.340
I think the first use of it naturally
link |
02:14:19.820
would be to ask your deep learning system
link |
02:14:22.940
to look at the cameras and produce the state,
link |
02:14:25.980
which is like the pose of my thing, for instance.
link |
02:14:28.900
But I think we've quickly found out
link |
02:14:30.460
that that's not always the right thing to do.
link |
02:14:34.460
Why is that?
link |
02:14:35.620
Because what's the state of my shirt?
link |
02:14:38.420
Imagine, I've always,
link |
02:14:39.740
Very noisy, you mean, or?
link |
02:14:41.300
It's, if the first step of me trying to button my shirt
link |
02:14:46.140
is estimate the full state of my shirt,
link |
02:14:48.580
including like what's happening in the back here,
link |
02:14:50.460
whatever, whatever.
link |
02:14:51.820
That's just not the right specification.
link |
02:14:55.780
There are aspects of the state
link |
02:14:57.500
that are very important to the task.
link |
02:15:00.260
There are many that are unobservable
link |
02:15:03.220
and not important to the task.
link |
02:15:05.860
So you really need,
link |
02:15:06.940
it begs new questions about state representation.
link |
02:15:11.100
Another example that we've been playing with in lab
link |
02:15:13.100
has been just the idea of chopping onions, okay?
link |
02:15:17.660
Or carrots, turns out to be better.
link |
02:15:20.540
So onions stink up the lab.
link |
02:15:22.500
And they're hard to see in a camera.
link |
02:15:26.220
But so,
link |
02:15:27.900
Details matter, yeah.
link |
02:15:28.740
Details matter, you know?
link |
02:15:30.180
So if I'm moving around a particular object, right?
link |
02:15:35.220
Then I think about,
link |
02:15:36.060
oh, it's got a position or an orientation in space.
link |
02:15:38.020
That's the description I want.
link |
02:15:39.780
Now, when I'm chopping an onion, okay?
link |
02:15:42.300
Like the first chop comes down.
link |
02:15:44.260
I have now a hundred pieces of onion.
link |
02:15:48.420
Does my control system really need to understand
link |
02:15:50.300
the position and orientation and even the shape
link |
02:15:52.660
of the hundred pieces of onion in order to make a decision?
link |
02:15:56.100
Probably not, you know?
link |
02:15:56.940
And if I keep going, I'm just getting,
link |
02:15:58.900
more and more is my state space getting bigger as I cut?
link |
02:16:04.740
It's not right.
link |
02:16:06.020
So somehow there's a,
link |
02:16:08.100
I think there's a richer idea of state.
link |
02:16:13.100
It's not the state that is given to us
link |
02:16:15.740
by Lagrangian mechanics.
link |
02:16:17.180
There is a proper Lagrangian state of the system,
link |
02:16:21.340
but the relevant state for this is some latent state
link |
02:16:26.460
is what we call it in machine learning.
link |
02:16:28.540
But, you know, there's some different state representation.
link |
02:16:32.180
Some compressed representation, some.
link |
02:16:35.020
And that's what I worry about saying compressed
link |
02:16:37.260
because it doesn't,
link |
02:16:38.260
I don't mind that it's low dimensional or not,
link |
02:16:43.020
but it has to be something that's easier to think about.
link |
02:16:46.260
By us humans.
link |
02:16:48.460
Or my algorithms.
link |
02:16:49.300
Or the algorithms being like control, optimal.
link |
02:16:53.860
So for instance, if the contact mechanics
link |
02:16:56.540
of all of those onion pieces and all the permutations
link |
02:16:59.660
of possible touches between those onion pieces,
link |
02:17:02.540
you know, you can give me
link |
02:17:03.620
a high dimensional state representation,
link |
02:17:05.100
I'm okay if it's linear.
link |
02:17:06.780
But if I have to think about all the possible
link |
02:17:08.660
shattering combinatorics of that,
link |
02:17:11.700
then my robot's gonna sit there thinking
link |
02:17:13.860
and the soup's gonna get cold or something.
link |
02:17:17.380
So since you taught the course,
link |
02:17:20.100
it kind of entered my mind,
link |
02:17:22.740
the idea of underactuated as really compelling
link |
02:17:25.980
to see the world in this kind of way.
link |
02:17:29.540
Do you ever, you know, if we talk about onions
link |
02:17:32.420
or you talk about the world with people in it in general,
link |
02:17:35.480
do you see the world as basically an underactuated system?
link |
02:17:39.980
Do you like often look at the world in this way?
link |
02:17:42.380
Or is this overreach?
link |
02:17:47.040
Underactuated is a way of life, man.
link |
02:17:49.160
Exactly, I guess that's what I'm asking.
link |
02:17:53.560
I do think it's everywhere.
link |
02:17:54.960
I think in some places,
link |
02:17:58.840
we already have natural tools to deal with it.
link |
02:18:01.380
You know, it rears its head.
link |
02:18:02.480
I mean, in linear systems, it's not a problem.
link |
02:18:04.280
We just, like an underactuated linear system
link |
02:18:07.340
is really not sufficiently distinct
link |
02:18:09.000
from a fully actuated linear system.
link |
02:18:10.760
It's a subtle point about when that becomes a bottleneck
link |
02:18:15.600
in what we know how to do with control.
link |
02:18:17.220
It happens to be a bottleneck,
link |
02:18:19.800
although we've gotten incredibly good solutions now,
link |
02:18:22.500
but for a long time that I felt
link |
02:18:24.200
that that was the key bottleneck in legged robots.
link |
02:18:27.100
And roughly now the underactuated course
link |
02:18:29.200
is me trying to tell people everything I can
link |
02:18:33.840
about how to make Atlas do a backflip, right?
link |
02:18:38.500
I have a second course now
link |
02:18:39.920
that I teach in the other semesters,
link |
02:18:41.280
which is on manipulation.
link |
02:18:43.600
And that's where we get into now more of the,
link |
02:18:45.840
that's a newer class.
link |
02:18:47.160
I'm hoping to put it online this fall completely.
link |
02:18:51.600
And that's gonna have much more aspects
link |
02:18:53.700
about these perception problems
link |
02:18:55.460
and the state representation questions,
link |
02:18:57.200
and then how do you do control.
link |
02:18:59.260
And the thing that's a little bit sad is that,
link |
02:19:04.040
for me at least, is there's a lot of manipulation tasks
link |
02:19:07.480
that people wanna do and should wanna do.
link |
02:19:09.280
They could start a company with it and be very successful
link |
02:19:12.740
that don't actually require you to think that much
link |
02:19:15.600
about underact, or dynamics at all even,
link |
02:19:18.040
but certainly underactuated dynamics.
link |
02:19:20.020
Once I have, if I reach out and grab something,
link |
02:19:23.100
if I can sort of assume it's rigidly attached to my hand,
link |
02:19:25.720
then I can do a lot of interesting,
link |
02:19:26.920
meaningful things with it
link |
02:19:28.800
without really ever thinking about the dynamics
link |
02:19:30.960
of that object.
link |
02:19:32.860
So we've built systems that kind of reduce the need for that.
link |
02:19:37.860
Enveloping grasps and the like.
link |
02:19:40.780
But I think the really good problems in manipulation.
link |
02:19:43.060
So manipulation, by the way, is more than just pick and place.
link |
02:19:48.540
That's like a lot of people think of that, just grasping.
link |
02:19:51.780
I don't mean that.
link |
02:19:52.620
I mean buttoning my shirt, I mean tying shoelaces.
link |
02:19:56.500
How do you program a robot to tie shoelaces?
link |
02:19:59.060
And not just one shoe, but every shoe, right?
link |
02:20:02.860
That's a really good problem.
link |
02:20:05.580
It's tempting to write down like the infinite dimensional
link |
02:20:08.420
state of the laces, that's probably not needed
link |
02:20:13.180
to write a good controller.
link |
02:20:15.100
I know we could hand design a controller that would do it,
link |
02:20:18.340
but I don't want that.
link |
02:20:19.180
I want to understand the principles that would allow me
link |
02:20:22.460
to solve another problem that's kind of like that.
link |
02:20:25.380
But I think if we can stay pure in our approach,
link |
02:20:29.820
then the challenge of tying anybody's shoes
link |
02:20:33.820
is a great challenge.
link |
02:20:36.300
That's a great challenge.
link |
02:20:37.220
I mean, and the soft touch comes into play there.
link |
02:20:40.940
That's really interesting.
link |
02:20:43.100
Let me ask another ridiculous question on this topic.
link |
02:20:47.500
How important is touch?
link |
02:20:49.780
We haven't talked much about humans,
link |
02:20:52.300
but I have this argument with my dad
link |
02:20:56.220
where like I think you can fall in love with a robot
link |
02:20:59.620
based on language alone.
link |
02:21:02.580
And he believes that touch is essential.
link |
02:21:06.460
Touch and smell, he says.
link |
02:21:07.660
But so in terms of robots, connecting with humans,
link |
02:21:17.380
we can go philosophical in terms of like a deep,
link |
02:21:19.660
meaningful connection, like love,
link |
02:21:21.820
but even just like collaborating in an interesting way,
link |
02:21:25.580
how important is touch like from an engineering perspective
link |
02:21:30.580
and a philosophical one?
link |
02:21:32.780
I think it's super important.
link |
02:21:35.700
Even just in a practical sense,
link |
02:21:37.020
if we forget about the emotional part of it.
link |
02:21:40.700
But for robots to interact safely
link |
02:21:43.300
while they're doing meaningful mechanical work
link |
02:21:47.220
in the close contact with or vicinity of people
link |
02:21:52.420
that need help, I think we have to have them,
link |
02:21:55.220
we have to build them differently.
link |
02:21:57.500
They have to be afraid, not afraid of touching the world.
link |
02:21:59.860
So I think Baymax is just awesome.
link |
02:22:02.820
That's just like the movie of Big Hero 6
link |
02:22:06.260
and the concept of Baymax, that's just awesome.
link |
02:22:08.700
I think we should, and we have some folks at Toyota
link |
02:22:13.060
that are trying to, Toyota Research
link |
02:22:14.420
that are trying to build Baymax roughly.
link |
02:22:16.860
And I think it's just a fantastically good project.
link |
02:22:21.900
I think it will change the way people physically interact.
link |
02:22:25.620
The same way, I mean, you gave a couple examples earlier,
link |
02:22:27.980
but if the robot that was walking around my home
link |
02:22:31.940
looked more like a teddy bear
link |
02:22:33.980
and a little less like the Terminator,
link |
02:22:35.980
that could change completely the way people perceive it
link |
02:22:38.900
and interact with it.
link |
02:22:39.820
And maybe they'll even wanna teach it, like you said, right?
link |
02:22:44.340
You could not quite gamify it,
link |
02:22:47.660
but somehow instead of people judging it
link |
02:22:50.060
and looking at it as if it's not doing as well as a human,
link |
02:22:54.340
they're gonna try to help out the cute teddy bear, right?
link |
02:22:57.060
Who knows, but I think we're building robots wrong
link |
02:23:01.260
and being more soft and more contact is important, right?
link |
02:23:07.780
Yeah, I mean, like all the magical moments
link |
02:23:09.860
I can remember with robots,
link |
02:23:12.380
well, first of all, just visiting your lab and seeing Atlas,
link |
02:23:16.900
but also Spotmini, when I first saw Spotmini in person
link |
02:23:21.660
and hung out with him, her, it,
link |
02:23:26.260
I don't have trouble engendering robots.
link |
02:23:28.380
I feel the robotics people really say, oh, is it it?
link |
02:23:31.500
I kinda like the idea that it's a her or a him.
link |
02:23:35.780
There's a magical moment, but there's no touching.
link |
02:23:38.780
I guess the question I have, have you ever been,
link |
02:23:41.620
like, have you had a human robot experience
link |
02:23:44.940
where a robot touched you?
link |
02:23:49.580
And like, it was like, wait,
link |
02:23:51.660
like, was there a moment that you've forgotten
link |
02:23:53.980
that a robot is a robot and like,
link |
02:23:57.740
the anthropomorphization stepped in
link |
02:24:00.820
and for a second you forgot that it's not human?
link |
02:24:04.900
I mean, I think when you're in on the details,
link |
02:24:07.820
then we, of course, anthropomorphized our work with Atlas,
link |
02:24:12.380
but in verbal communication and the like,
link |
02:24:17.100
I think we were pretty aware of it
link |
02:24:18.980
as a machine that needed to be respected.
link |
02:24:21.740
And I actually, I worry more about the smaller robots
link |
02:24:26.260
that could still move quickly if programmed wrong
link |
02:24:29.540
and we have to be careful actually
link |
02:24:31.660
about safety and the like right now.
link |
02:24:33.740
And that, if we build our robots correctly,
link |
02:24:36.380
I think then those, a lot of those concerns could go away.
link |
02:24:40.300
And we're seeing that trend.
link |
02:24:41.260
We're seeing the lower cost, lighter weight arms now
link |
02:24:44.100
that could be fundamentally safe.
link |
02:24:46.740
I mean, I do think touch is so fundamental.
link |
02:24:49.060
Ted Adelson is great.
link |
02:24:51.100
He's a perceptual scientist at MIT
link |
02:24:55.740
and he studied vision most of his life.
link |
02:24:58.180
And he said, when I had kids,
link |
02:25:01.220
I expected to be fascinated by their perceptual development.
link |
02:25:06.380
But what really, what he noticed was,
link |
02:25:09.260
felt more impressive, more dominant
link |
02:25:10.780
was the way that they would touch everything
link |
02:25:13.060
and lick everything.
link |
02:25:13.900
And pick things up, stick it on their tongue and whatever.
link |
02:25:16.900
And he said, watching his daughter convinced him
link |
02:25:22.180
that actually he needed to study tactile sensing more.
link |
02:25:25.580
So there's something very important.
link |
02:25:30.580
I think it's a little bit also of the passive
link |
02:25:32.780
versus active part of the world, right?
link |
02:25:35.660
You can passively perceive the world.
link |
02:25:38.460
But it's fundamentally different if you can do an experiment
link |
02:25:41.460
and if you can change the world
link |
02:25:43.340
and you can learn a lot more than a passive observer.
link |
02:25:47.460
So you can in dialogue, that was your initial example,
link |
02:25:51.500
you could have an active experiment exchange.
link |
02:25:54.580
But I think if you're just a camera watching YouTube,
link |
02:25:57.460
I think that's a very different problem
link |
02:26:00.380
than if you're a robot that can apply force.
link |
02:26:03.700
And I think that's a very different problem
link |
02:26:05.900
than if you're a robot that can apply force and touch.
link |
02:26:13.260
I think it's important.
link |
02:26:15.540
Yeah, I think it's just an exciting area of research.
link |
02:26:18.020
I think you're probably right
link |
02:26:19.260
that this hasn't been under researched.
link |
02:26:23.900
To me as a person who's captivated
link |
02:26:25.780
by the idea of human robot interaction,
link |
02:26:27.820
it feels like such a rich opportunity to explore touch.
link |
02:26:34.140
Not even from a safety perspective,
link |
02:26:35.860
but like you said, the emotional too.
link |
02:26:38.060
I mean, safety comes first,
link |
02:26:41.220
but the next step is like a real human connection.
link |
02:26:48.300
Even in the industrial setting,
link |
02:26:51.380
it just feels like it's nice for the robot.
link |
02:26:55.540
I don't know, you might disagree with this,
link |
02:26:58.060
but because I think it's important
link |
02:27:01.220
to see robots as tools often,
link |
02:27:04.340
but I don't know,
link |
02:27:06.060
I think they're just always going to be more effective
link |
02:27:08.540
once you humanize them.
link |
02:27:11.700
Like it's convenient now to think of them as tools
link |
02:27:14.340
because we want to focus on the safety,
link |
02:27:16.140
but I think ultimately to create like a good experience
link |
02:27:22.300
for the worker, for the person,
link |
02:27:24.860
there has to be a human element.
link |
02:27:27.980
I don't know, for me,
link |
02:27:30.140
it feels like an industrial robotic arm
link |
02:27:33.140
would be better if it has a human element.
link |
02:27:34.860
I think like Rethink Robotics had that idea
link |
02:27:37.060
with the Baxter and having eyes and so on,
link |
02:27:40.260
having, I don't know, I'm a big believer in that.
link |
02:27:45.220
It's not my area, but I am also a big believer.
link |
02:27:49.300
Do you have an emotional connection to Atlas?
link |
02:27:51.900
Like do you miss him?
link |
02:27:54.940
I mean, yes, I don't know if I more so
link |
02:27:59.940
than if I had a different science project
link |
02:28:01.620
that I'd worked on super hard, right?
link |
02:28:03.420
But yeah, I mean, the robot,
link |
02:28:09.900
we basically had to do heart surgery on the robot
link |
02:28:11.780
in the final competition because we melted the core.
link |
02:28:18.380
Yeah, there was something about watching that robot
link |
02:28:20.140
hanging there.
link |
02:28:20.980
We know we had to compete with it in an hour
link |
02:28:22.540
and it was getting its guts ripped out.
link |
02:28:25.260
Those are all historic moments.
link |
02:28:27.460
I think if you look back like a hundred years from now,
link |
02:28:32.140
yeah, I think those are important moments in robotics.
link |
02:28:35.140
I mean, these are the early days.
link |
02:28:36.660
You look at like the early days
link |
02:28:37.980
of a lot of scientific disciplines.
link |
02:28:39.500
They look ridiculous, they're full of failure,
link |
02:28:42.020
but it feels like robotics will be important
link |
02:28:45.060
in the coming a hundred years.
link |
02:28:48.940
And these are the early days.
link |
02:28:50.860
So I think a lot of people are,
link |
02:28:54.420
look at a brilliant person such as yourself
link |
02:28:57.900
and are curious about the intellectual journey they've took.
link |
02:29:01.740
Is there maybe three books, technical, fiction,
link |
02:29:06.260
philosophical that had a big impact on your life
link |
02:29:10.540
that you would recommend perhaps others reading?
link |
02:29:15.260
Yeah, so I actually didn't read that much as a kid,
link |
02:29:18.460
but I read fairly voraciously now.
link |
02:29:21.260
There are some recent books that if you're interested
link |
02:29:24.940
in this kind of topic, like AI Superpowers by Kai Fu Lee
link |
02:29:29.940
is just a fantastic read.
link |
02:29:31.660
You must read that.
link |
02:29:35.100
Yuval Harari is just, I think that can open your mind.
link |
02:29:40.500
Sapiens.
link |
02:29:41.580
Sapiens is the first one, Homo Deus is the second, yeah.
link |
02:29:46.980
We mentioned it in the book,
link |
02:29:48.340
Homo Deus is the second, yeah.
link |
02:29:51.060
We mentioned The Black Swan by Taleb.
link |
02:29:53.500
I think that's a good sort of mind opener.
link |
02:29:57.220
I actually, so there's maybe a more controversial
link |
02:30:04.420
recommendation I could give.
link |
02:30:06.220
Great, we love controversy.
link |
02:30:08.740
In some sense, it's so classical it might surprise you,
link |
02:30:11.580
but I actually recently read Mortimer Adler's
link |
02:30:16.020
How to Read a Book, not so long, it was a while ago,
link |
02:30:19.020
but some people hate that book.
link |
02:30:23.220
I loved it.
link |
02:30:24.820
I think we're in this time right now where,
link |
02:30:30.860
boy, we're just inundated with research papers
link |
02:30:33.780
that you could read on archive with limited peer review
link |
02:30:38.580
and just this wealth of information.
link |
02:30:40.980
I don't know, I think the passion of what you can get
link |
02:30:46.460
out of a book, a really good book or a really good paper
link |
02:30:49.460
if you find it, the attitude, the realization
link |
02:30:52.220
that you're only gonna find a few that really
link |
02:30:54.220
are worth all your time, but then once you find them,
link |
02:30:58.300
you should just dig in and understand it very deeply
link |
02:31:02.660
and it's worth marking it up and having the hard copy
link |
02:31:07.660
writing in the side notes, side margins.
link |
02:31:11.340
I think that was really, I read it at the right time
link |
02:31:16.340
where I was just feeling just overwhelmed
link |
02:31:19.260
with really low quality stuff, I guess.
link |
02:31:23.780
And similarly, I'm just giving more than three now,
link |
02:31:28.780
I'm sorry if I've exceeded my quota.
link |
02:31:31.460
But on that topic just real quick is,
link |
02:31:34.140
so basically finding a few companions to keep
link |
02:31:38.140
for the rest of your life in terms of papers and books
link |
02:31:41.340
and so on and those are the ones,
link |
02:31:44.140
like not doing, what is it, FOMO, fear of missing out,
link |
02:31:48.900
constantly trying to update yourself,
link |
02:31:50.820
but really deeply making a life journey
link |
02:31:53.700
of studying a particular paper, essentially, set of papers.
link |
02:31:57.500
Yeah, I think when you really start to understand
link |
02:32:02.500
when you really find something,
link |
02:32:06.100
which a book that resonates with you
link |
02:32:07.780
might not be the same book that resonates with me,
link |
02:32:10.420
but when you really find one that resonates with you,
link |
02:32:13.180
I think the dialogue that happens and that's what,
link |
02:32:16.260
I loved that Adler was saying, I think Socrates and Plato
link |
02:32:20.140
say the written word is never gonna capture
link |
02:32:25.740
the beauty of dialogue, right?
link |
02:32:28.020
But Adler says, no, no, a really good book
link |
02:32:33.100
is a dialogue between you and the author
link |
02:32:35.380
and it crosses time and space and I don't know,
link |
02:32:39.180
I think it's a very romantic,
link |
02:32:40.740
there's a bunch of like specific advice,
link |
02:32:42.740
which you can just gloss over,
link |
02:32:44.380
but the romantic view of how to read
link |
02:32:47.260
and really appreciate it is so good.
link |
02:32:52.140
And similarly, teaching,
link |
02:32:53.900
yeah, I thought a lot about teaching
link |
02:32:58.820
and so Isaac Asimov, great science fiction writer,
link |
02:33:03.300
has also actually spent a lot of his career
link |
02:33:05.340
writing nonfiction, right?
link |
02:33:07.260
His memoir is fantastic.
link |
02:33:09.940
He was passionate about explaining things, right?
link |
02:33:12.740
He wrote all kinds of books
link |
02:33:13.700
on all kinds of topics in science.
link |
02:33:16.100
He was known as the great explainer
link |
02:33:17.740
and I do really resonate with his style
link |
02:33:22.340
and just his way of talking about,
link |
02:33:28.420
by communicating and explaining to something
link |
02:33:30.540
is really the way that you learn something.
link |
02:33:32.540
I think about problems very differently
link |
02:33:36.260
because of the way I've been given the opportunity
link |
02:33:39.220
to teach them at MIT.
link |
02:33:42.140
We have questions asked, the fear of the lecture,
link |
02:33:45.500
the experience of the lecture
link |
02:33:47.700
and the questions I get and the interactions
link |
02:33:50.220
just forces me to be rock solid on these ideas
link |
02:33:53.140
in a way that if I didn't have that,
link |
02:33:55.060
I don't know, I would be in a different intellectual space.
link |
02:33:58.260
Also, video, does that scare you
link |
02:34:00.420
that your lectures are online
link |
02:34:02.140
and people like me in sweatpants can sit sipping coffee
link |
02:34:05.460
and watch you give lectures?
link |
02:34:08.260
I think it's great.
link |
02:34:09.980
I do think that something's changed right now,
link |
02:34:12.820
which is, right now we're giving lectures over Zoom.
link |
02:34:16.900
I mean, giving seminars over Zoom and everything.
link |
02:34:21.260
I'm trying to figure out, I think it's a new medium.
link |
02:34:24.380
I'm trying to figure out how to exploit it.
link |
02:34:28.020
Yeah, I've been quite cynical
link |
02:34:34.500
about human to human connection over that medium,
link |
02:34:39.820
but I think that's because it hasn't been explored fully
link |
02:34:43.420
and teaching is a different thing.
link |
02:34:45.780
Every lecture is a, I'm sorry, every seminar even,
link |
02:34:49.100
I think every talk I give is an opportunity
link |
02:34:53.460
to give that differently.
link |
02:34:54.980
I can deliver content directly into your browser.
link |
02:34:57.940
You have a WebGL engine right there.
link |
02:35:00.020
I can throw 3D content into your browser
link |
02:35:04.900
while you're listening to me, right?
link |
02:35:06.900
And I can assume that you have at least
link |
02:35:10.020
a powerful enough laptop or something to watch Zoom
link |
02:35:13.020
while I'm doing that, while I'm giving a lecture.
link |
02:35:15.460
That's a new communication tool
link |
02:35:18.060
that I didn't have last year, right?
link |
02:35:19.980
And I think robotics can potentially benefit a lot
link |
02:35:24.180
from teaching that way.
link |
02:35:26.420
We'll see, it's gonna be an experiment this fall.
link |
02:35:28.180
It's interesting.
link |
02:35:29.020
I'm thinking a lot about it.
link |
02:35:30.340
Yeah, and also like the length of lectures
link |
02:35:35.580
or the length of like, there's something,
link |
02:35:38.820
so like I guarantee you, it's like 80% of people
link |
02:35:42.900
who started listening to our conversation
link |
02:35:44.900
are still listening to now, which is crazy to me.
link |
02:35:48.180
But so there's a patience and interest
link |
02:35:51.140
in long form content, but at the same time,
link |
02:35:53.540
there's a magic to forcing yourself to condense
link |
02:35:57.940
an idea to as short as possible.
link |
02:36:02.740
As short as possible, like clip,
link |
02:36:04.660
it can be a part of a longer thing,
link |
02:36:06.180
but like just like really beautifully condense an idea.
link |
02:36:09.620
There's a lot of opportunity there
link |
02:36:11.900
that's easier to do in remote with, I don't know,
link |
02:36:17.500
with editing too.
link |
02:36:19.020
Editing is an interesting thing.
link |
02:36:20.980
Like what, you know, most professors don't get,
link |
02:36:25.020
when they give a lecture,
link |
02:36:25.860
they don't get to go back and edit out parts,
link |
02:36:28.220
like crisp it up a little bit.
link |
02:36:31.580
That's also, it can do magic.
link |
02:36:34.180
Like if you remove like five to 10 minutes
link |
02:36:37.620
from an hour lecture, it can actually,
link |
02:36:41.140
it can make something special of a lecture.
link |
02:36:43.220
I've seen that in myself and in others too,
link |
02:36:47.860
because I edit other people's lectures to extract clips.
link |
02:36:50.580
It's like, there's certain tangents that are like,
link |
02:36:52.740
that lose, they're not interesting.
link |
02:36:54.420
They're mumbling, they're just not,
link |
02:36:57.180
they're not clarifying, they're not helpful at all.
link |
02:36:59.780
And once you remove them, it's just, I don't know.
link |
02:37:02.820
Editing can be magic.
link |
02:37:04.580
It takes a lot of time.
link |
02:37:05.900
Yeah, it takes, it depends like what is teaching,
link |
02:37:08.940
you have to ask.
link |
02:37:09.780
Yeah, yeah.
link |
02:37:13.100
Cause I find the editing process is also beneficial
link |
02:37:18.020
as for teaching, but also for your own learning.
link |
02:37:21.620
I don't know if, have you watched yourself?
link |
02:37:23.740
Yeah, sure.
link |
02:37:24.780
Have you watched those videos?
link |
02:37:26.180
I mean, not all of them.
link |
02:37:27.900
It could be painful to see like how to improve.
link |
02:37:33.340
So do you find that, I know you segment your podcast.
link |
02:37:37.180
Do you think that helps people with the,
link |
02:37:40.740
the attention span aspect of it?
link |
02:37:42.220
Or is it the segment like sections like,
link |
02:37:44.220
yeah, we're talking about this topic, whatever.
link |
02:37:46.380
Nope, nope, that just helps me.
link |
02:37:48.260
It's actually bad.
link |
02:37:49.420
So, and you've been incredible.
link |
02:37:53.820
So I'm learning, like I'm afraid of conversation.
link |
02:37:56.420
This is even today, I'm terrified of talking to you.
link |
02:37:59.180
I mean, it's something I'm trying to remove for myself.
link |
02:38:04.180
There's a guy, I mean, I've learned from a lot of people,
link |
02:38:07.420
but really there's been a few people
link |
02:38:10.740
who's been inspirational to me in terms of conversation.
link |
02:38:14.100
Whatever people think of him,
link |
02:38:15.700
Joe Rogan has been inspirational to me
link |
02:38:17.500
because comedians have been too.
link |
02:38:20.500
Being able to just have fun and enjoy themselves
link |
02:38:23.300
and lose themselves in conversation
link |
02:38:25.580
that requires you to be a great storyteller,
link |
02:38:28.820
to be able to pull a lot of different pieces
link |
02:38:31.500
of information together.
link |
02:38:32.820
But mostly just to enjoy yourself in conversations.
link |
02:38:36.500
And I'm trying to learn that.
link |
02:38:38.060
These notes are, you see me looking down.
link |
02:38:41.660
That's like a safety blanket
link |
02:38:43.020
that I'm trying to let go of more and more.
link |
02:38:45.260
Cool.
link |
02:38:46.260
So that's, people love just regular conversation.
link |
02:38:49.420
That's what they, the structure is like, whatever.
link |
02:38:52.660
I would say, I would say maybe like 10 to like,
link |
02:38:57.620
so there's a bunch of, you know,
link |
02:38:59.820
there's probably a couple of thousand PhD students
link |
02:39:03.820
listening to this right now, right?
link |
02:39:06.980
And they might know what we're talking about.
link |
02:39:09.540
But there is somebody, I guarantee you right now,
link |
02:39:13.460
in Russia, some kid who's just like,
link |
02:39:16.580
who's just smoked some weed, is sitting back
link |
02:39:19.380
and just enjoying the hell out of this conversation.
link |
02:39:22.580
Not really understanding.
link |
02:39:23.860
He kind of watched some Boston Dynamics videos.
link |
02:39:25.980
He's just enjoying it.
link |
02:39:27.300
And I salute you, sir.
link |
02:39:29.300
No, but just like, there's so much variety of people
link |
02:39:32.780
that just have curiosity about engineering,
link |
02:39:35.260
about sciences, about mathematics.
link |
02:39:37.980
And also like, I should, I mean,
link |
02:39:43.940
enjoying it is one thing,
link |
02:39:44.980
but also often notice it inspires people to,
link |
02:39:49.180
there's a lot of people who are like
link |
02:39:50.860
in their undergraduate studies trying to figure out what,
link |
02:39:54.700
trying to figure out what to pursue.
link |
02:39:56.140
And these conversations can really spark
link |
02:39:59.220
the direction of their life.
link |
02:40:01.820
And in terms of robotics, I hope it does,
link |
02:40:03.580
because I'm excited about the possibilities
link |
02:40:06.540
of what robotics brings.
link |
02:40:07.580
On that topic, do you have advice?
link |
02:40:12.580
Like what advice would you give
link |
02:40:14.060
to a young person about life?
link |
02:40:18.260
A young person about life
link |
02:40:19.380
or a young person about life in robotics?
link |
02:40:23.060
It could be in robotics.
link |
02:40:24.380
Robotics, it could be in life in general.
link |
02:40:26.660
It could be career.
link |
02:40:28.460
It could be a relationship advice.
link |
02:40:31.300
It could be running advice.
link |
02:40:32.900
Just like they're, that's one of the things I see,
link |
02:40:36.620
like we talked to like 20 year olds.
link |
02:40:38.620
They're like, how do I do this thing?
link |
02:40:42.500
What do I do?
link |
02:40:45.620
If they come up to you, what would you tell them?
link |
02:40:48.020
I think it's an interesting time to be a kid these days.
link |
02:40:53.980
Everything points to this being sort of a winner,
link |
02:40:57.860
take all economy and the like.
link |
02:40:59.300
I think the people that will really excel in my opinion
link |
02:41:04.500
are going to be the ones that can think deeply
link |
02:41:06.820
about problems.
link |
02:41:11.180
You have to be able to ask questions agilely
link |
02:41:13.940
and use the internet for everything it's good for
link |
02:41:15.820
and stuff like this.
link |
02:41:16.660
And I think a lot of people will develop those skills.
link |
02:41:19.460
I think the leaders, thought leaders,
link |
02:41:24.820
robotics leaders, whatever,
link |
02:41:26.860
are gonna be the ones that can do more
link |
02:41:29.100
and they can think very deeply and critically.
link |
02:41:32.420
And that's a harder thing to learn.
link |
02:41:35.020
I think one path to learning that is through mathematics,
link |
02:41:38.140
through engineering.
link |
02:41:41.660
I would encourage people to start math early.
link |
02:41:44.180
I mean, I didn't really start.
link |
02:41:46.900
I mean, I was always in the better math classes
link |
02:41:50.460
that I could take,
link |
02:41:51.300
but I wasn't pursuing super advanced mathematics
link |
02:41:54.700
or anything like that until I got to MIT.
link |
02:41:56.700
I think MIT lit me up
link |
02:41:59.020
and really started the life that I'm living now.
link |
02:42:05.580
But yeah, I really want kids to dig deep,
link |
02:42:10.740
really understand things, building things too.
link |
02:42:12.460
I mean, pull things apart, put them back together.
link |
02:42:15.180
Like that's just such a good way
link |
02:42:17.180
to really understand things
link |
02:42:19.980
and expect it to be a long journey, right?
link |
02:42:23.660
It's, you don't have to know everything.
link |
02:42:27.260
You're never gonna know everything.
link |
02:42:29.500
So think deeply and stick with it.
link |
02:42:32.860
Enjoy the ride, but just make sure you're not,
link |
02:42:37.580
yeah, just make sure you're stopping
link |
02:42:40.580
to think about why things work.
link |
02:42:43.180
And it's true, it's easy to lose yourself
link |
02:42:45.420
in the distractions of the world.
link |
02:42:51.180
We're overwhelmed with content right now,
link |
02:42:52.740
but you have to stop and pick some of it
link |
02:42:56.260
and really understand it.
link |
02:42:58.780
Yeah, on the book point,
link |
02:43:00.380
I've read Animal Farm by George Orwell
link |
02:43:04.940
a ridiculous number of times.
link |
02:43:06.100
So for me, like that book,
link |
02:43:07.860
I don't know if it's a good book in general,
link |
02:43:09.780
but for me it connects deeply somehow.
link |
02:43:13.340
It somehow connects, so I was born in the Soviet Union.
link |
02:43:18.260
So it connects to me into the entirety of the history
link |
02:43:20.460
of the Soviet Union and to World War II
link |
02:43:23.180
and to the love and hatred and suffering
link |
02:43:26.500
that went on there and the corrupting nature of power
link |
02:43:33.140
and greed and just somehow I just,
link |
02:43:36.340
that book has taught me more about life
link |
02:43:38.100
than like anything else.
link |
02:43:39.380
Even though it's just like a silly childlike book
link |
02:43:42.860
about pigs, I don't know why,
link |
02:43:46.980
it just connects and inspires.
link |
02:43:49.300
The same, there's a few technical books too
link |
02:43:53.780
and algorithms that just, yeah, you return to often.
link |
02:43:58.020
I'm with you.
link |
02:44:01.900
Yeah, there's, and I've been losing that
link |
02:44:04.100
because of the internet.
link |
02:44:05.380
I've been like going on, I've been going on archive
link |
02:44:09.700
and blog posts and GitHub and the new thing
link |
02:44:12.420
and you lose your ability to really master an idea.
link |
02:44:18.100
Right.
link |
02:44:18.940
Wow.
link |
02:44:19.780
Exactly right.
link |
02:44:21.100
What's a fond memory from childhood?
link |
02:44:24.940
When baby Russ Tedrick.
link |
02:44:29.540
Well, I guess I just said that at least my current life
link |
02:44:33.940
began when I got to MIT.
link |
02:44:36.780
If I have to go farther than that.
link |
02:44:38.900
Yeah, what was, was there a life before MIT?
link |
02:44:42.260
Oh, absolutely, but let me actually tell you
link |
02:44:47.380
what happened when I first got to MIT
link |
02:44:48.900
because that I think might be relevant here,
link |
02:44:52.220
but I had taken a computer engineering degree at Michigan.
link |
02:44:57.540
I enjoyed it immensely, learned a bunch of stuff.
link |
02:45:00.420
I liked computers, I liked programming,
link |
02:45:04.580
but when I did get to MIT and started working
link |
02:45:07.340
with Sebastian Sung, theoretical physicist,
link |
02:45:10.300
computational neuroscientist, the culture here
link |
02:45:15.180
was just different.
link |
02:45:17.220
It demanded more of me, certainly mathematically
link |
02:45:20.260
and in the critical thinking.
link |
02:45:22.660
And I remember the day that I borrowed one of the books
link |
02:45:27.700
from my advisor's office and walked down
link |
02:45:29.780
to the Charles River and was like,
link |
02:45:32.140
I'm getting my butt kicked.
link |
02:45:36.620
And I think that's gonna happen to everybody
link |
02:45:38.180
who's doing this kind of stuff.
link |
02:45:40.220
I think I expected you to ask me the meaning of life.
link |
02:45:46.020
I think that somehow I think that's gotta be part of it.
link |
02:45:52.780
Doing hard things?
link |
02:45:55.140
Yeah.
link |
02:45:56.460
Did you consider quitting at any point?
link |
02:45:58.220
Did you consider this isn't for me?
link |
02:45:59.780
No, never that.
link |
02:46:01.740
I was working hard, but I was loving it.
link |
02:46:07.180
I think there's this magical thing
link |
02:46:08.860
where I'm lucky to surround myself with people
link |
02:46:11.900
that basically almost every day I'll see something,
link |
02:46:17.900
I'll be told something or something that I realize,
link |
02:46:20.340
wow, I don't understand that.
link |
02:46:22.020
And if I could just understand that,
link |
02:46:24.180
there's something else to learn.
link |
02:46:26.020
That if I could just learn that thing,
link |
02:46:28.140
I would connect another piece of the puzzle.
link |
02:46:30.220
And I think that is just such an important aspect
link |
02:46:36.220
and being willing to understand what you can and can't do
link |
02:46:40.260
and loving the journey of going
link |
02:46:43.580
and learning those other things.
link |
02:46:44.820
I think that's the best part.
link |
02:46:47.340
I don't think there's a better way to end it, Russ.
link |
02:46:51.500
You've been an inspiration to me since I showed up at MIT.
link |
02:46:55.580
Your work has been an inspiration to the world.
link |
02:46:57.700
This conversation was amazing.
link |
02:46:59.740
I can't wait to see what you do next
link |
02:47:01.700
with robotics, home robots.
link |
02:47:03.220
I hope to see you work in my home one day.
link |
02:47:05.780
So thanks so much for talking today, it's been awesome.
link |
02:47:08.100
Cheers.
link |
02:47:09.480
Thanks for listening to this conversation
link |
02:47:11.060
with Russ Tedrick and thank you to our sponsors,
link |
02:47:14.180
Magic Spoon Cereal, BetterHelp and ExpressVPN.
link |
02:47:18.220
Please consider supporting this podcast
link |
02:47:20.180
by going to magicspoon.com slash Lex
link |
02:47:23.420
and using code Lex at checkout.
link |
02:47:25.500
Go into betterhelp.com slash Lex
link |
02:47:27.780
and signing up at expressvpn.com slash Lex pod.
link |
02:47:32.820
Click the links, buy the stuff, get the discount.
link |
02:47:36.180
It really is the best way to support this podcast.
link |
02:47:39.380
If you enjoy this thing, subscribe on YouTube,
link |
02:47:41.520
review it with five stars and up a podcast,
link |
02:47:43.700
support on Patreon or connect with me on Twitter
link |
02:47:46.540
at Lex Friedman spelled somehow without the E
link |
02:47:50.620
just F R I D M A N.
link |
02:47:53.460
And now let me leave you with some words
link |
02:47:55.100
from Neil deGrasse Tyson talking about robots in space
link |
02:47:58.540
and the emphasis we humans put
link |
02:48:00.680
on human based space exploration.
link |
02:48:03.640
Robots are important.
link |
02:48:05.680
If I don my pure scientist hat,
link |
02:48:07.980
I would say just send robots.
link |
02:48:10.020
I'll stay down here and get the data.
link |
02:48:12.340
But nobody's ever given a parade for a robot.
link |
02:48:15.080
Nobody's ever named a high school after a robot.
link |
02:48:17.940
So when I don my public educator hat,
link |
02:48:20.180
I have to recognize the elements of exploration
link |
02:48:22.780
that excite people.
link |
02:48:24.180
It's not only the discoveries and the beautiful photos
link |
02:48:26.980
that come down from the heavens.
link |
02:48:29.020
It's the vicarious participation in discovery itself.
link |
02:48:33.020
Thank you for listening and hope to see you next time.