back to index

Kevin Scott: Microsoft CTO | Lex Fridman Podcast #30


small model | large model

link |
00:00:00.000
The following is a conversation with Kevin Scott,
link |
00:00:03.440
the CTO of Microsoft.
link |
00:00:06.080
Before that, he was the senior vice president
link |
00:00:08.540
of engineering and operations at LinkedIn.
link |
00:00:11.080
And before that, he oversaw mobile ads engineering
link |
00:00:14.160
at Google.
link |
00:00:15.960
He also has a podcast called Behind the Tech
link |
00:00:18.960
with Kevin Scott, which I'm a fan of.
link |
00:00:21.860
This was a fun and wide ranging conversation
link |
00:00:24.240
that covered many aspects of computing.
link |
00:00:26.680
It happened over a month ago,
link |
00:00:28.800
before the announcement of Microsoft's investment
link |
00:00:30.960
in OpenAI that a few people have asked me about.
link |
00:00:34.400
I'm sure there'll be one or two people in the future
link |
00:00:38.080
that'll talk with me about the impact of that investment.
link |
00:00:42.240
This is the Artificial Intelligence Podcast.
link |
00:00:45.400
If you enjoy it, subscribe on YouTube,
link |
00:00:47.640
give it five stars on iTunes,
link |
00:00:49.400
support it on Patreon,
link |
00:00:50.920
or simply connect with me on Twitter at Lex Friedman,
link |
00:00:54.240
spelled F R I D M A N.
link |
00:00:57.640
And I'd like to give a special thank you
link |
00:00:59.200
to Tom and Nelante Bighousen
link |
00:01:01.920
for their support of the podcast on Patreon.
link |
00:01:04.560
Thanks Tom and Nelante.
link |
00:01:06.040
Hope I didn't mess up your last name too bad.
link |
00:01:08.340
Your support means a lot
link |
00:01:10.480
and inspires me to keep this series going.
link |
00:01:13.440
And now, here's my conversation with Kevin Scott.
link |
00:01:18.100
You've described yourself as a kid in a candy store
link |
00:01:20.680
at Microsoft because of all the interesting projects
link |
00:01:22.920
that are going on.
link |
00:01:24.160
Can you try to do the impossible task
link |
00:01:27.920
and give a brief whirlwind view
link |
00:01:31.720
of all the spaces that Microsoft is working in?
link |
00:01:34.500
Both research and product?
link |
00:01:37.400
If you include research,
link |
00:01:38.360
it becomes even more difficult.
link |
00:01:46.840
I think broadly speaking,
link |
00:01:48.800
Microsoft's product portfolio includes everything
link |
00:01:53.680
from big cloud business,
link |
00:01:56.880
like a big set of SaaS services.
link |
00:01:59.320
We have sort of the original,
link |
00:02:01.920
or like some of what are among the original
link |
00:02:05.520
productivity software products that everybody uses.
link |
00:02:09.560
We have an operating system business.
link |
00:02:11.160
We have a hardware business where we make everything
link |
00:02:14.540
from computer mice and headphones
link |
00:02:18.400
to high end personal computers and laptops.
link |
00:02:23.400
We have a fairly broad ranging research group
link |
00:02:27.640
where we have people doing everything
link |
00:02:29.640
from economics research.
link |
00:02:31.840
So there's this really, really smart young economist,
link |
00:02:35.880
Glenn Weil, who my group works with a lot,
link |
00:02:39.720
who's doing this research on these things
link |
00:02:42.840
called radical markets.
link |
00:02:45.120
He's written an entire technical book
link |
00:02:48.080
about this whole notion of radical markets.
link |
00:02:51.080
So like the research group sort of spans from that
link |
00:02:53.480
to human computer interaction to artificial intelligence.
link |
00:02:56.800
And we have GitHub, we have LinkedIn,
link |
00:03:01.000
we have a search advertising and news business
link |
00:03:05.760
and like probably a bunch of stuff
link |
00:03:07.320
that I'm embarrassingly not recounting in this list.
link |
00:03:11.240
Gaming to Xbox and so on, right?
link |
00:03:12.840
Yeah, gaming for sure.
link |
00:03:14.080
Like I was having a super fun conversation this morning
link |
00:03:17.880
with Phil Spencer.
link |
00:03:19.480
So when I was in college,
link |
00:03:21.260
there was this game that LucasArts made
link |
00:03:25.560
called Day of the Tentacle
link |
00:03:27.600
that my friends and I played forever.
link |
00:03:30.160
And like we're doing some interesting collaboration now
link |
00:03:33.920
with the folks who made Day of the Tentacle.
link |
00:03:37.920
And I was like completely nerding out with Tim Schafer,
link |
00:03:40.840
like the guy who wrote a Day of the Tentacle this morning,
link |
00:03:43.880
just a complete fan boy,
link |
00:03:45.800
which sort of it like happens a lot.
link |
00:03:49.880
Like Microsoft has been doing so much stuff
link |
00:03:53.320
at such breadth for such a long period of time
link |
00:03:56.000
that like being CTO like most of the time,
link |
00:04:00.880
my job is very, very serious.
link |
00:04:02.200
And sometimes like I get caught up
link |
00:04:05.620
in like how amazing it is to be able to have
link |
00:04:10.620
the conversations that I have with the people
link |
00:04:12.800
I get to have them with.
link |
00:04:14.640
Yeah, to reach back into the sentimental.
link |
00:04:17.080
And what's the radical markets and the economics?
link |
00:04:21.640
So the idea with radical markets is like,
link |
00:04:24.760
can you come up with new market based mechanisms to,
link |
00:04:32.320
you know, I think we have this,
link |
00:04:33.840
we're having this debate right now,
link |
00:04:35.240
like does capitalism work like free markets work?
link |
00:04:40.040
Can the incentive structures
link |
00:04:43.000
that are built into these systems produce outcomes
link |
00:04:46.320
that are creating sort of equitably distributed benefits
link |
00:04:51.520
for every member of society?
link |
00:04:55.360
You know, and I think it's a reasonable,
link |
00:04:56.960
reasonable set of questions to be asking.
link |
00:04:59.520
And so what Glenn, and so like, you know,
link |
00:05:02.120
one mode of thought there,
link |
00:05:03.120
like if you have doubts that the markets
link |
00:05:05.920
are actually working, you can sort of like tip towards
link |
00:05:08.360
like, okay, let's become more socialist
link |
00:05:10.760
and, you know, like have central planning and, you know,
link |
00:05:13.640
governments or some other central organization
link |
00:05:15.760
is like making a bunch of decisions
link |
00:05:18.240
about how, you know, sort of work gets done
link |
00:05:22.000
and, you know, like where the, you know,
link |
00:05:24.520
where the investments and where the outputs
link |
00:05:26.360
of those investments get distributed.
link |
00:05:28.840
Glenn's notion is like, lean more
link |
00:05:32.120
into like the market based mechanism.
link |
00:05:35.760
So like, for instance, you know,
link |
00:05:37.840
this is one of the more radical ideas,
link |
00:05:39.540
like suppose that you had a radical pricing mechanism
link |
00:05:45.140
for assets like real estate where you were,
link |
00:05:50.560
you could be bid out of your position
link |
00:05:53.560
in your home, you know, for instance.
link |
00:05:58.680
So like if somebody came along and said,
link |
00:06:01.080
you know, like I can find higher economic utility
link |
00:06:04.380
for this piece of real estate
link |
00:06:05.720
that you're running your business in,
link |
00:06:08.680
like then like you either have to, you know,
link |
00:06:13.040
sort of bid to sort of stay
link |
00:06:16.480
or like the thing that's got the higher economic utility,
link |
00:06:19.960
you know, sort of takes over the asset
link |
00:06:21.480
which would make it very difficult
link |
00:06:23.700
to have the same sort of rent seeking behaviors
link |
00:06:27.580
that you've got right now
link |
00:06:29.000
because like if you did speculative bidding,
link |
00:06:34.000
like you would very quickly like lose a whole lot of money.
link |
00:06:40.440
And so like the prices of the assets
link |
00:06:42.380
would be sort of like very closely indexed
link |
00:06:45.640
to like the value that they could produce.
link |
00:06:49.720
And like, because like you'd have this sort
link |
00:06:52.120
of real time mechanism that would force you
link |
00:06:53.940
to sort of mark the value of the asset to the market,
link |
00:06:56.800
then it could be taxed appropriately.
link |
00:06:58.560
Like you couldn't sort of sit on this thing and say,
link |
00:07:00.400
oh, like this house is only worth 10,000 bucks
link |
00:07:03.040
when like everything around it is worth 10 million.
link |
00:07:06.620
That's really, so it's an incentive structure
link |
00:07:08.720
that where the prices match the value much better.
link |
00:07:13.160
Yeah, and Glenn does a much better job than I do
link |
00:07:16.360
at selling and I probably picked the world's worst example,
link |
00:07:18.960
you know, and it's intentionally provocative,
link |
00:07:24.560
so like this whole notion,
link |
00:07:25.800
like I'm not sure whether I like this notion
link |
00:07:28.920
that like we can have a set of market mechanisms
link |
00:07:31.120
where I could get bid out of my property, you know,
link |
00:07:35.360
but you know, like if you're thinking about something
link |
00:07:37.680
like Elizabeth Warren's wealth tax, for instance,
link |
00:07:42.480
like you would have, I mean, it'd be really interesting
link |
00:07:45.600
in like how you would actually set the price on the assets
link |
00:07:50.100
and like you might have to have a mechanism like that
link |
00:07:52.040
if you put a tax like that in place.
link |
00:07:54.160
It's really interesting that that kind of research,
link |
00:07:56.440
at least tangentially is touching Microsoft research.
link |
00:08:00.280
That you're really thinking broadly.
link |
00:08:02.560
Maybe you can speak to, this connects to AI,
link |
00:08:08.360
so we have a candidate, Andrew Yang,
link |
00:08:10.640
who kind of talks about artificial intelligence
link |
00:08:13.440
and the concern that people have about, you know,
link |
00:08:16.620
automation's impact on society and arguably,
link |
00:08:19.920
Microsoft is at the cutting edge of innovation
link |
00:08:23.340
in all these kinds of ways and so it's pushing AI forward.
link |
00:08:27.040
How do you think about combining all our conversations
link |
00:08:30.000
together here with radical markets and socialism
link |
00:08:32.840
and innovation in AI that Microsoft is doing
link |
00:08:37.500
and then Andrew Yang's worry that that will result
link |
00:08:44.560
in job loss for the lower and so on.
link |
00:08:46.840
How do you think about that?
link |
00:08:47.680
I think it's sort of one of the most important questions
link |
00:08:51.140
in technology like maybe even in society right now
link |
00:08:54.920
about how is AI going to develop
link |
00:08:59.640
over the course of the next several decades
link |
00:09:01.960
and what's it going to be used for
link |
00:09:03.600
and what benefits will it produce
link |
00:09:06.520
and what negative impacts will it produce
link |
00:09:08.480
and who gets to steer this whole thing.
link |
00:09:13.960
I'll say at the highest level,
link |
00:09:17.240
one of the real joys of getting to do what I do at Microsoft
link |
00:09:22.920
is Microsoft has this heritage as a platform company
link |
00:09:27.560
and so Bill has this thing that he said a bunch of years ago
link |
00:09:32.880
where the measure of a successful platform
link |
00:09:36.440
is that it produces far more economic value
link |
00:09:39.800
for the people who build on top of the platform
link |
00:09:41.820
than is created for the platform owner or builder
link |
00:09:47.320
and I think we have to think about AI that way.
link |
00:09:51.160
As a platform.
link |
00:09:52.240
Yeah, it has to be a platform that other people can use
link |
00:09:56.260
to build businesses, to fulfill their creative objectives,
link |
00:10:01.280
to be entrepreneurs, to solve problems that they have
link |
00:10:04.640
in their work and in their lives.
link |
00:10:07.680
It can't be a thing where there are a handful of companies
link |
00:10:11.960
sitting in a very small handful of cities geographically
link |
00:10:16.440
who are making all the decisions about what goes into the AI
link |
00:10:21.440
and then on top of all this infrastructure,
link |
00:10:26.880
then build all of the commercially valuable uses for it.
link |
00:10:30.960
So I think that's bad from a sort of economics
link |
00:10:36.480
and sort of equitable distribution of value perspective,
link |
00:10:40.120
sort of back to this whole notion of did the markets work?
link |
00:10:44.520
But I think it's also bad from an innovation perspective
link |
00:10:47.560
because I have infinite amounts of faith
link |
00:10:51.360
in human beings that if you give folks powerful tools,
link |
00:10:55.720
they will go do interesting things
link |
00:10:58.240
and it's more than just a few tens of thousands of people
link |
00:11:02.280
with the interesting tools,
link |
00:11:03.340
it should be millions of people with the tools.
link |
00:11:05.380
So it's sort of like you think about the steam engine
link |
00:11:10.160
in the late 18th century, like it was maybe the first
link |
00:11:14.480
large scale substitute for human labor
link |
00:11:16.760
that we've built like a machine
link |
00:11:19.080
and in the beginning when these things are getting deployed,
link |
00:11:23.480
the folks who got most of the value from the steam engines
link |
00:11:28.280
were the folks who had capital
link |
00:11:30.140
so they could afford to build them
link |
00:11:31.560
and like they built factories around them and businesses
link |
00:11:34.680
and the experts who knew how to build and maintain them.
link |
00:11:38.640
But access to that technology democratized over time.
link |
00:11:42.840
Like now, like an engine, it's not like a differentiated
link |
00:11:47.840
thing, like there isn't one engine company
link |
00:11:50.260
that builds all the engines
link |
00:11:51.500
and all of the things that use engines
link |
00:11:53.100
are made by this company
link |
00:11:54.220
and like they get all the economics from all of that.
link |
00:11:57.420
Like fully demarcated, like they're probably,
link |
00:12:00.540
we're sitting here in this room
link |
00:12:02.300
and like even though they're probably things
link |
00:12:05.220
like the MEMS gyroscope that are in both of our phones,
link |
00:12:09.100
like there's like little engines sort of everywhere.
link |
00:12:13.220
They're just a component in how we build the modern world.
link |
00:12:16.260
Like AI needs to get there.
link |
00:12:17.700
Yeah, so that's a really powerful way to think.
link |
00:12:20.220
If we think of AI as a platform
link |
00:12:22.700
versus a tool that Microsoft owns,
link |
00:12:26.860
as a platform that enables creation on top of it,
link |
00:12:30.140
that's the way to democratize it.
link |
00:12:31.500
That's really interesting actually.
link |
00:12:34.220
And Microsoft throughout its history
link |
00:12:36.060
has been positioned well to do that.
link |
00:12:38.260
And the tie back to this radical markets thing,
link |
00:12:41.660
like so my team has been working with Glenn on this,
link |
00:12:49.100
and Jaren Lanier actually.
link |
00:12:50.940
So Jaren is the sort of father of virtual reality.
link |
00:12:56.180
Like he's one of the most interesting human beings on the planet,
link |
00:13:00.100
like a sweet, sweet guy.
link |
00:13:02.220
And so Jaren and Glenn and folks in my team have been working
link |
00:13:07.660
on this notion of data as labor
link |
00:13:10.300
or like they call it data dignity as well.
link |
00:13:13.100
And so the idea is that if you,
link |
00:13:16.700
again going back to this sort of industrial analogy,
link |
00:13:20.580
if you think about data as the raw material that is
link |
00:13:24.700
consumed by the machine of AI in order to do useful things,
link |
00:13:30.060
then like we're not doing a really great job right now in having
link |
00:13:34.940
transparent marketplaces for valuing those data contributions.
link |
00:13:39.580
So and we all make them explicitly like you go to LinkedIn,
link |
00:13:43.540
you sort of set up your profile on LinkedIn,
link |
00:13:46.140
like that's an explicit contribution.
link |
00:13:47.780
Like you know exactly the information
link |
00:13:49.460
that you're putting into the system.
link |
00:13:50.700
And like you put it there because you have
link |
00:13:52.780
some nominal notion of what value you're going to get in return.
link |
00:13:56.620
But it's like only nominal,
link |
00:13:57.700
like you don't know exactly what value you're getting in return.
link |
00:14:00.460
Like service is free,
link |
00:14:01.860
like it's low amount of perceived debt.
link |
00:14:04.620
And then you've got all this indirect contribution that you're
link |
00:14:06.900
making just by virtue of interacting with all of
link |
00:14:09.540
the technology that's in your daily life.
link |
00:14:13.180
And so like what Glenn and
link |
00:14:15.580
Jaren and this data dignity team are trying to do is like,
link |
00:14:19.340
can we figure out a set of mechanisms that let us value
link |
00:14:23.820
those data contributions so that you could create
link |
00:14:27.260
an economy and like a set of controls and incentives that
link |
00:14:31.700
would allow people to like maybe even in the limit,
link |
00:14:36.860
like earn part of their living
link |
00:14:38.860
through the data that they're creating.
link |
00:14:41.020
And like you can sort of see it in explicit ways.
link |
00:14:42.660
There are these companies like Scale AI,
link |
00:14:46.020
and like there are a whole bunch of them in China
link |
00:14:49.420
right now that are basically data labeling companies.
link |
00:14:52.380
So like you're doing supervised machine learning,
link |
00:14:54.540
you need lots and lots of label training data.
link |
00:14:57.900
And like those people who work for
link |
00:15:01.540
those companies are getting compensated
link |
00:15:03.460
for their data contributions into the system.
link |
00:15:06.180
And so.
link |
00:15:07.380
That's easier to put a number on
link |
00:15:09.500
their contribution because they're explicitly labeling data.
link |
00:15:11.980
Correct.
link |
00:15:12.380
But you're saying that we're all
link |
00:15:13.620
contributing data in different kinds of ways.
link |
00:15:15.540
And it's fascinating to start to
link |
00:15:18.260
explicitly try to put a number on it.
link |
00:15:20.860
Do you think that's possible?
link |
00:15:22.580
I don't know. It's hard. It really is.
link |
00:15:24.980
Because we don't have
link |
00:15:29.420
as much transparency as I think
link |
00:15:33.740
we need in like how the data is getting used.
link |
00:15:37.220
And it's super complicated.
link |
00:15:38.660
Like we, I think as
link |
00:15:41.300
technologists sort of appreciate
link |
00:15:42.860
like some of the subtlety there.
link |
00:15:44.140
It's like the data gets created and then it gets,
link |
00:15:48.820
it's not valuable.
link |
00:15:50.940
Like the data exhaust that you give off,
link |
00:15:55.740
or the explicit data that I am putting into
link |
00:16:00.580
the system isn't super valuable atomically.
link |
00:16:05.100
Like it's only valuable when you sort of
link |
00:16:07.260
aggregate it together into sort of large numbers.
link |
00:16:10.420
This is true even for these like folks who are
link |
00:16:12.100
getting compensated for like labeling things.
link |
00:16:14.860
Like for supervised machine learning now,
link |
00:16:16.460
like you need lots of labels to
link |
00:16:18.860
train a model that performs well.
link |
00:16:21.900
And so I think that's one of the challenges.
link |
00:16:24.420
It's like how do you sort of figure
link |
00:16:27.220
out like because this data is getting combined in
link |
00:16:29.900
so many ways like through
link |
00:16:32.620
these combinations like how the value is flowing.
link |
00:16:35.700
Yeah, that's fascinating.
link |
00:16:37.620
Yeah. And it's fascinating that you're thinking about this.
link |
00:16:41.860
And I wasn't even going into this conversation expecting
link |
00:16:44.980
the breadth of research really
link |
00:16:48.180
that Microsoft broadly is thinking about,
link |
00:16:50.580
you're thinking about at Microsoft.
link |
00:16:52.100
So if we go back to 89 when Microsoft released Office,
link |
00:16:57.580
or 1990 when they released Windows 3.0.
link |
00:17:02.740
In your view, I know you weren't there through its history,
link |
00:17:07.980
but how has the company changed in
link |
00:17:09.940
the 30 years since as you look at it now?
link |
00:17:12.580
The good thing is it's started off as a platform company.
link |
00:17:16.900
Like it's still a platform company,
link |
00:17:20.020
like the parts of the business that are thriving and
link |
00:17:22.940
most successful are those that are building platforms.
link |
00:17:26.660
Like the mission of the company now is,
link |
00:17:28.980
the mission's changed.
link |
00:17:30.220
It's like changed in a very interesting way.
link |
00:17:32.380
So back in 89,
link |
00:17:35.860
90 like they were still on the original mission,
link |
00:17:39.100
which was like put a PC on every desk and in every home.
link |
00:17:43.940
And it was basically about democratizing access to
link |
00:17:47.700
this new personal computing technology,
link |
00:17:50.140
which when Bill started the company,
link |
00:17:52.540
integrated circuit microprocessors were a brand new thing.
link |
00:17:57.740
And people were building homebrew computers from kits,
link |
00:18:03.900
like the way people build ham radios right now.
link |
00:18:08.140
I think this is the interesting thing
link |
00:18:10.700
for folks who build platforms in general.
link |
00:18:12.860
Bill saw the opportunity there and
link |
00:18:17.060
what personal computers could do.
link |
00:18:18.780
And it was like, it was sort of a reach.
link |
00:18:20.500
Like you just sort of imagine like where things
link |
00:18:22.460
were when they started the company
link |
00:18:24.900
versus where things are now.
link |
00:18:26.100
Like in success,
link |
00:18:27.860
when you've democratized a platform,
link |
00:18:29.400
it just sort of vanishes into the platform.
link |
00:18:31.020
You don't pay attention to it anymore.
link |
00:18:32.500
Like operating systems aren't a thing anymore.
link |
00:18:35.420
Like they're super important,
link |
00:18:36.780
like completely critical.
link |
00:18:38.020
And like when you see one fail,
link |
00:18:41.460
like you just sort of understand.
link |
00:18:43.500
But like it's not a thing where you're not like
link |
00:18:46.060
waiting for the next operating system thing
link |
00:18:50.220
in the same way that you were in 1995, right?
link |
00:18:52.860
Like in 1995, like we had
link |
00:18:54.500
Rolling Stones on the stage with the Windows 95 rollout.
link |
00:18:57.580
Like it was like the biggest thing in the world.
link |
00:18:59.300
Everybody lined up for it the way
link |
00:19:01.340
that people used to line up for iPhone.
link |
00:19:03.380
But like, you know, eventually,
link |
00:19:04.820
and like this isn't necessarily a bad thing.
link |
00:19:07.160
Like it just sort of, you know,
link |
00:19:08.820
the success is that it's sort of, it becomes ubiquitous.
link |
00:19:12.860
It's like everywhere, like human beings,
link |
00:19:14.780
when their technology becomes ubiquitous,
link |
00:19:16.580
they just sort of start taking it for granted.
link |
00:19:18.180
So the mission now that Satya
link |
00:19:22.100
rearticulated five plus years ago now,
link |
00:19:25.220
when he took over as CEO of the company.
link |
00:19:28.260
Our mission is to empower every individual and
link |
00:19:33.620
every organization in the world to be more successful.
link |
00:19:38.340
And so, you know, again,
link |
00:19:40.860
like that's a platform mission.
link |
00:19:43.100
And like the way that we do it now is, is different.
link |
00:19:46.300
It's like we have a hyperscale cloud that
link |
00:19:48.780
people are building their applications on top of.
link |
00:19:51.620
Like we have a bunch of AI infrastructure that
link |
00:19:53.740
people are building their AI applications on top of.
link |
00:19:56.220
We have, you know,
link |
00:19:58.100
we have a productivity suite of software,
link |
00:20:02.060
like Microsoft Dynamics, which, you know,
link |
00:20:05.740
some people might not think is the sexiest thing in the world,
link |
00:20:07.820
but it's like helping people figure out how to automate
link |
00:20:10.820
all of their business processes and workflows
link |
00:20:13.580
and to help those businesses using it to grow and be more.
link |
00:20:19.060
So it's a much broader vision
link |
00:20:23.180
in a way now than it was back then.
link |
00:20:25.460
Like it was sort of very particular thing.
link |
00:20:27.380
And like now, like we live in this world where
link |
00:20:29.380
technology is so powerful and it's like
link |
00:20:32.380
such a basic fact of life that it both exists
link |
00:20:39.700
and is going to get better and better over time
link |
00:20:42.700
or at least more and more powerful over time.
link |
00:20:45.980
So like, you know, what you have to do as a platform player
link |
00:20:48.140
is just much bigger.
link |
00:20:49.900
Right. There's so many directions in which you can transform.
link |
00:20:52.980
You didn't mention mixed reality, too.
link |
00:20:55.140
You know, that's probably early days
link |
00:20:59.140
or it depends how you think of it.
link |
00:21:00.620
But if we think on a scale of centuries,
link |
00:21:02.140
it's the early days of mixed reality.
link |
00:21:04.020
Oh, for sure.
link |
00:21:04.900
And so with HoloLens,
link |
00:21:08.420
Microsoft is doing some really interesting work there.
link |
00:21:10.580
Do you touch that part of the effort?
link |
00:21:13.540
What's the thinking?
link |
00:21:14.820
Do you think of mixed reality as a platform, too?
link |
00:21:17.620
Oh, sure.
link |
00:21:18.460
When we look at what the platforms of the future could be,
link |
00:21:21.300
it's like fairly obvious that like AI is one.
link |
00:21:23.900
Like you don't have to, I mean, like that's,
link |
00:21:26.580
you know, you sort of say it to like someone
link |
00:21:29.140
and you know, like they get it.
link |
00:21:31.940
But like we also think of the like mixed reality
link |
00:21:36.300
and quantum as like these two interesting,
link |
00:21:39.580
you know, potentially.
link |
00:21:40.900
Quantum computing?
link |
00:21:41.820
Yeah.
link |
00:21:42.660
Okay. So let's get crazy then.
link |
00:21:44.500
So you're talking about some futuristic things here.
link |
00:21:48.900
Well, the mixed reality, Microsoft is really,
link |
00:21:50.860
it's not even futuristic, it's here.
link |
00:21:52.620
It is.
link |
00:21:53.460
It's incredible stuff.
link |
00:21:54.300
And look, and it's having an impact right now.
link |
00:21:56.660
Like one of the more interesting things
link |
00:21:58.740
that's happened with mixed reality
link |
00:21:59.980
over the past couple of years that I didn't clearly see
link |
00:22:04.140
is that it's become the computing device
link |
00:22:08.420
for folks who, for doing their work,
link |
00:22:13.180
who haven't used any computing device at all
link |
00:22:16.060
to do their work before.
link |
00:22:16.980
So technicians and service folks and people
link |
00:22:21.500
who are doing like machine maintenance on factory floors.
link |
00:22:25.340
So like they, you know, because they're mobile
link |
00:22:28.780
and like they're out in the world
link |
00:22:30.300
and they're working with their hands
link |
00:22:32.340
and, you know, sort of servicing these like
link |
00:22:34.260
very complicated things, they're,
link |
00:22:37.460
they don't use their mobile phone
link |
00:22:39.420
and like they don't carry a laptop with them
link |
00:22:41.420
and, you know, they're not tethered to a desk.
link |
00:22:43.500
And so mixed reality, like where it's getting traction
link |
00:22:47.340
right now, where HoloLens is selling a lot of units
link |
00:22:50.740
is for these sorts of applications for these workers.
link |
00:22:54.580
And it's become like, I mean, like the people love it.
link |
00:22:58.060
They're like, oh my God, like this is like for them,
link |
00:23:01.140
like the same sort of productivity boosts that,
link |
00:23:03.460
you know, like an office worker had
link |
00:23:05.500
when they got their first personal computer.
link |
00:23:08.220
Yeah, but you did mention it's certainly obvious AI
link |
00:23:12.100
as a platform, but can we dig into it a little bit?
link |
00:23:15.580
How does AI begin to infuse some of the products
link |
00:23:18.300
in Microsoft?
link |
00:23:19.500
So currently providing training of,
link |
00:23:24.500
for example, neural networks in the cloud
link |
00:23:26.700
or providing pre trained models or just even providing
link |
00:23:34.300
computing resources and whatever different inference
link |
00:23:37.540
that you wanna do using neural networks.
link |
00:23:39.940
How do you think of AI infusing as a platform
link |
00:23:44.500
that Microsoft can provide?
link |
00:23:45.900
Yeah, I mean, I think it's super interesting.
link |
00:23:48.340
It's like everywhere.
link |
00:23:49.580
And like we run these review meetings now
link |
00:23:54.580
where it's me and Satya and like members
link |
00:24:00.700
of Satya's leadership team and like a cross functional
link |
00:24:04.340
group of folks across the entire company
link |
00:24:06.180
who are working on like either AI infrastructure
link |
00:24:11.820
or like have some substantial part of their product work
link |
00:24:18.900
using AI in some significant way.
link |
00:24:22.580
Now, the important thing to understand
link |
00:24:23.940
is like when you think about like how the AI
link |
00:24:26.620
is gonna manifest in like an experience
link |
00:24:29.980
for something that's gonna make it better,
link |
00:24:31.820
like I think you don't want the AIness
link |
00:24:36.900
to be the first order thing.
link |
00:24:38.780
It's like whatever the product is
link |
00:24:40.900
and like the thing that is trying to help you do,
link |
00:24:43.700
like the AI just sort of makes it better.
link |
00:24:45.620
And this is a gross exaggeration,
link |
00:24:47.900
but like people get super excited about like
link |
00:24:51.580
where the AI is showing up in products and I'm like,
link |
00:24:54.220
do you get that excited about like
link |
00:24:55.780
where you're using a hash table like in your code?
link |
00:24:59.660
Like it's just another.
link |
00:25:01.100
It's just a tool.
link |
00:25:01.940
It's a very interesting programming tool,
link |
00:25:03.780
but it's sort of like it's an engineering tool.
link |
00:25:07.340
And so like it shows up everywhere.
link |
00:25:09.340
So like we've got dozens and dozens of features
link |
00:25:12.300
now in Office that are powered by
link |
00:25:15.660
like fairly sophisticated machine learning,
link |
00:25:18.060
our search engine wouldn't work at all
link |
00:25:21.980
if you took the machine learning out of it.
link |
00:25:24.620
The like increasingly things like content moderation
link |
00:25:30.860
on our Xbox and xCloud platform.
link |
00:25:36.820
When you mean moderation,
link |
00:25:37.900
you mean like the recommender is like showing
link |
00:25:39.500
what you wanna look at next.
link |
00:25:41.540
No, no, no, it's like anti bullying stuff.
link |
00:25:43.780
So the usual social network stuff
link |
00:25:45.780
that you have to deal with.
link |
00:25:46.820
Yeah, correct.
link |
00:25:47.660
But it's like really it's targeted,
link |
00:25:49.860
it's targeted towards a gaming audience.
link |
00:25:52.060
So it's like a very particular type of thing
link |
00:25:54.580
where the line between playful banter
link |
00:25:59.260
and like legitimate bullying is like a subtle one.
link |
00:26:02.100
And like you have to like, it's sort of tough.
link |
00:26:05.860
Like I have.
link |
00:26:07.340
I'd love to if we could dig into it
link |
00:26:08.860
because you're also,
link |
00:26:10.060
you led the engineering efforts of LinkedIn.
link |
00:26:12.980
And if we look at LinkedIn as a social network,
link |
00:26:17.460
and if we look at the Xbox gaming as the social components,
link |
00:26:21.700
the very different kinds of I imagine communication
link |
00:26:24.780
going on on the two platforms, right?
link |
00:26:26.740
And the line in terms of bullying and so on
link |
00:26:29.420
is different on the platforms.
link |
00:26:31.420
So how do you,
link |
00:26:33.140
I mean, it's such a fascinating philosophical discussion
link |
00:26:36.180
of where that line is.
link |
00:26:37.140
I don't think anyone knows the right answer.
link |
00:26:39.780
Twitter folks are under fire now, Jack at Twitter
link |
00:26:43.260
for trying to find that line.
link |
00:26:45.060
Nobody knows what that line is.
link |
00:26:46.860
But how do you try to find the line
link |
00:26:50.940
for trying to prevent abusive behavior
link |
00:26:57.940
and at the same time, let people be playful
link |
00:27:00.140
and joke around and that kind of thing?
link |
00:27:02.780
I think in a certain way,
link |
00:27:03.980
like if you have what I would call vertical social networks,
link |
00:27:10.300
it gets to be a little bit easier.
link |
00:27:12.140
So like if you have a clear notion
link |
00:27:14.380
of like what your social network should be used for,
link |
00:27:17.940
or like what you are designing a community around,
link |
00:27:22.220
then you don't have as many dimensions
link |
00:27:25.740
to your sort of content safety problem
link |
00:27:28.900
as you do in a general purpose platform.
link |
00:27:33.700
I mean, so like on LinkedIn,
link |
00:27:37.460
like the whole social network
link |
00:27:38.820
is about connecting people with opportunity,
link |
00:27:41.540
whether it's helping them find a job
link |
00:27:43.140
or to sort of find mentors
link |
00:27:46.380
or to sort of help them like find their next sales lead
link |
00:27:52.180
or to just sort of allow them to broadcast
link |
00:27:56.220
their sort of professional identity
link |
00:27:59.500
to their network of peers and collaborators
link |
00:28:06.740
and sort of professional community.
link |
00:28:08.300
Like that is, I mean, like in some ways,
link |
00:28:09.940
like that's very, very broad,
link |
00:28:11.580
but in other ways it's sort of, it's narrow.
link |
00:28:15.180
And so like you can build AI's like machine learning systems
link |
00:28:20.980
that are capable with those boundaries
link |
00:28:25.620
of making better automated decisions
link |
00:28:28.100
about like what is sort of inappropriate
link |
00:28:30.740
and offensive comment or dangerous comment
link |
00:28:32.940
or illegal content when you have some constraints.
link |
00:28:37.940
You know, same thing with like the gaming social network.
link |
00:28:43.900
So for instance, like it's about playing games,
link |
00:28:45.740
not having fun.
link |
00:28:47.260
And like the thing that you don't want to have happen
link |
00:28:49.460
on the platform is why bullying is such an important thing.
link |
00:28:52.220
Like bullying is not fun.
link |
00:28:53.740
So you want to do everything in your power
link |
00:28:56.460
to encourage that not to happen.
link |
00:28:59.380
And yeah, but I think it's sort of a tough problem
link |
00:29:03.420
in general and it's one where I think, you know,
link |
00:29:05.260
eventually we're going to have to have some sort
link |
00:29:09.980
of clarification from our policymakers about what it is
link |
00:29:15.940
that we should be doing, like where the lines are,
link |
00:29:18.940
because it's tough.
link |
00:29:20.860
Like you don't, like in democracy, right?
link |
00:29:23.740
Like you don't want,
link |
00:29:25.540
you want some sort of democratic involvement.
link |
00:29:28.900
Like people should have a say
link |
00:29:30.460
in like where the lines are drawn.
link |
00:29:34.660
Like you don't want a bunch of people making
link |
00:29:37.500
like unilateral decisions.
link |
00:29:39.460
And like we are in a state right now
link |
00:29:43.140
for some of these platforms
link |
00:29:44.220
where you actually do have to make unilateral decisions
link |
00:29:46.260
where the policymaking isn't going to happen fast enough
link |
00:29:48.620
in order to like prevent very bad things from happening.
link |
00:29:52.540
But like we need the policymaking side of that to catch up,
link |
00:29:56.020
I think, as quickly as possible
link |
00:29:58.460
because you want that whole process to be a democratic thing,
link |
00:30:01.980
not a, you know, not some sort of weird thing
link |
00:30:05.740
where you've got a non representative group
link |
00:30:08.020
of people making decisions that have, you know,
link |
00:30:10.420
like national and global impact.
link |
00:30:12.500
And it's fascinating because the digital space is different
link |
00:30:15.580
than the physical space in which nations
link |
00:30:18.340
and governments were established.
link |
00:30:19.860
And so what policy looks like globally,
link |
00:30:23.980
what bullying looks like globally,
link |
00:30:25.740
what's healthy communication looks like globally
link |
00:30:28.420
is an open question and we're all figuring it out together,
link |
00:30:31.900
which is fascinating.
link |
00:30:33.260
Yeah, I mean with, you know, sort of fake news, for instance.
link |
00:30:37.220
And...
link |
00:30:38.740
Deep fakes and fake news generated by humans?
link |
00:30:42.380
Yeah, so we can talk about deep fakes,
link |
00:30:44.660
like I think that is another like, you know,
link |
00:30:46.180
sort of very interesting level of complexity.
link |
00:30:48.340
But like if you think about just the written word, right?
link |
00:30:51.540
Like we have, you know, we invented papyrus,
link |
00:30:54.460
what, 3,000 years ago where we, you know,
link |
00:30:56.820
you could sort of put word on paper.
link |
00:31:01.220
And then 500 years ago, like we get the printing press,
link |
00:31:07.300
like where the word gets a little bit more ubiquitous.
link |
00:31:11.540
And then like you really, really didn't get ubiquitous
link |
00:31:14.660
printed word until the end of the 19th century
link |
00:31:18.500
when the offset press was invented.
link |
00:31:20.780
And then, you know, just sort of explodes
link |
00:31:22.460
and like, you know, the cross product of that
link |
00:31:25.420
and the Industrial Revolution's need
link |
00:31:28.980
for educated citizens resulted in like
link |
00:31:32.900
this rapid expansion of literacy
link |
00:31:34.780
and the rapid expansion of the word.
link |
00:31:36.060
But like we had 3,000 years up to that point
link |
00:31:39.740
to figure out like how to, you know,
link |
00:31:43.300
like what's journalism, what's editorial integrity,
link |
00:31:46.940
like what's, you know, what's scientific peer review.
link |
00:31:50.140
And so like you built all of this mechanism
link |
00:31:52.860
to like try to filter through all of the noise
link |
00:31:57.060
that the technology made possible
link |
00:31:59.820
to like, you know, sort of getting to something
link |
00:32:01.900
that society could cope with.
link |
00:32:03.980
And like, if you think about just the piece,
link |
00:32:06.580
the PC didn't exist 50 years ago.
link |
00:32:09.780
And so in like this span of, you know,
link |
00:32:11.780
like half a century, like we've gone from no digital,
link |
00:32:16.140
you know, no ubiquitous digital technology
link |
00:32:18.300
to like having a device that sits in your pocket
link |
00:32:21.060
where you can sort of say whatever is on your mind
link |
00:32:23.740
to like what did Mary have in her,
link |
00:32:27.100
Mary Meeker just released her new like slide deck last week.
link |
00:32:32.420
You know, we've got 50% penetration of the internet
link |
00:32:37.340
to the global population.
link |
00:32:38.500
Like there are like three and a half billion people
link |
00:32:40.260
who are connected now.
link |
00:32:41.740
So it's like, it's crazy, crazy, like inconceivable,
link |
00:32:44.980
like how fast all of this happened.
link |
00:32:46.460
So, you know, it's not surprising
link |
00:32:48.700
that we haven't figured out what to do yet,
link |
00:32:50.980
but like we gotta really like lean into this set of problems
link |
00:32:55.660
because like we basically have three millennia worth of work
link |
00:33:00.220
to do about how to deal with all of this
link |
00:33:02.500
and like probably what, you know,
link |
00:33:04.580
amounts to the next decade worth of time.
link |
00:33:07.020
So since we're on the topic of tough, you know,
link |
00:33:09.980
tough challenging problems,
link |
00:33:11.620
let's look at more on the tooling side in AI
link |
00:33:15.220
that Microsoft is looking at is face recognition software.
link |
00:33:18.420
So there's a lot of powerful positive use cases
link |
00:33:21.860
for face recognition, but there's some negative ones
link |
00:33:24.220
and we're seeing those in different governments
link |
00:33:27.180
in the world.
link |
00:33:28.140
So how do you, how does Microsoft think about the use
link |
00:33:30.900
of face recognition software as a platform
link |
00:33:35.740
in governments and companies?
link |
00:33:39.820
How do we strike an ethical balance here?
link |
00:33:42.300
Yeah, I think we've articulated a clear point of view.
link |
00:33:47.300
So Brad Smith wrote a blog post last fall,
link |
00:33:51.900
I believe that sort of like outlined
link |
00:33:54.180
like very specifically what, you know,
link |
00:33:57.100
what our point of view is there.
link |
00:33:59.340
And, you know, I think we believe
link |
00:34:01.060
that there are certain uses
link |
00:34:02.340
to which face recognition should not be put.
link |
00:34:04.740
And we believe again,
link |
00:34:06.060
that there's a need for regulation there.
link |
00:34:09.220
Like the government should like really come in
link |
00:34:11.940
and say that, you know, this is where the lines are.
link |
00:34:15.780
And like, we very much wanted to like figuring out
link |
00:34:18.380
where the lines are, should be a democratic process.
link |
00:34:20.380
But in the short term, like we've drawn some lines
link |
00:34:22.780
where, you know, we push back against uses
link |
00:34:26.180
of face recognition technology, you know,
link |
00:34:29.940
like the city of San Francisco, for instance,
link |
00:34:32.300
I think has completely outlawed any government agency
link |
00:34:36.580
from using face recognition tech.
link |
00:34:39.580
And like that may prove to be a little bit overly broad.
link |
00:34:44.580
But for like certain law enforcement things,
link |
00:34:48.820
like you really, I would personally rather be overly
link |
00:34:54.060
sort of cautious in terms of restricting use of it
link |
00:34:57.380
until like we have, you know,
link |
00:34:58.900
sort of defined a reasonable, you know,
link |
00:35:02.140
democratically determined regulatory framework
link |
00:35:04.860
for like where we could and should use it.
link |
00:35:08.820
And, you know, the other thing there is like,
link |
00:35:12.140
we've got a bunch of research that we're doing
link |
00:35:13.980
and a bunch of progress that we've made on bias there.
link |
00:35:18.380
And like, there are all sorts of like weird biases
link |
00:35:20.820
that these models can have,
link |
00:35:22.980
like all the way from like the most noteworthy one
link |
00:35:25.580
where, you know, you may have underrepresented minorities
link |
00:35:31.660
who are like underrepresented in the training data
link |
00:35:34.660
and then you start learning like strange things.
link |
00:35:39.180
But like there are even, you know, other weird things.
link |
00:35:42.100
Like we've, I think we've seen in the public research,
link |
00:35:46.460
like models can learn strange things,
link |
00:35:49.460
like all doctors are men, for instance, just, yeah.
link |
00:35:54.700
I mean, and so like, it really is a thing
link |
00:35:58.900
where it's very important for everybody
link |
00:36:03.580
who is working on these things before they push publish,
link |
00:36:08.420
they launch the experiment, they, you know, push the code
link |
00:36:12.780
to, you know, online, or they even publish the paper
link |
00:36:17.100
that they are at least starting to think about
link |
00:36:20.420
what some of the potential negative consequences are,
link |
00:36:25.260
some of this stuff.
link |
00:36:26.100
I mean, this is where, you know, like the deep fake stuff
link |
00:36:29.020
I find very worrisome just because
link |
00:36:32.340
there are going to be some very good beneficial uses
link |
00:36:39.780
of like GAN generated imagery.
link |
00:36:46.100
And funny enough, like one of the places
link |
00:36:48.460
where it's actually useful is we're using the technology
link |
00:36:52.940
right now to generate synthetic visual data
link |
00:36:58.620
for training some of the face recognition models
link |
00:37:01.140
to get rid of the bias.
link |
00:37:03.420
So like, that's one like super good use of the tech,
link |
00:37:05.740
but like, you know, it's getting good enough now
link |
00:37:09.620
where, you know, it's going to sort of challenge
link |
00:37:12.300
a normal human being's ability to,
link |
00:37:14.300
like now you're just sort of say,
link |
00:37:15.740
like it's very expensive for someone
link |
00:37:19.300
to fabricate a photorealistic fake video.
link |
00:37:24.140
And like GANs are going to make it fantastically cheap
link |
00:37:26.900
to fabricate a photorealistic fake video.
link |
00:37:30.420
And so like what you assume you can sort of trust is true
link |
00:37:34.460
versus like be skeptical about is about to change.
link |
00:37:38.380
And like, we're not ready for it, I don't think.
link |
00:37:40.540
The nature of truth, right.
link |
00:37:41.980
That's, it's also exciting because I think both you and I
link |
00:37:46.580
probably would agree that the way to solve,
link |
00:37:49.580
to take on that challenge is with technology, right?
link |
00:37:52.820
There's probably going to be ideas of ways to verify
link |
00:37:56.820
which kind of video is legitimate, which kind is not.
link |
00:38:00.820
So to me, that's an exciting possibility,
link |
00:38:03.860
most likely for just the comedic genius
link |
00:38:07.180
that the internet usually creates with these kinds of videos
link |
00:38:10.980
and hopefully will not result in any serious harm.
link |
00:38:13.940
Yeah, and it could be, you know,
link |
00:38:17.100
like I think we will have technology to,
link |
00:38:21.180
that may be able to detect whether or not
link |
00:38:23.580
something's fake or real.
link |
00:38:24.460
Although the fakes are pretty convincing,
link |
00:38:30.180
even like when you subject them to machine scrutiny.
link |
00:38:34.340
But, you know, we also have these increasingly
link |
00:38:37.820
interesting social networks, you know,
link |
00:38:40.540
that are under fire right now
link |
00:38:43.580
for some of the bad things that they do.
link |
00:38:46.220
Like one of the things you could choose to do
link |
00:38:47.700
with a social network is like you could,
link |
00:38:51.780
you could use crypto and the networks
link |
00:38:55.580
to like have content signed
link |
00:38:57.740
where you could have a like full chain of custody
link |
00:39:01.420
that accompanied every piece of content.
link |
00:39:03.900
So like when you're viewing something
link |
00:39:06.780
and like you want to ask yourself,
link |
00:39:08.540
like how much can I trust this?
link |
00:39:11.020
Like you can click something
link |
00:39:12.380
and like have a verified chain of custody
link |
00:39:14.980
that shows like, oh, this is coming from this source.
link |
00:39:19.060
And it's like signed by like someone
link |
00:39:21.660
whose identity I trust.
link |
00:39:24.100
Yeah, I think having that, you know,
link |
00:39:25.420
having that chain of custody,
link |
00:39:26.620
like being able to like say, oh, here's this video.
link |
00:39:29.340
Like it may or may not have been produced
link |
00:39:31.940
using some of this deepfake technology,
link |
00:39:33.740
but if you've got a verified chain of custody
link |
00:39:35.660
where you can sort of trace it all the way back
link |
00:39:37.780
to an identity and you can decide whether or not
link |
00:39:39.940
like I trust this identity.
link |
00:39:41.540
Like, oh no, this is really from the White House
link |
00:39:43.340
or like this is really from the, you know,
link |
00:39:45.500
the office of this particular presidential candidate
link |
00:39:48.820
or it's really from, you know, Jeff Wiener, CEO of LinkedIn
link |
00:39:53.540
or Satya Nadella, CEO of Microsoft.
link |
00:39:55.540
Like that might be like one way
link |
00:39:58.420
that you can solve some of the problems.
link |
00:39:59.940
So like that's not the super high tech.
link |
00:40:01.780
Like we've had all of this technology forever.
link |
00:40:04.500
And, but I think you're right.
link |
00:40:06.700
Like it has to be some sort of technological thing
link |
00:40:11.100
because the underlying tech that is used to create this
link |
00:40:15.820
is not going to do anything but get better over time
link |
00:40:18.780
and the genie is sort of out of the bottle.
link |
00:40:21.140
There's no stuffing it back in.
link |
00:40:22.780
And there's a social component,
link |
00:40:24.500
which I think is really healthy for a democracy
link |
00:40:26.620
where people will be skeptical
link |
00:40:28.500
about the thing they watch in general.
link |
00:40:32.140
So, you know, which is good.
link |
00:40:34.180
Skepticism in general is good for content.
link |
00:40:37.300
So deepfakes in that sense are creating a global skepticism
link |
00:40:41.780
about can they trust what they read.
link |
00:40:44.780
It encourages further research.
link |
00:40:46.900
I come from the Soviet Union
link |
00:40:49.860
where basically nobody trusted the media
link |
00:40:53.380
because you knew it was propaganda.
link |
00:40:55.180
And that kind of skepticism encouraged further research
link |
00:40:59.220
about ideas as opposed to just trusting any one source.
link |
00:41:02.420
Well, look, I think it's one of the reasons why
link |
00:41:04.340
the scientific method and our apparatus
link |
00:41:09.500
of modern science is so good.
link |
00:41:11.540
Like, because you don't have to trust anything.
link |
00:41:15.420
Like, the whole notion of modern science
link |
00:41:20.180
beyond the fact that this is a hypothesis
link |
00:41:22.460
and this is an experiment to test the hypothesis
link |
00:41:24.900
and this is a peer review process
link |
00:41:27.380
for scrutinizing published results.
link |
00:41:30.140
But stuff's also supposed to be reproducible.
link |
00:41:33.300
So you know it's been vetted by this process,
link |
00:41:35.260
but you also are expected to publish enough detail
link |
00:41:38.060
where if you are sufficiently skeptical of the thing,
link |
00:41:42.100
you can go try to reproduce it yourself.
link |
00:41:44.740
And like, I don't know what it is.
link |
00:41:47.580
Like, I think a lot of engineers are like this
link |
00:41:49.980
where like, you know, sort of this,
link |
00:41:51.940
like your brain is sort of wired for skepticism.
link |
00:41:55.580
Like, you don't just first order trust everything
link |
00:41:58.060
that you see and encounter.
link |
00:42:00.100
And like, you're sort of curious to understand,
link |
00:42:02.620
you know, the next thing.
link |
00:42:04.540
But like, I think it's an entirely healthy thing.
link |
00:42:09.140
And like, we need a little bit more of that right now.
link |
00:42:12.340
So I'm not a large business owner.
link |
00:42:16.300
So I'm just a huge fan of many of Microsoft products.
link |
00:42:23.300
I mean, I still, actually in terms of,
link |
00:42:25.460
I generate a lot of graphics and images
link |
00:42:27.060
and I still use PowerPoint to do that.
link |
00:42:28.740
It beats Illustrator for me.
link |
00:42:30.500
Even professional sort of, it's fascinating.
link |
00:42:34.540
So I wonder, what is the future of,
link |
00:42:38.460
let's say Windows and Office look like?
link |
00:42:42.020
Is, do you see it?
link |
00:42:43.940
I mean, I remember looking forward to XP.
link |
00:42:45.940
Was it exciting when XP was released?
link |
00:42:48.260
Just like you said, I don't remember when 95 was released.
link |
00:42:51.180
But XP for me was a big celebration.
link |
00:42:53.900
And when 10 came out, I was like, oh, okay.
link |
00:42:56.420
Well, it's nice.
link |
00:42:57.260
It's a nice improvement.
link |
00:42:59.100
So what do you see the future of these products?
link |
00:43:03.380
I think there's a bunch of excite.
link |
00:43:04.700
I mean, on the Office front,
link |
00:43:07.260
there's gonna be this like increasing productivity wins
link |
00:43:13.900
that are coming out of some of these AI powered features
link |
00:43:17.200
that are coming.
link |
00:43:18.040
Like the products will sort of get smarter and smarter
link |
00:43:20.000
in like a very subtle way.
link |
00:43:21.240
Like there's not gonna be this big bang moment
link |
00:43:24.260
where like Clippy is gonna reemerge and it's gonna be.
link |
00:43:28.020
Wait a minute.
link |
00:43:28.860
Okay, we'll have to wait, wait, wait.
link |
00:43:30.660
Is Clippy coming back?
link |
00:43:32.580
But quite seriously, so injection of AI.
link |
00:43:37.140
There's not much, or at least I'm not familiar,
link |
00:43:39.220
sort of assistive type of stuff going on
link |
00:43:41.340
inside the Office products.
link |
00:43:43.700
Like a Clippy style assistant, personal assistant.
link |
00:43:47.740
Do you think that there's a possibility
link |
00:43:50.740
of that in the future?
link |
00:43:52.100
So I think there are a bunch of like very small ways
link |
00:43:54.820
in which like machine learning powered assistive things
link |
00:43:58.540
are in the product right now.
link |
00:44:00.260
So there are a bunch of interesting things.
link |
00:44:04.940
Like the auto response stuff's getting better and better.
link |
00:44:09.500
And it's like getting to the point
link |
00:44:11.180
where it can auto respond with like,
link |
00:44:14.820
okay, this person's clearly trying to schedule a meeting.
link |
00:44:19.260
So it looks at your calendar and it automatically
link |
00:44:21.700
like tries to find like a time and a space
link |
00:44:24.260
that's mutually interesting.
link |
00:44:27.420
Like we have this notion of Microsoft search
link |
00:44:32.420
at a Microsoft search where it's like not just web search,
link |
00:44:34.940
but it's like search across like all of your information
link |
00:44:38.180
that's sitting inside of like your Office 365 tenant
link |
00:44:43.300
and like potentially in other products.
link |
00:44:46.900
And like we have this thing called the Microsoft Graph
link |
00:44:49.680
that is basically an API federator that sort of like
link |
00:44:53.980
gets you hooked up across the entire breadth
link |
00:44:57.980
of like all of the, like what were information silos
link |
00:45:01.640
before they got woven together with the graph.
link |
00:45:05.660
Like that is like getting increasing,
link |
00:45:07.860
with increasing effectiveness,
link |
00:45:09.140
sort of plumbed into some of these auto response things
link |
00:45:13.120
where you're gonna be able to see the system
link |
00:45:15.860
like automatically retrieve information for you.
link |
00:45:18.220
Like if, you know, like I frequently send out,
link |
00:45:21.140
you know, emails to folks where like I can't find a paper
link |
00:45:24.060
or a document or whatnot.
link |
00:45:25.380
There's no reason why the system
link |
00:45:26.340
won't be able to do that for you.
link |
00:45:27.540
And like, I think the, it's building towards
link |
00:45:31.980
like having things that look more like,
link |
00:45:34.480
like a fully integrated, you know, assistant,
link |
00:45:37.900
but like you'll have a bunch of steps
link |
00:45:40.740
that you will see before you,
link |
00:45:42.820
like it will not be this like big bang thing
link |
00:45:45.140
where like Clippy comes back and you've got this like,
link |
00:45:47.420
you know, manifestation of, you know,
link |
00:45:49.400
like a fully, fully powered assistant.
link |
00:45:53.380
So I think that's, that's definitely coming in,
link |
00:45:56.940
like all of the, you know, collaboration,
link |
00:45:58.700
coauthoring stuff's getting better.
link |
00:46:00.740
You know, it's like really interesting.
link |
00:46:02.220
Like if you look at how we use
link |
00:46:06.500
the Office product portfolio at Microsoft,
link |
00:46:09.020
like more and more of it is happening inside of
link |
00:46:12.140
like Teams as a canvas.
link |
00:46:14.500
And like, it's this thing where, you know,
link |
00:46:17.180
you've got collaboration is like at the center
link |
00:46:20.620
of the product and like we built some like really cool stuff
link |
00:46:25.620
that's some of, which is about to be open source
link |
00:46:28.420
that are sort of framework level things
link |
00:46:30.980
for doing, for doing coauthoring.
link |
00:46:34.540
That's awesome.
link |
00:46:35.380
So in, is there a cloud component to that?
link |
00:46:37.860
So on the web, or is it,
link |
00:46:40.300
and forgive me if I don't already know this,
link |
00:46:42.660
but with Office 365, we still,
link |
00:46:45.580
the collaboration we do if we're doing Word,
link |
00:46:47.540
we still send the file around.
link |
00:46:49.660
No, no.
link |
00:46:50.500
So this is.
link |
00:46:51.340
We're already a little bit better than that.
link |
00:46:54.300
A little bit better than that and like, you know,
link |
00:46:55.900
so like the fact that you're unaware of it means
link |
00:46:57.700
we've got a better job to do,
link |
00:46:59.180
like helping you discover, discover this stuff.
link |
00:47:02.900
But yeah, I mean, it's already like got a huge,
link |
00:47:06.380
huge cloud component.
link |
00:47:07.220
And like part of, you know, part of this framework stuff,
link |
00:47:09.700
I think we're calling it, like I,
link |
00:47:12.660
like we've been working on it for a couple of years.
link |
00:47:14.540
So like, I know the internal code name for it,
link |
00:47:17.220
but I think when we launched it to build,
link |
00:47:18.660
it's called the Fluid Framework.
link |
00:47:20.760
And, but like what Fluid lets you do is like,
link |
00:47:25.060
you can go into a conversation that you're having in Teams
link |
00:47:27.900
and like reference like part of a spreadsheet
link |
00:47:30.240
that you're working on where somebody's like sitting
link |
00:47:33.900
in the Excel canvas,
link |
00:47:35.580
like working on the spreadsheet with a, you know,
link |
00:47:37.740
chart or whatnot,
link |
00:47:38.860
and like you can sort of embed like part of the spreadsheet
link |
00:47:41.940
in the Teams conversation where like you can dynamically
link |
00:47:45.400
update it and like all of the changes that you're making
link |
00:47:48.740
to the, to this object are like, you know,
link |
00:47:51.220
coordinate and everything is sort of updating in real time.
link |
00:47:54.620
So like you can be in whatever canvas is most convenient
link |
00:47:57.940
for you to get your work done.
link |
00:48:00.380
So I, out of my own sort of curiosity as an engineer,
link |
00:48:03.380
I know what it's like to sort of lead a team
link |
00:48:06.220
of 10, 15 engineers.
link |
00:48:08.220
Microsoft has, I don't know what the numbers are,
link |
00:48:11.660
maybe 50, maybe 60,000 engineers, maybe 40.
link |
00:48:14.940
I don't know exactly what the number is, it's a lot.
link |
00:48:17.220
It's tens of thousands.
link |
00:48:18.900
Right, so it's more than 10 or 15.
link |
00:48:20.700
What, I mean, you've led different sizes,
link |
00:48:28.700
mostly large size of engineers.
link |
00:48:30.540
What does it take to lead such a large group
link |
00:48:33.820
into a continue innovation,
link |
00:48:37.500
continue being highly productive
link |
00:48:40.260
and yet develop all kinds of new ideas and yet maintain,
link |
00:48:44.220
like what does it take to lead such a large group
link |
00:48:47.100
of brilliant people?
link |
00:48:48.980
I think the thing that you learn
link |
00:48:52.060
as you manage larger and larger scale
link |
00:48:55.140
is that there are three things
link |
00:48:57.940
that are like very, very important
link |
00:49:00.500
for big engineering teams.
link |
00:49:02.340
Like one is like having some sort of forethought
link |
00:49:06.300
about what it is that you're gonna be building
link |
00:49:09.860
over large periods of time.
link |
00:49:11.060
Like not exactly, like you don't need to know
link |
00:49:13.100
that like, you know, I'm putting all my chips
link |
00:49:15.300
on this one product and like this is gonna be the thing,
link |
00:49:17.820
but like it's useful to know like what sort of capabilities
link |
00:49:21.460
you think you're going to need to have
link |
00:49:23.140
to build the products of the future.
link |
00:49:24.740
And then like invest in that infrastructure,
link |
00:49:28.060
like whether, and like I'm not just talking
link |
00:49:30.180
about storage systems or cloud APIs,
link |
00:49:32.740
it's also like what does your development process look like?
link |
00:49:35.380
What tools do you want?
link |
00:49:36.780
Like what culture do you want to build around?
link |
00:49:40.020
Like how you're, you know, sort of collaborating together
link |
00:49:42.780
to like make complicated technical things.
link |
00:49:45.780
And so like having an opinion and investing in that
link |
00:49:48.100
is like, it just gets more and more important.
link |
00:49:50.500
And like the sooner you can get a concrete set of opinions,
link |
00:49:54.540
like the better you're going to be.
link |
00:49:57.700
Like you can wing it for a while at small scales,
link |
00:50:01.620
like, you know, when you start a company,
link |
00:50:03.180
like you don't have to be like super specific about it,
link |
00:50:06.340
but like the biggest miseries that I've ever seen
link |
00:50:09.980
as an engineering leader are in places
link |
00:50:12.660
where you didn't have a clear enough opinion
link |
00:50:14.500
about those things soon enough.
link |
00:50:16.780
And then you just sort of go create a bunch
link |
00:50:18.740
of technical debt and like culture debt
link |
00:50:21.940
that is excruciatingly painful to clean up.
link |
00:50:25.820
So like, that's one bundle of things.
link |
00:50:28.700
Like the other, you know, another bundle of things
link |
00:50:33.260
is like, it's just really, really important
link |
00:50:36.620
to like have a clear mission
link |
00:50:41.620
that's not just some cute crap you say
link |
00:50:46.260
because like you think you should have a mission,
link |
00:50:48.940
but like something that clarifies for people
link |
00:50:52.940
like where it is that you're headed together.
link |
00:50:57.220
Like, I know it's like probably like a little bit
link |
00:50:59.180
too popular right now,
link |
00:51:00.380
but Yuval Harari's book, Sapiens,
link |
00:51:05.380
one of the central ideas in his book is that
link |
00:51:10.380
like storytelling is like the quintessential thing
link |
00:51:15.380
for coordinating the activities of large groups of people.
link |
00:51:18.780
Like once you get past Dunbar's number,
link |
00:51:21.380
and like I've really, really seen that
link |
00:51:23.980
just managing engineering teams.
link |
00:51:25.580
Like you can just brute force things
link |
00:51:30.580
when you're less than 120, 150 folks
link |
00:51:33.500
where you can sort of know and trust
link |
00:51:35.980
and understand what the dynamics are
link |
00:51:38.380
between all the people, but like past that,
link |
00:51:40.380
like things just sort of start to catastrophically fail
link |
00:51:43.780
if you don't have some sort of set of shared goals
link |
00:51:47.180
that you're marching towards.
link |
00:51:48.980
And so like, even though it sounds touchy feely
link |
00:51:51.380
and you know, like a bunch of technical people
link |
00:51:54.180
will sort of balk at the idea that like,
link |
00:51:56.180
you need to like have a clear, like the missions,
link |
00:52:00.180
like very, very, very important.
link |
00:52:02.180
You're always right, right?
link |
00:52:04.180
Stories, that's how our society,
link |
00:52:06.180
that's the fabric that connects us,
link |
00:52:08.180
all of us is these powerful stories.
link |
00:52:10.180
And that works for companies too, right?
link |
00:52:12.180
It works for everything.
link |
00:52:14.180
Like, I mean, even down to like, you know,
link |
00:52:16.180
you sort of really think about it,
link |
00:52:18.180
like our currency, for instance, is a story.
link |
00:52:20.180
Our constitution is a story.
link |
00:52:22.180
Our laws are stories.
link |
00:52:24.180
I mean, like we believe very, very, very strongly in them.
link |
00:52:27.180
And thank God we do.
link |
00:52:29.180
But like they are,
link |
00:52:31.180
they're just abstract things.
link |
00:52:33.180
Like they're just words.
link |
00:52:34.180
Like if we don't believe in them, they're nothing.
link |
00:52:36.180
And in some sense, those stories are platforms
link |
00:52:39.180
and the kinds, some of which Microsoft is creating, right?
link |
00:52:43.180
They have platforms on which we define the future.
link |
00:52:46.180
So last question, what do you,
link |
00:52:48.180
let's get philosophical maybe,
link |
00:52:50.180
bigger than even Microsoft,
link |
00:52:51.180
what do you think the next 20, 30 plus years
link |
00:52:56.180
looks like for computing, for technology, for devices?
link |
00:53:00.180
Do you have crazy ideas about the future of the world?
link |
00:53:04.180
Yeah, look, I think we, you know,
link |
00:53:06.180
we're entering this time where we've got,
link |
00:53:10.180
we have technology that is progressing
link |
00:53:13.180
at the fastest rate that it ever has.
link |
00:53:15.180
And you've got,
link |
00:53:18.180
you've got some really big social problems,
link |
00:53:21.180
like society scale problems that we have to tackle.
link |
00:53:26.180
And so, you know, I think we're going to rise to the challenge
link |
00:53:28.180
and like figure out how to intersect
link |
00:53:30.180
like all of the power of this technology
link |
00:53:32.180
with all of the big challenges that are facing us,
link |
00:53:35.180
whether it's, you know, global warming,
link |
00:53:37.180
whether it's like the biggest remainder of the population boom
link |
00:53:41.180
is in Africa for the next 50 years or so.
link |
00:53:46.180
And like global warming is going to make it increasingly difficult
link |
00:53:49.180
to feed the global population in particular,
link |
00:53:52.180
like in this place where you're going to have
link |
00:53:54.180
like the biggest population boom.
link |
00:53:57.180
I think we, you know, like AI is going to,
link |
00:54:01.180
like if we push it in the right direction,
link |
00:54:03.180
like it can do like incredible things to empower all of us
link |
00:54:07.180
to achieve our full potential and to, you know,
link |
00:54:12.180
like live better lives.
link |
00:54:15.180
But like that also means focus on like
link |
00:54:20.180
some super important things.
link |
00:54:21.180
Like how can you apply it to healthcare to make sure that,
link |
00:54:26.180
you know, like our quality and cost of
link |
00:54:29.180
and sort of ubiquity of health coverage is better
link |
00:54:33.180
and better over time.
link |
00:54:35.180
Like that's more and more important every day is like
link |
00:54:38.180
in the United States and like the rest of the industrialized world,
link |
00:54:43.180
so Western Europe, China, Japan, Korea,
link |
00:54:45.180
like you've got this population bubble of like aging,
link |
00:54:50.180
working, you know, working age folks who are,
link |
00:54:54.180
you know, at some point over the next 20, 30 years,
link |
00:54:56.180
they're going to be largely retired.
link |
00:54:58.180
And like you're going to have more retired people
link |
00:55:00.180
than working age people.
link |
00:55:01.180
And then like you've got, you know,
link |
00:55:02.180
sort of natural questions about who's going to take care of
link |
00:55:05.180
all the old folks and who's going to do all the work.
link |
00:55:07.180
And the answers to like all of these sorts of questions,
link |
00:55:11.180
like where you're sort of running into, you know,
link |
00:55:13.180
like constraints of the, you know,
link |
00:55:16.180
the world and of society has always been like
link |
00:55:20.180
what tech is going to like help us get around this?
link |
00:55:23.180
Like when I was a kid in the 70s and 80s,
link |
00:55:26.180
like we talked all the time about like population boom,
link |
00:55:29.180
population boom, like we're going to,
link |
00:55:31.180
like we're not going to be able to like feed the planet.
link |
00:55:34.180
And like we were like right in the middle of the Green Revolution
link |
00:55:38.180
where like this massive technology driven increase
link |
00:55:44.180
in crop productivity like worldwide.
link |
00:55:47.180
And like some of that was like taking some of the things
link |
00:55:49.180
that we knew in the West and like getting them distributed
link |
00:55:52.180
to the, you know, to the developing world.
link |
00:55:55.180
And like part of it were things like, you know,
link |
00:55:59.180
just smarter biology like helping us increase.
link |
00:56:03.180
And like we don't talk about like overpopulation anymore
link |
00:56:08.180
because like we can more or less,
link |
00:56:10.180
we sort of figured out how to feed the world.
link |
00:56:12.180
Like that's a technology story.
link |
00:56:14.180
And so like I'm super, super hopeful about the future
link |
00:56:19.180
and in the ways where we will be able to apply technology
link |
00:56:24.180
to solve some of these super challenging problems.
link |
00:56:28.180
Like I've, like one of the things that I'm trying to spend
link |
00:56:33.180
my time doing right now is trying to get everybody else
link |
00:56:36.180
to be hopeful as well because, you know, back to Harare,
link |
00:56:39.180
like we are the stories that we tell.
link |
00:56:41.180
Like if we, you know, if we get overly pessimistic right now
link |
00:56:44.180
about like the potential future of technology,
link |
00:56:48.180
like we, you know, like we may fail to get all of the things
link |
00:56:53.180
in place that we need to like have our best possible future.
link |
00:56:56.180
And that kind of hopeful optimism, I'm glad that you have it
link |
00:57:00.180
because you're leading large groups of engineers
link |
00:57:03.180
that are actually defining, that are writing that story,
link |
00:57:06.180
that are helping build that future, which is super exciting.
link |
00:57:09.180
And I agree with everything you said except I do hope
link |
00:57:13.180
Clippy comes back.
link |
00:57:15.180
We miss him. I speak for the people.
link |
00:57:19.180
So, Galen, thank you so much for talking to me.
link |
00:57:21.180
Thank you so much for having me. It was a pleasure.