back to index

Kevin Scott: Microsoft CTO | Lex Fridman Podcast #30


small model | large model

link |
00:00:00.000
The following is a conversation with Kevin Scott,
link |
00:00:03.440
the CTO of Microsoft.
link |
00:00:06.080
Before that, he was the senior vice president
link |
00:00:08.560
of engineering and operations at LinkedIn,
link |
00:00:11.080
and before that, he oversaw mobile ads
link |
00:00:13.520
engineering at Google.
link |
00:00:15.960
He also has a podcast called Behind the Tech
link |
00:00:19.000
with Kevin Scott, which I'm a fan of.
link |
00:00:21.880
This was a fun and wide ranging conversation
link |
00:00:24.280
that covered many aspects of computing.
link |
00:00:26.680
It happened over a month ago,
link |
00:00:28.840
before the announcement of Microsoft's investment
link |
00:00:31.000
OpenAI that a few people have asked me about.
link |
00:00:34.440
I'm sure there'll be one or two people in the future
link |
00:00:38.120
that'll talk with me about the impact of that investment.
link |
00:00:42.280
This is the Artificial Intelligence podcast.
link |
00:00:45.440
If you enjoy it, subscribe on YouTube,
link |
00:00:47.680
give it five stars on iTunes,
link |
00:00:49.440
support it on a Patreon,
link |
00:00:50.960
or simply connect with me on Twitter,
link |
00:00:53.000
at Lex Freedman, spelled FRIDMAM.
link |
00:00:57.680
And I'd like to give a special thank you
link |
00:00:59.240
to Tom and Elanti Bighausen
link |
00:01:01.960
for their support of the podcast on Patreon.
link |
00:01:04.600
Thanks Tom and Elanti.
link |
00:01:06.080
Hope I didn't mess up your last name too bad.
link |
00:01:08.400
Your support means a lot,
link |
00:01:10.520
and inspires me to keep this series going.
link |
00:01:13.480
And now, here's my conversation with Kevin Scott.
link |
00:01:18.160
You've described yourself as a kid in a candy store
link |
00:01:20.760
at Microsoft because of all the interesting projects
link |
00:01:23.000
that are going on.
link |
00:01:24.200
Can you try to do the impossible task
link |
00:01:28.000
and give a brief whirlwind view
link |
00:01:31.760
of all the spaces that Microsoft is working in?
link |
00:01:35.520
Both research and product.
link |
00:01:37.440
If you include research, it becomes even more difficult.
link |
00:01:46.480
So, I think broadly speaking,
link |
00:01:48.880
Microsoft's product portfolio includes everything
link |
00:01:53.720
from big cloud business,
link |
00:01:56.920
like a big set of SaaS services.
link |
00:01:59.360
We have sort of the original,
link |
00:02:01.720
or like some of what are among the original
link |
00:02:05.560
productivity software products that everybody uses.
link |
00:02:09.640
We have an operating system business.
link |
00:02:11.200
We have a hardware business
link |
00:02:13.560
where we make everything from computer mice
link |
00:02:17.240
and headphones to high end,
link |
00:02:20.760
high end personal computers and laptops.
link |
00:02:23.520
We have a fairly broad ranging research group
link |
00:02:27.680
where we have people doing everything
link |
00:02:29.680
from economics research.
link |
00:02:31.880
So, there's this really smart young economist,
link |
00:02:35.920
Glenn Weil, who like my group works with a lot,
link |
00:02:39.760
who's doing this research on these things
link |
00:02:42.880
called radical markets.
link |
00:02:45.120
Like he's written an entire technical book
link |
00:02:48.120
about this whole notion of radical markets.
link |
00:02:51.120
So, like the research group sort of spans from that
link |
00:02:53.520
to human computer interaction, to artificial intelligence.
link |
00:02:56.840
And we have GitHub, we have LinkedIn.
link |
00:03:01.040
We have a search advertising and news business
link |
00:03:05.800
and like probably a bunch of stuff
link |
00:03:07.360
that I'm embarrassingly not recounting in this list.
link |
00:03:11.240
On gaming to Xbox and so on, right?
link |
00:03:12.920
Yeah, gaming for sure.
link |
00:03:14.120
Like I was having a super fun conversation
link |
00:03:17.320
this morning with Phil Spencer.
link |
00:03:19.520
So, when I was in college,
link |
00:03:21.280
there was this game that Lucas Arts made
link |
00:03:25.560
called Day of the Tentacle,
link |
00:03:27.600
that my friends and I played forever.
link |
00:03:30.160
And like we're doing some interesting collaboration now
link |
00:03:33.920
with the folks who made Day of the Tentacle.
link |
00:03:37.920
And I was like completely nerding out with Tim Schaeffer,
link |
00:03:40.840
like the guy who wrote Day of the Tentacle this morning,
link |
00:03:43.880
just a complete fanboy,
link |
00:03:45.840
which you know, sort of it like happens a lot.
link |
00:03:49.880
Like, you know, Microsoft has been doing so much stuff
link |
00:03:53.320
at such breadth for such a long period of time
link |
00:03:56.000
that, you know, like being CTO,
link |
00:03:59.680
like most of the time my job is very, very serious
link |
00:04:02.200
and sometimes that like I get caught up
link |
00:04:05.640
in like how amazing it is
link |
00:04:09.200
to be able to have the conversations
link |
00:04:11.520
that I have with the people I get to have them with.
link |
00:04:14.640
You had to reach back into the sentimental
link |
00:04:17.040
and what's the radical markets and the economics?
link |
00:04:21.640
So the idea with radical markets is like,
link |
00:04:24.760
can you come up with new market based mechanisms to,
link |
00:04:32.320
you know, I think we have this,
link |
00:04:33.800
we're having this debate right now,
link |
00:04:35.240
like does capitalism work, like free markets work?
link |
00:04:40.040
Can the incentive structures
link |
00:04:43.000
that are built into these systems produce outcomes
link |
00:04:46.360
that are creating sort of equitably distributed benefits
link |
00:04:51.560
for every member of society?
link |
00:04:55.400
You know, and I think it's a reasonable set of questions
link |
00:04:58.720
to be asking.
link |
00:04:59.560
And so what Glenn, and so like, you know,
link |
00:05:02.160
one mode of thought there, like if you have doubts
link |
00:05:04.400
that the markets are actually working,
link |
00:05:06.720
you can sort of like tip towards like,
link |
00:05:08.560
okay, let's become more socialist
link |
00:05:10.800
and like have central planning and governments
link |
00:05:14.240
or some other central organization
link |
00:05:15.800
is like making a bunch of decisions
link |
00:05:18.280
about how sort of work gets done
link |
00:05:22.040
and like where the investments
link |
00:05:25.400
and where the outputs of those investments get distributed.
link |
00:05:28.880
Glenn's notion is like lean more
link |
00:05:32.160
into like the market based mechanism.
link |
00:05:35.800
So like for instance,
link |
00:05:37.920
this is one of the more radical ideas,
link |
00:05:39.600
like suppose that you had a radical pricing mechanism
link |
00:05:45.160
for assets like real estate
link |
00:05:47.120
where you could be bid out of your position
link |
00:05:53.600
in your home, you know, for instance.
link |
00:05:58.720
So like if somebody came along and said,
link |
00:06:01.120
you know, like I can find higher economic utility
link |
00:06:04.400
for this piece of real estate
link |
00:06:05.760
that you're running your business in,
link |
00:06:08.720
like then like you either have to, you know,
link |
00:06:13.040
sort of bid to sort of stay
link |
00:06:16.440
or like the thing that's got the higher economic utility,
link |
00:06:19.960
you know, sort of takes over the asset
link |
00:06:21.440
and which would make it very difficult
link |
00:06:23.720
to have the same sort of rent seeking behaviors
link |
00:06:27.600
that you've got right now
link |
00:06:29.000
because like if you did speculative bidding,
link |
00:06:34.000
like you would very quickly like lose a whole lot of money.
link |
00:06:40.440
And so like the prices of the assets would be sort of
link |
00:06:43.520
like very closely indexed to like the value
link |
00:06:47.600
that they can produce.
link |
00:06:49.720
And like because like you'd have this sort of real time
link |
00:06:52.680
mechanism that would force you to sort of mark the value
link |
00:06:55.320
of the asset to the market,
link |
00:06:56.800
then it could be taxed appropriately.
link |
00:06:58.560
Like you couldn't sort of sit on this thing and say,
link |
00:07:00.400
oh, like this house is only worth 10,000 bucks
link |
00:07:03.040
when like everything around it is worth 10 million.
link |
00:07:06.600
That's really interesting.
link |
00:07:07.440
So it's an incentive structure
link |
00:07:08.720
that where the prices match the value much better.
link |
00:07:13.200
Yeah.
link |
00:07:14.040
And Glenn does a much, much better job than I do
link |
00:07:16.320
at selling and I probably picked the world's worst example,
link |
00:07:18.920
you know, and, and, and, but like,
link |
00:07:20.360
and it's intentionally provocative, you know,
link |
00:07:24.520
so like this whole notion, like I, you know,
link |
00:07:26.480
like I'm not sure whether I like this notion
link |
00:07:28.920
that like we can have a set of market mechanisms
link |
00:07:31.120
where I could get bid out of, out of my property, you know,
link |
00:07:35.360
but, but, you know, like if you're thinking about something
link |
00:07:37.680
like Elizabeth Warren's wealth tax, for instance,
link |
00:07:42.480
like you would have, I mean, it'd be really interesting
link |
00:07:45.600
in like how you would actually set the price on the assets.
link |
00:07:50.080
And like you might have to have a mechanism like that
link |
00:07:52.040
if you put a tax like that in place.
link |
00:07:54.160
It's really interesting that that kind of research,
link |
00:07:56.440
at least tangentially touching Microsoft research.
link |
00:07:59.800
Yeah.
link |
00:08:00.640
So if you're really thinking broadly,
link |
00:08:02.560
maybe you can speak to this connects to AI.
link |
00:08:08.400
So we have a candidate, Andrew Yang,
link |
00:08:10.680
who kind of talks about artificial intelligence
link |
00:08:13.480
and the concern that people have about, you know,
link |
00:08:16.640
automations impact on society.
link |
00:08:19.000
And arguably Microsoft is at the cutting edge
link |
00:08:22.680
of innovation in all these kinds of ways.
link |
00:08:25.040
And so it's pushing AI forward.
link |
00:08:27.080
How do you think about combining all our conversations
link |
00:08:30.040
together here with radical markets and socialism
link |
00:08:32.840
and innovation in AI that Microsoft is doing?
link |
00:08:37.520
And then Andrew Yang's worry that that will,
link |
00:08:43.520
that will result in job loss for the lower and so on.
link |
00:08:46.840
How do you think about that?
link |
00:08:47.680
I think it's sort of one of the most important questions
link |
00:08:51.160
in technology, like maybe even in society right now
link |
00:08:55.320
about how is AI going to develop over the course
link |
00:09:00.720
of the next several decades
link |
00:09:02.000
and like what's it gonna be used for
link |
00:09:03.600
and like what benefits will it produce
link |
00:09:06.560
and what negative impacts will it produce
link |
00:09:08.520
and you know, who gets to steer this whole thing?
link |
00:09:13.720
You know, I'll say at the highest level,
link |
00:09:17.240
one of the real joys of getting to do what I do at Microsoft
link |
00:09:22.240
is Microsoft has this heritage as a platform company.
link |
00:09:27.560
And so, you know, like Bill has this thing
link |
00:09:31.040
that he said a bunch of years ago
link |
00:09:32.880
where the measure of a successful platform
link |
00:09:36.440
is that it produces far more economic value
link |
00:09:39.800
for the people who build on top of the platform
link |
00:09:41.840
than is created for the platform owner or builder.
link |
00:09:47.320
And I think we have to think about AI that way.
link |
00:09:50.920
Like it has to be a platform that other people can use
link |
00:09:56.280
to build businesses, to fulfill their creative objectives,
link |
00:10:01.280
to be entrepreneurs, to solve problems that they have
link |
00:10:04.640
in their work and in their lives.
link |
00:10:07.680
It can't be a thing where there are a handful of companies
link |
00:10:11.960
sitting in a very small handful of cities geographically
link |
00:10:16.440
who are making all the decisions
link |
00:10:19.120
about what goes into the AI and like,
link |
00:10:24.240
and then on top of like all this infrastructure,
link |
00:10:26.920
then build all of the commercially valuable uses for it.
link |
00:10:31.000
So like, I think like that's bad from a, you know,
link |
00:10:34.400
sort of, you know, economics
link |
00:10:36.520
and sort of equitable distribution of value perspective,
link |
00:10:39.720
like, you know, sort of back to this whole notion of,
link |
00:10:42.080
you know, like, do the markets work?
link |
00:10:44.560
But I think it's also bad from an innovation perspective
link |
00:10:47.600
because like I have infinite amounts of faith
link |
00:10:51.360
in human beings that if you, you know,
link |
00:10:53.880
give folks powerful tools, they will go do interesting things.
link |
00:10:58.280
And it's more than just a few tens of thousands of people
link |
00:11:02.320
with the interesting tools,
link |
00:11:03.360
it should be millions of people with the tools.
link |
00:11:05.400
So it's sort of like, you know,
link |
00:11:07.200
you think about the steam engine
link |
00:11:10.200
and the late 18th century, like it was, you know,
link |
00:11:13.800
maybe the first large scale substitute for human labor
link |
00:11:16.800
that we've built like a machine.
link |
00:11:19.120
And, you know, in the beginning,
link |
00:11:21.680
when these things are getting deployed,
link |
00:11:23.520
the folks who got most of the value from the steam engines
link |
00:11:28.320
were the folks who had capital
link |
00:11:30.160
so they could afford to build them.
link |
00:11:31.600
And like they built factories around them in businesses
link |
00:11:34.720
and the experts who knew how to build and maintain them.
link |
00:11:38.680
But access to that technology democratized over time.
link |
00:11:42.880
Like now like an engine is not a,
link |
00:11:47.040
it's not like a differentiated thing.
link |
00:11:48.800
Like there isn't one engine company
link |
00:11:50.280
that builds all the engines
link |
00:11:51.560
and all of the things that use engines
link |
00:11:53.120
are made by this company.
link |
00:11:54.240
And like they get all the economics from all of that.
link |
00:11:57.440
Like, no, like fully demarcated.
link |
00:11:59.320
Like they're probably, you know,
link |
00:12:00.600
we're sitting here in this room
link |
00:12:02.360
and like even though they don't,
link |
00:12:03.680
they're probably things, you know,
link |
00:12:05.280
like the MIMS gyroscope that are in both of our,
link |
00:12:09.120
like there's like little engines, you know,
link |
00:12:11.480
sort of everywhere, they're just a component
link |
00:12:14.520
in how we build the modern world.
link |
00:12:16.240
Like AI needs to get there.
link |
00:12:17.680
Yeah, so that's a really powerful way to think.
link |
00:12:20.200
If we think of AI as a platform versus a tool
link |
00:12:25.120
that Microsoft owns as a platform
link |
00:12:27.600
that enables creation on top of it,
link |
00:12:30.120
that's the way to democratize it.
link |
00:12:31.520
That's really interesting actually.
link |
00:12:34.200
And Microsoft throughout its history
link |
00:12:36.040
has been positioned well to do that.
link |
00:12:38.240
And the, you know, the tieback to this radical markets thing,
link |
00:12:41.640
like the, so my team has been working with Glenn
link |
00:12:47.800
on this and Jaren Lanier actually.
link |
00:12:51.120
So Jaren is the like the sort of father of virtual reality.
link |
00:12:56.440
Like he's one of the most interesting human beings
link |
00:12:59.480
on the planet, like a sweet, sweet guy.
link |
00:13:02.840
And so Jaren and Glenn and folks in my team
link |
00:13:07.120
have been working on this notion of data as labor
link |
00:13:10.360
or like they call it data dignity as well.
link |
00:13:13.160
And so the idea is that if you, you know,
link |
00:13:16.880
again, going back to this, you know,
link |
00:13:18.600
sort of industrial analogy,
link |
00:13:20.800
if you think about data as the raw material
link |
00:13:23.560
that is consumed by the machine of AI
link |
00:13:27.640
in order to do useful things,
link |
00:13:30.560
then like we're not doing a really great job right now
link |
00:13:34.400
in having transparent marketplaces for valuing
link |
00:13:37.760
those data contributions.
link |
00:13:39.800
So like, and we all make them like explicitly,
link |
00:13:42.680
like you go to LinkedIn,
link |
00:13:43.600
you sort of set up your profile on LinkedIn,
link |
00:13:46.160
like that's an explicit contribution.
link |
00:13:47.800
Like, you know exactly the information
link |
00:13:49.480
that you're putting into the system.
link |
00:13:50.720
And like you put it there because you have
link |
00:13:53.000
some nominal notion of like what value
link |
00:13:55.520
you're going to get in return,
link |
00:13:56.640
but it's like only nominal.
link |
00:13:57.720
Like you don't know exactly what value
link |
00:13:59.680
you're getting in return, like services free, you know,
link |
00:14:02.040
like it's low amount of like perceived.
link |
00:14:04.600
And then you've got all this indirect contribution
link |
00:14:06.680
that you're making just by virtue of interacting
link |
00:14:08.960
with all of the technology that's in your daily life.
link |
00:14:13.160
And so like what Glenn and Jaren
link |
00:14:16.120
and this data dignity team are trying to do is like,
link |
00:14:19.440
can we figure out a set of mechanisms
link |
00:14:22.240
that let us value those data contributions
link |
00:14:26.000
so that you could create an economy
link |
00:14:28.200
and like a set of controls and incentives
link |
00:14:31.480
that would allow people to like maybe even in the limit
link |
00:14:36.840
like earn part of their living
link |
00:14:38.880
through the data that they're creating.
link |
00:14:41.000
And like you can sort of see it in explicit ways.
link |
00:14:42.680
There are these companies like Scale AI
link |
00:14:46.000
and like they're a whole bunch of them in China right now
link |
00:14:49.960
that are basically data labeling companies.
link |
00:14:52.400
So like you're doing supervised machine learning,
link |
00:14:54.560
you need lots and lots of label training data.
link |
00:14:58.600
And like those people are getting like who work
link |
00:15:01.440
for those companies are getting compensated
link |
00:15:03.600
for their data contributions into the system.
link |
00:15:06.360
And so...
link |
00:15:07.720
That's easier to put a number on their contribution
link |
00:15:10.280
because they're explicitly labeling data.
link |
00:15:11.960
Correct.
link |
00:15:12.800
But you're saying that we're all contributing data
link |
00:15:14.360
in different kinds of ways.
link |
00:15:15.720
And it's fascinating to start to explicitly try
link |
00:15:19.640
to put a number on it.
link |
00:15:20.880
Do you think that's possible?
link |
00:15:22.600
I don't know, it's hard.
link |
00:15:23.640
It really is.
link |
00:15:25.480
Because, you know, we don't have as much transparency
link |
00:15:30.480
as I think we need in like how the data is getting used.
link |
00:15:37.240
And it's, you know, super complicated.
link |
00:15:38.720
Like, you know, we, you know,
link |
00:15:41.000
I think as technologists sort of appreciate
link |
00:15:42.880
like some of the subtlety there.
link |
00:15:44.160
It's like, you know, the data, the data gets created
link |
00:15:47.880
and then it gets, you know, it's not valuable.
link |
00:15:51.400
Like the data exhaust that you give off
link |
00:15:56.000
or the, you know, the explicit data
link |
00:15:58.480
that I am putting into the system isn't valuable.
link |
00:16:03.240
It's super valuable atomically.
link |
00:16:05.160
Like it's only valuable when you sort of aggregate it together
link |
00:16:08.360
into, you know, sort of large numbers.
link |
00:16:10.440
It's true even for these like folks
link |
00:16:11.960
who are getting compensated for like labeling things.
link |
00:16:14.880
Like for supervised machine learning now,
link |
00:16:16.480
like you need lots of labels to train, you know,
link |
00:16:20.080
a model that performs well.
link |
00:16:22.080
And so, you know, I think that's one of the challenges.
link |
00:16:24.440
It's like, how do you, you know,
link |
00:16:26.120
how do you sort of figure out like
link |
00:16:28.000
because this data is getting combined in so many ways,
link |
00:16:31.480
like through these combinations,
link |
00:16:33.880
like how the value is flowing.
link |
00:16:35.880
Yeah, that's, that's fascinating.
link |
00:16:38.520
Yeah.
link |
00:16:39.360
And it's fascinating that you're thinking about this.
link |
00:16:41.880
And I wasn't even going into this competition
link |
00:16:44.160
expecting the breadth of research really
link |
00:16:48.200
that Microsoft broadly is thinking about.
link |
00:16:50.600
You are thinking about in Microsoft.
link |
00:16:52.360
So if we go back to 89 when Microsoft released Office
link |
00:16:57.360
or 1990 when they released Windows 3.0,
link |
00:17:00.920
how's the, in your view,
link |
00:17:04.960
I know you weren't there the entire, you know,
link |
00:17:07.280
through its history, but how has the company changed
link |
00:17:09.760
in the 30 years since as you look at it now?
link |
00:17:12.840
The good thing is it's started off as a platform company.
link |
00:17:17.080
Like it's still a platform company,
link |
00:17:19.960
like the parts of the business that are like thriving
link |
00:17:22.640
and most successful or those that are building platforms,
link |
00:17:26.560
like the mission of the company now is,
link |
00:17:29.000
the mission's changed.
link |
00:17:30.120
It's like changing a very interesting way.
link |
00:17:32.480
So, you know, back in 89.90,
link |
00:17:36.280
like they were still on the original mission,
link |
00:17:39.040
which was like put a PC on every desk and in every home.
link |
00:17:43.840
Like, and it was basically about democratizing access
link |
00:17:47.480
to this new personal computing technology,
link |
00:17:50.000
which when Bill started the company,
link |
00:17:52.680
integrated circuit microprocessors were a brand new thing
link |
00:17:57.680
and like people were building, you know,
link |
00:18:00.120
homebrew computers, you know, from kits,
link |
00:18:03.840
like the way people build ham radios right now.
link |
00:18:08.520
And I think this is sort of the interesting thing
link |
00:18:10.680
for folks who build platforms in general.
link |
00:18:12.840
Bill saw the opportunity there
link |
00:18:16.840
and what personal computers could do.
link |
00:18:18.720
And it was like, it was sort of a reach.
link |
00:18:20.440
Like you just sort of imagined
link |
00:18:21.680
like where things were, you know,
link |
00:18:23.880
when they started the company
link |
00:18:24.880
versus where things are now.
link |
00:18:26.120
Like in success, when you democratize a platform,
link |
00:18:29.400
it just sort of vanishes into the platform.
link |
00:18:31.000
You don't pay attention to it anymore.
link |
00:18:32.480
Like operating systems aren't a thing anymore.
link |
00:18:35.600
Like they're super important, like completely critical.
link |
00:18:38.040
And like, you know, when you see one, you know, fail,
link |
00:18:41.760
like you just, you sort of understand,
link |
00:18:43.520
but like, you know, it's not a thing where you're,
link |
00:18:45.320
you're not like waiting for, you know,
link |
00:18:47.920
the next operating system thing
link |
00:18:50.480
in the same way that you were in 1995, right?
link |
00:18:52.960
Like in 1995, like, you know,
link |
00:18:54.280
we had Rolling Stones on the stage
link |
00:18:56.000
with the Windows 95 roll out.
link |
00:18:57.600
Like it was like the biggest thing in the world.
link |
00:18:59.320
Everybody would like lined up for it
link |
00:19:01.080
the way that people used to line up for iPhone.
link |
00:19:03.400
But like, you know, eventually,
link |
00:19:05.120
and like this isn't necessarily a bad thing.
link |
00:19:07.160
Like it just sort of, you know,
link |
00:19:09.000
the success is that it's sort of, it becomes ubiquitous.
link |
00:19:12.880
It's like everywhere and like human beings
link |
00:19:14.800
when their technology becomes ubiquitous,
link |
00:19:16.640
they just sort of start taking it for granted.
link |
00:19:18.240
So the mission now that Satya rearticulated
link |
00:19:23.640
five plus years ago now
link |
00:19:25.280
when he took over as CEO of the company,
link |
00:19:29.320
our mission is to empower every individual
link |
00:19:33.480
and every organization in the world to be more successful.
link |
00:19:39.200
And so, you know, again, like that's a platform mission.
link |
00:19:43.160
And like the way that we do it now is different.
link |
00:19:46.320
It's like we have a hyperscale cloud
link |
00:19:48.680
that people are building their applications on top of.
link |
00:19:51.680
Like we have a bunch of AI infrastructure
link |
00:19:53.680
that people are building their AI applications on top of.
link |
00:19:56.280
We have, you know, we have a productivity suite of software
link |
00:20:02.280
like Microsoft Dynamics, which, you know,
link |
00:20:05.800
some people might not think is the sexiest thing
link |
00:20:07.440
in the world, but it's like helping people figure out
link |
00:20:10.040
how to automate all of their business processes
link |
00:20:12.720
and workflows and to, you know, like help those businesses
link |
00:20:16.800
using it to like grow and be more successful.
link |
00:20:19.120
So it's a much broader vision in a way now
link |
00:20:24.240
than it was back then.
link |
00:20:25.480
Like it was sort of very particular thing.
link |
00:20:27.400
And like now, like we live in this world
link |
00:20:29.280
where technology is so powerful
link |
00:20:31.320
and it's like such a basic fact of life
link |
00:20:36.320
that it, you know, that it both exists
link |
00:20:39.760
and is going to get better and better over time
link |
00:20:42.760
or at least more and more powerful over time.
link |
00:20:46.000
So like, you know, what you have to do as a platform player
link |
00:20:48.200
is just much bigger.
link |
00:20:49.920
Right.
link |
00:20:50.760
There's so many directions in which you can transform.
link |
00:20:52.600
You didn't mention mixed reality too.
link |
00:20:55.160
You know, that's probably early days
link |
00:20:59.200
or depends how you think of it.
link |
00:21:00.680
But if we think in a scale of centuries,
link |
00:21:02.240
it's the early days of mixed reality.
link |
00:21:04.120
Oh, for sure.
link |
00:21:04.960
And so yeah, with how it lands,
link |
00:21:08.280
the Microsoft is doing some really interesting work there.
link |
00:21:10.600
Do you touch that part of the effort?
link |
00:21:13.560
What's the thinking?
link |
00:21:14.840
Do you think of mixed reality as a platform too?
link |
00:21:17.640
Oh, sure.
link |
00:21:18.480
When we look at what the platforms of the future could be.
link |
00:21:21.320
So like fairly obvious that like AI is one,
link |
00:21:23.880
like you don't have to, I mean, like that's,
link |
00:21:26.600
you know, you sort of say it to like someone
link |
00:21:29.160
and you know, like they get it.
link |
00:21:31.920
But like we also think of the like mixed reality
link |
00:21:36.280
and quantum is like these two interesting,
link |
00:21:39.560
you know, potentially.
link |
00:21:40.920
Quantum computing.
link |
00:21:41.800
Yeah.
link |
00:21:42.640
Okay, so let's get crazy then.
link |
00:21:44.520
So you're talking about some futuristic things here.
link |
00:21:48.920
Well, the mixed reality Microsoft is really,
link |
00:21:50.920
it's not even futuristic, it's here.
link |
00:21:52.600
It is.
link |
00:21:53.440
Incredible stuff.
link |
00:21:54.280
And look, and it's having an impact right now.
link |
00:21:56.680
Like one of the more interesting things
link |
00:21:58.720
that's happened with mixed reality over the past
link |
00:22:01.280
couple of years that I didn't clearly see
link |
00:22:04.120
is that it's become the computing device
link |
00:22:08.400
for folks who, for doing their work
link |
00:22:13.160
who haven't used any computing device at all
link |
00:22:16.040
to do their work before.
link |
00:22:16.960
So technicians and service folks
link |
00:22:19.800
and people who are doing like machine maintenance
link |
00:22:24.200
on factory floors.
link |
00:22:25.280
So like they, you know, because they're mobile
link |
00:22:28.760
and like they're out in the world
link |
00:22:30.280
and they're working with their hands
link |
00:22:32.320
and, you know, sort of servicing these
link |
00:22:34.080
like very complicated things.
link |
00:22:36.520
They're, they don't use their mobile phone
link |
00:22:39.440
and like they don't carry a laptop with them.
link |
00:22:41.440
And, you know, they're not tethered to a desk.
link |
00:22:43.480
And so mixed reality, like where it's getting
link |
00:22:46.920
traction right now, where HoloLens is selling
link |
00:22:48.840
a lot of units is for these sorts of applications
link |
00:22:53.880
for these workers and it's become like,
link |
00:22:55.440
I mean, like the people love it.
link |
00:22:58.040
They're like, oh my God, like this is like,
link |
00:23:00.600
for them like the same sort of productivity boosts
link |
00:23:02.840
that, you know, like an office worker had
link |
00:23:05.520
when they got their first personal computer.
link |
00:23:08.200
Yeah, but you did mention,
link |
00:23:09.800
it's certainly obvious AI as a platform,
link |
00:23:13.400
but can we dig into it a little bit?
link |
00:23:15.560
How does AI begin to infuse some of the products
link |
00:23:18.320
in Microsoft?
link |
00:23:19.480
So currently providing training of, for example,
link |
00:23:25.040
neural networks in the cloud
link |
00:23:26.760
or providing pre trained models
link |
00:23:30.960
or just even providing computing resources
link |
00:23:35.360
and whatever different inference
link |
00:23:37.520
that you want to do using neural networks.
link |
00:23:39.320
Yep.
link |
00:23:40.160
Well, how do you think of AI infusing the,
link |
00:23:43.560
as a platform that Microsoft can provide?
link |
00:23:45.880
Yeah, I mean, I think it's, it's super interesting.
link |
00:23:48.320
It's like everywhere.
link |
00:23:49.560
And like we run these, we run these review meetings now
link |
00:23:54.560
where it's me and Satya and like members of Satya's
link |
00:24:01.480
leadership team and like a cross functional group
link |
00:24:04.600
of folks across the entire company
link |
00:24:06.200
who are working on like either AI infrastructure
link |
00:24:11.840
or like have some substantial part of their,
link |
00:24:16.480
of their product work using AI in some significant way.
link |
00:24:21.480
Now, the important thing to understand is like,
link |
00:24:23.440
when you think about like how the AI is going to manifest
link |
00:24:27.040
in like an experience for something
link |
00:24:29.600
that's going to make it better,
link |
00:24:30.760
like I think you don't want the AI in this
link |
00:24:35.760
to be the first order thing.
link |
00:24:37.760
It's like whatever the product is and like the thing
link |
00:24:40.600
that is trying to help you do,
link |
00:24:42.440
like the AI just sort of makes it better.
link |
00:24:44.560
And you know, this is a gross exaggeration,
link |
00:24:46.840
but like I, yeah, people get super excited about it.
link |
00:24:50.680
They're super excited about like where the AI is showing up
link |
00:24:53.280
in products and I'm like, do you get that excited
link |
00:24:55.440
about like where you're using a hash table like in your code?
link |
00:24:59.880
Like it's just another, it's a very interesting
link |
00:25:03.200
programming tool, but it's sort of like it's an engineering
link |
00:25:05.800
tool and so like it shows up everywhere.
link |
00:25:09.560
So like we've got dozens and dozens of features now
link |
00:25:12.920
in office that are powered by like fairly sophisticated
link |
00:25:17.400
machine learning, our search engine wouldn't work at all
link |
00:25:22.200
if you took the machine learning out of it.
link |
00:25:24.840
The like increasingly, you know,
link |
00:25:28.560
things like content moderation on our Xbox and xCloud
link |
00:25:34.800
platform.
link |
00:25:37.000
When you mean moderation to me, like the recommender
link |
00:25:39.160
is like showing what you want to look at next.
link |
00:25:41.760
No, no, no, it's like anti bullying stuff.
link |
00:25:44.000
So the usual social network stuff that you have to deal with.
link |
00:25:47.040
Yeah, correct.
link |
00:25:47.880
But it's like really it's targeted,
link |
00:25:50.080
it's targeted towards a gaming audience.
link |
00:25:52.280
So it's like a very particular type of thing where,
link |
00:25:55.320
you know, the the line between playful banter
link |
00:25:59.480
and like legitimate bullying is like a subtle one.
link |
00:26:02.280
And like you have to, it's sort of tough.
link |
00:26:06.080
Like I have, I love to, if we could dig into it
link |
00:26:09.080
because you're also, you led the engineering efforts
link |
00:26:11.720
of LinkedIn and if we look at,
link |
00:26:14.920
if we look at LinkedIn as a social network
link |
00:26:17.640
and if we look at the Xbox gaming as the social components,
link |
00:26:21.760
the very different kinds of, I imagine communication
link |
00:26:24.840
going on on the two platforms, right?
link |
00:26:26.880
And the line in terms of bullying and so on
link |
00:26:29.520
is different on the two platforms.
link |
00:26:31.480
So how do you, I mean,
link |
00:26:33.480
such a fascinating philosophical discussion
link |
00:26:36.240
of where that line is.
link |
00:26:37.240
I don't think anyone knows the right answer.
link |
00:26:39.840
Twitter folks are under fire now,
link |
00:26:42.040
Jack at Twitter for trying to find that line.
link |
00:26:45.120
Nobody knows what that line is,
link |
00:26:46.920
but how do you try to find the line for,
link |
00:26:52.480
you know, trying to prevent abusive behavior
link |
00:26:58.040
and at the same time let people be playful
link |
00:27:00.200
and joke around and that kind of thing.
link |
00:27:02.880
I think in a certain way, like, you know,
link |
00:27:04.640
if you have what I would call vertical social networks,
link |
00:27:09.640
it gets to be a little bit easier.
link |
00:27:12.200
So like if you have a clear notion
link |
00:27:14.440
of like what your social network should be used for
link |
00:27:17.960
or like what you are designing a community around,
link |
00:27:22.280
then you don't have as many dimensions
link |
00:27:25.800
to your sort of content safety problem
link |
00:27:28.960
as, you know, as you do in a general purpose platform.
link |
00:27:33.720
I mean, so like on LinkedIn,
link |
00:27:37.520
like the whole social network is about
link |
00:27:39.920
connecting people with opportunity,
link |
00:27:41.560
whether it's helping them find a job
link |
00:27:43.160
or to, you know, sort of find mentors
link |
00:27:46.280
or to, you know, sort of help them
link |
00:27:49.320
like find their next sales lead
link |
00:27:52.120
or to just sort of allow them to broadcast
link |
00:27:56.160
their, you know, sort of professional identity
link |
00:27:59.440
to their network of peers and collaborators
link |
00:28:04.440
and, you know, sort of professional community.
link |
00:28:05.880
Like that is, I mean, like in some ways,
link |
00:28:07.400
like that's very, very broad,
link |
00:28:08.960
but in other ways, it's sort of, you know, it's narrow.
link |
00:28:12.480
And so like you can build AIs like machine learning systems
link |
00:28:18.360
that are, you know, capable with those boundaries
link |
00:28:23.360
of making better automated decisions about like,
link |
00:28:26.200
what is, you know, sort of inappropriate
link |
00:28:28.240
and offensive comment or dangerous comment
link |
00:28:30.440
or illegal content.
link |
00:28:31.920
When you have some constraints,
link |
00:28:34.800
you know, same thing with, you know,
link |
00:28:37.400
same thing with like the gaming social network.
link |
00:28:40.880
So for instance, like it's about playing games,
link |
00:28:42.680
about having fun and like the thing
link |
00:28:44.880
that you don't want to have happen on the platform.
link |
00:28:47.240
It's why bullying is such an important thing.
link |
00:28:49.160
Like bullying is not fun.
link |
00:28:50.600
So you want to do everything in your power
link |
00:28:53.400
to encourage that not to happen.
link |
00:28:56.240
And yeah, but I think that's a really important thing
link |
00:29:00.320
but I think it's sort of a tough problem in general.
link |
00:29:03.920
It's one where I think, you know,
link |
00:29:05.280
eventually we're gonna have to have
link |
00:29:09.120
some sort of clarification from our policy makers
link |
00:29:13.800
about what it is that we should be doing,
link |
00:29:17.400
like where the lines are, because it's tough.
link |
00:29:20.880
Like you don't, like in democracy, right?
link |
00:29:23.760
Like you don't want, you want some sort
link |
00:29:26.680
of democratic involvement.
link |
00:29:28.880
Like people should have a say
link |
00:29:30.440
in like where the lines are drawn.
link |
00:29:34.680
Like you don't want a bunch of people
link |
00:29:36.920
making like unilateral decisions.
link |
00:29:39.480
And like we are in a state right now
link |
00:29:43.120
for some of these platforms where you actually
link |
00:29:44.760
do have to make unilateral decisions
link |
00:29:46.280
where the policy making isn't gonna happen fast enough
link |
00:29:48.640
in order to like prevent very bad things from happening.
link |
00:29:52.520
But like we need the policy making side of that
link |
00:29:55.200
to catch up I think as quickly as possible
link |
00:29:58.480
because you want that whole process
link |
00:30:00.680
to be a democratic thing,
link |
00:30:02.000
not a, you know, not some sort of weird thing
link |
00:30:05.760
where you've got a non representative group
link |
00:30:08.040
of people making decisions that have, you know,
link |
00:30:10.440
like national and global impact.
link |
00:30:12.520
And it's fascinating because the digital space
link |
00:30:14.720
is different than the physical space
link |
00:30:17.520
in which nations and governments were established.
link |
00:30:19.800
And so what policy looks like globally,
link |
00:30:23.960
what bullying looks like globally,
link |
00:30:25.760
what healthy communication looks like globally
link |
00:30:28.360
is an open question and we're all figuring it out together.
link |
00:30:31.920
Which is fascinating.
link |
00:30:32.760
Yeah, I mean with, you know, sort of fake news for instance
link |
00:30:37.160
and deep fakes and fake news generated by humans.
link |
00:30:42.320
Yeah, so we can talk about deep fakes.
link |
00:30:44.600
Like I think that is another like, you know,
link |
00:30:46.120
sort of very interesting level of complexity.
link |
00:30:48.280
But like if you think about just the written word, right?
link |
00:30:51.480
Like we have, you know, we invented Papyrus
link |
00:30:54.400
what 3000 years ago where we, you know,
link |
00:30:56.760
you could sort of put word on paper.
link |
00:31:01.160
And then 500 years ago, like we get the printing press
link |
00:31:07.240
like where the word gets a little bit more ubiquitous.
link |
00:31:11.480
And then like you really, really didn't get ubiquitous
link |
00:31:14.600
printed word until the end of the 19th century
link |
00:31:18.400
when the offset press was invented.
link |
00:31:20.720
And then, you know, just sort of explodes
link |
00:31:22.360
and like, you know, the cross product of that
link |
00:31:25.360
and the industrial revolutions need
link |
00:31:28.960
for educated citizens resulted in like
link |
00:31:32.880
this rapid expansion of literacy
link |
00:31:34.720
and the rapid expansion of the word.
link |
00:31:36.000
But like we had 3000 years up to that point
link |
00:31:39.680
to figure out like how to, you know, like what's,
link |
00:31:44.040
what's journalism, what's editorial integrity?
link |
00:31:46.880
Like what's, you know, what's scientific peer review?
link |
00:31:50.120
And so like you built all of this mechanism
link |
00:31:52.840
to like try to filter through all of the noise
link |
00:31:57.080
that the technology made possible to like, you know,
link |
00:32:00.600
sort of getting to something that society could cope with.
link |
00:32:04.000
And like, if you think about just the piece,
link |
00:32:06.600
the PC didn't exist 50 years ago.
link |
00:32:09.800
And so in like this span of, you know,
link |
00:32:11.800
like half a century, like we've gone from no digital,
link |
00:32:16.160
you know, no ubiquitous digital technology
link |
00:32:18.320
to like having a device that sits in your pocket
link |
00:32:21.080
where you can sort of say whatever is on your mind
link |
00:32:23.760
to like what would Mary have
link |
00:32:26.800
and Mary Meeker just released her new like slide deck last week.
link |
00:32:32.440
You know, we've got 50% penetration of the internet
link |
00:32:37.360
to the global population.
link |
00:32:38.520
Like there are like three and a half billion people
link |
00:32:40.280
who are connected now.
link |
00:32:41.720
So it's like, it's crazy, crazy.
link |
00:32:43.720
They're like inconceivable,
link |
00:32:45.000
like how fast all of this happened.
link |
00:32:46.480
So, you know, it's not surprising
link |
00:32:48.720
that we haven't figured out what to do yet,
link |
00:32:51.000
but like we gotta really like lean into this set of problems
link |
00:32:55.640
because like we basically have three millennia worth of work
link |
00:33:00.200
to do about how to deal with all of this
link |
00:33:02.520
and like probably what amounts to the next decade
link |
00:33:05.800
worth of time.
link |
00:33:07.040
So since we're on the topic of tough, you know,
link |
00:33:09.960
tough challenging problems,
link |
00:33:11.600
let's look at more on the tooling side in AI
link |
00:33:15.200
that Microsoft is looking at as face recognition software.
link |
00:33:18.440
So there's a lot of powerful positive use cases
link |
00:33:21.840
for face recognition, but there's some negative ones
link |
00:33:24.240
and we're seeing those in different governments
link |
00:33:27.200
in the world.
link |
00:33:28.160
So how do you, how does Microsoft think
link |
00:33:30.240
about the use of face recognition software
link |
00:33:33.880
as a platform in governments and companies?
link |
00:33:39.400
Yeah, how do we strike an ethical balance here?
link |
00:33:42.280
Yeah, I think we've articulated a clear point of view.
link |
00:33:47.280
So Brad Smith wrote a blog post last fall,
link |
00:33:51.840
I believe that sort of like outline,
link |
00:33:54.120
like very specifically what, you know,
link |
00:33:57.000
what our point of view is there.
link |
00:33:59.280
And, you know, I think we believe that there are certain uses
link |
00:34:02.240
to which face recognition should not be put
link |
00:34:04.680
and we believe again that there's a need for regulation there.
link |
00:34:09.160
Like the government should like really come in and say
link |
00:34:12.440
that, you know, this is where the lines are.
link |
00:34:15.720
And like we very much wanted to like figuring out
link |
00:34:18.600
where the lines are should be a democratic process.
link |
00:34:20.680
But in the short term, like we've drawn some lines
link |
00:34:23.240
where, you know, we push back against uses
link |
00:34:26.640
of face recognition technology.
link |
00:34:29.440
You know, like this city of San Francisco, for instance,
link |
00:34:32.480
I think has completely outlawed any government agency
link |
00:34:36.480
from using face recognition tech.
link |
00:34:39.560
And like that may prove to be a little bit overly broad.
link |
00:34:44.560
But for like certain law enforcement things,
link |
00:34:48.840
like you really, I would personally rather be overly
link |
00:34:54.040
sort of cautious in terms of restricting use of it
link |
00:34:57.400
until like we have, you know,
link |
00:34:58.920
sort of defined a reasonable, you know,
link |
00:35:02.160
democratically determined regulatory framework
link |
00:35:04.880
for like where we could and should use it.
link |
00:35:08.840
And, you know, the other thing there is
link |
00:35:11.960
like we've got a bunch of research that we're doing
link |
00:35:14.000
and a bunch of progress that we've made on bias there.
link |
00:35:18.400
And like there are all sorts of like weird biases
link |
00:35:20.880
that these models can have like all the way
link |
00:35:23.640
from like the most noteworthy one where, you know,
link |
00:35:26.920
you may have underrepresented minorities
link |
00:35:31.680
who are like underrepresented in the training data.
link |
00:35:34.680
And then you start learning like strange things.
link |
00:35:39.240
But like they're even, you know, other weird things
link |
00:35:42.160
like we've, I think we've seen in the public research
link |
00:35:46.480
like models can learn strange things
link |
00:35:49.520
like all doctors or men for instance.
link |
00:35:54.520
Yeah, I mean, and so like it really is a thing where
link |
00:36:00.760
it's very important for everybody
link |
00:36:03.600
who is working on these things before they push publish,
link |
00:36:08.440
they launch the experiment, they, you know, push the code
link |
00:36:12.800
to, you know, online or they even publish the paper
link |
00:36:17.120
that they are at least starting to think
link |
00:36:20.040
about what some of the potential negative consequences
link |
00:36:25.040
are some of this stuff.
link |
00:36:25.880
I mean, this is where, you know, like the deep fake stuff
link |
00:36:29.040
I find very worrisome just because
link |
00:36:32.360
they're going to be some very good beneficial uses
link |
00:36:39.800
of like GAN generated imagery.
link |
00:36:46.080
And like, and funny enough, like one of the places
link |
00:36:48.440
where it's actually useful is we're using the technology
link |
00:36:52.920
right now to generate synthetic, synthetic visual data
link |
00:36:58.640
for training some of the face recognition models
link |
00:37:01.160
to get rid of the bias.
link |
00:37:03.440
So like that's one like super good use of the tech,
link |
00:37:05.800
but like, you know, it's getting good enough now
link |
00:37:09.640
where, you know, it's going to sort of challenge
link |
00:37:12.320
a normal human beings ability to like now you're just sort
link |
00:37:15.400
of say like it's very expensive for someone
link |
00:37:19.320
to fabricate a photorealistic fake video.
link |
00:37:24.200
And like GANs are going to make it fantastically cheap
link |
00:37:26.920
to fabricate a photorealistic fake video.
link |
00:37:30.440
And so like what you assume you can sort of trust
link |
00:37:33.920
is true versus like be skeptical about is about to change.
link |
00:37:38.400
And like we're not ready for it, I don't think.
link |
00:37:40.560
The nature of truth, right?
link |
00:37:42.000
That's, it's also exciting because I think both you
link |
00:37:46.360
and I probably would agree that the way to solve,
link |
00:37:49.600
to take on that challenge is with technology.
link |
00:37:52.080
Yeah. Right.
link |
00:37:52.920
There's probably going to be ideas of ways to verify
link |
00:37:56.800
which kind of video is legitimate, which kind is not.
link |
00:38:00.800
So to me, that's an exciting possibility.
link |
00:38:03.880
Most likely for just the comedic genius
link |
00:38:07.160
that the internet usually creates with these kinds of videos.
link |
00:38:10.960
And hopefully will not result in any serious harm.
link |
00:38:13.960
Yeah. And it could be, you know, like I think
link |
00:38:17.680
we will have technology to that may be able to detect
link |
00:38:23.040
whether or not something's fake or real.
link |
00:38:24.440
Although the fakes are pretty convincing
link |
00:38:30.160
even like when you subject them to machine scrutiny.
link |
00:38:34.360
But, you know, we also have these increasingly
link |
00:38:37.800
interesting social networks, you know,
link |
00:38:40.520
that are under fire right now for some of the bad things
link |
00:38:45.800
that they do.
link |
00:38:46.640
Like one of the things you could choose to do
link |
00:38:47.720
with a social network is like you could,
link |
00:38:51.760
you could use crypto and the networks
link |
00:38:55.560
to like have content signed where you could have a like
link |
00:38:59.960
full chain of custody that accompanied
link |
00:39:02.160
every piece of content.
link |
00:39:03.920
So like when you're viewing something
link |
00:39:06.800
and like you want to ask yourself like how, you know,
link |
00:39:09.640
how much can I trust this?
link |
00:39:11.040
Like you can click something
link |
00:39:12.400
and like have a verified chain of custody that shows like,
link |
00:39:15.640
oh, this is coming from, you know, from this source.
link |
00:39:19.040
And it's like signed by like someone whose identity I trust.
link |
00:39:24.080
Yeah, I think having that, you know,
link |
00:39:25.400
having that chain of custody like being able to like say,
link |
00:39:28.040
oh, here's this video, like it may or may not
link |
00:39:31.200
been produced using some of this deep fake technology.
link |
00:39:33.760
But if you've got a verified chain of custody
link |
00:39:35.640
where you can sort of trace it all the way back
link |
00:39:37.800
to an identity and you can decide whether or not
link |
00:39:39.960
like I trust this identity.
link |
00:39:41.520
Like, oh no, this is really from the White House
link |
00:39:43.360
or like this is really from the, you know,
link |
00:39:45.480
the office of this particular presidential candidate
link |
00:39:48.840
or it's really from, you know,
link |
00:39:50.960
Jeff Wiener CEO of LinkedIn or Satya Nadella CEO of Microsoft.
link |
00:39:55.520
Like that might be like one way
link |
00:39:58.400
that you can solve some of the problems.
link |
00:39:59.960
So like that's not the super high tech.
link |
00:40:01.800
Like we've had all of this technology forever.
link |
00:40:04.480
And but I think you're right.
link |
00:40:06.720
Like it has to be some sort of technological thing
link |
00:40:11.120
because the underlying tech that is used to create this
link |
00:40:15.840
is not going to do anything but get better over time
link |
00:40:18.800
and the genie is sort of out of the bottle.
link |
00:40:21.160
There's no stuffing it back in.
link |
00:40:22.800
And there's a social component
link |
00:40:24.520
which I think is really healthy for democracy
link |
00:40:26.600
where people will be skeptical about the thing they watch.
link |
00:40:30.200
Yeah.
link |
00:40:31.040
In general, so, you know, which is good.
link |
00:40:34.160
Skepticism in general is good for your personal content.
link |
00:40:37.280
So deep fakes in that sense are creating
link |
00:40:40.400
global skepticism about can they trust what they read?
link |
00:40:44.800
It encourages further research.
link |
00:40:46.880
I come from the Soviet Union
link |
00:40:49.800
where basically nobody trusted the media
link |
00:40:53.320
because you knew it was propaganda.
link |
00:40:55.120
And that kind of skepticism encouraged further research
link |
00:40:59.160
about ideas supposed to just trusting anyone's source.
link |
00:41:02.360
Well, like I think it's one of the reasons why the,
link |
00:41:05.440
you know, the scientific method and our apparatus
link |
00:41:09.440
of modern science is so good.
link |
00:41:11.480
Like because you don't have to trust anything.
link |
00:41:15.360
Like you, like the whole notion of, you know,
link |
00:41:18.520
like modern science beyond the fact that, you know,
link |
00:41:21.320
this is a hypothesis and this is an experiment
link |
00:41:23.440
to test the hypothesis.
link |
00:41:24.840
And, you know, like this is a peer review process
link |
00:41:27.360
for scrutinizing published results.
link |
00:41:30.080
But like stuff's also supposed to be reproducible.
link |
00:41:33.280
So like, you know, it's been vetted by this process,
link |
00:41:35.240
but like you also are expected to publish enough detail
link |
00:41:38.000
where, you know, if you are sufficiently skeptical
link |
00:41:41.480
of the thing, you can go try to like reproduce it yourself.
link |
00:41:44.720
And like, I don't know what it is.
link |
00:41:47.560
Like, I think a lot of engineers are like this
link |
00:41:49.920
where like, you know, sort of this, like your brain
link |
00:41:52.600
is sort of wired for skepticism.
link |
00:41:55.520
Like you don't just first order trust everything
link |
00:41:58.000
that you see and encounter.
link |
00:42:00.040
And like you're sort of curious to understand,
link |
00:42:02.560
you know, the next thing.
link |
00:42:04.480
But like, I think it's an entirely healthy thing.
link |
00:42:09.080
And like we need a little bit more of that right now.
link |
00:42:12.280
So I'm not a large business owner.
link |
00:42:16.200
So I'm just, I'm just a huge fan of many of Microsoft products.
link |
00:42:23.200
I mean, I still, actually in terms of,
link |
00:42:25.360
I generate a lot of graphics and images
link |
00:42:27.000
and I still use PowerPoint to do that.
link |
00:42:28.640
It beats Illustrator for me.
link |
00:42:30.440
Even professional sort of, it's fascinating.
link |
00:42:34.480
So I wonder what is the future of, let's say,
link |
00:42:39.560
windows and office look like?
link |
00:42:41.920
Is do you see it?
link |
00:42:43.840
I mean, I remember looking forward to XP.
link |
00:42:45.880
Was it exciting when XP was released?
link |
00:42:48.200
Just like you said, I don't remember when 95 was released.
link |
00:42:51.080
But XP for me was a big celebration.
link |
00:42:53.800
And when 10 came out, I was like,
link |
00:42:56.000
okay, well, it's nice, it's a nice improvement.
link |
00:42:58.040
But so what do you see the future of these products?
link |
00:43:02.600
You know, I think there's a bunch of excitement.
link |
00:43:04.640
I mean, on the office front,
link |
00:43:07.160
there's going to be this like increasing productivity
link |
00:43:13.440
wins that are coming out of some of these AI powered features
link |
00:43:17.080
that are coming, like the products will sort of get
link |
00:43:19.000
smarter and smarter in like a very subtle way.
link |
00:43:21.120
Like there's not going to be this big bang moment
link |
00:43:24.120
where, you know, like Clippy is going to reemerge
link |
00:43:27.080
and it's going to be...
link |
00:43:27.960
Wait a minute.
link |
00:43:28.680
Okay, well, I have to wait, wait, wait.
link |
00:43:30.520
It's Clippy coming back.
link |
00:43:31.960
Well, quite seriously.
link |
00:43:34.560
So injection of AI, there's not much,
link |
00:43:37.920
or at least I'm not familiar,
link |
00:43:39.040
sort of assistive type of stuff going on
link |
00:43:41.200
inside the office products,
link |
00:43:43.600
like a Clippy style assistant, personal assistant.
link |
00:43:47.600
Do you think that there's a possibility
link |
00:43:50.560
of that in the future?
link |
00:43:52.000
So I think there are a bunch of like very small ways
link |
00:43:54.680
in which like machine learning power
link |
00:43:57.320
and assistive things are in the product right now.
link |
00:44:00.080
So there are a bunch of interesting things,
link |
00:44:04.800
like the auto response stuff's getting better and better
link |
00:44:09.280
and it's like getting to the point where, you know,
link |
00:44:12.160
it can auto respond with like, okay,
link |
00:44:14.960
let this person is clearly trying to schedule a meeting
link |
00:44:19.080
so it looks at your calendar and it automatically
link |
00:44:21.520
like tries to find like a time and a space
link |
00:44:24.080
that's mutually interesting.
link |
00:44:26.240
Like we have this notion of Microsoft search
link |
00:44:33.520
where it's like not just web search,
link |
00:44:34.960
but it's like search across like all of your information
link |
00:44:38.200
that's sitting inside of like your Office 365 tenant
link |
00:44:43.320
and like, you know, potentially in other products.
link |
00:44:46.880
And like we have this thing called the Microsoft Graph
link |
00:44:49.680
that is basically a API federator that, you know,
link |
00:44:53.400
sort of like gets you hooked up across the entire breadth
link |
00:44:57.960
of like all of the, you know,
link |
00:44:59.760
like what were information silos
link |
00:45:01.640
before they got woven together with the graph.
link |
00:45:05.680
Like that is like getting increasing
link |
00:45:07.880
with increasing effectiveness,
link |
00:45:09.160
sort of plumbed into the,
link |
00:45:11.280
into some of these auto response things
link |
00:45:13.120
where you're going to be able to see the system
link |
00:45:15.840
like automatically retrieve information for you.
link |
00:45:18.200
Like if, you know, like I frequently send out,
link |
00:45:21.160
you know, emails to folks where like I can't find a paper
link |
00:45:24.080
or a document or whatnot.
link |
00:45:25.400
There's no reason why the system won't be able
link |
00:45:26.840
to do that for you.
link |
00:45:27.680
And like, I think the,
link |
00:45:29.560
it's building towards like having things that look more
link |
00:45:33.640
like like a fully integrated, you know, assistant,
link |
00:45:37.880
but like you'll have a bunch of steps
link |
00:45:40.720
that you will see before you,
link |
00:45:42.800
like it will not be this like big bang thing
link |
00:45:45.120
where like Clippy comes back and you've got this like,
link |
00:45:47.400
you know, manifestation of, you know,
link |
00:45:49.360
like a fully, fully powered assistant.
link |
00:45:53.320
So I think that's, that's definitely coming out.
link |
00:45:56.920
Like all of the, you know, collaboration,
link |
00:45:58.680
co authoring stuff's getting better.
link |
00:46:00.720
You know, it's like really interesting.
link |
00:46:02.200
Like if you look at how we use the office product portfolio
link |
00:46:08.320
at Microsoft, like more and more of it is happening
link |
00:46:10.840
inside of like teams as a canvas.
link |
00:46:14.480
And like it's this thing where, you know,
link |
00:46:17.160
that you've got collaboration is like
link |
00:46:19.840
at the center of the product.
link |
00:46:21.560
And like we, we, we built some like really cool stuff
link |
00:46:26.720
that's some of, which is about to be open source
link |
00:46:29.440
that are sort of framework level things for doing,
link |
00:46:33.120
for doing co authoring.
link |
00:46:35.600
That's awesome.
link |
00:46:36.440
So in, is there a cloud component to that?
link |
00:46:38.920
So on the web or is it,
link |
00:46:41.880
forgive me if I don't already know this,
link |
00:46:43.640
but with office 365,
link |
00:46:45.600
we still, the collaboration we do, if we're doing Word,
link |
00:46:48.480
we're still sending the file around.
link |
00:46:50.640
No, no, no, no.
link |
00:46:51.480
So this is,
link |
00:46:53.400
we're already a little bit better than that.
link |
00:46:55.240
And like, you know, so like the fact that you're unaware
link |
00:46:57.360
of it means we've got a better job to do,
link |
00:46:59.120
like helping you discover, discover this stuff.
link |
00:47:02.880
But yeah, I mean, it's already like got a huge,
link |
00:47:06.360
huge cloud component.
link |
00:47:07.200
And like part of, you know, part of this framework stuff,
link |
00:47:09.680
I think we're calling it, like I,
link |
00:47:12.640
like we've been working on it for a couple of years.
link |
00:47:14.520
So like, I know the, the internal OLA code name for it,
link |
00:47:17.200
but I think when we launched it to build,
link |
00:47:18.640
it's called the fluid framework.
link |
00:47:21.920
And, but like what fluid lets you do is like,
link |
00:47:25.080
you can go into a conversation that you're having in teams
link |
00:47:27.920
and like reference, like part of a spreadsheet
link |
00:47:30.280
that you're working on,
link |
00:47:32.600
where somebody's like sitting in the Excel canvas,
link |
00:47:35.600
like working on the spreadsheet with a, you know,
link |
00:47:37.760
charter whatnot.
link |
00:47:39.120
And like, you can sort of embed like part of the spreadsheet
link |
00:47:42.000
in the team's conversation,
link |
00:47:43.240
where like you can dynamically update in like all
link |
00:47:46.520
of the changes that you're making to the,
link |
00:47:49.400
to this object or like, you know,
link |
00:47:51.280
coordinate and everything is sort of updating in real time.
link |
00:47:54.680
So like you can be in whatever canvas is most convenient
link |
00:47:58.000
for you to get your work done.
link |
00:48:00.400
So out of my own sort of curiosity as an engineer,
link |
00:48:03.400
I know what it's like to sort of lead a team
link |
00:48:06.280
of 10, 15 engineers.
link |
00:48:08.280
Microsoft has, I don't know what the numbers are,
link |
00:48:11.680
maybe 15, maybe 60,000 engineers, maybe 40.
link |
00:48:14.920
I don't know exactly what the number is.
link |
00:48:16.160
It's a lot.
link |
00:48:17.000
It's tens of thousands.
link |
00:48:18.520
Right. This is more than 10 or 15.
link |
00:48:23.640
I mean, you've led different sizes,
link |
00:48:28.720
mostly large sizes of engineers.
link |
00:48:30.560
What does it take to lead such a large group
link |
00:48:33.840
into a continue innovation,
link |
00:48:37.480
continue being highly productive
link |
00:48:40.240
and yet develop all kinds of new ideas
link |
00:48:43.200
and yet maintain like, what does it take
link |
00:48:45.120
to lead such a large group of brilliant people?
link |
00:48:49.000
I think the thing that you learn
link |
00:48:52.080
as you manage larger and larger scale
link |
00:48:55.120
is that there are three things
link |
00:48:57.920
that are like very, very important
link |
00:49:00.480
for big engineering teams.
link |
00:49:02.360
Like one is like having some sort of forethought
link |
00:49:06.320
about what it is that you're going to be building
link |
00:49:09.840
over large periods of time.
link |
00:49:11.040
Like not exactly.
link |
00:49:11.880
Like you don't need to know that like,
link |
00:49:13.760
I'm putting all my chips on this one product
link |
00:49:16.440
and like this is going to be the thing.
link |
00:49:17.760
But it's useful to know what sort of capabilities
link |
00:49:21.440
you think you're going to need to have
link |
00:49:23.080
to build the products of the future
link |
00:49:24.720
and then like invest in that infrastructure.
link |
00:49:28.000
Like whether, and I'm not just talking about storage systems
link |
00:49:31.520
or cloud APIs, it's also like,
link |
00:49:33.480
what is your development process look like?
link |
00:49:35.360
What tools do you want?
link |
00:49:36.720
Like what culture do you want to build
link |
00:49:39.560
around like how you're sort of collaborating together
link |
00:49:42.760
to like make complicated technical things?
link |
00:49:45.720
And so like having an opinion and investing in that
link |
00:49:48.080
is like, it just gets more and more important.
link |
00:49:50.480
And like the sooner you can get a concrete set of opinions,
link |
00:49:54.520
like the better you're going to be.
link |
00:49:57.680
Like you can wing it for a while at small scales.
link |
00:50:01.600
Like, you know, when you start a company,
link |
00:50:03.160
like you don't have to be like super specific about it.
link |
00:50:06.320
But like the biggest miseries that I've ever seen
link |
00:50:10.000
as an engineering leader are in places
link |
00:50:12.640
where you didn't have a clear enough opinion
link |
00:50:14.440
about those things soon enough.
link |
00:50:16.800
And then you just sort of go create a bunch of technical debt
link |
00:50:20.240
and like culture debt that is excruciatingly painful
link |
00:50:24.000
to clean up.
link |
00:50:25.760
So like that's one bundle of things.
link |
00:50:28.640
Like the other, you know, another bundle of things is
link |
00:50:33.640
like it's just really, really important to
link |
00:50:38.960
like have a clear mission that's not just some cute crap
link |
00:50:45.520
you say because like you think you should have a mission,
link |
00:50:48.880
but like something that clarifies for people
link |
00:50:52.880
like where it is that you're headed together.
link |
00:50:57.160
Like I know it's like probably
link |
00:50:58.520
like a little bit too popular right now,
link |
00:51:00.320
but Yval Harari's book, Sapiens,
link |
00:51:07.240
one of the central ideas in his book is that
link |
00:51:12.440
like storytelling is like the quintessential thing
link |
00:51:16.840
for coordinating the activities of large groups of people.
link |
00:51:20.480
Like once you get past Dunbar's number
link |
00:51:23.360
and like I've really, really seen that
link |
00:51:25.800
just managing engineering teams.
link |
00:51:27.320
Like you can just brute force things
link |
00:51:32.080
when you're less than 120, 150 folks
link |
00:51:35.160
where you can sort of know and trust
link |
00:51:37.520
and understand what the dynamics are between all the people.
link |
00:51:40.920
But like past that,
link |
00:51:41.840
like things just sort of start to catastrophically fail
link |
00:51:45.440
if you don't have some sort of set of shared goals
link |
00:51:48.760
that you're marching towards.
link |
00:51:50.480
And so like even though it sounds touchy feely
link |
00:51:52.960
and you know, like a bunch of technical people
link |
00:51:55.640
will sort of balk at the idea that like you need
link |
00:51:58.200
to like have a clear, like the missions
link |
00:52:01.680
like very, very, very important.
link |
00:52:03.560
Yval's right, right?
link |
00:52:04.640
Stories, that's how our society,
link |
00:52:07.520
that's the fabric that connects us all of us
link |
00:52:09.360
is these powerful stories.
link |
00:52:11.120
And that works for companies too, right?
link |
00:52:13.440
It works for everything.
link |
00:52:14.520
Like I mean, even down to like, you know,
link |
00:52:16.520
you sort of really think about like our currency
link |
00:52:18.280
for instance is a story.
link |
00:52:19.960
Our constitution is a story, our laws are story.
link |
00:52:23.360
I mean, like we believe very, very, very strongly in them
link |
00:52:27.840
and thank God we do.
link |
00:52:29.960
But like they are, they're just abstract things.
link |
00:52:33.040
Like they're just words.
link |
00:52:34.000
Like if we don't believe in them, they're nothing.
link |
00:52:36.520
And in some sense, those stories are platforms
link |
00:52:39.440
and the kinds some of which Microsoft is creating, right?
link |
00:52:43.040
Yeah, platforms in which we define the future.
link |
00:52:46.360
So last question, what do you,
link |
00:52:48.600
let's get philosophical maybe,
link |
00:52:50.080
bigger than even Microsoft.
link |
00:52:51.480
What do you think the next 2030 plus years
link |
00:52:56.280
looks like for computing, for technology, for devices?
link |
00:53:00.120
Do you have crazy ideas about the future of the world?
link |
00:53:04.600
Yeah, look, I think we, you know,
link |
00:53:06.400
we're entering this time where we've got,
link |
00:53:10.640
we have technology that is progressing
link |
00:53:13.360
at the fastest rate that it ever has.
link |
00:53:15.800
And you've got, you get some really big social problems
link |
00:53:20.800
like society scale problems that we have to tackle.
link |
00:53:26.320
And so, you know, I think we're gonna rise to the challenge
link |
00:53:28.720
and like figure out how to intersect
link |
00:53:30.560
like all of the power of this technology
link |
00:53:32.400
with all of the big challenges that are facing us,
link |
00:53:35.320
whether it's, you know, global warming,
link |
00:53:37.840
whether it's like the biggest remainder of the population
link |
00:53:41.000
boom is in Africa for the next 50 years or so.
link |
00:53:46.800
And like global warming is gonna make it increasingly
link |
00:53:49.360
difficult to feed the global population in particular,
link |
00:53:52.600
like in this place where you're gonna have
link |
00:53:54.200
like the biggest population boom.
link |
00:53:57.720
I think we, you know, like AI is gonna,
link |
00:54:01.520
like if we push it in the right direction,
link |
00:54:03.560
like it can do like incredible things
link |
00:54:05.680
to empower all of us to achieve our full potential
link |
00:54:10.160
and to, you know, like live better lives.
link |
00:54:15.160
But like that also means focus on like
link |
00:54:20.520
some super important things,
link |
00:54:22.040
like how can you apply it to healthcare
link |
00:54:23.960
to make sure that, you know, like our quality and cost of,
link |
00:54:29.640
and sort of ubiquity of health coverage
link |
00:54:32.080
is better and better over time.
link |
00:54:35.080
Like that's more and more important every day
link |
00:54:37.960
is like in the United States
link |
00:54:40.880
and like the rest of the industrialized world.
link |
00:54:43.280
So Western Europe, China, Japan, Korea,
link |
00:54:45.720
like you've got this population bubble
link |
00:54:48.880
of like aging working, you know, working age folks
link |
00:54:52.880
who are, you know, at some point over the next 20, 30 years
link |
00:54:56.200
they're gonna be largely retired
link |
00:54:58.000
and like you're gonna have more retired people
link |
00:55:00.160
than working age people.
link |
00:55:01.200
And then like you've got, you know,
link |
00:55:02.520
sort of natural questions about who's gonna take care
link |
00:55:04.800
of all the old folks and who's gonna do all the work.
link |
00:55:07.120
And the answers to like all of these sorts of questions
link |
00:55:11.040
like where you're sort of running into, you know,
link |
00:55:13.200
like constraints of the, you know,
link |
00:55:16.080
the world and of society has always been like
link |
00:55:20.080
what tech is gonna like help us get around this.
link |
00:55:23.000
You know, like when I was a kid in the 70s and 80s,
link |
00:55:26.360
like we talked all the time about like population boom,
link |
00:55:29.800
population boom, like we're gonna,
link |
00:55:32.200
like we're not gonna be able to like feed the planet.
link |
00:55:34.360
And like we were like right in the middle
link |
00:55:36.800
of the green revolution
link |
00:55:38.200
where like this massive technology driven increase
link |
00:55:44.560
and crop productivity like worldwide.
link |
00:55:47.520
And like some of that was like taking some of the things
link |
00:55:49.320
that we knew in the West and like getting them distributed
link |
00:55:52.560
to the, you know, to the developing world.
link |
00:55:55.760
And like part of it were things like, you know,
link |
00:55:59.360
just smarter biology like helping us increase.
link |
00:56:03.280
And like we don't talk about like, yeah,
link |
00:56:06.760
overpopulation anymore because like we can more or less,
link |
00:56:10.320
we sort of figured out how to feed the world.
link |
00:56:12.000
Like that's a technology story.
link |
00:56:14.760
And so like I'm super, super hopeful about the future
link |
00:56:19.480
and in the ways where we will be able to apply technology
link |
00:56:24.080
to solve some of these super challenging problems.
link |
00:56:28.040
Like I've, like one of the things
link |
00:56:31.360
that I'm trying to spend my time doing right now
link |
00:56:34.680
is trying to get everybody else to be hopeful
link |
00:56:36.600
as well because, you know, back to Harari,
link |
00:56:38.720
like we are the stories that we tell.
link |
00:56:41.160
Like if we, you know, if we get overly pessimistic right now
link |
00:56:44.320
about like the potential future of technology, like we,
link |
00:56:49.320
you know, like we may fail to fail to get all the things
link |
00:56:53.680
in place that we need to like have our best possible future.
link |
00:56:56.880
And that kind of hopeful optimism.
link |
00:56:59.440
I'm glad that you have it because you're leading large groups
link |
00:57:03.160
of engineers that are actually defining
link |
00:57:05.600
that are writing that story,
link |
00:57:06.720
that are helping build that future,
link |
00:57:08.320
which is super exciting.
link |
00:57:10.000
And I agree with everything you said,
link |
00:57:12.320
except I do hope Clippy comes back.
link |
00:57:16.400
We miss him.
link |
00:57:17.760
I speak for the people.
link |
00:57:19.360
So, Kellen, thank you so much for talking to me.
link |
00:57:21.800
Thank you so much for having me.
link |
00:57:22.640
It was a pleasure.