back to index

Stephen Schwarzman: Going Big in Business, Investing, and AI | Lex Fridman Podcast #96


small model | large model

link |
00:00:00.000
The following is a conversation with Steven Schwartzman, CEO and cofounder of Blackstone,
link |
00:00:05.920
one of the world's leading investment firms with over $530 billion of assets under management.
link |
00:00:13.120
He's one of the most successful business leaders in history.
link |
00:00:17.920
I recommend his recent book called What It Takes that tells stories and lessons from
link |
00:00:22.720
his personal journey.
link |
00:00:24.560
Steven is a philanthropist and one of the wealthiest people in the world, recently signing
link |
00:00:29.840
the Giving Pledge, thereby committing to give the majority of his wealth to philanthropic causes.
link |
00:00:35.920
As an example, in 2018, he donated $350 million to MIT to help establish its new college of
link |
00:00:43.680
computing, the mission of which promotes interdisciplinary, big, bold research in
link |
00:00:49.440
artificial intelligence.
link |
00:00:51.680
For those of you who know me, know that MIT is near and dear to my heart and always will be.
link |
00:00:56.560
It was and is a place where I believe big, bold, revolutionary ideas have a home,
link |
00:01:02.720
and that is what has needed in artificial intelligence research in the coming decades.
link |
00:01:08.240
Yes, there's institutional challenges, but also there's power in the passion of individual
link |
00:01:14.320
researchers, from undergrad to PhD, from young scientists to senior faculty.
link |
00:01:19.600
I believe the dream to build intelligence systems burns brighter than ever in the halls of MIT.
link |
00:01:27.120
This conversation was recorded recently, but before the outbreak of the pandemic.
link |
00:01:31.600
For everyone feeling the burden of this crisis, I'm sending love your way.
link |
00:01:35.680
Stay strong.
link |
00:01:36.720
We're in this together.
link |
00:01:39.040
This is the Artificial Intelligence Podcast.
link |
00:01:41.520
If you enjoy it, subscribe on YouTube, review it with five stars on Apple Podcasts,
link |
00:01:46.000
support on Patreon, or simply connect with me on Twitter.
link |
00:01:49.120
And Lex Friedman, spelled F R I D M A N.
link |
00:01:53.200
As usual, I'll do a few minutes of ads now and never any ads in the middle that can break
link |
00:01:57.760
the flow of the conversation.
link |
00:01:59.440
I hope that works for you and doesn't hurt the listening experience.
link |
00:02:02.640
Quick summary of the ads.
link |
00:02:04.160
Two sponsors, Masterclass and ExpressVPN.
link |
00:02:07.760
Please consider supporting the podcast by signing up to masterclass and masterclass.com
link |
00:02:12.480
slash Lex and getting expressvpn at expressvpn.com slash Lex pod.
link |
00:02:19.840
This show is sponsored by Masterclass.
link |
00:02:22.800
Sign up at masterclass.com slash Lex to get a discount and support this podcast.
link |
00:02:28.880
When I first heard about Masterclass, I thought it was too good to be true.
link |
00:02:32.560
For $180 a year, you get an all access pass to watch courses from to list some of my favorites,
link |
00:02:39.200
Chris Hadfield on Space Exploration, Neil Lugas Tyson on Scientific Thinking and Communication,
link |
00:02:44.400
Will Wright, creator of SimCity and Sims on Game Design, Carlos Santana on Guitar,
link |
00:02:50.800
Gary Kasparov on Chess, Daniel Nagrano on Poker, and many, many more.
link |
00:02:56.080
Chris Hadfield explaining how rockets work and the experience of being launched into space alone
link |
00:03:01.440
is worth the money.
link |
00:03:03.120
By the way, you can watch it on basically any device.
link |
00:03:05.840
Once again, sign up at masterclass.com slash Lex to get a discount and to support this podcast.
link |
00:03:13.440
This show is sponsored by ExpressVPN.
link |
00:03:16.400
Get it at expressvpn.com slash Lex pod to get a discount and to support this podcast.
link |
00:03:23.040
I've been using ExpressVPN for many years.
link |
00:03:25.600
I love it.
link |
00:03:26.480
It's easy to use, press the big power on button and your privacy is protected.
link |
00:03:31.200
And if you like, you can make it look like your location is anywhere else in the world.
link |
00:03:36.240
I might be in Boston now, but it can make you look like I'm in New York, London, Paris,
link |
00:03:42.160
or anywhere else in the world.
link |
00:03:44.240
This has a large number of obvious benefits.
link |
00:03:46.800
Certainly, it allows you to access international versions of streaming websites
link |
00:03:50.880
like the Japanese Netflix or the UK Hulu.
link |
00:03:54.320
ExpressVPN works on any device you can imagine.
link |
00:03:57.200
I use it on Linux.
link |
00:03:58.400
Shout out to Ubuntu 2004.
link |
00:04:00.320
Windows, Android, but it's available everywhere else too.
link |
00:04:05.360
Once again, get it at expressvpn.com slash Lex pod to get a discount and to support this podcast.
link |
00:04:12.960
And now here's my conversation with Steven Schwarzman.
link |
00:04:17.520
Let's start with a tough question.
link |
00:04:19.600
What idea do you believe whether grounded in data or intuition
link |
00:04:24.640
that many people you respect disagree with you on?
link |
00:04:27.200
Well, there isn't all that much anymore since the world's so transparent.
link |
00:04:34.720
But one of the things I believe in and put it in the book, what it takes is,
link |
00:04:42.240
if you're going to do something very consequential,
link |
00:04:46.240
do something that's quite large, if you can, that's unique.
link |
00:04:50.080
Because if you operate in that kind of space, when you're successful, it's a huge impact.
link |
00:04:57.600
The prospect of success enables you to recruit people who want to be part of that.
link |
00:05:04.320
And those type of large opportunities are pretty easily described.
link |
00:05:09.360
And so not everybody likes to operate at scale.
link |
00:05:14.400
Some people like to do small things because it is meaningful for them emotionally.
link |
00:05:21.200
And so occasionally you get a disagreement on that.
link |
00:05:25.200
But those are life choices rather than commercial choices.
link |
00:05:29.840
That's interesting. What good and bad comes with going big?
link |
00:05:34.640
See, we often in America think big is good.
link |
00:05:40.320
What's the benefit? What's the cost? In terms of just bigger than business,
link |
00:05:45.040
but life, happiness, the pursuit of happiness?
link |
00:05:48.240
Well, you do things to make you happy. It's not mandated.
link |
00:05:52.720
And everybody's different. And some people, if they have talent like playing pro football,
link |
00:06:01.440
other people just like throwing the ball around, not even being on a team.
link |
00:06:07.920
What's better depends what your objectives are. It depends what your talent is.
link |
00:06:14.720
It depends what gives you joy.
link |
00:06:19.600
So in terms of going big, is it both for impact on the world
link |
00:06:24.160
and because you personally gives you joy?
link |
00:06:27.440
Well, it makes it easier to succeed, actually, because if you catch something,
link |
00:06:33.360
for example, that's cyclical, that's a huge opportunity,
link |
00:06:40.000
then you usually can find some place within that huge opportunity where you can make it work.
link |
00:06:46.880
If you're prosecuting a really small thing and you're wrong,
link |
00:06:53.280
you don't have many places to go. So I've always found that the easy place to be
link |
00:07:00.000
and the ability where you can concentrate human resources,
link |
00:07:07.600
get people excited about doing really impactful big things,
link |
00:07:13.200
and you can afford to pay them, actually, because the bigger thing can generate much more in the way
link |
00:07:21.120
of financial resources. So that brings people out of talent to help you.
link |
00:07:28.160
And so altogether, it's a virtuous circle, I think.
link |
00:07:33.280
How do you know an opportunity when you see one in terms of the one you want to go big on?
link |
00:07:40.080
Is it intuition? Is it facts? Is it back and forth deliberation with people you trust?
link |
00:07:48.640
What's the process? Is it art? Is it science?
link |
00:07:51.680
Well, it's pattern recognition.
link |
00:07:53.840
And how do you get to pattern recognition? First, you need to understand the patterns
link |
00:07:58.960
and the changes that are happening. And that's either it's observational on some level.
link |
00:08:07.440
You can call it data or you can just call it listening to unusual things that people are
link |
00:08:16.400
saying that they haven't said before. And I've always tried to describe this.
link |
00:08:23.760
It's like seeing a piece of white lint on a black dress.
link |
00:08:28.320
But most people disregard that piece of lint. They just see the dress.
link |
00:08:33.520
I always see the lint. And I'm fascinated by how did something get some place it's not supposed to be.
link |
00:08:42.080
So it doesn't even need to be a big discrepancy. But if something shouldn't be some place
link |
00:08:48.480
in a constellation of facts that made sense in a traditional way, I've learned that if you focus
link |
00:08:59.920
on why one discordant note is there, that's usually a key to something important.
link |
00:09:07.760
And if you can find two of those discordant notes, that's usually a straight line to someplace.
link |
00:09:17.520
And that someplace is not where you've been. And usually when you figure out that things
link |
00:09:23.440
are changing or have changed and you describe them, which you have to be able to do because it's not
link |
00:09:29.680
some odd intuition. It's just focusing on facts. It's almost like a scientific discovery, if you
link |
00:09:38.240
will. When you describe it to other people in the real world, they tend to do absolutely nothing
link |
00:09:45.200
about it. And that's because humans are comfortable in their own reality. And if there's no particular
link |
00:09:55.360
reason at that moment to shake them out of their reality, they'll stay in it even if they're
link |
00:10:02.880
ultimately completely wrong. And I've always been stunned that when I explain where we're going,
link |
00:10:10.240
what we're doing, and why almost everyone just says, that's interesting. And they continue doing
link |
00:10:19.520
what they're doing. And so I think it's pretty easy to do that. But what you need is a huge
link |
00:10:28.240
data set. So before AI and people's focus on data, I've sort of been doing this mostly my whole life.
link |
00:10:36.080
I'm not a scientist. I'm not alone a computer scientist. And you can just hear what people are
link |
00:10:42.880
saying when somebody says something or you observe something that simply doesn't make sense.
link |
00:10:47.680
That's when you really go to work. The rest of it's just processing.
link |
00:10:52.080
So you know, on a quick tangent, pattern recognition is a term often used throughout
link |
00:10:57.600
the history of AI. That's the goal of artificial intelligence is pattern recognition, right?
link |
00:11:03.360
But there's, I would say, various flavors of that. So usually, pattern recognition refers to the
link |
00:11:11.440
process of the, we said, dress and the lint on the dress pattern recognition is very good identifying
link |
00:11:19.920
the dress is looking at the right the pattern that's always there that's very common and so on.
link |
00:11:27.040
You almost refer to a pattern that's like an what's called outlier detection in computer science,
link |
00:11:33.920
right? The rare thing, the small thing. Now, AI is not often good at that. Do you just almost
link |
00:11:44.800
philosophically the kind of decisions you made in your life based scientifically almost on data?
link |
00:11:52.160
Do you think AI in the future will be able to do? Is it something that could be put down
link |
00:11:57.600
down into code? Or is it still deeply human? It's tough for me to say since I don't have
link |
00:12:05.440
domain knowledge in AI to know everything that could or might occur. I know sort of,
link |
00:12:17.840
in my own case, that most people don't see any of that. I just assumed it was motivational.
link |
00:12:25.120
But it's also sort of, it's hard wiring. What are you wired or programmed to be finding
link |
00:12:37.040
or looking for? It's not what happens every day. That's not interesting, frankly. I mean,
link |
00:12:44.880
that's what people mostly do. I do a bunch of that too, because that's what you do in normal life.
link |
00:12:51.920
But I've always been completely fascinated by the stuff that doesn't fit. Or the other way of
link |
00:13:01.360
thinking about it, it's determining what people want without them saying it. That's a different
link |
00:13:13.200
kind of pattern. You can see everything they're doing. There's a missing piece. They don't know
link |
00:13:18.800
it's missing. You think it's missing, given the other facts you know about them. You deliver that
link |
00:13:26.960
and then that becomes sort of very easy to sell to them. To link on this point a little bit,
link |
00:13:35.120
you've mentioned that in your family, when you were growing up, nobody raised their voice
link |
00:13:40.000
in anger otherwise. You said that this allowed you to learn to listen and hear some interesting
link |
00:13:45.280
things. Can you elaborate as you have been on that idea? What do you hear about the world if you
link |
00:13:52.400
listen? Well, you have to listen really intensely to understand what people are saying as well as
link |
00:14:01.600
what people are intending, because it's not necessarily the same thing. People mostly
link |
00:14:10.960
give themselves away, no matter how clever they think they are. Particularly if you
link |
00:14:19.680
have the full array of inputs. In other words, if you look at their face, you look at their eyes,
link |
00:14:26.320
which are the window on the soul, it's very difficult to conceal what you're thinking.
link |
00:14:32.560
You look at facial expressions and posture. You listen to their voice, which changes.
link |
00:14:42.800
When you're talking about something you're comfortable with or not, or you're speaking
link |
00:14:47.600
faster, is the amplitude of what you're saying higher. Most people just give away what's really
link |
00:14:54.560
on their mind. They're not that clever. They're busy spending their time thinking about what
link |
00:15:00.640
they're in the process of saying. If you just observe that, not in a hostile way, but just
link |
00:15:08.880
in an evocative way and just let them talk for a while, they'll more or less tell you almost
link |
00:15:14.560
completely what they're thinking, even the stuff they don't want you to know. Once you know that,
link |
00:15:22.960
of course, it's easy to play that kind of game, because they've already told you everything you
link |
00:15:31.600
need to know. It's easy to get to a conclusion if there's meant to be an area of common interest,
link |
00:15:40.320
since you know almost exactly what's on their mind. That's an enormous advantage, as opposed to just
link |
00:15:48.880
walking in in some place and somebody telling you something and you believing what they're saying.
link |
00:15:56.400
There are so many different levels of communication.
link |
00:16:00.880
So a powerful approach to life you discuss in the book on the topic of listening and really
link |
00:16:06.560
hearing people, is figuring out what the biggest problem, bothering a particular individual or
link |
00:16:11.600
group is and coming up with a solution to that problem and presenting them with a solution. In
link |
00:16:20.080
fact, you brilliantly describe a lot of simple things that most people just don't do. It's
link |
00:16:26.400
kind of obvious, find the problem that's bothering somebody deeply. And as you said, I think you've
link |
00:16:32.960
implied that they will usually tell you what the problem is. But can you talk about this process
link |
00:16:39.200
of seeing what the biggest problem for a person is, trying to solve it, and maybe a particularly
link |
00:16:46.080
memorable example? Sure. If you know you're going to meet somebody, there are two types of
link |
00:16:52.960
situations. Chance meetings, and the second is you know you're going to meet somebody. So
link |
00:16:59.680
let's take the easiest one, which is you know you're going to meet somebody.
link |
00:17:03.120
And you start trying to make pretend you're them. It's really easy. What's on their mind?
link |
00:17:13.840
What are they thinking about in their daily life? What are the big problems they're facing? So if
link |
00:17:19.520
they're, you know, to make it a really easy example, you know, and make pretend, you know,
link |
00:17:26.160
they're like president of the United States. It doesn't have to be this president,
link |
00:17:30.160
could be any president. So you sort of know what's more or less on their mind, because
link |
00:17:34.960
the press keeps reporting it. And you see it on television, you hear it, people discuss it. So you
link |
00:17:42.640
know if you're going to be running into somebody in that kind of position, you sort of know what
link |
00:17:48.320
they look like already. You know what they sound like, you know what their voice is like, and you
link |
00:17:56.240
know what they're focused on. And so if you're going to meet somebody like that, what you should
link |
00:18:02.720
do is take the biggest unresolved issue that they're facing and come up with a few interesting
link |
00:18:10.400
solutions that basically haven't been out there, or that you haven't heard anybody else
link |
00:18:18.960
I was thinking about. So just to give you an example, I was sort of in the early 1990s and
link |
00:18:24.320
I was invited to something at the White House, which was a big deal for me, because I was like,
link |
00:18:29.200
you know, a person from no place. And you know, I had met the president once before,
link |
00:18:35.360
because it was President Bush, because his son was in my dormitory. So I had met him at
link |
00:18:41.600
parents day. I mean, it's just like the oddity of things. So I knew I was going to see him,
link |
00:18:47.680
because you know, that's where the invitation came from. And so there was something going on,
link |
00:18:53.920
and I just thought about, you know, two or three ways to approach that issue. And you know, at that
link |
00:19:02.560
point, I was separated. And so I had brought a date to the White House. And so I saw the president,
link |
00:19:12.720
we sort of went over in a corner for about 10 minutes and discussed whatever this issue was.
link |
00:19:18.320
And I later, you know, went back to my data, it was a little rude, but it was meant to be
link |
00:19:23.920
confidential conversation. And I barely knew her. And you know, she said, what were you talking about
link |
00:19:30.720
all that time? I said, well, you know, there's something going on in the world. And I've thought
link |
00:19:36.000
about different ways of perhaps approaching that. And he was interested. And the answer is, of course,
link |
00:19:42.800
he was interested. Why wouldn't he be interested? There didn't seem to be an easy outcome. And
link |
00:19:48.160
so, you know, conversations of that type, once somebody knows you're really thinking about
link |
00:19:53.920
what's good for them, and good for the situation, it has nothing to do with me. I mean, it's really
link |
00:20:02.560
about being in service, you know, to the situation, that then people trust you. And they'll tell you
link |
00:20:10.800
other things, because they know your motives are basically very pure. You're just trying to resolve
link |
00:20:19.200
a difficult situation and help somebody do it. So these types of things, you know, that's a plan
link |
00:20:25.520
situation. That's easy. Sometimes you just come upon somebody and they start talking. And, you
link |
00:20:31.680
know, that requires, you know, like different skills, you know, you can ask them what you've
link |
00:20:38.160
been working on lately. What are you thinking about? You can ask them, you know, has anything been
link |
00:20:44.240
particularly difficult? And you know, you can ask them, most people, if they trust you for some reason,
link |
00:20:53.840
they'll tell you. And then you have to instantly go to work on it. And, you know, that's not as good
link |
00:21:02.000
as having some advanced planning. But, you know, almost everything going on is like out there.
link |
00:21:10.800
And people who are involved with interesting situations, they're playing in the same ecosystem.
link |
00:21:20.800
They just have different roles in the ecosystem. And, you know, you can do that with somebody who
link |
00:21:29.680
owns a pro football team that loses all the time. We specialize in those in New York. And, you know,
link |
00:21:38.480
you already have analyzed why they're losing, right? Inevitably, it's because they don't have a
link |
00:21:46.160
great quarterback. They don't have a great coach. And they don't have a great general manager who
link |
00:21:52.480
knows how to hire the best talent. Those are the three reasons why a team fails, right? Because
link |
00:21:59.360
they're salary caps. So, every team pays this amount of money for all their players. So, it's
link |
00:22:04.960
got to be those three positions. So, if you're talking with somebody like that, inevitably,
link |
00:22:11.040
even though it's not structured, you'll know how their team's doing and you'll know
link |
00:22:18.080
pretty much why. And if you start asking questions about that, they're typically very happy to talk
link |
00:22:24.320
about it because they haven't solved that problem. In some case, they don't even know that's the problem.
link |
00:22:29.440
It's pretty easy to see it. So, you know, I do stuff like that, which I find is intuitive as a
link |
00:22:38.240
process, but, you know, leads to really good results. Well, the funny thing is when you're smart,
link |
00:22:48.160
for smart people, it's hard to escape their own ego and the space of their own problems,
link |
00:22:53.360
which is what's required to think about other people's problems. It requires for you to let go
link |
00:23:00.400
of the fact that your own problems are all important. And then to talk about your, I think,
link |
00:23:07.200
while it seems obvious, and I think quite brilliant, it's a difficult leap for many people,
link |
00:23:14.480
especially smart people, to empathize with, truly empathize with the problems of others.
link |
00:23:19.760
Well, I have a competitive advantage, which is, I don't think I'm so smart.
link |
00:23:29.920
You know, it's not a problem for me.
link |
00:23:31.680
Well, the truly smartest people I know say that exact same thing. Yeah, being humble
link |
00:23:36.880
is really useful, competitive advantage, as you said. How do you stay humble?
link |
00:23:43.440
Well, I haven't changed much since I was in my mid teens, you know, I was raised partly in the city
link |
00:23:54.720
and partly in the suburbs. And, you know, whatever the values I had at that time, those are still
link |
00:24:04.640
my values. I call them like middle class values, that's how I was raised. And I've never changed,
link |
00:24:13.760
why would I? That's who I am. And so the accoutrement of, you know, the rest of your life
link |
00:24:22.880
has got to be put on the same, you know, like solid foundation of who you are. Because if you
link |
00:24:28.880
start losing who you really are, who are you? So I've never had the desire to be somebody else.
link |
00:24:37.040
I just do other things now that I wouldn't do as a, you know, sort of as a middle class kid
link |
00:24:43.840
from Philadelphia. I mean, my life has morphed on a certain level. But part of the strength
link |
00:24:49.680
of having integrity, of personality, is that you can remain in touch with everybody who's
link |
00:25:01.760
comes from that kind of background. And, you know, even though I do some things that aren't like that,
link |
00:25:08.560
you know, in terms of people I'd meet or situations I'm in, I always look at it through
link |
00:25:13.280
the same lens. And that's very psychologically comfortable. And doesn't require me to make
link |
00:25:21.120
any real adjustments in my life. And I just keep plowing ahead.
link |
00:25:26.320
There's a lot of activity and progress in recent years around effective altruism.
link |
00:25:32.080
It's, I wanted to bring this topic with you, because it's an interesting one from your perspective.
link |
00:25:36.640
You can put it in any kind of terms, but it's philanthropy that focuses on maximizing impact.
link |
00:25:44.160
How do you see the goal of philanthropy, both from a personal motivation perspective and the
link |
00:25:50.240
societal big picture impact perspective? Yeah, I don't think about philanthropy the
link |
00:25:56.000
way you would expect me to. Okay, I look at, you know, sort of solving big issues, addressing
link |
00:26:04.960
big issues, starting new organizations to do it, much like we do in our business. You know,
link |
00:26:12.400
we keep growing our business, not by taking the original thing and making it larger,
link |
00:26:17.520
but continually seeing new things and building those and, you know, sort of marshaling
link |
00:26:25.040
financial resources, human resources. And in our case, because we're in the investment
link |
00:26:31.680
business, we find something new that looks like it's going to be terrific. And we do that,
link |
00:26:36.640
and it works out really well. All I do in what you would call philanthropy is look at
link |
00:26:45.040
other opportunities to help society. And I end up starting something new,
link |
00:26:54.240
marshaling people, marshaling a lot of money. And then at the end of that kind of creative process
link |
00:27:00.640
says somebody typically asks me to write a check. I don't wake up and say, how can I, like, give
link |
00:27:07.520
large amounts of money away? I look at issues that are important for people. In some cases,
link |
00:27:15.520
I do smaller things, because it's important to a person. And, you know, I have, you know,
link |
00:27:22.640
sort of, I can relate to that person, there's some unfairness that's happened to them. And so,
link |
00:27:28.400
in situations like that, I'd give money anonymously and help them out. And, you know, it's like a
link |
00:27:37.440
miniature version of addressing something really big. So, you know, at MIT, I've done a big thing,
link |
00:27:46.480
you know, helping to start this new school of computing. And I did that because, you know,
link |
00:27:51.840
I saw that, you know, there's sort of like a global race on AI, quantum and other major
link |
00:27:59.760
technologies. And I thought that the US could use more enhancement from a competitive perspective.
link |
00:28:09.600
And I also, because I get to China a lot, and I travel around a lot compared to a regular person,
link |
00:28:15.680
you know, I can see the need to have control of these types of technologies. So, when they're
link |
00:28:23.600
introduced, we don't create a mess like we did with the internet, and with social media,
link |
00:28:30.640
unintended consequence, you know, that's creating all kinds of issues and freedom of speech and the
link |
00:28:37.120
functioning of liberal democracies. So, with AI, it was pretty clear that there was enormous
link |
00:28:42.800
difference of views around the world by the relatively few practitioners in the world who
link |
00:28:49.360
really knew what was going on. And by accident, I knew a bunch of these people, you know, who were
link |
00:28:56.560
like big famous people, and I could talk to them and say, why do you think this is a force for bad?
link |
00:29:04.960
And someone else, why do you feel this is a force for good? And how do we move forward with the
link |
00:29:11.760
technology by the same time, make sure that whatever is potentially, you know, sort of on the bad
link |
00:29:20.800
side of this technology with, you know, for example, disruption of workforces and things like that,
link |
00:29:27.840
that could happen much faster than the industrial revolution? What do we do about that? And how
link |
00:29:33.600
do we keep that under control so that the really good things about these technologies, which will
link |
00:29:40.080
be great things, not good things, are allowed to happen? So, to me, you know, this was one of the
link |
00:29:49.360
great issues facing society. The number of people who were aware of it were very small.
link |
00:29:57.280
I just accidentally got sucked into it. And as soon as I saw it, I went, oh my god, this is mega.
link |
00:30:04.640
Yeah. Yeah. Both on a competitive basis globally, but also in terms of protecting
link |
00:30:12.480
society and benefiting society. So, that's how I got involved. And at the end, you know,
link |
00:30:18.880
sort of the right thing that we figured out was, you know, sort of double MIT's computer
link |
00:30:24.960
science faculty and basically create the first AI enabled university in the world.
link |
00:30:31.360
And, you know, in effect, be an example, a beacon to the rest of the research community around the
link |
00:30:38.160
world academically, and create, you know, a much more robust US situation, competitive situation
link |
00:30:49.760
among the universities. Because if MIT was going to raise a lot of money and double its faculty,
link |
00:30:56.800
where you could bet that, you know, in a number of other universities, we're going to do the same
link |
00:31:03.040
thing. At the end of it, it would be great for knowledge creation, you know, great for the
link |
00:31:09.520
United States, great for the world. And so, I like to do things that I think are really positive
link |
00:31:19.440
things that other people aren't acting on, that I see for whatever the reason. First,
link |
00:31:26.160
it's just people I meet and what they say. And I can recognize when something really profound
link |
00:31:32.800
is about to happen or needs to. And I do it at the end of the situation. Somebody says,
link |
00:31:39.920
can you write a check to help us? And then the answer is sure. I mean, because if I don't,
link |
00:31:46.400
the vision won't happen. But it's the vision of whatever I do that is compelling.
link |
00:31:53.280
And essentially, I love that idea of whether it's small at the individual level or really big,
link |
00:32:01.840
like the gift to MIT to launch the College of Computing. It starts with the vision and
link |
00:32:11.120
you see philanthropy as the biggest impact you can have is by launching something new,
link |
00:32:17.040
especially on an issue that others aren't really addressing. And I also love the notion,
link |
00:32:24.480
and you're absolutely right, that there's other universities,
link |
00:32:29.520
Stanford, CMU, I'm looking at you, that would essentially your, the seed will
link |
00:32:36.640
create other, it'll have a ripple effect that potentially might help US be a leader
link |
00:32:42.320
or continue to be a leader in AI, this potentially very transformative research direction.
link |
00:32:49.920
Just to linger on that point a little bit, what is your hope long term
link |
00:32:54.960
for the impact the college here at MIT might have in the next 5, 10, even 20,
link |
00:33:01.120
or let's get crazy 30, 50 years? Well, it's very difficult to predict the future
link |
00:33:06.000
when you're dealing with knowledge production and creativity. MIT has obviously some unique aspects.
link |
00:33:16.160
Globally, there's four big academic surveys. I forget whether it was QS, there's the Times
link |
00:33:27.200
in London, the US News and whatever. And one of these recently, MIT was ranked number one
link |
00:33:34.800
in the world. So, leave aside whether you're number three somewhere else,
link |
00:33:41.360
in the great sweep of humanity, this is pretty amazing. So, you have a really
link |
00:33:49.280
remarkable aggregation of human talent here. And where it goes, it's hard to tell, you have to be
link |
00:33:58.800
a scientist to have the right feel. But what's important is you have a critical mass of people.
link |
00:34:08.480
And I think it breaks into two buckets. One is scientific advancement. And if the new college
link |
00:34:17.360
can help either serve as a convening force within the university or help coordination
link |
00:34:30.080
and communication among people, that's a good thing. Absolute good thing. The second thing
link |
00:34:37.360
is in the AI ethics area, which is in a way equally important because if the science side
link |
00:34:52.240
creates blowback, so that science is a bit crippled in terms of going forward because
link |
00:35:05.040
society's reaction to knowledge advancement in this field becomes really hostile, then
link |
00:35:13.600
you've sort of lost the game in terms of scientific progress and innovation. And so,
link |
00:35:19.760
the AI ethics piece is super important because in a perfect world, MIT would serve as a global
link |
00:35:30.400
convener because what you need is you need the research universities. You need the companies
link |
00:35:40.640
that are driving AI and quantum work. You need governments who will ultimately be regulating
link |
00:35:50.640
certain elements of this. And you also need the media to be knowledgeable and trained.
link |
00:35:58.800
And so, we don't get sort of overreactions to one situation, which then goes viral.
link |
00:36:10.480
And it ends up shutting down avenues that are perfectly fine to be walking down or running
link |
00:36:18.640
down that avenue. But if enough discordant information, not even correct necessarily,
link |
00:36:32.240
sort of gets pushed around society, then you can end up with a really hostile regulatory
link |
00:36:38.720
environment and other things. So, you have four drivers that have to be sort of integrated.
link |
00:36:50.240
And so, if the new school of computing can be really helpful in that regard,
link |
00:36:58.480
then that's a real service to science and it's a serviced MIT. So, that's why I wanted to get
link |
00:37:07.920
involved for both areas. And the hope is for me, for others, for everyone, for the world is for
link |
00:37:15.200
this particular college of computing to be a beacon and a connector for these ideas.
link |
00:37:22.880
Yeah, that's right. I mean, I think MIT is perfectly positioned to do that.
link |
00:37:31.040
All right. So, you've mentioned the media, social media, the internet as this complex
link |
00:37:38.960
network of communication with with flaws, perhaps, perhaps you can speak to them.
link |
00:37:44.080
But it, you know, I personally think that science and technology has its flaws,
link |
00:37:52.080
but ultimately is one, sexy, exciting. It's the way for us to explore and understand the
link |
00:38:01.200
mysteries of our world. And two, most perhaps more importantly for some people, it's a huge
link |
00:38:08.240
way to a really powerful way to grow the economy to improve the quality of life for everyone.
link |
00:38:13.600
So, how do we get, how do you see the media, social media, the internet as a society
link |
00:38:20.960
having, you know, a healthy discourse about science, first of all, one that's factual,
link |
00:38:30.160
and two, one that's finds science exciting, that invests in science, that pushes it forward,
link |
00:38:36.480
especially in this science fiction, fear filled field of artificial intelligence.
link |
00:38:43.360
Well, I think that's a little above my pay grade, because, you know, trying to control
link |
00:38:49.200
social media to make it do what you want to do, appears to be beyond almost anybody's control.
link |
00:38:56.640
And the technology is being used to create what I call the tyranny of the minorities.
link |
00:39:03.680
Okay, a minority is defined as, you know, two or three people on a street corner.
link |
00:39:09.200
Doesn't matter what they look like. It doesn't matter where they came from. They're united
link |
00:39:14.240
by that one issue that they care about. And their job is to enforce their views
link |
00:39:25.440
on the world. And, you know, in the political world, people just are manufacturing
link |
00:39:32.720
truth. And they throw it all over. And it affects all of us. And, you know, sometimes
link |
00:39:41.360
people are just hired to do that. I mean, it's amazing. And you think it's one person,
link |
00:39:48.640
it's really, you know, just sort of a front, you know, for a particular point of view.
link |
00:39:55.200
And this has become exceptionally disruptive for society. And it's dangerous. And it's
link |
00:40:03.840
undercutting, you know, the ability of liberal democracies to function. And I don't know how
link |
00:40:09.920
to get a grip on this. And I was really surprised when we, you know, was up here for the announcement
link |
00:40:18.080
last spring of the College of Computing. And they had all these famous scientists,
link |
00:40:26.880
some of whom were involved with the invention of the internet. And almost every one of them
link |
00:40:34.560
got up and said, I think I made a mistake. And as a non scientist, I never thought I'd hear anyone
link |
00:40:42.960
say that. And what they said is, more or less, to make it simple, we thought this would be really
link |
00:40:50.160
cool, inventing the internet. We could connect everyone in the world. We can move knowledge
link |
00:40:56.160
around. It was instantaneous. It's a really amazing thing. He said, I don't know that there was
link |
00:41:03.280
anyone who ever thought about social media coming out of that and the actual consequences
link |
00:41:09.600
for people's lives. You know, there's always some, some younger person, I just saw one of these
link |
00:41:18.560
yesterday, he's reported on the national news, he killed himself when people use social media
link |
00:41:24.560
to basically, you know, sort of ridicule him or something of that type. This is dead. This is
link |
00:41:33.920
dangerous. And, you know, so I don't have a solution for that other than going forward.
link |
00:41:45.360
You can't end up with this type of outcome using AI to make this kind of mistake twice
link |
00:41:52.720
is unforgivable. So interestingly, at least in the West, and parts of China, people are quite
link |
00:42:02.720
sympathetic to, you know, sort of the whole concept of AI ethics and what gets introduced when,
link |
00:42:11.440
and cooperation within your own country, within your own industry, as well as globally to make
link |
00:42:20.000
sure that the technology is a force for good. On that really interesting topic, since 2007,
link |
00:42:27.040
you've had a relationship with senior leadership, with a lot of people in China,
link |
00:42:32.800
and an interest in understanding modern China, their culture, their world.
link |
00:42:38.320
Much like with Russia, I'm from Russia originally, Americans are told a very narrow one sided story
link |
00:42:44.880
about China, that I'm sure misses a lot of fascinating complexity, both positive and negative.
link |
00:42:53.200
What lessons about Chinese culture, its ideas as a nation, its future, do you think Americans should
link |
00:42:59.520
know about, deliberate on, think about? Well, it's sort of a wide question that you're
link |
00:43:06.400
you're asking about. You know, China is a pretty unusual place. First, it's huge. You know, you
link |
00:43:14.960
got, it's physically huge. It's got a billion three people. And the character of the people
link |
00:43:22.480
isn't as well understood in the United States. Chinese people are amazingly energetic.
link |
00:43:32.160
If you're one of a billion three people, one of the things you've got to be focused on
link |
00:43:40.000
is how do you make your way, you know, through a crowd of a billion 2.99999 other people.
link |
00:43:50.800
Another word for that is competitive. Yes, they are individually highly energetic,
link |
00:43:57.040
highly focused, always looking for some opportunity for themselves, because they need to,
link |
00:44:07.840
because there's an enormous amount of just literally people around. And so, you know,
link |
00:44:14.240
what I've found is they'll try and find a way to win for themselves. And their country is
link |
00:44:23.200
complicated, because it basically doesn't have the same kind of functional laws that we do
link |
00:44:30.880
in the United States in the West. And the country is controlled really through a web of
link |
00:44:41.120
relationships you have with other people, and the relationships that those other people have
link |
00:44:47.360
with other people. So it's an incredibly dynamic culture where if somebody knocks somebody up
link |
00:44:56.800
on the top, who's three levels above you and is in effect protecting you, then, you know, you're
link |
00:45:03.360
like a, you know, sort of a floating molecule there, you know, without tethering, except the
link |
00:45:10.560
one or two layers above you, but that's going to get affected. So it's a very dynamic system and
link |
00:45:16.000
getting people to change is not that easy, because if there aren't really functioning laws,
link |
00:45:23.600
it's only the relationships that everybody has. And so when you decide to make a major change
link |
00:45:30.720
and you sign up for it, something is changing in your life. There won't necessarily be all the
link |
00:45:38.160
same people on your team. And that's a very high risk enterprise. So when you're dealing with China,
link |
00:45:46.160
it's important to know almost what everybody's relationship is with somebody. So when you
link |
00:45:53.200
suggest doing something differently, you line up these forces in the West, it's usually you
link |
00:46:01.840
talk to a person and they figure out what's good for them. It's a lot easier. And in that sense,
link |
00:46:07.760
in a funny way, it's easier to make change in the West, just the opposite of what people think.
link |
00:46:14.720
But once the Chinese system adjusts to something that's new, everybody's on the team.
link |
00:46:22.320
It's hard to change them, but once they're changed, they are incredibly focused in a way
link |
00:46:28.560
that it's hard for the West to do in a more individualistic culture. So there are all kinds of
link |
00:46:36.320
fascinating things. One thing that might interest the people who are listening
link |
00:46:46.640
were more technologically based than some other group. I was with one of the top people in
link |
00:46:53.760
the government a few weeks ago and he was telling me that every school child in China is going to be
link |
00:47:04.080
taught computer science. Now imagine 100% of these children. This is such a large number of human
link |
00:47:19.040
beings. Now, that doesn't mean that every one of them will be good at computer science, but if it's
link |
00:47:26.000
sort of like in the West, if it's like math or English, everybody's going to take it.
link |
00:47:32.400
Yes. Not everybody's great at English. They don't write books. They don't write poetry.
link |
00:47:38.800
Not everybody's good at math. Somebody like myself, I sort of evolved to the third grade
link |
00:47:44.640
and I'm still doing flashcards. I didn't make it further in math, but imagine
link |
00:47:51.840
everybody in their society is going to be involved with computer science.
link |
00:47:57.360
I just even pause on that. I think computer science involves at the basic beginner level
link |
00:48:06.640
programming and the idea that everybody in the society would have some ability to program a
link |
00:48:13.280
computer is incredible. For me, it's incredibly exciting and I think that should give United
link |
00:48:22.560
States pause and consider what, talking about sort of philanthropy and launching things.
link |
00:48:31.120
There's nothing like launching, sort of investing in young youth, the education
link |
00:48:37.600
system because that's where everything launches. Yes. Well, we've got a complicated system because
link |
00:48:42.800
we have over 3,000 school districts around the country. China doesn't worry about that as a
link |
00:48:49.600
concept. They make a decision at the very top of the government that that's what they want to have
link |
00:48:56.400
happen and that is what will happen. We're really handicapped by this distributed power.
link |
00:49:06.880
In the education area, although some people involved with that area will think it's great,
link |
00:49:12.640
but you would know better than I do what percent of American children have computer science exposure.
link |
00:49:23.120
My guess, no knowledge would be 5% or less. If we're going to be going into a world
link |
00:49:33.040
where the other major economic power, sort of like ourselves, is got like 100% and we got 5
link |
00:49:41.920
and the whole computer science area is the future, then we're purposely,
link |
00:49:50.960
we're accidentally actually handicapping ourselves and our system doesn't allow us
link |
00:49:57.760
to adjust quickly to that. The issues like this, I find fascinating and if you're lucky enough to
link |
00:50:08.320
go to other countries, which I do and you learn what they're thinking, then it informs what we
link |
00:50:17.680
ought to be doing in the United States. The current administration, Donald Trump,
link |
00:50:25.520
has released an executive order on artificial intelligence. I'm not sure if you're familiar
link |
00:50:30.720
with it in 2019. Looking several years ahead, how does America, we've mentioned in terms of the big
link |
00:50:40.800
impact, we hope your investment in MIT will have a ripple effect, but from a federal perspective,
link |
00:50:49.280
from a government perspective, how does America establish with respect to China leadership in
link |
00:50:56.240
the world at the top for research and development in AI? I think that you have to get the federal
link |
00:51:03.440
government in the game in a big way and that this leap forward technologically, which is going to
link |
00:51:14.080
happen with or without us, really should be with us and it's an opportunity, in effect,
link |
00:51:22.320
for another moonshot kind of mobilization by the United States. I think the appetite actually is
link |
00:51:34.000
there to do that. At the moment, what's getting in the way is the kind of poisonous politics we
link |
00:51:43.520
have, but if you go below the lack of cooperation, which is almost the defining element of American
link |
00:51:57.680
democracy right now in the Congress, if you talk to individual members, they get it and
link |
00:52:05.600
they would like to do something. Another part of the issue is we're running huge deficits.
link |
00:52:10.800
We're running trillion dollar plus deficits. How much money do you need for this initiative?
link |
00:52:19.520
Where does it come from? Who's prepared to stand up for it? Because if it involves taking away
link |
00:52:26.320
resources from another area, our political system is not real flexible to do that. If you're creating
link |
00:52:35.840
this kind of initiative, which we need, where does the money come from? Trying to get money
link |
00:52:47.280
when you've got trillion dollar deficits in a way could be easy. What's the difference of a
link |
00:52:51.680
trillion and a trillion a little more? It's hard with the mechanisms of Congress, but what's really
link |
00:53:00.400
important is this is not an issue that is unknown and it's viewed as a very important issue.
link |
00:53:12.960
There's almost no one in the Congress when you sit down and explain what's going on
link |
00:53:18.720
who doesn't say, we've got to do something. Let me ask the impossible question. You didn't
link |
00:53:26.400
endorse Donald Trump, but after he was elected, you have given him advice,
link |
00:53:35.440
which seems to me a great thing to do, no matter who the president is, to positively
link |
00:53:43.440
contribute to this nation by giving advice. Yet, you've received a lot of criticism for this.
link |
00:53:49.520
On the previous topic of science and technology and government, how do we have healthy discourse,
link |
00:53:59.600
give advice, get excited conversation with the government about science and technology
link |
00:54:06.960
without it becoming politicized? It's very interesting. When I was young,
link |
00:54:13.760
before there was a moonshot, we had a president named John F. Kennedy from Massachusetts here.
link |
00:54:23.520
In his inaugural address as president, he asked, not what your country can do for you,
link |
00:54:31.600
but what you can do for your country. We had a generation of people, my age, basically people,
link |
00:54:39.680
who grew up with that credo. Sometimes you don't need to innovate. You can go back to basic
link |
00:54:51.280
principles, and that's a good basic principle. What can we do? Americans have GDP per capita
link |
00:55:00.400
of around $60,000. It's not equally distributed, but it's big. People have, I think, an obligation
link |
00:55:14.480
to help their country. I do that. Apparently, I take some grief from some people who
link |
00:55:26.640
project on me things I don't even vaguely believe, but I'm quite simple. I tried to help
link |
00:55:38.560
the previous president, President Obama. He was a good guy, and he was a different party,
link |
00:55:44.320
and I tried to help President Bush, and he's a different party. I sort of don't care that much
link |
00:55:53.680
about what the parties are. I care about, even though I'm a big donor for the Republicans,
link |
00:56:01.280
but what motivates me is, what are the problems we're facing? Can I help people get to a good
link |
00:56:12.080
outcome that'll stand any test? We live in a world now where the filters and the hostility is so
link |
00:56:26.960
unbelievable. In the 1960s, when I went to school, in university, I went to Yale,
link |
00:56:35.040
and we had so much stuff going on. We had a war called the Vietnam War. We had black power
link |
00:56:46.320
starting, and we had a sexual revolution with the birth control pill. It was one other major
link |
00:56:59.600
thing going on, and the drug revolution. There hasn't been a generation that had more stuff
link |
00:57:09.920
going on in a four year period than my era. Yet, there wasn't this kind of
link |
00:57:20.240
instant hostility if you believed something different. Everybody lived together and respected
link |
00:57:31.120
the other person. I think that this type of change needs to happen, and it's got to happen
link |
00:57:38.640
from the leadership of our major institutions. I don't think that leaders can be bullied
link |
00:57:50.720
by people who are against the classical version of free speech and letting open expression and
link |
00:57:59.600
inquiry. That's what universities are for, among other things, socratic methods. In the midst of
link |
00:58:13.360
this onslaught of oddness, I believe in still the basic principles, and we're going to have to find a
link |
00:58:24.240
way to get back to that. That doesn't start with the people in the middle to the bottom who are
link |
00:58:33.040
using these kinds of screens to shout people down and create an uncooperative environment.
link |
00:58:41.120
It's got to be done at the top with core principles that are articulated. Ironically,
link |
00:58:49.920
if people don't sign on to these kind of core principles where people are equal and
link |
00:58:58.640
speech can be heard and you don't have these enormous shout down biases subtly or
link |
00:59:06.240
out loud, then they don't belong at those institutions. They're violating the core
link |
00:59:10.880
principles. That's how you end up making change, but you have to have courageous people
link |
00:59:21.040
who are willing to lay that out for the benefit of not just their institutions,
link |
00:59:27.680
but for society as a whole. I believe that will happen,
link |
00:59:33.680
but it needs the commitment of senior people to make it happen.
link |
00:59:42.480
Courage. I think for such great leaders, great universities, there's a huge hunger
link |
00:59:48.640
for it. I'm too very optimistic that it will come. I'm now personally taking a step into
link |
00:59:55.120
building a startup first time, hoping to change the world, of course. There are thousands,
link |
01:00:01.760
maybe more, maybe millions of other first time entrepreneurs like me. What advice
link |
01:00:07.840
you've gone through this process? You've talked about the suffering, the emotional turmoil,
link |
01:00:13.920
and all my intel. What advice do you have for those people taking that step?
link |
01:00:20.000
I'd say it's a rough ride and you have to be psychologically prepared for things going wrong
link |
01:00:30.800
with frequency. You have to be prepared to be put in situations where you're being asked to
link |
01:00:39.040
solve problems you didn't even know those problems existed. For example, renting space.
link |
01:00:45.680
It's not really a problem unless you've never done it. You have no idea what a lease looks like.
link |
01:00:52.320
You don't even know the relevant rent in a market, so everything is new. Everything
link |
01:00:58.960
has to be learned. What you realize is that it's good to have other people with you
link |
01:01:05.120
who've had some experience in areas where you don't know what you're doing. Unfortunately,
link |
01:01:11.120
an entrepreneur starting doesn't know much of anything, so everything is something new.
link |
01:01:16.960
Yeah. I think it's important not to be alone because it's overwhelming
link |
01:01:28.560
and you need somebody to talk to other than a spouse or a loved one because even they get
link |
01:01:35.440
bored with your problems. Getting a group, if you look at Alibaba, Jack Ma was telling me
link |
01:01:45.120
that they basically were at financial death's door at least twice. The fact that it wasn't
link |
01:01:55.040
just Jack, people think it is because he became the public face and the driver,
link |
01:02:02.480
but a group of people who can give advice, share situations to talk about, that's really important.
link |
01:02:12.720
And that's not just referring to the small details like renting space.
link |
01:02:17.360
It's also the psychological burden. Because most entrepreneurs at some point question what
link |
01:02:25.200
they're doing because it's not going so well or they're screwing it up and they don't know how to
link |
01:02:30.320
unscrew it up because we're all learning and it's hard to be learning when there are like
link |
01:02:37.440
25 variables going on. If you're missing four big ones, you can really make a mess.
link |
01:02:44.000
And so the ability to in effect have either an outsider who's really smart that you can rely on
link |
01:02:53.200
for certain type of things or other people who are working with you on a daily basis.
link |
01:02:59.040
It's in most people who haven't had experience believe in the myth of the one person, one great
link |
01:03:08.800
person, you know, makes outcomes, creates outcomes that are positive. Most of us,
link |
01:03:17.520
it's not like that. If you look back over a lot of the big successful tech companies,
link |
01:03:23.360
it's not typically one person. And you will know these stories better than I do. Because it's
link |
01:03:31.120
your world, not mine. But even I know that almost everyone of them had two people. I mean,
link |
01:03:36.720
if you look at Google, that's what they had. And that was the same at Microsoft at the beginning.
link |
01:03:43.520
And it was the same at Apple. People have different skills and they need to play off of
link |
01:03:51.600
other people. So, you know, the advice that I would give you is make sure you understand that so
link |
01:04:01.840
you don't head off in some direction as a lone wolf and find that either you can invent all the
link |
01:04:09.280
solutions, or you make bad decisions on certain types of things. This is a team sport. Entrepreneur
link |
01:04:18.640
means you're alone, in effect. And that's the myth. But it's mostly a myth.
link |
01:04:27.200
Yeah, I think, and you talk about this in your book, and I could talk to you about it forever,
link |
01:04:31.840
the harshly self critical aspect to your personality, and to mine as well in the face
link |
01:04:38.480
of failure. It's a powerful tool, but it's also a burden that's very interesting, very interesting
link |
01:04:46.880
to walk that line. But let me ask in terms of people around you, in terms of friends,
link |
01:04:55.200
in the bigger picture of your own life, what do you put the value of love, family, friendship
link |
01:05:02.560
in the big picture journey of your life? Well, ultimately, all journeys are alone.
link |
01:05:09.440
It's great to have support. And when you go forward and say, your job is to make something
link |
01:05:24.640
work, and that's your number one priority, and you're going to work at it to make it work,
link |
01:05:31.280
it's like superhuman effort. People don't become successful as part time workers. It doesn't work
link |
01:05:38.960
that way. And if you're prepared to make that 100 to 120% effort, you're going to need support,
link |
01:05:50.640
and you're going to have to have people involved with your life who understand that that's really
link |
01:05:56.240
part of your life. Sometimes you're involved with somebody, and they don't really understand that,
link |
01:06:04.640
and that's a source of conflict and difficulty. But if you're involved with the right people,
link |
01:06:13.360
whether it's a dating relationship or a spousal relationship, you have to involve them
link |
01:06:24.960
in your life, but not burden them with every sort of minor triumph or mistake. They actually get
link |
01:06:37.440
bored with it after a while, and so you have to set up different types of ecosystems. You have your
link |
01:06:46.720
home life, you have your love life, you have children, and that's like the enduring part of
link |
01:06:54.640
what you do. And then on the other side, you've got the sort of unpredictable nature of this type
link |
01:07:06.000
of work. What I say to people at my firm who are younger, usually, well, everybody's younger,
link |
01:07:14.000
but people who are of an age where they're just having their first child,
link |
01:07:22.160
or maybe they have two children, that it's important to make sure they go away with their
link |
01:07:31.440
spouse at least once every two months. It's just some lovely place where there are no children,
link |
01:07:40.080
no issues, sometimes once a month, if they're sort of energetic and clever.
link |
01:07:49.520
Escape the craziness of it all? Yeah, and reaffirm your values as a couple,
link |
01:07:58.080
and you have to have fun. If you don't have fun with the person you're with and all you're doing
link |
01:08:05.520
is dealing with issues, then that gets pretty old. And so you have to protect the fun element
link |
01:08:13.840
of your life together. And the way to do that isn't by hanging around the house
link |
01:08:18.640
and dealing with more problems. You have to get away and reinforce and reinvigorate
link |
01:08:27.200
your relationship. And whenever I tell one of our younger people about that, they sort of look at
link |
01:08:32.480
me and it's like the scales are falling off of their eyes and they're saying, jeez, I hadn't
link |
01:08:37.760
thought about that. I'm so enmeshed in all these things, but that's a great idea. And that's something
link |
01:08:43.440
as an entrepreneur, you also have to do. You just can't let relationships slip because you're half
link |
01:08:51.440
overwhelmed. Beautifully put, and I think there's no better place to end it. Steve, thank you so
link |
01:08:58.160
much. I really appreciate it. It was an honor to talk to you. My pleasure. Thanks for listening to
link |
01:09:03.760
this conversation with Stephen Schwarzman. And thank you to our sponsors ExpressVPN and Masterclass.
link |
01:09:09.840
Please consider supporting the podcast by signing up to masterclass and masterclass.com slash lex
link |
01:09:16.240
and getting ExpressVPN and expressvpn.com slash lex pod. If you enjoy this podcast,
link |
01:09:22.320
subscribe on YouTube, review it with five stars on Apple podcast, support it on Patreon, or simply
link |
01:09:27.600
connect with me on Twitter at Lex Friedman. And now let me leave you with some words from Stephen
link |
01:09:34.240
Schwarzman's book, What It Takes. It's as hard to start and run a small business as it is to start
link |
01:09:42.240
a big one. You'll suffer the same toll financially and psychologically as you bludgeon it into
link |
01:09:48.560
existence. It's hard to raise the money and to find the right people. So if you're going to dedicate
link |
01:09:54.640
your life to a business, which is the only way it will ever work, you should choose one with
link |
01:10:00.800
the potential to be huge. Thank you for listening and hope to see you next time.