0:27
Andrew Ross Sorkin and his
0:28
guests the founder and CEO
0:30
of Nvidia. Jensen Wang.
0:36
Wang Wang. Welcome back.
0:42
Everybody. Jensen is here.
0:44
Of course the CEO of
0:46
Nvidia as I mentioned at the
0:47
top of the day. This
0:49
is the clear winner
0:51
of every winter. In the
0:52
world in of In artificial
0:52
intelligence thus far
0:55
his company Powers
0:55
everything from open a.i.
0:57
Google’s programs matter
0:58
what Earl Frenemies in
1:00
some ways. We’ll talk about
1:01
it. He founded the company
1:03
back in 1993 over
1:05
breakfast at Denny’s with
1:07
two friends since then as
1:09
CEO. He’s LED Nvidia to
1:11
become the world’s most
1:13
valuable Semiconductor
1:14
Company and Via stock
1:15
has been on a tear up two
1:19
hundred and forty percent this
1:19
year reaching dollar. 240
1:24
Market cap and we are so
1:25
grateful to have you here
1:25
today market as we all try
1:26
to make sense of what
1:26
is happening in the world
1:27
of AI and I think it’s
1:28
so many ways you saw this
1:30
first and so I’m hoping to
1:32
start with this and I said
1:33
you power what open a.i. And
1:37
chat GPT has been we’ve all
1:37
been reading about open
1:39
a.i. And all of the
1:41
travails inside that that
1:43
company and nonprofit and
1:45
we maybe talk about some of
1:45
the governance issues there
1:46
as well. But you delivered I
1:50
think this is this is
1:51
back. I don’t know what
1:54
year we’re talking about
1:55
now, but you delivered
1:56
the first box the first chips
1:59
to Elon Musk who was
2:00
one of the founders of
2:01
open AI only a couple of
2:03
years ago. What did you
2:04
what happened? Well, I
2:10
delivered to him the first
2:11
AI supercomputer the world
2:12
ever made. It took us
2:14
five years to make it
2:15
is called a dgx. It’s
2:16
everywhere in the world
2:17
today people think that we
2:19
build gpus but this GPU is
2:23
is 70 pounds 35 thousand
2:27
Parts out of the 35
2:30
thousand eight of those
2:30
chips come from tsmc. It
2:33
is so heavy in new robots
2:35
to build it. It’s like
2:37
an electric car. It
2:39
consumes 10,000 amps it
2:39
It we sell it for
2:41
two hundred fifty thousand
2:42
dollars it it’s a
2:45
super computer. So it takes
2:46
another
2:47
super computer to test. This
2:48
is a computer first of its
2:50
kind and we started working
2:52
on it in 2012 took me
2:54
five years to build it
2:56
at first I built it for our
2:57
own engineers and I spoke
3:00
about it at one of our
3:01
conferences and Elon saw it.
3:03
He goes. I want one of
3:04
those and and he said he
3:07
And and told me about
3:08
open a And i I’d also
3:11
knew Peter Beal who was a
3:14
Berkeley Professor. He was
3:15
one of the early people at
3:17
opening II and
3:19
Ilya sutskever. He I met him
3:21
during the Alex net days
3:23
five years earlier.
3:25
He’s involved in all the
3:25
drama that we’ve been
3:26
reading about. And so
3:27
anyways, I delivered the
3:29
world’s first AI
3:30
supercomputer to open a
3:31
eye on that day and and
3:33
people took pictures of
3:34
it and so on the
3:35
internet somewhere. Yeah. Um
3:37
when you did that and you
3:38
said you didn’t do it
3:39
originally for him. What
3:40
was it though that you saw
3:41
at that point five years
3:43
before you even delivered
3:44
it in 2012 when this
3:46
Hall first started first
3:48
happened Alex net did
3:50
something remarkable. Here’s
3:52
a here’s a neural network.
3:54
It’s a it’s a it’s a
3:55
software program where the
3:57
way you programmed it was
4:00
to show it the results that
4:02
you wanted. Which is the
4:04
backwards which of most
4:05
Which programs up to then,
4:06
you know programs up to
4:07
them were where Engineers
4:09
would sit down and you would
4:10
write software and then you
4:12
would test it to see if
4:13
it produced the outputs you
4:15
wanted but here you showed
4:17
it examples and you you
4:20
you taught it what
4:22
outputs you wanted what
4:23
helped us to expect and so
4:25
when we first saw the
4:26
results of it Alex net the
4:28
results were so spectacular
4:30
that Alex kraszewski and
4:34
and Ilya sutskever and of
4:36
course Geoffrey Hinton they
4:38
achieved results that
4:39
Results of computer results
4:40
vision Results recognition
4:41
that that no no computer
4:43
vision expert where it was
4:45
able to achieve before that.
4:46
And so so the first the
4:48
first observation was is how
4:51
remarkable was but then
4:52
then we were we were
4:53
fortunate have taken a step
4:54
back and ask ourselves.
4:56
What is the implication of
4:57
this to the future of
5:01
computers? And and we drew
5:03
the right conclusions that
5:04
that this was going to
5:05
change the way Computing was
5:06
going to be done.
5:08
This was going to change.
5:09
Change the way software was
5:09
going to be written. And
5:11
this was going to change
5:11
the type of applications. We
5:13
could write write. Did you
5:13
get to work? Was there any
5:15
part of you did you write
5:15
that was scared when all
5:16
this happened you just
5:17
mentioned to name’s George
5:18
Hamilton Hinton as well.
5:19
You also mentioned Elia and
5:21
and those are names by
5:22
the way, if you’ve been
5:23
following If the way.
5:23
what’s happening, if
5:24
the way, If the way. they
5:25
have been very outspoken
5:26
about the dangers of AI
5:29
very, well. I want to get
5:30
into actually what you
5:30
think happened at open
5:32
a.i. In the past couple
5:32
weeks, but it may very
5:34
well be that there
5:35
may have been a new Step
5:37
change in terms of terms
5:42
in change Step been a new
5:43
there may have that be
5:43
well very
5:44
of what this technology.
5:47
LG was but was there
5:48
ever a part of you
5:48
when you’re seeing this all
5:49
happen say oh my goodness.
5:50
I don’t know. We’re on the
5:50
cusp of a revolution
5:51
in a great way. But this is
5:51
dangerous. What I
5:52
would say, I would say 12
5:52
years. Nobody expected the
5:53
results where we get and I
5:54
think anybody who would have
5:55
would have said so back
5:58
then would have
5:59
over-exaggerated, you know,
6:01
our understanding of the
6:02
of the the rate of
6:04
progress. There’s no
6:06
question that the rate of
6:07
progress is high. And what
6:11
we what we and realized
6:11
today, is that that of
6:13
course And of and course
6:15
course, what we can do
6:16
today with with these
6:18
models and intelligence are
6:21
related, but not the same,
6:22
you know, we’re very good
6:23
at perception today and
6:24
we’re very good at those
6:26
One-Shot knee-jerk reaction.
6:28
I recognize that that’s
6:29
a dog. I can I can finish
6:32
that sentence but there’s a
6:34
whole bunch of things
6:34
that we can’t do yet. do
6:35
can’t we things that of
6:36
bunch whole a there’s but
6:36
sentence that We can’t
6:37
reason yet, you know
6:38
this multi-step.
6:40
Meaning that humans are
6:41
very good at a I can’t do
6:42
and how far away do you
6:42
think we are from that?
6:43
Well, we’ll see. We’ll see
6:45
I think that just about
6:47
everybody’s working on it
6:48
and and all the
6:49
researchers are working on
6:50
it. Everybody’s working on
6:51
it. We’re trying to figure
6:51
out you know, how do you
6:53
take a goal break it down
6:54
into a whole bunch of steps
6:56
and created the sijin tree
6:58
and then walk down the
6:59
decision tree to figure
7:00
out you know, which one of
7:01
the paths leads to the most
7:03
optimal answer. This is
7:05
this is a how we reason
7:07
through things how we
7:08
iterate through Problem
7:12
today, as you know,
7:13
what you’re making bets now
7:13
in terms of
7:14
of technology that you have
7:14
to build an investment.
7:15
You have to make yeah on
7:16
where we’re going to be
7:17
five years from now
7:18
ten years ten years ago,
7:19
right? So, you know people
7:21
talk about a GI. Yeah, right
7:23
right artificial AGI. Yeah,
7:23
artificial general
7:24
intelligence. Yeah. Do you
7:24
think in 10 years from now?
7:26
We are there. Bye depending
7:30
on how you define it. I
7:30
think the answer is yes. And
7:32
so the question is what is a
7:34
gi8 if we defined a GI as
7:38
a piece of software a
7:40
computer that can take a
7:41
whole bunch of tests and
7:43
these tests reflect tests,
7:46
basic intelligence tests and
7:50
and by achieving by by by
7:52
completing those tests those
7:53
completing by achieving by
7:54
and intelligence basic
7:55
deliver results that are
7:55
fairly fairly
7:57
competitive to a Normal
7:59
human, I would I would say
8:00
that within the next five
8:01
years. You’re going to see a
8:03
obviously a eyes that can
8:05
that can achieve those tests
8:06
and designed the chips that
8:08
you’re making right now.
8:09
Yeah. Well, you need to
8:10
have the same staff that
8:12
designs them. In fact,
8:13
none of our chips are
8:14
possible today without a i
8:17
Literally, the H1 hundreds
8:18
were shipping today was
8:19
designed with the
8:20
assistance of a whole lot of
8:22
a eyes. Otherwise, we
8:23
wouldn’t be able to cram so
8:24
many transistors on a chip
8:26
or optimize the algorithms
8:27
to the level that we have
8:28
and you know software can’t
8:30
be written without a i chips
8:31
can be designed without a i
8:33
nothing can
8:34
yeah, nothing’s possible. We
8:34
started by talking about
8:35
open a.i. And everybody’s
8:37
Ai and
8:37
yeah. Nothing’s possible.
8:37
focused on that.
8:38
What did you make? What
8:43
happened? The ousting of Sam
8:43
Altman alternative Sam
8:44
Altman the all of it. Yeah.
8:45
Well, first of all, I’m
8:46
happy that they’re settled
8:49
and I hope they’re settled
8:51
is a really great team and
8:53
and they’re doing important
8:54
work and they’ve achieved
8:56
great results and I’m just
8:57
really happy that they’re
8:58
settled, you know, also
9:02
it also bring in brings to
9:03
mind the importance of
9:05
corporate governance. There’s
9:06
a invidious here 30 years
9:08
after our founding we’ve
9:10
gone through a lot of
9:11
adversity if we didn’t
9:12
set up our company properly,
9:15
who knows what
9:15
would have been who knows
9:16
what would have done and
9:18
so I think when
9:20
you’re when you’re
9:20
architecting an industry,
9:21
you know, you want to apply
9:22
some of that some that of
9:23
that some of some apply
9:24
you want to industry wisdom
9:24
to architecting a company
9:28
Yeah, and and so I’m really
9:29
proud of him videos
9:29
corporate governance
9:30
by the way in and if not
9:31
for the architecture that
9:32
we establish and I was 29
9:34
years old and you’d be kind
9:35
of your a
9:36
for-profit company though.
9:36
What’s so interesting
9:37
I think about this sort of
9:38
dynamic is that that is a
9:41
firm that is effectively
9:45
operated from a governance
9:46
perspective as
9:47
a not-for-profit and
9:48
one of the reasons that
9:48
they set it up that way was
9:49
because they did think it
9:50
was dangerous Elon Musk said
9:52
it was dangerous at the
9:52
beginning Celia said, it was
9:54
dangerous. And so
9:55
the question is in the
10:00
sort of multitude of these
10:00
different businesses that
10:01
are in a I do you think you
10:02
do need these
10:02
not-for-profits? Do you
10:03
think that that the
10:04
incentive system is just
10:05
fundamentally off. And
10:10
should be a for-profit.
10:10
I mean a lot of people
10:11
now think the
10:11
capitalist have taken over.
10:12
Well Regulators are not
10:13
for profit and we should
10:15
regulate these first of all
10:16
just take a step back and
10:18
think about what a i
10:19
is AI is an autonomous
10:20
system. It’s an autonomous
10:22
system. That’s more
10:23
sophistication autonomous
10:24
information system. We have
10:25
a lot of autonomous systems
10:27
today self-driving cars in
10:29
some in factories
10:31
within factories already
10:32
exists robots are
10:34
autonomous in factories
10:34
with planes are autonomous.
10:39
Autopilot self Landing all
10:40
of those capabilities
10:40
exist. We
10:43
we ought to make sure that
10:43
we applied the first
10:45
principles of autonomous
10:46
systems in the same way.
10:48
We have to design a
10:49
properly tested properly
10:52
stress test the properly
10:53
monitor it there’s Inside
10:55
Out safety. There’s outside
10:57
in safety the FAA flight
11:01
Air Traffic Control
11:03
redundancy Traffic Control
11:03
Air flight FAA the safety
11:04
in outside redundancy
11:05
diversity. There’s a whole
11:05
bunch of different systems
11:06
that we have to put in.
11:08
Place for autonomous systems
11:09
there’s a place for Place for
11:09
lot of place for Industries
11:10
to learn from at the
11:11
beginning of
11:11
those I mentioned there’s
11:12
sort of a frenemy situation
11:13
going on with a lot of
11:15
companies that use your
11:16
chips. They’re desperate for
11:17
your chips. They they
11:19
want your gpus and
11:21
at the same time. They’re
11:21
also trying to build
11:22
their own frankly. I’m
11:25
curious that you’ve seen
11:26
it all how you would stack
11:28
rank the success of the
11:30
various companies that are
11:31
in this AI space. We have
11:33
somebody from Google
11:34
deepmind’s here today
11:37
their CEO. Well, I’m
11:42
curious where you think open
11:43
a.i. Ranks in that
11:43
there’s inflection. Amazon
11:44
is trying to play
11:44
and I’m not going to rank
11:45
my friends, you know,
11:45
but you but you have a sense
11:46
of and I part of the
11:48
question that I want to but
11:49
I’m not going to do it.
11:51
I’m just kidding, kidding
11:52
but there is a question
11:53
about harshly whether all of
11:54
these things converge,
11:56
uh-huh meeting know that
11:58
that they all it just
11:59
this all become some kind
12:00
of commoditized no
12:02
business. No, I don’t think
12:03
so. I don’t think so. I think
12:05
what’s going to happen
12:06
is we’re going to have We’re
12:07
going to we’re have
12:08
off-the-shelf a eyes and
12:09
these off-the-shelf apis are
12:10
going to be really really
12:11
good at solving a lot of
12:13
problems. But but you’re
12:15
going to have companies
12:18
in healthcare going to have
12:19
supervised, you know, super
12:21
tuned a eyes that take
12:22
these off the shelf a eyes
12:24
and make them super good
12:26
at drug Discovery or super
12:27
good at chip design
12:29
and we just use our company.
12:30
For example, the vast
12:31
majority of our company’s
12:33
value company’s our of
12:34
majority vast the example,
12:34
For company. our is in the
12:35
data and the intelligence
12:35
and the know know how
12:39
to craft Know how that’s
12:39
inside our company and
12:40
know how none of that data
12:40
Know how is out on the
12:41
internet. You can’t get an
12:41
AI to go learn it. And so
12:42
I’ve got to take a really
12:44
smart AI which is what we
12:45
do. We build a smart Ai and
12:47
then we teach it how
12:48
to design chips. We teach
12:49
you how to write software
12:50
you teach it how to do
12:52
drug Discovery. You’ll
12:53
teach it how to do
12:54
you know Radiology. Let me
12:55
ask you a geopolitical
12:56
question. We’re
12:57
gonna hear from the president
12:58
Taiwan just after this and
13:00
there is a big debate as
13:01
you know, about chip
13:03
Independence the big
13:05
investment that we’re making
13:06
in, To
13:10
Manufactured here in the
13:10
United States Two whether we
13:11
should be exporting certain
13:12
types of chips to China.
13:16
Where are we on the Journey
13:17
of being chip independent
13:20
if you will and do you
13:20
think that that is a
13:21
worthy goal? We are we are
13:25
somewhere between a decade
13:26
we We two decades away
13:27
from we from supply chain
13:31
Independence Independence.
13:32
As I mentioned earlier. Our
13:34
systems comes 35,000 parts
13:37
and eight of them come from
13:38
tsmc and the supply chain
13:41
when you think through
13:42
are in Taiwan course,
13:43
there are a lot of in
13:44
Taiwan there all over the
13:45
world but supply chain
13:46
Independence is going to be
13:47
really challenging. Yeah,
13:49
we should try it. We
13:50
should Endeavour it. I mean
13:51
we should absolutely go
13:52
down the Journey of it, but
13:54
total independence But
13:55
of it. but of it, of But
13:56
of it. supply chain is not
13:58
a real practical thing for
14:00
for a decade or
14:00
took a one of the other
14:01
things that’s happening
14:01
as you know, so well is
14:03
that the u.s. Government
14:04
has effectively told you you
14:05
need to throttle the speed
14:08
of the chips that you are
14:10
exporting to China. Yeah.
14:12
This is having impact. On
14:16
the business itself, but I’m
14:16
curious how you think about
14:17
that also geopolitically
14:17
as a business the
14:19
National Security concerns
14:20
Jamie dimon. We were talking
14:21
earlier about you know, what
14:23
companies you should do
14:23
business with should
14:24
you do business with
14:25
people in China or not,
14:27
given all of the concerns
14:28
that people have well
14:30
on first principles were
14:31
a company that was built for
14:32
business. And so we try to
14:34
do business with everybody
14:35
we can on the other hand
14:38
on the other hand our
14:39
national security matters
14:40
and our national
14:41
competitiveness
14:42
competitiveness. Matters
14:44
somewhere between matters
14:45
Matters between matters the
14:46
the between that makes sense
14:48
and so And sense. our and
14:50
sense And sense. country,
14:50
of course once our
14:51
Industries to to on the
14:53
one hand be successful
14:54
right lead the world invent
14:57
amazing technology have
14:59
technology in dependence on
15:00
the one hand and and be the
15:02
leader of the world in
15:03
technology on the one hand
15:04
on the other hand. We need
15:05
to make sure that we
15:06
ensure our national
15:08
security our regulations
15:09
provide for that the most
15:12
critical that technology
15:14
critical most that the for
15:14
regulations provide our Build
15:19
the Leading Edge of it
15:19
is not made available to
15:20
China. And so what
15:20
we have to do a new
15:21
regulation just came out one
15:22
that came out a year ago
15:22
one just came out this
15:23
year. And so we have to
15:25
we have to come up with
15:26
new chips that comply with
15:29
the regulation. And once we
15:31
comply with the regulation
15:32
will go back to Market and
15:33
and do the best. Do you
15:34
think a regulation is a
15:35
good idea because I have I
15:35
have heard you say that you
15:38
think potentially by
15:40
throttling these chips. We
15:42
are just hiring and creating
15:44
competitors in places like
15:45
China that you can’t
15:46
control. That’s what you
15:48
don’t look they’re always
15:49
unintended consequences
15:50
everything that we do in
15:52
complicated systems. If we
15:54
want to want to limit them
15:54
from access to technology
15:56
like nvidia’s maybe it
15:58
doesn’t really they find a
16:00
way to get it or they find
16:01
a way to inspire their local
16:03
industry. There’s some 50
16:04
companies are being built in
16:06
China that that are going
16:08
to go provide this
16:09
technology. So we you know,
16:10
know, it’s you we So
16:11
technology. this provide go
16:11
going to are it’s
16:12
it’s a it’s a complicated
16:15
thing. And so what can you
16:16
do? Well, you could you
16:16
can make your own choices,
16:17
but the the other
16:17
thing that’s happened
16:18
literally in the past
16:20
couple months now is
16:21
Huawei came out with a new
16:23
phone. Yeah, and it
16:26
surprised everybody in terms
16:27
of the chips in that phone
16:30
in terms of being a
16:32
7 nanometre chip. There was a
16:33
view that China was never
16:35
going to get there. We were
16:35
we had this sort of real
16:40
Real opportunity ahead of
16:41
them by many years.
16:44
Were you surprised by that?
16:46
The the The rumors of it
16:48
and in the market has been
16:49
around for a long time. And
16:51
so was it where we
16:53
surprised? I don’t think so.
16:55
I don’t think anybody in
16:55
the industry was really
16:57
surprised. And and is it
16:59
possible to take
17:00
something that that said
17:03
16 nanometer and Shrink at
17:04
the seven? You know, these
17:05
are just numbers. Is it
17:08
really 7 did they shrink
17:10
it down to something that
17:12
was sufficiently good
17:13
that you can make a phone
17:14
from? Yeah, I think so. And
17:16
and so so I think it
17:17
you know, there’s no magic
17:19
in these numbers as
17:19
you know, it’s just seven the
17:21
number but the question is
17:22
what is our lead over them.
17:24
Do you think in
17:26
semiconductors? Yeah in
17:28
semiconductors, you know
17:29
call it call it a decade,
17:32
you know, you could decide
17:34
yeah and call it a decade,
17:36
but I could you take the
17:39
decade old technology and
17:41
just squeeze The Living
17:42
Daylights out of it until
17:43
it produces something
17:44
that’s kind of like
17:45
something from five years
17:46
ago. Yeah, probably and so
17:48
so I think there’s a lot of
17:49
in a lot of a lot of
17:51
clever Engineers all
17:52
over the world and
17:52
they’re trying to you know,
17:53
get the most out of
17:55
me ask you a different
17:55
company that they have
17:57
there’s gonna be gold asml
17:57
in the Netherlands. That’s
18:00
basically responsible for
18:01
every chip that everybody
18:02
makes some people might
18:04
call them a monopoly. How
18:12
powerful are they in
18:12
all of this? And
18:13
should it be we be
18:13
worried about that power?
18:15
Well, a lot of people
18:16
depend on them to build
18:18
the instrument and they do
18:19
build very very good
18:20
instruments. And the
18:23
technology is very
18:23
complicated. It took a
18:24
long time for them to build
18:25
it. There’s no reason why
18:27
they don’t want to provide
18:28
it to the world. And so
18:30
I’m not so I’m not sure
18:32
what the question is, but
18:33
but I’m not concerned.
18:34
I didn’t wake But is. up
18:35
but is, this morning But is.
18:35
concerned about the SML. I
18:37
think they’re
18:38
an excellent provider and and
18:40
they’re they’re they’re
18:41
motivated this apply to us
18:41
and And this apply to us. and
18:42
this apply to us And
18:43
this apply to us. you know,
18:43
so I think everybody’s
18:44
everybody’s incentives are
18:45
aligned when I asked you a
18:48
management question because
18:50
it’s just fascinating given
18:51
the success of this of this
18:53
company. You constantly say
18:55
even at this point in the
18:56
Ballgame you say I do
18:58
everything I can not to go.
19:01
Out of business. I do
19:02
everything I can not to
19:02
fail that that is like a
19:03
mantra inside the company
19:05
even at this point. What
19:08
is that about? What is that
19:11
about? I think I think when
19:13
you when you build a company
19:14
from the ground up and
19:16
you’ve you experienced real
19:18
real adversity, and and you
19:21
really really experienced
19:23
nearly
19:24
going out of business several
19:24
times that that feeling
19:27
stays with you. I wake up
19:28
every morning and in
19:30
you know, some condition of
19:32
concern and and I don’t
19:36
I don’t wake up proud and
19:38
confident I wake up.
19:41
Worried and concerned about
19:42
you know, and so
19:44
it just depends on which
19:44
side of the bed you get out
19:45
on. This is the Andy Grove
19:47
only the paranoid survive.
19:48
Well, I think paranoia needs
19:50
needs therapy. I don’t I
19:54
don’t think I don’t think
19:56
people are trying to
19:57
put me out of business. I
19:59
probably know they’re
20:00
trying to and so so I
20:02
that’s different. And so
20:03
so I I live in this
20:05
condition where where we’re
20:07
partly partly partly
20:10
desperate part. Lee,
20:12
you know partly partly
20:14
aspirational and uh, let me
20:15
ask you then about this you
20:16
said this to
20:17
the New Yorker and I found
20:19
it fascinating again goes
20:20
to this idea of failure or
20:22
worries about failure,
20:22
but you said this and I
20:24
was like news you can this
20:24
is a selfish question.
20:26
You said I find that I
20:27
think best when I’m under
20:29
adversity and then you said
20:31
my heart rate actually goes
20:33
down. When I’m under
20:36
adversity, my heart rate
20:36
goes up by a lot. Uh-huh.
20:42
Oh my let’s see. Well,
20:45
I think I think during
20:49
adversity you’re more focused
20:52
and when you’re more
20:52
focused you could you
20:54
perform better and I like
20:56
I like, you know know, the
20:57
last last five minutes
21:00
before before something
21:02
you’re more focused. And so,
21:04
you know, I like to live in
21:05
that state where we’re
21:08
we’re about to perish about
21:08
we’re we’re where state that
21:09
live in to like I to
21:09
perish and
21:13
Everything you know,
21:14
everything you know, and so
21:14
so I enjoy that condition
21:15
and Everything you know,
21:16
everything you know, and
21:16
I do my best work in
21:17
that condition and I
21:18
you know, I like going home
21:19
and telling condition, my
21:20
condition wife condition, I
21:21
saved the company today and
21:23
and maybe maybe it wasn’t
21:26
true. But but I like
21:27
to think so and so
21:29
another question we have
21:30
a lot of Business Leaders
21:30
and CEOs here and I think
21:32
they’re going to be
21:33
surprised to hear this you
21:34
have 40 direct reports.
21:37
So at the so at the So at the
21:38
company so at the 50 director
21:39
50 direct Reports, most
21:41
people say I don’t know if
21:42
we have any consultants
21:43
in the room, they’d
21:43
the room, in consultants
21:44
have any we know if don’t I
21:45
say they’d say, you know,
21:45
what half a dozen maybe
21:47
10, that should be the
21:47
limit. What’s
21:49
your what’s your philosophy
21:50
or Theory here? Well, the
21:52
people that report to the
21:53
CEO should require the least
21:54
amount of pampering. And
21:57
so I don’t think they need
21:58
life advice. I don’t think
21:59
they need career guidance.
22:02
They should be at the top
22:03
of their game incredibly
22:05
good at their craft
22:05
their craft. And unless
22:09
they need my personal
22:10
help, you know, they should
22:12
require very little
22:13
management. And so so I
22:14
think that one the
22:17
more the more direct reports
22:17
of CEO has the less layers
22:20
are in the company and so
22:22
Co so I it allows us to
22:24
keep information fluid
22:26
allows us to make sure that
22:29
that everyone is
22:30
empowered by Make sure
22:30
information make sure
22:31
Make sure and our company
22:32
that you know just
22:33
performs better because
22:35
you know,
22:35
everybody is aligned
22:36
you know you know,
22:36
everybody’s informed of
22:37
what’s going on. I
22:38
want to open up to questions
22:39
in just a moment. So,
22:40
please do raise your
22:41
hand so I can find you but I
22:43
want to ask you this you did
22:44
a podcast recently find you,
22:45
find you and there
22:46
are a lot of headlines about
22:47
it. And you said during the
22:50
podcast if you could do it
22:51
all over again, like if
22:52
you could start
22:54
inventing again,
22:54
invading again, yeah, you
22:56
wouldn’t. No, what do you
23:01
what did you mean? Why
23:03
I mean you’ve done this
23:03
amazing thing? Yeah, you’re
23:05
worth forty billion dollars
23:06
personally. That wasn’t what
23:07
I meant. First of all,
23:10
you know, I think it would
23:11
be disingenuous. If I said
23:12
that that it wasn’t quote
23:15
worth it. I enjoy a lot of
23:20
good things in life. I’ve
23:21
got a great family. We built
23:23
a great company. All of
23:25
that is worth it. That
23:27
wasn’t what I meant. What
23:28
I meant was if people
23:30
realized how hard something
23:32
is and if I were to
23:34
realize how hard it was how
23:35
many times we’re going to
23:36
fail how the original
23:38
business plan had no, hope
23:39
of succeeding. that that
23:42
That that almost the That
23:43
early Founders that we
23:44
built the whole company with
23:47
we had to completely
23:49
relearn just about
23:49
everything. We had to know
23:51
if I would have known we
23:52
everything all of the
23:53
things that We everything.
23:53
we everything I had to
23:55
know in order to be a CEO
23:57
everything that we had
23:58
to solve in order to be
23:59
where we are that mountain
24:00
of work that melon of
24:03
you know challenges
24:04
you know, the mountain of
24:06
adversity and setback and
24:08
some amount of humiliation
24:09
and a lot of embarrassment.
24:11
If you want if you want to
24:12
If you want mount piled all
24:12
if you want of that on in
24:14
1993 in you know on the
24:16
table of a 29 year old, I
24:18
don’t think I would have
24:19
done it. I would you know, I
24:20
would have said there’s
24:21
no way I would know all
24:22
this. There’s no way I could
24:23
learn all this. There’s no
24:24
way we can overcome all
24:25
this. There’s no way
24:26
you know, this is a game
24:27
plan that that’s not
24:28
going to work. And so
24:29
that’s what I meant that
24:32
I think I think the ignorance
24:33
of enterpreneurs this
24:35
attitude that and I try to
24:37
do to keep that today,
24:38
which is ask yourself.
24:43
How hard could it be
24:43
you know you approach
24:44
life with this attitude of
24:45
how hard could it be they
24:45
could do it I could do it
24:46
that attitude is
24:48
completely helpful,
24:50
but it’s also completely
24:52
wrong. It’s very helpful
24:54
because it gives you courage
24:56
but it’s wrong because it
24:57
is way harder than you
24:58
think. Yes, and and the
25:02
amount of skill that is
25:03
necessary to amount of
25:04
knowledge as a sentence
25:05
that’s necessary. You know,
25:07
I think it’s one of those
25:10
teenager attitudes and and I
25:12
think I think we I try to
25:14
keep that in the company
25:15
that teenage attitude how
25:16
hard can something can scan
25:19
something be, you know
25:21
gives you courage gives you
25:22
confidence. Let’s I too seek
25:24
in one question or two if
25:24
we could I know I Ron
25:27
Conway had a question last
25:28
time for at a different
25:29
moment. I know if he’s
25:30
still in the room. I felt
25:31
like I should give him
25:32
an opportunity but I see
25:34
Gary Lauder there. Hey,
25:35
Gary. So so there are So
25:40
a lot of startups so and
25:42
not some non startups doing
25:43
AI chips optimized for LMS.
25:46
Can you talk about and they
25:48
claim to be dramatically
25:50
more effective at energy
25:51
efficient than now gpus. Can
25:53
you talk about what
25:54
you’re planning
25:55
on these roads? Yeah. First
25:58
of all, this is one of the
26:00
great observations that we
26:01
made in a we realized that
26:03
that deep learning and AI
26:05
wasn’t was not a chip
26:07
problem. It’s a reinvention
26:09
of shooting problem
26:11
everything from how the
26:12
computer works how
26:14
computer software Everything
26:15
everything works the type of
26:16
software that
26:17
was going to write the way
26:18
that we write it the way
26:20
we develop software today
26:22
using AI creating a i that
26:25
method of software is
26:27
fundamentally different than
26:28
the way we did it before.
26:29
So every aspect of computing
26:31
is is changed. And in
26:32
fact, one of the things
26:33
that people don’t realize
26:35
is the vast majority of
26:37
computing today. today is a
26:38
retrieval model meaning just
26:40
all you have to ask Self what
26:43
happens when you touch your
26:44
phone self what
26:44
someone like, you know,
26:45
Self what there’s some
26:45
electrons go to a data
26:46
center somewhere retrieves
26:47
the file and brings it back
26:48
to you in the future. The
26:51
vast majority of computing
26:52
is going to be retrieval
26:53
plus generation. And so the
26:56
way that Computing is done
26:57
is fundamentally changed
26:59
now, we have we observe
27:00
that and realize that about
27:02
a decade and a half ago.
27:03
I think a lot of people are
27:04
still trying to sort
27:04
that out. It is the
27:06
reason why you know, people
27:09
say, oh, we’re
27:11
practically the only P’nay
27:11
doing it. It’s probably
27:12
because we’re the only
27:13
company that got it and
27:15
people are still trying to
27:15
get it. You can’t you can’t
27:18
solve this new way of doing
27:19
Computing by just designing
27:20
a chip every aspect of the
27:22
computer has fundamentally
27:23
changed and so everything
27:25
from networking to the
27:26
switching to the way the
27:27
computers are designed to
27:28
the chips and self all of
27:30
the software
27:31
that sits on top of it
27:32
in the methodology that
27:33
pulls it all together. It’s
27:42
It’s a big deal because
27:43
it’s a complete reinvention
27:43
of the computer industry.
27:44
And now we have a
27:44
trillion dollars with
27:45
the data centers in
27:45
the world. All of that
27:46
is going to give retooled.
27:47
That’s the amazing thing.
27:47
We’ve got
27:48
we’re in the beginning of a
27:49
brand new generation of
27:50
computing. It hasn’t been
27:52
reinvented in 60 years.
27:54
This is the this is why
27:56
such a big deal it’s
27:57
hard for people to
27:57
wrap their head around it.
27:59
But that’s that’s the that
28:01
was the great observation
28:02
that we made is it includes
28:03
a trip, but it’s not about
28:05
that ship Jensen
28:07
Wong everybody. Thank you
28:08
very very much.
28:08
long everybody.
28:09
Thanks everybody.