WEBVTT
1
00:00:02.600 --> 00:00:06.919
Chairman of Stagwell here with Linda Yakarino and an assembled
2
00:00:06.960 --> 00:00:11.480
group of about twenty five cmos of major corporations here
3
00:00:11.519 --> 00:00:16.519
at Cees during the CEES to really see the latest
4
00:00:16.559 --> 00:00:20.399
in technology. And we're graced with Elon Musk who has
5
00:00:20.480 --> 00:00:23.640
contented to give us some of his time to answer
6
00:00:23.679 --> 00:00:28.879
some of these really burning questions about technology's going to
7
00:00:28.920 --> 00:00:32.759
develop and change our lives, and some of his interactions
8
00:00:32.759 --> 00:00:35.039
with the government may also change.
9
00:00:34.880 --> 00:00:35.759
Our lives as well.
10
00:00:36.399 --> 00:00:40.119
Yeah, and so if I might start off with kind
11
00:00:40.159 --> 00:00:43.840
of a big overall question and I'll do some questions
12
00:00:43.880 --> 00:00:46.640
and then we'll get some questions in from the group.
13
00:00:47.679 --> 00:00:50.479
You know, I have to say that I've been actually
14
00:00:50.479 --> 00:00:52.759
a Tesla owner for many years.
15
00:00:53.039 --> 00:00:53.399
Thank you.
16
00:00:53.880 --> 00:00:58.039
Clearly you're clearly someone of a great taste.
17
00:00:59.320 --> 00:01:02.840
Well, I will say that I drove from Miami to
18
00:01:02.880 --> 00:01:05.920
Fort Laurida Vale to meet my brother for dinner, about forty.
19
00:01:05.599 --> 00:01:09.079
Miles and I did not touch I touched the wheel.
20
00:01:09.120 --> 00:01:10.560
But let me tell you that the test.
21
00:01:14.719 --> 00:01:18.439
The Chaudrove itself is what you're saying, and you do
22
00:01:18.560 --> 00:01:20.200
not need to intervene.
23
00:01:20.879 --> 00:01:23.439
It did not need me for this entire ride.
24
00:01:24.239 --> 00:01:28.040
Yeah, it's pretty magical. It's like like when you tell
25
00:01:28.079 --> 00:01:31.480
people that they have not experienced it, they don't believe you.
26
00:01:31.680 --> 00:01:35.840
You know, yeah, yes, And I didn't believe me either
27
00:01:37.400 --> 00:01:40.000
because I've been a skeptic about how far it would go.
28
00:01:40.640 --> 00:01:45.000
And your latest releases of software are you know, really incredible,
29
00:01:45.359 --> 00:01:48.040
So thank you for people to try them. As a
30
00:01:48.120 --> 00:01:52.640
big question, and in that thing coming in the next decade,
31
00:01:53.239 --> 00:01:55.480
what do you think is going to be the greatest
32
00:01:55.519 --> 00:01:58.959
advances in technology that will affect people's lives? What should
33
00:01:59.000 --> 00:02:03.400
they be affecting to see here from technology in their lives?
34
00:02:04.000 --> 00:02:06.760
Okay, well, I don't want to blow your minds, but
35
00:02:06.959 --> 00:02:13.319
AI is going to be big. I feel confident for
36
00:02:13.479 --> 00:02:15.680
that prediction. But you know, the funny thing is, if
37
00:02:15.719 --> 00:02:20.599
you go back even five years ago, certainly ten years ago,
38
00:02:21.120 --> 00:02:25.039
if ten years ago, even fifteen years ago, I was saying,
39
00:02:25.319 --> 00:02:28.560
is can be like this massive thing that has deep
40
00:02:28.719 --> 00:02:34.960
super intelligence, smarter than the smartest human People thought I
41
00:02:35.000 --> 00:02:38.120
was kidding, and I thought that that's ridiculous. There's no
42
00:02:38.159 --> 00:02:40.159
way computer's going to be smarter than a human. I'd
43
00:02:40.159 --> 00:02:41.520
be able to do all these complicated things.
44
00:02:41.960 --> 00:02:42.879
And now.
45
00:02:44.439 --> 00:02:48.599
You know the it's it's the latest AIS are able
46
00:02:48.680 --> 00:02:53.520
to pass complicated tests better than most humans. Like they
47
00:02:53.520 --> 00:02:57.039
can pass the medical tests better than than like eighty
48
00:02:57.039 --> 00:03:01.360
percent of doctors or something. Know they can AI can
49
00:03:01.919 --> 00:03:06.360
diagnose radiography better than most people who have been doing
50
00:03:06.400 --> 00:03:12.400
it their whole life. So that that's that's just accelerating obviously,
51
00:03:12.919 --> 00:03:17.080
I think probably if you haven't seen Jensen's talk, it's
52
00:03:17.319 --> 00:03:23.080
it's excellent and it really shows how much AI is advancing,
53
00:03:24.000 --> 00:03:25.000
and it's advancing on the.
54
00:03:25.240 --> 00:03:27.560
Hardware fronts on the software front.
55
00:03:28.280 --> 00:03:31.439
In terms of data, this is like the new sort
56
00:03:31.439 --> 00:03:34.520
of thing is synthetic data because we've actually run out
57
00:03:34.560 --> 00:03:38.000
of all the books and literally run out of you
58
00:03:38.120 --> 00:03:42.199
take the entire Internet and all books ever written, and
59
00:03:42.400 --> 00:03:46.000
all interesting videos, like like you don't need a thousand
60
00:03:46.039 --> 00:03:47.840
cat videos that are exactly the same, but all the
61
00:03:47.879 --> 00:03:50.639
interesting videos and and you sort of just strole that
62
00:03:50.759 --> 00:03:56.879
down into tokens essentially bits of information, and you've we've
63
00:03:56.960 --> 00:04:01.719
now exhausted all of the but basically that the cumulative
64
00:04:02.840 --> 00:04:04.479
some of human knowledge has been.
65
00:04:04.400 --> 00:04:05.520
Exhausted in AI training.
66
00:04:06.680 --> 00:04:09.680
That happened basically last last year, and so the only
67
00:04:09.759 --> 00:04:12.879
way to then supplement that is with synthetic data, where
68
00:04:12.879 --> 00:04:15.919
the AI creates, it'll sort of write an essay or
69
00:04:16.000 --> 00:04:18.160
we'll come up with the thesis, and then and then
70
00:04:18.680 --> 00:04:21.480
and then it will grade itself and and and and
71
00:04:21.759 --> 00:04:23.920
sort of go through this process of self learning with
72
00:04:24.040 --> 00:04:27.160
synthetic data, which is which is always challenging because how
73
00:04:27.199 --> 00:04:30.800
do you know it? How do you know if hallucinated
74
00:04:30.839 --> 00:04:34.920
the answer or it's a real answer. So it's challenging
75
00:04:34.959 --> 00:04:38.040
to find the ground truth. But but it is pretty
76
00:04:38.079 --> 00:04:40.399
well that AI at this point has run out of
77
00:04:40.519 --> 00:04:42.800
all human knowledge to train on.
78
00:04:45.279 --> 00:04:45.600
Crazy.
79
00:04:46.240 --> 00:04:49.680
I know that you're building for for the largest AI
80
00:04:49.920 --> 00:04:54.959
center on the planet we already have. Yeah, it's an
81
00:04:54.959 --> 00:04:57.240
operation Microsoft.
82
00:04:56.920 --> 00:04:59.920
Is planning eighty billion dollars or I used to work.
83
00:04:59.800 --> 00:05:03.839
For But there's a lot of money by anyone's standards.
84
00:05:03.480 --> 00:05:06.560
Really, you know, I mean, you know, I did a
85
00:05:06.639 --> 00:05:09.600
poll and we asked, there's AI making a difference in
86
00:05:09.680 --> 00:05:12.800
your life today? Thirteen percent said yes, and they said
87
00:05:12.879 --> 00:05:15.040
in five years from now, will AI make a difference
88
00:05:15.079 --> 00:05:18.199
in your life? Eighty seven percent expect in five years
89
00:05:18.240 --> 00:05:19.199
who will make a difference?
90
00:05:19.639 --> 00:05:21.079
What is a gigantic difference?
91
00:05:21.279 --> 00:05:23.079
Okay, what is it going to do for people?
92
00:05:23.360 --> 00:05:24.199
Is it going to be anything?
93
00:05:24.279 --> 00:05:27.959
You want to work right A I will do anything
94
00:05:28.040 --> 00:05:30.000
you want and even suggest things you never even thought of.
95
00:05:32.319 --> 00:05:35.360
So I mean, I really within the next few years
96
00:05:35.360 --> 00:05:38.920
will be able to do any cognitive task like it
97
00:05:38.920 --> 00:05:41.000
obviously begs the question what are we all going to do?
98
00:05:41.360 --> 00:05:45.959
You know, but pretty much any cognitive task that doesn't
99
00:05:46.000 --> 00:05:49.959
involve atoms AI will be able to do within i'd
100
00:05:50.000 --> 00:05:56.360
say max. Three or four years maximum. And then now
101
00:05:57.199 --> 00:06:00.720
then the another elements of it is the robotics is
102
00:06:00.759 --> 00:06:04.680
that you need, so you can't just be thinking. I
103
00:06:04.759 --> 00:06:07.720
can't just be thinking in it's in in a data center.
104
00:06:07.839 --> 00:06:10.600
It's going to do things. That's where you need the robots.
105
00:06:11.439 --> 00:06:16.519
So and you need you know, self driving cars, which
106
00:06:16.519 --> 00:06:19.360
obviously you've experienced, and and and that that rate of
107
00:06:19.399 --> 00:06:22.759
improvement is exponential in how good the self driving cars are.
108
00:06:24.600 --> 00:06:30.240
You know, we feel confident in passing the basically being
109
00:06:30.319 --> 00:06:35.360
better than human driving in about three months basically the
110
00:06:35.439 --> 00:06:38.360
Q two of this year. We feel confident of passing
111
00:06:40.720 --> 00:06:43.720
having a probability of accident that is better than the
112
00:06:44.079 --> 00:06:45.560
average experienced driver.
113
00:06:47.040 --> 00:06:48.839
And then and then it'll keep going from there.
114
00:06:49.920 --> 00:06:52.560
Ultimately, I think it's going to be ten times safer
115
00:06:52.600 --> 00:06:55.759
than a human driver and then one hundred times safer,
116
00:06:56.439 --> 00:06:57.279
like it's to.
117
00:06:57.639 --> 00:06:59.879
The point where really it just won't crash.
118
00:07:00.079 --> 00:07:04.959
Yeah. So and that's so, and that's happening this year
119
00:07:05.120 --> 00:07:07.800
with with Tesla. So Tesla and and and this is
120
00:07:07.839 --> 00:07:11.800
a software update to you know, our cars. So as
121
00:07:11.839 --> 00:07:15.759
you experienced yourself, it's the same car, it's got a
122
00:07:15.800 --> 00:07:18.079
software update, and suddenly it's way smarter at driving.
123
00:07:19.399 --> 00:07:21.959
Well, let me let me try a few timelines then,
124
00:07:22.040 --> 00:07:24.519
because you know, look, I'm not the youngest guy around,
125
00:07:24.600 --> 00:07:25.920
so I want to.
126
00:07:27.480 --> 00:07:30.079
Technology stand up for what he is young.
127
00:07:30.319 --> 00:07:33.000
Uh get older every year as I get older.
128
00:07:33.360 --> 00:07:35.720
I used to build computers and kids right when when
129
00:07:35.800 --> 00:07:38.439
you couldn't buy them yet, And I don't have to
130
00:07:38.519 --> 00:07:41.560
do that anymore. And so self driving some timeline self
131
00:07:41.639 --> 00:07:46.199
driving cars, certified government certified self drive you think will
132
00:07:46.240 --> 00:07:47.439
be within a year.
133
00:07:48.439 --> 00:07:52.160
Well, I mean there already are autonomous you know, in
134
00:07:52.519 --> 00:07:56.199
small in some regions, like Wemo has autonomous vehicles with
135
00:07:56.279 --> 00:07:59.600
no one in it, but they're limited to like if
136
00:07:59.639 --> 00:08:02.879
you sait Is in the U s that the Tesla solution,
137
00:08:03.160 --> 00:08:05.680
which is a much more difficult path to go but
138
00:08:05.800 --> 00:08:07.519
ultimately much more.
139
00:08:07.399 --> 00:08:11.279
Powerful, is a general solution to self driving.
140
00:08:11.959 --> 00:08:16.040
So the Tesla software is just purely AI and vision
141
00:08:16.839 --> 00:08:20.519
doesn't rely on any expensive sensor's, no light oars, no radars.
142
00:08:22.079 --> 00:08:24.199
Or or it doesn't require it doesn't even require knowing
143
00:08:24.279 --> 00:08:27.360
the area beforehand. Like you could put.
144
00:08:27.639 --> 00:08:29.560
You could have a drive someplace has never been before
145
00:08:29.600 --> 00:08:31.959
and no tests has ever been before. It couldn't be
146
00:08:32.000 --> 00:08:34.120
an alien planet, I mean it and the call will
147
00:08:34.159 --> 00:08:36.960
still work, still drive, so and and.
148
00:08:38.879 --> 00:08:43.240
So that's that's that's this year, you know, And when
149
00:08:43.320 --> 00:08:44.399
can I get a home robot?
150
00:08:44.639 --> 00:08:44.919
Okay?
151
00:08:45.480 --> 00:08:45.639
Right?
152
00:08:46.399 --> 00:08:50.720
H Well, so that's that's the other element is our
153
00:08:50.759 --> 00:08:55.480
humanoid humanoid robots. So I think probably most people, if
154
00:08:55.519 --> 00:08:58.200
not everyone, would like to have their own personal C
155
00:08:58.360 --> 00:09:03.600
three p O R T D two And I think
156
00:09:03.600 --> 00:09:05.840
I actually think humanoid robust will be the biggest product
157
00:09:06.639 --> 00:09:07.720
ever in history by far.
158
00:09:08.360 --> 00:09:08.799
I agree.
159
00:09:09.519 --> 00:09:10.559
Yeah, it's just it's as wild.
160
00:09:10.600 --> 00:09:13.679
But because you can just say, well every human is
161
00:09:13.720 --> 00:09:16.360
going to want one most likely, and someone want to
162
00:09:16.639 --> 00:09:19.559
and then they'll be all of the industry in terms
163
00:09:19.559 --> 00:09:22.919
of making providing products and services. So that you have, say,
164
00:09:22.960 --> 00:09:26.320
what's the ratio of humanoid robots to humans? My guess
165
00:09:26.440 --> 00:09:28.559
is it's least at least three to one, four to one,
166
00:09:29.000 --> 00:09:31.879
maybe five to one. So we're talking about twenty thirty
167
00:09:31.919 --> 00:09:39.879
billion humanoid robots, and you know, the it's it's not
168
00:09:39.919 --> 00:09:43.159
even clear what money means at that point, or if
169
00:09:43.200 --> 00:09:45.360
there's any meaningful cap on the economy.
170
00:09:46.679 --> 00:09:49.279
I think at that point, assuming that things haven't.
171
00:09:49.039 --> 00:09:53.399
Gone awry, you know, in the in the good AI scenario,
172
00:09:54.440 --> 00:09:56.799
I think we will have we won't have universal basic income,
173
00:09:56.840 --> 00:09:58.039
we'll have universal high income.
174
00:10:00.360 --> 00:10:03.080
So do you think five years for my first robot, or.
175
00:10:05.279 --> 00:10:08.720
Well for for Tesla where you know, Optimus robot really
176
00:10:08.799 --> 00:10:11.679
is it's and then somebody's got something secret we don't
177
00:10:11.679 --> 00:10:17.159
know about. The Optimus robot is the most sophisticated human
178
00:10:17.360 --> 00:10:20.000
robot in the world. It's got a hand that has
179
00:10:20.039 --> 00:10:22.039
twenty two degrees of freedom. It looks and feels like
180
00:10:22.120 --> 00:10:28.080
a human hand. And you know, we're aiming to have
181
00:10:28.159 --> 00:10:30.960
several thousand of those built this year. Initially we'll we'll
182
00:10:31.000 --> 00:10:36.399
test them out at Tesla factories, but then assuming things
183
00:10:36.440 --> 00:10:39.440
go well, we will will ten x that that output
184
00:10:39.600 --> 00:10:42.440
next year. So we'll aim to do maybe fifty to
185
00:10:42.519 --> 00:10:45.440
one hundred thousand humorid robots next year and then ten
186
00:10:45.600 --> 00:10:49.080
x at again the following year. It's like five hundred
187
00:10:49.120 --> 00:10:50.519
thousand robots in three years.
188
00:10:51.960 --> 00:10:52.399
That's a lot.
189
00:10:54.000 --> 00:10:57.240
Yeah, So I guess, well, maybe we should think of
190
00:10:57.360 --> 00:10:59.679
this in terms of Roman legions. How many legions of
191
00:10:59.759 --> 00:11:03.759
robot so we have, Like a Roman legion.
192
00:11:03.639 --> 00:11:04.399
Is five thousand.
193
00:11:06.559 --> 00:11:08.200
When will we have a colony on Mars?
194
00:11:09.799 --> 00:11:14.559
Well, I think we'll be able to send the first
195
00:11:16.080 --> 00:11:20.000
uncrewed spacecraft to Mars in two years. So Earth and
196
00:11:20.039 --> 00:11:24.919
Mars synchronize every two years, and so we're at a
197
00:11:24.960 --> 00:11:27.600
synchronous point right now, So then the next one will
198
00:11:27.639 --> 00:11:29.720
be roughly two years from now, and then there'll be
199
00:11:29.840 --> 00:11:32.519
two years from then there w another one. So for
200
00:11:32.600 --> 00:11:34.200
the first trip, obviously we want to make sure that
201
00:11:34.279 --> 00:11:40.600
we can land starship without crashing, Like we need to
202
00:11:40.639 --> 00:11:43.240
prove that we can land starship on Mars without incrementing
203
00:11:43.320 --> 00:11:49.639
the creator count. And if those land safely, then maybe
204
00:11:49.679 --> 00:11:52.879
on the next trip we would send people and then
205
00:11:52.960 --> 00:11:54.519
hopefully that would grow exponentially.
206
00:11:55.000 --> 00:11:58.960
So eventually there will be thousands of starships going to Mars.
207
00:11:59.480 --> 00:12:02.960
And I might have this like really cool visual like Battlesaw.
208
00:12:02.559 --> 00:12:04.159
Glack toic Go or something, you know, the sort of
209
00:12:04.600 --> 00:12:08.759
colony ships departing altogether with these like bright points of
210
00:12:08.799 --> 00:12:12.840
light in space. I think it looked really cool, but
211
00:12:13.919 --> 00:12:15.720
I think the goal has to be to get to
212
00:12:15.759 --> 00:12:19.639
the point where Mars is self sustaining. So the point
213
00:12:19.639 --> 00:12:23.159
of which Mars is self sustaining, which is really defined
214
00:12:23.200 --> 00:12:26.159
as a point of which if the resupply ships Earth
215
00:12:26.200 --> 00:12:28.879
stopped coming for any reason, that Mars doesn't die out,
216
00:12:29.279 --> 00:12:32.039
that Mars can continue to grow. So if there's something
217
00:12:32.039 --> 00:12:33.720
that happens on Earth, like let's say there's World War
218
00:12:33.759 --> 00:12:37.279
II or some natural disaster or who knows what, but
219
00:12:37.399 --> 00:12:41.559
for whatever reason the resupply ships stopped coming, if Mars
220
00:12:41.600 --> 00:12:44.080
can still continue to survive, then.
221
00:12:46.399 --> 00:12:50.200
You know, then the probable lifespan of civilization is dramatically greater.
222
00:12:51.559 --> 00:12:54.000
So you know, if you sort of stand back and say,
223
00:12:54.000 --> 00:12:56.879
how would you evaluate any civilization, you'd say, like, well,
224
00:12:56.919 --> 00:12:59.679
it's that civilization still stuck on its own planet or
225
00:13:00.159 --> 00:13:04.399
a multiplanets civilization, And we don't want to be one
226
00:13:04.399 --> 00:13:06.039
of those lame one planet civilizations.
227
00:13:06.279 --> 00:13:09.000
Okay, we're to have a respectable outcome here.
228
00:13:10.080 --> 00:13:11.840
Even if we don't make it beyond our Solar system,
229
00:13:11.879 --> 00:13:13.519
were at least got to get to another planet.
230
00:13:14.279 --> 00:13:18.799
Yeah, well, and finally in my list brain to technology. Communication.
231
00:13:19.480 --> 00:13:22.000
Yeah, back to communication, am I am I going to
232
00:13:22.039 --> 00:13:24.519
see that also because right now this is looking pretty
233
00:13:24.559 --> 00:13:24.919
good for me.
234
00:13:26.480 --> 00:13:29.279
Yeah. So we've got neuralink.
235
00:13:29.360 --> 00:13:32.440
We've got now three patients with three three humans with
236
00:13:32.600 --> 00:13:37.840
neural links implanet and no working well. And we've upgraded
237
00:13:37.879 --> 00:13:42.480
devices that were the devices will have more more electrodes
238
00:13:42.519 --> 00:13:48.519
basically higher bandwidth, longer battery life and everything. And so
239
00:13:48.679 --> 00:13:52.840
we expect to you know, hopefully do twenty or thirty
240
00:13:53.799 --> 00:13:57.039
patients next year or this year, I should say, with
241
00:13:57.200 --> 00:14:02.080
the upgraded neuralink devices. And this our first product is
242
00:14:02.360 --> 00:14:04.960
we're trying to enable people who have lost their brain
243
00:14:05.039 --> 00:14:10.679
body connection so they're a tetroplegic or paraphlegic or uh
244
00:14:12.159 --> 00:14:15.679
basically like you can imagine like say Stephen Hawking. If
245
00:14:15.679 --> 00:14:19.399
Stephen Hawking could communicate as fast or even faster than
246
00:14:19.440 --> 00:14:23.000
a normal human, that would be transformational. Yeah, So that
247
00:14:23.120 --> 00:14:28.440
that's sort of our first product is being able to
248
00:14:28.519 --> 00:14:31.440
read the motor cortex of the brain and say that
249
00:14:31.559 --> 00:14:34.440
if you think about moving your hand, it will move
250
00:14:34.519 --> 00:14:38.000
the cursor on the screen. And it enables enables people
251
00:14:38.080 --> 00:14:40.240
to control their computer or their phone just by thinking.
252
00:14:41.360 --> 00:14:44.240
And then our next part will be blind sites that
253
00:14:44.320 --> 00:14:47.000
even if somebody has lost both eyes or has lost
254
00:14:47.039 --> 00:14:50.120
the optic nerve, or if they've never they've been blind
255
00:14:50.159 --> 00:14:53.679
from birth, we can interface directly with the visual cortex
256
00:14:53.720 --> 00:14:55.840
in the brain and enable them to see.
257
00:14:57.279 --> 00:14:59.320
And we already have that working in monkeys.
258
00:15:00.879 --> 00:15:02.840
I actually have a monkey who's now had that for
259
00:15:02.960 --> 00:15:07.960
I think two years, and so enabling sort of basically
260
00:15:08.039 --> 00:15:16.360
enabling people to control devices, and ultimately we think if
261
00:15:16.399 --> 00:15:19.919
you have a second neuralink device that is past the
262
00:15:20.000 --> 00:15:24.000
point where the spinal damage occurred, we can actually transmit
263
00:15:24.080 --> 00:15:27.639
the signals from the brain past the where essentially the
264
00:15:27.679 --> 00:15:28.080
wires is.
265
00:15:28.080 --> 00:15:30.279
Broken and enabled someone to walk again.
266
00:15:31.679 --> 00:15:36.200
So that that would really be profound obviously, but I'm
267
00:15:36.200 --> 00:15:40.399
confident that that is physically possible. And then the long
268
00:15:40.480 --> 00:15:42.559
term goalf in neural link is to be able to
269
00:15:42.639 --> 00:15:48.639
improve the bandwidth. So right right now when we're speaking,
270
00:15:48.879 --> 00:15:51.679
our bandwidth and po second is quite low, and the
271
00:15:51.759 --> 00:15:54.519
sustained bandwidth of a human is less than one but
272
00:15:54.639 --> 00:15:58.159
per second over twenty four hour period, So this eighty
273
00:15:58.159 --> 00:16:01.240
six four hundred seconds in a day, and the average
274
00:16:01.279 --> 00:16:02.879
human put outputs.
275
00:16:02.519 --> 00:16:05.080
Much less than eighty six four hundred bits today.
276
00:16:06.679 --> 00:16:08.919
If someone's a writer, they might all exceed that, but
277
00:16:09.039 --> 00:16:12.279
most people do not output more than the number of
278
00:16:12.320 --> 00:16:16.799
seconds in the day. And with a neural link, you
279
00:16:17.080 --> 00:16:22.000
could increase that output capably by a thousand or maybe
280
00:16:22.000 --> 00:16:27.360
a million, So it would be profoundly different experience that
281
00:16:27.440 --> 00:16:28.639
could be super human essentially.
282
00:16:29.559 --> 00:16:31.759
Well, put me down for all of this so far
283
00:16:33.159 --> 00:16:34.200
early as after.
284
00:16:34.519 --> 00:16:37.519
Yeah, and trust me, really you really like the chip,
285
00:16:37.879 --> 00:16:39.159
I can guarantee.
286
00:16:38.679 --> 00:16:43.559
It, And let's kind of bring us down to earth
287
00:16:43.679 --> 00:16:46.240
for a question on DOGE.
288
00:16:46.679 --> 00:16:49.559
Yeah, I mean I worked very closely actually with President
289
00:16:49.559 --> 00:16:53.399
Clinton in the nineties where we did have reinventing government.
290
00:16:53.519 --> 00:16:55.600
We did balance the budget in two years.
291
00:16:55.679 --> 00:16:58.279
Actually, that's awesome. Oh those were the days.
292
00:17:00.279 --> 00:17:05.119
Well, it didn't last very long because it's got blown
293
00:17:05.200 --> 00:17:08.759
up very quickly. Have you identified some cuts that you're
294
00:17:08.759 --> 00:17:10.960
really that you're really looking at, that you think will
295
00:17:11.000 --> 00:17:13.279
be successful. Do you think the two trillion is is
296
00:17:13.319 --> 00:17:15.720
a realistic number now that you're looking more closely at it.
297
00:17:16.759 --> 00:17:23.920
Yeah, Well, I think I think we'll try and for
298
00:17:24.000 --> 00:17:28.000
two trillion. I think that's like the best case outcome.
299
00:17:28.960 --> 00:17:31.640
But I do think that you kind of have to
300
00:17:31.680 --> 00:17:33.640
have some overage. I think if we try for two trillion,
301
00:17:33.680 --> 00:17:37.559
we've got a good shot at getting one. And if
302
00:17:38.079 --> 00:17:40.880
we can get drop the budget deficit from two trillion
303
00:17:40.960 --> 00:17:42.319
to one trillion and.
304
00:17:43.960 --> 00:17:45.640
Kind of free up the economy.
305
00:17:45.240 --> 00:17:50.160
To you know, have additional growth such that the output
306
00:17:50.200 --> 00:17:54.000
of goods and services it keeps pace with the increase
307
00:17:54.000 --> 00:17:56.119
in the money supply, then there will be no inflation.
308
00:17:57.920 --> 00:18:00.839
So that that I think would be an epic outcome.
309
00:18:03.079 --> 00:18:06.119
And in terms of saving money in the government, well,
310
00:18:06.720 --> 00:18:09.599
as you as you know, it's it's a it's a
311
00:18:09.680 --> 00:18:11.799
very target rich environment for saving money. Like if you
312
00:18:12.000 --> 00:18:14.720
if you look at any direction, it's like it was like,
313
00:18:14.759 --> 00:18:16.400
where will you find places to save money? I'm like,
314
00:18:16.519 --> 00:18:18.480
it's like being in a room full oft targets, Like
315
00:18:18.599 --> 00:18:22.839
you could close your eyes and you can't miss. So
316
00:18:24.000 --> 00:18:27.880
there's just a lot of waste in government because especially
317
00:18:27.920 --> 00:18:30.119
the federal government, you just got a situation where that
318
00:18:30.240 --> 00:18:32.519
the checks never bounce, like they've got like the infinite
319
00:18:32.559 --> 00:18:38.160
money computer and uh and and then the the people
320
00:18:38.240 --> 00:18:40.240
that spend the money are not the people. It's it's
321
00:18:40.279 --> 00:18:42.319
not their money, you know, it's it's it's very very
322
00:18:42.359 --> 00:18:45.440
hard for people to care about spending someone else's money.
323
00:18:47.400 --> 00:18:50.200
And and then even if you you know, actually I
324
00:18:50.279 --> 00:18:52.480
know people in the government who do care about on
325
00:18:52.839 --> 00:18:55.039
just as a matter of principles, spending money effectively, and
326
00:18:55.079 --> 00:18:58.160
they try to do so and they can't. The system
327
00:18:58.240 --> 00:19:00.920
prevents them from doing so. And then and they even
328
00:19:00.960 --> 00:19:04.440
get told to do crazy things as against probably sounds familiar,
329
00:19:04.759 --> 00:19:06.720
where you get towards the end of the budget cycle
330
00:19:06.799 --> 00:19:09.359
and and they're they're told to spend up to their
331
00:19:09.400 --> 00:19:13.400
budget and even on on nonsense stuff because if they
332
00:19:13.440 --> 00:19:17.119
don't spend their budget, the budget gets reduced. So it's
333
00:19:17.160 --> 00:19:20.480
actually sort of sort of a perverse incentive to waste money,
334
00:19:21.680 --> 00:19:23.400
and and and then they kind of get punished for
335
00:19:23.519 --> 00:19:25.559
not wasting money. So it's totally bananas.
336
00:19:28.039 --> 00:19:29.599
Well, well, I agree, you will find.
337
00:19:30.440 --> 00:19:33.559
You know, I did a mathematical analysis in terms of
338
00:19:33.759 --> 00:19:36.119
how government used to do things. So if you take
339
00:19:36.200 --> 00:19:39.960
the Brooklyn Bridge or the Lincoln Tunnel, and you yeah, inflation,
340
00:19:40.799 --> 00:19:44.519
the infrastructure bill should actually get you four thousand adjusted
341
00:19:44.640 --> 00:19:48.960
breakage Lincoln Tunnels, which, of course, of course it's not
342
00:19:49.200 --> 00:19:52.039
because government is right, does it used to be?
343
00:19:52.880 --> 00:19:57.039
No, exactly Essentially that we've got had an accumulation of
344
00:19:57.160 --> 00:20:01.880
laws and regulations that may basically any large project essentially
345
00:20:01.920 --> 00:20:07.319
illegal and uh and uh and and even if you've
346
00:20:07.359 --> 00:20:09.119
tried to do you've better satisfied. You better to spend
347
00:20:09.160 --> 00:20:11.240
way more money on the paperwork than on the thing itself.
348
00:20:13.720 --> 00:20:17.200
So and and then it gets gets laid and so
349
00:20:17.279 --> 00:20:19.319
this there's there's an element of like dose, which is
350
00:20:19.400 --> 00:20:24.039
very important, which is looking at regulations and and getting
351
00:20:24.119 --> 00:20:28.599
rid of ones that where the the the harm is
352
00:20:28.759 --> 00:20:30.440
worse than the good, like you say, like, well, any
353
00:20:30.480 --> 00:20:31.920
given regulations like okay, there's some.
354
00:20:31.920 --> 00:20:33.799
Amount of good, some amount of harm, but you know
355
00:20:34.279 --> 00:20:35.319
what's that ratio? Is it?
356
00:20:35.480 --> 00:20:38.000
Like you know, and there's a lot of regulations where
357
00:20:38.319 --> 00:20:43.079
frankly it just completely nonsensical. And we want to get
358
00:20:43.119 --> 00:20:45.680
rid of nonsensical regulations that do not serve the public good.
359
00:20:46.079 --> 00:20:49.599
Well, Linda Acarino and her keynote here mentioned the dose thing,
360
00:20:49.680 --> 00:20:51.359
and she got enormous applause.
361
00:20:51.880 --> 00:20:54.799
So I think the country is really waiting to see
362
00:20:55.200 --> 00:20:58.400
this effort. There. They're behind it, they're they're optimistic.
363
00:20:58.960 --> 00:21:01.640
Let me try to get in one more topic here
364
00:21:01.759 --> 00:21:04.119
before I get one or two other questions out there,
365
00:21:04.160 --> 00:21:08.640
which is obviously Mark Zuckerberg made an amazing one.
366
00:21:09.640 --> 00:21:10.799
An eighty degree term.
367
00:21:11.559 --> 00:21:12.759
Yeah cool.
368
00:21:14.880 --> 00:21:17.759
What's your reaction to what he did and his acknowledgment
369
00:21:17.880 --> 00:21:21.400
frankly that the government was in fact censoring things, or
370
00:21:21.480 --> 00:21:24.680
he was censoring things, or the government or some combination thereof.
371
00:21:25.599 --> 00:21:27.359
Yeah, I mean there's there's there's no question. I mean
372
00:21:27.799 --> 00:21:30.799
one percent the government was was censoring things. We know
373
00:21:30.880 --> 00:21:34.480
that for a fact from from the Twitter files. I
374
00:21:34.519 --> 00:21:38.119
mean some of the stuff was like, like pretty illegal, frankly.
375
00:21:39.000 --> 00:21:42.079
I mean, the FBI had this portal into Twitter where
376
00:21:42.119 --> 00:21:45.920
they could spy on anything and and censor anything and
377
00:21:46.079 --> 00:21:47.480
had a two week auto delete.
378
00:21:47.559 --> 00:21:48.839
So we don't even know what they did.
379
00:21:50.960 --> 00:21:53.200
Except that they had immense power to do whatever they wanted,
380
00:21:53.200 --> 00:21:54.640
which doesn't sound legal.
381
00:21:56.079 --> 00:21:59.440
And you know that sounds pretty pretty crazy.
382
00:21:59.599 --> 00:22:02.880
And you know, there's also a lot of self censoring,
383
00:22:04.079 --> 00:22:06.200
and I don't know, there was just a lot of
384
00:22:06.240 --> 00:22:10.880
censoring going on. And I think you I feel very
385
00:22:10.880 --> 00:22:13.720
strongly that the you have to have freedom of speech
386
00:22:13.920 --> 00:22:17.119
to have a functioning democracy. You know, if you don't
387
00:22:17.200 --> 00:22:22.400
have freedom of speech, freedom expression, then how do you
388
00:22:22.440 --> 00:22:24.680
know what's really going on? And and if you can't
389
00:22:24.720 --> 00:22:27.640
make an inform vote then you don't have you don't
390
00:22:27.640 --> 00:22:31.400
have a real democracy. So it's you know, just that's
391
00:22:31.440 --> 00:22:35.480
incredibly important too. I think listen to the wisdom of
392
00:22:35.519 --> 00:22:37.480
the founders of the country and say, why why did
393
00:22:37.480 --> 00:22:40.839
they make that the first amendment? You know, and and
394
00:22:41.039 --> 00:22:42.680
it's what they did it for a reason. It's because
395
00:22:42.680 --> 00:22:46.200
they came from places where, uh that where there was
396
00:22:46.279 --> 00:22:50.799
massive censorship and the penalties for speaking your mind would be,
397
00:22:51.839 --> 00:22:54.640
you know, find imprisonment or death. And they're like, we
398
00:22:54.759 --> 00:22:56.519
really don't like that. We want that to not be
399
00:22:56.640 --> 00:23:00.440
the case in America. So then they you know, freedom speech.
400
00:23:01.240 --> 00:23:06.680
So you've just posted you are the media citizen.
401
00:23:07.119 --> 00:23:11.319
Citizen journalism, I mean, really becoming more and more important
402
00:23:11.599 --> 00:23:12.440
in the States of media.
403
00:23:12.480 --> 00:23:14.519
And you see it with the with the wildfires here.
404
00:23:14.640 --> 00:23:15.240
Yeah, exactly.
405
00:23:15.720 --> 00:23:19.119
It really takes the citizens to report and tell people.
406
00:23:19.200 --> 00:23:24.400
Yes, yeah, actually, you think of it like before there
407
00:23:24.480 --> 00:23:26.240
was before the Internet, you kind of had to have
408
00:23:27.799 --> 00:23:30.119
the what I call legacy media, of which which is
409
00:23:30.599 --> 00:23:34.799
you have to have some aggregation points where you know
410
00:23:35.119 --> 00:23:37.039
that you know, have reporters going and go and find
411
00:23:37.079 --> 00:23:38.519
things out. Then they would come to they would go
412
00:23:38.640 --> 00:23:42.359
to to to their office, they would write up articles.
413
00:23:42.799 --> 00:23:45.839
They would then print those articles on paper. That paper
414
00:23:45.880 --> 00:23:48.839
would then be distributed and it was kind of the
415
00:23:48.920 --> 00:23:50.480
only it was the only way to know what was
416
00:23:50.519 --> 00:23:53.240
going on. But it was very slow, especially in the
417
00:23:53.279 --> 00:23:53.720
old days.
418
00:23:54.720 --> 00:23:54.839
You know.
419
00:23:54.880 --> 00:23:55.599
I think when.
420
00:23:56.960 --> 00:23:59.440
Like when when Lincoln Lincoln was assassinated, I think it
421
00:23:59.480 --> 00:24:02.640
took like three weeks for the that needs to reach
422
00:24:02.720 --> 00:24:03.599
Asio or something like that.
423
00:24:03.880 --> 00:24:04.079
You know.
424
00:24:05.519 --> 00:24:07.799
And in fact, in the old days, like you wouldn't
425
00:24:07.799 --> 00:24:10.680
even know that your country had gone to war because
426
00:24:11.079 --> 00:24:13.119
it would take like a month for the fact, hey
427
00:24:13.160 --> 00:24:15.160
where or to reach your village.
428
00:24:16.400 --> 00:24:21.240
We learned about Perrel Harbor because we decoded the Japanese.
429
00:24:21.720 --> 00:24:26.920
I didn't know that we decoded their their their cyber system, right,
430
00:24:27.200 --> 00:24:30.160
I mean, that's how we actually learned about about it. Yeah,
431
00:24:31.039 --> 00:24:33.160
and and you know we have just if I can
432
00:24:33.480 --> 00:24:35.400
squeeze in, I just got three minutes here.
433
00:24:35.759 --> 00:24:38.279
It's fine if you got a little longer if you want, Okay.
434
00:24:39.440 --> 00:24:39.759
Thank you.
435
00:24:40.039 --> 00:24:44.039
I uh so, just I got three questions in from
436
00:24:44.079 --> 00:24:47.599
the audience that Dan Gardner and Toby Daniels avon Discourse
437
00:24:47.640 --> 00:24:50.240
Head want to know do you think the Internet sucks?
438
00:24:50.880 --> 00:24:51.920
And what do you think we need to.
439
00:24:53.440 --> 00:24:56.119
I mean, the content on the Internet or your Internet connection,
440
00:24:56.200 --> 00:24:58.559
because that starlink can help you on the Internet connection.
441
00:24:59.240 --> 00:25:04.759
Starling is very for Internet connectivity, so you know, especially
442
00:25:04.880 --> 00:25:07.920
for like places that have bad connectivity impact. I think
443
00:25:07.960 --> 00:25:11.400
stollic Is is really having a significant effect in terms
444
00:25:11.440 --> 00:25:14.599
of lifting people out of poverty. In fact, in many
445
00:25:14.640 --> 00:25:18.039
parts of the world where people have a product that
446
00:25:18.039 --> 00:25:19.599
they want to sell but if they don't have an
447
00:25:19.640 --> 00:25:23.680
Internet connection, they can't do it, or if they want
448
00:25:23.720 --> 00:25:26.400
to learn things like basically you can learn anything on
449
00:25:26.440 --> 00:25:28.359
the Internet for free, like mi T has all these
450
00:25:28.440 --> 00:25:31.039
lectures that are available for free on the Internet, which
451
00:25:31.039 --> 00:25:33.720
you need an Internet interet connection. And so once you
452
00:25:33.759 --> 00:25:35.920
have an Internet connection, you've you've got access to education,
453
00:25:36.000 --> 00:25:39.039
you've got access to a golden market. So I think
454
00:25:39.119 --> 00:25:43.759
it's it's very significant how connectivity makes difference in people's lives,
455
00:25:44.200 --> 00:25:47.160
you know, with with respect to maybe a question was
456
00:25:47.359 --> 00:25:49.519
geared at like is there too much negativity on the Internet.
457
00:25:51.200 --> 00:25:53.279
The I think at times there is there is too
458
00:25:53.359 --> 00:25:55.640
much negativity, and.
459
00:25:57.160 --> 00:25:59.720
You know, actually on the X platform, I propose like, well,
460
00:26:00.279 --> 00:26:02.799
we were going to tweak the algorithm to be a
461
00:26:02.839 --> 00:26:04.920
bit more positive, and then people got to upset about
462
00:26:04.960 --> 00:26:05.880
me for that, I'm like.
463
00:26:07.400 --> 00:26:09.160
Like, okay, like what do you guys want? You know?
464
00:26:11.039 --> 00:26:14.240
Well, actually the second question from exact Moffitts or Target
465
00:26:14.319 --> 00:26:17.720
victory was how do you make pessimism uncool again? So
466
00:26:17.839 --> 00:26:22.039
I think it's maybe exactly what you're saying, which is that,
467
00:26:22.839 --> 00:26:25.960
you know, it's become people people are afraid. You know,
468
00:26:26.079 --> 00:26:28.880
this used to be a can do nation. Yeah, always
469
00:26:29.000 --> 00:26:32.160
very positive, and now now the country when I ask
470
00:26:32.240 --> 00:26:33.640
if we're on the right track or the wrong track,
471
00:26:33.680 --> 00:26:35.720
they never say that we're on the right track anymore.
472
00:26:36.160 --> 00:26:40.960
Okay, well, I hope hopefully, you know, hopefully with this.
473
00:26:41.480 --> 00:26:45.720
You know, I'm actually pretty optimistic about the next four
474
00:26:45.799 --> 00:26:47.920
or five years. I think we're actually gonna I think
475
00:26:47.960 --> 00:26:52.440
we have the potential for a golden age. So we
476
00:26:52.720 --> 00:26:54.720
need to It's very important to get rid of of
477
00:26:55.000 --> 00:26:59.359
of the mountain of regulations that are holding things back.
478
00:26:59.599 --> 00:27:01.640
And I don't mean there's there's some good regulations, but
479
00:27:02.039 --> 00:27:04.279
there's just so much that we just can't get anything done.
480
00:27:04.599 --> 00:27:05.480
I mean, you take sort of.
481
00:27:05.480 --> 00:27:09.759
The California wildfires for example. We really need to have
482
00:27:09.880 --> 00:27:12.440
fire breaks, and we need to clear the brush back
483
00:27:12.480 --> 00:27:16.480
away from houses and uh, and we need to make
484
00:27:16.519 --> 00:27:18.559
sure the reservoirs are full. These are all kind of
485
00:27:18.599 --> 00:27:21.559
obvious things. But but but due to a bunch of
486
00:27:21.759 --> 00:27:25.680
environmental rulings, you can't actually do that in California. So
487
00:27:26.000 --> 00:27:27.680
they're not allowed to do the five breaks, and they're
488
00:27:27.680 --> 00:27:30.000
not allowed to push the brush back away from houses
489
00:27:31.039 --> 00:27:35.440
because it might it might hurt some like red legged
490
00:27:35.480 --> 00:27:36.920
frog or something like that. You know, it's like some
491
00:27:37.039 --> 00:27:40.000
sort of creature that usually a creature you've never heard of,
492
00:27:40.160 --> 00:27:43.880
that is preventing this from occurring. You know, there's this
493
00:27:44.039 --> 00:27:46.400
is there's like this fish called the smelt, for example,
494
00:27:47.839 --> 00:27:50.680
and so there's so we have far more fresh water
495
00:27:50.759 --> 00:27:53.640
run off into the ocean than than we should really
496
00:27:54.000 --> 00:27:57.000
be on the theory that it helps this one little
497
00:27:57.039 --> 00:28:01.559
fish that likes a slightly briny freshwater sold water mix,
498
00:28:02.200 --> 00:28:04.559
and if if we if we, if we keep more
499
00:28:04.640 --> 00:28:09.039
fresh water than the smelled fish will will not be happy.
500
00:28:09.799 --> 00:28:12.319
But there's no actual evidence that that the smells fish
501
00:28:12.400 --> 00:28:16.000
is going to be unhappy if we if we keep
502
00:28:16.039 --> 00:28:16.920
a bit more fresh water.
503
00:28:17.160 --> 00:28:19.920
In fact, so we should we should.
504
00:28:19.680 --> 00:28:22.319
Keep more fresh water, keep the reservoirs full, and and
505
00:28:22.519 --> 00:28:25.200
and just have some sensible fire breaks and move the
506
00:28:25.240 --> 00:28:28.279
rush away from houses. That's just an example of like
507
00:28:29.079 --> 00:28:31.279
we've saved a lot of trouble and a lot of
508
00:28:31.359 --> 00:28:33.680
tragedy in l A if we've done that.
509
00:28:35.640 --> 00:28:39.359
So you know, and and yeah, I think.
510
00:28:40.920 --> 00:28:43.400
I think AI and robotics is going to lead to
511
00:28:43.640 --> 00:28:46.319
a higher standard of living for for people beyond what
512
00:28:46.359 --> 00:28:52.000
they can imagine. So AI doctors and medicine that are
513
00:28:52.160 --> 00:28:52.920
pretty incredible.
514
00:28:53.079 --> 00:28:56.279
So that's the fin the final question, she says from
515
00:28:56.400 --> 00:28:57.160
from uh.
516
00:28:59.000 --> 00:29:04.279
Catherine heritage heritage, I got there right, Uh, if if
517
00:29:04.359 --> 00:29:08.440
all the robots and everything freeze up time for humans,
518
00:29:09.039 --> 00:29:11.279
what is it they will do with that time? Or
519
00:29:11.319 --> 00:29:13.240
what can we ask them to do with that time?
520
00:29:13.680 --> 00:29:17.359
And I think that's it. Yes, grounds out the circle
521
00:29:17.559 --> 00:29:18.359
of technology.
522
00:29:19.359 --> 00:29:24.000
Yeah, I guess it would be a bit like being retired. Yeah,
523
00:29:26.079 --> 00:29:28.680
I mean this that does will take a few years,
524
00:29:28.720 --> 00:29:32.000
but at some point if this, if as AI gets
525
00:29:32.400 --> 00:29:35.119
as AI and roboties get better, eventually a I will
526
00:29:35.160 --> 00:29:37.039
be able to do everything that humans can do. So
527
00:29:37.720 --> 00:29:40.079
any task you do will be optional, like it'll be
528
00:29:40.240 --> 00:29:47.440
like a hobby or you know. So now it just
529
00:29:47.559 --> 00:29:50.559
big question like well, this will our lives lives have meaning?
530
00:29:50.920 --> 00:29:54.680
If the computers and the robots can do everything better
531
00:29:54.720 --> 00:29:55.240
than we can.
532
00:29:56.039 --> 00:29:56.119
And.
533
00:29:59.079 --> 00:30:00.599
That is that a real question.
534
00:30:00.680 --> 00:30:03.240
I do wonder about that myself myself, and that's maybe
535
00:30:03.279 --> 00:30:04.640
that's why we need the new link so we can
536
00:30:04.759 --> 00:30:07.240
enhance you in capability so we can keep up with.
537
00:30:10.640 --> 00:30:10.839
Yeah.
538
00:30:11.640 --> 00:30:14.720
Well, I just want to thank you for well, it's
539
00:30:14.839 --> 00:30:18.799
obviously incredibly optimistic view about technology and where it's going.
540
00:30:19.839 --> 00:30:24.039
I feel reassured that the kind of leaps that that
541
00:30:24.119 --> 00:30:26.680
are going to be made, particular what you're working on,
542
00:30:27.000 --> 00:30:29.640
are just incredible in the next few years, not not
543
00:30:29.799 --> 00:30:33.720
decades away, but right really very closely talked in terms
544
00:30:33.759 --> 00:30:38.200
of years consistently, and very thankful for that. And uh
545
00:30:38.519 --> 00:30:42.240
and I think that that's you know, absolutely tremendous message
546
00:30:42.279 --> 00:30:44.640
here at the you know, where we are in Las
547
00:30:44.759 --> 00:30:48.039
Vegas kind of studying what what technology is going to
548
00:30:48.079 --> 00:30:51.880
be available to people and UH and really again, thank
549
00:30:51.920 --> 00:30:54.200
you for the tremendous role in opening up free speech.
550
00:30:54.960 --> 00:30:57.319
Uh and for the role that that X and you know,
551
00:30:57.799 --> 00:30:59.240
Linda is playing in terms of that.
552
00:30:59.440 --> 00:31:02.640
I know that we're we're working to get full recognition
553
00:31:02.799 --> 00:31:06.000
by everyone of the of the platform as it should be,
554
00:31:06.119 --> 00:31:08.960
because free speech is is I think the proper the
555
00:31:09.039 --> 00:31:09.720
proper way to go.
556
00:31:10.200 --> 00:31:12.440
Yeah, I think it unbalanced is good.
557
00:31:13.599 --> 00:31:17.000
And you know, one of the things that create conceptually
558
00:31:17.039 --> 00:31:20.200
with the X platform is like it's like a global consciousness.
559
00:31:20.240 --> 00:31:23.799
It's like collective conscious consciousness of humanity. Now, if you
560
00:31:23.880 --> 00:31:25.880
have a collective consciousness of humanity, well you're going to
561
00:31:25.920 --> 00:31:29.519
get every aspect of humanity, good and bad. That's just
562
00:31:29.640 --> 00:31:36.519
naturally what happens. So but I, you know, I do
563
00:31:36.640 --> 00:31:38.559
want it to be a good and productive thing. And
564
00:31:39.279 --> 00:31:44.160
you know that aspiration is to maximize unregretted user minutes.
565
00:31:44.680 --> 00:31:47.559
So like you spend time on the platform, it's like, well,
566
00:31:47.720 --> 00:31:48.440
did you regret it.
567
00:31:48.519 --> 00:31:51.519
Or not regret it? And we want to maximize unregretted
568
00:31:51.640 --> 00:31:52.240
user time.
569
00:31:53.559 --> 00:31:55.319
Well, and I always go back, you know, I always
570
00:31:55.359 --> 00:31:58.400
say that everything in technology was either in the jet sense.
571
00:32:00.279 --> 00:32:04.160
Likely justice. You know, it's either in Star Trek or
572
00:32:04.200 --> 00:32:05.720
the Jetsons more or less right.
573
00:32:06.759 --> 00:32:09.319
But the one thing that was never predicted in any
574
00:32:09.359 --> 00:32:13.119
book that I could find is social media. So uh,
575
00:32:14.039 --> 00:32:16.960
it's the one thing that they really is no book
576
00:32:17.079 --> 00:32:20.359
built around how social media would develop, how it would
577
00:32:20.400 --> 00:32:24.680
really impact society, how we move political movements and so forth.
578
00:32:25.079 --> 00:32:27.680
And it's interesting that everybody missed that in their in
579
00:32:27.759 --> 00:32:31.160
their projections, and so I guess the last closest cause
580
00:32:31.200 --> 00:32:32.039
the question maybe.
581
00:32:31.920 --> 00:32:34.039
Is is what do you want X to be and
582
00:32:34.119 --> 00:32:35.519
where is that platform going to go?
583
00:32:36.359 --> 00:32:41.119
And I think that well, I mean, I do want
584
00:32:41.240 --> 00:32:55.079
XP force for good. That exactly, so I do want
585
00:32:55.200 --> 00:32:57.680
XP a force for good. And and I do view
586
00:32:57.720 --> 00:33:00.559
it as as sort of like the group mind humanity,
587
00:33:01.200 --> 00:33:04.279
and we want to have a sort of a healthy, happy,
588
00:33:04.359 --> 00:33:14.359
insane group mind versus the opposite. And I mean or yeah, so,
589
00:33:14.720 --> 00:33:17.359
and and I want it to be just like the
590
00:33:17.400 --> 00:33:19.799
best source of truth, like if you're trying to understand
591
00:33:19.880 --> 00:33:22.640
what's going on in the world, that it has the
592
00:33:22.720 --> 00:33:27.920
most most accurate, the most up to date information about
593
00:33:28.200 --> 00:33:33.799
anything large or small. So so it gives you the
594
00:33:33.839 --> 00:33:38.759
best understanding of what's going on in the world, you know, anywhere, anytime.
595
00:33:40.480 --> 00:33:44.279
Yeah, perfect, Thank you. Any any closing thought you want
596
00:33:44.279 --> 00:33:46.039
to leave us with other rise I've brother.
597
00:33:46.200 --> 00:33:49.480
Thank you for being so generous with your high they welcome.
598
00:33:49.559 --> 00:33:52.680
I think I would encourage people to be optimistic about
599
00:33:52.680 --> 00:33:54.759
the future. I think it is much more likely to
600
00:33:54.839 --> 00:33:59.720
be good than bad. So that's my prediction. Thank you,
601
00:34:08.840 --> 00:34:09.719
all right, Thanks guys,
1
00:00:02.600 --> 00:00:06.919
Chairman of Stagwell here with Linda Yakarino and an assembled
2
00:00:06.960 --> 00:00:11.480
group of about twenty five cmos of major corporations here
3
00:00:11.519 --> 00:00:16.519
at Cees during the CEES to really see the latest
4
00:00:16.559 --> 00:00:20.399
in technology. And we're graced with Elon Musk who has
5
00:00:20.480 --> 00:00:23.640
contented to give us some of his time to answer
6
00:00:23.679 --> 00:00:28.879
some of these really burning questions about technology's going to
7
00:00:28.920 --> 00:00:32.759
develop and change our lives, and some of his interactions
8
00:00:32.759 --> 00:00:35.039
with the government may also change.
9
00:00:34.880 --> 00:00:35.759
Our lives as well.
10
00:00:36.399 --> 00:00:40.119
Yeah, and so if I might start off with kind
11
00:00:40.159 --> 00:00:43.840
of a big overall question and I'll do some questions
12
00:00:43.880 --> 00:00:46.640
and then we'll get some questions in from the group.
13
00:00:47.679 --> 00:00:50.479
You know, I have to say that I've been actually
14
00:00:50.479 --> 00:00:52.759
a Tesla owner for many years.
15
00:00:53.039 --> 00:00:53.399
Thank you.
16
00:00:53.880 --> 00:00:58.039
Clearly you're clearly someone of a great taste.
17
00:00:59.320 --> 00:01:02.840
Well, I will say that I drove from Miami to
18
00:01:02.880 --> 00:01:05.920
Fort Laurida Vale to meet my brother for dinner, about forty.
19
00:01:05.599 --> 00:01:09.079
Miles and I did not touch I touched the wheel.
20
00:01:09.120 --> 00:01:10.560
But let me tell you that the test.
21
00:01:14.719 --> 00:01:18.439
The Chaudrove itself is what you're saying, and you do
22
00:01:18.560 --> 00:01:20.200
not need to intervene.
23
00:01:20.879 --> 00:01:23.439
It did not need me for this entire ride.
24
00:01:24.239 --> 00:01:28.040
Yeah, it's pretty magical. It's like like when you tell
25
00:01:28.079 --> 00:01:31.480
people that they have not experienced it, they don't believe you.
26
00:01:31.680 --> 00:01:35.840
You know, yeah, yes, And I didn't believe me either
27
00:01:37.400 --> 00:01:40.000
because I've been a skeptic about how far it would go.
28
00:01:40.640 --> 00:01:45.000
And your latest releases of software are you know, really incredible,
29
00:01:45.359 --> 00:01:48.040
So thank you for people to try them. As a
30
00:01:48.120 --> 00:01:52.640
big question, and in that thing coming in the next decade,
31
00:01:53.239 --> 00:01:55.480
what do you think is going to be the greatest
32
00:01:55.519 --> 00:01:58.959
advances in technology that will affect people's lives? What should
33
00:01:59.000 --> 00:02:03.400
they be affecting to see here from technology in their lives?
34
00:02:04.000 --> 00:02:06.760
Okay, well, I don't want to blow your minds, but
35
00:02:06.959 --> 00:02:13.319
AI is going to be big. I feel confident for
36
00:02:13.479 --> 00:02:15.680
that prediction. But you know, the funny thing is, if
37
00:02:15.719 --> 00:02:20.599
you go back even five years ago, certainly ten years ago,
38
00:02:21.120 --> 00:02:25.039
if ten years ago, even fifteen years ago, I was saying,
39
00:02:25.319 --> 00:02:28.560
is can be like this massive thing that has deep
40
00:02:28.719 --> 00:02:34.960
super intelligence, smarter than the smartest human People thought I
41
00:02:35.000 --> 00:02:38.120
was kidding, and I thought that that's ridiculous. There's no
42
00:02:38.159 --> 00:02:40.159
way computer's going to be smarter than a human. I'd
43
00:02:40.159 --> 00:02:41.520
be able to do all these complicated things.
44
00:02:41.960 --> 00:02:42.879
And now.
45
00:02:44.439 --> 00:02:48.599
You know the it's it's the latest AIS are able
46
00:02:48.680 --> 00:02:53.520
to pass complicated tests better than most humans. Like they
47
00:02:53.520 --> 00:02:57.039
can pass the medical tests better than than like eighty
48
00:02:57.039 --> 00:03:01.360
percent of doctors or something. Know they can AI can
49
00:03:01.919 --> 00:03:06.360
diagnose radiography better than most people who have been doing
50
00:03:06.400 --> 00:03:12.400
it their whole life. So that that's that's just accelerating obviously,
51
00:03:12.919 --> 00:03:17.080
I think probably if you haven't seen Jensen's talk, it's
52
00:03:17.319 --> 00:03:23.080
it's excellent and it really shows how much AI is advancing,
53
00:03:24.000 --> 00:03:25.000
and it's advancing on the.
54
00:03:25.240 --> 00:03:27.560
Hardware fronts on the software front.
55
00:03:28.280 --> 00:03:31.439
In terms of data, this is like the new sort
56
00:03:31.439 --> 00:03:34.520
of thing is synthetic data because we've actually run out
57
00:03:34.560 --> 00:03:38.000
of all the books and literally run out of you
58
00:03:38.120 --> 00:03:42.199
take the entire Internet and all books ever written, and
59
00:03:42.400 --> 00:03:46.000
all interesting videos, like like you don't need a thousand
60
00:03:46.039 --> 00:03:47.840
cat videos that are exactly the same, but all the
61
00:03:47.879 --> 00:03:50.639
interesting videos and and you sort of just strole that
62
00:03:50.759 --> 00:03:56.879
down into tokens essentially bits of information, and you've we've
63
00:03:56.960 --> 00:04:01.719
now exhausted all of the but basically that the cumulative
64
00:04:02.840 --> 00:04:04.479
some of human knowledge has been.
65
00:04:04.400 --> 00:04:05.520
Exhausted in AI training.
66
00:04:06.680 --> 00:04:09.680
That happened basically last last year, and so the only
67
00:04:09.759 --> 00:04:12.879
way to then supplement that is with synthetic data, where
68
00:04:12.879 --> 00:04:15.919
the AI creates, it'll sort of write an essay or
69
00:04:16.000 --> 00:04:18.160
we'll come up with the thesis, and then and then
70
00:04:18.680 --> 00:04:21.480
and then it will grade itself and and and and
71
00:04:21.759 --> 00:04:23.920
sort of go through this process of self learning with
72
00:04:24.040 --> 00:04:27.160
synthetic data, which is which is always challenging because how
73
00:04:27.199 --> 00:04:30.800
do you know it? How do you know if hallucinated
74
00:04:30.839 --> 00:04:34.920
the answer or it's a real answer. So it's challenging
75
00:04:34.959 --> 00:04:38.040
to find the ground truth. But but it is pretty
76
00:04:38.079 --> 00:04:40.399
well that AI at this point has run out of
77
00:04:40.519 --> 00:04:42.800
all human knowledge to train on.
78
00:04:45.279 --> 00:04:45.600
Crazy.
79
00:04:46.240 --> 00:04:49.680
I know that you're building for for the largest AI
80
00:04:49.920 --> 00:04:54.959
center on the planet we already have. Yeah, it's an
81
00:04:54.959 --> 00:04:57.240
operation Microsoft.
82
00:04:56.920 --> 00:04:59.920
Is planning eighty billion dollars or I used to work.
83
00:04:59.800 --> 00:05:03.839
For But there's a lot of money by anyone's standards.
84
00:05:03.480 --> 00:05:06.560
Really, you know, I mean, you know, I did a
85
00:05:06.639 --> 00:05:09.600
poll and we asked, there's AI making a difference in
86
00:05:09.680 --> 00:05:12.800
your life today? Thirteen percent said yes, and they said
87
00:05:12.879 --> 00:05:15.040
in five years from now, will AI make a difference
88
00:05:15.079 --> 00:05:18.199
in your life? Eighty seven percent expect in five years
89
00:05:18.240 --> 00:05:19.199
who will make a difference?
90
00:05:19.639 --> 00:05:21.079
What is a gigantic difference?
91
00:05:21.279 --> 00:05:23.079
Okay, what is it going to do for people?
92
00:05:23.360 --> 00:05:24.199
Is it going to be anything?
93
00:05:24.279 --> 00:05:27.959
You want to work right A I will do anything
94
00:05:28.040 --> 00:05:30.000
you want and even suggest things you never even thought of.
95
00:05:32.319 --> 00:05:35.360
So I mean, I really within the next few years
96
00:05:35.360 --> 00:05:38.920
will be able to do any cognitive task like it
97
00:05:38.920 --> 00:05:41.000
obviously begs the question what are we all going to do?
98
00:05:41.360 --> 00:05:45.959
You know, but pretty much any cognitive task that doesn't
99
00:05:46.000 --> 00:05:49.959
involve atoms AI will be able to do within i'd
100
00:05:50.000 --> 00:05:56.360
say max. Three or four years maximum. And then now
101
00:05:57.199 --> 00:06:00.720
then the another elements of it is the robotics is
102
00:06:00.759 --> 00:06:04.680
that you need, so you can't just be thinking. I
103
00:06:04.759 --> 00:06:07.720
can't just be thinking in it's in in a data center.
104
00:06:07.839 --> 00:06:10.600
It's going to do things. That's where you need the robots.
105
00:06:11.439 --> 00:06:16.519
So and you need you know, self driving cars, which
106
00:06:16.519 --> 00:06:19.360
obviously you've experienced, and and and that that rate of
107
00:06:19.399 --> 00:06:22.759
improvement is exponential in how good the self driving cars are.
108
00:06:24.600 --> 00:06:30.240
You know, we feel confident in passing the basically being
109
00:06:30.319 --> 00:06:35.360
better than human driving in about three months basically the
110
00:06:35.439 --> 00:06:38.360
Q two of this year. We feel confident of passing
111
00:06:40.720 --> 00:06:43.720
having a probability of accident that is better than the
112
00:06:44.079 --> 00:06:45.560
average experienced driver.
113
00:06:47.040 --> 00:06:48.839
And then and then it'll keep going from there.
114
00:06:49.920 --> 00:06:52.560
Ultimately, I think it's going to be ten times safer
115
00:06:52.600 --> 00:06:55.759
than a human driver and then one hundred times safer,
116
00:06:56.439 --> 00:06:57.279
like it's to.
117
00:06:57.639 --> 00:06:59.879
The point where really it just won't crash.
118
00:07:00.079 --> 00:07:04.959
Yeah. So and that's so, and that's happening this year
119
00:07:05.120 --> 00:07:07.800
with with Tesla. So Tesla and and and this is
120
00:07:07.839 --> 00:07:11.800
a software update to you know, our cars. So as
121
00:07:11.839 --> 00:07:15.759
you experienced yourself, it's the same car, it's got a
122
00:07:15.800 --> 00:07:18.079
software update, and suddenly it's way smarter at driving.
123
00:07:19.399 --> 00:07:21.959
Well, let me let me try a few timelines then,
124
00:07:22.040 --> 00:07:24.519
because you know, look, I'm not the youngest guy around,
125
00:07:24.600 --> 00:07:25.920
so I want to.
126
00:07:27.480 --> 00:07:30.079
Technology stand up for what he is young.
127
00:07:30.319 --> 00:07:33.000
Uh get older every year as I get older.
128
00:07:33.360 --> 00:07:35.720
I used to build computers and kids right when when
129
00:07:35.800 --> 00:07:38.439
you couldn't buy them yet, And I don't have to
130
00:07:38.519 --> 00:07:41.560
do that anymore. And so self driving some timeline self
131
00:07:41.639 --> 00:07:46.199
driving cars, certified government certified self drive you think will
132
00:07:46.240 --> 00:07:47.439
be within a year.
133
00:07:48.439 --> 00:07:52.160
Well, I mean there already are autonomous you know, in
134
00:07:52.519 --> 00:07:56.199
small in some regions, like Wemo has autonomous vehicles with
135
00:07:56.279 --> 00:07:59.600
no one in it, but they're limited to like if
136
00:07:59.639 --> 00:08:02.879
you sait Is in the U s that the Tesla solution,
137
00:08:03.160 --> 00:08:05.680
which is a much more difficult path to go but
138
00:08:05.800 --> 00:08:07.519
ultimately much more.
139
00:08:07.399 --> 00:08:11.279
Powerful, is a general solution to self driving.
140
00:08:11.959 --> 00:08:16.040
So the Tesla software is just purely AI and vision
141
00:08:16.839 --> 00:08:20.519
doesn't rely on any expensive sensor's, no light oars, no radars.
142
00:08:22.079 --> 00:08:24.199
Or or it doesn't require it doesn't even require knowing
143
00:08:24.279 --> 00:08:27.360
the area beforehand. Like you could put.
144
00:08:27.639 --> 00:08:29.560
You could have a drive someplace has never been before
145
00:08:29.600 --> 00:08:31.959
and no tests has ever been before. It couldn't be
146
00:08:32.000 --> 00:08:34.120
an alien planet, I mean it and the call will
147
00:08:34.159 --> 00:08:36.960
still work, still drive, so and and.
148
00:08:38.879 --> 00:08:43.240
So that's that's that's this year, you know, And when
149
00:08:43.320 --> 00:08:44.399
can I get a home robot?
150
00:08:44.639 --> 00:08:44.919
Okay?
151
00:08:45.480 --> 00:08:45.639
Right?
152
00:08:46.399 --> 00:08:50.720
H Well, so that's that's the other element is our
153
00:08:50.759 --> 00:08:55.480
humanoid humanoid robots. So I think probably most people, if
154
00:08:55.519 --> 00:08:58.200
not everyone, would like to have their own personal C
155
00:08:58.360 --> 00:09:03.600
three p O R T D two And I think
156
00:09:03.600 --> 00:09:05.840
I actually think humanoid robust will be the biggest product
157
00:09:06.639 --> 00:09:07.720
ever in history by far.
158
00:09:08.360 --> 00:09:08.799
I agree.
159
00:09:09.519 --> 00:09:10.559
Yeah, it's just it's as wild.
160
00:09:10.600 --> 00:09:13.679
But because you can just say, well every human is
161
00:09:13.720 --> 00:09:16.360
going to want one most likely, and someone want to
162
00:09:16.639 --> 00:09:19.559
and then they'll be all of the industry in terms
163
00:09:19.559 --> 00:09:22.919
of making providing products and services. So that you have, say,
164
00:09:22.960 --> 00:09:26.320
what's the ratio of humanoid robots to humans? My guess
165
00:09:26.440 --> 00:09:28.559
is it's least at least three to one, four to one,
166
00:09:29.000 --> 00:09:31.879
maybe five to one. So we're talking about twenty thirty
167
00:09:31.919 --> 00:09:39.879
billion humanoid robots, and you know, the it's it's not
168
00:09:39.919 --> 00:09:43.159
even clear what money means at that point, or if
169
00:09:43.200 --> 00:09:45.360
there's any meaningful cap on the economy.
170
00:09:46.679 --> 00:09:49.279
I think at that point, assuming that things haven't.
171
00:09:49.039 --> 00:09:53.399
Gone awry, you know, in the in the good AI scenario,
172
00:09:54.440 --> 00:09:56.799
I think we will have we won't have universal basic income,
173
00:09:56.840 --> 00:09:58.039
we'll have universal high income.
174
00:10:00.360 --> 00:10:03.080
So do you think five years for my first robot, or.
175
00:10:05.279 --> 00:10:08.720
Well for for Tesla where you know, Optimus robot really
176
00:10:08.799 --> 00:10:11.679
is it's and then somebody's got something secret we don't
177
00:10:11.679 --> 00:10:17.159
know about. The Optimus robot is the most sophisticated human
178
00:10:17.360 --> 00:10:20.000
robot in the world. It's got a hand that has
179
00:10:20.039 --> 00:10:22.039
twenty two degrees of freedom. It looks and feels like
180
00:10:22.120 --> 00:10:28.080
a human hand. And you know, we're aiming to have
181
00:10:28.159 --> 00:10:30.960
several thousand of those built this year. Initially we'll we'll
182
00:10:31.000 --> 00:10:36.399
test them out at Tesla factories, but then assuming things
183
00:10:36.440 --> 00:10:39.440
go well, we will will ten x that that output
184
00:10:39.600 --> 00:10:42.440
next year. So we'll aim to do maybe fifty to
185
00:10:42.519 --> 00:10:45.440
one hundred thousand humorid robots next year and then ten
186
00:10:45.600 --> 00:10:49.080
x at again the following year. It's like five hundred
187
00:10:49.120 --> 00:10:50.519
thousand robots in three years.
188
00:10:51.960 --> 00:10:52.399
That's a lot.
189
00:10:54.000 --> 00:10:57.240
Yeah, So I guess, well, maybe we should think of
190
00:10:57.360 --> 00:10:59.679
this in terms of Roman legions. How many legions of
191
00:10:59.759 --> 00:11:03.759
robot so we have, Like a Roman legion.
192
00:11:03.639 --> 00:11:04.399
Is five thousand.
193
00:11:06.559 --> 00:11:08.200
When will we have a colony on Mars?
194
00:11:09.799 --> 00:11:14.559
Well, I think we'll be able to send the first
195
00:11:16.080 --> 00:11:20.000
uncrewed spacecraft to Mars in two years. So Earth and
196
00:11:20.039 --> 00:11:24.919
Mars synchronize every two years, and so we're at a
197
00:11:24.960 --> 00:11:27.600
synchronous point right now, So then the next one will
198
00:11:27.639 --> 00:11:29.720
be roughly two years from now, and then there'll be
199
00:11:29.840 --> 00:11:32.519
two years from then there w another one. So for
200
00:11:32.600 --> 00:11:34.200
the first trip, obviously we want to make sure that
201
00:11:34.279 --> 00:11:40.600
we can land starship without crashing, Like we need to
202
00:11:40.639 --> 00:11:43.240
prove that we can land starship on Mars without incrementing
203
00:11:43.320 --> 00:11:49.639
the creator count. And if those land safely, then maybe
204
00:11:49.679 --> 00:11:52.879
on the next trip we would send people and then
205
00:11:52.960 --> 00:11:54.519
hopefully that would grow exponentially.
206
00:11:55.000 --> 00:11:58.960
So eventually there will be thousands of starships going to Mars.
207
00:11:59.480 --> 00:12:02.960
And I might have this like really cool visual like Battlesaw.
208
00:12:02.559 --> 00:12:04.159
Glack toic Go or something, you know, the sort of
209
00:12:04.600 --> 00:12:08.759
colony ships departing altogether with these like bright points of
210
00:12:08.799 --> 00:12:12.840
light in space. I think it looked really cool, but
211
00:12:13.919 --> 00:12:15.720
I think the goal has to be to get to
212
00:12:15.759 --> 00:12:19.639
the point where Mars is self sustaining. So the point
213
00:12:19.639 --> 00:12:23.159
of which Mars is self sustaining, which is really defined
214
00:12:23.200 --> 00:12:26.159
as a point of which if the resupply ships Earth
215
00:12:26.200 --> 00:12:28.879
stopped coming for any reason, that Mars doesn't die out,
216
00:12:29.279 --> 00:12:32.039
that Mars can continue to grow. So if there's something
217
00:12:32.039 --> 00:12:33.720
that happens on Earth, like let's say there's World War
218
00:12:33.759 --> 00:12:37.279
II or some natural disaster or who knows what, but
219
00:12:37.399 --> 00:12:41.559
for whatever reason the resupply ships stopped coming, if Mars
220
00:12:41.600 --> 00:12:44.080
can still continue to survive, then.
221
00:12:46.399 --> 00:12:50.200
You know, then the probable lifespan of civilization is dramatically greater.
222
00:12:51.559 --> 00:12:54.000
So you know, if you sort of stand back and say,
223
00:12:54.000 --> 00:12:56.879
how would you evaluate any civilization, you'd say, like, well,
224
00:12:56.919 --> 00:12:59.679
it's that civilization still stuck on its own planet or
225
00:13:00.159 --> 00:13:04.399
a multiplanets civilization, And we don't want to be one
226
00:13:04.399 --> 00:13:06.039
of those lame one planet civilizations.
227
00:13:06.279 --> 00:13:09.000
Okay, we're to have a respectable outcome here.
228
00:13:10.080 --> 00:13:11.840
Even if we don't make it beyond our Solar system,
229
00:13:11.879 --> 00:13:13.519
were at least got to get to another planet.
230
00:13:14.279 --> 00:13:18.799
Yeah, well, and finally in my list brain to technology. Communication.
231
00:13:19.480 --> 00:13:22.000
Yeah, back to communication, am I am I going to
232
00:13:22.039 --> 00:13:24.519
see that also because right now this is looking pretty
233
00:13:24.559 --> 00:13:24.919
good for me.
234
00:13:26.480 --> 00:13:29.279
Yeah. So we've got neuralink.
235
00:13:29.360 --> 00:13:32.440
We've got now three patients with three three humans with
236
00:13:32.600 --> 00:13:37.840
neural links implanet and no working well. And we've upgraded
237
00:13:37.879 --> 00:13:42.480
devices that were the devices will have more more electrodes
238
00:13:42.519 --> 00:13:48.519
basically higher bandwidth, longer battery life and everything. And so
239
00:13:48.679 --> 00:13:52.840
we expect to you know, hopefully do twenty or thirty
240
00:13:53.799 --> 00:13:57.039
patients next year or this year, I should say, with
241
00:13:57.200 --> 00:14:02.080
the upgraded neuralink devices. And this our first product is
242
00:14:02.360 --> 00:14:04.960
we're trying to enable people who have lost their brain
243
00:14:05.039 --> 00:14:10.679
body connection so they're a tetroplegic or paraphlegic or uh
244
00:14:12.159 --> 00:14:15.679
basically like you can imagine like say Stephen Hawking. If
245
00:14:15.679 --> 00:14:19.399
Stephen Hawking could communicate as fast or even faster than
246
00:14:19.440 --> 00:14:23.000
a normal human, that would be transformational. Yeah, So that
247
00:14:23.120 --> 00:14:28.440
that's sort of our first product is being able to
248
00:14:28.519 --> 00:14:31.440
read the motor cortex of the brain and say that
249
00:14:31.559 --> 00:14:34.440
if you think about moving your hand, it will move
250
00:14:34.519 --> 00:14:38.000
the cursor on the screen. And it enables enables people
251
00:14:38.080 --> 00:14:40.240
to control their computer or their phone just by thinking.
252
00:14:41.360 --> 00:14:44.240
And then our next part will be blind sites that
253
00:14:44.320 --> 00:14:47.000
even if somebody has lost both eyes or has lost
254
00:14:47.039 --> 00:14:50.120
the optic nerve, or if they've never they've been blind
255
00:14:50.159 --> 00:14:53.679
from birth, we can interface directly with the visual cortex
256
00:14:53.720 --> 00:14:55.840
in the brain and enable them to see.
257
00:14:57.279 --> 00:14:59.320
And we already have that working in monkeys.
258
00:15:00.879 --> 00:15:02.840
I actually have a monkey who's now had that for
259
00:15:02.960 --> 00:15:07.960
I think two years, and so enabling sort of basically
260
00:15:08.039 --> 00:15:16.360
enabling people to control devices, and ultimately we think if
261
00:15:16.399 --> 00:15:19.919
you have a second neuralink device that is past the
262
00:15:20.000 --> 00:15:24.000
point where the spinal damage occurred, we can actually transmit
263
00:15:24.080 --> 00:15:27.639
the signals from the brain past the where essentially the
264
00:15:27.679 --> 00:15:28.080
wires is.
265
00:15:28.080 --> 00:15:30.279
Broken and enabled someone to walk again.
266
00:15:31.679 --> 00:15:36.200
So that that would really be profound obviously, but I'm
267
00:15:36.200 --> 00:15:40.399
confident that that is physically possible. And then the long
268
00:15:40.480 --> 00:15:42.559
term goalf in neural link is to be able to
269
00:15:42.639 --> 00:15:48.639
improve the bandwidth. So right right now when we're speaking,
270
00:15:48.879 --> 00:15:51.679
our bandwidth and po second is quite low, and the
271
00:15:51.759 --> 00:15:54.519
sustained bandwidth of a human is less than one but
272
00:15:54.639 --> 00:15:58.159
per second over twenty four hour period, So this eighty
273
00:15:58.159 --> 00:16:01.240
six four hundred seconds in a day, and the average
274
00:16:01.279 --> 00:16:02.879
human put outputs.
275
00:16:02.519 --> 00:16:05.080
Much less than eighty six four hundred bits today.
276
00:16:06.679 --> 00:16:08.919
If someone's a writer, they might all exceed that, but
277
00:16:09.039 --> 00:16:12.279
most people do not output more than the number of
278
00:16:12.320 --> 00:16:16.799
seconds in the day. And with a neural link, you
279
00:16:17.080 --> 00:16:22.000
could increase that output capably by a thousand or maybe
280
00:16:22.000 --> 00:16:27.360
a million, So it would be profoundly different experience that
281
00:16:27.440 --> 00:16:28.639
could be super human essentially.
282
00:16:29.559 --> 00:16:31.759
Well, put me down for all of this so far
283
00:16:33.159 --> 00:16:34.200
early as after.
284
00:16:34.519 --> 00:16:37.519
Yeah, and trust me, really you really like the chip,
285
00:16:37.879 --> 00:16:39.159
I can guarantee.
286
00:16:38.679 --> 00:16:43.559
It, And let's kind of bring us down to earth
287
00:16:43.679 --> 00:16:46.240
for a question on DOGE.
288
00:16:46.679 --> 00:16:49.559
Yeah, I mean I worked very closely actually with President
289
00:16:49.559 --> 00:16:53.399
Clinton in the nineties where we did have reinventing government.
290
00:16:53.519 --> 00:16:55.600
We did balance the budget in two years.
291
00:16:55.679 --> 00:16:58.279
Actually, that's awesome. Oh those were the days.
292
00:17:00.279 --> 00:17:05.119
Well, it didn't last very long because it's got blown
293
00:17:05.200 --> 00:17:08.759
up very quickly. Have you identified some cuts that you're
294
00:17:08.759 --> 00:17:10.960
really that you're really looking at, that you think will
295
00:17:11.000 --> 00:17:13.279
be successful. Do you think the two trillion is is
296
00:17:13.319 --> 00:17:15.720
a realistic number now that you're looking more closely at it.
297
00:17:16.759 --> 00:17:23.920
Yeah, Well, I think I think we'll try and for
298
00:17:24.000 --> 00:17:28.000
two trillion. I think that's like the best case outcome.
299
00:17:28.960 --> 00:17:31.640
But I do think that you kind of have to
300
00:17:31.680 --> 00:17:33.640
have some overage. I think if we try for two trillion,
301
00:17:33.680 --> 00:17:37.559
we've got a good shot at getting one. And if
302
00:17:38.079 --> 00:17:40.880
we can get drop the budget deficit from two trillion
303
00:17:40.960 --> 00:17:42.319
to one trillion and.
304
00:17:43.960 --> 00:17:45.640
Kind of free up the economy.
305
00:17:45.240 --> 00:17:50.160
To you know, have additional growth such that the output
306
00:17:50.200 --> 00:17:54.000
of goods and services it keeps pace with the increase
307
00:17:54.000 --> 00:17:56.119
in the money supply, then there will be no inflation.
308
00:17:57.920 --> 00:18:00.839
So that that I think would be an epic outcome.
309
00:18:03.079 --> 00:18:06.119
And in terms of saving money in the government, well,
310
00:18:06.720 --> 00:18:09.599
as you as you know, it's it's a it's a
311
00:18:09.680 --> 00:18:11.799
very target rich environment for saving money. Like if you
312
00:18:12.000 --> 00:18:14.720
if you look at any direction, it's like it was like,
313
00:18:14.759 --> 00:18:16.400
where will you find places to save money? I'm like,
314
00:18:16.519 --> 00:18:18.480
it's like being in a room full oft targets, Like
315
00:18:18.599 --> 00:18:22.839
you could close your eyes and you can't miss. So
316
00:18:24.000 --> 00:18:27.880
there's just a lot of waste in government because especially
317
00:18:27.920 --> 00:18:30.119
the federal government, you just got a situation where that
318
00:18:30.240 --> 00:18:32.519
the checks never bounce, like they've got like the infinite
319
00:18:32.559 --> 00:18:38.160
money computer and uh and and then the the people
320
00:18:38.240 --> 00:18:40.240
that spend the money are not the people. It's it's
321
00:18:40.279 --> 00:18:42.319
not their money, you know, it's it's it's very very
322
00:18:42.359 --> 00:18:45.440
hard for people to care about spending someone else's money.
323
00:18:47.400 --> 00:18:50.200
And and then even if you you know, actually I
324
00:18:50.279 --> 00:18:52.480
know people in the government who do care about on
325
00:18:52.839 --> 00:18:55.039
just as a matter of principles, spending money effectively, and
326
00:18:55.079 --> 00:18:58.160
they try to do so and they can't. The system
327
00:18:58.240 --> 00:19:00.920
prevents them from doing so. And then and they even
328
00:19:00.960 --> 00:19:04.440
get told to do crazy things as against probably sounds familiar,
329
00:19:04.759 --> 00:19:06.720
where you get towards the end of the budget cycle
330
00:19:06.799 --> 00:19:09.359
and and they're they're told to spend up to their
331
00:19:09.400 --> 00:19:13.400
budget and even on on nonsense stuff because if they
332
00:19:13.440 --> 00:19:17.119
don't spend their budget, the budget gets reduced. So it's
333
00:19:17.160 --> 00:19:20.480
actually sort of sort of a perverse incentive to waste money,
334
00:19:21.680 --> 00:19:23.400
and and and then they kind of get punished for
335
00:19:23.519 --> 00:19:25.559
not wasting money. So it's totally bananas.
336
00:19:28.039 --> 00:19:29.599
Well, well, I agree, you will find.
337
00:19:30.440 --> 00:19:33.559
You know, I did a mathematical analysis in terms of
338
00:19:33.759 --> 00:19:36.119
how government used to do things. So if you take
339
00:19:36.200 --> 00:19:39.960
the Brooklyn Bridge or the Lincoln Tunnel, and you yeah, inflation,
340
00:19:40.799 --> 00:19:44.519
the infrastructure bill should actually get you four thousand adjusted
341
00:19:44.640 --> 00:19:48.960
breakage Lincoln Tunnels, which, of course, of course it's not
342
00:19:49.200 --> 00:19:52.039
because government is right, does it used to be?
343
00:19:52.880 --> 00:19:57.039
No, exactly Essentially that we've got had an accumulation of
344
00:19:57.160 --> 00:20:01.880
laws and regulations that may basically any large project essentially
345
00:20:01.920 --> 00:20:07.319
illegal and uh and uh and and even if you've
346
00:20:07.359 --> 00:20:09.119
tried to do you've better satisfied. You better to spend
347
00:20:09.160 --> 00:20:11.240
way more money on the paperwork than on the thing itself.
348
00:20:13.720 --> 00:20:17.200
So and and then it gets gets laid and so
349
00:20:17.279 --> 00:20:19.319
this there's there's an element of like dose, which is
350
00:20:19.400 --> 00:20:24.039
very important, which is looking at regulations and and getting
351
00:20:24.119 --> 00:20:28.599
rid of ones that where the the the harm is
352
00:20:28.759 --> 00:20:30.440
worse than the good, like you say, like, well, any
353
00:20:30.480 --> 00:20:31.920
given regulations like okay, there's some.
354
00:20:31.920 --> 00:20:33.799
Amount of good, some amount of harm, but you know
355
00:20:34.279 --> 00:20:35.319
what's that ratio? Is it?
356
00:20:35.480 --> 00:20:38.000
Like you know, and there's a lot of regulations where
357
00:20:38.319 --> 00:20:43.079
frankly it just completely nonsensical. And we want to get
358
00:20:43.119 --> 00:20:45.680
rid of nonsensical regulations that do not serve the public good.
359
00:20:46.079 --> 00:20:49.599
Well, Linda Acarino and her keynote here mentioned the dose thing,
360
00:20:49.680 --> 00:20:51.359
and she got enormous applause.
361
00:20:51.880 --> 00:20:54.799
So I think the country is really waiting to see
362
00:20:55.200 --> 00:20:58.400
this effort. There. They're behind it, they're they're optimistic.
363
00:20:58.960 --> 00:21:01.640
Let me try to get in one more topic here
364
00:21:01.759 --> 00:21:04.119
before I get one or two other questions out there,
365
00:21:04.160 --> 00:21:08.640
which is obviously Mark Zuckerberg made an amazing one.
366
00:21:09.640 --> 00:21:10.799
An eighty degree term.
367
00:21:11.559 --> 00:21:12.759
Yeah cool.
368
00:21:14.880 --> 00:21:17.759
What's your reaction to what he did and his acknowledgment
369
00:21:17.880 --> 00:21:21.400
frankly that the government was in fact censoring things, or
370
00:21:21.480 --> 00:21:24.680
he was censoring things, or the government or some combination thereof.
371
00:21:25.599 --> 00:21:27.359
Yeah, I mean there's there's there's no question. I mean
372
00:21:27.799 --> 00:21:30.799
one percent the government was was censoring things. We know
373
00:21:30.880 --> 00:21:34.480
that for a fact from from the Twitter files. I
374
00:21:34.519 --> 00:21:38.119
mean some of the stuff was like, like pretty illegal, frankly.
375
00:21:39.000 --> 00:21:42.079
I mean, the FBI had this portal into Twitter where
376
00:21:42.119 --> 00:21:45.920
they could spy on anything and and censor anything and
377
00:21:46.079 --> 00:21:47.480
had a two week auto delete.
378
00:21:47.559 --> 00:21:48.839
So we don't even know what they did.
379
00:21:50.960 --> 00:21:53.200
Except that they had immense power to do whatever they wanted,
380
00:21:53.200 --> 00:21:54.640
which doesn't sound legal.
381
00:21:56.079 --> 00:21:59.440
And you know that sounds pretty pretty crazy.
382
00:21:59.599 --> 00:22:02.880
And you know, there's also a lot of self censoring,
383
00:22:04.079 --> 00:22:06.200
and I don't know, there was just a lot of
384
00:22:06.240 --> 00:22:10.880
censoring going on. And I think you I feel very
385
00:22:10.880 --> 00:22:13.720
strongly that the you have to have freedom of speech
386
00:22:13.920 --> 00:22:17.119
to have a functioning democracy. You know, if you don't
387
00:22:17.200 --> 00:22:22.400
have freedom of speech, freedom expression, then how do you
388
00:22:22.440 --> 00:22:24.680
know what's really going on? And and if you can't
389
00:22:24.720 --> 00:22:27.640
make an inform vote then you don't have you don't
390
00:22:27.640 --> 00:22:31.400
have a real democracy. So it's you know, just that's
391
00:22:31.440 --> 00:22:35.480
incredibly important too. I think listen to the wisdom of
392
00:22:35.519 --> 00:22:37.480
the founders of the country and say, why why did
393
00:22:37.480 --> 00:22:40.839
they make that the first amendment? You know, and and
394
00:22:41.039 --> 00:22:42.680
it's what they did it for a reason. It's because
395
00:22:42.680 --> 00:22:46.200
they came from places where, uh that where there was
396
00:22:46.279 --> 00:22:50.799
massive censorship and the penalties for speaking your mind would be,
397
00:22:51.839 --> 00:22:54.640
you know, find imprisonment or death. And they're like, we
398
00:22:54.759 --> 00:22:56.519
really don't like that. We want that to not be
399
00:22:56.640 --> 00:23:00.440
the case in America. So then they you know, freedom speech.
400
00:23:01.240 --> 00:23:06.680
So you've just posted you are the media citizen.
401
00:23:07.119 --> 00:23:11.319
Citizen journalism, I mean, really becoming more and more important
402
00:23:11.599 --> 00:23:12.440
in the States of media.
403
00:23:12.480 --> 00:23:14.519
And you see it with the with the wildfires here.
404
00:23:14.640 --> 00:23:15.240
Yeah, exactly.
405
00:23:15.720 --> 00:23:19.119
It really takes the citizens to report and tell people.
406
00:23:19.200 --> 00:23:24.400
Yes, yeah, actually, you think of it like before there
407
00:23:24.480 --> 00:23:26.240
was before the Internet, you kind of had to have
408
00:23:27.799 --> 00:23:30.119
the what I call legacy media, of which which is
409
00:23:30.599 --> 00:23:34.799
you have to have some aggregation points where you know
410
00:23:35.119 --> 00:23:37.039
that you know, have reporters going and go and find
411
00:23:37.079 --> 00:23:38.519
things out. Then they would come to they would go
412
00:23:38.640 --> 00:23:42.359
to to to their office, they would write up articles.
413
00:23:42.799 --> 00:23:45.839
They would then print those articles on paper. That paper
414
00:23:45.880 --> 00:23:48.839
would then be distributed and it was kind of the
415
00:23:48.920 --> 00:23:50.480
only it was the only way to know what was
416
00:23:50.519 --> 00:23:53.240
going on. But it was very slow, especially in the
417
00:23:53.279 --> 00:23:53.720
old days.
418
00:23:54.720 --> 00:23:54.839
You know.
419
00:23:54.880 --> 00:23:55.599
I think when.
420
00:23:56.960 --> 00:23:59.440
Like when when Lincoln Lincoln was assassinated, I think it
421
00:23:59.480 --> 00:24:02.640
took like three weeks for the that needs to reach
422
00:24:02.720 --> 00:24:03.599
Asio or something like that.
423
00:24:03.880 --> 00:24:04.079
You know.
424
00:24:05.519 --> 00:24:07.799
And in fact, in the old days, like you wouldn't
425
00:24:07.799 --> 00:24:10.680
even know that your country had gone to war because
426
00:24:11.079 --> 00:24:13.119
it would take like a month for the fact, hey
427
00:24:13.160 --> 00:24:15.160
where or to reach your village.
428
00:24:16.400 --> 00:24:21.240
We learned about Perrel Harbor because we decoded the Japanese.
429
00:24:21.720 --> 00:24:26.920
I didn't know that we decoded their their their cyber system, right,
430
00:24:27.200 --> 00:24:30.160
I mean, that's how we actually learned about about it. Yeah,
431
00:24:31.039 --> 00:24:33.160
and and you know we have just if I can
432
00:24:33.480 --> 00:24:35.400
squeeze in, I just got three minutes here.
433
00:24:35.759 --> 00:24:38.279
It's fine if you got a little longer if you want, Okay.
434
00:24:39.440 --> 00:24:39.759
Thank you.
435
00:24:40.039 --> 00:24:44.039
I uh so, just I got three questions in from
436
00:24:44.079 --> 00:24:47.599
the audience that Dan Gardner and Toby Daniels avon Discourse
437
00:24:47.640 --> 00:24:50.240
Head want to know do you think the Internet sucks?
438
00:24:50.880 --> 00:24:51.920
And what do you think we need to.
439
00:24:53.440 --> 00:24:56.119
I mean, the content on the Internet or your Internet connection,
440
00:24:56.200 --> 00:24:58.559
because that starlink can help you on the Internet connection.
441
00:24:59.240 --> 00:25:04.759
Starling is very for Internet connectivity, so you know, especially
442
00:25:04.880 --> 00:25:07.920
for like places that have bad connectivity impact. I think
443
00:25:07.960 --> 00:25:11.400
stollic Is is really having a significant effect in terms
444
00:25:11.440 --> 00:25:14.599
of lifting people out of poverty. In fact, in many
445
00:25:14.640 --> 00:25:18.039
parts of the world where people have a product that
446
00:25:18.039 --> 00:25:19.599
they want to sell but if they don't have an
447
00:25:19.640 --> 00:25:23.680
Internet connection, they can't do it, or if they want
448
00:25:23.720 --> 00:25:26.400
to learn things like basically you can learn anything on
449
00:25:26.440 --> 00:25:28.359
the Internet for free, like mi T has all these
450
00:25:28.440 --> 00:25:31.039
lectures that are available for free on the Internet, which
451
00:25:31.039 --> 00:25:33.720
you need an Internet interet connection. And so once you
452
00:25:33.759 --> 00:25:35.920
have an Internet connection, you've you've got access to education,
453
00:25:36.000 --> 00:25:39.039
you've got access to a golden market. So I think
454
00:25:39.119 --> 00:25:43.759
it's it's very significant how connectivity makes difference in people's lives,
455
00:25:44.200 --> 00:25:47.160
you know, with with respect to maybe a question was
456
00:25:47.359 --> 00:25:49.519
geared at like is there too much negativity on the Internet.
457
00:25:51.200 --> 00:25:53.279
The I think at times there is there is too
458
00:25:53.359 --> 00:25:55.640
much negativity, and.
459
00:25:57.160 --> 00:25:59.720
You know, actually on the X platform, I propose like, well,
460
00:26:00.279 --> 00:26:02.799
we were going to tweak the algorithm to be a
461
00:26:02.839 --> 00:26:04.920
bit more positive, and then people got to upset about
462
00:26:04.960 --> 00:26:05.880
me for that, I'm like.
463
00:26:07.400 --> 00:26:09.160
Like, okay, like what do you guys want? You know?
464
00:26:11.039 --> 00:26:14.240
Well, actually the second question from exact Moffitts or Target
465
00:26:14.319 --> 00:26:17.720
victory was how do you make pessimism uncool again? So
466
00:26:17.839 --> 00:26:22.039
I think it's maybe exactly what you're saying, which is that,
467
00:26:22.839 --> 00:26:25.960
you know, it's become people people are afraid. You know,
468
00:26:26.079 --> 00:26:28.880
this used to be a can do nation. Yeah, always
469
00:26:29.000 --> 00:26:32.160
very positive, and now now the country when I ask
470
00:26:32.240 --> 00:26:33.640
if we're on the right track or the wrong track,
471
00:26:33.680 --> 00:26:35.720
they never say that we're on the right track anymore.
472
00:26:36.160 --> 00:26:40.960
Okay, well, I hope hopefully, you know, hopefully with this.
473
00:26:41.480 --> 00:26:45.720
You know, I'm actually pretty optimistic about the next four
474
00:26:45.799 --> 00:26:47.920
or five years. I think we're actually gonna I think
475
00:26:47.960 --> 00:26:52.440
we have the potential for a golden age. So we
476
00:26:52.720 --> 00:26:54.720
need to It's very important to get rid of of
477
00:26:55.000 --> 00:26:59.359
of the mountain of regulations that are holding things back.
478
00:26:59.599 --> 00:27:01.640
And I don't mean there's there's some good regulations, but
479
00:27:02.039 --> 00:27:04.279
there's just so much that we just can't get anything done.
480
00:27:04.599 --> 00:27:05.480
I mean, you take sort of.
481
00:27:05.480 --> 00:27:09.759
The California wildfires for example. We really need to have
482
00:27:09.880 --> 00:27:12.440
fire breaks, and we need to clear the brush back
483
00:27:12.480 --> 00:27:16.480
away from houses and uh, and we need to make
484
00:27:16.519 --> 00:27:18.559
sure the reservoirs are full. These are all kind of
485
00:27:18.599 --> 00:27:21.559
obvious things. But but but due to a bunch of
486
00:27:21.759 --> 00:27:25.680
environmental rulings, you can't actually do that in California. So
487
00:27:26.000 --> 00:27:27.680
they're not allowed to do the five breaks, and they're
488
00:27:27.680 --> 00:27:30.000
not allowed to push the brush back away from houses
489
00:27:31.039 --> 00:27:35.440
because it might it might hurt some like red legged
490
00:27:35.480 --> 00:27:36.920
frog or something like that. You know, it's like some
491
00:27:37.039 --> 00:27:40.000
sort of creature that usually a creature you've never heard of,
492
00:27:40.160 --> 00:27:43.880
that is preventing this from occurring. You know, there's this
493
00:27:44.039 --> 00:27:46.400
is there's like this fish called the smelt, for example,
494
00:27:47.839 --> 00:27:50.680
and so there's so we have far more fresh water
495
00:27:50.759 --> 00:27:53.640
run off into the ocean than than we should really
496
00:27:54.000 --> 00:27:57.000
be on the theory that it helps this one little
497
00:27:57.039 --> 00:28:01.559
fish that likes a slightly briny freshwater sold water mix,
498
00:28:02.200 --> 00:28:04.559
and if if we if we, if we keep more
499
00:28:04.640 --> 00:28:09.039
fresh water than the smelled fish will will not be happy.
500
00:28:09.799 --> 00:28:12.319
But there's no actual evidence that that the smells fish
501
00:28:12.400 --> 00:28:16.000
is going to be unhappy if we if we keep
502
00:28:16.039 --> 00:28:16.920
a bit more fresh water.
503
00:28:17.160 --> 00:28:19.920
In fact, so we should we should.
504
00:28:19.680 --> 00:28:22.319
Keep more fresh water, keep the reservoirs full, and and
505
00:28:22.519 --> 00:28:25.200
and just have some sensible fire breaks and move the
506
00:28:25.240 --> 00:28:28.279
rush away from houses. That's just an example of like
507
00:28:29.079 --> 00:28:31.279
we've saved a lot of trouble and a lot of
508
00:28:31.359 --> 00:28:33.680
tragedy in l A if we've done that.
509
00:28:35.640 --> 00:28:39.359
So you know, and and yeah, I think.
510
00:28:40.920 --> 00:28:43.400
I think AI and robotics is going to lead to
511
00:28:43.640 --> 00:28:46.319
a higher standard of living for for people beyond what
512
00:28:46.359 --> 00:28:52.000
they can imagine. So AI doctors and medicine that are
513
00:28:52.160 --> 00:28:52.920
pretty incredible.
514
00:28:53.079 --> 00:28:56.279
So that's the fin the final question, she says from
515
00:28:56.400 --> 00:28:57.160
from uh.
516
00:28:59.000 --> 00:29:04.279
Catherine heritage heritage, I got there right, Uh, if if
517
00:29:04.359 --> 00:29:08.440
all the robots and everything freeze up time for humans,
518
00:29:09.039 --> 00:29:11.279
what is it they will do with that time? Or
519
00:29:11.319 --> 00:29:13.240
what can we ask them to do with that time?
520
00:29:13.680 --> 00:29:17.359
And I think that's it. Yes, grounds out the circle
521
00:29:17.559 --> 00:29:18.359
of technology.
522
00:29:19.359 --> 00:29:24.000
Yeah, I guess it would be a bit like being retired. Yeah,
523
00:29:26.079 --> 00:29:28.680
I mean this that does will take a few years,
524
00:29:28.720 --> 00:29:32.000
but at some point if this, if as AI gets
525
00:29:32.400 --> 00:29:35.119
as AI and roboties get better, eventually a I will
526
00:29:35.160 --> 00:29:37.039
be able to do everything that humans can do. So
527
00:29:37.720 --> 00:29:40.079
any task you do will be optional, like it'll be
528
00:29:40.240 --> 00:29:47.440
like a hobby or you know. So now it just
529
00:29:47.559 --> 00:29:50.559
big question like well, this will our lives lives have meaning?
530
00:29:50.920 --> 00:29:54.680
If the computers and the robots can do everything better
531
00:29:54.720 --> 00:29:55.240
than we can.
532
00:29:56.039 --> 00:29:56.119
And.
533
00:29:59.079 --> 00:30:00.599
That is that a real question.
534
00:30:00.680 --> 00:30:03.240
I do wonder about that myself myself, and that's maybe
535
00:30:03.279 --> 00:30:04.640
that's why we need the new link so we can
536
00:30:04.759 --> 00:30:07.240
enhance you in capability so we can keep up with.
537
00:30:10.640 --> 00:30:10.839
Yeah.
538
00:30:11.640 --> 00:30:14.720
Well, I just want to thank you for well, it's
539
00:30:14.839 --> 00:30:18.799
obviously incredibly optimistic view about technology and where it's going.
540
00:30:19.839 --> 00:30:24.039
I feel reassured that the kind of leaps that that
541
00:30:24.119 --> 00:30:26.680
are going to be made, particular what you're working on,
542
00:30:27.000 --> 00:30:29.640
are just incredible in the next few years, not not
543
00:30:29.799 --> 00:30:33.720
decades away, but right really very closely talked in terms
544
00:30:33.759 --> 00:30:38.200
of years consistently, and very thankful for that. And uh
545
00:30:38.519 --> 00:30:42.240
and I think that that's you know, absolutely tremendous message
546
00:30:42.279 --> 00:30:44.640
here at the you know, where we are in Las
547
00:30:44.759 --> 00:30:48.039
Vegas kind of studying what what technology is going to
548
00:30:48.079 --> 00:30:51.880
be available to people and UH and really again, thank
549
00:30:51.920 --> 00:30:54.200
you for the tremendous role in opening up free speech.
550
00:30:54.960 --> 00:30:57.319
Uh and for the role that that X and you know,
551
00:30:57.799 --> 00:30:59.240
Linda is playing in terms of that.
552
00:30:59.440 --> 00:31:02.640
I know that we're we're working to get full recognition
553
00:31:02.799 --> 00:31:06.000
by everyone of the of the platform as it should be,
554
00:31:06.119 --> 00:31:08.960
because free speech is is I think the proper the
555
00:31:09.039 --> 00:31:09.720
proper way to go.
556
00:31:10.200 --> 00:31:12.440
Yeah, I think it unbalanced is good.
557
00:31:13.599 --> 00:31:17.000
And you know, one of the things that create conceptually
558
00:31:17.039 --> 00:31:20.200
with the X platform is like it's like a global consciousness.
559
00:31:20.240 --> 00:31:23.799
It's like collective conscious consciousness of humanity. Now, if you
560
00:31:23.880 --> 00:31:25.880
have a collective consciousness of humanity, well you're going to
561
00:31:25.920 --> 00:31:29.519
get every aspect of humanity, good and bad. That's just
562
00:31:29.640 --> 00:31:36.519
naturally what happens. So but I, you know, I do
563
00:31:36.640 --> 00:31:38.559
want it to be a good and productive thing. And
564
00:31:39.279 --> 00:31:44.160
you know that aspiration is to maximize unregretted user minutes.
565
00:31:44.680 --> 00:31:47.559
So like you spend time on the platform, it's like, well,
566
00:31:47.720 --> 00:31:48.440
did you regret it.
567
00:31:48.519 --> 00:31:51.519
Or not regret it? And we want to maximize unregretted
568
00:31:51.640 --> 00:31:52.240
user time.
569
00:31:53.559 --> 00:31:55.319
Well, and I always go back, you know, I always
570
00:31:55.359 --> 00:31:58.400
say that everything in technology was either in the jet sense.
571
00:32:00.279 --> 00:32:04.160
Likely justice. You know, it's either in Star Trek or
572
00:32:04.200 --> 00:32:05.720
the Jetsons more or less right.
573
00:32:06.759 --> 00:32:09.319
But the one thing that was never predicted in any
574
00:32:09.359 --> 00:32:13.119
book that I could find is social media. So uh,
575
00:32:14.039 --> 00:32:16.960
it's the one thing that they really is no book
576
00:32:17.079 --> 00:32:20.359
built around how social media would develop, how it would
577
00:32:20.400 --> 00:32:24.680
really impact society, how we move political movements and so forth.
578
00:32:25.079 --> 00:32:27.680
And it's interesting that everybody missed that in their in
579
00:32:27.759 --> 00:32:31.160
their projections, and so I guess the last closest cause
580
00:32:31.200 --> 00:32:32.039
the question maybe.
581
00:32:31.920 --> 00:32:34.039
Is is what do you want X to be and
582
00:32:34.119 --> 00:32:35.519
where is that platform going to go?
583
00:32:36.359 --> 00:32:41.119
And I think that well, I mean, I do want
584
00:32:41.240 --> 00:32:55.079
XP force for good. That exactly, so I do want
585
00:32:55.200 --> 00:32:57.680
XP a force for good. And and I do view
586
00:32:57.720 --> 00:33:00.559
it as as sort of like the group mind humanity,
587
00:33:01.200 --> 00:33:04.279
and we want to have a sort of a healthy, happy,
588
00:33:04.359 --> 00:33:14.359
insane group mind versus the opposite. And I mean or yeah, so,
589
00:33:14.720 --> 00:33:17.359
and and I want it to be just like the
590
00:33:17.400 --> 00:33:19.799
best source of truth, like if you're trying to understand
591
00:33:19.880 --> 00:33:22.640
what's going on in the world, that it has the
592
00:33:22.720 --> 00:33:27.920
most most accurate, the most up to date information about
593
00:33:28.200 --> 00:33:33.799
anything large or small. So so it gives you the
594
00:33:33.839 --> 00:33:38.759
best understanding of what's going on in the world, you know, anywhere, anytime.
595
00:33:40.480 --> 00:33:44.279
Yeah, perfect, Thank you. Any any closing thought you want
596
00:33:44.279 --> 00:33:46.039
to leave us with other rise I've brother.
597
00:33:46.200 --> 00:33:49.480
Thank you for being so generous with your high they welcome.
598
00:33:49.559 --> 00:33:52.680
I think I would encourage people to be optimistic about
599
00:33:52.680 --> 00:33:54.759
the future. I think it is much more likely to
600
00:33:54.839 --> 00:33:59.720
be good than bad. So that's my prediction. Thank you,
601
00:34:08.840 --> 00:34:09.719
all right, Thanks guys,