4067 位用户此时在线
24小时点击排行 Top 10:
- 本站自动实时分享网络热点
- 24小时实时更新
- 所有言论不代表本站态度
- 欢迎对信息踊跃评论评分
- 评分越高,信息越新,排列越靠前
11
2
1
1
12
2
1
1
13
2
1
1
14
2
1
1
15
230k GPUs, including 30k GB200s, are operational for training Grok in a single supercluster called Colossus 1 (inference is done by our cloud providers).
2
1
1
230k GPUs, including 30k GB200s, are operational for training Grok in a single supercluster called Colossus 1 (inference is done by our cloud providers).
At Colossus 2, the first batch of 550k GB200s & GB300s, also for training, start going online in a few weeks.
As Jensen
btc
(
twitter.com)
00:04:34
17
2
1
1
18
2
1
1
19
2
1
1
20
2
1
1
22
2
1
1
24
WATCH: Elon Musk and David Sacks brainstorm Grokipedia live
2
1
1
WATCH: Elon Musk and David Sacks brainstorm Grokipedia live
Elon:
“If you take, say Wikipedia as an example, but this really applies to books, PDFs, websites, every form of information.”
“Grok is using heavy amounts of inference compute to look at, as an example, a Wikipedia
btc
(
twitter.com)
00:00:54
25
2
1
1