Do we have any idea how many TPUs there are?

(TPU is the supercomputer/cluster needed to train a modern AI.)

@WomanCorn That's a really good question! I haven't been able to find an answer. The closest I found was a page on the total compute capacity in the world in 2015[1], which was 2 x 10²⁰ – 1.5 x 10²¹ FLOPS.

[1]: aiimpacts.org/global-computing

Follow

@WomanCorn But someone should figure out your question.

Sign in to participate in the conversation
Mastodon

a Schelling point for those who seek one