Do we have any idea how many TPUs there are?
(TPU is the supercomputer/cluster needed to train a modern AI.)
@WomanCorn That's a really good question! I haven't been able to find an answer. The closest I found was a page on the total compute capacity in the world in 2015[1], which was 2 x 10²⁰ – 1.5 x 10²¹ FLOPS.
[1]: https://aiimpacts.org/global-computing-capacity/
@WomanCorn But someone should figure out your question.
a Schelling point for those who seek one
@WomanCorn But someone should figure out your question.