Do we have any idea how many TPUs there are?
(TPU is the supercomputer/cluster needed to train a modern AI.)
@WomanCorn That's a really good question! I haven't been able to find an answer. The closest I found was a page on the total compute capacity in the world in 2015[1], which was 2 x 10²⁰ – 1.5 x 10²¹ FLOPS.
[1]: https://aiimpacts.org/global-computing-capacity/
[This page is out of date and its contents may have been inaccurate in 2015, in light of new information that we are yet to integrate.] Computing capacity worldwide…
@WomanCorn But someone should figure out your question.
a Schelling point for those who seek one
@WomanCorn That's a really good question! I haven't been able to find an answer. The closest I found was a page on the total compute capacity in the world in 2015[1], which was 2 x 10²⁰ – 1.5 x 10²¹ FLOPS.
[1]: https://aiimpacts.org/global-computing-capacity/
[This page is out of date and its contents may have been inaccurate in 2015, in light of new information that we are yet to integrate.] Computing capacity worldwide…
AI Impacts