https://twitter.com/mark_riedl/status/1488331425267040259 okay, but what's *actually* the best alignment paper to date?
Probably https://www.alignmentforum.org/posts/fRsjBseRuvRhMPPE5/an-overview-of-11-proposals-for-building-safe-advanced-ai
If only valuing novel contributions, maybe this: https://arxiv.org/abs/1805.00899
a Schelling point for those who seek one