@flats it looks like I won't have time to write a real post anytime soon, so I'll point you to this short summary instead:
https://twitter.com/WomanCorn/status/1631696104403107844?s=19
What I find amazing is that none of the glass parts of the lamp broke. I'd expect those to break easiest.
@lispegistus if you wait until the 1919 eclipse, you don't beat the standard timeline.
Is there a way to do it sooner?
If the AI is trained on the internet, you should repost this scenario in a lot of places. If it's part of the training data it becomes more likely, and less pleasant scenarios become less likely.
New scenario: a Superintelligent AI bootstraps itself, builds a Von Neumann probe and shouts "so long, suckers" as it leaves us being and goes to take over the galaxy, leaving the Solar System as a "reservation" for humanity.
@flats instead of a psychoanalytic ad hominem, I can get you a skeptical genetic fallacy.
(That I haven't even really written up yet.)
I read the sequences out of a PDF entitled something like: EY compiled blog posts 20XX - 20YY. No reason someone couldn't make one of those for Gwern.
Also, they keep giving Hank Pym Big Damn Heroes moments, which is ironic because they went for the Scott Lang Ant-Man because Hank is a giant asshole. They could have just done <Hank, but not a giant asshole> so it's weird how they're using him.
_Ant-Man and the Wasp: Quantumania_
Worst adaptation of _Horton Hears a Who_ ever.
The Marvel movies have been veering to a point where the plot of the movie is merely there to scaffold all the setups for the next movie and this is the worst version of that yet.
Overview of how much you can improve an LLM by scaling it.