Show newer

@niplav I think that a lot of my discomfort with AI risk arguments stems from carrying through assumptions about sovereign AI into discussions about other kinds.

(Yes, I have heard about instrumental convergence.)

In Britain, it's called a lift, but Americans call it an elevator. I guess we were just raised differently.

Well, fuck. The GPT disinformation age is now.

I googled "OS for 4gb ram" and the first hit, which also was used by google to populate its snippet is an answer from quora which is very obviously created with #chatgpt (I recognised the non-committal non-answer right away, but it can also be detected by a popular GPT detector).

The user has 98 answers and, you've guessed it, they are all created with GPT.
PLOT TWIST: The questions where also created with GPT!

quora.com/profile/Heri-Mulyo-C

#AI

Oh man, this wasn't intended to be a slam on this person.

Show thread

Singularitarians desire to build a master AI to run the world. They realize that it's really hard to get right, and switch to arguing against it.

Normies don't desire to build a master, they make an AI that they can just not use when it does something wrong.

I see my own posts and I'm like <what is this gibberish?>

In the runup to the OGL 1.1 debacle, Hasbro had certain other companies sign contracts offering them better rates on the profit sharing clause.

Do we know who?

Because I think it reveals a lot about the plan if the preferred partners are streamers and not RPG designers.

Yuuko and Mai attempt to determine if Nano is a robot. Mio chides them.

Nakamura, the science teacher, attempts to determine if Nano is a robot. She accidentally drugs herself.

Nano begs Hakase to remove her key, and some of her more ridiculous attachments.

Show thread

Using Scott Alexander's last name is sure a tell.

Maybe not technically a <dogwhistle>, but it really gives away a lot about the moral behavior of the person who uses it.

I think that the most likely outcome of LLMs is to put a bunch of copywriters out of work, but not to actually affect much elsewhere.

A wordcel apocalypse.

@niplav the autist's special interest is real, the schizo's special interest is fictional.

(And not even an entertaining fiction!)

Fine: Accidentally triggering an autist to tell you all about his special interest.

Not Fine: Accidentally triggering a schizo to tell you all about his "special interest."

Wordcel proclaims shape rotators are an existential risk to humanity.

What if, instead of assuming we had to build an AI singleton as fast as possible (with the risk that it turns out to be bad,) we just... didn't?

worldspiritsockpuppet.substack

Show older
Mastodon

a Schelling point for those who seek one