a trace exists in the false self https://twitter.com/StoreyDexter/status/1629217962962874369
"How is it so small and capable of so much? Because it forgets irrelevant details. There is another term for this: abstraction... no evidence will convince them a computer is doing anything but shuffling symbols, because “Concepts” are exclusive to humans"
https://borretti.me/article/and-yet-it-understands
'It just happens to be in the nature of knowledge that it cannot be conserved if it does not grow. Scientists — “knowers” [from the Greek] — need to continuously satisfy their curiosity if what they know is to remain valid and retain its vitality.'
some messy notes on Disco Elysium. Can't remember the last time I wrote about politics, so you can tell they really got to me.
Not wild about the Sophisticated pandering to social reality tho
"[unlike] folks who see the world as... tamed with the right set of facts & rules; I tend to think things are more complex, because humans are more complex... the breakthrough AI product is not substance but style"
Quite fun to see @TetraspaceWest's Shoggpth show up on the king of mainstream rambling business strategy blogs.
https://stratechery.com/2023/from-bing-to-sydney-search-as-distraction-sentient-ai/
The House of Lords once included a communist (hereditary peer, lol).
https://en.wikipedia.org/wiki/Wogan_Philipps,_2nd_Baron_Milford
His maiden speech is supposed to have called for the abolition of the House, but annoyingly it's not recorded in Hansard.
Melancholy about successful technology (of which Ford has a nice nuanced version I'm not taking issue with) is very striking to me:
we don't really want our problems solved, or we only want em solved in filmic ways - the big speech, the big march, the bad guys usurped, 0 profit
Ford is "sometimes angry, sometimes ashamed, and often grateful" for the drug,
the sticking points being his years of wasted effort and suffering, and his journalistic unease with technical solutions to human problems (particularly one bucketed as a social problem for so long).
"How long before there’s an injection for your appetites, your vices? A weekly anti-avarice shot? Can Pharma cure your sloth, lust, wrath...? Is this how we fix climate change—injecting harmony instead of hoping for it?"
https://www.wired.com/story/new-drug-switched-off-appetite-mounjaro/
Some interesting work recently under the neologism "epistemic corruption"
"to find hyperparams about twice as fast, start a bunch of networks training and after a while copy the weights of the one improving fastest. repeat"
https://www.deepmind.com/blog/population-based-training-of-neural-networks
"to reduce resource use by 50%(!), use a large model to do rejection sampling of small models' output"
"to increase performance by 10% absolute, just take the majority-vote answer of several LM answers"
To detect if text comes from LM X, randomly modify it and get X's logprobs of the original and the mod.
If p(original) > p(mod), classify as LM generated.
context maximiser @Arb