Show newer

@a_lizard@mastodon.social thanks for the tip! trying this later

a tensor is just an element in the tensor product of vector spaces and their duals, what's the problem?

(Non-joke question) Is there a tensor product of tensor products

@mira Thanks! These seem interesting but not quite what I was asking for :-) (though you probably weren't optimising for that anyway)

@rune Ah, that's what you meant. Seems true 👍

@mira
Makes sense, hadn't connected regularization with simplicity.

(tho I don't think I understand regularization enough yet to understand why it'd result in simplicity).

@rune I think that we mostly can't reach into the resulting buckets of neural nodes and change stuff we don't like.

Strong kudos to Jaron Lanier for forseeing the current discourse on generative model. My respect has increased.

Calibration of the intellect, optimism of the will

Wait, do neural networks implement a sensible prior?

(like the speed or simplicity prior?)

If yes, which one?

Is there another AI paradigm that could result in AGI[1] other than neural networks[2]?

[1]: Please don't ask me to pin down that term.
[2]: Assuming the scaling hypothesis. "Neural network"="Stacked matrix multiplication with non-linearities thrown in".

Wikipedia article “List of Largest Snakes”:

»There are eleven living snakes«

Does regularization of RL policies act as an impact measure?

Which way modern man

Show thread

Which way modern man

@WomanCorn I wonder how much advantage LLMs give someone with search-engine experience on a search engine in 2017.

E.g. ChatGPT lied to me about linearity in Jacobians, which is often the kind of thing I want to discover.

Probably still enough advantage.

Man Google is just not very helpful anymore. I remember a story on LW about singularity in the 1980s from an EURISKO-like system, but Google just doesn't have a clue what I want to find

Show older
Mastodon

a Schelling point for those who seek one