excellent thread on several core concepts of ML, namely encoding spaces for datasets, and how training approximates the latent manifolds thereof
these concepts are much more generally applicable than just for ML, for ex. modeling the state spaces of complex systems as well
---
RT @fchollet
A common beginner mistake is to misunderstand the meaning of the term "interpolation" in machine learning.
Let's take a look ⬇️⬇️⬇️
https://twitter.com/fchollet/status/1450524400227287040
the thread discusses how the encoding space is larger than the latent manifold, which is a subset of the space, but its worth mentioning that the actual state space of the system being modeled can be much larger than the encoding space!
---
RT @pee_zombie
this is why we praise those able to explain complex concepts simply; minimally-lossy dimensional reduction is a difficult skill to master! mapping btwn the super high-dimensional spa…
https://twitter.com/pee_zombie/status/1439992610781794305
state spaces are representations of the possible states of some system; an encoding space, in the terminology of the QTd thread
these come in quite handy for understanding chaotic systems, among others
---
RT @pee_zombie
great illustration of the initial state phase space of a 2d chaotic system https://twitter.com/Rainmaker1973/status/1342037328836128771
https://twitter.com/pee_zombie/status/1342223630403514374