@niplav > "The more ambitious plan may have more chances of success […] provided it is not based on a mere pretension but on some vision of the things beyond those immediately present."
I call this "abstract leverage": given a specific problem, sometimes it's *easier* to try to find a more general solution which solves more than what you bargained for.
Spaced repetition is "memory leverage".
@rime
There's levels to this! If we have a decreasing marginal returns model & weakly efficient market for altruism (vis à vis EA) we get a bunch of difficult but useless problems + few medium — difficult useful problems
But one can also be good along some axis, e.g. being able to deal with boring/tedious/stupid/low-status stuff that has high leverage
And then there's also problems which, if solved, unlock many low-hanging fruits cascadically