We will reach a point of diminishing returns on increasing parameters within the next 20 years, where the cost of hardware to increase parameter counts isn't worth the increase in value you get from the model.
@WomanCorn This feels quite true to me. (Where "new paradigm" could also just be "better activation function found").
We won't ever hit Peak Parameters, because a new paradigm will appear and draw people away from LLMs before we do.