Let's say you have a programming language C, and a programming language C+.
Every valid C program is a valid C+ program, but not the other way around.
There can be at least two ways in which this happens:
1. C+ is "stricter" than C, but the strictness is optional: you can e.g. add type signatures that don't exist in C.
2. C+ is more "featureful" than C. E.g. it adds generics or whatever.
Is there a way to distinguish those two rigorously? Or is there no underlying deep difference?