Pondering numeric representation in computers.

Most programming languages don't have numbers. They have some kind of constrained number subtype.

"Integers" are not even integers, they're bounded integers.

This gets me wondering what kinds of numbers could be represented.

Actual integers could be represented with a BigInt type. (That's a list of bounded machine integers with logic to make basic operations handle overflow.)

Rationals could be represented with two BigInts.

Irrational numbers probably can't be represented in a sane way, without going to full symbolic representation.

Ironically, it's easier to represent complex numbers with rational parts than reals.

I'm now interested in what a programming language would look like that had a fully general number type, but used bounded numbers for speed when appropriate constraints were followed.

Follow

@ThrowawayVR

Python does something like this, but for integers only. In Python 2, it would auto promote from short -> long, but not auto demote back from long -> short.

In Python 3, that conversion is hidden, so I'm not sure if it still applies.

@WomanCorn Right. It would be nice if float auto-promoted to Decimal which is Python’s floating point BigInt equivalent. For what it’s worth my understanding is that in 3 all ints are treated internally as longs. I don’t think the implementation has a specific speed/memory disadvantage vs the auto promotion structure before, which is why they decided to remove bounded ints.

Sign in to participate in the conversation
Mastodon

a Schelling point for those who seek one