¹²³4
Oh so that’s what exponents are for. They’re just lowercase numbers in the sky.
I guess OP’s never seen Roman numerals before. 🤪🤓
Although, the romans never saw one trillion anything.
VXXXII!
ONE TWO THREE FOUR FIVE
Relevant video by mad lad Tom7: https://www.youtube.com/watch?v=HLRdruqQfRk
There truly is an XKCD for everything…
The subreddit law of comics (lemmy communities just can’t reach that level for now)
☺☻♥♦♣♠•◘○
Just use an exclamation mark. It’ll be fine. Like so:
5!
10!
See? No real difference and now the number feels bigger!
Big difference there, those are
120
3628800
I think you mean:
120!
3628800!
Really emphasize it, ya know!?
Embiggened by several factors.
Calculator 2: The Overbiggening
You can use absolute numbers like ”The answer to life, the universe and everything is |42|.”
What if the number is negative, do I say “-|42|?”
Please surrender your license to math.
XXXXII!!! So simple!
I don’t know how to explain it, but 5 is loud
nerd mode engaged.
https://graphicdesign.stackexchange.com/questions/54423/why-dont-upper-case-numbers-exist#54425
they actually are usually upper case, it’s the lower case that are less widely used.
IDK, I don’t think old style and lining figures are analogous to lower- and uppercase letters. They’re not really different glyphs, at least not like lower- and uppercase letters are, and I would see them more as different ways of typesetting the same glyph.
Edit: Wikipedia does not agree with me.

6 and 8 being all “no one tells me what to do!” and staying the same.
That can be used to play pranks on people…
As far as I know, that is the ‘number case’. Where the difference between upper and lower case is defined based on alignment of the numbers with baseline of typography.
I think the post is taking about ‘letter case’. Which we commonly use to yell at people through text. I don’t think there is an equivalent like that in case of numbers. Mainly because numbers came from languages which are unicase by default. Like the Indian languages and Arabic.
Wikipedia says that they are sometimes referred to as lowercase.
Most font packages call it old style figures.
In typography class we were taught to call the lining and non-lining figures.
It is lowercase only. But lowercase in number case. The upper and lower case is distinguished based on alignment in this, where in text case it is based on shape and/or size.
Edit: my use of the words number case and letter case does not look like the standard words. But the concept still exist. Check this: https://totallytype.com/figures.php
Neat-lowercase0
𝟙𝟚𝟛𝟜𝟝𝟞𝟟𝟠𝟡𝟘
Um, @&$!#
That explains a lot. It turns out people often switch to upper case speaking numbers to me.
Chinese entered the chat
I’m ethnic Chinese* and I’m entering the chat
(*Chinese American)
And I’ll conquer the fediverse with 壹貳參肆伍陸柒捌玖拾
— It will cost you $253. — Are you kidding me? TWO HUNDRED AND FIFTY THREE DOLLARS?
$253? Try, SEVENTEEN THOUSAND EIGHT HUNDRED AND NINETY SIX DOLLARS!
I’m not sure if Roman Numerals are the best choice for statistics
iii III!
It’s made up numbers anyway…
So is Arabic
Use roman numerals.
How about, ONE POINT SEVEN FIVE TRILLION KILOGRAMS PER HECTARE SQUARED!
Or perhaps, TEN BILLION FOOT POUNDS PER SQUARE INCH!
Or even, FOURTEEN MILLION ONE HUNDRED AND SEVENTY FIVE THOUSAND, TWO HUNDRED AND SIXTY THREE CUBIC DECILITERS PER YEAR!
For some reason this reminds me of [https://en.wikipedia.org/wiki/COBOL](the COBOL programming language), though not even COBOL was batshit enough to use numerals written in plain English. Everything else was in plain English, though, which was supposed to make it easier to read and write, but is in reality a horrible idea.
Though all caps just reminds me of early programming languages in general, since we didn’t separate uppercase and lowercase in all machines back then, instead using encoding schemes like DEC SIXBIT. Saving memory by using only six bits per character instead of seven or eight, and such. Six bit characters had matching word lengths, before the concept of a byte there used to be loads of 12-bit and 36-bit architectures, that more-or-less went away when the industry almost collectively decided to take byte-addressed memory into use.

















