It seems I'm the only one with this opinion but I ...
# advent-of-code
k
It seems I'm the only one with this opinion but I really disliked today's challenge. I feel like the "support large numbers" is not only very weird and unexpected for an **Int**Computer, but also really favors scripting languages like python (which use BigInt by default since 3.0) and javascript, lua ... which use doubles.
On top of that "support large numbers" is very vague. What is a large number? I think 2 billion is large but clearly not large enough. I guess I should have know that it should fir in doubles and longs as AoC doesn't normally force people to use libraries and not all languages have a bigInt in their stdlib, but by this time I was already too salty to think clearly.
a
Well I would say Int is kinda hijacked by some languages to be within a range. So I don't see why large numbers would be weird or unexpected for an *Integer*Computer
p
"integer" in math sense does not really have any bounds