this post was submitted on 24 Jul 2022
71 points (94.9% liked)

Programmer Humor

32563 readers
894 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Lakso@ttrpg.network 1 points 1 year ago* (last edited 1 year ago)

Those are valid points and make some practical sense, but I've talked too much with mathematicians about this so let me give you another point of view.

First of all, we do modular arithmetic with integers, not natural numbers, same with all those objects you listed.

On the first point, we are not talking about 0 as a digit but as a number. The main argument against 0 being in N is more a philosophical one. What are we looking at when we study N? What is this set? "The integers starting from 0" seems a bit of a weird definition. Historically, the natural numbers always were the counting numbers, and that doesn't include 0 because you can't have 0 apples, so when we talk about N we're talking about the counting numbers. That's just the consensus where I'm from, if it's more practical to include 0 in whatever you are doing, you use N~0~. Also the axiomatization of N is more natural that way IMO.