Douglas_Knight comments on Bizarre Illusions - Less Wrong

11 Post author: MrHen 27 January 2010 06:25PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (305)

You are viewing a single comment's thread. Show more comments above.

Comment author: byrnema 28 January 2010 12:41:00AM *  3 points [-]

Hmm... I agree this is compelling. However, since I'm resistant to updating my world view about 1-the-discriminated-prime-number, I'll continue to proffer counter-arguments:

  • the Fundamental Theorem of Arithmetic is pretty important, but may still not be the "essence" of what prime is

  • the FTA itself requires the "except 1" clause: "all natural numbers can be uniquely factored into primes except 1" -- which would make someone thing 1 ought to be prime

  • the FTA already assumes 'modulo permutations', we could easily throw in 'modulo 1'

  • Wikipedia -- the first and last authority on such things -- carefully writes in an entire sentence unto itself, "The number 1 is by definition not a prime number," suggesting just how arbitrary this is. (My own emphasis added.)

The best argument I came up with for not including 1 as prime, because I tend to worry about how things are constructed, was with the seive of Eratosthenes.

The seive of Eratosthenes says that you can find the primes by starting with all the natural numbers > 1; let 2 be the first prime number, and then begin eliminating all multiples of 2 and the multiples of subsequent primes as you find them. If you included '1' in the first step, then you would eliminate all the numbers in the first step.

Comment author: Douglas_Knight 28 January 2010 05:52:15AM *  3 points [-]

Wikipedia -- the first and last authority on such things -- carefully writes in an entire sentence unto itself, "The number 1 is by definition not a prime number," suggesting just how arbitrary this is.

A measure of the arbitrariness is the history, which is that 1 was considered prime up to the 19th century and was a matter of fashion during the 19th century. That suggests that unique factorization is not, in itself, enough to motivate the definition. Perhaps its extension to the gaussian integers or the more radical version for general number rings prompted the definiton.

Comment author: Jack 28 January 2010 08:01:47AM 0 points [-]

This reminds me. Pre-19th century it was thought that part of what it was to be a mammal was to give live birth, in addition to having mammary glands. 1 is the platypus of numbers.