I was thinking that kids are taught that zero is neither negative nor positive around the same time they are learning about even and odd numbers, so they jumble “zero isn’t negative or positive” with “zero isn’t odd or even.”

In any case the wiki article goes deep into the misconception. Apparently a substantial fraction of teachers believe zero isn’t odd or even, including all the teachers at one school that was studied.

Even if you as a programmer knows that 0 is technically not even, you will still probably check if a number is even with (x % 2 == 0).

So they will miss out on seeing how a line of reasoning can be used to deduce things, they will listen to random ramblings, unconnected and emotive sentences, and be convinced by “logic” that is non-existent.

People say 0 is even because in the vast majority of real world applications for the test, it is appropriate to treat zero as even, not because they are afraid of your big brained maths.

Even if you as a programmer knows that 0 is technically not even, you will still probably check if a number is even with (x % 2 == 0).

That is the actual test to see if a number is even. To be more precise, an integer n is even if and only if it can be written as n = 2*m, where m is also an integer. So 6 is even, because we can set m to 3: 6 = 2*3. 0, then, is even, because 0 = 2*0.

The definition of odd, on the other hand, is m = (2*n) + 1, with m and n both integers.

Even if you as a programmer knows that 0 is technically not even

0 is “technically even”. Did you read the post? This isn’t about discussing whether or not it is, it’s about why some people don’t realise it is, and how maths could be better taught/communicated to rectify this.

a is divisible by a nonzero b if there exists a whole number x such that a = x•b. Then we call b a divisor of a. Clearly every whole number is a divisor of 0, and since 2 is a divisor of 0, 0 is even.

It seems to me a lot of people seem to think even-ness is a function over N+, and they hesitate to extend it to N. The 2/3-1/3 split fascinates me, but I’m not real picky on terminology.

To go further, you could ask is 3.5 odd, or even? I was taught that “factorial” was a function over integers, but it turns out gamma extends beyond that, in between (though, to be fair, gamma doesn’t work on negative integers). If you could extend odd/even, what even would it mean? I wonder what people would say

I’d be curious about the results of this poll with a follow-up question that either asks how confident they are in their answer, or why they gave the answer they gave.

While as this post says “The article on Wikipedia is clear” about the issue, it’s also the case that math is only “correct” up to a point that we choose certain definitions over others, sometimes without one being “incorrect”. (though there’s a lot of philosophising that can be done here). And that in different fields of math, or even different places that practice it (geographically or certain university’s conventions) certain definitions are taken as implicit until specified otherwise.
Aside from the parity of zero, another example of a similar case is whether zero is part of the natural numbers.

As far as I’m aware there’s no technical reason why we shouldn’t define zero as “not even” (though I’m not that fluent in this matter, I don’t recall that it should break anything, as opposed to trying to define it as equal to 1 which goes against certain axioms etc). When that’s more beneficial for some proof or result we are trying to achieve as long as it’s consistent with the rest of the math we’re using (which sort of means with the rest of most of math in general usually). Yes it means we are choosing the definition of what even means, or some neuence of it, but that’s fine if we can make it work.

There’s a story (that I hope I’m recalling correctly) about the definition of log(-1), that for most of us these days would equal πi which comes from Euler, while Bernoulli (under whom Euler studied) defined it as 0. Both of their definitions can be shown/argued to be correct and consistent and (broadly speaking) the one that was more conducive to developing math won out.

In some sense math is a language and it helps that there’s a clear definition and that we’ll all agree on what it means, but there’s room to pick and choose and have some flexibility.

I was thinking that kids are taught that zero is neither negative nor positive around the same time they are learning about even and odd numbers, so they jumble “zero isn’t negative or positive” with “zero isn’t odd or even.”

In any case the wiki article goes deep into the misconception. Apparently a substantial fraction of

teachersbelieve zero isn’t odd or even, includingallthe teachers at one school that was studied.I wonder if it also gets mixed up with ‘1 is neither prime nor composite’ (though you don’t learn that until later).

How odd. I do not even.

Complete the pattern:

3 2 1 0 -1 -2 -3

Odd Even Odd ??? Odd Even Odd

Even if you as a programmer knows that 0 is technically not even, you will still probably check if a number is even with (x % 2 == 0).

People say 0 is even because in the vast majority of real world applications for the test, it is appropriate to treat zero as even, not because they are afraid of your big brained maths.

That

isthe actual test to see if a number is even. To be more precise, an integer`n`

is even if and only if it can be written as`n = 2*m`

, where`m`

is also an integer. So 6 is even, because we can set`m`

to 3:`6 = 2*3`

. 0, then, is even, because`0 = 2*0`

.The definition of odd, on the other hand, is

`m = (2*n) + 1`

, with m and n both integers.0 is “technically even”. Did you read the post? This isn’t about discussing whether or not it is, it’s about why some people don’t realise it is, and how maths could be better taught/communicated to rectify this.

a is divisible by a nonzero b if there exists a whole number x such that a = x•b. Then we call b a divisor of a. Clearly every whole number is a divisor of 0, and since 2 is a divisor of 0, 0 is even.

It seems to me a lot of people seem to think even-ness is a function over N+, and they hesitate to extend it to N. The 2/3-1/3 split fascinates me, but I’m not real picky on terminology.

To go further, you could ask is 3.5 odd, or even? I was taught that “factorial” was a function over integers, but it turns out gamma extends beyond that, in between (though, to be fair, gamma doesn’t work on negative integers). If you could extend odd/even, what even would it mean? I wonder what people would say

I’d be curious about the results of this poll with a follow-up question that either asks how confident they are in their answer, or

whythey gave the answer they gave.While as this post says “The article on Wikipedia is clear” about the issue, it’s also the case that math is only “correct” up to a point that we choose certain definitions over others, sometimes without one being “incorrect”. (though there’s a lot of philosophising that can be done here). And that in different fields of math, or even different places that practice it (geographically or certain university’s conventions) certain definitions are taken as implicit until specified otherwise. Aside from the parity of zero, another example of a similar case is whether zero is part of the natural numbers.

As far as I’m aware there’s no technical reason why we shouldn’t define zero as “not even” (though I’m not that fluent in this matter, I don’t recall that it should break anything, as opposed to trying to define it as equal to 1 which goes against certain axioms etc). When that’s more beneficial for some proof or result we are trying to achieve as long as it’s consistent with the rest of the math we’re using (which sort of means with the rest of most of math in general usually). Yes it means we are choosing the definition of what even means, or some neuence of it, but that’s fine if we can make it work.

There’s a story (that I hope I’m recalling correctly) about the definition of log(-1), that for most of us these days would equal πi which comes from Euler, while Bernoulli (under whom Euler studied) defined it as 0. Both of their definitions can be shown/argued to be correct and consistent and (broadly speaking) the one that was more conducive to developing math won out.

In some sense math is a language and it helps that there’s a clear definition and that we’ll all agree on what it means, but there’s room to pick and choose and have some flexibility.