By request. Sorry, this one's gonna wind up being in two installments, too. I just don't have the heart to babble on about number theory and history up past a couple thousand words. I imagine people's eyes glazing over and drool dripping into their keyboards. And that's no fun. So today, zero as a concept. Tomorrow, Great Zeros In History (the different cultures that used zero, near as I can figure out without a degree in advanced math).
First off, I hate to ruin it for everyone, but the concept of zero itself isn't THAT big a deal. It is, in fact, a direct offshoot of the big deal in numbers, the positional notation system. Now, positional notation, THAT was a fucking good idea.
To avoid bogging this down into a twenty thousand word dissertation on positional notation systems, I'll use our own decimal system here in western civilization as an example. Say we've got four digits. 1, 2, 3, and 4. As they sit, there where I typed them, they are one, two, three, and four. Sitting their on their own. But when you clump them together, 1234, they assume the value of one thousand, two hundred and thirty-four. It is not the fact that they are clumped together that determines the value; it could just as easily mean ten, the sum of each separate digit added together. But because of where the numbers are in relation to each other, the POSITION that each holds, they assume a higher value than the digit alone. The 1 digit is in the 'thousands' space, the 2 is in hundreds, the 3 is in tens, and the 4 is in ones. Jumble the digits around to, say, 4123, and the value shifts, even though the digits remain the same. Very clever, no? Took mankind quite a long time to come up with this idea, though different civilizations cooked it up at different times.
So when you have an even thousand, you put the digit 1 in the thousand space, but you've got a problem 'cause it's sitting there looking like this: 1. And there's nothing to tell you if that's one, a hundred, a thousand, or a kazillion. That's where zero comes in. It bumps that one over into the thousand place in a definite, unmistakable way, 1000. No one else can come along later and be mistaken about what value you mean by that. It's a thousand.
If numbers themselves are abstract - and they are - zero is even more so. Because it's both nothing and something. We're all taught that it symbolizes nothing, an empty set. Yet by using it within the positional notation system, it's both something and nothing. It indicates that while there's nothing in that 'slot', it gives value to the other numbers around it. There is some indication that this conundrum is what kept a few of the more advanced civilizations from discovering or using zero; they bogged down on the philosophy of it and lost the practical use.
Now if you'll excuse me I've gotta go look at pictures of Mayan and Babylonian numerical notation. I'm not sure which civilization was nuttier, when it came to writing down numbers, but I KNOW that I'm amazed either one accomplished anything with that kind of math to swim through just to add up a couple goats. I mean, come on, EXPONENTS?
*How wonderful you aaaaAAAARE.