Before I get into this, I am not a mathematician. I’m not even an amateur. I’m just a guy who barely passed high school math 7 years ago, and who is interested in learning more.
It’s a generally accepted (though not completely) fact that 0.999… = 1. I’m not going to go into proofs, however if you need to see for yourself, you can find some at the Wikipedia article for 0.999… And I’m not going to debate or even acknowledge that people disagree, because that’s not the point of this post.
The point is, that 0.999… = 1 is an important concept that demonstrates the abstraction of the numeral from the number. The people who see 0.999… = 1 and can’t believe it, I believe, haven’t adequately realised there is a difference between a number and a numeral.
A numeral is a symbol that represents a number – that’s all it is. It is, of itself, inherently useless. The numeral doesn’t mean anything without the number. However, a number itself has meaning, no matter what numeral you use to represent it. This is what’s going on with 0.999… = 1.
The reason this distinction is important is one of advancement in mathematics, specifically in algebra and all other relatively abstract fields. When you realise that 1, the numeral, represents the number 1, and 0.999… the numeral, can also represent the number 1, you realise that 4/4 is also a valid numeral for the number 1. Then, it’s just a very small step to understanding that x can represent the number 1, and that in the equation 2x + y = 5, y also represents a number, and we need to find out that number.
Although the example I gave is trivial, I believe it’s an important concept to grasp, and it’s a concept many people don’t grasp.