In my last post, I gave a definition for the real numbers intended to fill the holes in the rational numbers. Just to restate it, real numbers are an extension of the rational numbers so any set of real numbers which has any upper bound has a least upper bound.
There are two points about this definition. The first is that this is just a definition. It doesn't prove that the real numbers actually exist. Fortunately, there are techniques of constructing the real numbers which prove that they exist. My analysis textbook includes one of these techniques, known as Dedekind cuts. Interestingly, the construction by Dedekind cuts is in an appendix rather than the main text. Essentially the books says, "here's the definition, here's the proof in case you really want to know, now let's ignore that and start talking about the properties of the real numbers." The course notes I have actually go further, and say, "there's a proof in the book, but don't bother to read it." I went ahead and read it anyway.
I think I followed the Dedekind cuts proof, although I certainly couldn't reconstruct it. On first reading, I'm not convinced it gives me any insight into the real numbers not provided by the definition. I expect that if I study the subject enough, I will get to the point where I can say that this is what the basic outline of the proof looks like and that is why it is useful, but I'm nowhere near there yet.
The second thing is that in thinking informally about real numbers, I have thought of them as numbers that can be written as decimals with an infinite number of digits. The textbook has a very short section on decimal numbers, basically showing that this is valid. The last sentence of the section is, "Since we shall never use decimals, we do not enter into a detailed discussion."
Beyond the fact that going to the trouble to define decimals just to say that they won't use them is kind of funny, I think there's a useful point here. I work with decimals every day, basically every time I turn on a computer or use a calculator. As a result, It's really easy to think that "numbers" means "decimals". This flat out rejects that premise.
The slightly subtle part of this is that thinking of real numbers as decimals does not actually help understanding of the basic properties of real numbers. In fact, this thinking obscures more than it reveals. I can't help thinking that various Internet debates about real numbers are driven by the thought that real numbers are decimal numbers, and rejecting that thinking might help defuse these debates.