In my last post, I gave a definition for the real numbers intended to fill the holes in the rational numbers. Just to restate it, real numbers are an extension of the rational numbers so any set of real numbers which has any upper bound has a least upper bound.
There are two points about this definition. The first is that this is just a definition. It doesn't prove that the real numbers actually exist. Fortunately, there are techniques of constructing the real numbers which prove that they exist. My analysis textbook includes one of these techniques, known as Dedekind cuts. Interestingly, the construction by Dedekind cuts is in an appendix rather than the main text. Essentially the books says, "here's the definition, here's the proof in case you really want to know, now let's ignore that and start talking about the properties of the real numbers." The course notes I have actually go further, and say, "there's a proof in the book, but don't bother to read it." I went ahead and read it anyway.
I think I followed the Dedekind cuts proof, although I certainly couldn't reconstruct it. On first reading, I'm not convinced it gives me any insight into the real numbers not provided by the definition. I expect that if I study the subject enough, I will get to the point where I can say that this is what the basic outline of the proof looks like and that is why it is useful, but I'm nowhere near there yet.
The second thing is that in thinking informally about real numbers, I have thought of them as numbers that can be written as decimals with an infinite number of digits. The textbook has a very short section on decimal numbers, basically showing that this is valid. The last sentence of the section is, "Since we shall never use decimals, we do not enter into a detailed discussion."
Beyond the fact that going to the trouble to define decimals just to say that they won't use them is kind of funny, I think there's a useful point here. I work with decimals every day, basically every time I turn on a computer or use a calculator. As a result, It's really easy to think that "numbers" means "decimals". This flat out rejects that premise.
The slightly subtle part of this is that thinking of real numbers as decimals does not actually help understanding of the basic properties of real numbers. In fact, this thinking obscures more than it reveals. I can't help thinking that various Internet debates about real numbers are driven by the thought that real numbers are decimal numbers, and rejecting that thinking might help defuse these debates.
Last time we I outlined the proof that the square root of 2 is not a rational number. Now I will discuss why this is a problem, and define the real numbers as a solution.
Consider a set of rational numbers, for example all rational numbers less than 1. If there exists a number such that every number in the set is less than or equal to that number, that number is an upper bound for the set. For example, 2 is an upper bound of the set of rational numbers less than 1. Of course, 1 is also an upper bound. If a set has any upper bound it has lots of upper bounds.
So instead of talking about any upper bound for a set, we want to talk about the least upper bound. That is, the smallest number that is an upper bound. If we choose a smaller number, numbers in the set will be greater than the number we choose. If we choose a larger number, then there exists a number smaller than that which is still an upper bound. So the least upper bound sits right in the middle, and could be thought of as the "best" upper bound.
The least upper bound for the set of rational numbers less than 1 is obviously 1. The problem comes when we try to find a least upper bound for the set of rational numbers less than the square root of 2. Since the square root of 2 is not a rational number, it cannot be the least upper bound. Any rational number greater than the square root of 2 is an upper bound, but if you try to claim that any particular rational number is the least upper bound, it is always possible to find a smaller upper bound. The only conclusion is that there is no rational least upper bound for this set.
An informal interpretation of this result is to say, okay, the square root of 2 is not a rational number. Why don't we use the "best" rational approximation of the square root of 2 instead? Leaving aside the fact that approximations aren't really a mathematical way of thinking, the least upper bound result shows that there is no best approximation, because you can always find a better one.
Essentially, this means that the rational numbers have holes. It's true that given any two rational numbers, you can always find a rational number between them, so the rational numbers are very close together. However, there are still numbers we would like to use (like the square root of 2) which just don't exist in the set of rational numbers, and there's no rational number we can use as a substitute for these missing numbers.
This is where the real numbers come in. The real numbers are defined as an extension of the rational numbers with the property that any set of real numbers with an upper bound has a least upper bound. The least upper bound of the set of rational numbers less than the square root of 2 is the square root of 2. Therefore, the square root of 2 is a real number.
Now that the real numbers have been defined, I will post some thoughts about them next time.
Okay, enough talk about studying math. I want to talk about math. I've been studying real analysis recently, and so far the discussion has been on the properties of real numbers.
I mentioned that the course notes started with the natural numbers and addition, and proceeded to derive multiplication and exponents. The notes also develop zero and negative numbers before getting to rational numbers. The textbook skips straight to rational numbers and I will too.
A rational number is any number written as a fraction where both the numerator and the denominator are integers. In this conceptualization, the natural number 2 is a different concept than the rational number 2/1, even though they have all the same properties and can be used interchangeably in most contexts. Additionally, 1/2 is a different number than 2/4, even though they are equivalent.
One sometimes interesting, sometimes frustrating thing about the project to start over and redefine everything from scratch is that you have to revisit lots of basic math. In the early stages, this involves things which are totally internalized. I know that 1/2 = 2/4, but what does that mean in the context of this definition?When trying to develop ideas from the ground up, I end up second guessing myself frequently. Just because I "know" something is true doesn't mean I've proved it. Sometimes I embrace the challenge of proving everything, and other times I want to say, "I know this is true. Why can't I just say so and move on?"
Anyway, once the properties of rational numbers are established, the next step is to prove that the square root of 2 is not a rational number. I've seen this proof many times, but it took multiple times before it really sank in, so I will restate it here. Assume that x is a rational number such that x2 = 2. Then x can be written as a fraction, p/q, where both p and q are integers. Many different values for p and q can be chosen to give the same equivalent fraction, so pick p and q so that at most one is an even number. (If both are even, just divide both by 2 as many times as necessary until one is odd.) Then (p/q)2 = 2, so p2 = 2q2. Therefore p2 is even, so p is also even. But that means we can choose the integer m so 2m = p. Substituting back into the earlier equation, (2m)2 = 2q2, and by the same reasoning, q is also even. But we chose p and q so that at most one of the two numbers is even, and this contradiction means p and q do not exist, so the square root of 2 is not rational.
Now that we've shown that the square root of 2 is not a rational number, it's reasonable to ask what kind of number the square root of 2 is. I will go into the problems that the square root of 2 causes for the rational numbers and why the real numbers are the solution next time.
I want to develop some related thoughts to my last post, in a different (and hopefully more optimistic) context. For a long time I thought I wasn't interested in studying mathematics at the graduate level. I used two basic arguments to support this. The first is that I didn't even understand what graduate level math was talking about. Upper level math features things like groups and rings and differential geometry and I don't know what else, but even the basic vocabulary is totally foreign to me. The second is that, in my understanding, upper level math isn't even about numbers anymore, and I like numbers. I have since concluded that I was wrong, wrong, wrong.
First of all, it's not even true that all upper level math is unfamiliar to me. Looking at graduate math department course descriptions, I can find lots of courses on subjects like statistics and Fourier analysis and things that I actually do have an understanding of and interest in. In some cases, like Fourier, my understanding is a result of the engineering courses I took after I basically decided I should get a degree in engineering rather than math. I may not have known I was interested in the subject before I took the engineering courses, but with the benefit of hindsight it's clear that I should have skipped the engineering and gone straight for the math.
Second of all, I don't know where I got the idea that if I was unfamiliar with the subject, I wouldn't be interested in it. (This may be related to the being smart vs. working hard distinction from last time. The existence of topics I'm not familiar with may imply that I'm not smart, so if I was being motivated by seeming smart rather than by working hard, that was reason to stay away.) Regardless, this idea is clearly ridiculous. I studied some abstract algebra last fall. I went in not knowing what groups or rings were, and now I have at least the basics down. Getting the basics is less important to me than the fact that the whole time I was studying, I was thinking, "This is great. Why didn't I take this course a long time ago?" Lack of knowledge is not an excuse. There was always plenty of evidence that I was interested in all the math I did know, so it's not surprising that this interest extends to subjects I don't know.
Abstract algebra also dents the objection about not being about numbers. Abstract algebra is about sets, which are made up of things, which are not necessarily numbers. So a lot of abstract algebra isn't really about numbers. But one of the central ideas of abstract algebra is that different sets behave the same, so when you're talking about one set you are really talking about all similar sets. Which means it's often convenient, when talking about any arbitrary set, to find a similar set of numbers and talk about that one instead.
One interpretation is that abstract algebra isn't really about numbers. Another interpretation is that abstract algebra allows you to talk about all kinds of things which are not numbers as if they in fact are numbers. The numbers don't go away. Instead abstract algebra almost makes the numbers more powerful by making them more universal.
Last fall, speed was more important than depth in my crash course in abstract algebra. I ended up with a pretty cursory understanding of the subject. But now I'm looking forward to going back and studying it in more depth. It's clear I was making excusing for why I shouldn't be doing this before. But the evidence is pretty overwhelmingly that those excuses were wrong. Now I'm looking for opportunities to study more math instead, and I'm a lot happier as a result.
A side note on the psychology of independently studying math: for several years, I've been big on the idea that it's far better to praise people for working hard than for being smart. There's some independent evidence for this. People who have been praised for being smart eventually hit problems which are hard for them, and then they stop trying. People who have been praised for working hard view hard problems as an opportunity to demonstrate how hard they work.
Generally I feel this way about myself and math. My strength isn't that I'm smart (which is not to say I'm not). My strength is that I have an almost obsessive need to solve problems that are put in front of me. At least, that's what I tell myself most of the time.
However, I've been flipping through my real analysis text and noticing my emotional reactions to some topics that are coming up. There's basic cluelessness, such as topology, where I don't know the basic vocabulary. I look at the text and it's all jargon that I don't understand. There's a level on which I feel like I can't be interested in this because I don't even understand what it's about. On the other hand, I have confidence that I will figure it out when I get there.
Infinite series is a subject that's a little more complicated. I have a long standing feeling that I don't really understand them. However, I reviewed the chapter on infinite series in my calculus textbook recently, and I felt like I was on top of the material as far as it went. The presentation in my calculus textbook was more concerned with the question of whether a particular series would converge than with what particular value it would converge to. It presented a bunch of convergence tests, but I'm not even sure whether it demonstrated why the tests work. I can look at a series and use the tests, but I still feel unsatisfied. Maybe the issue is just that the calculus text didn't go far enough. In that case, real analysis may give me what I'm looking for.
And then there's vector calculus. I came out of multivariable calculus not understanding vector calculus at all. Last fall, I worked out enough understanding that I could correctly answer test problems, but I still feel like I have no idea what's going on. There's a section on vector calculus in my analysis textbook that fills me with dread. On a certain level, I think this is totally irrational. When I get there, I'll be well prepared, and I will take as long as I need to work out what's going on. In all likelihood, once I figure it out, I'll wonder what the big deal was.
I went through this process with trigonometric substitution. Completely failed to get it first time around in calculus, then tried it again recently and found it straightforward but tedious. At that point I sort of resented it, until I realized that it's actually useful for evaluating line integrals.
So I expect that similar patterns will play out in the future. Topics will pass from completely unknown to incomprehensible to doable but pointless or annoying to actually useful. They may not all go through every stage, or in that order, but as long as I keep working hard, the comprehensible and exciting stuff should keep growing.
This leads to what I feel is the big weakness of independent study. There's no feedback. I think I mostly sort of understand what's going on so far as I've been working on real analysis, but I'm not sure. When I try to work out a problem from the book, I can feel like I both overthinking the problem and missing the point. Am I assuming things which I haven't proved? Is my reasoning not rigorous enough? Or am I trying to unnecessarily reinvent the wheel? In a classroom, there's much more opportunity to see how others are approaching problems and to just ask the professor for clarification. I have the book and the notes, but mostly I have to figure it out on my own. I know I've missed it badly in some cases, but I hope most of time I'm doing well enough. I also hope that the farther I get, the easier it will be to determine what the important parts of the stuff I've already covered were.