O Sweet Mr Math

wherein is detailed Matt's experiences as he tries to figure out what to do with his life. Right now, that means lots of thinking about math.

Thursday, August 30, 2012

10:35 PM

So, continuing my discussion of the big picture of real analysis, I ended last time talking about sequences of functions, and the idea that under the right conditions (the sequence is uniformly convergent, or the sequences of derivatives is uniformly convergent), the members of the sequence of functions have the same properties as the function which is the limit of the sequence. This means that if you have a function which is computationally ugly, you can potentially rewrite it as a sequence of functions which are easy to work with.

There are two important examples of sequences of functions which converge uniformly. The first is polynomials. Polynomials are easy to work with, so if you can rewrite an ugly function as a polynomial, you can turn hard problems into easy problems. In particular, Taylor series are a particular sequence of polynomials which approximate a function. Taylor series have important limitations, in that the original function must be infinitely differentiable and Taylor series do not always converge, but when they do work, they are a powerful and convenient tool.

The second example of a sequence of functions which converges uniformly is the Fourier series. Given a function which is bounded and periodic (meaning it repeats itself), you can write the function as a sum of sine and cosine functions. Sines and cosines are difficult to evaluate, but they are easy to work with (for example differentiation and integration). Their smoothly undulating curves are also pretty. It's aesthetically appealing to be able to convert an angular, sharp edged function into a sum of beautiful waves.

I've previously encountered Fourier series in other contexts, and while I studied the basic math to some extent, there was an element of, "we are justified in using Fourier series because they give the correct results in practice." I found it personally satisfying to come back to them and be able to say that we are mathematically justified in using them because we can mathematically prove that they give the expected results.

This is as far as I've gone in studying real analysis. It all comes back to using limits as a tool to say that these are the conditions under which we are allowed to do certain mathematical operations, and these are the conditions under which the operations will fail. Along the way, I started with the concept of sequences of numbers, and eventually extended that idea to sequences of functions. The fact that functions can be inserted in a place where I expected to use numbers has also had the effect of changing how I think about functions in general.

Real analysis goes on from here, leading to questions like "what is the mathematical definition of length?" (Think about it. A line segment has some length. But a line is made up of points, and points have no length. So where does the length of the line come from?) This leads to questions like, "can you have a set of points which is not a line but which also has a length?" I'm interested in these questions, which start to have a metaphysical significance, but I'm happy to stop here with my current studies for now.

0 comments

Wednesday, August 29, 2012

9:43 PM

I want to continue my thoughts about real analysis from last time. As I said then, the central concept behind real analysis is the limit. But limits aren't the point of real analysis, they are a tool. So what can we do with limits?

First, we can determine continuity. Continuity is the idea that there are no breaks or gaps in something. As applied to functions, continuity means that there are no sudden jumps in the value of the function. In other words, if a particular input produces a particular output, any inputs near the original input will have outputs near the original output. Or mathematically, a function is continuous at a point if the limit of the function at that point equals the value of the function at that point. Being "near" a value is sort of a nebulous concept, and I'm not going to define it precisely here. But I will say that being "near" an output at a particular input has a precise definition, but that it depends on the function and also the input value.

Starting with the statement that a function is continuous at a particular point, we can extend this to say that the function is continuous everywhere. Sometimes we can only say that a function is continuous at most places, but there are some places where it is not continuous. This is nearly as useful as being continuous everywhere. We may also be able to say that a function is uniformly continuous, in which case we can use the same definition for "near" for every point in the domain of the function.

Why do we care about continuity? If we know that a function is continuous, we know a lot about that function. There are a bunch of important theorems from calculus, such as the intermediate value theorem and the mean value theorem, which depend on the continuity of the function. Calculus uses these theorems, but often does not prove them. A calculus textbook will say something like, "this looks like it should be true, and it is true, but for the proof, check an analysis book." Continuous functions are easier to work with than discontinuous functions. The bad news is that most functions are discontinuous, so proving continuity is an important step for working with a function.

Continuity leads to calculus. Again using limits as a tool, we can demonstrate that differentiation and integration actually work. Both concepts are based on approximation, and there's an assumption that the approximations are actually meaningful. Using limits, analysis proves that the approximations are correct.

When approaching limits using sequences, derivatives are based on a sequence of points. Starting with a fixed input point, take a sequence of points near that point, and the limit of the change in outputs between those two points is the derivative of the function. Integrals are based on a sequence of sets. Subdivide the total area into a set of smaller areas which approximate the original. As the sets have more subdivisions of smaller areas, the limit of the approximation of the area is the true area, or the integral.

You can also take the limit of a sequence of functions, and this is where the real fun begins. Start with some function which is hard to work with. Maybe there's no way to directly find the value of the function for a particular input, or you can do it but it's too much work. You can approximate the original function with another function that is easier to use. If you have a sequence of these approximate functions, you may be able to show that the limit of the sequence equals the original function. This justifies using one of the approximations instead of the original function.

How do you prove that these approximations are actually good, useful approximations? With limits, of course. And limits can be very useful. Looking at the continuity of a function, we distinguished between continuity in general, which means that if two inputs are near each other, then their outputs will also be near each other, but the definition of "near" depends on the inputs, and uniform continuity, which states that "near" has the same definition for every input. Likewise, we can define uniform convergence for a sequence of functions.

Take a particular function in the sequence. This function approximates the original function, but is not exactly the same. The difference at a particular input between the approximation and the original is called the error of the approximation at that point. If for the particular function there is a maximum error, and if every function in the sequence has a maximum error, and if the limit of the maximum error is 0 as you progress through the sequence, then the sequence converges uniformly. (For a counter example, picture a function which goes to infinity at one point. Then imagine a sequence of functions which are generally close to the original function at all other points, but have a finite value at that point. If that value grows with each function in the sequence, the limit of the sequence will be the same as the original function, but the error will be infinite at that point for every function in the sequence, so it does not converge uniformly.)

You can use limits to show that every if every function in a sequence is continuous and the sequence converges uniformly, then the function they converge to must be continuous. Likewise, for a sequence of functions which converges uniformly, the limit of the sequence of integrals equals the integral of the limit. However, if a sequence converges uniformly, the limit of the sequence of derivatives does not necessarily equal the derivative of the limit of the sequence. This is one of those cases where it's easy to detect a pattern and assume it continues. Analysis and the application of limits shows that your expectations break down. Applying the same techniques leads to the good news that if the derivatives converge uniformly to a function, then that function is the derivative of the limit. In the case of derivatives, the question isn't whether the functions in the sequence converge uniformly. It's whether the derivatives of the functions in the sequence converge uniformly.

There's a big payoff to sequences of functions and uniform convergence, but this post has gone on long enough, so it will have to wait until tomorrow.

0 comments

Tuesday, August 28, 2012

4:18 PM

I had been posting about real analysis. I worked through the definition of the real numbers, then moved on to topology and sets, and finally had started working on sequences and series. Then I stopped. Truthfully, when I started covering this material I hadn't intended to cover things in quite so much detail, but going over all the details did come pretty naturally to me. But when I started, I had been intending to keep pace with my own studying. I ended up falling behind, and then I ran out of time for any blogging at all.

In the meantime I kept studying, and now I have worked through two semesters worth of analysis. While I'd like to return to blogging in detail, I've decided to take advantage of the fact that I've stopped to take a step back and look at the larger picture, now that I have enough understanding to see the larger picture.

The fundamental idea behind real analysis is the limit. Limits are a powerful tool for understanding lots of math concepts, and real analysis is about developing the use of limits and then applying them to various problems.

Loosely speaking, the idea behind limits is that two things are near each other. The things in question could be numbers, or points in space, or sets, or functions. There's a precise mathematical definition for limits, which involves Greek letters (and causes some people to run in terror), but today I just want to talk about the general concept.

In many mathematical contexts, the standard is exact equality. High school algebra is all about showing that the left hand side of an equation is exactly equal to the right hand side. With limits, we say that two things are not exactly the same, but that's okay as long as they are near each other. This can feel like it's a step back from true equality, and it can also feel unfocused.

But there's a tradeoff. Equality can only say that this thing is exactly the same as this other thing. Limits can let you say that everything near this thing is close to everything near this other thing. The ability to speak about lots of things at the same time gives limits more power than strict equality has.

You may be wondering why, if analysis is all about limits, did I spend months blogging about sets and sequences. I did not know the answer at the time, but now I do. Just like limits are a tool used by analysis to talk about other stuff, we need tools to talk about limits. The first tool is sets and topology. One of the fundamental concepts of topology is distance, and so gives us the ability to talk about whether two things are near each other. The theory about sets we developed, for example the properties of compact sets, gives us tools to talk about limits.

Similarly, sequences give us different tools to talk about limits. The important thing here is that although sets and sequences give us different tools, they come to the same conclusions. Anything that can be demonstrated about limits using sets can also be demonstrated using sequences, and which one to use is just a question of convenience. This equivalence can also be used for sets and sequences to say things about each other, so using both tools allows us to get a deeper understanding of each tool individually.

I plan to have another post soon in which I will talk about what limits are useful for, again at a big picture level. I may also post about the big picture with sequences and series. My introduction to the concepts of sequences and series was in Calculus 2, and the idea has always felt a little half-baked. Now that I'm looking at them from the other side of analysis, I have a much better understanding of why we study them the way we do.

0 comments

RSS

FAQ

What does "rolls a hoover" mean, anyway?

"Roll a hoover" was coined by Christopher Locke, aka RageBoy (not worksafe). He enumerated some Hooverian Principles, but that might not be too helpful. My interpretation is that rolling a hoover means doing something that you know is stupid without any clear sense of what the outcome will be, just to see what will happen. In my case, I quit my job in an uncertain economy to try to start a business. I'm still not sure how that will work out.

Why is the HTML for this page not valid?

BlogSpot adds the advertisement that appears at the top of this page. That advertisement is not valid HTML and is outside of my control. I believe that aside from that ad, this page is valid HTML.