O Sweet Mr Math

wherein is detailed Matt's experiences as he tries to figure out what to do with his life. Right now, that means lots of thinking about math.

Wednesday, August 29, 2012

9:43 PM

I want to continue my thoughts about real analysis from last time. As I said then, the central concept behind real analysis is the limit. But limits aren't the point of real analysis, they are a tool. So what can we do with limits?

First, we can determine continuity. Continuity is the idea that there are no breaks or gaps in something. As applied to functions, continuity means that there are no sudden jumps in the value of the function. In other words, if a particular input produces a particular output, any inputs near the original input will have outputs near the original output. Or mathematically, a function is continuous at a point if the limit of the function at that point equals the value of the function at that point. Being "near" a value is sort of a nebulous concept, and I'm not going to define it precisely here. But I will say that being "near" an output at a particular input has a precise definition, but that it depends on the function and also the input value.

Starting with the statement that a function is continuous at a particular point, we can extend this to say that the function is continuous everywhere. Sometimes we can only say that a function is continuous at most places, but there are some places where it is not continuous. This is nearly as useful as being continuous everywhere. We may also be able to say that a function is uniformly continuous, in which case we can use the same definition for "near" for every point in the domain of the function.

Why do we care about continuity? If we know that a function is continuous, we know a lot about that function. There are a bunch of important theorems from calculus, such as the intermediate value theorem and the mean value theorem, which depend on the continuity of the function. Calculus uses these theorems, but often does not prove them. A calculus textbook will say something like, "this looks like it should be true, and it is true, but for the proof, check an analysis book." Continuous functions are easier to work with than discontinuous functions. The bad news is that most functions are discontinuous, so proving continuity is an important step for working with a function.

Continuity leads to calculus. Again using limits as a tool, we can demonstrate that differentiation and integration actually work. Both concepts are based on approximation, and there's an assumption that the approximations are actually meaningful. Using limits, analysis proves that the approximations are correct.

When approaching limits using sequences, derivatives are based on a sequence of points. Starting with a fixed input point, take a sequence of points near that point, and the limit of the change in outputs between those two points is the derivative of the function. Integrals are based on a sequence of sets. Subdivide the total area into a set of smaller areas which approximate the original. As the sets have more subdivisions of smaller areas, the limit of the approximation of the area is the true area, or the integral.

You can also take the limit of a sequence of functions, and this is where the real fun begins. Start with some function which is hard to work with. Maybe there's no way to directly find the value of the function for a particular input, or you can do it but it's too much work. You can approximate the original function with another function that is easier to use. If you have a sequence of these approximate functions, you may be able to show that the limit of the sequence equals the original function. This justifies using one of the approximations instead of the original function.

How do you prove that these approximations are actually good, useful approximations? With limits, of course. And limits can be very useful. Looking at the continuity of a function, we distinguished between continuity in general, which means that if two inputs are near each other, then their outputs will also be near each other, but the definition of "near" depends on the inputs, and uniform continuity, which states that "near" has the same definition for every input. Likewise, we can define uniform convergence for a sequence of functions.

Take a particular function in the sequence. This function approximates the original function, but is not exactly the same. The difference at a particular input between the approximation and the original is called the error of the approximation at that point. If for the particular function there is a maximum error, and if every function in the sequence has a maximum error, and if the limit of the maximum error is 0 as you progress through the sequence, then the sequence converges uniformly. (For a counter example, picture a function which goes to infinity at one point. Then imagine a sequence of functions which are generally close to the original function at all other points, but have a finite value at that point. If that value grows with each function in the sequence, the limit of the sequence will be the same as the original function, but the error will be infinite at that point for every function in the sequence, so it does not converge uniformly.)

You can use limits to show that every if every function in a sequence is continuous and the sequence converges uniformly, then the function they converge to must be continuous. Likewise, for a sequence of functions which converges uniformly, the limit of the sequence of integrals equals the integral of the limit. However, if a sequence converges uniformly, the limit of the sequence of derivatives does not necessarily equal the derivative of the limit of the sequence. This is one of those cases where it's easy to detect a pattern and assume it continues. Analysis and the application of limits shows that your expectations break down. Applying the same techniques leads to the good news that if the derivatives converge uniformly to a function, then that function is the derivative of the limit. In the case of derivatives, the question isn't whether the functions in the sequence converge uniformly. It's whether the derivatives of the functions in the sequence converge uniformly.

There's a big payoff to sequences of functions and uniform convergence, but this post has gone on long enough, so it will have to wait until tomorrow.

0 comments

RSS

FAQ

What does "rolls a hoover" mean, anyway?

"Roll a hoover" was coined by Christopher Locke, aka RageBoy (not worksafe). He enumerated some Hooverian Principles, but that might not be too helpful. My interpretation is that rolling a hoover means doing something that you know is stupid without any clear sense of what the outcome will be, just to see what will happen. In my case, I quit my job in an uncertain economy to try to start a business. I'm still not sure how that will work out.

Why is the HTML for this page not valid?

BlogSpot adds the advertisement that appears at the top of this page. That advertisement is not valid HTML and is outside of my control. I believe that aside from that ad, this page is valid HTML.